To better understand the differences between a regular computer and a quantum computer, Eric Ladizinsky, co-founder of quantum computing company D-Wave, gave a very ‘real’ world example as reference. He suggested imagining that you only have five minutes to find an ‘X’ written on a page of a book among 50 million books. You’d never find the ‘X’. Now imagine if you had 50 million parallel realities (the quantum computer) and you could look at a different book in each of those realities. You would now find the ‘X’.
Common digital computing requires that the data be encoded into binary digits (bits), each of which is always in one of two definite states (0 or 1). Quantum computation uses quantum bits or qubits which are infinitely more powerful.
But what does that really mean? By entering into this quantum area of computing we will be able to create processors that are significantly faster (a million or more times) than the ones we use today, using less power, and working on many levels and tasks simultaneously.
Possible applications include quantum encryption methods for increased security and data protection. In medicine, quantum computing would allow for a person’s genes to be sequenced and analysed much more rapidly than the methods we use today and would allow for personalised drug development. Meteorologists, through massive real-time data analysis will have a much better idea of when bad weather will strike, enabling them to alert people and ultimately save lives, anguish and money.
Which brings us back to AI. Information processing is critical to improve machine learning. Quantum computers can analyse large quantities of data to provide artificial intelligence machines with the feedback required to improve performance and shorten this learning curve.