Computer Electronics

Understanding Quantum Computing

14 September 2017

In 2014, Google Quantum AI Lab announced that it had hired John Martinis, one of the world’s foremost experts on quantum computing. Martinis, a physics professor at the University of California Santa Barbara, was allowed to keep his affiliation with the university as he led a group of researchers attempting to develop quantum computing hardware for Google. By the end of this year, Martinis’ group is expected to demonstrate what is known as quantum supremacy: using qubits to solve a problem that would be beyond the reach of the world’s fastest supercomputers. How did we get this far, this fast? The answer is related to how quantum computers work.

A quantum machine. Source: Eric Lucero / CC BY-SA 3.0A quantum machine. Source: Eric Lucero / CC BY-SA 3.0The fundamental difference between traditional computers and quantum computers lies in the difference between bits and qubits. Classical computers utilize binary digits, or bits, to represent information. One bit can have one of two unique values, 1 or 0. Two bits can represent four possible unique values: 00, 01, 10 and 11. More bits means more unique values that can be represented with those bits. In general, there are 2N values for N bits. For example, a byte—which is 8 bits—can represent 256 unique values.

By contrast, a quantum computer uses quantum binary digits, or qubits, to represent information. Qubits exploit quantum entanglement to create an intrinsic parallelism. A qubit is in a superposition of states when it is not being measured and due to this entanglement, N qubits are the equivalent of 2N bits of information. This means that a relatively small number of qubits can exceed the information processing capacity of very large supercomputers. The catch is that the answer has to be simple. When an entangled state is measured, it loses its parallelism and collapses into a state with the information equivalence of classical bits. Thus in calculations qubits can far exceed the information processing of a traditional computer, but the information that can be output as a result is the same.

Quantum computers are ideal for certain classes of problems that would require massive parallelism in traditional computers but output relatively straightforward results. These include problems involving key cryptography built upon integer factorization or many-body systems whose components perturb each other, such as electrons in an atom or molecule, or the ability to search large databases quickly. These problems do not scale well with traditional computing. In a sense, quantum computers will not replace traditional computers as much as they will complement them.

So although Google achieving quantum computing supremacy sounds impressive and perhaps slightly sinister, in reality, it equates to a 49 qubit computer, which is a small step on the way to the full-sized quantum computers of the future. Still, it’s a significant step. Qubits are notoriously fragile and getting 49 of them entangled and performing operations in such a way that produces meaningful results is an incredible achievement. Just as in 1946, when the Electronic Numerical Integrator and Computer (ENIAC) calculating 2,400 times faster than humans was seen as an important step in the development of traditional computing.

In 20 years perhaps our devices will utilize quantum computing in the cloud for certain applications. Certainly, the field of materials science will benefit from the development of even small quantum computers. There are many financial reasons to justify Google’s interest in quantum computing. Google isn’t the only game in town either. IBM, Microsoft and others are all investing heavily in the technology in a race for the future of computing. By the end of this year, quantum computing supremacy will have been achieved and the quantum computers will likely begin their long march towards the mainstream.



Powered by CR4, the Engineering Community

Discussion – 6 comments

By posting a comment you confirm that you have read and accept our Posting Rules and Terms of Use.
Re: Understanding Quantum Computing
#1
2017-Sep-26 10:26 AM

I think 2^N is incorrect. This is what a binary bit does.

2^8=256 would be 8 bits for a conventional computer.

Maybe N^N is right, but I'm not sure.

8^8=16,777,216

Any clarification?

Re: Understanding Quantum Computing
#2
2017-Sep-26 10:28 AM

I still do not understand quantum computing.

Re: Understanding Quantum Computing
#4
In reply to #2
2017-Sep-26 7:36 PM

It has to do with the entanglement of the sub atomic particles.

There was an experiment that proved they behave differently when being measured to give you a more expected result.

The state of entanglement is the state of not being measured, this allows the SAP to act different.Not Be confined by our logic until the answer is produced.

Re: Understanding Quantum Computing
#3
2017-Sep-26 12:18 PM

Well... I don't understand quantum computing either, but if

"...N qubits are the equivalent of 2N bits of information.."

then 8 qubits = 28 bits = 256 bits = 2256 unique values

Re: Understanding Quantum Computing
#6
In reply to #3
2017-Sep-27 11:36 AM

8 qubits = 28 bits = 256 bits = 2256 unique values

Thanks for that explanation; I looked at it for a while before the light can on.

Re: Understanding Quantum Computing
#5
2017-Sep-27 10:01 AM

One step at a time, starting with the thing in itself.

What is the identity of the q-bit subatomic particle?

Next,

How does one introduce/apply a logic code to a cause a unique non-binary presumably hyper-ephemeral dynamic entanglement of such particles?

We off and running down the Yellow Brick Road to taking quantum computing from alchemy to science.

Engineering Newsletter Signup
Get the GlobalSpec
Stay up to date on:
Features the top stories, latest news, charts, insights and more on the end-to-end electronics value chain.
Advertisement
Weekly Newsletter
Get news, research, and analysis
on the Electronics industry in your
inbox every week - for FREE
Sign up for our FREE eNewsletter
Advertisement