The tech world is buzzing with excitement about artificial intelligence (AI) and the promise of artificial general intelligence (AGI), which is the point at which AI surpasses human abilities.
However, it’s probably a bit too early to get excited.
According to various sources in the tech world, progress toward AGI is likely to stall due to the slowdown in the increase of transistor density in classical computing. Instead, quantum computing is being touted as the next big technology with the potential to power AGI models.
The problem is, current quantum processing units (QPUs) are unreliable, as will be discussed shortly. The limitations of quantum computing haven’t stopped numerous companies pouring billions into developing them. In fact, the U.S. and China seem to be locked in a race to see who can make quantum computing commercially feasible the fastest.
What are quantum processors?
A QPU is the main processor in a quantum computer. It uses the principles of quantum mechanics to carry out calculations much faster than traditional or classical computers.
Before looking at QPUs in more detail, let’s first explore two types of processors that have accelerated computers in recent years:
- GPUs (graphics processing units) handle graphics rendering and other highly complex calculations.
- DPUs (data processing units) move and process large volumes of data.
QPUs operate on a fundamentally different level. They apply properties of quantum mechanics to do the heavy lifting of complex calculations much more quickly and efficiently.
In fact, QPUs do much more than simply speed things up. They can tackle certain types of calculations that would be unfeasible with traditional computing technologies.
In practical terms, a GPU optimizes processing by handling multiple tasks at the same time, but still within the bounds of classical physics using standard bits that represent 1 or 0. A QPU breaks those bounds by using quantum bits (or qubits) which can represent multiple states simultaneously, thanks to superposition.
Another quantum property called entanglement allows qubits to interconnect in ways that amplify their computational power.
The potential of quantum computing is vast, which makes it an exciting field, especially for fields like cryptography, materials science and complex system modeling. However, it’s still largely in the experimental stage and not yet ready for mainstream use. Many industry thought leaders are saying that QPUs will be the future of AI and are investing heavily in pioneering chip companies such as Nvidia.
Such is the interest in AI, and the technology that enables it, Nvidia took over from Microsoft as the most valuable company in the world. Nvidia became one of the first companies to announce research into coupling classical chips with quantum computing technology.
Limitations of quantum processors
Despite the excitement surrounding quantum computing, there are significant hurdles to its widespread adoption.
Noise and error rates
Quantum systems are highly sensitive to their environment. Any slight changes can lead to quantum decoherence or noise, which makes qubits behave more like classical particles. Under these conditions, it’s difficult to maintain superposition and entanglement states, disrupting computations and increasing error rates.
Without intervention, current quantum computers have an error rate of around 1%, but need to be close to 1 in a trillion to be considered useful. To mitigate this problem, quantum computer researchers have started to develop quantum error correction (QEC) techniques that protect quantum information more effectively. These techniques preserve the integrity of quantum states, allowing QPUs to perform reliable calculations over longer periods, despite the inherent instability of qubits.
Limited qubit count
Scaling up quantum systems to a larger number of qubits, needed for more complex advanced AI algorithms, remains a technical challenge. At the moment, quantum computers can only handle around 100 qubits, which restricts their processing power.
Part of the problem of scaling up is that QPUs need ultra-low temperatures that are close to absolute zero (-273.15°C). As the number of qubits increases, maintaining these ultra-low temperatures becomes more challenging due to increased heat generation from control electronics and other components.
Immature quantum algorithms
Another problem with using QPUs for AI is that quantum algorithms differ to the ones used for classical computers. It requires a new approach to algorithm design and implementation, which is still in its infancy.
Quantum algorithms, such as Shor's algorithm for factorization and Grover's algorithm for search are currently being explored for their potential to solve complex problems much faster than classical computers.
Another option is to use quantum transformers, which have shown some success in handling tasks such as processing and analyzing data in a way that mimics the architecture of neural network transformers in classical computing. Transformers aim to leverage quantum computing's capabilities to perform complex tasks like natural language processing more efficiently than traditional models.
Lack of practical applications
Despite the theoretical advantages of quantum computing in handling specific calculations, it's uncertain if QPUs will perform effectively with typical AI tasks such as those used in deep learning models. The potential for quantum computers to improve areas like cryptography and complex system modeling is clear, but their advantage in conventional AI applications is yet to be demonstrated.
Integration challenges
As mentioned earlier, the quantum state is fragile and holding it long enough to perform meaningful computations is difficult. It’s challenging to maintain environments shielded from any form of interference that could disrupt quantum coherence when dealing with classical systems.
The interface between quantum processing units and classical systems adds another layer of complexity. Data needs to be translated back and forth between classical and quantum forms, requiring new protocols and interfaces. These interfaces must handle quantum data without collapsing its state, all while ensuring efficient communication between different parts of the system.
High energy consumption
An article on AI wouldn’t be complete without mention of the huge energy demands. Existing LLMs (large language models) like ChatGPT already use large amounts of energy and water to keep them running.
QPUs would take this to an even higher level. Google’s Sycamore quantum processor, for example, consumes about 26 kW of power to keep it running, primarily due to the extensive cooling systems.
Skills gap
Finally, there's a knowledge and skills gap that must be bridged. The workforce skilled in quantum computing at the moment is tiny, and the educational infrastructure to support quantum computing is still developing.
Quantum computing requires knowledge from multiple fields, including physics, computer science, mathematics, and engineering. Many universities are only just starting to offer quantum computing courses, with the curriculum still evolving, as the field itself is rapidly changing.
Is the future quantum AI? Possibly
There’s little doubt that quantum computing will play some kind of role in the future of AI. Quite how much is a matter of opinion. It mainly depends on whether researchers can find innovative ways to maintain quantum states without such a demand on energy and resources.
It feels like anything is possible in this new world of artificial intelligence and quantum computing, but it’s best to stay grounded. One thing is for certain – billions of dollars of investment will pour into both these technologies over the next couple of decades and if that’s anything to go by, the future will be quantum AI.
About the author
Daniel Martin is an ex-engineer and schoolteacher who has worked as a full-time technical content writer for the past decade. He applies his knowledge of technical topics to write clear, easy-to-understand and informative content. Daniel aims to bring a fresh perspective into topics and help other technical professionals to broaden their understanding and thinking.