Energy-Efficient & Practical Quantum AI
Bridging Quantum Principles and AI for the Next Technological Frontier

Won Joon Yun
(June 21, 2023)

Abstract

Artificial Intelligence (AI) has seen remarkable applications like AlphaGo and Tesla's vision technology. Does fusion with quantum computing, termed Quantum AI, promise even greater potential. While Quantum Machine Learning (QML) lays the groundwork for Quantum AI, its definitive edge over classical methods is yet to be established. Major industry players like Google and IBM have made strides in demonstrating quantum capabilities, with projections indicating the emergence of quantum AI by 2025. However, challenges like high costs, increased error rates, and scalability persist. For Quantum AI to outpace classical AI, a deeper understanding of QML, energy-efficient algorithms, and identification of suitable real-world applications are crucial. Despite the current dominance of classical AI, Quantum AI, with focused research, holds the promise of becoming a transformative force in the tech landscape.

I. Quantum AI - Fusion of Quantum Computing and AI

Fig. 1. Current AI services in 2023

Fig. 2. When Quantum Computer Meets AI

Currently, AI technology is so useful that it can change human activity and is being used in diverse ways (see Fig.1). While AlphaGo has become outdated, it can still play a much superior game of Go than humans. Tesla has advanced its vision technology to the extent that it possesses an extremely advanced AI model, as if equipped with a 3D sensor. Moreover, the recent Arc test results of GPT-4 were truly astonishing. Such AI technology will further advance in the future, and it's certain we will experience an automated life.

Let's consider when quantum computing is integrated with this AI technology. First, we need to look into the characteristics of quantum computers. Quantum computing algorithms are based on the principles of quantum superposition and entanglement. Superposition allows for the simultaneous representation of '0' and '1' (or generally, a pair of eigen-states), and by controlling the quantum state, manipulation of the '0' and '1' states becomes possible. I refer to these superposable and controllable quanta as Quantum Bits (Qubit). Additionally, multiple qubits can become entangled. Once in an Entangled state, there's dependency between qubits, allowing for parallel computation. Leveraging superposition and entanglement, notable achievements include the Shor's algorithm for factorization, the quantum Fourier transform which underpins the Shor's algorithm, and the powerful Grover's search algorithm. These algorithms have the advantage of remaining cost-effective even as problem size increases. Constructing Quantum AI based on quantum computer technology will likely produce significant and positive results (see Fig.2).

Let's delve into Quantum Machine Learning (QML), which underpins Quantum AI and the quantum version of Machine Learning (ML). Maria Schuld proposed ideas on quantum machine learning, and subsequently brought attention to quantum machine learning by suggesting quantum kernel methods. However, it hasn't yet been clearly proven that quantum machine learning directly outperforms classical methods in combinatorics, memory, and search. In other words, to progress to quantum AI, there needs to be a method of quantum machine learning that clearly shows computational and memory advantages. This needs to be implemented and validated on quantum computers, making it by no means a simple challenge.

II. How Industries/Academia look Quantum AI? 

Fig. 3. How Industries look Quantum AI

Fig. 4. How Academia looks Quantum AI

The industrial sector is at the forefront of verifying quantum supremacy. In 2019, Google demonstrated quantum supremacy in a problem of generating random number sequences. This was followed in 2022 by Xanadu demonstrating quantum supremacy in Gaussian Boson Sampling. Recently, in 2023, an IBM research team showed that quantum supremacy can be achieved without Quantum Error Correction (QEC).

Firstly, Google stated, "Quantum Artificial Intelligence will enhance the most consequential of human activities, explaining observations of the world around us." IBM Quantum released a roadmap saying that quantum AI could be realized by 2025. Xanadu, D-Wave, IonQ, and Rigetti also projected that quantum AI could be possible around the same time. Recently, IonQ became a topic of discussion when they ran a machine learning algorithm on the IonQ computer for MNIST, but its accuracy was only around 80%, not yet reaching the performance of classical computers.

How does academia view quantum AI? In the 20th century, with the advent of quantum information theory and the introduction of the concept of quantum computers, the fields of quantum algorithms and quantum information processing actively developed. After the emergence of deep learning, quantum kernels and quantum machine learning were researched. Subsequently, research on Quantum Convolution Neural Network (QCNN) for image processing and quantum reinforcement learning was conducted, and to this day, models such as quantum shadow models continue to be proposed and theoretically analyzed.

Such algorithms must be validated through quantum computers. Quantum clouds, such as IBM Quantum, IonQ, and QuERA, are being disseminated, and from 2021, there have been explosive experiments to demonstrate quantum supremacy. However, the reality is that the current quantum machine learning technology has low performance and robustness. Therefore, it seems unlikely that quantum AI will be realized in the near future.

III. Why is Quantum AI Premature?

Fig. 5. Prematureness of QML

Fig. 6. Vagueness of QML Research

Quantum machine learning has lower performance than classical artificial intelligence, is more expensive, and has larger errors. In quantum machine learning, it costs about 10~100$ to perform one inference (based on 1K~10K shots), and the cost of running a quantum computer for one minute is around 100$. On the other hand, the cost to run a quantum computer in simulation doesn't even amount to 1$, so quantum computers cannot currently beat the cost of classical computers. Also, in terms of error, quantum computers possess an error of 0.01~0.1, whereas classical computing holds an error level of 0.000001, making its error considerably lower. Finally, in classical computers, the MNIST model demonstrates an accuracy of over 99%, while in quantum computers, it only displays around 80% accuracy, implying that quantum machine learning still requires significant advancement.

Pointing this out was Maria Schuld, who proposed quantum machine learning. From 2022 to the 2023 QHACK, Maria Schuld consistently noted that while researching quantum machine learning is beneficial, its direction seemed to be geared towards demonstrating "quantum supremacy". The main agenda has been about how to construct circuits properly, which problems to solve, how to evaluate performance, and why "quantumness" is crucial for the problems at hand. Indeed, many quantum machine learning papers claim better performance than traditional methods, but this is a mere modest advancement and doesn't address the "why". Also, this progress is not even better than the state-of-the-art (SOTA) in classical machine learning.

So, is there a way for quantum machine learning to progress into quantum AI that can replace classical computing? Let's explore this in the next chapter.

IV. Methods for Quantum AI to Surpass Classical AI

Fig. 6. Problem of Classical AI

Fig. 7.  Vision, Mission, and Technical Hurdles to surpass AI

" My goal is to tackle real-world and scalable problems where Quantum AI can surpass traditional AI.   To achieve this, we must uncover the fundamentals of QML and strive to reduce costs.   This ambition has the potential to revolutionize our society. "

Since 2020, Large Language Models like LLM have continuously evolved, leading to a tremendous increase in their model size. Presently, to infer from GPT-4, a minimum of 140 Nvidia H100 GPUs is essential. Considering that 500 mL of water is used for cooling for each prompt, the cost is undoubtedly substantial. Expanding the size of LLMs is no longer sustainable and is likely harmful to the environment. Furthermore, a convergence point in computing costs between quantum computers and commercial computers is imminent, suggesting a potential takeover by Quantum AI. In fact, models like the Quantum Transformer and Quantum Visual Transformer have emerged, demonstrating quantum properties while achieving performance that could potentially replace classical AI even at the current NISQ scale.

In my perspective, for Quantum AI to become a reality, it must achieve high performance, large scale, and low cost simultaneously. To realize this, we should: 

However, there are technical challenges to consider:

V. Missions to Achieve Quantum AI

Fig. 8.  Toward Quantum AI

A. Developing QML Fundamentals

To explain about QML Fundamentals, the mathematical expressions of the Universal unitary gate and the random circuit are equivalent to the parametrized unitary gate, but when there is noise, the mathematical space that the universal unitary gate and the random circuit can reach becomes different. Therefore, in a noise mitigation environment, we need to achieve the following process.

1. Research should be conducted on whether a universal data encoder can be created for the first point. Research is being conducted on angle encoding and amplitude encoding in encoding classical data, but they cannot be viewed as a universal encoder. Recently, there is a proposal for a reuploading method that uploads data, processes the quantum state with trainable parameters, and then uploads the data again, but the computational cost is very high.

2. The second point is to research reducing the gap between the real world and the ideal world. Until 2022, the performance difference between simulation and the real world was very significant. For MNIST, while the simulation showed 80% accuracy, in reality, it was only about 30%. The challenge is to create a model that can run well on various devices. Recently, there are research results from IonQ that have significantly reduced the gap by using the Reconfigurable Beam Splitter (RBS) block and Projection Valued Measurement (PVM).

3. The third point is to research the model structure used in quantum machine learning. Just as classical artificial neural networks evolved from Multi-layer perceptrons to transformers, it's about developing circuits for quantum machine learning. Typically, quantum machine learning is processed through the 1-to-3 processes described earlier, but recently there is a Flip model that goes through the process of data encoding and measurement after quantum state processing using VQC. Also, it has been confirmed that the butterfly structure's RBS quantum gate can significantly reduce the gap in experiments. Recently, a method of obtaining a classical shadow and then mapping it to a quantum initial state and then encoding the data has been proposed, which has been revealed to have theoretical advantages over traditional AI.

B. Study on Energy Efficient QML

The problem with quantum machine learning is that its performance is lower than current classical computers, the delay is high, and the cost is also high. As shown in the table, the delay when running FashionMNIST is 4600-28000 times higher, but the performance is much lower. 

Three solutions to this are introduced:

If we go through this series of processes, we may be able to achieve energy-efficient quantum AI.

C. Finding Theoritical Advantage

Proving a theoretical advantage of quantum machine learning over classical machine learning through comparison can be a game-changer of paramount importance.

In summary, research that utilizes PAC learning theory and complexity theory to verify the problems quantum machine learning can solve is crucial.

D. Finding Potential Applications

While it might still be a speculative discussion, research on various applications where classical artificial neural networks are used is also necessary.

Classical artificial neural networks that use many parameters are mainly used in applications like 3D object detection and Natural Language Processing (NLP). Research that can directly reduce computation in these fields would be meaningful, and well-known quantum advantages, like solutions to the max-cut problem and search, could also be referenced in the design of quantum artificial neural networks.

VI. Concluding Remark

So far, I have examined methods to move towards energy-efficient and practical quantum AI. While classical artificial intelligence still significantly outperforms quantum artificial intelligence, the rate of advancement for massive AI models like LLM is expected to slow down gradually. Due to cost issues, I anticipate that quantum AI will emerge as an alternative in the near future. At such a juncture, if I further research the fundamentals of QML, discover energy-efficient QML techniques, identify theoretical quantum advantages, and conduct research on applications that can utilize them, I firmly believe that I will be able to commercialize that quantum AI when it becomes necessary.