15.
15.1 Video on Using Quantum Linear Algebra for Learning
15.1.1 Can measurement do Linear Algebra
We returned to the first metaphor that we introduced in Module 1: quantum states are special probability distribution written as vectors. The quantum basic linear algebra subroutines (QBLAS) give you a toolbox to do linear algebra with these vectors. Would a measurement qualify as such? The answer is yes because the measurement outcome is a projection to an eigenvector. I thought measurement is just a laboratory process to change the state hence it is bad, and it is not a computing process,not to mention computing for linear algebra. But obviously, I don’t understand why it is a projection to an eigenvector.
15.1.2 Quantum random access memory (QRAM)
Quantum random access memory (QRAM) is a necessary component in many coherent quantum algorithms. Its purpose has nothing to do with classical computer RAM or processing. The purpose is to retrieve the desired quantum state in superposition.
15.2 Quantum-Assisted Gaussian Processes
15.2.1 Gaussian Prior, Posterior, and posterior updates
Bayesian approaches pivot on priors, posterior, and how we update the latter based on observations. Why do we like a Gaussian prior and a Gaussian posterior? Note that Bayesian approach is not the only way such that Gaussian can do matrix inversion in the update. Also, Linear regression has nothing to do with it. So, the answer is: It is easy to derive a convenient update rule.
15.2.2 Is Gaussian approach variational circuits
The quantum-assisted forward pass has a fair bit of classical pre- and post-processing. Would you consider it a hybrid classical-quantum variational algorithm? The answer is no, since the quantum circuit has no parameters to optimize. I kind of have a feeling Gaussian is not variational. The reason has nothing to do with shallow or deep circuits. However, I have no idea that Gaussian has no parameters to optimize but probably the optimization is the most vital objective of variational circuits (otherwise why do we need variational circuits?). Also, alternating quantum and classical processing does not make the Gaussian approach to be variational circuits.
15.3 Model 4 Quiz
15.3.1 QFT notation regarding binary representation after digit point
This one is easy. For 0.5625=½+0/4+8/8+1/16, the binary representation is 0.1001.
15.3.2 QFT formula derivation
15.3.7 QBLAS Restrictions
QBLAS stands for quantum basic linear algebra subroutines, which is a convenient way of referring to quantum protocols that give you useful linear algebra operations. The main restrictions on these algorithms are:
· Vectors must be normalized – True. I guess instructor see normalization as something limits the linear algebra’s application scope, although I don’t see normalization is a restriction. I thought it is the power of linear algebra.
· They run only on contemporary quantum computers. Wrong. The restriction is not linear algebra itself, it is the unavailable coherent technology that restricts the use of linear algebra. Plus, QBLAS cannot do shallow circuits.
· They must be reversible – Wrong. The instructor wants student to consider Harrow-Hassidim-Lloyd (matrix inversion) algorithm – it relies on post selection. I thought this is true that because quantum circuits are reversible, it cannot do what digital circuits do. So, post selection is the way to resolve reversibility (?!).
· They map from quantum states to quantum states. True. From the video lecture, QBLAS include QFT, Phase Estimation, Matrix Inversion, and PAN’s a system of linear algebra (that is why PAN is so famous). So QBLAS handles quantum-to-quantum kind of Quantum Turing Machine. It does not solve classical-to-quantum problem.
15.3.8 No Cloning Theory for QRAM
Quantum random access memory (QRAM) is a necessary component in many coherent quantum algorithms, but it is unclear how we build it in a scalable manner. Could we store our data instances in a superposition and retrieve them as needed? No cloning theory says NO however you try to copy data into the superposition.
15.3.9 Gaussian distribution, Kernel Matrix, and SPARSITY.
A Gaussian distribution is defined by its mean and variance. In other words, its first and second moments. In Gaussian processes, we think of the variance as a kernel: f∼N(0,K), where Kij=k(xi,xj). What structure would we like to see in the kernel matrix if we use the quantum matrix inversion?
· Wrong. We want a dense matrix, that is, few or no elements would be zero. This allows us to capture long-range correlations between data points. Dense matrices remain hard to simulate even theoretically.
· True. Sparsity, so we can do Hamiltonian simulation efficiently.
· Wrong. Big difference between smallest and largest eigenvalues. While the matrix we actually invert compensates somewhat for large differences in the eigenvalues of the kernel matrix, the quantum matrix inversion algorithm is very sensitive to this.
15.3.10 Gaussian Forward Pass
In the forward pass of a Gaussian process, we calculate the posterior given the data: