KDD 2018: Quantum Machine Learning

It must be an exciting time for quantum computing researches. Up until a few years ago all work in this domain was about building theoretical models of what will be possible once we have a quantum computer. Today, a number of companies have built and made available physical quantum computers. So far those are toy-size, proofs of concept, with no practical application, but this gives hope that practical devices will become available in a few years’ time, so wider community should take notice.

Quantum Computing

Two main approaches to the architecture of a quantum computer are gate-based universal quantum computers and special-purpose annealers. The majority of players – IBM, Google, Microsoft, Rigetti – work towards a universal device. Quantum annealers are made by D-Wave, the first company to release a commercially available quantum computer. Programming a universal device resembles classical programming somewhat: the state of the system is modified by the use of gates similar to logical gates in the classical computer. Annealers work in a more exotic fashion: given a Hamiltonian describing the problem, they smoothly evolve the state of the system from initial state to ground (approximately lowest energy) state corresponding to the Hamiltonian given.

To give a better idea of what universal quantum computing is about, here is a nice definition based in linear algebra only, given by Maria Schuld (a great speaker, by the way):

  1. A quantum computer is a quantum physical system of \(n\) binary subsystems (qubits) that can be measured in \(2^n\) basis states. A computer is in a superposition of the states, and measurement gives us one of them.
  2. The state of the QC is described by a \(2^n\)-dimensional amplitude vector \(a\);
    • the entries are complex numbers;
    • the probability of measuring \(i\)th basis state is \(|a_i|^2\).
  3. Quantum algorithm is a manipulation of the physical system that transforms the amplitude vector;
    • every quantum algorithm can be decomposed into elementary transformations (gates) acting on only one or two subsystems.

It is acknowledged that quantum processing units (QPUs) are only suitable for certain, specialised tasks, and will not replace classical CPUs (perhaps at some point the “C” will evolve to stand for “classical” rather than “central”?). Rather, it is expected that QPUs will become co-processors, like today’s GPUs – perhaps only available remotely, in the cloud.

Quantum Machine Learning

The physical quantum devices available today have state of less than 100 imperfect (non-error-corrected, NISQ) qubits. To run the seminal Shor’s factorisation algorithm that breaks RSA we need in the order of 1,000 perfect, or 1 million imperfect qubits. It should come as no surprise that the applications to machine learning are also currently in the realm of theory. What are those potential future applications?

At a very high level, certain classes of algorithms applicable in machine learning are known to be asymptotically faster on a quantum processor. Firstly, optimisation, estimation and constraint satisfaction problems, exemplified by Grover’s unstructured search algorithm, where quantum computers offer a square root (only; but can be significant in practice!) speed-up over the classical algorithms. This can be applied to accelerate Monte Carlo methods. Secondly, high-dimensional linear equations can be efficiently “solved” on a quantum computer, which has found theoretical application in recommender systems. Thridly, adiabatic algorithm (function minimisation by evolution of quantum Hamiltonian at zero temperature), and its relative, annealing (implemented in D-Wave’s hardware), can be used for heuristics – for example, finding approximate solutions to optimisation problems.

More specific applications of quantum computers to machine learning are speculative, but the results are often theoretically proven. It turns out that translating standard neural networks to quantum mechanics is be hard; on the other hand, restricted Boltzmann machines map naturally to the D-Wave style annealer. Another construct that translates very well, this time to universal quantum device, are kernel calculations. A setup where the majority of the learning algorithm runs on a CPU, but kernel computation is handed off to a QPU, would be (in theory) relatively simple to implement. Further in the realm of hypothetical are GANs, it turns out that if the data, generator and discriminator are all quantum, then the classical GAN property of generator learning the distribution of the data at Nash equilibrium holds; however, a classical generator will not be able to fool quantum-enabled discriminator if the data is quantum in nature – there are distributions from which a quantum device can sample that a classical device can not.

In a rare example of a practical experiment, Piotr Gawron presented work where Markov random field was applied to image segmentation. The four-level Potts Markov random field model is NP-hard, but could be efficiently computed on D-Wave annealer. The trick was mapping the structure of the problem – image pixels – to the topology of D-Wave’s qubit network, the Chimera graph. The results were encouraging: first of all, the approach worked, secondly, D-Wave execution took a few seconds while a simulation needed minutes. The simulation was naive and it is likely that a highly optimised classical algorithm would beat the annealer at the current stage, but the asymptotic advantage of the quantum architecture means that the classical supremacy’s lifetime is very limited.

In conclusion, the experts agree: quantum machine learning can be done, but it will be some years – perhaps five, perhaps more – before the hardware is developed that will make it practical.

Workshop website: https://www.quantummachinelearning.org/qmlkdd2018.html

21/08/2018