- there are attempts to utilize quantum algorithms such as the HHL algorithm, and on specific computationally hard steps of the learning process to achieve a speedup in learning. These attempts suffer a number of drawbacks, and only provide solutions in specific cases. However, I do believe the majority of the classical machine learning machinery/intuition still applies as the majority of the algorithm is unmodified.
- Researchers have attempted to modify traditional machine learning algorithms to reformulate them in quantum terms. This is along the lines of the mentioned in the article you linked above. In my opinion this is a very interesting approach. It does not necessarily require a traditional gate based quantum computer to implement, and the theoretical description in shockingly simple. The authors came up with the QBM by simply replacing the energy based activation units in the classical Boltzmann machine, with quantum “spins” (or qubits) with controllable couplings undergoing evolution in a transverse field. Unfortunately at this point in time there are only small scale simulations demonstrating the improvement of the QBM. I believe the only way to demonstrate a overall superiority would be to run the algorithm on a large scale quantum computer. In my opinion the recent demonstrations of the implementation on the DWave machine suffered from serious drawbacks. However, overall the QBM is well suited to an adiabatic quantum computer and could perform better on an improved machine.
- Finally computationally tools developed by physicists for the classical simulation of quantum system have recently been showing a lot of interesting results when being applied to machine learning. In this recent paper , the authors demonstrate how tensor networks (specifically matrix product states) may be applied to supervised learning. They achieve a very respectable score on the MNIST dataset, and there appear to be many avenues forward for improvements. In many ways there work is a generalization of kernel learning. Some nice features is that in a way it allows on demand optimization of the underlying network representation, and by examining the network one can obtain an intuition for the learning process
Application of Quantum Annealing to Training of Deep Neural Networks