Future quantum computers with machine learning could attack larger sets of data than classical computers
July 31, 2013
Seth Lloyd of MIT and his collaborators have developed a quantum version of machine learning — a type of AI in which programs can learn from previous experience to become progressively better at finding patterns in data. It would take advantage of quantum computations to speed up machine-learning tasks exponentially, Nature News reports.
Data can be split into groups — a task that is at the core of handwriting- and speech-recognition software — or can be searched for patterns. Massive amounts of information could therefore be manipulated with a relatively small number of qubits.
“We could map the whole Universe — all of the information that has existed since the Big Bang — onto 300 qubits,” Lloyd says.
Such quantum AI techniques could dramatically speed up tasks such as image recognition for comparing photos on the web or for enabling cars to drive themselves — fields in which companies such as Google have invested considerable resources.
Putting quantum machine learning into practice will be more difficult. Lloyd estimates that a dozen qubits would be needed for a small-scale demonstration.
The ideas are explored in a series of five (open-access) arXiv papers. [...]
- Patrick Rebentrost, Masoud Mohseni, Seth Lloyd, Quantum support vector machine for big feature and big data classification, arXiv, 2013, arxiv.org/abs/1307.0471
- Seth Lloyd, Masoud Mohseni, Patrick Rebentrost, Quantum algorithms for supervised and unsupervised machine learning, arXiv, 2013, arxiv.org/abs/1307.0411
- Seth Lloyd, Masoud Mohseni, Patrick Rebentrost, Quantum self analysis, arXiv, 2013, arxiv.org/abs/1307.0401
- Stefanie Barz et al., Solving systems of linear equations on a quantum computer, arXiv, 2013, arxiv.org/abs/1302.1210
- Jian Pan et al., Experimental realization of quantum algorithm for solving linear systems of equations, arXiv, 2013, arxiv.org/abs/1302.1946