Future quantum computers with machine learning could attack larger sets of data than classical computers

July 31, 2013

“We could map the whole Universe — all of the information that has existed since the Big Bang — onto 300 qubits.” — Seth Lloyd

Seth Lloyd of MIT and his collaborators have developed a quantum version of machine learning — a type of AI in which programs can learn from previous experience to become progressively better at finding patterns in data. It would take advantage of quantum computations to speed up machine-learning tasks exponentially, Nature News reports.

Data can be split into groups — a task that is at the core of handwriting- and speech-recognition software — or can be searched for patterns. Massive amounts of information could therefore be manipulated with a relatively small number of qubits.

“We could map the whole Universe — all of the information that has existed since the Big Bang — onto 300 qubits,” Lloyd says.

Such quantum AI techniques could dramatically speed up tasks such as image recognition for comparing photos on the web or for enabling cars to drive themselves — fields in which companies such as Google have invested considerable resources.

Putting quantum machine learning into practice will be more difficult. Lloyd estimates that a dozen qubits would be needed for a small-scale demonstration.

The ideas are explored in a series of five (open-access) arXiv papers. […]

[more]