Boltzmann machine learning with a variational quantum algorithm

Yuta Shingu, Yuya Seki, Shohei Watabe, Suguru Endo, Yuichiro Matsuzaki, Shiro Kawabata, Tetsuro Nikuni, Hideaki Hakoshima

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)

Abstract

A Boltzmann machine is a powerful tool for modeling probability distributions that govern the training data. A thermal equilibrium state is typically used for the Boltzmann machine learning to obtain a suitable probability distribution. The Boltzmann machine learning consists of calculating the gradient of the loss function given in terms of the thermal average, which is the most time-consuming procedure. Here, we propose a method to implement the Boltzmann machine learning by using noisy intermediate-scale quantum devices. We prepare an initial pure state that contains all possible computational basis states with the same amplitude, and we apply a variational imaginary time simulation. Readout of the state after the evolution in the computational basis approximates the probability distribution of the thermal equilibrium state that is used for the Boltzmann machine learning. We perform the numerical simulations of our scheme and confirm that the Boltzmann machine learning works well. Our scheme leads to a significant step toward an efficient machine learning using quantum hardware.

Original languageEnglish
Article number032413
JournalPhysical Review A
Volume104
Issue number3
DOIs
Publication statusPublished - 2021 Sept
Externally publishedYes

ASJC Scopus subject areas

  • Atomic and Molecular Physics, and Optics

Fingerprint

Dive into the research topics of 'Boltzmann machine learning with a variational quantum algorithm'. Together they form a unique fingerprint.

Cite this