Misplaced Pages

Neural network quantum states

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

Neural Network Quantum States (NQS or NNQS) is a general class of variational quantum states parameterized in terms of an artificial neural network. It was first introduced in 2017 by the physicists Giuseppe Carleo and Matthias Troyer to approximate wave functions of many-body quantum systems.

Given a many-body quantum state | Ψ {\displaystyle |\Psi \rangle } comprising N {\displaystyle N} degrees of freedom and a choice of associated quantum numbers s 1 s N {\displaystyle s_{1}\ldots s_{N}} , then an NQS parameterizes the wave-function amplitudes

s 1 s N | Ψ ; W = F ( s 1 s N ; W ) , {\displaystyle \langle s_{1}\ldots s_{N}|\Psi ;W\rangle =F(s_{1}\ldots s_{N};W),}

where F ( s 1 s N ; W ) {\displaystyle F(s_{1}\ldots s_{N};W)} is an artificial neural network of parameters (weights) W {\displaystyle W} , N {\displaystyle N} input variables ( s 1 s N {\displaystyle s_{1}\ldots s_{N}} ) and one complex-valued output corresponding to the wave-function amplitude.

This variational form is used in conjunction with specific stochastic learning approaches to approximate quantum states of interest.

Learning the Ground-State Wave Function

One common application of NQS is to find an approximate representation of the ground state wave function of a given Hamiltonian H ^ {\displaystyle {\hat {H}}} . The learning procedure in this case consists in finding the best neural-network weights that minimize the variational energy

E ( W ) = Ψ ; W | H ^ | Ψ ; W . {\displaystyle E(W)=\langle \Psi ;W|{\hat {H}}|\Psi ;W\rangle .}

Since, for a general artificial neural network, computing the expectation value is an exponentially costly operation in N {\displaystyle N} , stochastic techniques based, for example, on the Monte Carlo method are used to estimate E ( W ) {\displaystyle E(W)} , analogously to what is done in Variational Monte Carlo, see for example for a review. More specifically, a set of M {\displaystyle M} samples S ( 1 ) , S ( 2 ) S ( M ) {\displaystyle S^{(1)},S^{(2)}\ldots S^{(M)}} , with S ( i ) = s 1 ( i ) s N ( i ) {\displaystyle S^{(i)}=s_{1}^{(i)}\ldots s_{N}^{(i)}} , is generated such that they are uniformly distributed according to the Born probability density P ( S ) | F ( s 1 s N ; W ) | 2 {\displaystyle P(S)\propto |F(s_{1}\ldots s_{N};W)|^{2}} . Then it can be shown that the sample mean of the so-called "local energy" E l o c ( S ) = S | H ^ | Ψ / S | Ψ {\displaystyle E_{\mathrm {loc} }(S)=\langle S|{\hat {H}}|\Psi \rangle /\langle S|\Psi \rangle } is a statistical estimate of the quantum expectation value E ( W ) {\displaystyle E(W)} , i.e.

E ( W ) 1 M i M E l o c ( S ( i ) ) . {\displaystyle E(W)\simeq {\frac {1}{M}}\sum _{i}^{M}E_{\mathrm {loc} }(S^{(i)}).}

Similarly, it can be shown that the gradient of the energy with respect to the network weights W {\displaystyle W} is also approximated by a sample mean

E ( W ) W k 1 M i M ( E l o c ( S ( i ) ) E ( W ) ) O k ( S ( i ) ) , {\displaystyle {\frac {\partial E(W)}{\partial W_{k}}}\simeq {\frac {1}{M}}\sum _{i}^{M}(E_{\mathrm {loc} }(S^{(i)})-E(W))O_{k}^{\star }(S^{(i)}),}

where O ( S ( i ) ) = log F ( S ( i ) ; W ) W k {\displaystyle O(S^{(i)})={\frac {\partial \log F(S^{(i)};W)}{\partial W_{k}}}} and can be efficiently computed, in deep networks through backpropagation.

The stochastic approximation of the gradients is then used to minimize the energy E ( W ) {\displaystyle E(W)} typically using a stochastic gradient descent approach. When the neural-network parameters are updated at each step of the learning procedure, a new set of samples S ( i ) {\displaystyle S^{(i)}} is generated, in an iterative procedure similar to what done in unsupervised learning.

Connection with Tensor Networks

Neural-Network representations of quantum wave functions share some similarities with variational quantum states based on tensor networks. For example, connections with matrix product states have been established. These studies have shown that NQS support volume law scaling for the entropy of entanglement. In general, given a NQS with fully-connected weights, it corresponds, in the worse case, to a matrix product state of exponentially large bond dimension in N {\displaystyle N} .

See also

References

  1. Carleo, Giuseppe; Troyer, Matthias (2017). "Solving the quantum many-body problem with artificial neural networks". Science. 355 (6325): 602–606. arXiv:1606.02318. Bibcode:2017Sci...355..602C. doi:10.1126/science.aag2302. PMID 28183973. S2CID 206651104.
  2. Becca, Federico; Sorella, Sandro (2017). Quantum Monte Carlo Approaches for Correlated Systems. Cambridge University Press. Bibcode:2017qmca.book.....B. doi:10.1017/9781316417041. ISBN 9781316417041.
  3. Chen, Jing; Cheng, Song; Xie, Haidong; Wang, Lei; Xiang, Tao (2018). "Equivalence of restricted Boltzmann machines and tensor network states". Phys. Rev. B. 97 (8): 085104. arXiv:1701.04831. Bibcode:2018PhRvB..97h5104C. doi:10.1103/PhysRevB.97.085104. S2CID 73659611.
Categories: