Back to Search Start Over

Tensor Network Message Passing

Authors :
Wang, Yijia
Zhang, Yuwen Ebony
Pan, Feng
Zhang, Pan
Publication Year :
2023

Abstract

When studying interacting systems, computing their statistical properties is a fundamental problem in various fields such as physics, applied mathematics, and machine learning. However, this task can be quite challenging due to the exponential growth of the state space as the system size increases. Many standard methods have significant weaknesses. For instance, message-passing algorithms can be inaccurate and even fail to converge due to short loops. At the same time, tensor network methods can have exponential computational complexity in large graphs due to long loops. This work proposes a new method called ``tensor network message passing.'' This approach allows us to compute local observables like marginal probabilities and correlations by combining the strengths of tensor networks in contracting small sub-graphs with many short loops and the strengths of message-passing methods in globally sparse graphs, thus addressing the crucial weaknesses of both approaches. Our algorithm is exact for systems that are globally tree-like and locally dense-connected when the dense local graphs have limited treewidth. We have conducted numerical experiments on synthetic and real-world graphs to compute magnetizations of Ising models and spin glasses, to demonstrate the superiority of our approach over standard belief propagation and the recently proposed loopy message-passing algorithm. In addition, we discuss the potential applications of our method in inference problems in networks, combinatorial optimization problems, and decoding problems in quantum error correction.<br />Comment: Comments are welcome!

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2305.01874
Document Type :
Working Paper