-Adic statistical field theory and convolutional deep Boltzmann machines An introduction to restricted Boltzmann machines
W A Zúñiga-Galindo (University of Texas Rio Grande Valley, School of Mathematical and Statistical Sciences, , , One West University Blvd., Brownsville, TX 78520, , , USA); C He (Oklahoma State University, Department of Mathematics, , , MSCS 425, Stillwater, OK, , , USA); B A Zambrano-Luna (University of Texas Rio Grande Valley, School of Mathematical and Statistical Sciences, , , One West University Blvd., Brownsville, TX 78520, , , USA)
Abstract Understanding how deep learning architectures work is a central scientific problem. Recently, a correspondence between neural networks (NNs) and Euclidean quantum field theories has been proposed. This work investigates this correspondence in the framework of p-adic statistical field theories (SFTs) and neural networks. In this case, the fields are real-valued functions defined on an infinite regular rooted tree with valence p, a fixed prime number. This infinite tree provides the topology for a continuous deep Boltzmann machine (DBM), which is identified with a statistical field theory on this infinite tree. In the p-adic framework, there is a natural method to discretize SFTs. Each discrete SFT corresponds to a Boltzmann machine with a tree-like topology. This method allows us to recover the standard DBMs and gives new convolutional DBMs. The new networks use O(N) parameters while the classical ones use O(N$^{2}$) parameters.