Categories
Nevin Manimala Statistics

Gibbs Measures from Deep Shaped Multilayer Perceptrons

Phys Rev Lett. 2026 Feb 13;136(6):067301. doi: 10.1103/xm8d-v66z.

ABSTRACT

We develop a diagrammatic approach analyzing Gibbs measures (i.e., Bayesian posteriors) in deep shaped multilayer perceptrons at arbitrary temperature. This gives the first (perturbatively) solvable model of learning with nonlinear neural networks where the input dimension N_{0}, depth L, width N, and number of samples P can all be large without severe assumptions on either initialization or training data statistics. The limits N_{0},N,L,P→∞ do not commute, resulting in a rich phase diagram of learning regimes that we study to first order in 1/N. We find in particular that the ratio LP/N defines a critical depth necessary for feature learning: if LP/N→0, then Bayesian posteriors coincide with those of a kernel method. Regimes where LP/N→λ>0, in contrast, correspond to learning with data-dependent deformations of these kernels, and we provide explicit formulas for the resulting features to first order in 1/N.

PMID:41765812 | DOI:10.1103/xm8d-v66z

By Nevin Manimala

Portfolio Website for Nevin Manimala