Categories
Nevin Manimala Statistics

Statistical Mechanics of Transfer Learning in Fully Connected Networks in the Proportional Limit

Phys Rev Lett. 2025 May 2;134(17):177301. doi: 10.1103/PhysRevLett.134.177301.

ABSTRACT

Transfer learning (TL) is a well-established machine learning technique to boost the generalization performance on a specific (target) task using information gained from a related (source) task, and it crucially depends on the ability of a network to learn useful features. Leveraging recent analytical progress in the proportional regime of deep learning theory (i.e., the limit where the size of the training set P and the size of the hidden layers N are taken to infinity keeping their ratio α=P/N finite), in this Letter we develop a novel single-instance Franz-Parisi formalism that yields an effective theory for TL in fully connected neural networks. Unlike the (lazy-training) infinite-width limit, where TL is ineffective, we demonstrate that in the proportional limit TL occurs due to a renormalized source-target kernel that quantifies their relatedness and determines whether TL is beneficial for generalization.

PMID:40408730 | DOI:10.1103/PhysRevLett.134.177301

By Nevin Manimala

Portfolio Website for Nevin Manimala