Statistical Physics of Unsupervised Learning with Prior Knowledge in Neural Networks

Integrating sensory inputs with prior beliefs from past experiences in unsupervised learning is a common and fundamental characteristic of brain or artificial neural computation. However, a quantitative role of prior knowledge in unsupervised learning remains unclear, prohibiting a scientific understanding of unsupervised learning. Here, we propose a statistical physics model of unsupervised learning with prior knowledge, revealing that the sensory inputs drive a series of continuous phase transitions related to spontaneous intrinsic-symmetry breaking. The intrinsic symmetry includes both reverse symmetry and permutation symmetry, commonly observed in most artificial neural networks. Compared to the prior-free scenario, the prior reduces more strongly the minimal data size triggering the reverse-symmetry breaking transition, and moreover, the prior merges, rather than separates, permutation-symmetry breaking phases. We claim that the prior can be learned from data samples, which in physics corresponds to a two-parameter Nishimori constraint. This Letter thus reveals mechanisms about the influence of the prior on unsupervised learning.

Medienart:

E-Artikel

Erscheinungsjahr:

2020

Erschienen:

2020

Enthalten in:

Zur Gesamtaufnahme - volume:124

Enthalten in:

Physical review letters - 124(2020), 24 vom: 19. Juni, Seite 248302

Sprache:

Englisch

Beteiligte Personen:

Hou, Tianqi [VerfasserIn]
Huang, Haiping [VerfasserIn]

Links:

Volltext

Themen:

Journal Article

Anmerkungen:

Date Completed 09.07.2020

Date Revised 09.07.2020

published: Print

Citation Status PubMed-not-MEDLINE

doi:

10.1103/PhysRevLett.124.248302

funding:

Förderinstitution / Projekttitel:

PPN (Katalog-ID):

NLM312171366