Statistical Physics of Learning
Understanding neural network dynamics through statistical mechanics, random matrix theory, and variational principles to uncover the fundamental laws governing learning.
Advancing the fundamental understanding of how neural networks learn, compute, and exhibit emergent intelligence through the lens of physics and mathematics.
Understanding neural network dynamics through statistical mechanics, random matrix theory, and variational principles to uncover the fundamental laws governing learning.
Investigating how complex behaviors like reasoning and creativity emerge from simple learning dynamics, treating neural networks as dynamical systems.
Bridging physics, mathematics, computer science, and neuroscience to develop unified theories of learning and computation.
Nature Machine Intelligence
We present a statistical physics framework for understanding how capabilities emerge as neural networks scale, revealing phase transitions in learning dynamics.
Physical Review Letters
Applying tools from statistical mechanics to analyze neural network learning dynamics and emergent computational properties.
ICML 2025
Mathematical characterization of how reasoning and creativity capabilities scale with model size and training data.
Open-source software for analyzing neural networks as physical systems, including dynamics visualization and emergent property detection.
Comprehensive evaluation framework for measuring emergent capabilities in large neural networks across different domains.
We actively collaborate with researchers across multiple disciplines. If you're interested in contributing to our research on the physics of learning and neural computation, please reach out.