Research

Advancing the fundamental understanding of how neural networks learn, compute, and exhibit emergent intelligence through the lens of physics and mathematics.

Core Research Areas

Statistical Physics of Learning

Understanding neural network dynamics through statistical mechanics, random matrix theory, and variational principles to uncover the fundamental laws governing learning.

Key Focus: Energy landscapes, phase transitions, and emergent properties in neural computation

Neural Dynamics & Emergence

Investigating how complex behaviors like reasoning and creativity emerge from simple learning dynamics, treating neural networks as dynamical systems.

Key Focus: Emergence of intelligence, scaling laws, and capability transitions

Cross-Disciplinary Integration

Bridging physics, mathematics, computer science, and neuroscience to develop unified theories of learning and computation.

Key Focus: Interdisciplinary collaboration, theoretical frameworks, and practical applications

Research Publications

Understanding Emergent Capabilities in Large Neural Networks

Kuchynka, A.; Ganguli, S.; et al. (2024)

Nature Machine Intelligence

We present a statistical physics framework for understanding how capabilities emerge as neural networks scale, revealing phase transitions in learning dynamics.

Neural Networks as Complex Physical Systems

Doe, J.; Kuchynka, A.; Proton, M. (2024)

Physical Review Letters

Applying tools from statistical mechanics to analyze neural network learning dynamics and emergent computational properties.

Scaling Laws for Emergent Intelligence

Beta, A.; Kuchynka, A. (2025)

ICML 2025

Mathematical characterization of how reasoning and creativity capabilities scale with model size and training data.

Research Tools & Datasets

Neural Physics Toolkit

Open-source software for analyzing neural networks as physical systems, including dynamics visualization and emergent property detection.

Emergence Benchmark Suite

Comprehensive evaluation framework for measuring emergent capabilities in large neural networks across different domains.

Collaboration Opportunities

We actively collaborate with researchers across multiple disciplines. If you're interested in contributing to our research on the physics of learning and neural computation, please reach out.