Neural Dynamics
Understanding how the structure of data, learning dynamics and neural architectures interact to yield emergent computations including reasoning and creativity.
We are a Prague-based research startup advancing understanding of how large neural networks learn, compute, scale, reason and imagine — employing powerful tools from physics, mathematics, computer science and theoretical neuroscience to uncover the fundamental principles that make AI work.
Understanding how the structure of data, learning dynamics and neural architectures interact to yield emergent computations including reasoning and creativity.
Treating AI as a complex physical system to discover fundamental principles through statistical mechanics, random matrix theory, and variational principles.
Uncovering the "black box" of modern AI by bringing together researchers from physics, computer science, neuroscience, mathematics and statistics.
We partner with leading research institutions and industry labs to advance the physics of learning and neural computation.
Cross-disciplinary collaborations
Research publications
Years of research focus
Loss convergence patterns across different architectures
Performance scaling with model size and data
Emergence of reasoning and creativity abilities
We treat AI as a complex physical system, using tools from physics, mathematics, and neuroscience to understand the fundamental principles of learning.
Yes, we actively collaborate with leading institutions and researchers across multiple disciplines to advance the physics of learning.
We employ powerful tools from statistical mechanics, random matrix theory, variational principles, and asymptotics to understand how neural networks learn and compute at scale.
We draw on theoretical neuroscience and computer science to understand how structure, dynamics, and architecture interact to yield striking emergent computations including reasoning and creativity.
CEO. Leads strategy and research direction. Focus on physics of learning, neural dynamics, and emergent computation in large-scale systems.
Reach out for research collaborations, cross-disciplinary partnerships, and advancing the physics of learning and neural computation.