Physics of Learning
and Neural Computation

We are a Prague-based research startup advancing understanding of how large neural networks learn, compute, scale, reason and imagine — employing powerful tools from physics, mathematics, computer science and theoretical neuroscience to uncover the fundamental principles that make AI work.

CEO: Alex Kuchynka Based in Prague, EU

Our Research Focus

Neural Dynamics

Understanding how the structure of data, learning dynamics and neural architectures interact to yield emergent computations including reasoning and creativity.

Physical Systems

Treating AI as a complex physical system to discover fundamental principles through statistical mechanics, random matrix theory, and variational principles.

Emergent Intelligence

Uncovering the "black box" of modern AI by bringing together researchers from physics, computer science, neuroscience, mathematics and statistics.

Explore projects →

Research Highlights

  • Novel approaches to understanding neural network learning dynamics
  • Statistical physics methods for analyzing large-scale AI systems
  • Cross-disciplinary collaboration on emergent computation

Collaborations

We partner with leading research institutions and industry labs to advance the physics of learning and neural computation.

See collaborations →

Research Impact

0+

Cross-disciplinary collaborations

0+

Research publications

0+

Years of research focus

Research Visualizations

Neural Network Learning Dynamics

Loss convergence patterns across different architectures

Scaling Laws Analysis

Performance scaling with model size and data

Emergent Capabilities

Emergence of reasoning and creativity abilities

Research Partners

Stanford University
Simons Foundation
Physics Institutes

Milestones

2024

  • Foundation of physics of learning research
  • Cross-disciplinary collaboration network

2025

  • Neural dynamics breakthrough studies
  • Open-source physics-based tools

FAQ

What makes your approach unique?

We treat AI as a complex physical system, using tools from physics, mathematics, and neuroscience to understand the fundamental principles of learning.

Do you collaborate with external researchers?

Yes, we actively collaborate with leading institutions and researchers across multiple disciplines to advance the physics of learning.

Approach

Statistical Physics & Mathematics

We employ powerful tools from statistical mechanics, random matrix theory, variational principles, and asymptotics to understand how neural networks learn and compute at scale.

Neuroscience & Emergent Computation

We draw on theoretical neuroscience and computer science to understand how structure, dynamics, and architecture interact to yield striking emergent computations including reasoning and creativity.

People

Contact

Reach out for research collaborations, cross-disciplinary partnerships, and advancing the physics of learning and neural computation.