Free SKILL.md scraped from GitHub. Clone the repo or copy the file directly into your Claude Code skills directory.
npx versuz@latest install hiyenwong-ai-collection-collection-skills-hamiltonian-emergent-deep-generative-modelsgit clone https://github.com/hiyenwong/ai_collection.gitcp ai_collection/SKILL.MD ~/.claude/skills/hiyenwong-ai-collection-collection-skills-hamiltonian-emergent-deep-generative-models/SKILL.md---
name: hamiltonian-emergent-deep-generative-models
description: "Hamiltonian extraction from autonomous emergent deep generative models. Discovering implicit physical interaction laws from learned generative models using Riemannian diffusion score fields. Activation: Hamiltonian, generative model, emergent dynamics, Riemannian diffusion, score field, physics discovery, implicit interaction."
---
# Hamiltonian of Autonomous Emergent Deep Generative Models
> Extract implicit physical Hamiltonians from autonomously emergent deep generative models, revealing that learned score fields encode Riemannian diffusion and interaction potentials.
## Metadata
- **Source**: arXiv:2604.20821
- **Authors**: Wenjie Xi, Wei-Qiang Chen
- **Published**: 2026-04-22
- **Categories**: cond-mat.dis-nn, cond-mat.stat-mech
## Core Methodology
### Key Innovation
Deep generative models trained on physical data implicitly learn the underlying interaction laws. This work demonstrates how to extract the Hamiltonian — the total energy function governing system dynamics — from the learned parameters of an autonomously emergent generative model. The key discovery is that the score field of the generative model decomposes into a Riemannian diffusion component and an interaction potential that matches the physical Hamiltonian.
### Technical Framework
1. **Score-Based Generative Models**: Generative models parameterized by score functions (∇_x log p(x)) that capture data distribution gradients
2. **Hamiltonian Extraction**: The learned score field encodes force-like quantities that decompose into conservative (Hamiltonian-derived) and dissipative (diffusion) components
3. **Riemannian Diffusion Score Field**: The diffusion process operates on a learned Riemannian manifold whose metric reflects the underlying physical geometry
4. **Autonomous Emergence**: No explicit physics supervision — the Hamiltonian structure emerges purely from data-driven training
### Mathematical Foundation
- Hamiltonian mechanics: H(q,p) = T(p) + V(q)
- Score matching: learning ∇_x log p(x) from data
- Riemannian geometry: metric tensor g_{ij} defining the diffusion manifold
- Decomposition: score = -∇V(x) + Riemannian diffusion term
## Implementation Guide
### Prerequisites
- Hamiltonian mechanics and classical mechanics
- Score-based generative modeling (diffusion models)
- Riemannian geometry basics
- PyTorch or JAX
### Step-by-Step
1. Train a score-based generative model on physical trajectory data
2. Extract the learned score field s_θ(x)
3. Decompose score into conservative (gradient of potential) and dissipative components
4. Identify the potential energy V(x) from the conservative component
5. Verify that the extracted Hamiltonian reproduces the observed dynamics
### Code Example
```python
import torch
def extract_hamiltonian_from_score(score_model, x_samples):
"""Extract implicit Hamiltonian from learned score field."""
x = x_samples.requires_grad_(True)
score = score_model(x) # Learned score s_θ(x)
# Conservative component: s_conservative = -∇V
# Integrate to recover potential V(x)
potential = -torch.cumsum(score * x.grad, dim=0)
# Kinetic energy from Riemannian metric
# H = T(p) + V(q)
return potential
def verify_hamiltonian(H, trajectories):
"""Verify extracted H reproduces observed dynamics."""
# Hamilton's equations: dq/dt = ∂H/∂p, dp/dt = -∂H/∂q
pass
```
## Applications
- Discovering unknown physical laws from experimental data
- Validating that neural networks learn physically meaningful representations
- Building physics-informed generative models with guaranteed conservation laws
- Understanding emergent computation in neural networks
## Pitfalls
- Score decomposition requires careful regularization to separate conservative and dissipative parts
- The Riemannian metric estimation can be ill-conditioned in high dimensions
- Not all generative models will yield clean Hamiltonian structures — depends on training data
## Related Skills
- energy-based-neurocomputation
- physics-informed-neural-networks-maximizing-quantum
- koopman-representation-learning