Free SKILL.md scraped from GitHub. Clone the repo or copy the file directly into your Claude Code skills directory.
npx versuz@latest install hiyenwong-ai-collection-collection-skills-kirchhoff-inspired-neural-networksgit clone https://github.com/hiyenwong/ai_collection.gitcp ai_collection/SKILL.MD ~/.claude/skills/hiyenwong-ai-collection-collection-skills-kirchhoff-inspired-neural-networks/SKILL.md---
name: kirchhoff-inspired-neural-networks
description: "Kirchhoff-Inspired Neural Network (KINN) - state-variable-based network architecture built on Kirchhoff's current law for evolving high-order perception. Derives numerically stable state updates from ODEs, enabling explicit decoupling and encoding of higher-order evolutionary components. Keywords: KINN, Kirchhoff, ODE-based neural networks, high-order perception, PDE solving, physics-informed neural networks, state-variable networks."
---
# Kirchhoff-Inspired Neural Networks (KINN)
## Metadata
- **Source**: arXiv:2603.23977
- **Authors**: Tongfei Chen, Jingying Yang, Linlin Yang, Jinhu Lü, David Doermann, Chunyu Xie, Long He, Tian Wang, Juan Zhang, Guodong Guo, Baochang Zhang
- **Published**: 2026-03-25
- **Categories**: cs.LG, cs.AI
## Core Methodology
### Problem Statement
Conventional deep networks optimize weights and biases by adjusting connection strengths, lacking a systematic mechanism to jointly characterize the interplay among signal intensity, coupling structure, and state evolution. Biological systems depend on dynamic fluctuations in membrane potential — a fundamentally different approach.
### KINN Innovation
KINN is a **state-variable-based network architecture** constructed on **Kirchhoff's current law** (KCL), which states that the sum of currents entering a node equals the sum leaving it.
### Key Contributions
1. **ODE-Based State Updates**: Derives numerically stable state updates from fundamental ordinary differential equations
2. **Higher-Order Evolutionary Components**: Enables explicit decoupling and encoding of higher-order evolutionary components within a single layer
3. **Physical Consistency**: Preserves physical laws (Kirchhoff's laws) throughout computation
4. **Interpretability**: State-variable formulation provides clearer physical interpretation than weight-based networks
5. **End-to-End Trainability**: Compatible with standard gradient-based optimization
### Mathematical Framework
The KINN layer models each neuron as a node in an electrical circuit:
```
Current balance at node i:
Σ_j I_ij = C_i · dV_i/dt + I_leak,i
Where:
I_ij: Current from node j to node i
C_i: Membrane capacitance (learnable)
V_i: Membrane potential (state variable)
I_leak,i: Leak current
```
State update via ODE integration:
```
dV/dt = f(V, W, x) # Governed by KCL
V(t+dt) = V(t) + dt · f(V(t), W, x(t))
```
### Higher-Order Encoding
KINN captures higher-order temporal dynamics within a single layer by:
- Modeling state evolution as coupled ODEs
- Encoding derivatives of state variables as features
- Maintaining physical consistency through KCL constraints
## Implementation Guide
### Architecture Design
```python
import torch
import torch.nn as nn
class KINNLayers(nn.Module):
"""Kirchhoff-Inspired Neural Network Layer"""
def __init__(self, n_nodes, dt=0.1):
super().__init__()
self.n_nodes = n_nodes
self.dt = dt
# Learnable parameters
self.capacitance = nn.Parameter(torch.ones(n_nodes))
self.conductance = nn.Parameter(torch.randn(n_nodes, n_nodes))
self.leak = nn.Parameter(torch.zeros(n_nodes))
def forward(self, x, n_steps=10):
"""
Args:
x: Input current (batch, n_nodes)
n_steps: Number of ODE integration steps
Returns:
Final state (batch, n_nodes)
"""
v = torch.zeros_like(x) # Initial state
for _ in range(n_steps):
# Kirchhoff's current law: C·dV/dt = ΣI_in - I_leak
current_in = x + v @ self.conductance.T
dvdt = (current_in - self.leak) / self.capacitance
v = v + self.dt * dvdt
return v
class KINN(nn.Module):
"""Multi-layer Kirchhoff-Inspired Neural Network"""
def __init__(self, input_dim, hidden_dims, output_dim, dt=0.1, n_steps=10):
super().__init__()
layers = []
prev_dim = input_dim
for h_dim in hidden_dims:
layers.append(KINNLayers(h_dim, dt))
layers.append(nn.ReLU())
prev_dim = h_dim
layers.append(KINNLayers(output_dim, dt))
self.layers = nn.ModuleList(layers)
self.n_steps = n_steps
def forward(self, x):
for layer in self.layers:
if isinstance(layer, KINNLayers):
x = layer(x, self.n_steps)
else:
x = layer(x)
return x
```
### PDE Solving Application
```python
def solve_pde_with_kinn(spatial_grid, initial_condition, boundary_conditions, n_time_steps):
"""Use KINN to solve partial differential equations"""
# Encode spatial grid as KINN nodes
# Set initial condition as node states
# Apply boundary conditions as external currents
# Evolve through time using KINN dynamics
pass
```
## Applications
1. **PDE Solving**: Outperforms existing methods on benchmark PDE problems
2. **Image Classification**: Validated on ImageNet with competitive results
3. **Physics-Informed Learning**: Natural fit for systems governed by conservation laws
4. **Neuroscience Modeling**: Closer to biological neuron dynamics than standard ANNs
## When to Use
- Systems with physical conservation laws (electrical, fluid, thermal)
- Problems requiring interpretable state evolution
- High-order temporal dynamics modeling
- PDE-based scientific computing
## Pitfalls
- ODE integration step size (dt) must be carefully chosen for stability
- Conductance matrix may require regularization for large networks
- Training may be slower than standard feedforward networks due to iterative state updates
- Physical constraints limit architectural flexibility
## Related Skills
- physics-guided-neural-network
- pinn-neuronal-parameter-estimation
- energy-based-neurocomputation
- spiking-neural-network-analysis
## References
- Chen, T., et al. "Kirchhoff-Inspired Neural Networks for Evolving High-Order Perception." arXiv:2603.23977, 2026.
- Kirchhoff's Circuit Laws (1845)
- Neural ODEs (Chen et al., 2018)