Free SKILL.md scraped from GitHub. Clone the repo or copy the file directly into your Claude Code skills directory.
npx versuz@latest install hiyenwong-ai-collection-collection-skills-brainca-neural-cellular-automatagit clone https://github.com/hiyenwong/ai_collection.gitcp ai_collection/SKILL.MD ~/.claude/skills/hiyenwong-ai-collection-collection-skills-brainca-neural-cellular-automata/SKILL.md---
name: brainca-neural-cellular-automata
description: "BraiNCA: brain-inspired neural cellular automata with complex brain-like topologies for morphogenesis and motor control. Long-range connections beyond Moore neighborhood. Triggers: neural cellular automata, morphogenesis, NCA, brain-inspired topology, motor control, complex networks."
---
# BraiNCA: Brain-Inspired Neural Cellular Automata
> Neural Cellular Automata with brain-inspired network topologies including long-range connections, applied to morphogenesis and distributed motor control.
## Metadata
- **Source**: arXiv:2604.01932v1
- **Authors**: Léo Pio-Lopez, Benedikt Hartl, Michael Levin, et al.
- **Published**: 2026-04-02
- **Institution**: Université Côte d'Azur, Tufts University
## Core Methodology
### Key Innovation
BraiNCA extends traditional Neural Cellular Automata (NCA)—which use regular grids with Moore neighborhoods—by incorporating complex brain-like network topologies with long-range connections. This brain-inspired architecture enables more efficient information propagation and robust pattern formation, demonstrating applications in morphogenesis (growing artificial tissues) and distributed motor control (coordinated multi-joint movement).
### Brain-Inspired Topology Features
#### 1. Long-Range Connections
Unlike standard NCAs with only local (1-hop) neighbors, BraiNCA includes:
- **Short-range connections**: Local neighbors (like cortical columns)
- **Long-range connections**: Skip connections across the grid (like white matter tracts)
- **Small-world topology**: High clustering with short path lengths
- **Hierarchical modularity**: Community structure with dense intra-module and sparse inter-module connections
#### 2. Network Generation Algorithms
```python
import networkx as nx
import numpy as np
def generate_brainca_topology(n_nodes, connection_rules):
"""
Generate brain-inspired topology for NCA
Args:
n_nodes: Total number of cells in the grid
connection_rules: Dict specifying connection probabilities
"""
G = nx.Graph()
# Create grid positions
grid_size = int(np.sqrt(n_nodes))
positions = {(i, j): idx for idx, (i, j) in
enumerate([(i, j) for i in range(grid_size)
for j in range(grid_size)])}
# Local connections (Moore neighborhood)
for i in range(grid_size):
for j in range(grid_size):
idx = positions[(i, j)]
# 8 neighbors
for di in [-1, 0, 1]:
for dj in [-1, 0, 1]:
if di == 0 and dj == 0:
continue
ni, nj = (i + di) % grid_size, (j + dj) % grid_size
G.add_edge(idx, positions[(ni, nj)],
weight=connection_rules['local_weight'])
# Long-range connections (small-world)
for _ in range(int(n_nodes * connection_rules['long_range_fraction'])):
u, v = np.random.choice(n_nodes, 2, replace=False)
if not G.has_edge(u, v):
G.add_edge(u, v, weight=connection_rules['long_range_weight'])
return G
```
#### 3. Distance-Dependent Connection Probability
Following cortical connectivity patterns:
```
P(connection) ∝ d^(-α) * exp(-d/λ)
where:
- d: Euclidean distance between cells
- α: Power-law exponent (typically 1-2)
- λ: Characteristic length scale
```
### Neural Cellular Automata Update Rule
#### State Variables
Each cell i maintains:
- **s_i**: Cell state vector (concentrations of morphogens, cell types, etc.)
- **h_i**: Hidden state (internal memory)
#### Update Equation
```python
def brainca_update(cell_states, hidden_states, topology, perception_kernel):
"""
Single update step for BraiNCA
Args:
cell_states: (n_cells, state_dim) array
hidden_states: (n_cells, hidden_dim) array
topology: NetworkX graph defining connections
perception_kernel: Neural network for perceiving neighbors
"""
n_cells = len(cell_states)
new_states = np.zeros_like(cell_states)
new_hidden = np.zeros_like(hidden_states)
for i in range(n_cells):
# Collect neighbor information
neighbor_states = []
for j in topology.neighbors(i):
weight = topology[i][j]['weight']
neighbor_states.append(weight * cell_states[j])
# Perception: process neighbor information
perception_input = np.concatenate([
cell_states[i],
np.mean(neighbor_states, axis=0) if neighbor_states else np.zeros_like(cell_states[i])
])
perceived = perception_kernel(perception_input)
# Update rule (neural network)
update_input = np.concatenate([cell_states[i], hidden_states[i], perceived])
delta_s, new_h = update_network(update_input)
# Stochastic update (optional)
if np.random.rand() < 0.9: # 90% update probability
new_states[i] = cell_states[i] + delta_s
else:
new_states[i] = cell_states[i]
new_hidden[i] = new_h
return new_states, new_hidden
```
## Implementation Guide
### Prerequisites
- Python 3.8+
- PyTorch or JAX for neural networks
- NetworkX for graph operations
- Matplotlib/Plotly for visualization
### Step-by-Step: Morphogenesis Application
1. **Setup Environment**
```python
import torch
import torch.nn as nn
import networkx as nx
import numpy as np
class BraiNCA(nn.Module):
def __init__(self, n_cells, state_dim, hidden_dim, topology):
super().__init__()
self.n_cells = n_cells
self.state_dim = state_dim
self.topology = topology
# Neural networks
self.perception = nn.Sequential(
nn.Linear(state_dim * 2, 64),
nn.ReLU(),
nn.Linear(64, 32)
)
self.update_net = nn.Sequential(
nn.Linear(state_dim + hidden_dim + 32, 128),
nn.ReLU(),
nn.Linear(128, state_dim + hidden_dim)
)
# Initialize states
self.cell_states = nn.Parameter(torch.randn(n_cells, state_dim) * 0.1)
self.hidden_states = torch.zeros(n_cells, hidden_dim)
def forward(self, n_steps):
for _ in range(n_steps):
self.cell_states, self.hidden_states = self._step()
return self.cell_states
def _step(self):
# Gather neighbor information
neighbor_means = torch.zeros_like(self.cell_states)
for i in range(self.n_cells):
neighbors = list(self.topology.neighbors(i))
if neighbors:
weights = [self.topology[i][j]['weight'] for j in neighbors]
weighted_sum = sum(
w * self.cell_states[j]
for j, w in zip(neighbors, weights)
) / sum(weights)
neighbor_means[i] = weighted_sum
# Perception
perception_input = torch.cat([
self.cell_states,
neighbor_means
], dim=1)
perceived = self.perception(perception_input)
# Update
update_input = torch.cat([
self.cell_states,
self.hidden_states,
perceived
], dim=1)
delta = self.update_net(update_input)
new_states = self.cell_states + torch.tanh(delta[:, :self.state_dim]) * 0.1
new_hidden = torch.tanh(delta[:, self.state_dim:])
return new_states, new_hidden
```
2. **Training for Target Pattern**
```python
def train_morphogenesis(target_pattern, n_epochs=1000):
"""Train BraiNCA to grow target pattern"""
n_cells = len(target_pattern)
topology = generate_brainca_topology(
n_cells,
{'local_weight': 1.0, 'long_range_fraction': 0.05, 'long_range_weight': 0.5}
)
model = BraiNCA(n_cells, state_dim=16, hidden_dim=32, topology=topology)
optimizer = torch.optim.Adam(model.parameters(), lr=0.001)
for epoch in range(n_epochs):
optimizer.zero_grad()
# Reset to single seed cell
model.cell_states.data.fill_(0)
model.cell_states.data[0] = torch.randn(16) # Seed
# Grow for n steps
final_states = model(n_steps=100)
# Compare to target (e.g., cell type classification)
predicted_types = torch.argmax(final_states[:, :4], dim=1)
loss = nn.CrossEntropyLoss()(final_states[:, :4], target_pattern)
loss.backward()
optimizer.step()
if epoch % 100 == 0:
print(f"Epoch {epoch}, Loss: {loss.item():.4f}")
return model
```
3. **Motor Control Application**
```python
class BraincaMotorController:
"""Distributed motor control using BraiNCA"""
def __init__(self, n_joints, topology_type='small_world'):
self.n_joints = n_joints
# Each cell controls one joint
if topology_type == 'small_world':
self.topology = nx.watts_strogatz_graph(n_joints, k=4, p=0.3)
elif topology_type == 'scale_free':
self.topology = nx.barabasi_albert_graph(n_joints, m=2)
self.nca = BraiNCA(n_joints, state_dim=8, hidden_dim=16,
topology=self.topology)
def compute_angles(self, target_position, current_angles):
"""Compute joint angles to reach target"""
# Encode target into cell states
target_encoded = self.encode_target(target_position)
self.nca.cell_states.data[:, :4] = target_encoded
self.nca.cell_states.data[:, 4:] = torch.tensor(current_angles).unsqueeze(1)
# Run NCA dynamics
for _ in range(50):
self.nca.cell_states, self.nca.hidden_states = self.nca._step()
# Read out joint angles from cell states
new_angles = self.nca.cell_states[:, 4].detach().numpy()
return new_angles
def encode_target(self, position):
"""Encode 3D target position into cell encodings"""
# Simple linear encoding
encoding = torch.zeros(self.n_joints, 4)
for i in range(self.n_joints):
encoding[i] = torch.tensor([
position[0] / 100,
position[1] / 100,
position[2] / 100,
i / self.n_joints # Joint index
])
return encoding
```
## Applications
### 1. Artificial Morphogenesis
- **Tissue engineering**: Design growth patterns for organoids
- **Self-repairing systems**: Materials that heal damage through cellular regeneration
- **Developmental biology**: Model biological development processes
### 2. Distributed Robot Control
- **Modular robotics**: Self-organizing robot swarms
- **Soft robotics**: Continuous body coordination
- **Multi-agent systems**: Decentralized task allocation
### 3. Neural Development Models
- **Cortical wiring**: Model how brain connectivity develops
- **Neurodevelopmental disorders**: Understand atypical development
- **Evolution of brains**: Study how network topology affects function
### 4. Regenerative Medicine
- **Wound healing**: Simulate tissue regeneration
- **Limb regeneration**: Model salamander-like regrowth
- **Cancer modeling**: Understand uncontrolled growth
## Pitfalls
### Training Instability
- **Problem**: NCAs can collapse to trivial solutions or diverge
- **Solution**: Use curriculum learning; start with simple patterns; add noise during training
### Long-Range Communication Delay
- **Problem**: Information takes many steps to propagate across long distances
- **Solution**: Increase long-range connection density; use hierarchical organization
### Pattern Robustness
- **Problem**: Generated patterns are sensitive to initial conditions
- **Solution**: Train with multiple seeds; add perturbations during inference
### Computational Cost
- **Problem**: Graph-based NCA is slower than grid-based
- **Solution**: Use sparse matrix operations; parallelize across cells; consider GPU acceleration
## Related Skills
- brain-inspired-neural-cellular-automata: General brain-inspired NCA framework
- developmental-minimal-neural-circuits: Developmental neural circuit generation
- neuro-inspired-attention-mechanisms: Brain-inspired attention for neural networks
## References
```bibtex
@article{piolopez2026brainca,
title={BraiNCA: brain-inspired neural cellular automata and applications to morphogenesis and motor control},
author={Pio-Lopez, Léo and Hartl, Benedikt and Levin, Michael and others},
journal={arXiv preprint arXiv:2604.01932},
year={2026}
}
```