Free SKILL.md scraped from GitHub. Clone the repo or copy the file directly into your Claude Code skills directory.
npx versuz@latest install hiyenwong-ai-collection-collection-skills-kernel-hopfield-associative-memorygit clone https://github.com/hiyenwong/ai_collection.gitcp ai_collection/SKILL.MD ~/.claude/skills/hiyenwong-ai-collection-collection-skills-kernel-hopfield-associative-memory/SKILL.md---
name: kernel-hopfield-associative-memory
description: "Kernel Hopfield networks: geometric analysis of attractor boundaries and storage capacity limits. KLR-trained associative memories with P/N ~16 for random sequences and ~20 for structured data. Trigger words: kernel Hopfield, associative memory, KLR, attractor basin, storage capacity, kernel logistic regression."
category: neuroscience
---
# Kernel Hopfield Associative Memory Networks
Skill based on arXiv:2605.00366v1 - Geometric analysis of attractor boundaries and storage capacity limits in kernel Hopfield networks.
## Core Concept
High-capacity associative memories based on Kernel Logistic Regression (KLR) exhibit strong storage capabilities. This paper investigates the **global geometry of attractor basins** and the **physical determinants of storage limits** in KLR-trained Hopfield networks.
## Key Findings
### Storage Capacity
- **Random sequences**: P/N ≈ 16 (patterns per neuron)
- **Structured data (CIFAR-10)**: Effective load near P/N ≈ 20
- Significantly exceeds classical Hopfield limit of P/N ≈ 0.14
### Attractor Geometry
- **"Ridge of Optimization"**: Attractors separated by sharp, phase-transition-like boundaries
- Morphing analysis reveals:
- Well-defined basin boundaries
- Transition regions between attractors
- Capacity limit corresponds to boundary collapse
### Analysis Methods
1. **Empirical evaluation**: Random sequences + real-world image embeddings
2. **Phenomenological morphing**: Interpolate between stored patterns
3. **Statistical SNR analysis**: Signal-to-Noise Ratio of pattern retrieval
## Mathematical Framework
### Kernel Logistic Regression Training
```
E(x) = -Σᵢ αᵢ K(x, ξᵢ)
```
where:
- K(x, ξᵢ): Kernel function
- ξᵢ: Stored patterns (memories)
- αᵢ: Learned weights from logistic regression
### Dynamics
```
dx/dt = -∇E(x)
```
- Energy landscape defined by kernel expansion
- Gradient flow converges to attractors
- Attractors correspond to stored patterns
### Kernel Choice
- RBF/Gaussian kernels for smooth landscapes
- Polynomial kernels for structured data
- Kernel bandwidth controls basin size
## Attractor Basin Analysis
### Morphing Experiments
- Create interpolated states between two stored patterns
- Run dynamics from interpolated states
- Observe which attractor captures the state
- Map basin boundaries in pattern space
### Phase Transition at Capacity Limit
- Below capacity: Sharp basin boundaries
- Near capacity: Boundaries become irregular
- Above capacity: Boundaries collapse, retrieval fails
### SNR Analysis
- Signal: Component aligned with stored pattern
- Noise: Interference from other stored patterns
- Capacity limit: SNR drops below retrieval threshold
## Implementation
### Training Procedure
```python
# Store patterns using KLR
def store_patterns(patterns, kernel='rbf', gamma=1.0):
# Compute kernel matrix
K = kernel_matrix(patterns, patterns, kernel, gamma)
# Train logistic regression for each pattern
weights = train_klr(K, targets)
return weights, patterns, kernel_params
# Retrieve pattern from cue
def retrieve(cue, weights, stored_patterns, kernel_params, steps=100):
x = cue.copy()
for _ in range(steps):
# Compute energy gradient
grad = compute_gradient(x, weights, stored_patterns, kernel_params)
# Gradient descent
x -= learning_rate * grad
return x
```
### Parameters
- **Kernel type**: RBF, polynomial, linear
- **Kernel bandwidth**: Controls interaction range
- **Regularization**: Prevents overfitting
- **Steps**: Number of retrieval iterations
## Applications
### Associative Memory
- Pattern completion from partial cues
- Error correction in noisy inputs
- Content-addressable memory
### Machine Learning
- Kernel-based classification
- Memory-augmented neural networks
- Few-shot learning via pattern storage
### Neuroscience Modeling
- Memory storage in biological networks
- Attractor dynamics in cortex
- Capacity limits of neural systems
## Comparison with Classical Hopfield
| Property | Classical Hopfield | Kernel Hopfield |
|----------|-------------------|-----------------|
| Capacity (P/N) | ~0.14 | ~16-20 |
| Energy Function | Quadratic | Kernel-based |
| Pattern Type | Binary | Continuous |
| Training | Hebbian | KLR optimization |
| Basin Geometry | Simple | Complex, analyzable |
| Kernel Flexibility | None | Multiple choices |
## References
- **Paper**: Geometric analysis of attractor boundaries and storage capacity limits in kernel Hopfield networks
- **Author**: Akira Tamamori
- **arXiv**: 2605.00366v1 [cs.NE]
- **Categories**: Neural and Evolutionary Computing (cs.NE)
- **Date**: May 1, 2026
## Related Skills
- hippocampal-replay-credit-assignment
- neuro-attractor-landscape-working-memory
- attractor-metadynamics-neural
- hippi-hippocampal-inspired-memory