Free SKILL.md scraped from GitHub. Clone the repo or copy the file directly into your Claude Code skills directory.
npx versuz@latest install hiyenwong-ai-collection-collection-skills-helamem-hebbian-learning-associative-memory-llgit clone https://github.com/hiyenwong/ai_collection.gitcp ai_collection/SKILL.MD ~/.claude/skills/hiyenwong-ai-collection-collection-skills-helamem-hebbian-learning-associative-memory-ll/SKILL.md--- name: helamem-hebbian-learning-associative-memory-llm description: "Long-term memory is a critical challenge for Large Language Model agents, as fixed context windows cannot preserve coherence across extended interactions. Existing memory systems represent conversation history as unstructured embedding vectors, retri Activation: neuroscience, dynamics, cognitive, memory" --- # HeLa-Mem: Hebbian Learning and Associative Memory for LLM Agents ## OvervieLong-term memory is a critical challenge for Large Language Model agents, as fixed context windows cannot preserve coherence across extended interactions. Existing memory systems represent conversation history as unstructured embedding vectors, retrieving information through semantic similarity. This paradigm fails to capture the associative structure of human memory, wherein related experiences progressively strengthen interconnections through repeated co-activation. Inspired by cognitive neuroscience, we identify three mechanisms central to biological memory: association, consolidation, and spreading activation, which remain largely absent in current research. To bridge this gap, we propose HeLa-Mem, a bio-inspired memory architecture that models memory as a dynamic graph with Hebbian learning dynamics. HeLa-Mem employs a dual-level organization: (1) an episodic memory graph that evolves through co-activation patterns, and (2) a semantic memory store populated via Hebbian Distillation, wherein a Reflective Agent identifies densely connected memory hubs and distills them into structured, reusable semantic knowledge. This dual-path design leverages both semantic similarity and learned associations, mirroring the episodic-semantic distinction in human cognition. Experiments on LoCoMo demonstrate superior performance across four question categories while using significantly fewer context tokens. Code is available on GitHub: https://github.com/ReinerBRO/HeLa-Mem ## Source Paper - **Title:** HeLa-Mem: Hebbian Learning and Associative Memory for LLM Agents - **Authors:** Jinchang Zhu, Jindong Li, Cheng Zhang et al. - **arXiv:** [2604.16839v1](https://arxiv.org/abs/2604.16839v1) - **Published:** 2026-04-18 - **Categories:** cs.CL - **PDF:** [Download](https://arxiv.org/pdf/2604.16839v1) ## Key Contributions Based on the abstract, this paper makes the following contributions: 1. **Novel approach** to neuroscience, dynamics, cognitive, memory 2. **Methodology** bridging computational neuroscience with practical applications 3. **Evaluation** demonstrating effectiveness in relevant tasks ## Core Concepts ### Methodology Long-term memory is a critical challenge for Large Language Model agents, as fixed context windows cannot preserve coherence across extended interactions. Existing memory systems represent conversation history as unstructured embedding vectors, retrieving information through semantic similarity. This paradigm fails to capture the associative structure of human memory, wherein related experiences progressively strengthen interconnections through repeated co-activation. Inspired by cognitive neuro ### Technical Details - The paper introduces a framework/method for neuroscience-related computation - Key innovation in handling neuroscience, dynamics, cognitive data/tasks - Provides theoretical grounding and experimental validation ## Practical Applications ### Application Area This research has implications for: - Brain-computer interfaces - Neural decoding and encoding - Computational modeling of brain function - AI systems inspired by neuroscience ### Implementation Considerations Key implementation aspects: 1. Data preprocessing for neuroimaging/neural signals 2. Model architecture choices 3. Training and evaluation protocols ## Related Work This work builds on existing research in: - Computational neuroscience methods - neuroscience, dynamics, cognitive analysis - Brain-inspired AI architectures ## References - Jinchang Zhu, Jindong Li, Cheng Zhang et al. (2026). "HeLa-Mem: Hebbian Learning and Associative Memory for LLM Agents." arXiv:2604.16839v1. ## Activation Keywords neuroscience, dynamics, cognitive, memory