Free SKILL.md scraped from GitHub. Clone the repo or copy the file directly into your Claude Code skills directory.
npx versuz@latest install hiyenwong-ai-collection-collection-skills-l-system-neural-network-evolutiongit clone https://github.com/hiyenwong/ai_collection.gitcp ai_collection/SKILL.MD ~/.claude/skills/hiyenwong-ai-collection-collection-skills-l-system-neural-network-evolution/SKILL.md--- name: l-system-neural-network-evolution description: "L-System genetic encoding methodology for scalable neural network evolution. Uses Lindenmayer system grammar to encode neural networks, enabling compact representation and efficient evolutionary search. Applies to: neuroevolution, scalable network encoding, genetic algorithms, neural architecture search. Activation: L-system neural encoding, Lindenmayer neuroevolution, genetic network encoding, scalable neural evolution, grammar-based NAS." --- # L-System Genetic Encoding for Neural Network Evolution > Uses Lindenmayer system (L-System) grammar as a compact genetic encoding for evolving neural networks, enabling scalable architecture search beyond direct matrix encoding. ## Metadata - **Source**: arXiv:2604.22000 - **Authors**: Alexander Stuy, Nodin Weddington - **Published**: 2026-04-XX - **Category**: cs.NE ## Core Methodology ### Key Innovation Encodes neural network architectures using **L-System grammars** — formal rewriting systems originally developed for modeling plant growth — rather than direct weight matrices. This provides: 1. **Compact Representation**: Complex networks from short grammar rules 2. **Modularity**: Repeated structural patterns naturally emerge 3. **Scalability**: Genome size grows sub-linearly with network size 4. **Regularity**: Captures the repeating patterns common in biological and artificial networks ### L-System Basics - **Axiom**: Initial string (starting point) - **Production Rules**: Rewriting rules applied iteratively - **Interpretation**: Final string decoded into network architecture ### Comparison with Direct Encoding - Direct matrix encoding: O(N²) genome size for N-neuron networks - L-System encoding: O(log N) or O(√N) for regular architectures - Better suited for evolving large, structured networks ## Technical Framework ### Encoding Pipeline 1. **Grammar Definition**: Production rules → architectural patterns 2. **Derivation**: Apply rules iteratively from axiom 3. **Decoding**: Interpret derived string as network connectivity 4. **Evaluation**: Train and test the decoded network 5. **Selection**: Evolve grammar rules based on fitness ### Rule Design Principles - Context-free rules for simple patterns - Context-sensitive rules for conditional growth - Stochastic rules for diversity - Parameterized rules for continuous variation ## Applications - Evolving large-scale neural networks efficiently - Neural architecture search with grammar-based encoding - Bio-inspired network topology evolution - Modular neural network discovery ## Pitfalls - Grammar design requires domain knowledge - Decoding process adds computational overhead - Not all architectures can be compactly represented - Fitness landscape may be rugged with grammar encoding - May struggle with irregular, non-modular architectures ## Related Skills - evolutionary-snn-classifier - snn-universal-approximation-theory - developmental-minimal-neural-circuits