Free SKILL.md scraped from GitHub. Clone the repo or copy the file directly into your Claude Code skills directory.
npx versuz@latest install hiyenwong-ai-collection-collection-skills-dmosopt-neural-dynamical-systemsgit clone https://github.com/hiyenwong/ai_collection.gitcp ai_collection/SKILL.MD ~/.claude/skills/hiyenwong-ai-collection-collection-skills-dmosopt-neural-dynamical-systems/SKILL.md--- name: dmosopt-neural-dynamical-systems description: "DMOSOPT: Joint surrogate learning framework for multi-objective optimization of neural dynamical systems. Learns objectives, constraints, and sensitivities simultaneously. Activation: multi-objective optimization, neural dynamics, surrogate model, parameter optimization" --- # DMOSOPT: Multi-Objective Optimization for Neural Dynamical Systems > A scalable optimization framework using jointly learned surrogate models to capture the interplay between objectives, constraints, and parameter sensitivities in biophysical neural simulations. ## Metadata - **Source**: arXiv:2603.20984 - **Authors**: [Paper authors] - **Published**: 2026-03 - **Categories**: cs.LG ## Core Methodology ### Key Innovation DMOSOPT addresses the challenge of optimizing high-dimensional neural system parameters under numerous constraints by learning a unified surrogate model that simultaneously approximates objectives, feasibility boundaries, and parameter sensitivities, providing gradient signals where traditional methods fail. ### Technical Framework #### Joint Surrogate Learning 1. **Objective landscape approximation**: Smooth surrogate for objective functions 2. **Feasibility boundary learning**: Constraint satisfaction prediction 3. **Sensitivity estimation**: Per-parameter gradient information 4. **Unified gradient**: Steers search toward improved objectives AND constraint satisfaction #### Optimization Strategy - **Binary feasible/infeasible partition**: Handles hard constraints - **Targeted exploration**: Uses sensitivity estimates for intelligent parameter updates - **Supercomputing scale**: Validated on large-scale neural circuit models - **Fewer evaluations**: Achieves optimization with substantially fewer problem evaluations ### Application Scope #### Neural System Optimization - Single-cell dynamics parameter fitting - Population-level network activity tuning - Multi-scale neural circuit modeling - Biophysical model calibration #### General Applicability While demonstrated in neuroscience, applicable to any constrained multi-objective optimization in scientific and engineering domains. ## Implementation Guide ### Prerequisites - Simulation framework for target dynamical system - Multi-objective optimization library - Surrogate modeling capabilities (Gaussian Processes, Neural Networks) ### Workflow 1. **Define objectives**: Quantify what to optimize 2. **Specify constraints**: Hard and soft constraints 3. **Initial sampling**: Generate training data for surrogate 4. **Joint learning**: Train unified surrogate model 5. **Guided search**: Use unified gradients and sensitivities 6. **Iterate**: Update surrogate and continue optimization ### Key Considerations - High-dimensional parameter spaces require sufficient initial sampling - Constraint boundary learning critical for feasibility - Sensitivity estimates enable efficient exploration - Surrogate accuracy affects final solution quality ## Applications - Neural model parameter fitting - Brain network dynamics optimization - Biophysical simulation calibration - Control policy optimization - Engineering design under constraints ## Pitfalls - Surrogate accuracy depends on training data quality - High-dimensional spaces may require many initial evaluations - Constraint learning can be challenging for complex feasibility regions - Computational cost of surrogate training at scale ## Related Skills - learning-neuron-dynamics-deep-snn - neural-dynamics-universal-translator - snn-learning-survey - optimization-neural-networks