Free SKILL.md scraped from GitHub. Clone the repo or copy the file directly into your Claude Code skills directory.
npx versuz@latest install hiyenwong-ai-collection-collection-skills-contraction-theory-control-optimizationgit clone https://github.com/hiyenwong/ai_collection.gitcp ai_collection/SKILL.MD ~/.claude/skills/hiyenwong-ai-collection-collection-skills-contraction-theory-control-optimization/SKILL.md---
name: contraction-theory-control-optimization
description: "Contraction theory framework for robust control, optimization, and neural computation. Provides unified geometric analysis of dynamical system convergence, controller design, and learning algorithms. Activation: contraction theory, robust control, dynamical systems convergence, contraction metric, exponential stability analysis."
---
# Contraction Theory for Control and Optimization
Contraction theory provides a unified geometric framework for analyzing convergence, designing controllers, and understanding optimization algorithms through the lens of differential stability.
## Core Concept
A dynamical system is **contracting** if all trajectories converge exponentially to each other, regardless of initial conditions.
### Mathematical Definition
For a system ẋ = f(x,t), if there exists a metric M(x,t) such that:
```
δxᵀ(∂f/∂x)ᵀM + M(∂f/∂x) + Ṁ ≤ -2λM
```
for some λ > 0, then the system is contracting with rate λ.
### Key Property
**Incremental Stability**: If a system is contracting, then:
```
||x₁(t) - x₂(t)|| ≤ e^(-λt) ||x₁(0) - x₂(0)||
```
This implies:
- Global exponential convergence to a unique equilibrium
- Robustness to bounded disturbances
- Entrainment to periodic inputs
## Contraction Analysis
### 1. Finding Contraction Metrics
**Approach 1: Constant Metric**
Try M = I (identity matrix) or M = constant diagonal matrix.
**Approach 2: State-Dependent Metric**
Use M(x) that captures system geometry.
**Approach 3: Sum-of-Squares (SOS)**
For polynomial systems, use SOS programming to find M.
### 2. Contraction Conditions
| System Type | Contraction Condition |
|-------------|----------------------|
| Linear: ẋ = Ax | A is Hurwitz (eigenvalues in LHP) |
| Nonlinear: ẋ = f(x) | Jacobian ∂f/∂x is uniformly negative definite |
| Time-varying | Metric M(t) compensates for time variation |
| Stochastic | Mean-square contraction |
## Applications
### 1. Controller Design
**Contraction-based Control**:
Design controller u such that the closed-loop system is contracting.
```python
# Example: Contracting controller for tracking
# Given: ẋ = f(x) + g(x)u
# Goal: Track reference r(t)
# Virtual system approach:
# Define virtual dynamics that are contracting
# Controller makes actual system follow virtual system
```
**Key Techniques**:
- Virtual contraction analysis
- Feedback linearization with contraction guarantees
- Passivity-based control
- Sliding mode with contraction properties
### 2. Neural Network Analysis
**Training Dynamics**:
View gradient descent as a dynamical system:
```
θ̇ = -∇L(θ)
```
If the loss landscape is contracting (strong convexity), gradient descent converges exponentially.
**Neural ODEs**:
For continuous-depth networks:
```
ẋ = f(x, θ, t)
```
Contraction ensures:
- Stable forward propagation
- Well-behaved backpropagation
- Robustness to perturbations
**Key Results**:
- Contracting RNNs avoid vanishing/exploding gradients
- Contracting layers provide implicit regularization
- Stable architectures through contraction constraints
### 3. Optimization Algorithms
**Gradient Descent as Contraction**:
```
x_{k+1} = x_k - α∇f(x_k)
```
For strongly convex f with parameter μ and Lipschitz gradient L:
- GD is contracting if α < 2/L
- Contraction rate: 1 - αμ
**Accelerated Methods**:
- Momentum methods can be viewed as contracting systems in extended state space
- Nesterov acceleration through time-varying metrics
**Distributed Optimization**:
- Consensus algorithms as contracting systems
- Gradient tracking with contraction guarantees
- Robustness to network topology changes
### 4. Multi-Agent Systems
**Distributed Contraction**:
For networked systems with coupling:
```
ẋ_i = f_i(x_i) + Σ_j a_ij h(x_j - x_i)
```
If individual systems are contracting and coupling is cooperative, the network contracts.
**Applications**:
- Synchronization (Kuramoto oscillators)
- Formation control
- Distributed estimation
- Flocking behavior
## Advanced Topics
### 1. Time-Varying Contraction Metrics
When the natural metric changes with time or state:
```
M(x,t) = M₀ + M₁(x,t)
```
**Use Cases**:
- Systems with multiple operating points
- Adaptive control
- Online learning
### 2. Partial Contraction
When only part of the state space needs to contract:
- Hierarchical systems
- Systems with symmetries
- Reduced-order models
### 3. Stochastic Contraction
For systems with noise:
```
dx = f(x,t)dt + σ(x,t)dW
```
**Mean-Square Contraction**:
```
dE[δxᵀMδx]/dt ≤ -2λE[δxᵀMδx]
```
### 4. Contraction on Riemannian Manifolds
For systems evolving on curved spaces:
- Robotics (SO(3), SE(3))
- Quantum systems
- Information geometry
## Practical Tools
### Numerical Verification
```python
import numpy as np
from scipy.linalg import solve_continuous_lyapunov
def check_contraction(f, x_range, t=0):
"""
Numerically check contraction condition.
Args:
f: Vector field function f(x, t)
x_range: Grid of points to check
t: Time
Returns:
is_contracting: Boolean
contraction_rate: Estimated rate
"""
# Compute Jacobian at each point
# Check if symmetric part is negative definite
pass
```
### SOS Programming
For polynomial systems, use Sum-of-Squares to find M:
```python
import cvxpy as cp
# Define polynomial variables
# Set up SOS constraints
# Solve for metric M
```
## Connections to Other Frameworks
| Framework | Connection to Contraction |
|-----------|--------------------------|
| Lyapunov | Contraction is incremental Lyapunov |
| Passivity | Contracting systems are output strictly passive |
| ISS | Contraction implies ISS |
| Gradient Flows | Gradient flows of strongly convex functions are contracting |
| Mirror Descent | Natural gradient as contraction in dual space |
## Implementation Guidelines
1. **Start Simple**: Try constant metrics first
2. **Use Symmetries**: Exploit system structure
3. **Numerical Verification**: Check contraction computationally
4. **Combine with Other Tools**: Use contraction with Lyapunov, passivity
5. **Consider Computational Cost**: Balance tightness with tractability
## References
- Bullo, F., Coogan, S., & Dall'Anese, E. (2025). Advances in Contraction Theory for Robust Optimization, Control, and Neural Computation.
- Lohmiller, W., & Slotine, J. J. (1998). On contraction analysis for non-linear systems.
- Manchester, I. R., & Slotine, J. J. (2017). Control contraction metrics: Convex and intrinsic criteria for nonlinear feedback design.
- Singh, S., et al. (2019). Learning stabilizable nonlinear dynamics with contraction-based regularization.
## Related Skills
- `ai-systems-engineering-v-model` - Systems engineering for AI
- `distributed-quantum-control-systems` - Quantum system control
- `complex-kuramoto-control` - Network synchronization
## Activation Keywords
- contraction-theory-control-optimization
- contraction theory control
- contraction theory control optimization
## Tools Used
- `read` - 读取技能文档
- `write` - 创建输出
- `exec` - 执行相关命令
## Instructions for Agents
1. 理解技能的核心方法论
2. 根据用户问题提供针对性回答
3. 遵循最佳实践
## Examples
### Example 1: 基本查询
**User:** 请解释 Contraction Theory Control Optimization
**Agent:** Contraction Theory Control Optimization 是关于...