Free SKILL.md scraped from GitHub. Clone the repo or copy the file directly into your Claude Code skills directory.
npx versuz@latest install hiyenwong-ai-collection-collection-skills-large-adaptive-regularization-gaussian-graphicgit clone https://github.com/hiyenwong/ai_collection.gitcp ai_collection/SKILL.MD ~/.claude/skills/hiyenwong-ai-collection-collection-skills-large-adaptive-regularization-gaussian-graphic/SKILL.md---
name: large-adaptive-regularization-gaussian-graphical
description: "LARGE: Locally Adaptive Regularization for estimating Gaussian Graphical Models — improving brain network connectivity estimation via node-specific penalty tuning. Activation: Gaussian graphical model, GGM, graphical Lasso, GLASSO, brain connectivity, functional connectivity, precision matrix, adaptive regularization, network neuroscience."
---
# LARGE: Locally Adaptive Regularization for Gaussian Graphical Models
> Novel approach to estimating high-dimensional Gaussian Graphical Models (GGMs) by introducing locally adaptive regularization, overcoming limitations of global penalty in standard GLASSO for brain network analysis.
## Metadata
- **Source**: arXiv:2601.09686
- **Authors**: Ha Nguyen, Sumanta Basu
- **Published**: 2026-01-14
- **Categories**: stat.ME, stat.CO, stat.ML
## Core Methodology
### Key Innovation
Replaces the global ℓ₁-penalty in Graphical Lasso (GLASSO) with a locally adaptive regularization scheme. Each node/variable receives a penalty calibrated to its local connectivity structure, enabling more accurate recovery of sparse precision matrices in high-dimensional settings like brain functional connectivity networks.
### Technical Framework
1. **Standard GLASSO Limitation**: Global penalty λ treats all partial correlations equally — over-penalizes hub nodes, under-penalizes weak connections
2. **LARGE Penalty Design**: Node-specific penalties λⱼ derived from initial marginal correlation estimates
3. **Two-Stage Estimation**: (a) Obtain initial estimates via standard methods; (b) Refine with adaptive penalties
4. **Theoretical Guarantees**: Consistency results for precision matrix estimation under high-dimensional asymptotics
### Implementation Guide
#### Prerequisites
- Multivariate Gaussian distribution theory
- Convex optimization (coordinate descent)
- R or Python with optimization libraries
#### Step-by-Step
1. Compute sample covariance matrix S from observations
2. Obtain initial penalty weights from marginal correlations
3. Solve the LARGE objective: maximize log-likelihood with adaptive node-specific ℓ₁ penalties
4. Iterate until convergence of the precision matrix estimate
5. Threshold the estimated precision matrix to recover the graph structure
### Code Example
```python
import numpy as np
from sklearn.covariance import GraphicalLasso
def large_penalty_weights(S, alpha=1.0):
"""Compute locally adaptive penalty weights from sample covariance."""
n = S.shape[0]
weights = np.zeros((n, n))
for j in range(n):
for k in range(n):
if j != k:
weights[j, k] = alpha / (np.abs(S[j, k]) + 1e-6)
return weights
def large_estimate(X, alpha=1.0, max_iter=200):
"""LARGE: Locally Adaptive Regularization for GGM estimation."""
S = np.cov(X, rowvar=False)
# Stage 1: Initial GLASSO estimate
gl = GraphicalLasso(alpha=alpha, max_iter=max_iter)
gl.fit(X)
# Stage 2: Compute adaptive weights
weights = large_penalty_weights(S, alpha)
# Stage 3: Weighted GLASSO with adaptive penalties
# (Use specialized solver for weighted ℓ₁)
return gl.precision_
```
## Applications
- **Brain Connectivity**: Estimating functional brain networks from fMRI/EEG data
- **Gene Networks**: Inferring gene regulatory networks from expression data
- **Financial Networks**: Modeling dependencies between financial instruments
- **Any high-dimensional sparse graph estimation**
## Pitfalls
- Two-stage estimation increases computational cost
- Initial estimate quality affects final results
- Choice of meta-regularization parameter requires cross-validation
- May not outperform GLASSO when true graph has uniform sparsity
## Related Skills
- gaussian-graphical-connectivity-analysis
- brain-graph-neural
- structure-aware-coreset-fc-benchmarking
- brain-network-topology