Free SKILL.md scraped from GitHub. Clone the repo or copy the file directly into your Claude Code skills directory.
npx versuz@latest install hiyenwong-ai-collection-collection-skills-gaussian-graphical-models-brain-connectivitygit clone https://github.com/hiyenwong/ai_collection.gitcp ai_collection/SKILL.MD ~/.claude/skills/hiyenwong-ai-collection-collection-skills-gaussian-graphical-models-brain-connectivity/SKILL.md---
name: gaussian-graphical-models-brain-connectivity
description: "Gaussian Graphical Models (GGMs) for brain functional connectivity analysis with comprehensive evaluation of regularized precision matrix estimators. Includes graphical lasso, ridge, elastic net, adaptive glasso, SCAD, MCP, CLIME, and TIGER methods. Use for neuroimaging connectivity estimation, Alzheimer's disease studies, and network structure learning. Keywords: GGM, functional connectivity, precision matrix, graphical lasso, brain network estimation."
---
# Gaussian Graphical Models for Brain Connectivity
Comprehensive statistical framework for estimating brain functional connectivity using Gaussian Graphical Models (GGMs) with extensive evaluation of regularized precision matrix estimators.
## Overview
GGMs provide a powerful statistical framework for estimating functional connectivity by capturing **conditional dependence** relationships among brain regions (not just correlations).
**Key Advantage:** Conditional dependencies reveal direct connections after accounting for other regions' effects.
## Mathematical Foundation
### Precision Matrix and Conditional Independence
For multivariate Gaussian $X \sim N(\mu, \Sigma)$:
- **Precision matrix** $\Theta = \Sigma^{-1}$
- $\Theta_{ij} = 0$ iff $X_i \perp X_j | X_{\setminus {i,j}}$
**Interpretation for fMRI:**
- Non-zero $\Theta_{ij}$: Direct functional connection between regions i and j
- Zero $\Theta_{ij}$: No direct connection (conditional independence)
### Regularized Estimation Problem
$$
\hat{\Theta} = \arg\min_{\Theta \succ 0} \left\{ -\log\det(\Theta) + \text{tr}(S\Theta) + \lambda P(\Theta) \right\}
$$
Where:
- $S$: Sample covariance matrix
- $\lambda$: Regularization parameter
- $P(\Theta)$: Penalty function
## Methods Comparison
### 1. Graphical Lasso (glasso)
**Penalty:** $\ell_1$ norm
$$P(\Theta) = \sum_{i \neq j} |\Theta_{ij}|$$
**Properties:**
- Sparse solutions (many zeros)
- Convex optimization
- Well-established theory
**Implementation:**
```python
from sklearn.covariance import GraphicalLassoCV
def glasso_connectivity(timeseries, alphas=4):
"""
Estimate connectivity using graphical lasso.
Args:
timeseries: [n_timepoints, n_regions]
alphas: Number of alpha values for cross-validation
Returns:
precision: Estimated precision matrix
covariance: Estimated covariance matrix
"""
model = GraphicalLassoCV(alphas=alphas, cv=5)
model.fit(timeseries)
return model.precision_, model.covariance_
```
### 2. Ridge-Regularized glasso
**Penalty:** $\ell_2$ norm added
$$P(\Theta) = \sum_{i \neq j} |\Theta_{ij}| + \frac{\gamma}{2}\|\Theta\|_F^2$$
**Properties:**
- Better conditioning
- Dense solutions (no exact zeros)
- More stable with small samples
### 3. Graphical Elastic Net
**Penalty:** Combined $\ell_1$ and $\ell_2$
$$P(\Theta) = \alpha \sum_{i \neq j} |\Theta_{ij}| + \frac{(1-\alpha)}{2}\sum_{i \neq j} \Theta_{ij}^2$$
**Properties:**
- Balance between sparsity and grouping
- Similar to elastic net regression
**Implementation:**
```python
def graphical_elastic_net(timeseries, alpha=0.5, lambda_param=0.1):
"""
Graphical elastic net for connectivity.
Args:
timeseries: [n_timepoints, n_regions]
alpha: Mixing parameter (0=ridge, 1=lasso)
lambda_param: Regularization strength
"""
S = np.cov(timeseries.T)
n = timeseries.shape[0]
# Use QUIC or similar solver
from sklearn.linear_model import ElasticNet
# Solve column-by-column (neighborhood selection)
Theta = np.zeros((S.shape[0], S.shape[0]))
for i in range(S.shape[0]):
# Predict region i from others
y = timeseries[:, i]
X = np.delete(timeseries, i, axis=1)
model = ElasticNet(alpha=lambda_param, l1_ratio=alpha)
model.fit(X, y)
# Store coefficients
Theta[i, :i] = model.coef_[:i]
Theta[i, i+1:] = model.coef_[i:]
return Theta
```
### 4. Adaptive Graphical Lasso
**Penalty:** Weighted $\ell_1$
$$P(\Theta) = \sum_{i \neq j} w_{ij} |\Theta_{ij}|$$
**Properties:**
- Data-dependent weights
- Reduces bias from large coefficients
- Two-stage procedure
### 5. Non-Convex Penalties
**SCAD (Smoothly Clipped Absolute Deviation):**
$$P'(\theta) = \lambda \left\{ I(|\theta| \leq \lambda) + \frac{(a\lambda - |\theta|)_+}{(a-1)\lambda} I(|\theta| > \lambda) \right\}$$
**MCP (Minimax Concave Penalty):**
$$P(\theta) = \lambda |\theta| - \frac{\theta^2}{2a} \text{ for } |\theta| \leq a\lambda$$
**Properties:**
- Less bias than lasso
- Oracle properties
- Harder to optimize
### 6. CLIME (Constrained $\ell_1$ Minimization)
**Formulation:**
$$\min \|\Theta\|_1 \text{ subject to } \|S\Theta - I\|_\infty \leq \lambda$$
**Properties:**
- Different optimization approach
- Good theoretical guarantees
- Column-wise estimation
**Implementation:**
```python
def clime_estimation(S, lambda_param):
"""
CLIME estimation of precision matrix.
Args:
S: Sample covariance matrix
lambda_param: Constraint parameter
"""
p = S.shape[0]
Theta = np.zeros((p, p))
for i in range(p):
# Solve column i: min ||beta||_1 s.t. ||S beta - e_i||_inf <= lambda
from scipy.optimize import linprog
# This is a linear programming problem
# Using simple implementation via coordinate descent
beta = np.zeros(p)
beta[i] = 1.0 / S[i, i] # Initialize
# Iterative refinement
for _ in range(100):
residual = S @ beta - np.eye(p)[:, i]
if np.max(np.abs(residual)) <= lambda_param:
break
# Coordinate descent update
for j in range(p):
if j != i:
rho = S[j, :] @ beta - S[j, j] * beta[j] - (1 if j == i else 0)
beta[j] = soft_threshold(-rho / S[j, j], lambda_param / S[j, j])
Theta[:, i] = beta
# Symmetrize
Theta = (Theta + Theta.T) / 2
return Theta
def soft_threshold(x, lambda_):
"""Soft thresholding operator."""
return np.sign(x) * np.maximum(np.abs(x) - lambda_, 0)
```
### 7. TIGER (Tuning-Insensitive Graph Estimation)
**Key Innovation:** Less sensitive to tuning parameter selection
**Properties:**
- More stable across lambda values
- Reduced tuning burden
- Good for exploratory analysis
## Method Selection Guide
| Method | Sparsity | Computation | Small Sample | Use Case |
|--------|----------|-------------|--------------|----------|
| glasso | High | Fast | Moderate | Standard analysis |
| Ridge | None | Fastest | Good | Dense networks |
| Elastic Net | Medium | Fast | Good | Balanced approach |
| Adaptive | High | Slower | Moderate | Reduced bias |
| SCAD/MCP | High | Slow | Moderate | Oracle properties |
| CLIME | High | Moderate | Good | Theoretical guarantees |
| TIGER | Medium | Fast | Good | Stable across lambdas |
## Practical Workflow
### 1. Data Preprocessing
```python
import numpy as np
from nilearn.connectome import ConnectivityMeasure
def preprocess_fmri(timeseries):
"""
Preprocess fMRI timeseries for GGM estimation.
Steps:
1. Detrend
2. Standardize
3. Remove confounds
"""
# Detrend
from scipy.signal import detrend
timeseries = detrend(timeseries, axis=0)
# Standardize (z-score per region)
timeseries = (timeseries - timeseries.mean(axis=0)) / timeseries.std(axis=0)
return timeseries
```
### 2. Method Comparison
```python
def compare_ggm_methods(timeseries, methods=None):
"""
Compare multiple GGM estimation methods.
Args:
timeseries: [n_timepoints, n_regions]
methods: List of method names to compare
Returns:
results: Dict of precision matrices per method
"""
if methods is None:
methods = ['glasso', 'ridge', 'elastic_net']
results = {}
for method in methods:
if method == 'glasso':
from sklearn.covariance import GraphicalLassoCV
model = GraphicalLassoCV(cv=5)
model.fit(timeseries)
results[method] = model.precision_
elif method == 'ridge':
S = np.cov(timeseries.T)
# Add ridge penalty to diagonal
alpha = 0.1
results[method] = np.linalg.inv(S + alpha * np.eye(S.shape[0]))
elif method == 'elastic_net':
results[method] = graphical_elastic_net(timeseries)
elif method == 'clime':
S = np.cov(timeseries.T)
results[method] = clime_estimation(S, lambda_param=0.3)
return results
```
### 3. Stability Selection
```python
def stability_selection(timeseries, method, n_bootstrap=100):
"""
Use bootstrap for stable edge selection.
Key insight: Edges that persist across bootstrap
samples are more reliable.
"""
n_samples, n_regions = timeseries.shape
edge_frequencies = np.zeros((n_regions, n_regions))
for _ in range(n_bootstrap):
# Bootstrap sample
idx = np.random.choice(n_samples, n_samples, replace=True)
boot_sample = timeseries[idx]
# Estimate precision
precision = method(boot_sample)
# Record non-zero edges
edge_frequencies += (np.abs(precision) > 1e-6).astype(float)
# Normalize
edge_frequencies /= n_bootstrap
return edge_frequencies
```
## Application to Alzheimer's Disease
**Key Findings:**
- AD shows altered connectivity patterns
- Default mode network disruption
- Method choice affects biomarker discovery
```python
def ad_connectivity_analysis(control_subjects, ad_subjects, method='glasso'):
"""
Compare connectivity between control and AD groups.
Args:
control_subjects: List of timeseries matrices
ad_subjects: List of timeseries matrices
method: GGM estimation method
Returns:
group_differences: Statistical comparison
"""
# Estimate group-level precision matrices
control_precisions = [method(ts) for ts in control_subjects]
ad_precisions = [method(ts) for ts in ad_subjects]
# Average within groups
control_mean = np.mean(control_precisions, axis=0)
ad_mean = np.mean(ad_precisions, axis=0)
# Compute difference
diff = ad_mean - control_mean
# Statistical testing (simplified)
from scipy import stats
p_values = np.zeros_like(diff)
for i in range(diff.shape[0]):
for j in range(diff.shape[1]):
control_edges = [p[i,j] for p in control_precisions]
ad_edges = [p[i,j] for p in ad_precisions]
_, pval = stats.ttest_ind(control_edges, ad_edges)
p_values[i,j] = pval
return {
'control_mean': control_mean,
'ad_mean': ad_mean,
'difference': diff,
'p_values': p_values
}
```
## R Package: spice
The paper provides an R package `spice` for easy implementation:
```r
# Installation (when available)
# install.packages("spice")
library(spice)
# Estimate connectivity
result <- spice(timeseries, method = "glasso", lambda = 0.1)
# Compare methods
comparison <- compare_methods(timeseries,
methods = c("glasso", "ridge", "tiger"))
```
## Activation Keywords
- GGM
- functional connectivity
- precision matrix
- graphical lasso
- brain network estimation
- conditional dependence
- SCAD
- MCP
- CLIME
- TIGER
## Tools Used
- sklearn.covariance for glasso
- scipy for optimization
- nilearn for neuroimaging
- R package 'spice' (paper-specific)
## References
Zhang, P., Xiao, S., Robb, W. H., Liu, D., Jefferson, A. L., & Yan, J. (2026). Gaussian Graphical Models for Functional Connectivity Analysis: A Statistical Review with Applications to Alzheimer's Disease. arXiv:2604.10249.
## Related Skills
- brain-connectivity-analysis
- fmri-preprocessing
- network-neuroscience
- statistical-learning