Free SKILL.md scraped from GitHub. Clone the repo or copy the file directly into your Claude Code skills directory.
npx versuz@latest install hiyenwong-ai-collection-collection-skills-hypergraph-unet-braingit clone https://github.com/hiyenwong/ai_collection.gitcp ai_collection/SKILL.MD ~/.claude/skills/hiyenwong-ai-collection-collection-skills-hypergraph-unet-brain/SKILL.md---
name: hypergraph-u-net-for-brain-graph-embedding
description: **Source:** arXiv:2008.13118
---
# Hypergraph U-Net for Brain Graph Embedding
**Source:** arXiv:2008.13118
**Utility:** 0.95
**Created:** 2026-03-25
## Activation Keywords
- hypergraph U-Net
- HUNet
- hypergraph neural network
- brain graph embedding
- high-order relationship
- geometric deep learning brain
## Description
A novel data embedding framework leveraging hypergraph structure to learn low-dimensional embeddings while capturing high-order relationships, generalizing U-Net architecture from graphs to hypergraphs.
## Core Methodology
### 1. Problem: High-Order Relationships
**Limitation of Graph Methods:**
- Graph methods capture pairwise relationships only
- Brain networks have higher-order structures
- Missing complex multi-node interactions
**Solution:** Hypergraph representation captures high-order relationships
### 2. Hypergraph Structure
**Hypergraph:**
- Vertices (nodes)
- Hyperedges (connect multiple vertices, not just pairs)
- Captures group-wise relationships
**Advantages:**
- Models multi-way interactions
- Preserves complex structural patterns
- Better represents brain connectivity organization
### 3. HUNet Architecture
**Components:**
1. **Hypergraph Encoder**
- Pooling layers reduce hypergraph resolution
- Feature aggregation at each level
- Capture multi-scale patterns
2. **Hypergraph Decoder**
- Unpooling layers increase resolution
- Reconstruct hypergraph structure
- Skip connections preserve details
3. **Skip Connections**
- Connect encoder and decoder at same resolution
- Preserve local information
- Enable gradient flow
### 4. Key Innovations
**Local Feature Aggregation:**
- Improved hyperedge-based message passing
- Better capture of local structure
**High-Order Relationship Preservation:**
- Hypergraph pooling maintains multi-node relationships
- Not just pairwise as in standard graphs
## Implementation Framework
```python
# Conceptual HUNet architecture
import torch
import torch.nn as nn
class HypergraphConv(nn.Module):
"""Hypergraph convolution layer"""
def __init__(self, in_features, out_features):
super().__init__()
self.linear = nn.Linear(in_features, out_features)
def forward(self, x, H):
"""
Args:
x: Node features [N, F]
H: Hypergraph incidence matrix [N, E]
Returns:
Updated node features
"""
# Hyperedge-to-vertex message passing
# Aggregate features within each hyperedge
hyperedge_features = torch.sparse.mm(H.t(), x) # [E, F]
# Vertex update from hyperedges
node_features = torch.sparse.mm(H, hyperedge_features) # [N, F]
return self.linear(node_features)
class HUNet(nn.Module):
"""Hypergraph U-Net for brain graph embedding"""
def __init__(self, input_dim, hidden_dims, latent_dim):
super().__init__()
# Encoder
self.encoders = nn.ModuleList([
HypergraphConv(input_dim, hidden_dims[0]),
*[HypergraphConv(hidden_dims[i], hidden_dims[i+1])
for i in range(len(hidden_dims)-1)]
])
# Latent
self.latent = HypergraphConv(hidden_dims[-1], latent_dim)
# Decoder
self.decoders = nn.ModuleList([
HypergraphConv(latent_dim, hidden_dims[-1]),
*[HypergraphConv(hidden_dims[i+1], hidden_dims[i])
for i in range(len(hidden_dims)-2, -1, -1)]
])
# Skip connections
self.skip_connections = nn.ModuleList([
nn.Linear(hidden_dims[i], hidden_dims[i])
for i in range(len(hidden_dims))
])
def encode(self, x, H):
"""Encode to latent space"""
features = [x]
for encoder in self.encoders:
x = F.relu(encoder(x, H))
features.append(x)
z = self.latent(x, H)
return z, features
def decode(self, z, H, features):
"""Decode from latent space"""
x = z
for i, decoder in enumerate(self.decoders):
x = F.relu(decoder(x, H))
# Skip connection
if i < len(features) - 1:
skip = self.skip_connections[-(i+1)](features[-(i+2)])
x = x + skip
return x
def forward(self, x, H):
z, features = self.encode(x, H)
x_recon = self.decode(z, H, features)
return x_recon, z
```
## Applications
### 1. Brain Disorder Classification
- Autism spectrum disorder
- Dementia/Alzheimer's disease
- Better accuracy than graph methods (4-14% gain)
### 2. Brain Network Embedding
- Low-dimensional representation
- Capture high-order connectivity patterns
- Multi-scale feature learning
### 3. Multi-Modal Integration
- Morphological brain networks
- Functional brain networks
- Heterogeneous dataset handling
## Key Results
| Dataset | Task | Improvement |
|---------|------|-------------|
| Autism | Classification | +4-14% vs SOTA |
| Dementia | Classification | +4-14% vs SOTA |
**Advantages:**
- Scalability to large datasets
- Generalizability across disorders
- High-order relationship modeling
## When to Use
- Brain network embedding and classification
- When high-order relationships matter
- Multi-scale feature learning
- Heterogeneous brain connectomic data
## Tools Used
- `read` - Read documentation and references
- `web_search` - Search for related information
- `web_fetch` - Fetch paper or documentation
## Instructions for Agents
Follow these steps when applying this skill:
### Step 1: Hypergraph Encoder
### Step 2: Hypergraph Decoder
### Step 3: Skip Connections
### Step 4: Understand the Request
### Step 5: Search for Information
### When to Apply
- Brain network embedding and classification
- When high-order relationships matter
- Multi-scale feature learning
## Examples
### Example 1: Basic Application
**User:** I need to apply Hypergraph U-Net for Brain Graph Embedding to my analysis.
**Agent:** I'll help you apply hypergraph-unet-brain. First, let me understand your specific use case...
**Context:** Problem: High-Order Relationships
### Example 2: Advanced Scenario
**User:** Brain network embedding and classification
**Agent:** Based on the methodology, I'll guide you through the advanced application...
### Example 2: Advanced Application
**User:** What are the key considerations for hypergraph-unet-brain?
**Agent:** Let me search for the latest research and best practices...
## Related Skills
- `gnn-transformer-fusion` - GNN architectures
- `multimodal-brain-connectivity-gnn` - Multi-modal integration
- `brain-graph-augmentation-template` - Brain graph preprocessing
## References
- Rekik, I., et al. "Deep Hypergraph U-Net for Brain Graph Embedding and Classification." arXiv:2008.13118 (2020)
- Geometric deep learning
- Hypergraph neural networks