Free SKILL.md scraped from GitHub. Clone the repo or copy the file directly into your Claude Code skills directory.
npx versuz@latest install hiyenwong-ai-collection-collection-skills-hopfield-temporal-complexitygit clone https://github.com/hiyenwong/ai_collection.gitcp ai_collection/SKILL.MD ~/.claude/skills/hiyenwong-ai-collection-collection-skills-hopfield-temporal-complexity/SKILL.md---
name: hopfield-temporal-complexity-on-network-topologies
description: **来源论文:** arXiv:2406.12895 - Temporal Complexity of a Hopfield-Type Neural Model in Random and Scale-Free Graphs
---
# Hopfield Temporal Complexity on Network Topologies
**来源论文:** arXiv:2406.12895 - Temporal Complexity of a Hopfield-Type Neural Model in Random and Scale-Free Graphs
**效用评分:** 0.99
**创建时间:** 2026-03-24 08:53
---
## 概述
研究 Hopfield 型神经网络在两种不同网络拓扑(无标度 vs 随机网络)上的时序复杂性特征,使用间歇驱动复杂性(IDC)方法分析自组织状态的亚稳性。
## 激活关键词
- Hopfield temporal complexity
- intermittency-driven complexity
- scale-free neural network
- random network topology
- self-organized metastability
- Hopfield 网络复杂性
- 间歇驱动复杂性
## 核心概念
### 间歇驱动复杂性(IDC)
```
传统 Hopfield 动态 IDC 分析
┌─────────────────┐ ┌─────────────────┐
│ 平衡状态 │ │ 亚稳态转换 │
│ 固定点吸引子 │ │ 间歇性动力学 │
│ 同步更新 │ → │ 粉红噪声 │
└─────────────────┘ │ 幂律分布 │
└─────────────────┘
关键观察:
- 事件间隔时间服从幂律分布
- 扩散过程表现出标度行为
- 拓扑结构影响复杂性特征
```
## 实现方法
### Hopfield 网络模型
```python
import numpy as np
from scipy.stats import powerlaw
class HopfieldNetwork:
"""
Hopfield 型联想记忆网络
"""
def __init__(self, n_neurons, connectivity_type='scale-free', sparsity=0.1):
self.n = n_neurons
self.connectivity_type = connectivity_type
# 构建连接矩阵
if connectivity_type == 'scale-free':
self.W = self.create_scale_free_topology(sparsity)
else:
self.W = self.create_random_topology(sparsity)
# Hopfield 权重约束
self.W = self.W / n_neurons
np.fill_diagonal(self.W, 0) # 无自连接
def create_scale_free_topology(self, sparsity):
"""创建无标度网络(Barabási-Albert 模型)"""
# 首先创建小网络
W = np.random.randn(10, 10)
# 增长机制:优先连接到高度节点
W = self.grow_network(W, self.n - 10, m=2)
# 稀疏化
mask = np.random.rand(self.n, self.n) < sparsity
W = W * mask
return W
def create_random_topology(self, sparsity):
"""创建随机网络"""
W = np.random.randn(self.n, self.n)
mask = np.random.rand(self.n, self.n) < sparsity
W = W * mask
return W
def update(self, x, noise_level=0.01):
"""
异步更新神经状态
Args:
x: 当前神经状态 {0, 1}^n
noise_level: 高斯噪声水平
Returns:
新的神经状态
"""
# 计算输入
h = np.dot(self.W, x)
# 添加噪声
h += np.random.randn(self.n) * noise_level
# 随机选择神经元更新
i = np.random.randint(0, self.n)
x_new = x.copy()
x_new[i] = 1 if h[i] >= 0 else 0
return x_new
```
### 时序复杂性分析
```python
def compute_idc_analysis(trajectory, bin_size=100):
"""
间歇驱动复杂性分析
核心思想:分析事件间隔时间的分布
"""
# 检测状态转换事件
events = detect_state_transitions(trajectory, bin_size)
# 计算事件间隔
intervals = []
for i in range(1, len(events)):
interval = events[i]['time'] - events[i-1]['time']
intervals.append(interval)
# 拟合幂律分布
fit = powerlaw.Fit(intervals, discrete=True)
# 幂律指数 α
alpha = fit.alpha
# 最小间隔 τ
tau = fit.xmin
# 时序复杂性指标
# 事件间隔的方差
interval_std = np.std(intervals)
interval_mean = np.mean(intervals)
return {
'alpha': alpha,
'tau': tau,
'cv': interval_std / interval_mean, # 变异系数
'n_events': len(events),
'intervals': intervals
}
def detect_state_transitions(trajectory, bin_size):
"""
检测状态转换事件
Args:
trajectory: [n_timepoints, n_neurons]
bin_size: 状态定义的时间窗口
Returns:
事件列表 [time, state_id]
"""
events = []
n_bins = len(trajectory) // bin_size
for b in range(n_bins):
start = b * bin_size
end = (b + 1) * bin_size
window = trajectory[start:end]
# 计算窗口的平均激活模式
mean_pattern = window.mean(axis=0)
# 状态 ID(基于激活模式)
state_id = hash(tuple((mean_pattern > 0.5).astype(int)))
if len(events) == 0 or events[-1]['state_id'] != state_id:
events.append({
'time': end,
'state_id': state_id,
'pattern': mean_pattern
})
return events
```
### 扩散过程分析
```python
def analyze_diffusion_dynamics(events, n_states):
"""
分析扩散过程
Args:
events: 事件序列
n_states: 状态空间大小
Returns:
扩散参数
"""
# 状态转换矩阵
transition_matrix = np.zeros((n_states, n_states))
for i in range(len(events) - 1):
current = events[i]['state_id'] % n_states
next_state = events[i+1]['state_id'] % n_states
transition_matrix[current, next_state] += 1
# 归一化
row_sums = transition_matrix.sum(axis=1, keepdims=True)
transition_matrix = transition_matrix / (row_sums + 1e-6)
# 计算稳态分布
eigenvalues, eigenvectors = np.linalg.eig(transition_matrix.T)
stationary_idx = np.argmax(np.abs(eigenvalues))
stationary_dist = np.abs(eigenvectors[:, stationary_idx])
stationary_dist /= stationary_dist.sum()
# 熵
entropy = -np.sum(stationary_dist * np.log(stationary_dist + 1e-6))
return {
'transition_matrix': transition_matrix,
'stationary_dist': stationary_dist,
'entropy': entropy,
'n_states': n_states
}
```
### 比较网络拓扑
```python
def compare_topologies(n_neurons=100, n_trials=10):
"""
比较无标度 vs 随机网络的时序复杂性
"""
results = {
'scale_free': [],
'random': []
}
for trial in range(n_trials):
# 无标度网络
net_sf = HopfieldNetwork(n_neurons, connectivity_type='scale-free')
trajectory_sf = simulate_dynamics(net_sf, n_steps=10000)
idc_sf = compute_idc_analysis(trajectory_sf)
results['scale_free'].append(idc_sf)
# 随机网络
net_rand = HopfieldNetwork(n_neurons, connectivity_type='random')
trajectory_rand = simulate_dynamics(net_rand, n_steps=10000)
idc_rand = compute_idc_analysis(trajectory_rand)
results['random'].append(idc_rand)
# 统计分析
return analyze_topology_comparison(results)
def simulate_dynamics(network, n_steps=10000, noise_level=0.01):
"""
模拟神经动力学
"""
x = np.random.randint(0, 2, network.n)
trajectory = []
for t in range(n_steps):
x = network.update(x, noise_level)
trajectory.append(x.copy())
return np.array(trajectory)
def analyze_topology_comparison(results):
"""
分析拓扑比较结果
"""
# 无标度网络指标
sf_alphas = [r['alpha'] for r in results['scale_free']]
sf_cvs = [r['cv'] for r in results['scale_free']]
# 随机网络指标
rand_alphas = [r['alpha'] for r in results['random']]
rand_cvs = [r['cv'] for r in results['random']]
return {
'scale_free': {
'mean_alpha': np.mean(sf_alphas),
'std_alpha': np.std(sf_alphas),
'mean_cv': np.mean(sf_cvs),
'std_cv': np.std(sf_cvs)
},
'random': {
'mean_alpha': np.mean(rand_alphas),
'std_alpha': np.std(rand_alphas),
'mean_cv': np.mean(rand_cvs),
'std_cv': np.std(rand_cvs)
},
'difference': {
'alpha': np.mean(sf_alphas) - np.mean(rand_alphas),
'cv': np.mean(sf_cvs) - np.mean(rand_cvs)
}
}
```
## 应用场景
1. **联想记忆建模** - Hopfield 网络的动态
2. **神经复杂性** - 生物神经网络的复杂性特征
3. **状态转换分析** - 亚稳态转换模式
4. **拓扑影响** - 网络结构对动力学的影响
## 关键发现
- **无标度网络**:可以触发与随机网络类似的复杂性特征
- **参数差异**:不同参数值下表现不同
- **噪声影响**:噪声水平影响复杂性特征
- **拓扑依赖**:网络结构影响时序复杂性模式
## 相关技能
- `attractor-metadynamics-neural` - 吸引子亚动力学
- `neutral-theory-neural-dynamics` - 中性理论神经动力学
- `curvature-aware-nonconvex-optimization` - 曲率非凸优化
- `weighted-brain-community-detection` - 加权脑社区检测
---
_此技能基于 IDC 方法分析 Hopfield 型神经网络在不同拓扑结构上的时序复杂性_
## Description
Hopfield Temporal Complexity on Network Topologies
## Activation Keywords
- hopfield-temporal-complexity
- hopfield-temporal-complexity 技能
- hopfield-temporal-complexity skill
## Tools Used
- `read` - Read documentation and references
- `web_search` - Search for related information
- `web_fetch` - Fetch paper or documentation
## Instructions for Agents
Follow these steps when applying this skill:
### Step 1: 联想记忆建模
### Step 2: 神经复杂性
### Step 3: 状态转换分析
### Step 4: 拓扑影响
### Step 5: Understand the Request
## Examples
### Example 1: Basic Application
**User:** I need to apply Hopfield Temporal Complexity on Network Topologies to my analysis.
**Agent:** I'll help you apply hopfield-temporal-complexity. First, let me understand your specific use case...
**Context:** Apply the methodology
### Example 2: Advanced Scenario
**User:** Complex analysis scenario
**Agent:** Based on the methodology, I'll guide you through the advanced application...
### Example 2: Advanced Application
**User:** What are the key considerations for hopfield-temporal-complexity?
**Agent:** Let me search for the latest research and best practices...