Free SKILL.md scraped from GitHub. Clone the repo or copy the file directly into your Claude Code skills directory.
npx versuz@latest install hiyenwong-ai-collection-collection-skills-bosonic-gaussian-state-learninggit clone https://github.com/hiyenwong/ai_collection.gitcp ai_collection/SKILL.MD ~/.claude/skills/hiyenwong-ai-collection-collection-skills-bosonic-gaussian-state-learning/SKILL.md---
name: bosonic-gaussian-state-learning
description: >
Sample-optimal learning methodology for bosonic Gaussian quantum states. Proves tight
sample complexity bounds for characterizing unknown Gaussian states: Omega(n^3/epsilon^2)
for Gaussian measurements, Omega(n^2/epsilon^2) for arbitrary measurements, and O-tilde(n^2/epsilon^2)
for pure/passive states. Use when: designing quantum state tomography protocols, analyzing
sample complexity of continuous-variable systems, optimizing Gaussian state learning,
or proving information-theoretic bounds for quantum state estimation. Triggers: bosonic
Gaussian state learning, quantum state tomography sample complexity, continuous-variable
quantum learning, Gaussian measurement bounds, passive Gaussian state, quantum state
estimation efficiency, sample-optimal quantum learning.
---
# Bosonic Gaussian State Learning
Sample-optimal learning methodology for characterizing unknown bosonic Gaussian quantum
states from minimal measurements (arXiv: 2603.18136, Chen et al., Caltech/Stanford/Google).
## Core Principle
The ultimate efficiency limit for learning bosonic Gaussian states is determined by the
interplay between the measurement class (Gaussian vs. non-Gaussian), the state structure
(pure vs. passive vs. general), and the use of adaptivity in measurement protocols.
## Key Results
### Sample Complexity Bounds
| Setting | Lower Bound | Upper Bound | Optimal? |
|---------|-------------|-------------|----------|
| Gaussian measurements (general) | Omega(n^3/epsilon^2) | - | Nearly tight |
| Arbitrary measurements (general) | Omega(n^2/epsilon^2) | - | Information-theoretic limit |
| Pure or passive Gaussian states | - | O-tilde(n^2/epsilon^2) | Matches lower bound |
| Single-mode, non-adaptive | - | Theta-tilde(E/epsilon^2) | Energy-dependent |
### Critical Insights
1. **Gaussian measurements are insufficient** for optimal learning of passive Gaussian states
- Non-Gaussian measurements are provably required
- This reveals a fundamental separation between measurement classes
2. **Adaptivity is indispensable** for energy-independent scaling
- Non-adaptive schemes suffer Theta-tilde(E/epsilon^2) dependence on energy E
- Adaptive protocols achieve nearly energy-independent bounds
3. **Pure vs. passive state structure matters**
- Pure Gaussian states can be learned nearly optimally with Gaussian measurements
- Passive states require non-Gaussian measurements for optimal learning
## Workflow
### Step 1: Characterize the State Class
```python
def classify_gaussian_state(covariance_matrix, displacement_vector):
"""Classify Gaussian state properties."""
n = len(displacement_vector) // 2
# Check purity: Tr(rho^2) = 1/det(2*sigma)
det_sigma = np.linalg.det(covariance_matrix)
purity = 1.0 / np.sqrt(det_sigma * (2**(2*n)))
is_pure = np.isclose(purity, 1.0, atol=1e-6)
# Check passivity: no energy can be extracted by unitary operations
# Passive states have covariance in thermal form
eigenvalues = np.linalg.eigvalsh(covariance_matrix)
is_passive = np.all(eigenvalues >= 1) # In appropriate units
return is_pure, is_passive
```
### Step 2: Choose Measurement Strategy
```python
def select_measurement_protocol(is_pure, is_passive, n_modes):
"""Select optimal measurement protocol based on state properties."""
if is_pure:
# Gaussian measurements suffice for nearly optimal learning
return "homodyne_tomography", "O-tilde(n^2/epsilon^2)"
elif is_passive:
# Non-Gaussian measurements required for optimal learning
return "photon_counting", "O-tilde(n^2/epsilon^2)"
else:
# General case: use arbitrary measurements
return "heterodyne_with_nonGaussian", "Omega(n^2/epsilon^2)"
```
### Step 3: Design Adaptive Protocol
```python
def adaptive_learning_protocol(measurements, target_precision, energy_bound):
"""Adaptive protocol for energy-independent sample complexity."""
samples_per_round = []
accumulated_info = []
for round_idx in range(max_rounds):
# Design measurement based on accumulated information
measurement = design_adaptive_measurement(accumulated_info, round_idx)
# Collect samples
n_samples = compute_sample_budget(round_idx, target_precision)
results = perform_measurement(measurement, n_samples)
# Update state estimate
estimate = update_gaussian_estimate(results, accumulated_info)
# Check convergence
if check_convergence(estimate, target_precision):
break
accumulated_info.append(results)
samples_per_round.append(n_samples)
return estimate, sum(samples_per_round)
```
### Step 4: Compute Sample Budget
```python
def compute_sample_budget(measurement_class, state_type, n_modes, epsilon):
"""Compute required number of samples for target precision."""
if measurement_class == "gaussian" and state_type == "general":
return Omega(n_modes**3 / epsilon**2)
elif measurement_class == "arbitrary" and state_type == "general":
return Omega(n_modes**2 / epsilon**2)
elif state_type in ["pure", "passive"]:
return O_tilde(n_modes**2 / epsilon**2)
else:
raise ValueError("Unknown configuration")
```
## Pitfalls
### Pitfall 1: Assuming Gaussian measurements are always sufficient
- **Wrong**: Using only homodyne/heterodyne for all Gaussian state learning
- **Correct**: Use non-Gaussian measurements (e.g., photon counting) for passive states
### Pitfall 2: Ignoring adaptivity benefits
- **Wrong**: Using fixed non-adaptive measurement sequences
- **Correct**: Design adaptive protocols that update measurements based on prior results
### Pitfall 3: Underestimating n-scaling
- **Wrong**: Assuming O(n/epsilon^2) scaling for n-mode states
- **Correct**: General Gaussian states require Omega(n^3/epsilon^2) with Gaussian measurements
### Pitfall 4: Energy dependence in non-adaptive protocols
- **Wrong**: Non-adaptive schemes with fixed measurement settings
- **Correct**: Use adaptive protocols to achieve energy-independent scaling
## Applications
1. **Quantum sensing**: Optimize sample complexity for gravitational wave detection
2. **Quantum communication**: Characterize channel output states efficiently
3. **Dark matter detection**: Learn quantum states of axion-photon conversion signals
4. **Continuous-variable QIP**: Verify and characterize CV quantum resources
## Related Papers
- Quantum state tomography bounds (arXiv: various)
- Continuous-variable quantum information theory
- Information-theoretic limits of quantum learning
## References
- arXiv: 2603.18136 - "Towards sample-optimal learning of bosonic Gaussian quantum states"
- Authors: Senrui Chen, Francesco Anna Mele, Marco Fanizza, Alfred Li, Zachary Mann,
Hsin-Yuan Huang, Yanbei Chen, John Preskill