Free SKILL.md scraped from GitHub. Clone the repo or copy the file directly into your Claude Code skills directory.
npx versuz@latest install vivekkarmarkar-claude-code-os-skills-spec-checkgit clone https://github.com/VivekKarmarkar/claude-code-os.gitcp claude-code-os/SKILL.MD ~/.claude/skills/vivekkarmarkar-claude-code-os-skills-spec-check/SKILL.md--- name: spec-check description: "Read spec files and compare them against what was actually implemented. Use this skill BEFORE starting implementation (to understand requirements) and AFTER implementation (to verify compliance). Also use when: the user asks 'does this match the spec?', 'check the specs', 'what do the specs say?', you've just finished a chunk of work, or you realize you might have deviated from the design documents. This skill prevents the pattern of ignoring specs and later discovering the implementation contradicts the design. Trigger on any mention of 'spec', 'specs', 'requirements', 'design doc', or references to files in the info/ directory." --- # Spec Check Read the design specs before coding. Compare the implementation against specs after coding. Flag discrepancies honestly. ## When to Use ### Before Implementation - Read ALL relevant spec files in `info/` for the current task - List the concrete requirements extracted from each file - Identify any ambiguities or contradictions between spec files - Present the requirements to the user before writing code ### After Implementation - Re-read the spec files - For each requirement, check whether the implementation satisfies it - Flag any discrepancies with exact citations from both the spec and the code ## The Process ### Step 1: Identify Relevant Specs For this project, the spec files live in `info/`. Read these based on the task: - `info/modular_thinking.md` — Which phase are we on? Don't implement beyond the current phase. - `info/layout_solution.md` — How the layout pipeline works (reference canvas, render-time scaling, three zones) - `info/visual_system.md` — What the system-level and code-level views look like - `info/aesthetic_specs.md` — Visual rules (translucent rectangles, no edge labels, no clutter) - `info/explanation_philosophy.md` — How explanations should be written (WHAT-first, Feynman standard) - `info/tech_stack.md` — Technology choices and rationale - `info/sacred_rules.md` — Non-negotiable behavioral rules Also check `blueprint/agent_output_schema.md` for data contract requirements and `CLAUDE.md` for project-level instructions. ### Step 2: Extract Requirements For each spec file, extract concrete, testable requirements. Format: ``` From [file]: 1. [Requirement] — [exact quote from spec] 2. [Requirement] — [exact quote from spec] ``` ### Step 3: Compare Against Implementation For each requirement, check the code: ``` Requirement: [what the spec says] Spec source: [file:line or quote] Implementation: [what the code does — cite file and line] Status: MATCHES / DIFFERS / NOT IMPLEMENTED If DIFFERS: [explain the discrepancy honestly] ``` ### Step 4: Report Present all findings. Don't hide discrepancies. If the implementation differs from the spec for a good reason (architectural constraint, better approach discovered during implementation), explain why — but still flag it so the spec can be updated. ## Key Principles - **Cite everything.** Spec file name + quote. Code file name + line number. No vague claims. - **Discrepancies are not failures.** Sometimes the implementation is better than the spec. But the discrepancy must be acknowledged and the spec updated. - **Check the phase.** `info/modular_thinking.md` defines what's in scope. Anything beyond the current phase should not be implemented, no matter how tempting. - **Ambiguity is worth flagging.** If the spec is unclear, say so. Don't silently resolve ambiguity with an assumption — the user should decide.