Progressive Context Loading (Skills)
The three-level loading pattern Claude Code uses to keep skills lightweight. Level 1: initial search only reads the YAML frontmatter (name + description, ~100 tokens) of every skill. Level 2: once a skill is selected, the full skill.md is loaded (typically 1-2K tokens). Level 3: referenced files (scripts, templates, reference markdown) are loaded only if the skill's instructions explicitly require them. This lets a project carry dozens of skills without blowing context on unused ones.
Key points
- Level 1 reads only YAML frontmatter of every skill in the project
- Level 2 loads the full skill.md only after selection
- Level 3 pulls referenced scripts, templates, and reference files on demand
- Skill.md should stay under 500 lines per Anthropic guidance
- Reference files can live inside the skill folder or anywhere the skill.md path points to
- Enables dozens of skills in one project without proportional token cost
- Mornati's CLI-plus-skill experiment shows why the loading boundary matters: the GitHub CLI skill is cheap when invoked for GitHub work, but expensive if copied into always-loaded files such as
CLAUDE.md[src-041] - On-demand skills let low-frequency tools gain structured command guidance without paying the full Tool Schema Tax of Native MCP [src-041]
- OpenAI's API & Codex Build Hour generalizes the same idea to tool search: GPT-5.4 can discover namespaced tools progressively rather than loading hundreds of tool definitions into context up front [src-084].
- The Workspace Agents session gives the product version: agents can attach multiple skills, while concise skill descriptions and frontmatter-style summaries help the model decide when to load the detailed playbook [src-084].
Related entities
Related concepts
- Claude Code Token Economics
- Claude Code Context Management Discipline
- On-Demand Skill Files
- MCP vs CLI Token Trade-off
- Harness Engineering
Source references
- [src-004] Nate Herk cluster — Nate Herk — Claude Code cluster (21 videos)
– Videos referenced: zKBPwDpBfhs, RAZVk5NPNtE