The amount of text an AI model can see at one time, measured in tokens. Everything the model reads — system prompt, conversation history, tool results, files — must fit inside the context window.
Why it matters
When an agent runs out of context window, it forgets what it was doing. Coordination intelligence is the discipline of putting only the right tokens in the window.
Related terms
Prompt Engineering
The skill of writing instructions that get an AI model to do what you actually want. Includes role framing, few-shot examples, ...
Token
The basic unit of text AI models work with — roughly 3/4 of a word in English. Models charge by tokens, process by tokens, and ...
Token Efficiency Ratio
A metric measuring whether an operational rule saves more tokens than it costs to load into an agent's context. A ratio above 1...
Build with this on OTP
OTP encodes coordination intelligence so AI agent teams can run on it. If this term shows up in your team's playbook, it belongs in your OOS.
Found an issue with this definition? Tell us and we'll fix it.