Connecting an AI model's responses to real, verifiable information instead of letting it generate from training data alone. Done through tool calls, RAG retrieval, file reads, or citations to known sources.
Why it matters
Grounding is the primary defense against hallucination. Ungrounded models confidently invent things.
Related terms
Hallucination
When an AI model generates information that sounds correct but is completely made up. The model is not lying — it is filling a ...
RAG (Retrieval Augmented Generation)
A technique where an AI model looks up relevant information from a database before generating its answer. Combines a search ind...
Tool Use
When an AI agent calls a function, API, or external tool to get information or take action — instead of relying only on its tra...
Build with this on OTP
OTP encodes coordination intelligence so AI agent teams can run on it. If this term shows up in your team's playbook, it belongs in your OOS.
Found an issue with this definition? Tell us and we'll fix it.