Skip to content

AI Coding

Grounding

The practice of anchoring model responses to specific, verified sources such as codebase files, documentation, or test results. Grounding reduces hallucinations by giving the model authoritative context to cite.