Contact
AI

Hallucination

When an LLM generates plausible-sounding but factually incorrect or fabricated information. A fundamental limitation of generative models. Mitigated (not eliminated) by RAG, grounding, citations, and temperature control.

Related Resources