--- # Trust but Verify In legal, medical, or coding fields, an AI hallucination isn't a quirk—it's a liability. ## RAG (Retrieval Augmented Generation) The best way to fix hallucinations is to force the AI to look at a trusted document *before* answering. This is RAG. ## The "Critic" Agent Always run a second AI pass specifically told to "Act as a fact-checker and verify every claim." ---