AI Garage Jargons

Hallucination (AI)

When an AI model generates confident but incorrect or fabricated output.

Use Cases

  • Risk reviews in legal/finance
  • Fact-checking pipelines
  • Human approval workflows

ELI5

The AI sounds sure, but it made things up.

Why it matters

Unchecked hallucinations can cause trust, compliance, and business issues.

Back to glossary/ai-hallucination