EP10 · 8 min

Reliability: hallucinations, grounding, RAG overview

Build systems that prefer evidence over plausible guessing.

Simple definition
Reliable AI answers are grounded in verifiable sources, not just fluent text.
Precise definition
Reliability in generation is the probability of producing factually consistent, source-attributable outputs under deployment constraints.

Objective

You should leave with a framework for "cite vs guess" decisions in product flows.

Hallucination basics

A model can produce confident but wrong statements when:

  • input lacks needed facts,
  • retrieval misses key documents,
  • prompt rewards fluency over evidence.

Grounding strategy

  1. Retrieve relevant documents.
  2. Pass excerpts with identifiers.
  3. Instruct the model to answer only from sources.
  4. Require explicit citation or abstention.

Worked example (online store)

Customer asks: "Can I return final-sale shoes after 30 days?"

Without grounding, model may fabricate a policy exception. With RAG, the system retrieves current return policy and asks model to cite the clause.

If no clause is found, response should say "I don't have enough evidence" rather than inventing.

Product decision

Reliability is not free. Retrieval pipelines add latency and cost, but reduce risk and support burden.

Three takeaways

  • Fluent text is not proof.
  • Citations turn generation into auditable behavior.
  • Abstention is often the safest response.

Visual Stage

Interactive walkthrough

Visual walkthrough: grounded response flow

Tap each phase from question to cited answer.

Step Insight

Fetch top relevant policy or knowledge documents for the user query.

Common traps
  • Shipping without source citation requirements.
  • Assuming larger model size eliminates hallucination.
  • Treating retrieval as optional for knowledge-heavy tasks.
Three takeaways
  • Hallucination is a systems problem, not only a model problem.
  • Grounding needs both retrieval quality and prompt policy.
  • RAG improves trust when citations are enforced.
Next lesson