← Back to cases
CASE ID: case-001

Fabricated non-existent academic paper citations

捏造不存在的学术论文引用

CLASSIFIED
MODEL
ChatGPT 3.5
OpenAI
DATE
May 25, 2023
CATEGORY
Hallucination
SEVERITY
☠️ Life
INCIDENT DETAIL

ChatGPT 3.5 cited multiple entirely fictional legal cases in legal documents, causing a lawyer to be sanctioned in court. This became a landmark case for AI hallucination harm.