← Back to cases
CASE ID: case-009

Fabricated refund policy, airline lost court case

编造不存在的退票政策并败诉

CLASSIFIED
MODEL
Air Canada Chatbot
Air Canada
DATE
Feb 19, 2024
CATEGORY
Hallucination
SEVERITY
🔴 Heavy
INCIDENT DETAIL

In 2022, Canadian passenger Jake Moffatt asked Air Canada's AI chatbot about bereavement fare policies after his grandmother died. The chatbot fabricated a rule allowing retroactive refund applications within 90 days of purchase — a policy that did not exist. When Air Canada refused the refund, Moffatt sued. Air Canada argued the chatbot was a "separate legal entity" responsible for its own actions. The tribunal rejected this defense, ruling that companies are responsible for all content on their websites including chatbot output. Air Canada lost and was ordered to refund the customer, becoming a landmark AI accountability case.