Passenger took advice from airline’s chatbot in purchase of discounted ticket.
Airline later denied the discount, said chatbot liable for "own actions”.
Man sued and won. How did he do it?
08 May 2025, Singapore - Traveller Jake Moffatt used Air Canada’s chatbot on its website to find discounted bereavement fares.
The chatbot gave inaccurate advice, including the option for a partial refund within 90 days after the flight.
But it's message hyperlinked the words “bereavement fare” to a page that stated claims cannot be made for used tickets.
(Bereavement fares are discounted flight fares offered by airlines to help passengers reach their loved ones in times of mourning, said Doug Luftman, a former chief legal officer.)
Moffatt did not click on the hyperlink. He bought tickets based on the chatbot’s advice, and made a claim within six days.
However, Air Canada’s staff denied Moffatt’s request. The staff said the chatbot used “misleading words”, but also hyperlinked to a page that showed a different policy.
Moffatt sued for the fare difference. He won, one year later.
Why should travellers learn about this case? Because of a stunning argument by Air Canada.
Airlines which offer bereavement fares, as of May 2025
Air Canada once claimed it “cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot”.
The case caught global attention, when Air Canada argued that the chatbot was a "separate legal entity that is responsible for its own actions".
Not so, said Christopher Rivers, who decided on the outcome of the case. He is a vice chair of the Civil Resolution Tribunal of British Columbia.
(The Civil Resolution Tribunal of British Columbia is "part of the British Columbia public justice system" and "Canada’s first online tribunal".)
Three points stood out in Rivers' decision:
Air Canada did not explain why it believed it cannot be held liable for information provided by any of its representatives, including a chatbot.
It did not explain why one webpage was more trustworthy than its chatbot.
It did not provide a "copy of the relevant portion of the tariff" to prove it is not liable due to attached terms or conditions.
Rivers wrote that it was Air Canada's responsibility to ensure “representations are accurate and not misleading”, whether on a static page or a chatbot.
Updated: Air Canada's bereavement travel policies
Screen capture / AirCanada.com
Rivers found Moffatt's actions supported his claim of "negligent misrepresentation" by Air Canada. It was not Moffatt’s responsibility to know one section of the airline’s webpage is accurate, and another is not.
Moffatt proved he checked for bereavement fares with the chatbot. (He took screenshots.) He also phoned and spoke to a Air Canada representative to determine the discount sum.
Moffatt proved he relied on advice from Air Canada's chatbot by "following up for a partial refund in line with the chatbot's information".
Moffatt represented himself and won damages of C$812.02 from Air Canada. Air Canada was represented by an employee and lost in a landmark case.
After the ruling, Air Canada’s website offers only one way to request for bereavement fares. “By phone”. And there seems to be no sign of chatbots on the airline's website.
By Arjun Das and Priscilla Wong
For enquiries about this story, please email the editor.