This is truly an incredible case.
Jake Moffatt’s grandmother passed away, so he logged onto AirCanada.com and a chatbot told him how to buy a ticket and apply for a partial refund afterward by sending in proof of qualification for a bereavement fare.
Air Canada later denied the claim, saying that you have to call to request a bereavement fare in advance, and that their website states there are no refunds available afterward.
Luckily, Mr. Moffatt was a smart cookie. He took a screenshot of the chatbot’s offer.
Surely Air Canada would just honor that and retrain their chatbot, right?
Wrong.
The contempt that airlines have for their customers is just astounding, so I guess I shouldn’t be that surprised.
Air Canada was perhaps the worst actor for COVID-19 refunds for canceled flights. While other airlines were stingy, they generally paid up when a passenger filed a DOT complaint for a refund due to a flight canceled by the airline. Air Canada on the other hand has resisted refunds even when faced with DOT complaints. The Canada Transportation Agency also punted on enforcement. It wasn’t until April 2021 that they finally started refunding passengers.
Air Canada told the judge that “it cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot.”
I mean, are you kidding me? If an agent gives you bad information, whose fault is that?
The judge rightly scolded Air Canada for this argument, “It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.”
Air Canada also argued that a $200 flight coupon should reduce the value of the refund, which the judge denied as the customer rejected said compensation. It’s a good reminder to decline inadequate compensation.
Mr. Moffatt represented himself and won a total of $812.02. It just boggles my mind that Air Canada tried to fight this case rather than just settling it.
The airline told the Vancouver Sun that they will comply with the ruling and won’t appeal.
Air Canada spoke about their implementation of AI into its operations for years, perhaps they jumped onboard too soon?
AI hallucination, or when AI provides incorrect or misleading information, is dreadfully commonplace. I wrote about the issue a year ago and it still never ceases to amaze me how services like ChatGPT will insist that the wrong answer is correct, while refusing to cite sources to back up its claim. And yet, lawyers have been fined for using ChatGPT in court cases while failing to realize that the AI just made up facts from thin air.
The reasons for AI hallucinations are complex, but if you’re interested in them, this video attempts to tackle the question:
For AI to become truly useful, that’s going to have to be solved.
Though I will say that OpenAI Sora absolutely terrifies me.
Air Canada appears to have removed the chatbot from their website for the time-being.
Kudos to the judge for the common sense decision!
Leave a Reply
9 Comments On "Air Canada: Don’t Trust Anything Our Agents Or Chatbots Tell You; Thoughts On AI Hallucinations"
All opinions expressed below are user generated and the opinions aren’t provided, reviewed or endorsed by any advertiser or DansDeals.
Maybe the chatbot has said some other things that they were trying to avoid?
AC Karen’s when it came to masks – Canadians in general- a different species
The IRS publishes lots of instruction booklets as well as informational booklets on many topics. However, be wary of relying on those alone. Information published in those formats may not stand up in court. What really counts in general is the Internal Revenue Code (US Code Title 26) and Treasury Regulations. Who cares that the official government agency tasked with administration of the nation’s tax collection and processing is the body which published that information on the tax topic! It’s not legally definitive.
Really mind boggling how small they are. Horrible pr for an 800 case. If it wouldn’t been me, I would’ve given up, kudos to Mr. Moffatt for actually taking it to court.
Great take. These type of posts make DansDeals fun.
Why are you worried about SORA? Torah said long ago, al pi shnayim eidim yakum davar (kosher eidim that is). It’s high time people stopped believing everything they see on a video
Is not just airlines. I chatted with amex rep online, to ask them if I used a benefit already.
(I.E Airline credit). They told me I didnt and that I can still qualify. I made the purchase, then they refused to honor it stating that I used it already a few months back. They didnt care that I had a copy of the chat (which they can see themselves). I called spoke to manager, and nothing doing…….”sometimes the agent in the chat makes mistakes”.
So what are my options? Sue for $200??
Complain to the bbb – I recently had a complaint with Amex offers which the higher ups had to answer to – they avoided my problem but it is hard to see how they will wiggle out of your issue I think you have a good shot there
I have a recording of Amazon Alexa confidently explaining why vital wheat gluten is gluten free and therefore safe for those with celiac. Not just hallucinating on the facts, but also giving a detailed explanation that can be imagined to make sense. I asked the question numerous times in different ways over several days and it’s confident in it’s accuracy, so much so that I Google searched it multiple times before writing this.
They weren’t arguing the few hundred bucks, they were fighting the precedence. Thankfully lost…