Air Canada Lost Case Involving Hallucinating Chatbot Lying About Policy


By

on

in

,

Bills

Air Canada lost a small claims case after its AI-powered chatbot hallucinated on the job and gave a grieving customer incorrect information about the airline’s bereavement policy.

The customer said they bought tickets after being misled by the AI chatbot, which gave answers inconsistent with Air Canada’s policy. While the chatbot provided a link to the airline’s Bereavement Travel page, the small claims court favored the customer. The Tribunal in Canada’s court ruled in favor of the customer and awarded $812.02 in damages and court fees, according to Forbes.

This snafu isn’t the first time AI has caused a ruckus for companies and customers. Hallucinations in AI are a prominent problem when nonsensical answers or flat-out incorrect information is generated. In 2018, WestJet airlines’ chatbot referred to a customer complimenting a crewmember to a suicide hotline, according to CBC. Vectara, a start-up company founded by former Google employees, estimates chatbots invent information between 3% and 27% of the time.

Air Canada launched its fleet of chatbots last year and plans to expand to AI-powered voice customer service bots as well. Crocker said AI “creates a better customer experience,” but that’s not the case in this instance, and the Tribunal agrees.

The problem arose when the customer used Air Canada’s chatbot to book flights after the death of their grandmother. The chatbot said the passenger could retroactively apply for bereavement fares. The customer paid $590 for a next-day flight under the advice of the chatbot, which said they would get a partial refund later. The customer paid another $627 for a return flight. In total, they spent $1,216 for tickets, which the chatbot said would only be $564 after the rebate, according to The Washington Post.

The customer took a screenshot, which they shared with the Tribunal.

“Air Canada offers reduced bereavement fares if you need to travel because of an imminent death or a death in your immediate family… If you need to travel immediately or have already traveled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.”

However, the chatbot was incorrect. Air Canada’s bereavement policy “does not allow refunds for travel that has already happened,” according to its policy page. This directly conflicts with what the customer was told via the chatbot. Air Canada initially offered the customer a $200 flight voucher, which they turned down and filed in small claims court.

Air Canada’s defense was that the chatbot was its own separate entity. The airline pointed out that the chatbot did supply the customer with a link to the correct information. 

“Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot,” said Civil Resolution Tribunal Member Christopher C. Rivers. “In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions.” 

The Tribunal ruled Air Canada did not adequately explain why the passenger should know not to trust the chatbot on its website and determined the case constituted “negligent misrepresentation.”

Rivers said Air Canada didn’t take “reasonable care to ensure its chatbot was accurate” or why the bereavement page was “inherently more trustworthy than its chatbot.”

“It should be obvious to Air Canada that it is responsible for all the information on its website,” said Rivers. “It makes no difference whether the information comes from a static page or a chatbot.” 

Air Canada did not respond to comment, but the chatbot appears to have been removed from its website.

Disclaimer: To address the growing use of ad blockers we now use affiliate links to sites like http://Amazon.com, streaming services, and others. Affiliate links help sites like Cord Cutters News, stay open. Affiliate links cost you nothing but help me support my family. We do not allow paid reviews on this site. As an Amazon Associate I earn from qualifying purchases.

Subscribe to Our Newsletter

* indicates required

Please select all the ways you would like to hear from :

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please visit our website.

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp’s privacy practices here.