Air Canada chatbot error shows liability implications of AI

Air Canada Boeing 777 taking off from Vancouver International Airport during sunset. Date: Feb 11, 2022

Air Canada has been ordered to uphold a policy fabricated by its AI customer chatbot in a recent Civil Resolution Tribunal (CRT) dispute.  

The decision is a cautionary tale for why clients need to be sure their AI chatbots provide accurate information — or risk being held liable in court.

The dispute arose after passenger Jake Moffatt booked a flight in Nov. 2022 with Air Canada after a relative died. While researching flight options, Moffatt inquired through the airline’s chatbot about bereavement fare options. 

The chatbot said Moffatt could apply for bereavement fares retroactively. 

“If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form,” the chatbot’s response read, according to CRT.  

The chatbot hyperlinked to a separate Air Canada webpage titled ‘Bereavement travel’ with additional information about Air Canada’s bereavement policy. But contrary to the chatbot, the webpage said the bereavement policy does not apply after travel has been completed. 

Based on the information from the chatbot, Moffatt booked a one-way flight from Vancouver to Toronto for $794.98, and a second one-way flight from Toronto to Vancouver for $845.38. 

After the flight, Moffat pursued the reduced rate, but learned from an Air Canada employee via telephone that Air Canada did not permit retroactive applications. 

In later communication, an Air Canada representative eventually responded and admitted the chatbot had provided “misleading words,” and that the chatbot would be updated.  

See also  Electric pickups compared: 2025 Ram 1500 REV specs vs. Silverado EV, F-150 Lightning

Moffatt argued they would not have taken the flight had they anticipated paying the full fare. 

Air Canada argued it could not be held liable for information provided by one of its agents, servants or representatives — including a chatbot.

“It does not explain why it believes that is the case,” tribunal member Christopher C. Rivers wrote in the decision.  “In effect, Air Canada suggested the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission.  

“While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.” 

Rivers also found it unreasonable for customers to expect that the webpage would be inherently more trustworthy than its chatbot. And, Air Canada was unable to explain why customers should be required to double-check information found on one part of its website with another.  

Rivers found Air Canada liable for negligent misrepresentation and ordered the airline to pay damages to Moffatt, plus pre-judgement interest and CRT fees, totalling $812.02.  

 

Feature image by iStock.com/Alvin Man