404: Responsibility Not Found -Air Canada Case, 2022-
As I was looking for another case to introduce, I came across another of a problematic case involving when AI systems act on their own, and when no one is clearly accountable. In 2022, Jake Moffatt, a traveler trying to reach a family funeral in Toronto, asked Air Canada’s website chatbot for help; only to receive incorrect information and financial loss.
Jake checked Air Canada’s website on the day his grandmother died and asked the chatbot how to handle bereavement fares. “If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.” was the respond he got. However, this information turned out to be wrong.
When he asked for a refund, the airline argued the chatbot was responsible for its own words. The tribunal rejected the claim and concluded the company was responsible for everything on its website, including AI bot messages. Air Canada was ordered to compensate him.
This shows a clear stance that businesses cannot shift blame onto their AI auto systems. If they use AI systems to guide customers, they will be responsible for the consequences. Like the Uber accident and other AI-related cases I introduced, the problem isn’t just that AI can make mistakes, but that authorities are relying on these tools much faster than making regulation for matters, leaving a clear lack of responsibility.
-References-

This case clearly shows how companies remain responsible for the actions of their AI systems, especially when customers rely on them during urgent situations. How should businesses ensure accuracy when using chatbots for sensitive information like refunds or bereavement policies? Adding one comparison to another AI‑related accountability case such as a banking or insurance chatbot error could help us see how widespread this issue has become.
ReplyDelete