Canadian court holds Air Canada liable for information chatbot gave to consumer

Many companies have started to put automated chatbots on their website to answer customer service questions. Air Canada is one of them. Its chatbot told a customer that he could retroactively seek a bereavement discount for a flight he needed to purchase. When the customer submitted the paperwork he was told by the bot to complete, Air Canada denied him the discount, saying the discount was only available prior to purchase — i.e., that the chatbot was wrong. Air Canada said the customer should have found the correct information on its website himself.

The customer sued, and Air Canada insisted it was not responsible for what the chatbot said. The Canadian Civil Resolution Tribunal (like small claims court) rejected that argument in a strongly worded decision:

Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot. It does not explain why it believes that is the case. In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission. While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.

The tribunal ordered Air Canada to pay approximately 800 CAD.

Leave a Reply

Your email address will not be published. Required fields are marked *