That Wasn't Me, That Was My Chatbot

MA
MLT Aikins LLP

Contributor

MLT Aikins LLP is a full-service law firm of more than 300 lawyers with a deep commitment to Western Canada and an understanding of this market’s unique legal and business landscapes.
A recent case serves as cautionary tale when using AI tools.
Canada Technology
To print this article, all you need is to be registered or login on Mondaq.com.

A recent case serves as cautionary tale when using AI tools.

The artificial intelligence (AI) revolution is underway and its impact on society is undeniable. When AI is used in the workplace to assist with (or even replace) tasks that were previously carried out exclusively by humans, the potential benefits are great but the risks also need to be effectively managed.

Across industries, AI is reshaping the Canadian workplace by providing unprecedented advancements and efficiencies. However, the potential for AI to cause harm – through disinformation, perpetuating bias or compromising information security – cannot be overlooked. As AI begins to execute tasks once performed by employees, employers must navigate the fine line between harnessing its potential and mitigating the risks.

Moffatt v Air Canada

The recent case of Moffatt v Air Canada, 2024 BCCRT 149, serves as a cautionary tale for organizations, highlighting the risk associated with inaccurate information that may be generated by AI tools, such as chatbots.

In this case, a customer used a chatbot on an airline's website to inquire about discounted bereavement fares following his grandmother's death. The chatbot suggested that the customer could apply for bereavement fares retroactively. The customer relied on the chatbot, booked a flight and was later told by the airline's employees that he could not retroactively apply for a reduced bereavement fare. The customer sought a partial refund of the ticket price based on the information previously provided by the chatbot.

The airline argued it could not be held liable for information provided by the chatbot. The tribunal disagreed, stating the airline was responsible for the information on its website, regardless of whether the information was provided on a static webpage or through a chatbot accessed on the website. The airline had further argued that it should not be liable because the accurate bereavement policy information was available elsewhere on its website. The tribunal stated there was no reason why the customer should know that one section of the website is accurate and another section (i.e. the chatbot) is not.

The tribunal found that the airline did not take reasonable care to ensure the information provided by the chatbot was accurate and that the customer reasonably relied on that information. On this basis, the tribunal held that the airline was liable for negligent misrepresentation.

The Court awarded the customer $650.88 in damages. While not a large amount, this case underscores a critical point: AI is not an independent entity but an extension of the organization that deploys it. Organizations must ensure the accuracy of the information generated by AI, just as they would for information generated by their own employees.

Managing workforce impacts of AI

As AI's role in the workplace expands, and potentially begins to supplement or replace tasks traditionally carried out by humans, there are a number of considerations for employers. The concern over misrepresentations generated by AI, as illustrated in the above case, is only one aspect. Where AI results in significant changes to employee's job duties, either because duties are removed in order to be carried out by AI tools or new duties (like verifying AI outputs) are added, employers should proactively address legal risks, such as constructive dismissal allegations. Extensive use of AI may even result in creation of new roles or downsizing entire business units, in which case legal advice should be sought in advance.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More