Two recent developments highlight the challenges businesses may face when exploring ways to integrate AI-powered chatbots into their customer service offerings:
- A putative class action filed in California federal district court alleges that, through the use of an artificial intelligence (AI) tool, Home Depot and Google “tapped” customers’ interactions with the centers calls from Home Depot, in violation of California’s Invasion of Privacy Act (CIPA).
- A decision from the British Columbia Civil Resolution Tribunal has held that Air Canada is required to respect information provided to a customer through its AI chatbot, even if that information was incorrect.
CIPA complaint
On February 14, 2024, Christopher Barulich filed a putative class action complaint in the U.S. District Court for the Central District of California against Home Depot and Google, alleging that the companies violated CIPA Section 631 by using an AI-based customer service tool.1 This section prohibits, in relevant part, any reading, attempted reading, or learning of the content or meaning of any message or communication voluntarily and without the consent of all parties to the communication, and permits private rights of action.
Barulich alleges that Home Depot used Google’s Cloud Contact Center AI (CCAI), a technology through which customers first speak to an automated agent who “listens” to the customer service call, transcribes and analyzes the call in real time, then suggests possible answers. to a live Home Depot agent to whom the customer is then transferred. Barulich claims that by activating this process, Home Depot allowed Google to “access, record, read and know the content of (customers’) calls” without their prior consent. Barulich claims he did not know he was speaking to an automated agent or that the content of his calls had been forwarded to a third party (here, Google) for analysis.
Barulich also alleges that Home Depot and Google have “the ability to use the content of communications they intercept for purposes beyond the scope of individual customer service calls,” for example, “using information and data gleaned from customer service calls.” to continue training or develop their AI models.
The complaint alleges that the aforementioned activity allowed Google to “eavesdrop or eavesdrop on live conversations between callers and Home Depot,” in violation of CIPA Section 631. The complaint further alleges that Home Depot violated this section of CIPA by “knowingly and willfully allowing” Google to know the content of these communications in real time.
Barulich seeks injunctive relief, recovery of $5,000 per CIPA violation (with no cap on total statutory damages) on behalf of a putative California class, attorneys’ fees and costs.
Air Canada decision
On February 14, 2024, the Civil Resolution Tribunal of British Columbia found that Air Canada negligently misrepresented its bereavement airfare policy to plaintiff Jake Moffatt via its AI chatbot, and ordered Air Canada to reimburse Moffatt for the difference in the airfare based on what the AI chatbot represented to Moffat that he was eligible to receive.2
In November 2022, Moffatt visited the Air Canada website to learn more about Air Canada bereavement fares. Moffatt interacted with an AI chatbot that inaccurately stated that Moffatt could request reimbursement for a reduced bereavement rate within 90 days of the ticket issue date. In the response generated by the chatbot, the words “bereavement fares” were highlighted and underlined and linked to Air Canada’s current bereavement policy which did not allow for such retroactive reimbursement.
Relying on the information provided by the chatbot, and not the actual policy, Moffatt booked his flights and then submitted a bereavement fare request within the 90-day window specified by the chatbot. Air Canada acknowledged that the chatbot provided “misleading language” but claimed that the link provided by the chatbot was to the actual and correct policy.
Given the commercial relationship between the parties as service provider and consumer, the Tribunal concluded that Air Canada owed a duty of reasonable care to ensure that its representations were accurate and not misleading. The Tribunal also found that Air Canada was responsible for all information on its website, including information generated by an AI-powered chatbot: “In effect, Air Canada suggests that the chatbot is a separate legal entity who is responsible for his own actions. » .. Although a chatbot has an interactive component, it is only part of the Air Canada website.
Given the chatbot’s misrepresentation of the policy, the Tribunal found that Air Canada failed to take reasonable steps to ensure its chatbot provided accurate information.
The Tribunal ordered Air Canada to reimburse Moffatt the difference between the price of the plane ticket he paid less the approximate price of each flight, according to the Air Canada agent.
Key takeaways
The above developments highlight some of the issues businesses may face when deploying AI-powered customer service tools:
- There has been an uptick in threatened and filed litigation and arbitrations related to alleged CIPA violations. Companies using AI have been the target of pre-demand letters of demand, pre-arbitration notices of dispute, and have filed complaints and demands for arbitration invoking the CIPA civil cause of action to challenge commercially reasonable, expected, and ubiquitous technology tools such as live chats, session replay software, cookies, trackers, and pixels on a company’s website. Note that violations of the law are also criminal.
- Using AI to facilitate call center workflows is something many companies are exploring, but the new theory asserted in the Barulich complaint (and the potential for another wave of CIPA litigation and efforts to AI-driven mass arbitrage) should be taken into account in a company’s risk analysis. when evaluating AI tools.
- Businesses that use AI providers should also review and update customer disclosures and website policies to include the use of third-party AI technologies, if necessary. When it comes to calls, companies should audit their practices to ensure required disclosures are made at the start of call flows – whether with live agents or interactive voice response – to ensure that callers are informed of any call recording.
- The Canadian decision in Moffat highlights the risk businesses may face if they allow AI chatbots to articulate company rules and policies. Although it is unclear whether Air Canada would have prevailed if it had had a disclaimer stating that the chatbot discussion was for general information purposes only and should not be relied upon, and whether it had provided clear information that the actual bereavement policy was determinative and should be consulted. , businesses may want to consider whether such disclaimers are appropriate.
_______________
1 Barulich v. Home Depot, Inc.2:24-cv-01253 (CD Cal. February 14, 2024).
2 Moffatt v. Air Canada2024 BCCRT 149, ¶¶ 32, 40 (Can. BC CRT).
(View source.)