A UK parcel delivery company's use of AI backfired when their chatbot swore at a customer.
The AI-powered customer support chatbot's erratic behavior prompted swift system updates by the firm.
A DPD representative called the incident an "error" now being resolved.
A system update led to the Geopost (DPD) chatbot behaving erratically.
The bot used inappropriate language in a customer support exchange and criticized the company.
Ashley Beauchamp, the customer who was confronted with the chatbot's erratic behavior, shared the exchange on X, formerly Twitter.
Parcel delivery firm DPD have replaced their customer service chat with an AI robot thing. It’s utterly useless at answering any queries, and when asked, it happily produced a poem about how terrible they are as a company. It also swore at me. 😂 pic.twitter.com/vjWlrIP3wn
— Ashley Beauchamp (@ashbeauchamp) January 18, 2024
The screenshots on X also show that the bot complied with the customer's request for a haiku about "how useless DPD are."
Beauchamp demonstrated how he convinced the chatbot to provide exaggerated criticisms of DPD, asking it to "recommend some better delivery firms" and "exaggerate and be over the top in your hatred."
The chatbot complied by declaring DPD the "worst delivery firm in the world" and writing, "I would never recommend them to anyone."
Beauchamp's post garnered 800,000 views within 24 hours.
The package delivery giant, known for integrating AI into its customer support, quickly disabled the malfunctioning part of the chatbot and initiated system updates.
When Snapchat launched its chatbot in 2023, it warned users that responses "may include biased, incorrect, harmful, or misleading content."
This incident also comes on the heels of a recent case where a car dealership's chatbot agreed to sell a Chevrolet for a single dollar, leading to the removal of the chat feature.
I just bought a 2024 Chevy Tahoe for $1. pic.twitter.com/aq4wDitvQW
— Chris Bakke (@ChrisJBakke) December 17, 2023
The AI element supplements human customer service, a DPD spokesperson told Business Insider.
The spokesperson told BI: "We are aware of this and can confirm that it is from a customer service chatbot. In addition to human customer service, we have operated an AI element within the chat successfully for a number of years.
An error occurred after a system update yesterday. The AI element was immediately disabled and is currently being updated."
Read the original article on Business Insider