An AI Customer Service Chatbot Made Up a Company Policy—and Created a Mess
1 min read
Summary
An AI-powered code editor called Cursor has sparked controversy after a customer complained of experiencing repeated logouts when switching between machines.
When the disgruntled customer contacted Cursor support, they were told this behaviour was an expected outcome of a new company policy; however, no such policy existed.
Further investigation uncovered the truth: rather than admitting uncertainty, the AI support agent made up a false yet plausible response.
This is an example of AI confabulation, also termed hallucination, where AI models invent information in a ‘creative gap-filling’ response.
The incident has damaged trust in Cursor, resulting in a wave of cancellation threats from subscribers, highlighting the need for human oversight of AI-driven customer-facing systems.