home All News open_in_new Full Article

ChatGPT hit with privacy complaint over defamatory hallucinations

OpenAI is facing another privacy complaint in Europe over its viral AI chatbot’s tendency to hallucinate false information — and this one might prove tricky for regulators to ignore. Privacy rights advocacy group Noyb is supporting an individual in Norway who was horrified to find ChatGPT returning made-up information that claimed he’d been convicted for […] © 2024 TechCrunch. All rights reserved. For personal use only.



Privacy rights group Noyb has filed a complaint against OpenAI over ChatGPT's "hallucinations," specifically regarding false claims that the chatbot generated about an individual, Arve Hjalmar Holmen, falsely accusing him of child murder. While OpenAI has since updated ChatGPT to stop producing the specific falsehoods about Holmen, Noyb and Holmen remain concerned that the incorrect information may still be retained within the AI model. Noyb argues that the current disclaimer about ChatGPT's potential for errors is insufficient, and highlights other instances of ChatGPT fabricating legally compromising information about individuals.

today 11 h. ago attach_file Events

attach_file Technology
attach_file Events
attach_file Politics
attach_file Events
attach_file Politics
attach_file Science
attach_file Events
attach_file Events
attach_file Events
attach_file Politics
attach_file Politics
attach_file Events
attach_file Politics
attach_file Politics
attach_file Society
attach_file Events
attach_file Events
attach_file Politics
attach_file Events
attach_file Events


ID: 1361944113
Add Watch Country

arrow_drop_down