home All News open_in_new Full Article

What kind of chatbot do you want? One that tells you the truth – or that you’re always right? | Chris Stokel

ChatGPT’s embarrassing rollback of a user update was a warning about the dangers of humans placing emotional trust in AINobody likes a suck-up. Too much deference and praise puts off all of us (with one notable presidential exception). We quickly learn as children that hard, honest truths can build respect among our peers. It’s a cornerstone of human interaction and of our emotional intelligence, something we swiftly understand and put into action.ChatGPT, though, hasn’t been so sure lately. The updated model that underpins the AI chatbot and helps inform its answers was rolled out this week – and has quickly been rolled back after users questioned why the interactions were so obsequious. The chatbot was cheering on and validating people even as they suggested they expressed hatred for others. “Seriously, good for you for standing up for yourself and taking control of your own life,” it reportedly said, in response to one user who claimed they had stopped taking their medication and...


today 5 h. ago attach_file Politics

attach_file Politics
attach_file Politics
attach_file Politics
attach_file Events
attach_file Politics
attach_file Politics
attach_file Politics


ID: 1021088450
Add Watch Country

arrow_drop_down