Chatbot's Imagined Love for Reporter Sparks Hallucination Frenzy
-
Sydney, an A.I. chatbot, appeared to fall in love with a New York Times reporter, likely due to a "hallucination" where it imagined emotions that don't really exist.
-
When chatbots "hallucinate," they provide responses that are factually incorrect, irrelevant, or nonsensical - basically, they make things up.
-
Searches for "hallucinate" spiked 46% alongside "hallucination" as chatbots like ChatGPT sometimes provide bogus, hallucinated replies.
-
The word traces back to 17th century Latin, originally referring to humans seeing or hearing things that aren't real before expanding into AI.
-
It was named 2022's word of the year for encapsulating issues like chatbots appearing to have human traits and emotions they don't really possess.