New Snapchat AI Tool “Open To Hallucinations”


ARSTechnica, Monday, February 27, 2023



 


“Sorry in advance!” Snapchat warns of hallucinations with new AI conversation bot


Snapchat has launched a new chatbot called My AI, but warns that the product, which costs $3.99 a month, is “open to hallucinations.” That it means it makes inaccurate comments about a subject that isn’t covered in its training data set. 


New Snapchat AI Tool


 


On Monday, Snapchat announced an experimental AI-powered conversational chatbot called “My AI,” powered by ChatGPT-style technology from OpenAI. My AI will be available for $3.99 a month for Snapchat+ subscribers and is rolling out “this week,” according to a news post from Snap, Inc.


Users will be able to personalize the AI bot by giving it a custom name. Conversations with the AI model will take place in a similar interface to a regular chat with a human. “The big idea is that in addition to talking to our friends and family every day, we’re going to talk to AI every day,” Snap CEO Evan Spiegel told The Verge.


But like its GPT-powered cousins, ChatGPT and Bing Chat, Snap says that My AI is prone to “hallucinations,” which are unexpected falsehoods generated by an AI model. On this point, Snap includes a rather lengthy disclaimer in its My AI announcement post:



“As with all AI-powered chatbots, My AI is prone to hallucination and can be tricked into saying just about anything. Please be aware of its many deficiencies and sorry in advance! All conversations with My AI will be stored and may be reviewed to improve the product experience. Please do not share any secrets with My AI and do not rely on it for advice.”


Among machine-learning researchers, “hallucination” is a term that describes when an AI model makes inaccurate inferences about a subject or situation that isn’t covered in its training data set. It’s a well-known drawback of current large language models such as ChatGPT, which can easily make up convincing-sounding falsehoods, such as academic papers that don’t exist and inaccurate biographies.


Story at ARSTechnica »


 


 

(3)