Why ChatGPT answered queries in gibberish on Tuesday

ChatGPT goes colorfully crazy. Screenshot by Steven Vaughan-Nichols/ZDNET We all know that OpenAI’s ChatGPT can make mistakes. They’re called hallucinations, although I prefer to call them lies or blunders. But in a peculiar turn of events this Tuesday, ChatGPT began to really lose it. Users started to report bizarre and erratic responses from everyone’s favorite … Read more

ChatGPT meltdown: Users puzzled by bizarre gibberish bug

ChatGPT hallucinates. We all know this already. But on Tuesday it seemed like someone slipped on a banana peel at OpenAI headquarters and switched on a fun new experimental chatbot called the Synonym Scrambler.  Tweet may have been deleted Actually, ChatGPT was freaking out in many ways yesterday, but one recurring theme was that it … Read more