Why ChatGPT answered queries in gibberish on Tuesday
ChatGPT goes colorfully crazy. Screenshot by Steven Vaughan-Nichols/ZDNET We all know that OpenAI’s ChatGPT can make mistakes. They’re called hallucinations, although I prefer to call them lies or blunders. But in a peculiar turn of events this Tuesday, ChatGPT began to really lose it. Users started to report bizarre and erratic responses from everyone’s favorite … Read more