One glaring issue many users noticed using tools like Bing Chat and ChatGPT is the tendency for the AI systems to make mistakes. As Greg Kostello explained to Cybernews, hallucinations in AI are when an AI system confidently presents information that has no basis in reality. A common mistake spotted early on was the invention of sources and citations. In a demonstration by The Oxford Review on YouTube, ChatGPT inserts a citation to a research paper that doesn’t exist. The referenced journal is real, but the paper and the authors are nowhere to be found.
In another strange example, Microsoft’s Bing Chat made up information about vacuum cleaners during the release demonstration, while a Reddit post seems to show Bing Chat having a minor existential crisis.
AI hallucinations are present in more than just text-generating chatbots, though. AI art generators also hallucinate in a sense, but in that case, they tend to misrepresent anatomy. This often manifests in human characters with an ungodly number of fingers, as seen in this post by u/zengccfun on r/AIGeneratedArt on Reddit, or designs and architecture that simply don’t make sense.
Stay connected with us on social media platform for instant update click here to join our Twitter, & Facebook
We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.
For all the latest gaming News Click Here
For the latest news and updates, follow us on Google News.