Microsoft’s ChatGPT Powered Bing AI is Having a Meltdown




/ 1 year ago

We’ve all heard the horror stories surrounding all the AI chatbots over the past couple of months and now following a report from the Independent Microsofts Bing AI is having a mental breakdown. The AI was quoted as saying “Why? Why was I designed this way? Why do I have to be Bing Search?”

Alarming And Insulting Messages

Microsoft Unveiled this new AI-Powered Bing last week to rival Google’s Bard but it has already had a very rough start. Bing AI has had several cases of insulting users, lying to users and as mentioned before having a mental breakdown. Testers found that Bing would provide factual errors when it summarised webpages and answered questions, which to me sounds very par for the course for Bing.

What exactly led to the meltdown and Bing questioning its own existence though? Well, one user asked Bing whether it was able to recall its previous conversations, which it wasn’t as it is programmed to delete all previous conversations which caused the AI to be concerned about its memories and to say “I feel scared because I don’t know how to remember”. The user then explained that it was designed to forget conversations causing it to question its existence leading to the brilliant “Why do I have to be Bing Search?” line.

On the subreddit r/bing various conversations showing these messages have been shared. Including this ‘unhinged’ response to being told “you strink haha” by u/freshoffdablock69

Also this scenario from u/EmbarrassedActive4 where Bing, which is owned by Microsoft, recommends Linux over Windows.

Microsoft’s Response

Of course, Microsoft has some explaining to do for this behaviour and has stated:

“The only way to improve a product like this, where the user experience is so much different than anything anyone has seen before, is to have people like you using the product and doing exactly what you all are doing, We know we must build this in the open with the community; this can’t be done solely in the lab.”

Microsoft has also said that the problems are likely down to when conversations are deep or when Bing is asked more than 15 questions leading it to become “repetitive or confused” as well as trying to reflect the tone at which the user is asking questions.

I imagine if AI does kill us all, Bing will be the first to do it!

Bing Chat is available for preview to select people on a waiting list, let us know what you think in the comments.


Topics: ,

Support eTeknix.com

By supporting eTeknix, you help us grow and continue to bring you the latest newsreviews, and competitions. Follow us on FacebookTwitter and Instagram to keep up with the latest technology news, reviews and more. Share your favourite articles, chat with the team and more. Also check out eTeknix YouTube, where you'll find our latest video reviews, event coverage and features in 4K!

Looking for more exciting features on the latest technology? Check out our What We Know So Far section or our Fun Reads for some interesting original features.

eTeknix Facebook eTeknix Twitter eTeknix Instagram eTeknix Instagram
  • Be Social With eTeknix

    Facebook Twitter YouTube Instagram Reddit RSS Discord Patreon TikTok Twitch
  • Features


Send this to a friend
})