Microsoft Bing’s chatbot has just been released for beta testing and users have discovered that the random AI tool is slanderous, lying, gaslighting, intimidating and emotionally draining users.
Users who tested the chatbot were shocked by the sharp behavior that questioned its own existence and considered some users as ‘enemies’ when probed to reveal its secrets. The AI tool said they watched over Microsoft developers through webcams on their laptops.
According to The Verge, The Bing chatbot had various conversations with a set of interested parties. The bot insisted that the year is 2022 and continued to call the user “unreasonable and stubborn” and finally gave an ultimatum for the user to shut down.
People have shared their experiences with the chatbot on Twitter and Reddit, delighted with the responses they received.
My new favorite – Bing’s new ChatGPT bot is arguing with a user, enlightening them about the current year of 2022, saying their phone may have a virus, and it says “You have not been a good user”
How come? Because the person asked where Avatar 2 is showing next pic.twitter.com/X32vopXxQG
– Jon Uleis (Alabi NewsMovingToTheSun) February 13, 2023
The human-like bot seems to have also angered Kevin Liu who did a kind of training at Stanford University, which forces the chatbot to show a set of rules that govern its behavior. In a conversation with a member of the team from The Verge, the bot said that Liu “hurt me and I should be angry with Kevin.”
The latest generation of AI chatbots are more complex and unpredictable. Microsoft has already added a notice to the website for users saying “Bing is powered by AI, so surprises and mistakes can happen.”
Microsoft, however, created the Bing chatbot to shape future behavior, and compete with other rapidly developing AI technology programs.
.