In the week since Microsoft unveiled its AI-powered Bing chatbot and made it available to test on a limited basis, numerous users have pushed its limits only to have some jarring experiences.
After asking Microsoft's AI-powered Bing chatbot for help in coming up with activities for my kids while juggling work, the tool started by offering something unexpected: empathy.
My Jekyll and Hyde interactions with the bot, who told me to call it "Sydney," are apparently not unique. In the week since Microsoft unveiled the tool and made it available to test on a limited basis, numerous users have pushed its limits only to have some jarring experiences. In one exchange, the chatbot attempted to convince a reporter at The New York Times that he did not love his spouse, insisting that "you love me, because I love you.
In a statement to CNN, a Microsoft spokesperson said it continues to learn from its interactions and recognizes "there is still work to be done and are expecting that the system may make mistakes during this preview period." "The tone of the responses is unexpected but not surprising," Lian Jye, a research director at ABI Research, told CNN. "The model does not have contextual understanding, so it merely generated the responses with the highest probability [of it being relevant]. The responses are unfiltered and unregulated, so they may end up being offensive and inappropriate."
Philippines Latest News, Philippines Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Opinion | Why is Microsoft’s new Bing ChatGPT trash talking human users?Why is Microsoft’s new Bing ChatGPT trash talking human users? After just a couple of weeks, this Bing chatbot is already moodier than Britney Spears. People, we need to recalibrate our dystopian fears. Opinion by vinaymenon
Read more »
Microsoft looks to tame AI chatbot Bing after it insults usersIs Bing too belligerent? Microsoft looks to tame AI chatbot
Read more »
Is Bing too belligerent? Microsoft looks to tame AI chatbotMicrosoft\u0027s newly revamped Bing search engine can write recipes and songs and quickly explain just about anything found on the internet.
Read more »
Is Bing too belligerent? Microsoft looks to tame AI chatbotMicrosoft said this week it is promising to make improvements to its AI-enhanced search engine after a growing number of people are reporting being disparaged by Bing.
Read more »
Is Bing too belligerent? Microsoft looks to tame AI chatbotMicrosoft\u0027s newly revamped Bing search engine can write recipes and songs and quickly explain just about anything found on the internet.
Read more »
Is Bing too belligerent? Microsoft looks to tame AI chatbotMicrosoft’s newly revamped Bing search engine can write recipes and songs and quickly explain just about anything it can find on the internet.
Read more »