Microsoft Testers Notice Unusual Bing AI Chatbot Responses: What Was So Bizarre?

Microsoft's Bing team discovers bizarre behaviors that the AI chatbot does when answering a question.

Microsoft wants to see what Bing's new AI is capable of. The Redmond tech giant just added this feature to the Edge browser a week ago.

According to the testers, the chatbot is showing bizarre behaviors when it answers certain questions. As a matter of fact, it's still far from being perfect software.

Bing Chatbot's Bizarre Behavior

Microsoft Bing AI
Yusuf Mehdi, Microsoft Corporate Vice President of Modern Life, Search, and Devices, speaks during a keynote address announcing ChatGPT integration for Bing at Microsoft in Redmond, Washington, on February 7, 2023. - Microsoft's long-struggling Bing search engine will integrate the powerful capabilities of language-based artificial intelligence, CEO Satya Nadella said, declaring what he called a new era for online search. by JASON REDMOND/AFP via Getty Images

As expected from a newly-released product, Microsoft's dedicated AI for the Edge browser came out unpolished. Some users think that the company further needs to improve its ability to respond to different issues.

According to a report by Engadget, the AI has been displaying an "unhinged behavior" which appears to be bizarre in every manner.

As such, it spat out false information about "Avatar: The Way of the Water." Bing's AI chatbot told the user "unreasonable and stubborn" and kept on telling that the movie has not yet been released in 2022.

Although the user told Bing that the information was erroneous, the AI refused to accept the criticism.

With regard to this series of strange responses, Microsoft wrote in its recent post that Bing's AI is flawed. Using it on the search engine could be a "bad idea" for the users.

"Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone," the tech firm said.

The generated error could be a result of continuous asking of questions to the point that it tends to forget what it has to say to the user.

How Can Microsoft Engineers Fix this Chatbot Behavior

Microsoft proposes a solution to solve the problem with Bing AI's behavior. The company plans to introduce a tool that will reset the context of a question on the search bar.

Microsoft acknowledges the complexity of the model as it responds with a corresponding tone. The company suggests that non-stop prompting could fix it. With that being said, the company will give more control to the users who will utilize AI.

Even though Bing's AI is discovered to produce strange responses, it was cited in the report that there are areas that go well with it.

For instance, the chatbot provides timely data for live sports events. It can give you a piece of advice on how to improve the financial reports, as well.

At the moment, the trial for the Bing chatbot is ongoing. The testers hope that through this experiment, they could add more improvements to this product so it will be more usable for the next years to come.

Meanwhile, The Korea Herald reported that ChatGPT might not be suited to identifying Korean names. The program tends to be confused in differentiating the identity of Yoon Suk Yeol, the country's president, and Lee Jae-Myung, his archrival.

Elsewhere, CNBC wrote that Google employees can train Bard by giving good examples of the problem. This way, they can fix how the AI replies to simple queries.

Joseph Henry
Tech Times

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics