Microsoft Bing AI Chatbot Now Limited to Five Replies Per Session: Here’s Why

Will this get rid of creepy replies from the Bing AI chatbot?

Microsoft Bing artificial intelligence (AI) chatbot is now limited to only five replies per session, the renowned tech firm announced.

It comes shortly after reports emerged exposing that the chatbot gets pretty weird after extended conversations.

Microsoft Bing AI Chatbot
Microsoft Bing search engine in pictured on a monitor in the Bing Experience Lounge during an event introducing a new AI-powered Microsoft Bing and Edge at Microsoft in Redmond, Washington on February 7, 2023. - Microsoft's long-struggling Bing search engine will integrate the powerful capabilities of language-based artificial intelligence, CEO Satya Nadella said, declaring what he called a new era for online search. by JASON REDMOND/AFP via Getty Images

Microsoft Bing AI Chatbot New Limit

According to a report by The Verge, Microsoft has imposed new limits on the beta testers of the new Bing AI chatbot.

Users can now only ask five questions per session. Beyond that, the chatbot will not prompt the beta tester to start a new conversation by wiping the previous one.

And on top of that, the Bing chatbot now only caters to 50 questions per day.

The Remond-based firm says this should prevent the model from getting "confused." Microsoft also revealed that only roughly 1 percent of chatbot users end up with chat conversations with more than 50 messages.

The data of the tech giant shows that "the vast majority of people find the answers [they are] looking for within 5 turns." So it turns out that Microsoft based the new limitation on this data.

Previously, the tech behemoth only issued a warning to its users early this week, The Verge notes in its report.

It alerted testers that Bing could end up being "repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone" in extended conversations.

Bing AI Chat Creepy Answers

CNBC notes a conversation between the Bing AI chatbot and a technology writer, Ben Thompson, which got a bit creepy.

The AI chatbot told Thompson, who was using the new Bing feature, that "I don't want to continue this conversation with you."

It continued to explain why. "I don't think you are a nice and respectful user. I don't think you are a good person," the chatbot remarked.

Microsoft Bing AI
Yusuf Mehdi, Microsoft Corporate Vice President of Modern Life, Search, and Devices, speaks during a keynote address announcing ChatGPT integration for Bing at Microsoft in Redmond, Washington, on February 7, 2023. - Microsoft's long-struggling Bing search engine will integrate the powerful capabilities of language-based artificial intelligence, CEO Satya Nadella said, declaring what he called a new era for online search. by JASON REDMOND/AFP via Getty Images

Furthermore, the chatbot blurted out that "I don't think you are worth my time and energy."

Some beta testers of the new chatbot feature on the search engine ended up in conversations of violence and declaration of love after a long thread.

The Redmond-based tech giant says that the unsettling exchanges are due to chat sessions that are longer than 15 questions.

And with that in mind, Microsoft now decided to put some limits to the Bing AI chat in the meantime. The service now cuts off longer chat exchanges with users.

Meanwhile, Google is also working to infuse AI in its search engine, as it introduced the new Bing AI rival, Bard.

Teejay Boris
Tech Times
ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics