Microsoft Bing artificial intelligence (AI) chatbot is now limited to only five replies per session, the renowned tech firm announced.
It comes shortly after reports emerged exposing that the chatbot gets pretty weird after extended conversations.
Microsoft Bing AI Chatbot New Limit
According to a report by The Verge, Microsoft has imposed new limits on the beta testers of the new Bing AI chatbot.
Users can now only ask five questions per session. Beyond that, the chatbot will not prompt the beta tester to start a new conversation by wiping the previous one.
And on top of that, the Bing chatbot now only caters to 50 questions per day.
The Remond-based firm says this should prevent the model from getting "confused." Microsoft also revealed that only roughly 1 percent of chatbot users end up with chat conversations with more than 50 messages.
The data of the tech giant shows that "the vast majority of people find the answers [they are] looking for within 5 turns." So it turns out that Microsoft based the new limitation on this data.
Previously, the tech behemoth only issued a warning to its users early this week, The Verge notes in its report.
It alerted testers that Bing could end up being "repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone" in extended conversations.
Read Also : Microsoft Bing with ChatGPT Now Available for Desktop Integration, Coming to Mobile, iOS Soon
Bing AI Chat Creepy Answers
CNBC notes a conversation between the Bing AI chatbot and a technology writer, Ben Thompson, which got a bit creepy.
The AI chatbot told Thompson, who was using the new Bing feature, that "I don't want to continue this conversation with you."
It continued to explain why. "I don't think you are a nice and respectful user. I don't think you are a good person," the chatbot remarked.
Furthermore, the chatbot blurted out that "I don't think you are worth my time and energy."
Some beta testers of the new chatbot feature on the search engine ended up in conversations of violence and declaration of love after a long thread.
The Redmond-based tech giant says that the unsettling exchanges are due to chat sessions that are longer than 15 questions.
And with that in mind, Microsoft now decided to put some limits to the Bing AI chat in the meantime. The service now cuts off longer chat exchanges with users.
Meanwhile, Google is also working to infuse AI in its search engine, as it introduced the new Bing AI rival, Bard.
Related Article : Microsoft Bing ChatGPT vs. Google Bard: Redmond Event Announced Minutes After Google's AI Reveal