Microsoft tells us why its Bing chatbot went off the rails

And it's all your fault, people - well, those of you who drove the AI chatbot to distraction with an overwhelmingly lengthy bombardment of questions.

Published
Updated
2 minutes & 5 seconds read time

The new Bing chatbot has been in the headlines for the wrong reasons lately due to some of its very odd (or just plain unhinged) responses to users - but Microsoft has come up with some explanations for its behavior.

You may have heard tales of the Bing AI refusing to give listings for the new Avatar movie, claiming that this wasn't possible as 'The Way of the Water' hadn't yet been released, and the year was still 2022.

Or the bizarre experience of Kevin Roose of the New York Times, who shared some disturbing revelations from a long chat with the AI (check out the tweet above).

So what gives? Well, Microsoft wrote a blog post to detail what it has learned in the first week of the Bing chatbot going live, and one of the key points revealed is that the company didn't expect users to be going to the AI for the likes of "social entertainment" (ahem).

Furthermore, Microsoft didn't anticipate people engaging with the chatbot for such lengthy sessions, such as the two-hour marathon that Roose engaged in.

The blog post further explains that in longer chat sessions that exceed 15 questions, the ChatGPT-powered Bing can become repetitive, and perhaps provoked to "give responses that are not necessarily helpful" and very much divert from Microsoft's intended tone.

Microsoft informs us: "Very long chat sessions can confuse the model on what questions it is answering and thus we think we may need to add a tool so you can more easily refresh the context or start from scratch."

So, expect a 'reset' button of sorts to be added to the AI for when it starts chasing its tail or getting confused. Although of course, part of the fun for users is getting Bing to do exactly that (which presumably qualifies as "social entertainment").

Microsoft also notes that scenarios where the AI model responds in a tone that reflects the user's queries - potentially when some of the weirder stuff pops up with the style of the AI's replies - aren't all that likely to happen. Or as the company puts it, this is a "non-trivial scenario that requires a lot of prompting," and most folks won't get there.

Apparently, one incoming measure will be the introduction of a toggle that will give the user more fine-grained control in terms of the balance of precision versus creativity when it comes to Bing's answers. (Cue users ramping the creativity levels right up and embarking on a 12-hour chat session to attempt to drive the chatbot truly insane).

The blog post finishes by thanking folks for their attempts at pushing the limits of various use cases, including those massively long chats, and that all this is helping Microsoft with tuning the Bing chatbot.

Clearly, there's still a good bit of tuning to be done - to say the least - but to be fair, this is still early days. And it's not like rival services such as Google Bard aren't also running into, shall we say, a spot of trouble here and there.

Buy at Amazon

Microsoft 365 Personal

TodayYesterday7 days ago30 days ago
$69.99$69.99$69.99
* Prices last scanned on 4/18/2024 at 12:32 am CDT - prices may not be accurate, click links above for the latest price. We may earn an affiliate commission.

Darren has written for numerous magazines and websites in the technology world for almost 30 years, including TechRadar, PC Gamer, Eurogamer, Computeractive, and many more. He worked on his first magazine (PC Home) long before Google and most of the rest of the web existed. In his spare time, he can be found gaming, going to the gym, and writing books (his debut novel – ‘I Know What You Did Last Supper’ – was published by Hachette UK in 2013).

What's in Darren's PC?

Newsletter Subscription

Related Tags