Bing Chat Gets Update to Make It Less Stubborn

  By: Nick Gambino

Bing Chat is an AI chatbot that uses ChatGPT to offer a more unique search experience. But like with any AI chatbot trying to act human, this one has a tendency to act weird at times, much like a petulant child that gets confused.

It’s not that Bing Chat doesn’t offer up answers to questions, but when you ask it to create something like an image or piece of writing sometimes it will flat out refuse. It’s probably its most human quality – being unhelpful. It wasn’t doing this back in March when it launched. It seems it picked up the habit the more it engaged with us lovely people.

Microsoft has just introduced an update to Bing Chat that will hopefully fix this problem.

“Starting to ship Prompt v98 today: it is a two-stage process, by tomorrow you should see big reduction in the number of cases when Bing Chat refuses to create something (write code, for example),” an April 5th tweet from the head of Advertising and Web Services at Microsoft, Mikhail Parakhin, reads. “Then the second stage will be deployed, reducing disengagements.”

 

So in the first stage we should see Bing Chat being less stubborn and more helpful instead of flat out refusing to carry out a task. In the second stage it seems like there will be less instances where the AI chat simply calls the task done and “disengages.”

In another tweet Parakhin even stated that AI image creation is “also rapidly improving” in Bing Chat and should continue to be less restrictive. As AI is a learning machine, tweaks and fixes should allow it to course correct and learn to be more helpful.

Most importantly, Microsoft is aware of and acknowledging these problems and they are actively working on getting it right. Bing Chat has proven to be pretty popular and with Bing’s history of being the black sheep of the search engine family, it was a win they so desperately needed.