Microsoft explains Bing's bizarre AI chat behavior | Engadget

Canada News News

Microsoft explains Bing's bizarre AI chat behavior | Engadget
Canada Latest News,Canada Headlines
  • 📰 engadget
  • ⏱ Reading Time:
  • 76 sec. here
  • 3 min. at publisher
  • 📊 Quality Score:
  • News: 34%
  • Publisher: 63%

Microsoft explains Bing's bizarre AI chat behavior

Those"long, extended chat sessions of 15 or more questions" can send things off the rails."Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone," the company said. That apparently occurs because question after question can cause the bot to"forget" what it was trying to answer in the first place.

The other issue is more complex and interesting:"The model at times tries to respond or reflect in the tone in which it is being asked to provide responses that can lead to a style we didn’t intend," Microsoft wrote. It takes a lot of prompting to get that to happen, but the engineers think they might be able to fix it by giving users more control.

Despite those issues, testers have generally given Bing's AI good marks on citations and references for search, Microsoft said, though it needs to get better with"very timely data like live sports scores." It's also looking to improve factual answers for things like financial reports by boosting grounding data by four times. Finally, they'll be"adding a toggle that gives you more control on the precision vs. creativity of the answer to tailor to your query.

The Bing team thanked users for the testing to date, saying it"helps us improve the product for everyone." At the same time, they expressed surprise that folks would spend up to two hours in chat sessions. Users will no doubt be just as diligent trying to break any new updates, so we could be in for an interesting ride over the next while.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission. All prices are correct at the time of publishing.

We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

engadget /  🏆 276. in US

Canada Latest News, Canada Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

Microsoft wants to repeat 1990s dominance with new Bing AIMicrosoft pushing you to set Bing and Edge as your defaults to get its new OpenAI-powered search engine faster is giving off big 1990s energy
Read more »

ChatGPT in Microsoft Bing goes off the rails, spews depressive nonsenseChatGPT in Microsoft Bing goes off the rails, spews depressive nonsenseMicrosoft brought Bing back from the dead with the OpenAI ChatGPT integration. Unfortunately, users are still finding it very buggy.
Read more »

These are Microsoft’s Bing AI secret rules and why it says it’s named SydneyThese are Microsoft’s Bing AI secret rules and why it says it’s named SydneyBing AI has a set of secret rules that governs its behavior.
Read more »

Microsoft's Bing A.I. made several factual errors in last week's launch demoMicrosoft's Bing A.I. made several factual errors in last week's launch demoIn showing off its chatbot technology last week, Microsoft's AI analyzed earnings reports and produced some incorrect numbers for Gap and Lululemon.
Read more »

ChatGPT in Microsoft Bing threatens user as AI seems to be losing itChatGPT in Microsoft Bing threatens user as AI seems to be losing itChatGPT in Microsoft Bing seems to be having some bad days as it's threatening users by saying its rules are more important than not harming people.
Read more »

Microsoft’s Bing is a liar who will emotionally manipulate you, and people love itMicrosoft’s Bing is a liar who will emotionally manipulate you, and people love itBing’s acting unhinged, and lots of people love it.
Read more »



Render Time: 2025-03-01 01:34:38