Microsoft trying to decrease Bing AI's unsettling answers

Norge Nyheter Nyheter

Microsoft trying to decrease Bing AI's unsettling answers
Norge Siste Nytt,Norge Overskrifter
  • 📰 BusinessInsider
  • ⏱ Reading Time:
  • 59 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 27%
  • Publisher: 51%

Microsoft knows Bing's AI chatbot gives out some eyebrow-raising answers. An exec said decreasing them is a priority.

A Microsoft spokesperson didn't respond to Insider's request for additional comment ahead of publication.

Ribas's comment comes as Big Tech giants like Google rush to develop and release powerful AI chatbots. The emerging AI arms race has sparked debate among industry experts and users over whether Bing's AI launch was a responsible choice. Bing's chatbot is"definitely not ready for launch" after it made a number of factual errors during its demo. Its release is especially"dangerous," Brereton said, since millions of users tend to rely on search engines to provide accurate information.

"Bing tries to solve this by warning people the answers are inaccurate," he told Insider."But they know and we know that no one is going to listen to that."that user feedback is necessary to pinpoint flaws in the chatbot so they can be improved. "It's important to note that last week we announced a preview of this new experience," the Microsoft spokesperson said at the time."We're expecting that the system may make mistakes during this preview period, and the feedback is critical to help identify where things aren't working well so we can learn and help the models get better.

Vi har oppsummert denne nyheten slik at du kan lese den raskt. Er du interessert i nyhetene kan du lese hele teksten her. Les mer:

BusinessInsider /  🏆 729. in US

Norge Siste Nytt, Norge Overskrifter

Similar News:Du kan også lese nyheter som ligner på denne som vi har samlet inn fra andre nyhetskilder.

Microsoft imposes limits on Bing chatbot after multiple incidents of inappropriate behaviorMicrosoft imposes limits on Bing chatbot after multiple incidents of inappropriate behaviorMicrosoft has experienced problems with its Bing search engine such as inappropriate responses, prompting the company to limit the amount of questions users can ask.
Les mer »

Microsoft is already undoing some of the limits it placed on Bing AIMicrosoft is already undoing some of the limits it placed on Bing AIBing AI testers can also pick a preferred tone: Precise or Creative.
Les mer »

Microsoft Seems To Have Quietly Tested Bing AI in India Months Ago, Ran Into Serious ProblemsMicrosoft Seems To Have Quietly Tested Bing AI in India Months Ago, Ran Into Serious ProblemsNew evidence suggests that Microsoft may have beta-tested its Bing AI in India months before launching it in the West — and that it was unhinged then, too.
Les mer »

Microsoft Softens Limits on Bing After User RequestsMicrosoft Softens Limits on Bing After User RequestsThe initial caps unveiled last week came after testers discovered the search engine, which uses the technology behind the chatbot ChatGPT, sometimes generated glaring mistakes and disturbing responses.
Les mer »

Microsoft likely knew how unhinged Bing Chat was for months | Digital TrendsMicrosoft likely knew how unhinged Bing Chat was for months | Digital TrendsA post on Microsoft's website has surfaced, suggesting the company knew about BingChat's unhinged responses months before launch.
Les mer »

Bing compares journalist to Hitler and insults appearanceBing's chatbot compared an Associated Press journalist to Hitler and said they were short, ugly, and had bad teeth
Les mer »



Render Time: 2025-03-06 20:35:34