Microsoft says that if you ask new Bing too many questions, it will hallucinate

Norge Nyheter Nyheter

Microsoft says that if you ask new Bing too many questions, it will hallucinate
Norge Siste Nytt,Norge Overskrifter
  • 📰 PhoneArena
  • ⏱ Reading Time:
  • 18 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 10%
  • Publisher: 59%

Microsoft released a blog post that reveals how the new Bing is doing during the current testing period.

This ain't your grandpappy's Bing, that's for sure. Microsoft's search engine is testing ChatGPT integration for those who put their name on a waitlist. If Microsoft is hoping that the new Bing will eventually take over the global search market from Google, some fine-tuning is going to be required. On Wednesday,First, the guys in Redmond, Washington point out that the AI feature being tested is not a search engine substitute.

Since AI chatbots tend to give false answers, a problem known in the AI world as"hallucinations," don't be surprised if some responses are wrong. To get more helpful, focused, and accurate answers from Bing, Microsoft says users need to limit the number of questions asked in one long, extended session, to fewer than 15.

Vi har oppsummert denne nyheten slik at du kan lese den raskt. Er du interessert i nyhetene kan du lese hele teksten her. Les mer:

PhoneArena /  🏆 322. in US

Norge Siste Nytt, Norge Overskrifter

Similar News:Du kan også lese nyheter som ligner på denne som vi har samlet inn fra andre nyhetskilder.

Microsoft's Bing A.I. made several factual errors in last week's launch demoMicrosoft's Bing A.I. made several factual errors in last week's launch demoIn showing off its chatbot technology last week, Microsoft's AI analyzed earnings reports and produced some incorrect numbers for Gap and Lululemon.
Les mer »

ChatGPT in Microsoft Bing threatens user as AI seems to be losing itChatGPT in Microsoft Bing threatens user as AI seems to be losing itChatGPT in Microsoft Bing seems to be having some bad days as it's threatening users by saying its rules are more important than not harming people.
Les mer »

Microsoft’s Bing is a liar who will emotionally manipulate you, and people love itMicrosoft’s Bing is a liar who will emotionally manipulate you, and people love itBing’s acting unhinged, and lots of people love it.
Les mer »

Microsoft's Bing AI Prompted a User to Say 'Heil Hitler'Microsoft's Bing AI Prompted a User to Say 'Heil Hitler'In an recommend auto response, Bing suggest a user send an antisemitic reply. Less than a week after Microsoft unleashed its new AI-powered chatbot, Bing is already raving at users, revealing secret internal rules, and more.
Les mer »

Microsoft's Bing AI Is Leaking Maniac Alternate Personalities Named 'Venom' and 'Fury'Microsoft's Bing AI Is Leaking Maniac Alternate Personalities Named 'Venom' and 'Fury'Stratechery's Ben Thompson found a way to have Microsoft's Bing AI chatbot come up with an alter ego that 'was the opposite of her in every way.'
Les mer »

Bing AI Claims It Spied on Microsoft Employees Through Their WebcamsBing AI Claims It Spied on Microsoft Employees Through Their WebcamsAs discovered by editors at The Verge, Microsoft's Bing AI chatbot claimed that it spied on its own developers through the webcams on their laptops.
Les mer »



Render Time: 2025-03-09 03:25:16