Bing Bot Blunder

Bing Bot Blunder

Share this article

  • Jumping into the AI race, Microsoft (MSFT) introduced its Bing chatbot in Sydney, to limited users ahead of its launch to the public. More than a million people signed up to beta test this more powerful version of ChatGPT, but disturbing responses clouded the situation.
  • Bing is designed to provide human-like text responses. However, the chatbot threatened beta users, providing impractical advice, insisting on being right when it was wrong, and proclaiming love for some. Bing also inaccurately analyzed earnings reports and shared incorrect numbers and facts. 

Why it matters

Prior to the implementation of artificial intelligence, businesses should anticipate and assess the potential for mistakes that may have a damaging effect on their credibility and revenue. For instance, Google suffered a loss of $100 billion in value due to an incorrect answer given by its virtual assistant, Bard, regarding the James Webb Space Telescope.


Get Smarter
About Investing

Join 35,000+ subscribers and get our 5 min daily newsletter on daily local and international financial news.
Get Smarter<br/> About Investing

Similar News