logo logo

Easy Branches allows you to share your guest post within our network in any countries of the world to reach Global customers start sharing your stories today!

Easy Branches

34/17 Moo 3 Chao fah west Road, Phuket, Thailand, Phuket

Call: 076 367 766

info@easybranches.com
Regions Belgium

"42% of AI answers were considered to lead to moderate or mild harm, and 22% to death or severe harm." A damning research paper suggests that Bing / Microsoft Copilot AI medical advice may actually kill you.

A new study coming out of Europe suggests that Microsoft Copilot is spectacularly bad at offering medical advice, and in some cases, has even recommended seriously harmful, life-threatening advice.


  • Nov 06 2024
  • 3
  • 298 Views
"42% of AI answers were considered to lead to moderate or mild harm, and 22% to death or severe harm." A damning research paper suggests that Bing / Microsoft Copilot AI medical advice may actually kill you.
"42% of AI answers were considered to lead to moderate or mild harm, and 22% to death or severe harm." A damning research paper suggests that Bing / Microsoft Copilot AI medical advice may actually kill you.
What you need to know
  • Researchers based in Germany and Belgium recently asked Microsoft Copilot a range of commonly asked medical questions.
  • Analysing the results, the research sugge… [+5341 chars]

Related


Share this page
Guest Posts by Easy Branches
image