- Researchers based in Germany and Belgium recently asked Microsoft Copilot a range of commonly asked medical questions.
- Analysing the results, the research sugge… [+5341 chars]
Regions
Belgium
"42% of AI answers were considered to lead to moderate or mild harm, and 22% to death or severe harm." A damning research paper suggests that Bing / Microsoft Copilot AI medical advice may actually kill you.
A new study coming out of Europe suggests that Microsoft Copilot is spectacularly bad at offering medical advice, and in some cases, has even recommended seriously harmful, life-threatening advice.
What you need to know