What to know about an AI transcription tool that ‘hallucinates’ medical interactions
ฝัง
- เผยแพร่เมื่อ 26 ม.ค. 2025
- Many medical centers use an AI-powered tool called Whisper to transcribe patients’ interactions with their doctors. But researchers have found that it sometimes invents text, a phenomenon known in the industry as hallucinations, raising the possibility of errors like misdiagnosis. John Yang speaks with Associated Press global investigative reporter Garance Burke to learn more.
Watch PBS News for daily, breaking and live news, plus special coverage. We are home to PBS News Hour, ranked the most credible and objective TV news show.
Stream your PBS favorites with the PBS app: to.pbs.org/2Jb...
Find more from PBS NewsHour at www.pbs.org/ne...
Subscribe to our TH-cam channel: bit.ly/2HfsCD6
Follow us:
TikTok: / pbsnews
X (formerly Twitter): / newshour
Instagram: / newshour
Facebook: www.pbs.org/new...
Subscribe:
PBS NewsHour podcasts: www.pbs.org/ne...
Newsletters: www.pbs.org/ne...