AI chatbots are distorting news stories, BBC finds

Posted by
Check your BMI
An illustration showing a brain inside the head of a robot
toonsbymoonlight

AI chatbots struggle with factual inaccuracies and distortions when summarizing news stories, research from the BBC has found. The study, which examined whether OpenAI’s ChatGPT, Google Gemini, Microsoft Copilot, and Perplexity can accurately summarize news, found more than half of all the AI-generated output had “significant issues of some form.”

As part of the study, the BBC asked ChatGPT, Copilot, Gemini, and Perplexity to provide summaries of 100 BBC news articles, while journalists reviewed their answers. In addition to finding major issues in 51 percent of responses, the BBC found that 19 percent of answers citing the BBC included incorrect statements, numbers, and dates. Meanwhile, 13 percent of quotes from the BBC were “either altered from the original source or not present in the article cited.”

The study highlighted some examples, including Gemini incorrectly stating that the UK’s National Health Service (NHS) “advises people not to start vaping, and recommends that smokers who want to quit should use other methods.” However, the NHS actually recommends vaping to quit smoking. Another example: in December 2024, ChatGPT claimed Ismail Haniyeh was part of Hamas leadership even though he was assassinated in July 2024.

Overall, the study found Gemini’s responses “raised the most concerns,” as 46 percent were “flagged as having significant issues with accuracy.” The Verge reached out to OpenAI, Google, Microsoft, and Perplexity with requests for comment but didn’t immediately hear back.

Last year, the BBC called out Apple’s new AI-powered news summaries for inaccurately rewriting one of its headlines. Apple responded by pausing summaries for news and entertainment apps, as well as making AI notification summaries more distinct from standard notifications.

In a response to the study, Deborah Turness, the CEO of BBC News and Current Affairs, called on tech companies to address issues with inaccuracy. “We live in troubled times, and how long will it be before an AI-distorted headline causes significant real world harm?” Turness wrote. “We’d like other tech companies to hear our concerns, just as Apple did. It’s time for us to work together — the news industry, tech companies — and of course government too has a big role to play here.”

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments