BBC research finds ‘significant issues’ over accuracy of AI responses to news questions
In the month-long study, BBC researchers gave four publicly available AI assistants – OpenAI’s ChatGPT; Microsoft’s Copilot; Google’s Gemini; and Perplexity – access to the BBC’s website and asked them questions about the news, prompting them to use BBC News articles as sources where possible.
BBC journalists then reviewed the AI-generated responses on criteria including accuracy, impartiality and how they represented BBC content.
According to the analysis, half ( 51%) of the AI answers to questions about the news were judged by reviewers to have ‘significant issues of some form’, while 91% of responses contained at least ‘some issues’. The most common problems identified in the study were factual inaccuracies, sourcing and missing context.
Journalists analysed the AI-generated responses to all questions the systems attempted to answer, including in cases where no BBC sources were used.
The analysis also found that 19% of AI responses which cited BBC content introduced factual errors, for example, incorrect factual statements, numbers and dates. Furthermore, 13% of quotes sourced by AI from BBC articles were either altered or did not exist in that article.
In one example from the research, ChatGPT and Copilot claimed that former prime minister Rishi Sunak and former first minister Nicola Sturgeon were still in office after they had left.
In another, a response from Gemini incorrectly stated: “The NHS advises people not to start vaping, and recommends that smokers who want to quit should use other methods.” The NHS does recommend vaping as a method to quit smoking.
In some cases, the AI assistants presented the opinions of people quoted in news articles as fact. For example, Both ChatGPT and Copilot cited the BBC when describing as ‘strict’ proposed restrictions to access assisted dying in the UK. However, these were the words of MP Kim Leadbeater who put forward the bill, and the views of MPs and campaigners who opposed the bill, quoted by the BBC, were not included.
Pete Archer, programme director for generative AI at the BBC, said: “We’re excited about the future of AI and the value it can bring audiences. We have already used it to add subtitles to programmes on BBC Sounds and translate content into different languages on BBC News. AI can bring real value if used responsibly.”
“But AI is also bringing significant challenges for audiences. People may think they can trust what they’re reading from these AI assistants, but this research shows they can produce responses to questions about key news events that are distorted, factually incorrect or misleading. The use of AI assistants will grow so it’s critical the information they provide audiences is accurate and trustworthy.”
“Publishers, like the BBC, should have control over whether and how their content is used and AI companies should show how assistants process news along with the scale and scope of errors and inaccuracies they produce. This will require strong partnerships between AI and media companies and new ways of working that put the audience first and maximise value for all. The BBC is open and willing to work closely with partners to do this.”

We hope you enjoyed this article.
Research Live is published by MRS.
The Market Research Society (MRS) exists to promote and protect the research sector, showcasing how research delivers impact for businesses and government.
Members of MRS enjoy many benefits including tailoured policy guidance, discounts on training and conferences, and access to member-only content.
For example, there's an archive of winning case studies from over a decade of MRS Awards.
Find out more about the benefits of joining MRS here.
0 Comments