Apple has temporarily suspended a new artificial intelligence (AI) feature designed to summarise news headlines following widespread criticism and complaints over its repeated inaccuracies.
The decision comes after mounting pressure from media organizations and journalism watchdogs, who warned that the feature was not ready for release. The feature, which generated notifications that appeared to come directly from news apps, had been flagged for spreading false information.
In a statement, an Apple spokesperson said: “We are working on improvements and will make them available in a future software update.”
One of the most notable incidents involved the BBC, which reported that Apple’s AI-generated alert had falsely informed readers that Luigi Mangione, accused of killing UnitedHealthcare CEO Brian Thompson, had shot himself. The claim was entirely fabricated by the AI model.
Similar errors were reported by journalists and users of Sky News, the New York Times, and the Washington Post.
Jonathan Bright, head of AI for public services at the Alan Turing Institute, highlighted the risks of such technology: “Hallucinations – where an AI model makes things up – are a real concern. As yet, firms don’t have a systematic way to guarantee that AI models will never hallucinate, apart from human oversight.”
He added that these errors not only misinform the public but also risk further eroding trust in the media.
Journalism body Reporters Without Borders (RSF) welcomed Apple’s decision but warned of the dangers of rushing AI-powered tools. “Innovation must never come at the expense of the right of citizens to receive reliable information,” RSF said in a statement.
Vincent Berthier of RSF added: “This feature should not be rolled out again until there is zero risk it will publish inaccurate headlines.”
The BBC, one of the most vocal critics, said it had raised concerns with Apple as early as December. While the company initially promised to clarify the role of AI in the summaries, critics argued the response was insufficient.
Follow us on: