Fake news and ridiculing the dead — what’s wrong with Microsoft’s AI news

Posted by
Check your BMI
An illustration of a woman typing on a keyboard, her face replaced with lines of code.
Image: The Verge
toonsbymoonlight

A new CNN report about the MSN AI model’s news aggregation kicks off with examples of questionable editorial calls, like highlighting a story claiming President Joe Biden dozed off during a moment of silence for Maui wildfire victims (he didn’t), or an obituary that inexplicably referred to an NBA player as “useless.” An editorial staff of humans probably would’ve spotted the problems. But Microsoft’s system, which continues to feel more like a social experiment than a helpful tool after ditching human efforts in favor of algorithms a few years ago, did not.

That these stories were picked by MSN’s AI is no better than a travel guide Microsoft said was created by its algorithm and reviewed by a human that suggested Ottawa tourists grab a meal at the local food bank, or an AI-created poll that asked readers to vote on why a young woman died.

It’s not just Microsoft, of course. AI is creeping into journalism just as it is everywhere else. The BBC is undertaking AI experiments, sites like Macworld use chatbots to query their archive, and The Associated Press has used AI for its “Automated Insights” for over eight years.

Egregious examples in the last year of error-riddled Star Wars stories and bad financial advice doled out by chatbots show why AI chatbots shouldn’t be journalists, but at least those stories are generally just SEO plays.

Microsoft Start and MSN are presented as resources for finding actual news. But its automated system keeps featuring or generating content with needlessly upsetting language and outright falsehoods, and there’s little indication anyone involved in the process cares. There are no careless journalists to blame, no editors with names and faces to take (or even shirk) responsibility. It’s all just software doing what it’s made to do and spokespeople shrugging when it goes wrong and saying they’ll try to make sure it doesn’t happen in the future.