There is no conceivable world in which enough bomb-making equipment is being sold on Amazon to train an algorithm to make this recommendation.
So Channel 4 has discovered that fireworks enthusiasts and chemistry teachers shop on Amazon.
But by blending these innocent observations into an explosive tale of terrorism, they’ve guaranteed that their coverage will attract the maximum amount of attention.
The ‘Amazon teaches bomb-making’ story has predictably spread all over the Internet
When I contacted the author of one of these pieces to express my concerns, they explained that the piece had been written on short deadline that morning, and they were already working on an unrelated article. The author cited coverage in other mainstream outlets (including the New York Times) as justification for republishing and not correcting the assertions made in the original Channel 4 report.
The real story in this mess is not the threat that algorithms pose to Amazon shoppers, but the threat that algorithms pose to journalism. By forcing reporters to optimize every story for clicks, not giving them time to check or contextualize their reporting, and requiring them to race to publish follow-on articles on every topic, the clickbait economics of online media encourage carelessness and drama. This is particularly true for technical topics outside the reporter’s area of expertise.
And reporters have no choice but to chase clicks. Because Google and Facebook have a duopoly on online advertising, the only measure of success in publishing is whether a story goes viral on social media.
The very machine learning systems that Channel 4’s article purports to expose are eroding online journalism’s ability to do its job.
Moral panics like this one are not just harmful to musket owners and model rocket builders. They distract and discredit journalists, making it harder to perform the essential function of serving as a check on the powerful.
The real story of machine learning is not how it promotes home bomb-making, but that it’s being deployed at scale with minimal ethical oversight, in the service of a business model that relies entirely on psychological manipulation and mass surveillance. The capacity to manipulate people at scale is being sold to the highest bidder, and has infected every aspect of civic life, including democratic elections and journalism.
Together with climate change, this algorithmic takeover of the public sphere is the biggest news story of the early 21st century. We desperately need journalists to cover it. But as they grow more dependent on online publishing for their professional survival, their capacity to do this kind of reporting will disappear, if it has not disappeared already.