Trying to cleanse the internet of fake news is a bit like trying to sweep Bondi Beach clear of sand. No matter how much we do, the next wave simply washes more sand back up.
But surely those big platforms Google and Facebook should be able to fix it, right? Spoiler alert: probably not.
If you’re reading this as a Crikey subscriber (and why wouldn’t you be?), here’s some good news: subscriber news (genuine, paid-for journalism, that is) is the one small segment of the internet structurally designed to exclude fake news.
The same is true if you restrict your browsing to the manicured gardens of journalistic media, where the excesses are trimmed and tamed by journalism ethics and professional practice.
But it’s another picture out in the wild jungle of the internet where social media algorithms and Macedonian bots roam free, swinging from site to site, spreading and sharing the fruits and seeds of fake news.
There, the political economy of the internet encourages the fake over the real. Fake news — falsehoods dressed up as in news style, often sourced to an equally bogus organisation — are the most disturbing part of the digital misinformation and deliberate disinformation that floods the web.
[Sorry, ‘fake news’ isn’t information you don’t like]
Because Google’s search engine maps that jungle (and makes good money doing so) there’s a constant call for them to just fix it. Yet while naming and mapping confers control, sometimes it’s a bit like those 18th century sea captains stumbling over an island, claiming ownership for some European monarch and then sailing on.
Google spends a lot of resources (not a lot by Google standards, but a lot for anyone else) in spreading this message: we don’t own the internet! We’re not the internet police! But the money the search engine makes out of it makes that a hard message to sell.
Google has had some success with its “all care no responsibility” mantra in its home base in the United States. But the more regulatory environment of the European Union is pressing the internet giant to adapt, as for example with the right to be forgotten. And that’s before Google has to deal with appropriate responses to authoritarian regimes like China.
That’s because the business model of Google is a response to the political economy of the web. Its algorithms continually struggle to be shaped for quality and accuracy rather than what they might randomly pick up out in the wilds.
But Google is trying. In fact, the entire point of the Google search is to rank usefulness to the searcher, so it is starting from a practical place. Over the past 12 months, its algorithms and bots factor in more ethical assumptions about reliability, such as source of news. So the major traditional news sources get prioritised over more dubious sources. The downside of this, of course, is that it might exclude new or innovative players.
Google has prioritised fact-checked responses to fake news over the fake itself. However, this runs into human perceptions: the fake — being more dramatic — remains more memorable than the correction.
It has tried to interrupt the money flow by reshaping AdSense to exclude ads from fake sites as well as racist, sexist or other offensive material. Often this involves the radical concept of actual humans making decisions, rather than bots. This has affected some fake news producers, but many have either morphed or shifted to other programmatic ad software.
For Facebook, the challenge is different. Facebook, after all, operates in its own walled garden. But it’s a garden than can be a bit like New York’s Central Park at the peak of the 1980s crime wave; there are parts of it you want to stay out of, particularly when it’s dark.
[The ‘fake news’ shitstorm gathers momentum]
Facebook has always been more a sharing than an information company, more of a distribution channel. The entire point of Facebook is to create a bubble that links you, friends and family as, well “Friends”. This is what makes it such a frustrating vehicle for journalists. It filters by design.
About 12 months ago, after being criticised for promoting liberal views (in the US sense) in trending topics, Facebook adjusted its algorithms to prioritise “family and friends” over, say, posts or shares from media organisations. After the US election, Facebook introduced some flagging of fake news. However, it’s precisely fake news shared by family and friends that has real power. We lend shares from family and friends a trust, whether right or wrong.
Facebook hasn’t come up with a solution to this structural challenge. It seems like it has implemented a short-term hack of having actual people rein in the bots. That’s not scalable. But Facebook is smart. Expect some significant shift — even a major disruption to the current model — in the next few months.
As part of that short-term hack, both Google and Facebook have been supporting fact-checking and trust building projects. They’ve encouraged and trialled tagging software — sort of algorithmic bullshit detectors. There’s a lot of frantic paddling going on underneath the water line.
The platforms have a major role to play in cleansing fake news. They’ve got a real commercial imperative to fix it. But they’re not there yet.
*Seen any Australian fake news lately? That is, false reports set out to look like news reports, often from an equally bogus media organisation. Often seen and shared on Facebook or other social media. First step in fighting fake news is knowing what’s out there. Let us know what you find.
“The same is true if you restrict your browsing to the manicured gardens of journalistic media, where the excesses are trimmed and tamed by journalism ethics and professional practice.”
Surely Christopher Warren is being ironic here. Or does he believe that the Australian, the Daily Telegraph and other stablemate purveyors of invented outrage campaigns masquerading as news fits into the manicured gardens.
We can trust the old gatekeepers eh, that’s reassuring. So Belgian babies really were caught on the Hun bayonets, the Vietnamese coast guard attacked a US destroyer in the Gulf of Tonkin and Iraq had WMDs.
Well, that’s all dunk-hory, I am relieved.
Isn’t there a song, something about “Peeeple who trust peeeeple are the dumbest …”?.
I’m reminded of the reliance on calculators – without some sense of basic arithmetic, most people would not realise when they have touched the wrong key and just accept the answer on screen, no matter how absurd.
Try giving a checkout operator $50.45 for an item costing $10.45 and watch them struggle to understand why they should give you $40 change.