As News Corp Australia faces pressure from staff over its use of artificial intelligence (AI), many of the company’s mastheads are already publishing AI-generated articles with errors, formatting mistakes and bizarre language.
Earlier this week News Corp editorial staff wrote to company chair Michael Miller expressing their disappointment that the company had been using AI “for years” to write more than 3000 hyperlocal articles a week without consulting journalists. The letter called for more information from News Corp management about the use of the technology.
“New technologies have a place in supporting good, accessible journalism, but it is crucial that implementation processes are transparent, ethical and done in consultation with journalists and readers,” it said.
Today News Corp Australia’s general manager of employee relations Andrew Biocca responded to the staff’s letter saying that the company has “held a number of presentations” and “worked closely with employees on how to positively utilise AI”.
The company has a number of article formats that are automatically generated with information using public records for News Corp local mastheads, as first reported by Guardian Australia. These include traffic alerts, weather, sports results, court appearances, liquidation records and stock prices. Articles will sometimes specify the source of the information — like the New South Wales government’s Fuel Check website — and are typically published under the byline of one or more News Corp data journalists or as “Staff Writers”.
A Crikey review of content published by News Corp on its mastheads by its Data Local team found these articles frequently contain errors, despite the company’s claim that journalists are still responsible for the editing process.
Some of the errors are outright factual mistakes. For example, The Daily Telegraph’s August 10 “Parramatta traffic: Crashes, delays, updates”, published at 3.15pm and archived by Crikey at 3.48pm shows traffic alerts listed in the future, like an incident recorded at 4.14pm on Olympic Drive near Bridge Street.
Other errors involve formatting issues. For example, in at least 16 traffic articles published for various NSW regions for The Daily Telegraph today, all of them feature the text “Notavailable(NotAvailable)” sprinkled several times between the listed traffic incidents, seemingly an artefact from their code. The Courier-Mail’s generated articles on the state’s premier league soccer results refer to the leagues as “McDonald’s FQPL Men_” and “McDonald’s FQPL Women_”.
Some of the articles include unconventional or unwieldy language. Automated weather articles use abbreviations like “Today’s forecast is mostly sunny; n’ly winds tending fresh nw”. Lists of daily remote VCAT appearances are given a grammatically incorrect title “Victorian Civil and Administrative Tribunal (VCAT) hearings in Videoconference for Thursday, August 10” (a headline format seemingly designed for hearings that take place in various real-world locations like Melbourne).
A News Corp Australia spokesperson defended the Data Local team’s output when presented with examples of the errors, calling Crikey’s inquiries “confused, wrong and not reflecting any reality”.
“Every word published is overseen by working journalists using only trusted and publicly available sources,” they told Crikey. “We are proud of their work delivering this important service journalism to their communities.”
They also told Crikey that the company doesn’t use popular artificial intelligence product ChatGPT, despite Crikey sharing an example of a News Corp journalist citing their use of it as part of their article titled “ChatGPT, Midjourney AI creates ‘typical’ men, women of Gympie”.
The article appeared to accidentally include an unnecessary response from ChatGPT: “Please note that more recent data beyond 2021 could provide a more current and accurate description.”
While News Corp does not use ChatGPT as part of its Data Local team, the manual use of AI by a journalist shows how incorporating the technology into an editorial workflow can present a risk.
News Corp Australia’s embrace of AI comes as the company negotiates a claim for compensation from tech companies that used news publishers’ intellectual property to train their AI products. Earlier today News Corp chief executive Robert Thomson said these discussions were “fruitful”.
Apparently none of its readership have noticed, unsurprisingly.
“Every word published is overseen by working journalists using only trusted and publicly available sources,” “We are proud of their work delivering this important service journalism to their communities.”
Just wondering if their statement was…from an AI?
Maybe the A, but not any I – at least in the old-fashioned common definition of intelligence.
They do have supporters (a RW Zero Hedge influenced knockoff, masquerading as centrist), who also now have it in for Crikey?
‘Crikey panics about AI journalism. It should … Crikey’s recently developed fake left point-0f-view (culture war symbolism over class war substance) does not produce objective reporting or argument. Neither does it at The Guardian. Nor the ABC.’
Goes onto include their AI response…..
https://www.macrobusiness.com.au/2023/08/crikey-panics-about-ai-journalism-it-should/
There you go….
They do have supporters, who also now have it in for Crikey?
‘Crikey panics about AI journalism. It should (10 Aug ’23) … Crikey’s recently developed fake left point-0f-view (culture war symbolism over class war substance) does not produce objective reporting or argument. Neither does it at The Guardian. Nor the ABC.’
The article goes on to include their own AI response…..
There you go….
‘Crikey panics about AI journalism. It should’ (10 Aug ’23)
‘…recently developed fake left point-0f-view (culture war symbolism over class war substance) does not produce objective reporting or argument.’ (MB)
One would think a professional news provider would be the first organisation to oppose AI writing stories. But Murdoch has not prioritised factual reporting for decades, likewise informed opinion. Murdoch has sacked journalists and closed down reliable community newspapers almost every year since 2012. So it is understandable why News Corpse relies on AI and the lack of facts.
Murdoch doesn’t treat his serious journalists with contempt, he is treating his readers with contempt.
I sometimes wonder if the ABC is using AI, which might account for the frequent atrocious grammar and malapropisms. But unfortunately, I suspect that it’s actually the English skills of the latest crop of Media graduates they’ve hired.
Cam, ì don’t think AI is root problem for Newscorp and shouldn’t be talked down
Newscorp are famous for their errors, and BS and lack of “I”
In US MSNBC’s Tapper gave a suitable response to a statement made by a proprietor’s son as a result of an outlet’s legal issue, i.e. laughing back; in Oz the same would never happen, but the statement would be taken as authoritative locally, why?