- Posted on 15 Aug 2025
- 5 mins read
In 2023, when we produced a landmark report surveying Australian editors and journalists for the first time about generative AI, the pressure genAI could place on the integrity of news output was the dominant theme. In 2024–2025, we returned to those editors, and extended our interviews to other editors, journalists, and product developers. The overriding concern remains maintaining the integrity of news output, with caution in experimentation and implementation seen as a lever to protect it. As well, our participants expressed increased scepticism about getting the tech sector to help them navigate the technology’s known challenges – including bias and verification – so that greater AI integration might happen and still meet the boundaries of editorial practice.
Bargaining with the tech industry has been a challenge for the news industry for some time. It railed for many years against digital platforms using their product – journalism – to attract consumers and make money. And it was – and still is – vocal about the flight of advertising revenue from media to platforms. This week, with the Productivity Commission’s interim report, Harnessing Data and Digital Technology, seeking feedback on whether big tech should be handed a data mining exception for AI training to the Australian Copyright Act, another fight looms; News Corp, whilst not alone, is particularly concerned. In 2021, the government lent news media a helping hand with the News Media Bargaining Code. The sector basked in the financial benefit that flowed from two digital platforms, Meta and Google, until Meta decided last year it would make no more deals to pay the news industry for its journalism. By 2024, when we conducted our interviews for this research, it was clear the NMBC was faltering and technological disruption was widening and deepening, thanks to genAI, which is slowing click-through rates to news websites and causing some news organisations to fear they are losing control over the integrity of news output once it enters AI domains.
The bargaining position against big tech should be stronger now than it was in the lead up to the NMBC because AI relies on gatekept news and information to power its LLMs, although the proposal for a text and data mining exception might weaken this bargaining power. Yet we found that news media had little confidence in their ability to collaborate as an industry to counter the threats posed by generative AI. As Craig McCosker, the ABC’s Group Product Manager told us: “Now is the potential time for publishers to be less passive and stand up for a fair deal with AI companies. But doing that as an industry seems hard. It requires publishers to come together and bargain as a united block, but the industry is diverse, with lots of different interests.”
Editors were also sceptical that AI companies would assume a greater degree of accountability for the flaws in their products that have the capacity to produce distortions in news output, as noted by ABC Standards editor Matt Brown: “I think their licence to operate should be rooted in those sorts of principles that inform what we’re all on about: transparency, accountability. [But] we’re not worth enough to them. … I really think if we want these sorts of values and principles, even codes of practice, built in, it may require regulation. I do think you should be able to say to an LLM vendor, how’s it made? What’s it from? What have you done with all the data soup that you poured into it?”
As one large technology company told us, his company wants to be known for its willingness to collaborate with information providers, and it wants to discuss the challenges with media companies. He said AI had created an explosion in data points which would slow down or complicate the development of principles that can be applied across the board, but particularly in journalism, where AI principles would need to intersect with editorial principles.
The interim solution would be for media organisations to move away from consumer-level tools to enterprise level, which are more reliable and would give them complete control over who accesses the system and its outputs. As this participant noted, not all AI is the same.
At least two major news organisations have begun walking this road, even developing their own custom LLMs: NewsCorp has created NewsGPT and the ABC has created an internal search tool. Both efforts may yet incentivise tech manufacturers to speed up discussions with news media to help resolve some of the known challenges.
Still, the general feeling in the news sector was that the horse had bolted, despite the minimal move of blocking AI scrapers which many have used, and more critically, given the debate opened by the Productivity Commission on copyright exceptions for AI.
References
Gen AI and Journalism report 2023 |Centre for Media Transition https://www.uts.edu.au/globalassets/sites/default/files/2024-04/gen-ai-and-journalism_web-version-9-april-2024.pdf
News Media Bargaining Code https://www.legislation.gov.au/C2021A00021/latest/text
Meta pulls out of news deals | Australian Financial Review https://www.afr.com/companies/media-and-marketing/meta-refuses-to-pay-for-news-setting-up-war-with-publishers-20240301-p5f93l
Slowed click through rates | The Guardian https://www.theguardian.com/technology/2025/jul/24/ai-summaries-causing-devastating-drop-in-online-news-audiences-study-finds
News Corp NewsGPT | The Guardian
https://www.theguardian.com/media/2025/jun/20/news-corp-bets-big-on-ai-tools-but-journalists-voice-concerns
Productivity Commission report https://www.pc.gov.au/inquiries/current/data-digital/interim/data-digital-interim.pdf
News Corp concern over copyright | The Australian https://www.theaustralian.com.au/commentary/dangers-lurk-beneath-productivity-commissions-prose/news-story/a8655b9ed3710f9ab5c2566b6a398c38