Generative artificial intelligence is upsetting the rules of the game in newsrooms. How can we adapt without betraying the fundamentals of the profession? That’s the question being asked this Wednesday on the second day of the Nexus Luxembourg 2025 tech conference, in the presence of two figures from the sector: Lukas Görög, AI and data strategy lead at the Swiss media company NZZ, and Thomas Mattsson, senior advisor at Bonnier News, Scandinavia’s leading press group.
Since ChatGPT appeared in November 2022, “everything has changed,” says Görög. AI has made its presence felt in newsrooms at breakneck speed. And not just via ChatGPT: other tools such as Perplexity have become daily companions.
“A good journalist who knows how to use AI is a journalist on steroids,” he sums up. Thanks to advanced prompting techniques, AI makes it possible to increase efficiency, explore in greater depth, synthesise masses of information, transcribe interviews and, ultimately, produce richer content for the reader. “It’s as if the journalist had superpowers,” explains the AI consultant.
Education as a foundation
But you still have to know how to use it. “If the teams are not trained, AI will not deliver any return on investment,” warns Mattsson. He therefore recommends appointing an AI ambassador in each newsroom, tasked with coaching his or her colleagues for one or two hours a week on concrete use cases.
For both speakers, the key to success lies in education. Whatever the technical level of journalists, collective learning is essential. “Whilst younger people are quick to pick up these tools, this is not always the case for previous generations,” observes Mattsson, who stresses the importance of securing and supervising this transition.
“AI is emerging as a turning point as radical as the internet was in its day. And as with the web, refusing to use it is not a viable option in the long term,” he warns.
Transparency and trust
This transformation cannot be achieved without transparency towards the reader. “You have to indicate when content has been generated by an AI,” says Görög. It’s an imperative that comes up against a reality: revealing the use of AI in the writing of an article can damage the perception of its credibility. “It’s a dilemma,” admits Mattsson.
However, the two speakers are clear: humans must remain in the loop. Proofreading, validation, critical analysis... the quality of the information is always based on human work. AI is a tool, not an end in itself.
And whilst it can translate or summarise on the fly, it remains incapable of doing what a journalist can do in the field. “Tomorrow, if something happens outside your home, AI won’t be able to get there. It can summarise, but not investigate,” Mattsson points out.
Preserving the value of content
Another strategic issue: the protection of journalistic content. “We must prevent our work being sucked up by AIs without authorisation,” warns Mattsson. He advocates clear regulation based on copyright. “If someone wants to use our content, they have to pay. Full stop.”
But AI should not just be seen as a threat. It also paves the way for more targeted journalism, better adapted to public expectations. “We already have the data, the KPIs. AI allows us to go further in personalising information,” he adds.
And above all, concludes Görög, “stay in the loop.” Understand trends, test tools, keep a critical eye open. Journalism will not be replaced by AI, but it could be by other journalists... who know how to use it.
This article was originally published in .