Exploring RSF’s new AI charter for journalism

Arthur Grimonpont, Project manager at Reporters Sans Frontières (RSF)
6th December 2023
13:00 - 14:00
Online

The context and the speaker 

Back in November 2022, Reporters Sans Frontières (RSF) announced a new charter aimed at regulating the use of AI in the media. The charter is a collaborative effort by representatives from 31 leading media organisations including AFP, Rappler, the Tow Center for Journalism, the Media Institute of Southern Africa, the Thomson Reuters Foundation, the Canadian Journalism Foundation and the Committee to Protect Journalists.

In this seminar project manager Arthur Grimonpont will speak about the initiative and about its members' collective commitment to uphold the ethics of journalism and the right to information in the age of AI. 

Grimonpont is an engineer and a consultant who now works at the project. He is the author of Algocratie, a book on the impact of algorithms on democracies around the world. 

The video

Seven takeaways from the talk and the discussion: 

1. New journalistic approaches for new technologies. 2023 was a breakout year for generative AI which disrupted a number of industries, including journalism. The new RSF charter aims to produce centralised ethical guidelines to guide the industry when it comes to the use of AI. 

While Grimonpont did acknowledge that many outlets have produced their own guidelines, he highlighted the need for industry-wide rules and regulations. “What we need is not each actor individually choosing where to set the bar on the integrity scale,” he said. “We need shared ethical standards, like journalism has already done in the past.”

2. Journalistic ethics should remain relevant in the age of AI. AI has allowed for many industries to streamline and cost-cut operations. In journalism, which is a highly competitive market, media outlets are also incentivised to use AI to their advantage. 

However, Grimonpont pointed out that while this might indeed cut costs in newsrooms, there might be negative consequences in the long run. “This might very well be profitable in the short term, at least from an economic perspective, but it may also harm the journalism industry and its social function in the long run,” he said. He said that journalists need to be careful to not compromise journalistic ethics for economic incentives. This includes maintaining quality journalism, accuracy, impartiality, and nuance.

3. Media outlets must prioritise human agency. “Every time a decision is automated, human judgement is set aside,” said Grimonpont. In journalism, human judgement and agency is central in finding, developing, and expressing ideas. 

Grimonpont said that while human agency is not necessary for every task that journalists perform everyday such as transcribing and spelling and grammar correction, to produce effective and meaningful journalism, human judgement has to be prioritised over artificial intelligence. 

4. Draw a clear line between synthetic and authentic content. “Media have a responsibility to enable a distinction between real world and AI generated content and they must always avoid misleading the public in their use of AI,” Grimonpont highlighted. 

This ranges from articles generated by AI to pictures generated by AI. Grimonpont went further to explain that media organisations should also refrain from creating visual content for illustration purposes that mimics reality. “These may seem restrictive at first glance, but we must keep in mind that we will soon live in a digital world dominated by synthetic content,” he elaborated. 

5. The use of journalistic content by Big Tech should involve fair compensation. "AI system owners must credit sources, respect intellectual property rights and provide just compensation to right holders. This compensation must be passed on to journalists through fair remuneration,” Grimonpont said. 

He added that this does not only involve financial compensation. When a media organisation does authorise a tech company to use their data, for example, they should make sure that it is used responsibly in terms of intention. 

6. There is power in collective action. Grimonpont also highlighted that these negotiations with Big tech should also come collectively as an industry, rather than individually. He went on to explain that those actions are about protecting the future of journalism as a whole, rather than individual media companies. 

“Media outlets as a whole need to understand that it is essential for them to unite their voices in the face of big companies, because even a large media corporation is very small compared to Big Tech and protecting the news industry, in the long term, requires more than just bilateral agreements,” he said. 

7. The media industry should have a seat at the table when it comes to AI regulation. “AI regulation should not be left solely to AI experts because we need to understand how AI systems work in order to express our views on how to regulate them,” Grimonpont said. 

While there is yet no institutional oversight in place to regulate AI, Because AI systems have a massive impact on the information landscape, Grimonpont said, it is crucial that a cross-section of information professionals and experts, including journalists, are at the heart of the regulation debate as it is being shaped. 

The bottom line 

With generative AI becoming more ubiquitous in the journalism landscape, rules and regulations are needed in the industry to have clear expectations on its uses. While RSF highlights many of these in their charter, it is important that they are applied collectively across the news media. 

Upcoming Events