New AI & tech collabs for electoral accountability - the Meedan story
The speaker
Dr Dima Saber is the director of programs and impact lead at Meedan, a nonprofit focused on enabling access to reliable information during major events such as elections, climate and public health emergencies, armed conflicts and social turmoil. Meedan works with over 70 organisations in more than 45 countries across North Africa and Western Asia, Latin America and the Caribbean, Sub-Saharan Africa and the Asia-Pacific region.
One of Meedan’s offerings is Check, a tool that allows organisations to deploy AI-generated chatbots on closed messaging apps so that community members can ask questions, which are then grouped by AI and matched with resources news outlets have compiled, and are then recommended back to the user via the chatbot’s response. Saber stressed in the seminar that this tool is designed to always be used with human oversight and works best when the chatbot line is staffed by journalists who can take over on less straightforward user questions.
Saber is also a researcher and writer on media depictions of conflict and the role of archival records in identity-building processes, focusing primarily on the work of political activists in post-revolution and conflict countries such as Lebanon, Egypt, Syria, Yemen, and Palestine. Until late 2022, she was a media and cultural studies reader and associate director at the Birmingham Centre for Media and Cultural Research at Birmingham City University.
The video
Five takeaways from the talk and the discussion
1. Information inequality is a critical issue worldwide. Dr Saber and Meedan are working on projects related to elections all over the globe this year, from Pakistan to Brazil to the US. “Regardless of where we’re looking, there’s definitely one trend circulating across which is information inequality. So people lack access to information that they need that is suited for their communities and languages,” Saber said.
2. It’s important to address the needs of communities. Saber came back to this idea throughout the seminar. “There’s often a mismatch between what journalists talk about and what voters want to know. More often than not media outlets do not tailor [information to] or exchange information with their communities,” Saber said. Meedan is trying to address this issue, as well as that of online misinformation, through its work with Check, which involves directly asking audience members what their questions are. “What makes the work that we're trying to do a little bit different from other AI-powered solutions is that we are not trying to solve the technological wickedness by a tech solution. It’s always a community-centred human solution,” she said.
3. The lack of regulation on closed messaging apps can lead to disinformation spreading unchecked. Closed messaging apps, such as WhatsApp, Facebook Messenger and Telegram, are not monitored for disinformation in the same way post-based platforms like Instagram, X and TikTok are. “These spaces are end-to-end encrypted, and so they are unregulated. And so misinformation, disinformation, political candidates can reach out to voters. And all this can spread virally with no oversight, really from tech platforms,” Saber said.
4. Cuts to trust and safety teams on platforms have made them more vulnerable to disinformation. The biggest issue we've been facing as a civil society organisation is that platforms have cut funding to third-party organisations, and they’ve cut their trust and safety teams,” Saber said. This, in combination with the growing popularity of generative AI tools able to create cheap and potentially viral content, creates a higher risk of mis- and disinformation spreading widely, she explained.
5. It’s not just about polling day. When it comes to understanding trends in disinformation and different narratives around elections, it’s valuable to observe what’s going on over a longer period, Saber explained. “It's not enough to look at what's happening on election day. It actually is more important to look at the period leading up to the election and to stick around a little bit after the election, as the narratives continue to unfold,” she said.
The bottom line
AI can pose a threat to the information ecosystem by facilitating the creation and spread of disinformation. However, it can also be part of the solution, if used under human supervision and tailored carefully to the community it serves.
Part of our Global Journalism Seminars.
In every email we send you'll find original reporting, evidence-based insights, online seminars and readings curated from 100s of sources - all in 5 minutes.
- Twice a week
- More than 20,000 people receive it
- Unsubscribe any time