Amid war, vicious attacks and political turmoil, global fact-checkers fear the impact of the end of Meta’s programme
Meta’s founder and CEO Mark Zuckerberg recently announced he was ending the company’s long-running partnership with fact-checking organisations. The programme, founded in 2016, was part of the company’s efforts to improve content moderation on its platforms. It would refer suspicious posts to fact-checkers for their assessment. Those flagged as false or misleading by them would then have labels attached to them and would also be moved lower in users’ feeds.
That system will now be replaced with X-style “community notes,” where users themselves add context to misleading posts. This comes just as Zuckerberg adds himself to the list of tech billionaires donating money to President-elect Donald Trump’s inauguration.
“After Trump first got elected in 2016, the legacy media wrote nonstop about how misinformation was a threat to democracy,” Zuckerberg said. “We tried in good faith to address those concerns without becoming the arbiters of truth. But the fact-checkers have just been too politically biased and have destroyed more trust than they've created, especially in the US."
While Zuckerberg expressed that these changes will be rolled out imminently in the United States, the company works with hundreds of fact-checking partners in more than 115 countries. According to the Financial Times, Meta said that there is “no immediate plan” to end third-party fact-checking and introduce community notes outside the US. But fact-checkers around the world are already preparing for what they see is the inevitable demise of Meta’s global fact-checking program.
Against this depressing backdrop, I spoke to seven fact-checking organisations from Brazil, Croatia, Italy, Nigeria, the Philippines and Ukraine to examine the potential impact of this decision on their work, their newsrooms, and their audiences.
1. Will the rest of the world follow?
While Meta’s decision only applies to the US for now, experts are expecting the company will apply it more globally. None of the fact-checkers I spoke to have been contacted by Meta since the announcement. But most of them are confident that their contracts with the company will remain intact at least until the end of this year. Despite this, all the fact-checkers I spoke to are preparing for this decision to extend beyond the United States.
“I see no reason why we should take Mark Zuckerberg at his word,” says Ana Brakus, Executive Director of Croatian fact-checking organisation Faktograf. “He hasn't shown to be exactly trustworthy and it would be very careless to base our future business decisions on taking him on his word.”
If Meta can’t counter disinformation without the fact-checking programme, the EU should enforce its rules. Grovelling to foreign pressure would be very dangerous, and morally wrong
Rolling out these new rules might look different from country to country, depending on local legislation and the political will to enforce it. Any decision, for example, might be impacted by the European Union’s Digital Services Act and the UK’s Online Safety Act.
As our former colleague Lucas Graves pointed out in a recent piece for The Guardian, the DSA’s “regulatory framework is unfinished and untested” as the first formal charges brought under the DSA against Elon Musk’s X remain unresolved.
My sources in the European fact-checking community are not optimistic about the EU enforcing the DSA. So far, the EU’s only public comment on the matter is that any platform that wishes to remove such policies “will have to conduct a risk assessment and send it to the EU Commission.”
“We hope the EU will be strong in facing these threats, but I don't have any particular optimism on this point,” says Tommaso Canetta, Managing Editor of Italian fact-checking organisation Facta News.
“As a fact-checker and journalist,” he added, “I don't think I can tell the EU institutions what they should do specifically, but I think that they should defend the rules they have created and implement them. If Meta proves that it can effectively counter disinformation even without the fact-checking programme, no fines should be issued. But if they can't, EU rules should be enforced. Grovelling to foreign pressure and disowning its own rules would be very dangerous, and morally wrong.”
Brakus is not sure Meta has any incentives to follow EU regulation and questions the capacity of the European Commission to enforce their own laws, especially against a tech giant. She mentions the precedent of X, whose owner Elon Musk quickly deconstructed the platform’s content moderation system with little pushback.
“The Digital Service Act has a chance to work, but it’s too soon to tell,” says Brakus. “Even when the EU approves good regulation, it's not always great with implementing it and notoriously slow.”
2. How will this impact fact-checkers’ finances?
Back in 2022, Meta bragged about building “the largest global fact-checking network of any platform” and having “contributed more than $100 million to programs supporting our fact-checking efforts since 2016.”
Many organisations around the world rely on Meta’s funding and may now be left in the lurch after their decision. None of the outlets I spoke to said that losing funding from Meta will force them to fold. Most of them even said they will be able to secure other sources of funding to make up for any losses.
Several outlets refrained from giving me any figures on how much of their funding comes from Meta citing the non-disclosure agreement they had to sign with the company. The ones that did share some information said that it is significantly less than 50% of their budget and hovering between 20% to 30%. Several fact-checkers mentioned they’ve already diversified their sources of revenue so they don’t rely too much on a single programme.
Yevhen Fedchenko is the co-founder and chief editor of the Ukrainian fact-checking organisation StopFake.org. When the invasion of Ukraine started in 2022, Meta was the first organisation to reach out to StopFake.org and ask what they could do to help. Fedchenko now hopes Meta sees the value of their partnership in safeguarding Ukrainians against Russian disinformation in the midst of war.
“Financial threats have a different meaning for us because we have gone through any kind of threats during the war,” he says. “[Meta’s money] helped us to adjust to those changing circumstances, but we could still survive [without them]. That's what Ukrainian audiences expect from us. I can’t imagine how difficult it’d be for Ukrainian society to survive without access to our fact-checks.”
For example, Fedchenko points at some recent debunking that they had to do regarding lies that discredit Ukraine and Ukrainian leadership in the eyes of Western audiences, which can potentially impact provision of international aid for the country. He also mentioned their work debunking lies discrediting Ukrainian refugees, which can create potential conflicts with receiving communities.
This is roughly the outlook of the fact-checking organisations that I spoke to: they’ll need to adjust their budgets to avoid lay-offs and cutting programmes, but the biggest loss will be faced by their countries’ information ecosystem. Most outlets expect they will have at least a year to prepare for any potential changes as their contracts have been renewed for 2025.
Kemi Busari is the editor of Nigerian fact-checking organisation Dubawa, which has been partnering with Meta since 2019. If the decision from Meta had come as abruptly as in the US, he says, they would be forced to reduce their staff, which in turn would mean their capacity to fact-check claims will be diminished.
But if they have a year to prepare ahead, Busari thinks they will be able to find other revenue streams. “We are also viewing this as a challenge to look inwards and think about other options,” says Busari. “We have this understanding that fact-checking is not a business and should never be a business. It's a social enterprise. With that kind of mindset, we should be able to find some other ways to continue to do our work.”
Tai Nalon is the Executive Director of Aos Fatos, a fact-checking organisation in Brazil that has been partnering with Meta since 2018. In addition to grants, they have diversified their financing through the licensing of journalistic content, a membership programme, and the sale of technology and intelligence services. Nalon, however, said Meta’s support has been essential for their journalistic work.
“Our partnership with Meta was crucial for establishing Aos Fatos as a leading journalistic organisation in Brazil and across the continent,” she says. “For a long time, Meta shared tools for monitoring trends that supported our journalistic investigations, such as the public Crowdtangle API. Monitoring the attacks in Brasília on 8 January 2023 would not have been possible without a robust strategy to combat misinformation through fact-checking and investigations.”
Nátalia Leal, CEO of Brazilian fact-checking organisation Agência Lupa, says their partnership with Meta has allowed them to grow as a company and to expand their audiences by reaching users they were not able to reach before. Their sources of revenue range from selling their content to other news outlets to offering workshops and training.
“We will need more people supporting our work,” Leal says. “It’s not just the money. It's the perception of the importance of journalism and fact-checking.”
3. How will this impact the information ecosystem?
Fact-checkers say the most important impact of Meta’s decision will be felt in the information ecosystem, especially in many countries in the Global South.
Facebook and Instagram are still major sources of news in many of those countries. So the removal of fact-checking from news feeds could cause an increase in the amount of misleading information users see. X is often mentioned as an example of how Meta’s platforms can evolve.
One of those countries is the Philippines, where 61% get their news from Facebook, according to our Digital News Report 2024.
Celine Samson is the head of the online verification team at VERA Files, a Filipino fact-checking organisation which has been partnering with Meta since 2018. “Facebook is still king here,” Samson says. “. Despite the rise of other platforms, Facebook continues to be the most used social media platform. It’s where local Filipinos and our huge diaspora get their news. If the programme gets removed, we are worried about the quality of the information they would be getting.”
In Ukraine, Fedchenko says, Facebook has become an informational lifeline to many during the war. But Russia has also used this platform to spread war propaganda, which is one of the reasons why fact-checking is so crucial.
“People are using social networks here as a platform for life-saving communication,” he says. “It’s a place where people share important information and our ability to verify that information is also crucial for people.”
[Without the programme] it will become more difficult to distinguish high-quality, professionally-verified information from other types of content on social media. Trust will be weakened
For the 2023 Nigerian general election, Busari’s fact-checking organisation Dubawa did research on the different types of misinformation circulating across social media platforms by sourcing data from published fact-check reports from three African fact-checking outlets. They found that Facebook was the platform where falsehoods were most prevalent.
Shutting down the programme, Busari says, would be a blow to democracies in the continent and around the world. “If you are not equipping fact-checkers to combat that kind of disinformation, then it is a very huge threat,” he says.
Nalon from Brazil’s Aos Fatos stresses that this decision has also been followed by a relaxation in the rules regarding hate speech. Fact-checking, she says, has often played a crucial role in showing that certain types of misinformation were conspiracy theories promoted by hateful groups.
“[Without the programme] it will become more difficult to distinguish high-quality, professionally-verified information from other types of content on social media. Trust will be weakened,” Nalon says. “Lax rules will likely turn the network into a sort of hub for scams. It is what we’ve seen on X, which is now regarded by Zuckerberg as an example.”
4. Will Meta’s new ‘community notes’ work?
Meta’s plan is simply to replace their current system of verification with X-style ‘community notes’, where users themselves submit context, clarification or fact-checks to posts. Their plans are still nebulous, but Meta said that community notes will be written and rated by contributing users and “will require agreement between people with a range of perspectives to help prevent biased ratings.”
Samson from VERA Files is concerned about these changes. She wonders who these users will be, how Meta will decide on these users, what the company means by a diversity of perspectives and what kinds of facts will be put on these notes. Another concern is how Meta will apply these rules across different cultural and political contexts, in light of the kind of disastrous mistakes they made in the past.
“I am not totally discounting [community notes] but I am very sceptical of it, just because of the era we're living in now, where people don't even agree on what facts are,” she says. “In our experience, when you put out a fact-check, some people would just not believe it, even if you supply them with so many sources.”
Meta will also get rid of restrictions on how issues like immigration and gender identity are allowed to be discussed. The company has already implemented some of these changes with an update to their ‘Hateful Conduct’ policy. For example, its new policies allow users to make “allegations of mental illness or abnormality when based on gender or sexual orientation” It has also deleted warnings against self-admission of racism, homophobia and Islamophobia.
While this goes beyond the scope of what fact-checkers do, Leal from Brazil’s Lupa thinks that both these changes and the new community notes system will result in the amplification of Meta’s most radical users.
“Social media will become more polarised,” Leal says. “The system of community notes is based on engagement and the most engaged users are often the most polarised. If they are the ones responsible for community notes, these notes will not be impartial and there will probably be more rage and more hate.”
Will requiring agreement between people with a range of perspectives help prevent biased ratings? Leal does not think so.
“This idea of ‘different sides’ evaluating something as a sign of balance is nonsense to me. It is not fair for an evaluation from someone who bases it on scientific evidence having the same weight of an evaluation based on conspiracy theories, for example. This is what happens in X and is very different from balance or from giving different perspectives to users,” she said.
There is a trackable impact of how the programme has reduced false information on Meta. We try to provide people with accurate information. I don’t think there is any way community notes can replace that. Think about X
One major study for the US has shown that right-leaning users tend to share misinformation at a greater volume than left-leaning users and are more often fact-checked, which could explain the perception that fact-checking is biased, but that latter assertion is not supported by research. There is some evidence though that fact-checking helps reduce the spread of misinformation.
Many fact-checkers I spoke to vehemently repudiate Zuckerberg’s assertion that they are biased or that they hinder freedom of speech.
“We pick up false information and fact-check it by providing context or ratings in some cases,” says Busari from Nigeria’s Dubawa. “There is a trackable impact of how that has reduced false information on Meta. We try to provide people with accurate information. I don’t think there is any way community notes can replace that. Think about X.”
Canneta from Italy’s Facta News also uses Elon Musk’s X as an example of how a faulty content moderation system can deteriorate the information ecosystem of a platform and how this degradation can bleed into real-life.
“The platform quickly became a haven for people spreading hate and propaganda through demonstrably false content and this created harmful effects,” he says. “Think about the riots in the UK last summer that were propelled by false news about immigration circulating unchallenged on X.”
5. What kind of effect will Zuckerberg’s message have?
Both Zuckerberg’s video message and Meta’s press release claimed fact-checkers were biased and presented their work as censorship. But fact-checkers do not provide content moderation for Meta or decide what type of content is moderated. They simply provide a service: fact-checking misleading posts. It’s not fact-checkers who decide what Meta does with their work.
Meta explains its own system very clearly on their own policies:
- Their own technology detects posts that are likely to be misinformation based on various signals.
- Then fact-checkers independently review those pieces of content and rate their accuracy.
- When content has been rated by fact-checkers, they add a note to it so that people can read additional context. This means that Meta ultimately decides on how content found to be false or misleading is labeled or down-ranked.
Moreover, for fact-checking organisations to be part of Meta’s programme, they have to be certified through non-partisan institutions: either the International Fact-Checking Network (IFCN) or the European Fact-Checking Standards Network (EFCSN). To do so, organisations have to go through independent assessments and follow strict standards of non-partisanship. This is part of the IFCN’s Code of Principles that all members have to adhere to and that Meta has praised in the past.
Many of the fact-checkers I spoke to were concerned that Meta’s scapegoating would result in a rising tide of online attacks.
“Both this decision and the things that Zuckerberg said to justify this decision are really bad,” says Canetta from Italy’s Facta News. “Fact-checkers already face a huge amount of attacks. Calling us censors and politically biased will give more weapons to the people that already want to harass us.”
Brakus from Croatia’s Faktograf told me that, as soon as Meta made its announcement, they started receiving emails with threats and attacks.
“Our research shows that harassing fact-checkers is just part of the populist playbook and is used to create mistrust in society,” says Brakus. “When someone like Mark Zuckerberg accuses you of censorship, what do you think happens after that? Anyone who’s previously attacked us will feel emboldened to do it again.”
Agência Lupa’s Leal says that they have also been targeted with several forms of harassment, with the peak of attacks happening during the pandemic.
Brakus’ Faktograf has been tracking this kind of harassment. In their latest report, published in 2023, they tracked attacks received by fact-checking media outlets in Europe. The report was conducted via a survey with 41 fact-checking outlets from 28 European countries. Most of the respondents said attacks against them have become more frequent since they joined Meta’s programme. The report also outlines how fact-checkers are falsely presented as “censors” who down-rank or even remove user content, which is not the case.
Despite Meta’s decision, several of the organisations I spoke to still hope these policy changes do not extend abroad. Brakus thinks that fact-checking is now more crucial than ever, as the news ecosystem has become fragmented, most people access news online, and levels of mistrust are unprecedentedly high.
“Most people want good information,” says Brakus. “When they are sharing stuff, they actually think they are helping their friends and families and they deserve good information.
Meta has been one of the funders of the Reuters Institute in the past and supported our Journalism Innovation Project and our Trust in News Project. At the time this piece was published, the company was not one of the Institute's funders.
In every email we send you'll find original reporting, evidence-based insights, online seminars and readings curated from 100s of sources - all in 5 minutes.
- Twice a week
- More than 20,000 people receive it
- Unsubscribe any time
signup block
In every email we send you'll find original reporting, evidence-based insights, online seminars and readings curated from 100s of sources - all in 5 minutes.
- Twice a week
- More than 20,000 people receive it
- Unsubscribe any time