The most successful fact-checks with Africa Check's visitors: lessons from Kenya, Nigeria, and South Africa

31st January 2020

External fact-checking –the practice where specialised organisations evaluate claims made by public sources such as politicians or the media– is a fast-growing field. When the Duke Reporters’ Lab started counting in 2014, they found 44 fact-checking organisations around the world. Five years later, the number has quadrupled to 188 in more than 60 countries.

Africa Check, the continent’s first independent fact-checking organisation of which I was chief editor until July 2019, has itself expanded exponentially. The organisation –a non-profit mainly funded by large foundations– launched in 2012 in South Africa with a junior researcher and part-time editor. Africa Check then opened a second office on the continent in Dakar, Senegal, in October 2015. Here the team runs a French-language version of the website to serve West African audiences. The English-language team branched out to Lagos, Nigeria, in November 2016 with Nairobi in Kenya following in January 2017.

Besides publishing fact-checking reports on our website, Africa Check syndicates its content to other news organisation to republish free of charge, provided proper attribution is given. The team also regularly discusses its findings on radio and television to reach offline audiences, since internet penetration is still comparatively low in most of Africa.

What’s in a click?

To track the impact of its work, Africa Check has dedicated increasing resources to monitor and evaluate output. Initially, the editorial team mainly kept track of unique visits to the website and the number of times other media groups had republished our fact-check reports.

As more donors came onboard, they requested a wider range of website metrics be reported to them. For example, the Bill and Melinda Gates Foundation requires the average time on page for each fact-check as well as the share of external referrals to the website.

Nevertheless, most donors still prioritise unique visitors as a “catch-all” metric for reach, according to Africa Check’s impact manager Nicola Theunissen. In May 2019, Africa Check appointed a full-time impact researcher to join forces with Theunissen.

However, we still do not know enough about what influences audience interest in our reports, particularly across the different countries Africa Check serves. My research project at the Reuters Institute attempted to tease out which kind of articles are most successful with readers, especially in helping Africa Check reach its goals.

What this project focused on

Africa Check’s foremost goal is to promote accurate public debate by ensuring that policymakers and the public retain a more accurate understanding of key matters. But just how do you measure the progress towards this goal, especially when relying on web metrics?

To delve into the quality of visits to Africa Check’s website, the following research questions formed my departure point:

  • Which Africa Check reports generated the greatest engagement on social media?
  • Which of these reports were most likely to have been clicked on and read?
  • What are the common attributes of these reports, if any?

I collected data by using Buzzsumo and Google Analytics. Buzzsumo captures engagement on several social media sites while Google Analytics is the tracking tool Africa Check employs to record visits to its website.

With the help of Buzzsumo, I located the ten Africa Check reports with the highest engagement on Facebook, Twitter, Reddit and Pinterest during 2015, 2016, 2017 and 2018. From Google Analytics I then pulled the following data for each report: total page views, top source and medium of traffic, average time spent on each page and country most visitors were from.

I also categorised each report by:

  • Type of headline (such as question, answer, combination or statement)
  • Overarching topic (health or the economy, for example)
  • Subject of the fact-check (these included politicians, public figures, the media and so forth)
  • Verdict (Africa Check has seven possible verdicts, ranging from correct to incorrect)

From the backend of Africa Check’s website, I copied each article’s readability score calculated by the Yoast SEO plugin and based on the Flesch reading ease test.

Lastly, I calculated the time it should take someone to read an article using the article word count provided by Buzzsumo and dividing it by 238. (This was found in a meta-analysis to be the average silent reading rate per minute for adults reading non-fiction English text.) Then I subtracted the duration from the average time-on-page metric as a makeshift indication of whether someone had read the entire article.

By considering the time someone would have taken to read an article, I wanted to approximate whether social media users had engaged with an Africa Check article after having read it first.

My findings

Here’s what I learned about the contribution of social media to Africa Check’s website traffic:

  • The data shows that in each year there were a few “blockbuster” articles after which engagement tailed off. Engagement on social media site Facebook comprised the overwhelming majority of likes and shares of Africa Check content with Twitter trailing far behind.
  • Facebook has become less and less important in driving traffic to Africa Check’s website, as has been the case for major brands and publishers all over the world. For all but one article in the top ten lists of 2015 and 2016, most views came from Facebook; it dropped to two in 2017 and three in 2018.
  • In most cases where Twitter was the biggest source of traffic, the average time users spent on the page was higher than the time it should theoretically take to read the piece.
  • Content engagement showed a major jump from 2015 to 2016, reflecting the rapid growth of Africa Check in that year. However, it has since decreased year on year, likely on the back of Facebook’s decline in importance as a traffic driver.

Here’s what I discovered about engagement versus views:

  • Between 2015 and 2018, the average number of page views in this content set had increased. Overall, however, the ten articles with the most engagement were not the most viewed.
  • In 2016 and especially in 2017, the articles with the highest engagement had lower views than likes and shares, suggesting that people engaged with the social media post without necessarily clicking through to read it.
  • If the verdict of the fact-check is clear from the social media card – which shows an image, the headline and a description of 160 characters in a clickable format –people reached by it could still learn the main take-away. But the cards concerned here mostly contained only a teaser of the conclusion.

Our verdicts and topics followed these trends:

  • Incorrect verdicts and spot-checks (short pieces focused on claims that we have fact-checked before) carried the engagement day.
  • As for preference by overarching topic, health articles attracted the most engagement in Nigeria and economic ones in Kenya. (The finding reflects both national preoccupations and newsroom priorities.)

About Google News’s role I uncovered the following:

  • Google News started featuring as a top driver of traffic to Africa Check’s website in 2018 after the company made their “fact-check” label available globally in April 2017.
  • In almost all the cases where Google News was the largest source of traffic to the site, most visitors came from outside the continent.
  • Google News was a significant traffic driver to Kenyan content with the most engagement in 2018, forming half of the top ten. In Nigeria’s case, Google News did not feature on our top ten of 2018 at all. Instead, organic Google searches played a large role in driving traffic to the Nigerian reports that enjoyed the most engagement.
  • In most cases where Google News was the top driver of traffic to an article, average time on page was lower than the time it should theoretically take to read an article.

Political content displayed these aspects:

  • High profile speeches by politicians made the top ten in 2015, 2017 and 2018, but not in 2016. These pieces received far more views than engagement, suggesting it is a worthy investment of newsroom resources.
  • The gap between time on page and expected read time is large for multi-claim reports focused on politics. This is to be expected, since readers may be interested in or search for information about only one claim.
  • In each year, a politician or political party was the most frequent subject of the top ten fact-checks by engagement.

Here’s what I determined about country differences:

  • On average, Kenyan copy was the easiest to read as judged by the Flesch reading ease test, followed closely by South African content and with Nigerian articles trailing behind.
  • Whereas content about Nigeria and South Africa was represented equally in the top ten between 2015 and 2017, South African-related content pulled ahead in 2018 with seven pieces on the top ten list.
  • Based on the data from 2017 and 2018, people interested in South African content seemed to prefer sharing headlines conveying the verdict of the fact-check. For Nigeria and Kenya, there was no discernible pattern.

What it all boils down to

To recap: Africa Check exists to promote accurate public debate by ensuring that policymakers and the public retain a more accurate understanding of key matters.

With this in mind, by which metrics should we judge whether a visit has helped Africa Check achieve this goal? Some of my findings show that traditional indicators of success may run counter to achieving knowledge retention.  Posts that enjoy a lot of engagement on Facebook are not necessarily read.  Articles featured on Google News mostly attract foreign audiences and are unlikely to be read in their entirety.

In deciding which web metrics Africa Check should focus on to achieve its goal that policymakers and the public retain a more accurate understanding of key matters, first prize seems to be when audience members read an entire report.

Alternatively, given the summary Africa Check provides at the top of a fact-check and the conclusion at the end, one could administer a test to check whether someone has retained the information, even though the visit was brief.

A few ideas for Africa Check

In addition to carefully defining the kind of attention that Africa Check wants to strive for I recommend that the English-language team carry out the following recommendations, based on this research report:

  • Continue prioritising major speeches. Given my finding that major political speeches received high engagement and even higher views, plus the fact that politicians or political parties formed the most frequent subject of the top ten fact-checks by engagement, investing newsroom resources to verify these speeches appears worth the effort.
  • Adjust Facebook boosting. In their current format, boosting Facebook posts represents “empty calories” as they receive more likes and shares than click-throughs. To help convey accurate information, the verdict should be made clear from the image and excerpt. Alternatively, Africa Check could more carefully target people who don’t typically engage with news content through sponsored posts. (This practice was found to have made a significance difference in the reach of Comprova, a collaborative fact-checking project that took place during the 2018 Brazilian presidential election.)
  • Measure attention more accurately. Africa Check should explore better measurement tools to help judge which fact-checks truly captures readers’ attention, with tools such as the native scroll depth trigger tool Google Tag Manager offers or the Audience Explorer analytics dashboard that was created for the Center for Cooperative Media at Montclair State University.
  • Serve existing needs better. Rather than “publish and pray” a fact-check reaches users truly interested in the claim, Africa Check should better capitalise on existing mass interest. My finding that visitors from Nigeria often locate Africa Check’s content through organic searches presents an opportunity that the organisation can purposefully tap into. One way could be by following The Sun’s example. The UK tabloid employs a team to gauge what the public wants to know, as judged by online searches revealed through Google Trends. The journalists then either update existing content or write a report from scratch.
  • Introduce quizzes. Africa Check has already floated the idea of adding a basic web quiz to fact-checks to test whether the report has helped make visitors’ understanding about the topic at hand more accurate. This should be pursued with urgency.