Report on tax evasion and listen to women of colour: Safiya Noble's tips to cover tech

The author of the book 'Algorithms of Oppression' reflects on how journalists should report on big technology companies
17th October 2020

The speaker

Safiya Noble is one of the most respected voices on the impact of technology companies in society. She is an Associate Professor at the University of California, where she serves as Co-Director of the UCLA Center for Critical Internet Inquiry. She is the author of Algorithms of Oppression, a book that challenges the idea that search engines offer an equal playing field for all forms of ideas, identities, and activities. You can read an excerpt here

Why data bias matters

Professor Noble’s first encounter with racism in search happened in 2009, when she was talking to a friend who mentioned: “You should see what happens when you google ‘black girls.’” She was stunned to discover that most of the results were related with porn or sex even if those words were not included in the search box. 

Google blocked explicit content from AdWords in 2014. And yet, even today it’s possible to find hypersexualised results when searching for “Latino girls” or “Asian girls.” “What we know about Google’s responses to racial stereotyping in its products is that it typically denies responsibility or intent to harm, but then it is able to “tweak” or “fix” these aberrations or “glitches” in its systems,” Professor Noble wrote in her book, published in 2018 and reviewed here by The New York Times.  

It’s important for journalists to understand what data bias is and how to report on it. It’s also important for people who work at the tech companies to have an education on the histories of marginalized people so they don’t make the same mistakes. 

Watch the seminar

 

Seven takeaways from the seminar

On 14 October 2020, Professor Noble spoke at a seminar chaired by Meera Selva, Director of our Journalist Fellowship Programme. Here are seven takeaways from her talk: 

1. On search as a biased space. “Search engines are huge classification projects and we know from other fields outside the techno-utopian bubble that classifications are deeply political”, Professor Noble said. “Many of the technologies are deepening the politics of classification of people and communities. That was the reason why I decided to take on something that many people use everyday [in my book.]”

2. On the benefits of diversity. “We have seen the consequences of not having people of colour adequately represented in newsrooms and not being able to frame the concerns they report on from the vantage point of the people they are reported about,” she said. “The same is true in technology. The absence of people of colour results in lack of perspective and lack of point of view. But even if we had the most diverse newsrooms and technology companies, we still live in societies that are profoundly structured by inequality. And those problems won’t be solved at the level of the employee. Every institution is responsible for contending with that.” 

“I often say: ‘You should have listened to women of colour’,” Professor Noble pointed out. “We understand [what’s happening] because we are often closest to the crisis produced by lack of care and lack of opportunity. And we are definitely seeing the elevation of really important voices who are on the frontlines of the understanding of what these crises look like. COVID-19 has created such a disproportionate health impact on poor people and communities of colour. Any of us can switch on the television and take a look at the Congress and say they are not quite a match to the people who need so desperately to be represented by public policy.” 

3. On technology and Black Lives Matter. “People are now able to join the conversation and be politicised in ways that wasn’t possible before, in the same way as when some newspapers and radio stations in the civil rights movement helped us understand what was happening. I’d say though that one of the consequences of the rise of tech platforms has been the profound dehumanization of Black people and how they have allowed white supremacists to organise and influence the White House. As much as we’ve been able to coalesce around Black Lives Matter, we have more problems to coalesce around that the platforms also enable.” 

4. On journalists reporting on tech. “There are key journalists, many of them women and LGTBQ journalists, who have been foregrounding the most critical and important issues [in the field of technology,]”, she said. “At the same time, I’ve also seen other journalists, sometimes more junior, unfortunately male journalists, who go and look for experts to support their stories and only find people who look like them, even though the people who have been doing the groundbreaking, really gruelling work of exposing the harms of the tech industry have mostly been women and people of colour. And again their expertise somehow doesn’t get recognised at making these stories possible.” 

5. On the importance of storytelling. “Data is a social construction,” Professor Noble said. “We need to remember that about the way in which we are engaging with large datasets. A survey gives you something really different than an interview. To what degree can you mitigate that and use multiple methods I think those are ways to uncover the truths we are trying to tell.”

She acknowledged that business models based on advertising make it difficult for many journalists to tell important stories. However, she encouraged reporters to dig into important stories like the tech giants tax evasion schemes, even if they sometimes get fewer pageviews: “We have to story-tell better the impact of corporate tax evasion. We need people to understand what was defunded as a result of tax evasion, who was harmed, who didn’t get medical care, who languishes in a job that can’t pay the rent… Journalists have to figure out how to tell those stories.” 

6. On the importance of language. “It’s really important to get the texture of a problem,” she said. “We have words that flatten very important concepts that need to be distinguished. The word content is a useless word. Unfortunately, it’s a word invented by the tech industry to absolve themselves from responsibility for being publishers. It’s a way to say that they are not responsible, that they are just the damned pipes. However, the word content does not disambiguate evidence-based research from biography, advertising, misinformation, disinformation or propaganda. There’s no way to disambiguate when you use a word like content. So we have to be very thoughtful and reclaim our own words and put those into the mix.”

7. On how to get platforms to fix problems. “Systems can be gamed and people spend a lot of money in the advertising mechanism that drives search to dominate certain keywords and make their content more visible, and white nationalists and neo nazis are particularly excellent at doing that,” she said. “If you are talking about an advertising product that is predicated upon making the most money, it means that those with the most capital are going to have more power and influence in it. If you have a system that is predicated upon hyperlinking and good metadata, those with more technical skills will have more power and influence. These tools and platforms are not democratic because democracy fails as a model when capital and power can take control. These systems are rigged for the most powerful. That’s what they are designed to do. So we have a fundamental conflict between what we imagine these platforms to be and what they really are.” 

What can journalists do about this? “In my own experience you raise the hell out of a problem to the point that it gets their attention and they respond to it,” Professor Noble said. “These companies have enormous backlogs and work on a triage mode. They figure out what’s on fire and tend to that. So this is where journalists amplifying problems is really important because journalists have a really wide platform and can make a problem visible.” 

The bottom line 

Technology companies are powerful actors and deserved to be covered as such. This means looking at things like data bias, content moderation and tax evasion schemes. In doing this, it’s essential that journalists find ways to tell these complex stories in ways that capture the imagination of their audience. Many of those stories have been broken by women and people of colour. This shows how important it is to create newsrooms that truly reflect the make-up of their audience. 

If you want to know more…