Skip to main content

Can an algorithm be racist? Spotting systemic oppression in the age of Google

Can a bridge be racist?

It sounds ridiculous, but that’s exactly the argument sociologist Langdon Winner makes in “Do Artifacts Have Politics?” — a classic essay in which he examines several bridges built over roadways in Long Island, New York.

Many of these bridges were extremely low, with just nine feet of clearance from the curb. Most people would be unlikely to attach any special meaning to their design, yet Winner suggested that they were actually an embodiment of the social and racist prejudices of designer Robert Moses, a man who was responsible for building many of the roads, parks, bridges, and other public works in New York between the 1920s and 1970s.

Image used with permission by copyright holder

With the low bridges, Winner wrote that Moses’ intention was to allow only whites of “upper” and “comfortable middle” classes access to the public park, since these were largely the only demographics able to afford cars at the time. Because poorer individuals (which included many people of color) relied on taller public buses, they were denied access to the park, as these buses were unable to handle the low overpasses and were therefore forced to find alternative routes.

As New York town planner Lee Koppleman later recalled, “The old son of a gun … made sure that buses would never be able to use his goddamned parkways.”

Jump forward the best part of 40 years and Dr. Safiya Umoja Noble, part of the faculty at the University of Southern California (USC) Annenberg School of Communication, has written a book that updates Langdon Winner’s critique for the digital age.

Noble’s Algorithms of Oppression makes the argument that many of the algorithms driving today’s digital revolution (she focuses particularly on those created by Google) are helping to marginalize minorities through the way that they structure and encode the world around us. They are, quite literally, a part of systemic racism.

Discriminatory patterns

Before she earned her PhD, Noble was in the advertising industry where, she told Digital Trends, one of the big discussions was about “how to game Google for our clients, because we knew that if we could get content about our clients on the first page, that’s what mattered.”

A few years later, she glimpsed this world of search engine optimization and prioritization from another angle when a friend mentioned the search results presented when a person looks for the term “black girls.”

Dr. Safiya U. Noble, Author of Algorithms of Oppression Image used with permission by copyright holder

“The first page was almost exclusively pornography or highly sexualized content,” she said. “I thought maybe it was a fluke, but over the next year I did the same for other identities, such as Asian girls and Latinas.”

The same thing held true: frequent pornographic results, even when the search terms didn’t include suffixes like “sex” or “porn.” “That’s when I started taking seriously that this wasn’t just happening in a random way, and thought that it was time for a more systemized study.”

“…I started taking seriously that this wasn’t just happening in a random way.”

Noble isn’t the first person to spot worrying discrimination embedded into tools that many of us still believe are objective. Several years ago, the African-American Harvard University PhD Latanya Sweeney noticed that her search results were accompanied by ads asking, “Have you ever been arrested?” These ads did not appear for her white colleagues.

Sweeney began a study ultimately demonstrating that the machine-learning tools behind Google’s search were inadvertently racist, linking names more commonly given to black people to ads relating to arrest records.

It’s not just racial discrimination, either. Google Play’s automated recommender system has been found to suggest that those who download Grindr, a location-based social-networking tool for gay men, also download a sex offender location-tracking app.

In both cases, the issue wasn’t necessarily that there was a racist programmer responsible for the algorithm, but rather that the algorithms were picking up on frequent discriminatory cultural associations between black people and criminal behavior and homosexuality and predatory behavior.

Who has the responsibility?

Noble makes the point in her book that companies like Google are now so influential that they can help shape public attitudes, as well as reflect them.

“We are increasingly being acculturated to the notion that digital technologies, particularly search engines, can give us better information than other human beings can,” she said.

“The idea is that they are vetting the most important information, and provide us with [objective answers] better than other knowledge spheres. People will often take complex questions to the web and do a Google search rather than going to the library or taking a class on the subject. The idea is that an answer can be found in 0.3 seconds to questions that have been debated for thousands of years.”

To Noble, the answer is that tech giants need to be held accountable for the results that they provide — and the harm they might cause.

“If your platform allows this kind of content to flourish, then you’re also responsible.” 

“Tech companies have really invested in lobbying in the U.S. that they are simply intermediaries,” she continued. “They [claim that they] are tech companies and not media companies; they’ve designed a platform but they are not responsible for the content that flows through it. In the U.S. they do that because it means they are held to be harmless for trafficking in [things like] anti-semitism, Nazi propaganda, white supremacist literature, child pornography, and all the most hideous dimensions of the things which are out there on the web.”

She has little patience for the suggestion that tech companies like Google are simply reflecting what users search for. This is an argument Google itself made several years ago when it was taken to court in Germany for allegedly defamatory autocomplete results, linking the name Bettina Wulff, wife of the former German president Christian Wulff, with a rumor about prostitution and escorts.

“We believe that Google should not be held liable for terms that appear in autocomplete as these are predicted by computer algorithms based on searches from previous users, not by Google itself,” Google said at the time.

“If your platform has been designed in such a way as to allow this kind of content to flourish, then you’re also responsible for the way that you’ve designed your platform,” Noble said. “You cannot absolve your company from that.”

Asking the right questions

She also argues that tech companies are not as far removed from other, more traditional companies as they might like.

“When we’re talking about corporations and their impact on society,” she said. “I don’t think there are many industries that we can trace specific actions to one individual. What typically happens is that a CEO or a board of directors are held accountable if there is harm that hits communities. The tech industry doesn’t have to be different; it’s not really that different from fossil fuels or other industries that might cause harm through a particular product that’s been developed.”

Image used with permission by copyright holder

Finally, she calls for better training of those developing the algorithms that dictate which information is shown to us.

“If you are going to design technology for society, you should have a deep education on societal issues.”

“One of the things that is particularly frightening to me is that many of the people who are designing these technologies, and embedding their own values and world views and best judgment into them, have very limited education around the liberal arts, humanities, or the social sciences,” Noble said. “They typically come into engineering curriculums where they are hyper-focused on theoretical or applied math … A lot of them are operating on twelfth-grade level humanities instruction … If you are going to design technology for society, you should have a deep education on societal issues.”

The arguments she makes in Algorithms of Oppression are incisive and provocative. While Google and other tech companies have taken steps to solve some of the most egregiously high profile issues (my top search for “black girls” leads to Black Girls Code, a San Francisco tech training initiative for underrepresented youth), problems such as this will happen with greater frequency as platforms such as Google, Facebook, and others play an increasing role in our lives, with ever more information to parse.

Safiya Noble | Challenging the Algorithms of Oppression

There are no easy solutions. How much do search results influence public opinion? Are tech companies qualified to make value judgements about what is and isn’t acceptable? Should search engines display answers that are purely based on what gets clicks, even when these results are offensive or even harmful? Will making companies responsible for their content mean that they err on the side of censorship, even when the material doesn’t warrant it? These are conundrums that companies like Google will face as they grow ever larger.

Dr. Safiya Umoja Noble doesn’t have all the answers. But she’s asking the right questions.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Toyota shifts gears: 15 New EVs and a million cars by 2027
Front three quarter view of the 2023 Toyota bZ4X.

After years of cautiously navigating the electric vehicle (EV) market, Toyota is finally ramping up its commitment to fully electric vehicles.
The Japanese automaker, which has long relied on hybrids, is now planning to develop about 15 fully electric models by 2027, up from five currently. These models will include vehicles under the Toyota and Lexus brands, with production expected to reach 1 million units annually by that year, according to a report from Nikkei.
This strategy marks a significant shift for Toyota, which has thus far remained conservative in its approach to electric cars. The company sold just 140,000 EVs globally in 2024—representing less than 2% of its total global sales. Despite this, Toyota is aiming for a much larger presence in the EV market, targeting approximately 35% of its global production to be electric by the end of the decade.
The Nikkei report suggests the company plans to diversify its production footprint beyond Japan and China and expanding into the U.S., Thailand, and Argentina. This would help mitigate the impact of President Donald Trump’s 25% tariffs on all car imports, as well as reduce delivery times. Toyota is also building a battery plant in North Carolina.
For now, Toyota has only two fully electric vehicles on the U.S. market: The bZ4X  and the Lexus RZ models. The Japanese automaker is expected to introduce new models like the bZ5X and a potential electric version of the popular Tacoma pickup.
Separately, Toyota and Honda, along with South Korea’s Hyundai, all announced on April 4 that they would not be raising prices, at least over the next couple of months, following the imposition of U.S. tariffs. According to a separate Nikkei report, Toyota’s North American division has told its suppliers that it will absorb the extra costs of parts imported from Mexico and Canada. Another 25% for automotive parts imported to the U.S. is slated to come into effect on May 3.

Read more
Tesla, Warner Bros. dodge some claims in ‘Blade Runner 2049’ lawsuit, copyright battle continues
Tesla Cybercab at night

Tesla and Warner Bros. scored a partial legal victory as a federal judge dismissed several claims in a lawsuit filed by Alcon Entertainment, a production company behind the 2017 sci-fi movie Blade Runner 2049, Reuters reports.
The lawsuit accused the two companies of using imagery from the film to promote Tesla’s autonomous Cybercab vehicle at an event hosted by Tesla CEO Elon Musk at Warner Bros. Discovery (WBD) Studios in Hollywood in October of last year.
U.S. District Judge George Wu indicated he was inclined to dismiss Alcon’s allegations that Tesla and Warner Bros. violated trademark law, according to Reuters. Specifically, the judge said Musk only referenced the original Blade Runner movie at the event, and noted that Tesla and Alcon are not competitors.
"Tesla and Musk are looking to sell cars," Reuters quoted Wu as saying. "Plaintiff is plainly not in that line of business."
Wu also dismissed most of Alcon's claims against Warner Bros., the distributor of the Blade Runner franchise.
However, the judge allowed Alcon to continue its copyright infringement claims against Tesla for its alleged use of AI-generated images mimicking scenes from Blade Runner 2049 without permission.
Alcan says that just hours before the Cybercab event, it had turned down a request from Tesla and WBD to use “an icononic still image” from the movie.
In the lawsuit, Alcon explained its decision by saying that “any prudent brand considering any Tesla partnership has to take Musk’s massively amplified, highly politicized, capricious and arbitrary behavior, which sometimes veers into hate speech, into account.”
Alcon further said it did not want Blade Runner 2049 “to be affiliated with Musk, Tesla, or any Musk company, for all of these reasons.”
But according to Alcon, Tesla went ahead with feeding images from Blade Runner 2049 into an AI image generator to yield a still image that appeared on screen for 10 seconds during the Cybercab event. With the image featured in the background, Musk directly referenced Blade Runner.
Alcon also said that Musk’s reference to Blade Runner 2049 was not a coincidence as the movie features a “strikingly designed, artificially intelligent, fully autonomous car.”

Read more
Audi halts vehicle deliveries to the U.S. as it mulls impact of tariffs
2021 Audi Q5

If you’d been thinking of buying an Audi, now might be the time.  The German brand, owned by the Volkswagen Group, has announced it would halt shipments to the U.S. in the wake of President Donald Trump’s 25% tariffs on all imported vehicles.
Audi is currently holding cars that arrived after the tariffs took effect, on April 3, in U.S. ports. But it still has around 37,000 vehicles in its U.S. inventory, which should be able to meet demand for about two months, according to Reuters.
Automakers on average hold enough cars to meet U.S. demand for about three months, according to Cox Automotive.
Audi should be particularly affected by the tariffs: The Q5, its best-selling model in the U.S., is produced in Mexico, while other models, such as the A3, A4, and A6 are produced in Germany.
Holding shipments is obviously a temporary measure to buy time for Audi and parent company Volkswagen. If tariffs stay in place, vehicle prices would likely have to go up accordingly, unless some production is shifted to the U.S. Volkswagen already has a plant in Chattanooga, Tennessee, and is planning a new plant in South Carolina. That latter plant, however, isn’t expected to be operational until 2027 and is currently dedicated to building electric vehicles for VW’s Scout Motors brand.
Other global automakers have also taken drastic measures in response to Trump’s tariffs. Jaguar Land Rover on April 5 said it is pausing shipments of its its UK-made cars to the United States this month. The British sports-luxury vehicle maker noted that the U.S. market accounts for nearly a quarter of its global sales, led by the likes of Range Rover Sports, Defenders, and Jaguar F-PACE.
And on April 3, Nissan, the biggest Japanese vehicle exporter to the United States, announced it will stop taking new U.S. orders for two Mexican-built Infiniti SUVs, the QX50 and QX55.

Read more