New Netflix doc ‘Coded Bias’ is so keen to show AI is racist that it ignores how tech tyranny is dehumanizing EVERYONE
‘Coded Bias’ explores how artificial intelligence algorithms propagate racial and gender bias. But its obsession with identity politics means it fails to address the very real prospect of big tech taking over the world.
Big-tech totalitarianism is one of the most important issues of our time, and I’m on board with any film highlighting the inherent perils of over-reliance on insidious technologies. But Coded Bias, directed by Shalini Kantayya, while being somewhat informative, ultimately falls flat because its focus on race and gender is much too narrow.
The film sets out to show how artificial intelligence dehumanizes people and encodes racial bias into application processes for jobs, colleges, mortgages and loans, as well as the criminal justice system. But this misses the techno-tyranny forest for the trees and is akin to complaining about a lack of art by people of color on the walls of the Titanic.
Massachusetts Institute of Technology computer scientist Joy Buolamwini opens the movie by recounting how she discovered racial bias in facial recognition software and then documents her attempts to combat it with her collection of activists, the Algorithmic Justice League (AJL).
Buolamwini makes for a compelling protagonist on this journey into the Orwellian hellscape of artificial intelligence, due to her superior knowledge of the subject matter and magnetic personality.
Equally compelling is the disturbing information about the totalitarian use of algorithms by the Chinese government to control their populace through a social credit system, and the UK’s baby steps down the same authoritarian path, as it implements its own flawed facial recognition program.
Americans are under the same invasive surveillance and are imprisoned by a similar social credit system, the only difference being that they are unaware of it, and it’s being done by big tech companies such as Google, Facebook, Amazon and Apple.
But these issues are painfully complex, and Coded Bias is often at cross-purposes with itself when confronting them. For instance, the film highlights the Chinese and UK governments’ draconian use of technology, but then spotlights activists demanding the American government assert itself more aggressively regarding oversight.
The same is true when Buolamwini takes her racial bias study to IBM to demonstrate that its facial recognition tech fails to adequately work on black faces. In response, the company fixes the problem… which results in more black people being able to be put in facial recognition databases. This pyrrhic victory makes the AJL seem like controlled opposition.
In this way, the AJL is reminiscent of Black Lives Matter, in that it’s really a grievance delivery system designed to divide people and distract them from the much bigger issue. The race- and gender-obsessed AJL, just like BLM, makes enemies of potential allies by refusing to see all victims as equal.
For example, the conservatives and ‘conspiracy theorists’ that have been de-platformed by algorithms from Twitter, Facebook, Google and YouTube are not considered worthy victims of tech totalitarianism by the AJL (and are never mentioned in the movie). But these ‘deplorables’ could be powerful allies in the fight to rein in the Sauron of Silicon Valley.
In one scene, Republican Congressman Jim Jordan of Ohio is aghast at the power and pervasiveness of the FBI’s extra-judicial facial recognition program. The AJL no doubt loathes Jordan (an easy thing to do), but he could be an effective asset in attempting the Herculean task of restraining the tech behemoth.
In contrast to Jordan, in the same Congressional hearing, Democrat Alexandria Ocasio-Cortez ignores deeper concerns and instead theatrically focuses her ire at the majority “demographic group” that writes the code for artificial intelligence – white males.
The arch-villains of big tech expanding their surveillance capabilities without giving the slightest thought to ethics or human rights makes the possibility and probability of a dystopian corporate and draconian governmental future (and present) extremely high. However, the film and the AJL are simply incapable of moving beyond their slavish devotion to identity politics and their own biases against white men to focus on that truly horrifying bigger picture.
The reality is that artificial intelligence doesn’t just dehumanize black people, it dehumanizes all people, and any movement that fails to put that fact front and center is deserving of distrust if not disdain.Also on rt.com Seaspiracy? Netflix’s new anti-fishing film that’s being lauded by celebrities has got more holes in it than a trawler’s net
If the AJL was serious about stopping techno-tyranny, it would be fighting vociferously to restore every person’s right to privacy and freedom of speech, especially if that speech is ugly and hateful, and for the right of people to own their personal information and data, and to stop tech companies from collecting and selling that data, and to either shatter the tech monopolies into a million pieces or transform them into public utilities. But it isn’t serious and doesn’t aggressively address any of those issues.
Coded Bias ends by recounting the true story of Stanislav Petrov, a Soviet soldier who in 1983 defied technology during a missile scare and refused to launch a nuclear counter-attack against the US. The film states that if the artificial intelligence of a Strangelovian ‘doomsday machine’ was in charge, and not Petrov’s humanity, then the world would have been obliterated. This nod to individualism is a nice sentiment, but rings hollow after 90 minutes of relentless identity politics. It’s also somewhat amusing since the heroic Petrov is a member of the dreaded white male demographic.
In keeping with the Dr. Strangelove metaphor, Coded Bias and the activists it spotlights unfortunately aren’t truly interested in fighting against big tech’s artificial intelligence ‘doomsday machine’, they just want to make sure the war room is diverse and inclusive enough.
Think your friends would be interested? Share this story!
The statements, views and opinions expressed in this column are solely those of the author and do not necessarily represent those of RT.