Coded Bias

Coded Bias is one of those shit-yourself-with-fear documentaries. It’s about the propagation of artificial intelligence algorithms and facial recognition technology in a million unseen ways throughout our society right now, and how the racist and sexist biases existent in our societies are being replicated and compounded by those technologies.

Almost nothing in the documentary is about future technology. It is about what is happening right now, and in some cases, has been happening for years under our noses. Things we have a vague sense of, without feeling like there’s any explicit intrusion, have been designed with exactly that effect in mind. To become ubiquitous, convenient, and unseen, while holding a massive amount of power, to be sold to the highest bidder or the state.

First things first, facial recognition technology. The artificial intelligence that recognises what a face is, and whether it matches another face, is only as good as the data set it learns from. And unsurprisingly the white men who created the code to sell to the white men who’d buy the software, mostly entered white men into the data set. Women and people of colour were widely underrepresented, and thus the software failed to recognise them, or correctly match their faces a disproportionate amount of the time. Oh, and gender minorities? Those don’t exist. You are either a woman or man, light-skinned or dark-skinned. The cissexist, mono-genderist model erases trans non-binary folks entirely.

So what does that mean if facial recognition technology doesn’t work on you? Well, for one you are going to be massively more likely to be mismatched, possibly by police looking for wanted criminals, possibly by airport security looking for no-fly-list terror suspects. In short, harassment happening wholesale against populations of people of colour will now be automated, built into the codes that control our lives, and depicted as the neutral, infallible judgement of an emotionally-detached system.

And it’s not simply the lack of diversity in data sets. If you have a program that is designed to replicate what is already there, it will replicate all the injustices that are already there. A company that hires mainly white men finds that the AI that sorts through the CVs at HR is excluding almost all women and people of colour. Why? Because it is designed to find matches that replicate the existing outcome. So a computer program meant to take human partiality out of the equation finds it only entrenches prejudice.

And there is no accountability for this technology. Because the artificial intelligence is designed to learn on its own, beyond its original programming, sometimes its own developers don’t know how exactly it is making its calculations and judgements.

So gone are the days you could boycott a bus company for not hiring ‘coloured’ workers. Gone are the days you could protest a sheriff’s department for its discriminatory policing. In the current era of civil rights, neither the bus company nor the sheriff’s office will have any control over who is selected for hire or frisking, it will be determined by an algorithm designed by an entirely different company, maybe one that isn’t even in the country, and even they themselves won’t full understand why it’s happening.

Scary, no?

So how do you resist? Luckily this documentary gives us a number of activists and human rights groups to root for. Predominantly led by women of colour, the charge is driving for more regulation of this technology, of raising awareness of its prevalence, and ways to undermine its usage. The film follows Big Brother Watch and Algorithmic Justice League as they try to make legal challenges against the unregulated use of untested software on powerless, poor, and predominantly black communities. The fight for equality, privacy, and human rights goes on, now in new technological frontiers.

If you like this …