Facescapes A critical guide to facial recognition technologies.

Racial Discrimination and Bias in Facial Recognition Technology (Courtney Miller)

The accuracy of facial recognition technology on identifying people is not universal. A project titled the Gender Shades project revealed that algorithms in this technology had the least accuracy in identifying darker females as opposed to the highest accuracy in lighter males. These disproportionate results can lead to wrongful arrests and penalization of people of color in the United States.

"Racial Discrimination in Face Recognition Technology" Alex Najibi Participation in facial recognition occurs without consent or notice and is an effect of failures in legislation to keep this in check. This article, published as an analysis in science policy by Harvard University, contextualizes the ways in which facial recognition algorithms fail to identify all races and genders equally. Najibi details how policing agencies utilize facial recognition technologies to execute discriminatory law enforcement and explains that, beyond a long history of surveillance of Black activists for purposes of suppression, that heightened surveillance of Black communities is a fear-inducement tactic with potential to cause psychological harm and make those being surveilled more vulnerable to abuse. Additionally, these inaccuracies in facial recognition can lead to targeting of other marginalized populations like undocumented immigrants or Muslim citizens. Facial recognition technology has increasingly lead to false reports, harsher sentences and higher bails. It is a powerful tool to surveil and oppress marginalized groups, and there is a need for these algorithms to be more proportionate and effective.

"How is Face Recognition Surveillance Technology Racist?" Kade Crockford In this article published by the American Civil Liberties Union, Kade Crockford details modern uses of facial recognition technology being used to oppress and target Black people, but also details the history of surveillance of Black people and communities in the United States. They explain its roots in the 18th century lantern laws that demanded the carrying of candle lanterns by Black, mixed-race and Indigenous enslaved people if they were to travel in New York City after sunset, unaccompanied by a white person. Failure to comply led to punishment. These lantern laws allowed for the identification, observation and control of marginalized people in public spaces. Many describe this as a precedent set for the later establishment of stop-and-frisk laws today.

Police surveillance utilizing facial recognition technology performs a similar function. Cameras disproportionately installed and utilized by law enforcement in Black and Brown neighborhoods allow for similar surveillance and tracking enabled by the lantern laws.

Crockward warns that if government agencies continue to employ facial recognition technology to monitor and control communities, then Black and Brown people will continue to be disproportionately targeted. By failing to learn from the mistakes of the past, the United States fails to achieve racial justice.

"The Perpetual Line-Up: Unregulated Police Face Recognition in the America" Clare Garvie, Alvaro Bedoya, Jonathan Frankle This analysis on unregulated police use of facial recognition technology by the Center on Privacy and Technology at Georgetown Law provides background information, a risk framework and provides potential recommendations on the subject. Additionally, they have developed a tool that allows the user access to information on this usage by city or state. This information includes:

Homepage Image Source: Gender Shades