ACLU, NAACP call on US DOJ to investigate racial bias in law enforcement facial recognition – One World Identity
Facial recognition technology is increasingly being used by law enforcement to automatically locate known criminals, fugitives and suspects. But potential biases in the technology — particularly against nonwhites — have drawn the attention of a number of organizations, including the ACLU and NAACP.
Earlier this month, a coalition of concerned groups submitted a letter to the U.S. Department of Justice’s Civil Rights Division, calling for an investigation into whether face recognition has a disparate effect on communities of color.
The letter cites a 2012 study co-authored by an FBI expert that found several leading facial recognition algorithms were anywhere from 5 to 10 percent less accurate with African Americans than caucasians. Accordingly, the coalition has requested that the DOJ cooperate with the Federal Bureau of Investigation to find out more about any potential biases.
“Such inaccuracies raise the risk that, absent appropriate safeguard, innocent African Americans may mistakenly be placed on a suspect list or investigated for a crime solely because a flawed algorithm failed to identify the correct suspect,” the letter reads.
The American Civil Liberties Union and NAACP were joined by the Council on American-Islamic Relations, the Electronic Frontier Foundation, The Innocence Project, National Action Network, and many more in supporting the call for an investigation.
“Face recognition technology has enormous civil liberties implications and its use must be closely examined to ensure that it is not violating Americans’ civil rights,” the letter reads. “We stand ready to work with you to ensure that the voices of our communities are heard in this important, ongoing national conversation.”
As the technology behind facial recognition has become more available and affordable, its use has spread to smaller, local law enforcement agencies across the U.S.
Security advocates contend that the new tools available to law enforcement are a useful way to identify potentially dangerous suspects without violating their constitutional rights, by simply identifying them via a visible face in public. But critics are concerned that potential flaws in the software, especially toward people with darker skin tones, could only exacerbate race relations issues and lead to wrongful accusations or convictions.