Congressman or convict? ACLU testing of Amazon’s facial Rekognition software raises concerns

Amidst controversial use of its Rekognition face scanning software in Orlando, Amazon is now under fire from the ACLU, which contends that the system is flawed enough to identify politicians as criminals. However, Amazon disputes the findings, and says the civil rights organization did not follow its recommended threshold for error tolerance.

Note: The latest research from OWI Labs, entitled “The Perimeter is Dead: $148B Physical Access Market to Lead IAM Growth,” is now available to download.

The results of the test, published on Thursday by the American Civil Liberties Union, utilized 25,000 publicly available arrest photos. Those were then compared against every current member of the U.S. House and Senate.

Amazon’s Rekognition software, as tested by the ACLU, incorrectly identified 28 politicians as criminals. The ACLU also noted that the false matches disproportionally selected people of color, reaffirming race and gender biases previously discovered in other facial recognition algorithms.

Notably, the ACLU set the threshold for error down to 80 percent, which Amazon said is well below its recommended setting of 95 percent. Still, law enforcement agencies who use Rekognition — such as the city of Orlando, where it is currently being tested — could theoretically set the error tolerance levels to any percentage they feel is appropriate, increasing the number of false positives.

Rekognition’s trial in Orlando has garnered skepticism from critics, with the ACLU leading the charge, along with local organizations including the Orange County Classroom Teachers Association, the Farmworker Association of Florida, and the United Faculty of Florida at UCF. Those groups all contend that Amazon’s technology could be used for discriminatory immigration enforcement.

Despite the fact that implementations of facial recognition technology have been found to carry race and gender biases, and sometimes carry poor accuracy for law enforcement, police around the world have turned to the technology for a more automated and thorough way of identifying criminals in public. Some projections have called for the market for facial recognition technology to balloon to nearly $7 billion in the next three years.

OWI Insight: The use of facial recognition for law enforcement is here to stay, as police around the world continue to adopt the technology. However, tests — including the latest one from the ACLU — clearly demonstrate that the technology needs significant improvements to prevent false identification. The space will continue to grow, but adoption could be held back by effectiveness, as well as public push back on the fear of a surveillance state. As adoption of facial biometrics continues to grow, officials must be transparent about its use, including potential false positives and failure rates, in order to earn Trust & Safety from the general public.