Orlando police will continue to test Amazon’s facial recognition software, despite outcry from privacy advocates – One World Identity
Although a number of organizations, including the American Civil Liberties Union, have spoken out against the Orlando Police Department using Amazon’s Rekognition technology, officials recently decided to extend the program, which automatically identifies individuals through facial recognition.
Note: The latest research from OWI Labs, entitled “The Perimeter is Dead: $148B Physical Access Market to Lead IAM Growth,” is now available to download.
City officials sided with police, saying that more time was needed for a “thoughtful, precise, and comprehensive recommendation,” according to the Orlando Sentinel. During the trial run, the program has used just eight city cameras, and has been equipped to identify just seven officers who volunteered to participate.
The trial has garnered skepticism from critics, with the ACLU being joined by a number of local organizations, including the Orange County Classroom Teachers Association, the Farmworker Association of Florida, and the United Faculty of Florida at UCF. They contend that the technology could be used for “discriminatory immigration enforcement.”
“The context of increased ICE raids, FBI targeting of Black Lives Matter activists, the securitizing of communities through Countering Violent Extremism (CVE) initiatives, racial disparities in the use of police force, and the President’s Muslim Ban has led to increased levels of distrust both within our community and across the nation,” the ACLU wrote in a letter to Police Chief John Mina. “The mere use of Rekognition, or similar public surveillance and facial recognition systems, will exacerbate that distrust, and will promote suspicion and public self-censorship in the Orlando metro area.”
For its part, the city has expressed pride that Amazon selected Orlando for the free trial period with Rekognition. Mayor Buddy Dyer told the Sentinel that facial recognition technology is already the norm, noting that the iPhone X uses Face ID technology for secure biometrics, while U.S. Customs and Border Protection uses facial recognition for people entering and leaving the country.
“This is just using it in a little bit broader sense for crime prevention or crime apprehension,” Dyer said. “I think we’ll be able to balance that need. It’s not something where we’re going big brother and following everybody.”
In fact, at Orlando International Airport, international flyers can already leave their passport and boarding pass in their bag, as an automatic facial recognition system identifies fliers in place of showing documentation.
Such technology is also catching on around the world, including in China where officials have been using it to identify individuals at concerts, and have made three arrests. And the upcoming Tokyo 2020 Olympics have announced plans to use facial biometrics to identify athletes, officials and journalists.
However, critics contend that because of how the technology has been developed — mostly by white males in America — facial recognition has inherent biases. Tests have shown the current technology is best at identifying white males, while darker skinned females carry much higher failure and false-positive rates.
To offset this, IBM last month released over 1 million images to fight bias in facial recognition. The previous largest facial attribute dataset was just 200,000 images, making IBM’s release by far the largest of its kind.
OWI Insight: The seemingly ubiquitous public use of facial recognition technology, including for identifying potential threats in public, seems inevitable. However, the points being raised by the ACLU and others are valid, as scientific tests have proven. As adoption of facial biometrics for numerous causes, including law enforcement, continues to grow, officials must be transparent about its use, including potential false positives and failure rates, in order to earn Trust & Safety from the general public.