Researchers find a way to steal your identity with 22-cent glasses
In a recent study from Carnegie Mellon University, individuals wearing specially crafted eyeglasses were able to not only fool facial recognition software 100 percent of the time, but also impersonate other people at a rate near 90 percent.
Current facial recognition software uses deep neural networks to learn patterns on individual facial features. The technology can also detect whether a set of given facial features belong to the same person.
To test the accuracy of modern facial recognition technology, the Carnegie Mellon research team crafted a pair of eyeglasses printed with a pixelated color scheme. In their tests, an individual wearing the eyeglasses successfully duped the facial recognition software’s ability to recognize the pattern.
In a potentially serious security concern for biometric authentication, the eyeglasses also demonstrated success in impersonating the identity of other individuals with 87.9 percent accuracy.
The study simulated multiple impersonations to demonstrate whether their test subjects could pass as famous celebrities. Their research found that a 41-year-old white male was able to impersonate Milla Jovovich — a 40-year-old white female.
In a more extreme example, a 24-year-old South Asian female was successfully able to impersonate former U.S. Secretary of State Colin Powell.
The research effort also demonstrated the ease to which the experiment is replicable by the average person. The eyeglasses cost just 22 cents to create, using an office inkjet printer and glossy photo paper to create the pixelated color overlay.
Facial recognition software has recently become popularized through services such as Facebook and Google that can automatically detect and tag familiar faces within photos. Advanced applications of facial recognition software are being developed to assist with a variety of use cases including surveillance by law enforcement agencies, airport security, and identity verification for passports and at ATM machines.
The study demonstrates important gaps in facial recognition software and the extent to which the technology remains vulnerable to disguise.
“As our reliance on technology increases, we sometimes forget that it can fail,” the researchers said, advocating for machine learning algorithms that are more “robust” against such simple evasion techniques.