Published on August 8th, 2016 | by Joshua New0
Tattoo-Matching Algorithms Do Not Mean We’re in Nazi Germany
Tattoos, no surprise, can be helpful in criminal investigations. Much like a person’s size, hairstyle, or eye color, they add unique detail to physical descriptions that law enforcement agencies use to identify suspects and victims. Tattoos also can provide useful information about people’s affinities and affiliations, whether they happen to be with a fraternity or sorority, a military unit, a gang, or a hate group.
Since manually sorting through databases of tattoo photos is labor-intensive, the U.S. National Institute of Standards and Technology (NIST) sponsored a project to investigate how algorithms could be used to automatically recognize and match tattoos. In response, the Electronic Frontier Foundation (EFF) has denounced this effort as a dangerous step toward a Big Brother-style police state in which police will freely use technology to violate people’s civil rights in a variety of invasive and terrifying ways. EFF even goes so far as to compare law enforcement’s use of tattoo-matching algorithms to Nazi Germany’s use of tattoos to identify Jews during the Holocaust. These are preposterous claims that not only condemn useful research, but also stand in the way of developing legitimate tools to make policing more efficient and effective.
Right now, law enforcement agencies use a standardized system of keywords and labels to describe tattoos in their databases and manually search through these keywords to find tattoos that could help identify suspects or victims and generate new leads. But these methods are imprecise, since they depend on the subjective interpretations of the images. In addition, investigators do not always capture all of the information that may be gleaned from a tattoo, such as possible hidden meanings or relationships to similar tattoos or graffiti, which limits the usefulness of data about tattoos as an investigatory aid. NIST’s Tattoo Recognition Technology program is part of an effort to improve the accuracy and effectiveness of tattoo-matching technology so that these processes can be automated.
Despite this laudable goal, EFF says that NIST’s research program should terrify anyone who cares about freedom of expression and religious liberty and called for NIST to “halt this program immediately.” Tattoo recognition technology, EFF argues, “sets law enforcement on a dangerous path, since it would allow a police officer to learn more than just your identity, but your interests, political beliefs, or religion.” But since police regularly use tattoos to infer information about individuals they come into contact with, it takes a creative feat of mental gymnastics to conclude that using an algorithm to make this practice more reliable, accurate, and efficient is in any way a cause for concern.
EFF’s criticisms veer into the absurd when they attempt to draw a comparison between Nazi Germany and NIST’s research. In discussing NIST efforts to determine if algorithms could identify similar, but not identical, tattoos on different people, EFF writes, “This should raise bright red flags for those concerned about religious freedom, especially in light of how authoritarian governments have used tattoos to oppress religious minorities. Nazi Germany’s use of tattoos to track Jews during the Holocaust comes to mind.” Raising the specter of Nazism whenever one does not like what the government is doing should be off-limits in technology policy debates, because it does nothing to enlighten, only to enflame. In this case, EFF’s analogy is a morally obtuse false equivalence. There is absolutely no legitimate comparison between the forced tattooing of a religious minority for the purpose of ethnic cleansing to the use of algorithms to help law enforcement better identify victims and perpetrators of crimes based on their tattoos.
Regardless, government programs are not nefarious simply because they involve technology. Moreover, those who oppose providing police with technologies to operate more efficiently and effectively are standing in the way of advancements that could actually offer greater civil liberties protections and increase public safety. For example, if law enforcement could apply a tattoo-matching algorithm to security camera footage of a robbery to quickly and accurately identify robbers by their tattoos when their faces are obscured, then the police could avoid unnecessarily detaining innocent people. While policymakers should be keenly aware of the complicated policy questions that can arise from law enforcement applications of new technologies, they should also be just as aware of how some are willing to dramatically mischaracterize beneficial and benign technology to stoke fears about its use.
Image: Javier Ramirez.