Cameras are everywhere today: traffic and security cameras, news crews and other professional productions, and nearly everyone around us has a video camera on their phone. Most of us have become accustomed to this idea, but it can have profound implications for us if we are ever accused of a crime.
Police often dig through whatever video footage they can find, looking for evidence that might link a suspect to a crime. They can look through all sorts of material that is publicly available and may be able to get warrants for material that is not.
Of course, it isn’t easy for law enforcement officers to scroll through hundreds or thousands of hours of video in an effort to, hopefully, catch a glimpse of one suspect, but digital facial recognition technology can make this job easier. Some facial recognition services promise to allow users to scan through billions of images, identifying faces by biometric data (such as the distance between the eyes and the nose) even in cases where human observers could not recognize them.
False arrests and bad evidence
Many law enforcement agencies use these services to run video of unknown suspects against a huge database of known individuals.
This practice has frightening implications for the rest of us. For one thing, sometimes the technology gets faces wrong. That can lead to defendants being convicted with faulty evidence. In some cases, it has led to innocent people being arrested.
Last year, police arrested a pregnant woman in front of her children after facial recognition software misidentified her as the person who committed a carjacking. In fact, she had a solid alibi, and prosecutors eventually dropped the case, but not before her life was turned upside down.
There’s a racial component to this issue. Studies have shown that facial recognition software is much more likely to find a false match when looking at the faces of Black people, as opposed to looking at the faces of others. The American Civil Liberties Union says it knows of more than a half-dozen cases in which people were arrested due to bad matches in facial recognition. In every case, the subject of the wrongful arrest was Black.
More training?
The federal government has taken small steps to address the possible abuse of facial recognition technology. Of seven federal law enforcement agencies currently using the technology, only two require their agents to take special training that might help protect the public’s civil liberties. Some federal watchdogs have recommended all agencies increase their training.
It’s too early to tell if more training will stop the abuse of the technology, the false arrests and bad evidence. In the mean time, those who are facing criminal charges should be aware of the possibility the prosecution will use this type of evidence against them, and learn about how to defend against it in court.The post Civil liberties advocates warn against police abuse of facial recognition technology first appeared on W. Scott Hanken, Attorney at Law.