Technology has been for many things, maintaining security and upholding the law being among them. However, the innovations we have now are still far from perfect, especially when it comes to facial recognition, so much so that this particular technology led to an innocent pregnant woman being jailed.
Facial Recognition Gone Wrong
The Detroit police resorted to facial recognition technology to track down the perpetrator for a robbery and carjacking. This eventually led to the arrest of a 32-year-old woman who was eight months pregnant, Porcha Woodruff.
As it turns out, the arrested was actually innocent of the crime. Woodruff was held by the police for 11 hours, questioned, and had her phone confiscated to be scanned for evidence. She was charged in court with the crime and ended up being released on a $100,000 personal bond.
The police used DataWorks Plus to conduct the facial scan through the surveillance footage. It was then matched with Woodruff's previous mug shot from 2015 which was an unrelated crime. To make it worse, the victim of the robbery identified Woodruff as the thief in a photo lineup.
A month after the mistaken arrest, Porcha Woodruff filed a lawsuit for wrongful arrest against the city of Detroit. The city's police chief, James E. White, says that the allegations are concerning and that the matter is being taken seriously, as mentioned in Ars Technica.
Reports show that the incident is not the first to happen. Facial recognition has led to three other wrongful accusations in Detroit alone, out of six that have happened overall. Another concern is that all of the victims of false recognition were said to be black.
Woodruff is not the only one filing a lawsuit against the city for wrongful arrest. There are three other related lawsuits brought on by facial recognition. American Civil Liberties Union of Michigan is already calling for the city to end the practice due to false arrests.
These mistaken identifications alone should already be enough to convince the police to cease the use of the technology for the time being, at least until they develop one that's more accurate to avoid further unfortunate circumstances that may lead to more lawsuits.
Why Facial Recognition Might Do More Harm
The case of Porcha Woodruff is just one factor that fuels the argument of facial recognition not being a good idea. Aside from it not being advanced enough to avoid wrongful arrests, it also poses a threat to people's privacy.
Some even argue that the technology is a stepping stone to an Orwellian future. These records will stay in the police databases, which could pose a threat if they got into the wrong hands. Threat actors could use this data to conduct fraudulent activities or track people.
A lot of security systems are also not of high enough quality to match the requirements that facial recognition requires, as pointed out by Tech Business Guide. Low light, image, or video quality will greatly affect how the system could identify a person accurately.