Detroit woman suing police after 'shoddy' AI facial recognition leads to false arrest

Porcha Woodruff in Oak Park, Mich. Woodruff, who was falsely arrested when she was eight months pregnant and accused of a carjacking, is suing the city of Detroit due to what she says is an overreliance on facial recognition technology.

Porcha Woodruff in Oak Park, Mich. Woodruff, who was falsely arrested when she was eight months pregnant and accused of a carjacking, is suing the city of Detroit due to what she says is an overreliance on facial recognition technology. (Carlos Osorio, Associated Press)


Save Story
Leer en español

Estimated read time: 3-4 minutes

This archived news story is available only for your personal, non-commercial use. Information in the story may be outdated or superseded by additional information. Reading or replaying the story in its archived form does not constitute a republication of the story.

DETROIT — Porcha Woodruff was eight months pregnant when six cops showed up at her home in February, told her she had been identified as a suspect in a carjacking incident and arrested her in front of her children.

Woodruff, a 32-year-old Black woman, thought it was some kind of a joke and, per a report from the New York Times, pointed to her stomach and asked police, "Are you kidding me?"

Turns out it was no joke and Woodruff was held for 11 hours before posting a $100,000 personal bond and being released but not until after she was charged with robbery and carjacking.

Police zeroed in on Woodruff after they used an artificial intelligence-driven software program that matched an image of a woman captured in surveillance video footage, believed to be a suspect in the incident, to a picture contained in a law enforcement database. An 8-year-old photo of Woodruff was in a file of police mugshots following an arrest for driving without a license. The carjacking victim then picked Woodruff out of a group of six photographs and that led to her arrest. But, even though police had access to a more recent photograph of Woodruff, one taken for her Michigan driver's license, they failed to perform a double-check, according to the Times.

Police eventually recognized their mistake and the case against Woodruff was dismissed, but now she is suing the city of Detroit and says she and her family continue to suffer from the impacts of the mistaken identity and arrest.

"My two children had to witness their mother being arrested," Woodruff told the Associated Press. "They stood there crying as I was brought away."

The lawsuit says that Woodruff has suffered, among other things, "past and future emotional distress" because of the arrest, per the AP. Woodruff said her pregnancy already had multiple complications that she worried the stress surrounding the arrest would further exacerbate.

"I could have lost my child," Woodruff told the AP in a phone interview.

The ACLU of Michigan is also pursuing litigation related to false identifications made by police using AI software tools. Woodruff's arrest represents the third known allegation of a wrongful arrest by Detroit police based on the technology, according to the ACLU and, in each case, the victims of those mistaken identifications were Black.

"It's deeply concerning that the Detroit Police Department knows the devastating consequences of using flawed facial recognition technology as the basis for someone's arrest and continues to rely on it anyway," said Phil Mayor, senior staff attorney at ACLU of Michigan, in a statement.

Mayor represents Robert Williams, a Detroit man who was arrested in January 2020 for shoplifting based on a faulty facial recognition match, for which the prosecutor's office later apologized, according to the New York Times.

"Shoddy technology makes shoddy investigations, and police assurances that they will conduct serious investigations do not ring true," Mayor told the Times.

A slew of independent analyses have uncovered racial biases in the models used to train artificial intelligence facial recognition software including a study published last fall by Georgia State University researchers that found significant racial disparities when it came to arrests that used facial recognition software as a basis for establishing probable cause. Their conclusions included a call for heightened human involvement and oversight when it comes to law enforcement's use of AI-driven software in criminal investigations.

"Results suggest a need for civic leaders to scrutinize the relative contributions of structural factors, agency policies and government directives to officer decision-making before widely deploying (facial recognition technology) in jurisdictions," researchers wrote. "For agencies currently using this technology, it would imply the need for policies and supervision that guide, and in some cases restrict, officer discretion in (facial recognition technology)-assisted contexts."

Most recent Artificial Intelligence stories

Related topics

Artificial IntelligenceU.S.Police & CourtsScience
Art Raymond, Deseret NewsArt Raymond
Art Raymond works with the Deseret News' InDepth news team, focusing on business, technology and the economy.

STAY IN THE KNOW

Get informative articles and interesting stories delivered to your inbox weekly. Subscribe to the KSL.com Trending 5.
By subscribing, you acknowledge and agree to KSL.com's Terms of Use and Privacy Policy.
Newsletter Signup

KSL Weather Forecast

KSL Weather Forecast
Play button