The high-profile case of a Black man wrongly arrested earlier this year wasn’t the first misidentification linked to controversial facial recognition technology used by Detroit Police, the Free Press has learned.
Last year, a 25-year-old Detroit man was wrongly accused of a felony for supposedly reaching into a teacher’s vehicle, grabbing a cellphone and throwing it, cracking the screen and breaking the case.
Detroit Police used facial recognition technology in that investigation, too.
This man, Michael Oliver, was charged with larceny for the May 2019 incident he didn’t actually participate in. The report by the Free Press contains photos of both the person caught on the phone’s camera and Michael Oliver. They highlight one major problem with facial recognition software: even if one could be persuaded the two faces are a close match (and they don’t appear to be), the recording used by investigators to search for a match showed the suspect’s bare arms. The person committing the crime had no tattoos. Michael Oliver’s arms are covered with tattoos, running from the top of his hands all the way up to his shoulders.
The facial recognition software delivered its mistake to investigators, who included this mismatch in the photos they presented to the person whose phone had been grabbed.
During the investigation, police captured an image from the cellphone video, sent it for facial recognition and the photo came back to Oliver, the police report said.
After Oliver was singled out, a picture of his face was included in a photo lineup of possible suspects that was presented to the teacher.
A second person, a student, was also captured in the video with the suspect. The officer in charge of the case testified he didn’t interview that person though he’d been given that student’s name.
Once again, the Detroit PD and local prosecutors are promising the thing that has already happened twice won’t happen again. There are new processes in place, although it’s unclear when those policies went into effect. Oliver was arrested late last year. The other bogus arrest occurred earlier this year. In both cases, reporters were assured by law enforcement spokespeople that things have changed.
Here’s what official say is now in place, even though it’s too little too late for two Black men arrested and charged for crimes they didn’t commit. There are "stricter rules" in effect. Matches returned by the system are not probable cause for anything, but can only be used as "investigative leads." Supposedly, this software will now only be used to identify people wanted for violent felonies.
Prosecutors are doing things a bit differently, too. But it’s a reaction, rather than a proactive effort. t’s only now -- after two false arrests -- that the prosecutor’s office is mandating review of all facial recognition evidence by the city’s top prosecutor, Kym Worthy. Investigators must also produce corroborating evidence before seeking to arrest or charge someone based on a facial recognition match.
This seems unlikely to change anything. Outside of the limitation to violent crimes, both of these cases could have gone through the review process -- along with the limited corroborating evidence (in both cases, crime victims picked the AI’s mismatches out of a lineup) -- and still resulted in the arrest of these two men. In both cases, investigators ended their investigations after this step, even though they were given the opportunity to interview other witnesses. If corroboration is nothing more than discovering humans are just as bad at identifying people as the PD’s software is, the mistakes the PD claims will never happen again will keep on happening.
Our IP Address: