A facial recognition tech test incorrectly matched several Denver City Council members with sex offenders

The test was run by organizers behind a possible anti-facial-recognition ballot measure.
3 min. read
Denver City Council begins on Aug. 5, 2019. (Kevin J. Beaty/Denverite)

Organizers hoping to put a measure on the Denver ballot banning facial recognition technology said a test using City Council members' portraits incorrectly matched them with photos of sex offenders.

The test used Amazon's facial recognition program, called Rekognition. Connor Swatling, a member of the committee supporting the measure, said it took about three days to complete the test, which involved creating a 2,000 image pool from the Colorado Bureau of Investigation's publicly-available Sex Offender Register. He then ran the council members' official portraits through the program and compared them to the sex offender image database to see if there were any matches.

Searches with images of nine out of the 13 members returned matching results. Councilman Chris Hinds had the most matches, with four, while images of Councilmembers Stacie Gilmore, Paul Kashmann and Jamie Torres each returned three matches. Each match has a "confidence" rating that shows you how similar the two photographs are. Their results showed the program had a confidence level rating as high as 92 percent for some council member's portraits when compared to those of sex offenders.

The test was similar to one completed by the ACLU on sitting members of Congress in 2018. Their results showed the tech falsely matched 28 members with people who had been arrested for a crime. They included a much larger mugshot pool (25,000).

Swatling said the results showed how unreliable the technology can be. The group is currently collecting signatures to put a measure on the fall ballot asking voters to ban city agencies from using it. Such technology is not currently in use by city agencies, including Denver police.

Torres said the results "underscore the lack of certain reliability," for the technology, though she said she wanted to learn more about how the group put together their sample size. She added that the results didn't seem to show how this kind of tech can disproportionally make false positive matches for people of color.

"That said, there are clearly enough questions marks about it that even DPD won't use it. It's interesting," Torres said.

Swatling and others want to ban the technology because they believe it's biased and violates people's right to privacy.

"To me, this is more about signaling what we already knew: that the technology has some challenges it needs to overcome, some hurdles it needs to overcome before it's ready to be used by city agencies, specifically law enforcement," Swatling said. "If a city councilperson can be falsely matched to a sex offender, any Joe Blow can be matched to anyone, really."

Recent Stories