The Aurora Police Department can start using AI facial recognition software in its investigations after getting the go-ahead from the Aurora City Council.
The resolution passed in a vote of 5-2 at the council’s Monday meeting. Commander Chris Poppe of APD is one of the architects of the policy. He told council members the police department has been “working on this for a couple of years.”
APD says it will use the vendors Clearview AI and Lumen. Clearview AI has a database of tens of billions of faces scraped from all over the internet that law enforcement can use to find matches when they have photos of unidentified suspects. Lumen matches photos to mugshots that already exist in the police database.
APD’s website states they will use facial recognition technology as “an investigative lead,” not as the sole grounds for arrests.
According to Poppe, Clearview AI will give police a list of potential suspect matches, rather than offering just a singular match. He added that officers will be allowed to use software for any active or ongoing investigation, but not as a part of “ongoing surveillance or persistent tracking.” APD is also pledging not to use the software for “civil immigration enforcement.”
What Aurora city council members think
Council member Alison Coombs brought up ethical concerns around Clearview AI in particular. She questioned the company’s trustworthiness, pointing to lawsuits that have been brought against it for violating privacy laws.
“I think that it's deeply concerning that we would be looking at contracting with a vendor that has demonstrated that they will violate the law and wait to be sued to then comply,” Coombs said.
Coombs later told CPR that some of her constituents have voiced concerns to her about the use of this tech.
“I was asked about it at a forum that I attended with a group of community members here in Aurora,” Coombs said. “They were mostly older adults and they had significant concerns with the level of surveillance in general and the use of AI in connection with the fact that we already have such a large amount of surveillance taking place.”

“I think that the very folks that the department needs to rebuild trust with are going to have less trust and more suspicion in relation to this technology,” she said.
Other council members weighed in on why they voted yes on the resolution, mentioning the public’s desire for safer communities.
“I've heard consistently from my constituents that crime and addressing public safety issues is a top priority, if not the most important priority,” said council member Curtis Gardner. “I feel like the Aurora Police Department made a compelling case as to why these tools could be helpful in addressing crime and solving crime. I think they also have appropriate safeguards in place to protect potential civil liberties concerns.”
Council member Amsalu Kassaw echoed those views.
“When we do a town hall meeting with our communities, they're very concerned with crime. They're always concerned about public safety. And we thought as a council, this is a good tool to fight criminals, to fight the crime that is happening in our city,” said Kassaw. Kassaw works as a lieutenant for the GEO Group, the private contractor that runs the U.S. Immigration and Customs Enforcement detention center in Aurora. ICE is one of Clearview AI’s biggest customers.
Civil rights concerns
Multiple cities across the country, including Minneapolis, Boston, and San Francisco, have banned the use of facial recognition software. In Colorado, civil rights advocates are concerned about APD having access to this technology.
“At baseline, the ACLU of Colorado does not think that law enforcement should have the ability to use facial recognition technology,” said ACLU public policy director Anaya Robinson.
“I think our concerns generally around the use of facial recognition technology by law enforcement are even more exacerbated when we're talking about governments that already have complicated and somewhat problematic relationships with communities of color, like the Aurora Police Department does and historically has had.”

In 2021, state-appointed investigators published a report detailing how the “Aurora Police has engaged in a pattern and practice of racially biased policing against people of color as a whole and Black people in particular.” That same year, the Colorado Attorney General’s office and the City of Aurora agreed on a framework for the city’s police and fire departments to reform racist policing practices and rebuild public trust by signing a consent decree, which is still in place today.
Robinson believes that people in Aurora are being harmed by this technology is a matter of when, not if.
“We really see the use of facial recognition as a potential to maintain and very likely increase those negatively impactful interactions with police,” Robinson said. “While we understand that this won't be allowable to be used as a basis of arrest or a basis of prosecution alone, it does and has the very real potential to increase interactions with law enforcement based on a system that we know is historically, and still currently flawed, in misidentifying at higher rates, people of color, women, people with disabilities.”
Anaya noted that Aurora has followed state rules in its quest to start using facial recognition technology, but said state lawmakers should revisit those rules to make them “stronger and with more regulatory guardrails, before more law enforcement agencies across the state engage in the use of facial recognition technology.”
A number of research studies have found racial bias in facial recognition technologies, with potentially higher error rates when identifying people of color.
What’s next?
Council members Coombs and Gardner mentioned that Aurora will hold community feedback sessions, but didn’t know exactly when that would happen.
According to APD, the city auditor will also conduct annual audits to ensure the technology is not being misused. However, Robinson said that the ACLU of Colorado would like to see an outside agency not affiliated with the city of Aurora also oversee the use of this technology.
Poppe said Aurora Police won’t start using AI facial recognition until at least January. The department’s policy concerning this technology is still being finalized and officers and detectives will be trained by the FBI and a private consultant on how to use the technology before incorporating it into their work.











