Bill seeks to regulate facial recognition technology
DENVER — As technology develops, facial recognition software is increasingly becoming a part of our daily lives, from unlocking an iPhone to complex criminal investigations.
Airports, schools, police departments and even government agencies have started using the software to identify people.
Now, Colorado lawmakers are taking a closer look at whether more safeguards are needed for biometric data.
Senate Bill 22-113 calls for the creation of the Task Force for the Review of Artificial Intelligence, which will take a closer look at the use of technology by public bodies and offer recommendations on how it should be regulated.
“It’s really focused on putting in place some safeguards and analysis of the use of artificial intelligence for facial recognition,” said Sen. Chris Hansen, D-Denver, one of the co- sponsors of the bill.
Part of the reason Hansen says the bill is needed dates back to 2018 Massachusetts Institute of Technology studywho found a high error rate in software correctly identifying women with darker skin tones.
The study found a 0.8% error rate in the software identifying white men, but a 34.7% error rate in identifying dark-skinned women.
“You can imagine the issues with some sort of false identification, false positives, false negatives and that’s really where we think we need the bill, let’s take the time to carefully assess how the sector audience uses technology,” Hansen said.
Defenders of the technology insist that it has changed a lot since this study was conducted in 2018 and has improved in terms of recognition.
However, Kerstin Haring, an assistant professor at the Ritchie School of Engineering and Computer Science at the University of Denver, says problems with the technology still exist.
She breaks down the problem of bias into two parts: a lack of data for artificial intelligence to learn from and derive algorithms from, and a lack of diverse coders imputing the data.
“Our current datasets, for example, don’t contain a lot of black people. So it’s very difficult for a machine learning algorithm to identify them correctly,” Haring said. “We can’t check our own biases as coders when we don’t have a diverse representation of who is creating these algorithms.”
There is also a lack of understanding of how and why artificial intelligence reaches its conclusions.
The task force would meet every four months beginning in October and would be repealed in 2032. The bill also calls on state and local government agencies that use or plan to use the software to submit an accountability report containing information about how it uses technology.
It would also prohibit law enforcement from using the software to identify, observe, or track an individual without prior probable cause.
Schools would also be banned from using the technology at this time.
“It’s not about banning, it’s really about carefully reviewing the use and making sure we’re getting the right results and doing human review,” Hansen said.
Some groups like Colorado County Sheriffs opposed the bill. At a committee hearing on Wednesday, a representative from the Security Industry Association spoke out against the bill, saying there was a section of the bill that was so vague it had a chilling effect on the use of facial recognition technology.
Jake Parker also expressed concerns about making the product available to third parties for testing, saying it would limit the availability of some of this technology. Finally, he opposed banning the use of technology in schools, saying it could be an important safety tool.
Sen Hansen offered several amendments to the bill, including clarification on the appropriate uses of technology by law enforcement. The bill was passed by the committee and is proceeding through the legislative process.