Audio version of the article
At least its bandwagon-detection AI still works
Microsoft said on Thursday it will not sell facial-recognition software to the police in the US until the technology is regulated by federal law.
The move comes after the Windows giant faced mounting pressure to denounce face-analyzing machine-learning systems as rivals, namely IBM and Amazon, publicly dumped the tech.
In IBM’s case, Big Blue said it was doing away with “general purpose” facial-recognition systems entirely, conveniently keeping the door open for customized services for clients. Meanwhile, Amazon said it was going to pause touting Rekognition to the cops for a year, long enough for this hot potato to drop out of the news cycle, and for AWS to, presumably, quietly resume offering it again 12 months down the line.
This all comes amid protests across the US against systemic racism and police brutality, and fears the plod will use facial-recognition gear to identify and crack down on protesters – well, the ones not wearing masks for pandemic or privacy reasons.
“For the past two years we have been focused on developing and implementing strong principles that govern our use of facial recognition, and we’ve been calling for strong government regulation,” a Microsoft spokesperson told The Register.
“We do not sell our facial recognition technology to US police departments today, and until there is a strong national law grounded in human rights, we will not sell this technology to police departments. We’re committed to working with others to advocate for the legislation that is needed. We’re also taking this opportunity to further strengthen our review processes for any customer seeking to use this technology at scale.”
A previous study revealed that commercial facial recognition systems sold by companies like Microsoft and IBM struggled more with identifying women and people with darker skin. Error rates for Microsoft API was as high as 20.8 per cent for black women, and for IBM’s model it was as high as 34.7 per cent.
These types of racial biases are well known, making it particularly problematic for such a technology to be used by law enforcement. Police mugshot databases are predominantly made up of black people, Clare Garvie, Senior Associate at Georgetown University Law Center’s Center on Privacy & Technology, previously explained in a congressional hearing on facial recognition. The technology therefore, is more likely to misidentify black people and mistake them for criminals, she argued.
Microsoft’s President Brad Smith has urged the US government to adopt laws to regulate the technology. Redmond has even gone as far as sponsor legislation in its home state on the topic.
Washington Governor Jay Inslee passed a bill backed by Microsoft that would allow law enforcement to use facial-recognition software, albeit with a few caveats to ensure the technology is rolled out more transparently. That bill strikes a softer blow than the ones passed in cities like San Francisco and Oakland in California, or Somerville in Massachusetts, where law enforcement have been banned from using the tech.
Now, Microsoft has decided to scrap the service altogether for US police departments – and presumably just American police departments. By that definition, it sounds like Microsoft is still open to peddling its services to government agencies, such as immigration and border officers, or the military, just like Amazon.
This article has been published from a wire agency feed without modifications to the text. Only the headline has been changed.