Amazon bans police from using facial recognition tech Rekognition for 1 year

Amazon has barred police from using its facial recognition technology for one year.

In a company blog post Wednesday, Amazon said it will implement a one-year “moratorium on police use of Amazon’s facial recognition technology” — known as Rekognition.

Amazon said the decision to impose a temporary halt to the use of Rekognition comes after activists pushed members of Congress to regulate or outright ban the use of the technology for police activity.

The company said it has “advocated” for government officials to “put in place stronger regulations” and is hoping the one-year moratorium “might give Congress enough time to implement appropriate rules.”

In February, Senators Cory Booker and Jeff Merkley introduced the Ethical Use of Facial Recognition Act, which would create guidelines for how the government uses facial recognition. And last year, two lawmakers, Rep. Elijah Cummings and Rep. Jim Jordan, had planned to introduce a bipartisan bill on the use of the technology.

The company said it will still allow organizations like Thorn, the International Center for Missing and Exploited Children, and Marinus Analytics to use the facial-recognition technology “to help rescue human trafficking victims and reunite missing children with their families.”

It is unknown just how many police forces and government agencies use Amazon’s Rekognition. When asked how Amazon will remove the technology from police departments who already have it employed, a company spokesperson declined to comment.

IBM recently announced it would halt all research and development of facial recognition technology. Protesters and civil rights groups have raised privacy concerns about the use of the technology to identify demonstrators.

Experts told Digital Trends there is inherent bias in facial recognition technology. Most tech of this kind is typically trained to identify white male faces, meaning the probability of misidentification for anyone who is not white and not a man is much higher. In a study done by Massachusetts Institute of Technology, researchers found that Amazon’s Rekognition was bad at identifying dark-skinned, female faces — it misclassified women as men 19% of the time.

In the past, Amazon has come under fire, from consumers as well as its own employees, for selling facial recognition to police departments across the U.S., a move that it later addressed in a similar company blog post.

Editors' Recommendations