‘Black Lives Matter’ Protests Put Orwellian Facial Recognition Software in the National Spotlight

As “Black Lives Matter” protests continue throughout the country after the death of George Floyd have caused widespread outrage, certain privacy and human rights groups are calling out Orwellian facial recognition technology that they claim is discriminatory toward people of color.

“We need to make sure technologies like facial surveillance stay out of our communities,” said Kade Crockford, who works as Director of the Technology for Liberty Program at the American Civil Liberties Union (ACLU) of Massachusetts. The ACLU is leading the charge against widespread facial recognition.

Groups like the ACLU have cited federal studies showing that minorities are 10 to 100 times more likely to be misidentified through the use of facial recognition technology. They believe that the use of this technology leads to more encounters between law enforcement and minorities, which could turn deadly.

“People are marching in record numbers to demand justice for black communities long subject to police violence,” Crockford said to the Thomson Reuters Foundation.

“In response, government agencies are mounting increasingly aggressive attacks on freedom of speech and association, including by deploying dystopian surveillance technologies,” Crockford added.

The ACLU wants more cities to do what has been done in Oakland and San Francisco, where local authorities have banned the use of facial recognition technology.

“We have a fundamental duty to safeguard the public from potential abuses,” Aaron Peskin, the San Francisco city supervisor who championed the prohibition on facial-recognition technology, said before his board voted to approve the ban.

The ACLU has called out corporations such as Amazon that have paid lip service to social justice protesters while selling facial recognition technology to police that allows them to profile minorities.

“Cool tweet. Will you commit to stop selling face recognition surveillance technology that supercharges police abuse?” the ACLU asked Amazon in a Twitter post.

Sarah Chander of European digital rights group EDRi is raising awareness of how Artificial Intelligence is being used in predictive policing, in which cops use algorithms to determine the communities that they should be deploying additional resources toward. She believes that the technology will be used as an excuse for law enforcement to crack down on communities of color.

“Governments across the world need to step up and protect communities. This means drawing red lines at certain uses of technology,” she said in emailed comments.

Although many of the “Black Lives Matter” protesters may be radical socialists, some of their goals may perhaps unwittingly defend privacy rights and liberty against technological innovations that empower Big Brother.