Digital rights campaigners called on the European Union on Monday to tighten export controls on surveillance tools such as facial recognition systems to prevent European technology from being used in countries where it could fuel human rights abuses.
Sales of digital surveillance systems are not currently restricted by the EU despite posing risks to privacy and other freedoms in countries that lack adequate safeguards, Amnesty International said in a report.
“These technologies can be exported freely to every buyer around the globe,” said the report, which was published as the European Parliament and EU member states prepare to review the bloc’s export rules.
“The EU exports regulation framework needs fixing, and it needs it fast,” it said.
It called for the technology to be treated in the same way as goods with dual civilian and military use, meaning export deals could be blocked if judged to pose a significant threat to human rights.
Amnesty said it had conducted an investigation that found several European companies had sold digital monitoring systems to China.
China’s efforts to build one of the world’s most sophisticated surveillance technology networks, with hundreds of millions of cameras in public places, have drawn criticism from human rights advocates.
Morpho, a French company that later became part of IDEMIA, supplied facial recognition equipment to Shanghai police in 2015, the Amnesty report said.
IDEMIA said the sale had involved an old-generation system for the identification of faces on recorded footage rather than live surveillance, adding it “did not and does not sell facial recognition technologies to China.”
Amnesty’s probe also found Swedish company Axis Communications had been selling surveillance cameras to Chinese law enforcement agencies since 2012.
The Lund-based company said network video solutions were used all over the world to help increase security and safety, adding that it had “export control mechanisms” and “systematic screening of customers.”
Meanwhile, Dutch company Noldus Information Technology sold emotion recognition systems to Chinese authorities and universities, according to Amnesty.
Noldus said it was technically impossible to use its software – designed for the study of human behavior – for the purposes of mass surveillance.
“We have never come across a single instance where human rights were violated with the aid of our software,” it said in a statement, adding that Amnesty had not provided evidence to the contrary and had declined an offer to inspect the software.
Amnesty said individual member states were blocking proposals by the European Parliament and European Commission for tougher controls.
A spokeswoman for the Council of the European Union, which represents the member states, said negotiations to review the regulations were ongoing.
Ella Jakubowska, policy and campaigns officer at European digital rights group EDRi, welcomed Amnesty’s report, saying biometric mass surveillance technologies “run an enormous risk of fundamental rights violations.”
“EDRi is currently urging the EU to ban biometric mass surveillance technologies within the EU – and this certainly means that we are also against the use of these dystopian technologies elsewhere,” she said.