Around the world

Technologies of smart surveillance are being tested all over the world, both in developed and developing countries and equally so in democratic and repressive systems. However, the direction of further political developments could depend on the power of citizens to stand up against uncontrolled subjection and insist that their human rights and freedom be respected.

SAD

United States of America

Significant points of resistance are often found in the technologically most developed environments, such as the cities in California (San Francisco, Berkeley, Oakland) and Massachusetts (Somerville, Cambridge) where it is forbidden that public authorities, including the police, use facial recognition technology. In other cities, such as Detroit, there is a growing awareness of the dangers of the biometric surveillance system for the African-American community.


Certain companies have decided not to use this technology before a clear legal framework is established. That is why the Ethics committee of a company that supplies the majority of police forces in the USA with equipment decided that they would not use facial recognition technology due to “serious ethical issues”.

Kina

China

In cooperation with a number of corporations, China has already been using facial recognition technology for years, so this system has become an integral part of daily life in many big cities. Namely, in 2015, Chinese authorities already gave their support to the concept of “omnipresent, totally connected, always-on, and completely controlled” network for video surveillance all over the country as the imperative of public security. Two years later, they developed a facial recognition database that can in only a couple of seconds establish a person’s identity in a country of over a billion citizens.


This technology is used for identity control in online payments, entrance to public institutions – even schools – as well as for continuous police surveillance of citizens. The surveillance system is “gamified” by people gaining and losing their social credit; for example, public embarrassment for street offences. Human rights organisations warn that the system is also used for monitoring the activity of ethnic minorities, such as the Uyghur Muslim community.

Ujedinjeno Kraljevstvo

United Kingdom

The number of surveillance cameras in Britain has drastically increased in less than two decades: from 100 installed cameras in 1990 to over four million in the first decade of the 21st century. As stated in the report of “Big Brother Watch” published in 2018, contemporary technology of smart surveillance is still unsuccessful: 98% of identification done in London was wrong. The accuracy of results in Wales was as low as 9%, while the pictures of 2.451 wrongly suspected persons remained in the police system for a year.


On the other hand, so far the biggest legal victory of British citizens was proclaimed in October 2019, when the parliament drafted the law that would put a moratorium on facial recognition technology, while all control, installation, and purchase of equipment for analysis of citizens’ biometric data in public space would be considered a criminal offence.

Argentina

Argentina

The case of a Buenos Aires resident demonstrated to the general public in Argentina how facial recognition technology can create a lot of trouble for innocent people. Namely, a man was stopped by the police when he entered the metro because smart cameras identified him as an alleged robber who had been wanted for 17 years. The wrong suspect was for hours trying to convince the police that he was innocent, and then the policemen were trying to “convince” the computer program. In the end, the man was released, claiming he was lucky to be white. In a similar incident, a wrongly accused man of darker skin spent as many as six days in detention because he was wrongly identified by a “smart” surveillance system.

Francuska

France

Regional authorities in the south of France allowed for the facial recognition system to be installed at entrances of two high schools as a pilot within a larger project the aim of which was to cover all schools with smart surveillance. However, several human rights organisations, one trade union, and one parent association jointly appealed to the court and requested that this experiment be stopped. They claimed that the system was too intrusive and did not comply with the Law on Personal Data Protection. At the end of February 2020, the court decided to terminate the regional government’s project, stating among other things that the system was disproportionate to the purpose of controlling the entry, while the previously obtained consent of parents was not a valid legal basis because they were not insufficiently informed about the system.