The European Commission is now drafting new laws that will give European citizens the opportunity to access their facial recognition data and how it will be used.
In a move that effectively goes beyond the existing General Data Protection Regulation (GDPR), public entities like the police and government agencies as well as private businesses, will be required to conform with the regulations if they use FRT technology.
The existing regulations prevent biometric data collection which distinctly identifies individuals. The update is said to set a precedent and a ‘world standard for AI regulation.’
The news about the reining-in of FRT comes from a report from the business news publication Financial Times. FT reports from ‘senior officials’ that this new legislation on artificial intelligence is part of the new European Commission President Ursula von der Leyen’s first 100 days goals.
She is understood to envisage the creation of, ‘a coordinated European approach on the human and ethical implications of artificial intelligence.’
It will limit, ‘the indiscriminate use of facial-recognition technology’ and let citizens ‘know when [facial recognition] data is used.’
Facial recognition has been treated at arm’s length within the European Union. Three years ago, Facebook was asked to remove FRT from its private photo and video sharing app Moments, within the European market. Facebook had also been impacted by the Irish Data Protection Commissioner’s prevention of the company from using FRT within Europe, as well as requesting the deletion of all its biometric records.
The report which the FT has managed to access, says that ‘AI applications can pose significant risks to fundamental rights. Unregulated AI systems may take decisions affecting citizens without explanation, possibility of recourse or even a responsible interlocutor,’
A spokesperson for the Commission explained that the proposals were drawn together through the consultations from experts on how to properly regulate FRT. Part of this was to provide citizens with the knowledge that they were under surveillance.
Cameras in Paceville
The news of the European Commission’s plans to better regulate FRT technology comes at a time where contention exists between the Maltese government’s decisions to install FRT cameras, versus concerns over personal privacy.
In 2016, the Maltese government signed a Memorandum of Understanding with the Chinese telecommunications company Huawei, to install cameras fitted with facial recognition technology.
The move has raised real concerns over what the government was getting into given the previous allegations of the company and its founder, being directly affiliated with the Chinese state.
So strong were the allegations, the United States government prevented the sale of Huawei Mate 10 mobile phone handsets in the US market, as well as the end of a deal between the American telecoms company AT&T. They cited threats to US national security as their reasoning.
According to Maltese government and Safe City Malta, a company set up specifically to facilitate the process, said that the cameras would be trialed to help with crime prevention in high impact areas like Paceville.
Prime Minister Joseph Muscat said the project would feature a set of high-definition CCTV cameras, ‘eyes in the sky’, that would relay information to a mobile hub.
It was understood at the time, that this technology would allow for facial recognition to pick out faces in crowds and support police units on the ground. If the project is successful in Paceville, it would be intended for roll out in Marsa.
Is it legal?
The Malta Information Technology Law Association raised genuine arguments over the legality of the cameras in accordance with GDPR rules and the processing of information.
In their response, MITLA highlights that the cameras will be used for collecting biometric data, are subject to a ‘stricter legal regime’ while also urging a debate over the balance between the rights and freedoms of the citizen and the roles and responsibilities of law enforcement to maintain law and order. The say that ‘such balance will not be easily achieved and will require careful consideration.’
On the topic of data processing, MITLA points out that in addition to complying with the new GDPR regulations (in force by May 2018), ‘biometric data by law enforcement agencies will also need to be carried out in line with the provisions of EU Directive 2016/680 on the protection of natural persons with regard to the processing of data by competent authorities for the purpose of prevention, investigation, detection or prosecution of criminal offences.’
Needs an impact assessment
In a written response to a letter sent by Nationalist Local Councillor candidate Michael Briguglio, the EU Commissioner for Civil Liberties Ms Věra Jourová, said that with such a large scale project that would gather so much data from public areas and using sophisticated technologies, that processing could, ‘likely result in a high risk for the freedoms of natural persons.’
For this reason, a data protection impact assessment would therefore need to include ‘an assessment of the necessity and proportionality of the processing operations, an assessment of the risks to rights and freedoms of data subjects, and measures envisaged to address those risks, safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with the GDPR.’
This response comes following a letter sent by Briguglio to Ms Jourová and her colleagues leading other related EU departments involved in Security, Digital Technology and Data Protection. Briguglio calls on the services of the Heads of the Commissions over the Maltese government’s plan to introduce facial recognition technology (FRT) into some of Malta’s most populated streets.
Public interest and data controllers
In addition to an impact assessment, Jourová also states that the persons who are identified as the trusted data controllers must then also determine that the task being carried out and the processing of the data, ‘is necessary for the performance of a task in the public interest.’
She adds that, ‘According to settled case law, the protection of personal data requires that limitations in relation to that fundamental right can only apply in so far as it respects the essence of that right and is strictly necessary and proportionate.’
With this in mind, Jourová goes further to explain that the data controller should be held accountable for their actions and choose technology which complies with the regulations.
‘It is therefore for the accountability of the controller, to choose only such technology, which is compliant with the data protection principles and requirements, regardless whether that technology is developed by a European company or by a provider from outside the EU.’