Home

Some companies that have chosen us

Privacy Officer and Privacy Consultant
CDP Scheme according to ISO/IEC 17024:2012
European Privacy Auditor
ISDP©10003 Certification Scheme according to ISO/IEC 17065:2012
Auditor
According to standard UNI 11697:2017
Lead Auditor ISO/IEC 27001:2022
According to standard ISO/IEC 17024:2012
Data Protection Officer
According to standard ISO/IEC 17024:2012
Anti-Bribery Lead Auditor Expert
According to standard ISO/IEC 17024:2012
ICT Security Manager
According to standard UNI 11506:2017
IT Service Management (ITSM)
According to the ITIL Foundation
Ethical Hacker (CEH)
According to the EC-Council
Network Defender (CND)
According to the EC-Council
Computer Hacking Forensics Investigator (CHFI)
According to the EC-Council
Penetration Testing Professional (CPENT)
According to the EC-Council

Professional qualifications

Stay up-to-date with world news!

Select your topics of interest:

News

Home / News
/
Data protection: facial recognition and the most problematic aspects.

Data protection: facial recognition and the most problematic aspects.

Facial recognition, like all the artificial intelligence techniques, undoubtedly presents privacy problems because it deals with extremely sensitive biometric data. Specifically, this technique involves the application of biometric software capable of universally identifying and verifying the identity of a person by analysing the distinctive features of the face and comparing them with those of other images.

In fact, each of us has unique facial characteristics.

This type of software is able to analyse them, compare them with images stored in a database and identify the person (if a match is found). 

Depending on the relationship with the specific sector of personal data protection, as is often the case in the field of new technologies, it is not possible to speak of absolute incompatibility, as data processing can certainly be possible and lawful if it complies with the principles of legality, necessity, proportionality and minimisation detailed in the European Regulation n. 2016/679 (GDPR).

Otherwise we could be faced with cases of illegitimate dissemination of extremely sensitive data and indeed identity theft or automated data processing not permitted under the regulation or other forms of unlawful processing of personal data. 

Actually, while the use of these technologies may be perceived as particularly effective, data controllers should first of all assess the impact on fundamental rights and freedoms and consider the least intrusive means to achieve their legitimate aim. 

In our legal system there are no ad hoc laws regulating facial recognition, but the related processing of personal data has to respect some fundamental provisions of the GDPR, in particular Article 5 which regulates the principles applicable to the processing of personal data, Article 6 which defines the conditions of lawfulness of the processing and Article 9 which regulates the processing of particular categories of data, including biometric data. Moreover, recently, the European Data Protection Board (EDPB) in the directive n. 3./2019 on the processing of personal data by means of video surveillance devices devotes particular attention to facial recognition and intelligent video surveillance. 

Speaking of biometric data, in the light of articles 4.14 and 9 of the GDPR, the following three criteria have to be taken into account: 

  • Nature of the data: data on the physical, physiological or behavioural characteristics of a natural person;
  • Means and methods of processing: data ‘resulting from a specific technical processing’. 
  • Purpose of processing: the data must be used to uniquely identify a natural person.

The directives support that Article 9 of the GDPR applies if the controller stores biometric data (most commonly by means of models created by extracting key features from the approximate form of biometric data (e.g. facial measurements from an image)) to uniquely identify a person. If a data controller wishes to survey a subject who is inside the area or who leaves in another area (e.g. to project a personalised advertisement) in order to uniquely identify a natural person, this means that the operation is part of the scope of application of Article 9 from the outset.

Some biometric systems are sometimes installed in uncontrolled environments, meaning that the system provides for the capture of the faces of any person passing within the range of the video camera, including persons who have not consented to the biometric device. These models are compared with those created from persons who have given prior consent during a recruitment process (i.e. a biometric user) in order for the data controller to recognise whether the person is a biometric device user or not. 

In this case, according to EDPB guidance, the system is usually designed to discriminate the individuals it wants to recognise in a database from those who are not enrolled, and obviously a specific exemption is required under Article 9(2) of the GDPR.

When biometric processing is used for authentication purposes, the guidelines recommend that the controller should offer a fallback solution that does not involve biometric processing, without restrictions or additional costs for the data subject. This fallback solution is also necessary for individuals who do not comply with the limitations of the biometric device (impossibility to enrol or read the biometric data, disability situation hampering its use, etc.) and in case of unavailability of the biometric device (such as a malfunctioning of the device), a “fallback solution” should be implemented to ensure the continuity of the proposed service, limited however to an exceptional use.

Of course, in line with the principle of data minimisation, data controllers should ensure that the data extracted from a digital image to build a model are not excessive and contain only the information necessary for the specified purpose, thus avoiding any possible further processing.

From a technical point of view, according to the guidelines, the controller must take all necessary precautions to preserve the availability, integrity and confidentiality of the processed data.

In this regard, he/she must:

  1.  compartmentalise data during transmission and storage;
  2. store biometric templates and raw or identity data in separate databases; 
  3. encrypt biometric data, in particular biometric templates, and define an encryption and key management policy;
  4. Integrate an organisational and technical measure for fraud detection;
  5. Associate an integrity code to the data (e.g. a signature or a hash) and prohibit any external access to the biometric data.

Of course, these measures will have to evolve with the progress of technologies.

SOURCE: FEDERPRIVACY

Recommended to you

Advanced Research