Use of biometric data: Assessment from a data protection perspective

Processing operations involving biometric data operations can be used for many different purposes, such as proof of life, identification, authentication, tracking, profiling, automatic decisions, etc. Biometric operations can use different techniques, some of them simultaneously, and the same technique can be implemented in different ways. Biometric data operations in a particular processing will have a different degree of intrusiveness and impact on the privacy of individuals depending on the technique used, but also on the definition of the processing itself, its nature, the scope or extent of the processing, its context and, in particular, the purposes that are pursued. Therefore, the impact assessment of biometric operations must be carried out within the framework of a processing operation and in relation to its ultimate purposes.

Blog: Reconocimiento facial

Photo by teguhjati pras from Pixabay

Biometric data processing techniques are based on collecting and processing physical, behavioral, physiological or neural traits of individuals by means of devices or sensors, creating signatures or patterns that enable the identification, tracking or profiling of individuals. Some methods require the cooperation of the individual, while other methods can capture biometric data remotely, without requiring the cooperation of the individual and without the individual's awareness.

In the framework of a processing operation, any of the different biometric techniques involved have to be assessed according to their adequacy, proportionality and necessity, their purpose, their impact on the rights and freedoms of natural persons and the risks they entail, both for the individual and for society.

There are different criteria for classifying biometric systems: some are based on the use of different technologies, others in relation to the different devices or sensors, others in relation to the trait or sets of traits studied, and so on. However, when demonstrating the adequacy of a processing operation under the GDPR, and when assessing the risk to the rights and freedoms of individuals that the processing of such data may entail, it is appropriate to use classification criteria for biometric operations from a data protection point of view and in relation to the processing in which it is implemented.

The following is a non-exhaustive list of some of the criteria that may be useful in typifying biometric operations in the context of a processing operation:

Purpose of biometric data operations in relation to the purpose of processing

The GDPR is aimed at the adequacy of processing. The technologies used in the processing are part of the nature of the processing and the processing is defined by its purposes, which must be specified, explicit and legitimate.

Biometric technologies are not processing operations in themselves, but a means to carry out operations within a processing operation, which will have a well-defined ultimate purpose.

One (or more) biometric technology can be used in the framework of a processing operation to implement different purposes within the processing operation:

  • Detection of human beings
  • Detection of a face or other anatomical feature
  • Evaluation of patterns and behaviours
  • Profiling, classification, and decision making
  • Authentication
  • Identification
  • Tracking of individuals

The order shown above could be ranked from least to most intrusive operations or with the most impact on subjects' rights.

Legal framework

The legal framework for biometric operations will depend on data protection regulations, specific regulations on biometrics and also the sectoral regulations that are applicable to the controller and the specific processing.

This may entail enabling or restricting the use of biometric operations, as well as additional obligations (e.g. they may require the performance of a DPIA), as well as the legal validity of their results.

Scope or extent of processing

The greater the scope of the processing in which the biometric operation is included, in relation to the number of subjects affected, the volume of biometric parameters used, the geographical extension, the duration in time of the processing, biometric data collection rate or the data retention period, among others, the greater the impact it may have on the individuals.

In particular, it is necessary to take into account the extent in relation to the categories of data subjects concerned, taking into account the typology of the data subjects, for example, when applying to vulnerable groups (minors, elderly, sick, migrants, refugees, etc.), taking into account the possibility that they may be opposed to the biometric operation and that this may be adequate.

Qualified human intervention in relation to the biometric result

Qualified human intervention is an additional guarantee for resolution of problems related to the biometric operation, as well as for immediate identification of biases.

In addition, certain biometric features allow for easier detection by a human of exposure attacks, such as facial recognition, than others, e.g. the iris, where it is more difficult to recognize a manipulation.

On the other hand, if the processing has legal effects on the data subject or significantly affects the data subject and the result of the biometric operation, Art. 22 GDPR provides for additional prohibitions and safeguards, including possible qualified human intervention.

In this way, the degree of qualified human involvement in the biometric operation diminishes some of the risks inherent in this operation.

Processing of special categories of data

Some biometric techniques could or do process special categories of data beyond their possible involvement in identification and authentication processes.

In that case, a cause for lifting the prohibition will have to be determined for each of the special categories of data referred to in Article 9 of the GDPR, on which a processing operation is intended to be carried out. If there is no cause for lifting the prohibition laid down in Article 9.1 of the GDPR for each of the categories of data processed, the processing will not only be intrusive, but also prohibited.

Transparency of the biometric operation

The collection of data for biometric processing in a data processing, can take place in one of two extreme situations:

  • Data collection is done consciously by the individual, and even requires a positive action from the individual to initiate biometric data processing
  • Or, at the other extreme, the individual is not aware that biometric information is being taken, nor what kind of data, nor when, nor requiring any positive action from the individual

The latter case is the most intrusive to the individual's privacy.

Free choice of the data subject to the biometric operation

The processing may be designed so that the biometric procedure is an option which the data subject can choose freely, specifically, informed and unequivocally. In particular, if the submission to a biometric operation requires a positive and conscious action by the data subject.

To the extent that these conditions are not actually met, the processing will become more and more intrusive.

Adequacy of the biometric operation

The biometric technique used in the processing has to meet the performance parameters appropriate to the context in which the data processing is deployed.

The accuracy associated with relative biometric features or techniques varies, as well as the performance of different implementations and the peculiarities of each context. The latter need to be assessed throughout the life of the data processing, especially before deployment by having the necessary tests, audits and/or certifications in the specific data processing context provided that the implemented biometrics allow the purpose of the data processing to be fulfilled.

To the extent that a lack of accuracy of the data obtained in relation to the biometric operation such as profiling biases, incorrect identifications, identity theft, discrimination in segments of the population (elderly, disabled, racial types, sick, ...), denial of access to services due to errors in the capture of the data, etc.; the greater the intrusion that the operation causes in the rights of the data subjects, the greater the intrusion that the operation causes in the rights of the data subjects.

Minimum data

Not all biometric technologies, nor all specific implementations of a technology, process the same amount of personal data or process the same number of physical aspects of the data subject.

In that sense, the biometric operation needs to be adapted to the purpose of the processing. For example, if the purpose of the processing is only to determine that a human exists or to find a random face, it is not necessary to use a biometric operation with such a granularity that it allows identification (that is able to distinguish between two faces or two humans).

Some techniques, depending on how they are implemented, may reveal information about the subject (e.g., emotional state, race, health, ...) that goes beyond the purpose of the processing.

The use of multibiometrics, which aims to improve accuracy, may involve taking excessive additional data on individuals contrary to the minimization principle required by Article 25.2 of the GDPR.

Finally, some processing operations combine biometric data with other types of personal data, which may result in excessive data collection in relation to the purpose of the data processing.

Suitability and need for biometric operation

Different means can be used to achieve the purpose of a data processing, and biometrics can be one of them.

It needs to be assessed whether biometric operations are really necessary to achieve its the purpose and whether other less intrusive means cannot be used. Therefore, performance metrics required in the processing should be defined, and assessment of the suitability of the different options, including biometrics, should be done regarding such requirements.

Degree of user control

In a simplified way, a biometric process can involve the collection of a physical characteristic (e.g., face), its processing with a biometric analysis machine, and its storage or comparison with an already stored biometric pattern.

If the last two are done by the controller (or third parties) we would have the configuration with less user control. Another situation could occur if the biometric pattern is stored by the user (e.g., on a card), and only the process of analyzing the characteristic against the pattern is performed outside the user's control. The least intrusive form appears when all three elements are controlled by the user (e.g. match-on-card) and the output to the controller or third parties is only a yes or no match.

Implicit collateral effects in the biometric operation In biometric operations, more personal data than strictly necessary could be captured, inferred and exposed. This is the case, for example, with techniques such as proctoring, which exposes an individual's intimate environment (e.g. posters, books, pendants, etc.) that reveal political, religious or philosophical convictions. These factors can lead to bias or impose self-censorship on legitimate behavior.
Personal data breaches

The reality of technology is that every day new technological or social engineering ways of generating vulnerabilities appear.

In the framework of the processing where the biometric operation is located, it is necessary to consider breach scenarios, and to determine the impact that a personal data breach derived from the use of biometric techniques may have on the rights and freedoms of data subjects.

These can result in the filtering or loss of biometric patterns, spoofing of stored patterns, intrusion into the biometric analysis system and its results, by-pass of communication between subsystems, denial of service attacks, discontinuity of third-party services, etc. All scenarios must be analyzed independently of the estimated probability of their materialization and measuring the degree of intrusion they may cause to rights and freedoms.

Likewise, it is necessary to know the reality of what breaches are already occurring and which could determine the inadequacy of biometric technique or biometrics in general. This implies a continuous assessment of the processing according to the events that are occurring.

Implementation There is a big difference between the concept of biometric operation and its implementation. The actual implementation involves selection of sensors, communications, development libraries, devices in which they are integrated (e.g., mobile phones or ATMs), storage (e.g. in the cloud), etc. Each of them with different degrees of quality, certification, auditing, security and third-party involvement. The less control the controller has over all of these, the greater the risk of this processing.
Data processing context In relation to the above, just as the biometric operation must be assessed within the framework of the processing, both have to be assessed taking into account the social context and the collateral and unforeseen effects on rights and freedoms that a processing operation incorporating biometric operations has produced or is producing in the environment (deviation of purposes, social impacts, regulatory changes, religious or cultural changes, conflicts, etc.).

 

The validation of the biometric techniques used in a processing must be carried out "by design" as required by article 25.1 of the GDPR and with the recommendations set out in the Privacy by Design Guide. The analysis of these factors, and others that may be specific to the processing or to the biometric operation chosen, will enable an analysis of regulatory compliance, necessity, and proportionality of the processing, allowing for a more appropriate risk management.

It should be noted that, due to the nature of biometric operations, there is a high probability that the processing operations to be included will require an impact assessment as required by Article 35 of the GDPR and, where appropriate, prior consultation as provided for in Article 36 of the GDPR. To help determine this obligation, the controller can use the Assess Risk tool.

This post is related to other material published by the AEPD such as:

Entradas relacionadas