UEBA and data protection
In recent years, the use of techniques called “User and Entity Behavior Analytics” (UEBA) has spread wide over. These types of techniques have a multitude of applications that always have something in common: record the behavior of users in the past, model this behavior in the present and, if possible, predict what it will be in the future.
UEBA techniques used in Internet services collect massive amounts of user or entity data and almost always apply machine learning or artificial intelligence techniques to generate behavioral models. Users are always people, entities can be animals, vehicles, mobile devices, sensors, etc. The application of these techniques depends on the specific application domain, since it may be interesting to analyze the individual behavior of people or their behavior from the social perspective (interrelation, communication, distance and displacement)
We can summarize the most significant domains of application of these techniques in three broad categories:
1. Optimization of services and marketing: This is perhaps the most widespread domain today, mainly on web pages and mobile applications. In this context, UEBA solutions allow you to model the behavior of users and their devices while browsing, or using an application, extracting information about how they use it, where they focus their attention, where they invest more time, etc. This is achieved by recording the sessions, drawing heat maps or identifying the user's journey. This information then can be used to design the interfaces to locate each element where it is most efficient or productive (advertising, for example) or classify users into groups that allow personalizing marketing initiatives. There are other examples within this domain of application such as those that try to optimize the use of communications networks (including 5G or the Internet of Things), transport networks (especially in smart cities) or electricity networks.
2. Cybersecurity: UEBA in this domain is evolving exponentially. In this case, the objective is usually to prevent and detect threats, detecting situations of behavior different from the usual by employees and external users, infer the possible threat of their behavior and thus perform an action on the individual. It also allows the detection of anomalies in a network due to malware infection or in the use of a user account that has suffered identity spoofing.
3.Health and safety of people: In this case, the applications are aimed at detecting potentially anomalous or unexpected behaviors, which could be indicative of different types of physical or mental illnesses. It can also be used to prevent accidents at work (analyzing the behavior of the worker in his position, for example) or traffic (analyzing the behavior of the driver).
COMPLIANCE WITH DATA PROTECTION REGULATIONS
In any of these domains, personal data is processed. UEBA solutions can be very intrusive, routinely processing personal data and building behavioral profiles that allow users to be identified, categorized, and acted upon.
These techniques rarely incorporate typical data protection strategies by design and by default. For example, the principle of minimization is not usually applied, since these solutions have as a principle to collect and process all possible information about the actions of the individual in the digital environment, in case they can serve at some point. For the same reason, generalization techniques or less granularity are not usually applied, since they seek to associate data of the highest precision to a user and, in this way, identify the user profile to classify it into, for example, what the system considers suspicious behavior. Data anonymization, pseudonymization, and aggregation are also often ruled out in the absence of specific privacy strategies for machine learning or artificial intelligence.
When processing personal data, the principles established in the GDPR are mandatory, including the principle of transparency. In many cases, users are not informed that these types of techniques are being used, the depth of treatment (for example, whether they are right-handed or left-handed, their routines or habits, their mood or health, their profiles or categorizations, etc.), nor the potential impact that a data breach can have.
The conservation of the data must be limited in time to the achievement of the purposes pursued by the treatment. Compliance with this conservation principle should be analysed before such treatment is launched.
Likewise, the rights of users whose data are being processed must be guaranteed, including the right of access, rectification and, where applicable, the right of opposition.
In some of the domains presented, this technology could involve automated individual decisions within the meaning of Article 22 of the GDPR, in which case the right of individuals not to be exclusively subject to such automated decisions must also be guaranteed.
In addition, in a large number of applications the providers of the solutions that carry out the UEBA treatment are not in the EEA, which may imply the occurrence of international personal data transfers, which can only be done if the guarantees established in the RGPD are met.
As in any processing of personal data, risks to the rights and freedoms of individuals must be properly managed, and in cases where processing may be considered high risk, a data protection impact assessment is mandatory. In particular, Article 35.3.a of the GDPR states that the data protection impact assessment shall be mandatory in cases where there is a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect them. That is, what is the framework of treatment of UEBA techniques.
It a responsibility of the data controller to assess the need for this impact assessment in each case before the implementation of the personal data processing.
You can find more material related to this topic on the Innovation and Technology | AEPD web page, in particular:
- Guide to risk management and impact assessment on the processing of personal data
- List of types of processing that require DPIA (Art. 35.4 GDPR)
- Privacy by design: Secure multi-party computing, additive secret sharing
- Anonymization and pseudonymization (II): differential privacy
- Anonymisation and pseudonymisation | AEPD
- Metaverse and privacy