The General Data Protection Regulation (GDPR) introduces new rules in relation to certain kinds of automated decision making and profiling. Earlier this week, the Article 29 Working Party published its draft guidance on how those rules should be interpreted.

What is automated decision making and profiling?

Automated decision making and profiling are two separate, but often interlinked concepts.

  • Profiling is a form of automated processing of personal data used to analyse or predict matters relating to an individual. For example analysing an individual's performance at work, financial status, health, interests or location.
  • Automated decision making is the ability to make decisions without human involvement. In practice, profiling can often be a precursor to automated decision making.

Profiling and automated decision making can be used in three ways:

  • General profiling – where individuals are segmented into different groups, based on data analysis
  • Decision-making based on profiling – where a human makes a decision based on profiling
  • Solely automated decision making – where an algorithm makes a decision, with no human intervention

General prohibition on certain types of automated decision making

Under Article 22(1) of the GDPR, decisions based solely on automated decision making which produces legal effects or similarly significantly affects an individual are prohibited unless:

  • It is necessary for the performance of or entering into a contract;
  • It is authorised by law; or
  • It is based on the data subject's explicit consent

Automated decision making that involves special categories of personal data, such as information about health, sexuality, and religious beliefs, is only permitted where it is carried out on the basis of explicit consent or where it is necessary for reasons of substantial public interest, such as fraud prevention and operating an insurance business.

Necessity is interpreted narrowly, and organisations must be able to show that it is not possible to use less intrusive means to achieve the same goal.

Further regulatory guidance on what constitutes "explicit" consent is expected in due course. As with general consent under the GDPR, any consent must be freely given, unambiguous, specific and informed.

What is meant by "legal effects" or "similarly significantly affects"?

"Legal effects" are things that have an impact on an individual's legal rights or affect a person's legal status or rights under a contract. Examples include:

  • Being entitled or denied benefits such as housing or child benefit
  • Being refused entry at a national border
  • Automatic disconnection from a mobile phone service because an individual forgot to pay their bill

"Similarly significantly affects" means decisions that have non-trivial consequences, such as:

  • Automatic refusal of an online credit application
  • Automated decisions about credit limits, based on analysis of spending habits and location
  • E-recruiting without any human intervention
  • Certain types of targeted advertising
  • Online profiling that leads to different individuals being offered different pricing

In practice, this will require an analysis of how automated decision making and profiling is used and the consequences of that for the individual.

Can I get round the restrictions on by just having a human nominally supervise the decision?

No. Any human intervention must be meaningful. The individual must analyse all available data and have the authority and competence to change the decision.

What do I need to tell individuals?

Where decisions are made solely using automated decision making, organisations must:

  • tell the individual that it is using automated decision making for these purposes;
  • provide meaningful information about the logic involved (for example by explaining the data sources and main characteristics of the decision making process); and
  • explain the significance and envisaged consequences

The Article 29 Working Party recommends that these steps are followed whenever automated decision making is used, as this can help with ensuring that the processing is carried out fairly.

Safeguards and transparency

Individuals must be told when a decision has been taken solely using automated decision making and they must have the right to request a review of the decision. The review should be .by a person with appropriate authority and capacity to change the decision and should involve a thorough review of all relevant data and any additional information provided by the individual.

Organisations using automated decision making should also carry our regular reviews and use appropriate procedures to prevent errors.

Data Protection Impact Assessments

When considering using automated decision making and profiling, organisations should assess the risks using a data protection impact assessment (PDF). Conducting a DPIA will help organisations show that appropriate measures have been put in place to mitigate those risks and help demonstrate compliance with the GDPR.

Organisations should also remember that any use of automated decision making and profiling must comply with the general principles in the GDPR in relation to fair and lawful processing and the requirement to provide individuals with a privacy notice. Such processing will also be subject to the general rights of individuals under the GDPR, including the right to object to certain types of processing (including direct marketing), the right of rectification and the right to erasure.

Where can I get more information?

The Article 29 Working Party's draft guidance on profiling can be downloaded from the Article 29 Working Party website. The draft guidance is open for comments until 28 November 2017.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.