Rights related to automated decision making including profiling

News

HomeHome / News / Rights related to automated decision making including profiling

Jun 04, 2023

Rights related to automated decision making including profiling

Search article

Search article

To comply with the UK GDPR...

☐ We have a lawful basis to carry out profiling and/or automated decision-making and document this in our data protection policy.

☐ We send individuals a link to our privacy statement when we have obtained their personal data indirectly.

☐ We explain how people can access details of the information we used to create their profile.

☐ We tell people who provide us with their personal data how they can object to profiling, including profiling for marketing purposes.

☐ We have procedures for customers to access the personal data input into the profiles so they can review and edit for any accuracy issues.

☐ We have additional checks in place for our profiling/automated decision-making systems to protect any vulnerable groups (including children).

☐ We only collect the minimum amount of data needed and have a clear retention policy for the profiles we create.

As a model of best practice...

☐ We carry out a DPIA to consider and address the risks before we start any new automated decision-making or profiling.

☐ We tell our customers about the profiling and automated decision-making we carry out, what information we use to create the profiles and where we get this information from.

☐ We use anonymised data in our profiling activities.

To comply with the UK GDPR...

☐ We carry out a DPIA to identify the risks to individuals, show how we are going to deal with them and what measures we have in place to meet UK GDPR requirements.

☐ We carry out processing under Article 22(1) for contractual purposes and we can demonstrate why it's necessary.

OR

☐ We carry out processing under Article 22(1) because we have the individual's explicit consent recorded. We can show when and how we obtained consent. We tell individuals how they can withdraw consent and have a simple way for them to do this.

OR

☐ We carry out processing under Article 22(1) because we are authorised or required to do so. This is the most appropriate way to achieve our aims.

☐ We don't use special category data in our automated decision-making systems unless we have a lawful basis to do so, and we can demonstrate what that basis is. We delete any special category data accidentally created.

☐ We explain that we use automated decision-making processes, including profiling. We explain what information we use, why we use it and what the effects might be.

☐ We have a simple way for people to ask us to reconsider an automated decision.

☐ We have identified staff in our organisation who are authorised to carry out reviews and change decisions.

☐ We regularly check our systems for accuracy and bias and feed any changes back into the design process.

As a model of best practice...

☐ We use visuals to explain what information we collect/use and why this is relevant to the process.

☐ We have signed up to [standard] a set of ethical principles to build trust with our customers. This is available on our website and on paper.

Automated individual decision-making is a decision made by automated means without any human involvement.

Examples of this include:

Automated individual decision-making does not have to involve profiling, although it often will do.

The UK GDPR says that profiling is:

"Any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements."

[Article 4(4)]

Organisations obtain personal information about individuals from a variety of different sources. Internet searches, buying habits, lifestyle and behaviour data gathered from mobile phones, social networks, video surveillance systems and the Internet of Things are examples of the types of data organisations might collect.

Information is analysed to classify people into different groups or sectors, using algorithms and machine-learning. This analysis identifies links between different behaviours and characteristics to create profiles for individuals. There is more information about algorithms and machine-learning in our paper on big data, artificial intelligence, machine learning and data protection.

Based on the traits of others who appear similar, organisations use profiling to:

This can be very useful for organisations and individuals in many sectors, including healthcare, education, financial services and marketing.

Automated individual decision-making and profiling can lead to quicker and more consistent decisions. But if they are used irresponsibly there are significant risks for individuals. The UK GDPR provisions are designed to address these risks.

The UK GDPR restricts you from making solely automated decisions, including those based on profiling, that have a legal or similarly significant effect on individuals.

"The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her."

[Article 22(1)]

For something to be solely automated there must be no human involvement in the decision-making process.

The restriction only covers solely automated individual decision-making that produces legal or similarly significant effects. These types of effect are not defined in the UK GDPR, but the decision must have a serious impact on an individual to be caught by this provision.

A legal effect is something that affects someone's legal rights. Similarly significant effects are more difficult to define but would include, for example, automatic refusal of an online credit application, and e-recruiting practices without human intervention.

Solely automated individual decision-making - including profiling - with legal or similarly significant effects is restricted, although this restriction can be lifted in certain circumstances.

You can only carry out solely automated decision-making with legal or similarly significant effects if the decision is:

If you’re using special category personal data you can only carry out processing described in Article 22(1) if:

Because this type of processing is considered to be high-risk the UK GDPR requires you to carry out a Data Protection Impact Assessment (DPIA) to show that you have identified and assessed what those risks are and how you will address them.

As well as restricting the circumstances in which you can carry out solely automated individual decision-making (as described in Article 22(1)) the UK GDPR also:

These provisions are designed to increase individuals’ understanding of how you might be using their personal data.

You must:

Article 22 applies to solely automated individual decision-making, including profiling, with legal or similarly significant effects.

If your processing does not match this definition then you can continue to carry out profiling and automated decision-making.

But you must still comply with the UK GDPR principles.

You must identify and record your lawful basis for the processing.

You need to have processes in place so people can exercise their rights.

Individuals have a right to object to profiling in certain circumstances. You must bring details of this right specifically to their attention.

External link

In more detail – ICO guidance

In more detail – European Data Protection Board

The European Data Protection Board (EDPB), which has replaced the Article 29 Working Party (WP29), includes representatives from the data protection authorities of each EU member state. It adopts guidelines for complying with the requirements of the EU version of the GDPR.

WP29 has adopted guidelines on Automated individual decision-making and Profiling, which have been endorsed by the EDPB.

Other relevant guidelines published by WP29 and endorsed by the EDPB include:

WP29 guidelines on Data Protection Impact Assessment

Further reading – ICO guidance

The Accountability Framework looks at the ICO's expectations in relation to rights related to automated decision making including profiling.

To comply with the UK GDPR... As a model of best practice... To comply with the UK GDPR... As a model of best practice... only only or In more detail – ICO guidance In more detail – European Data Protection Board Further reading – ICO guidance