GDPR Brief: automated decision-making and profiling

7 Jan 2019

Individual opportunity increasingly depends on automated decisions by companies and (prospective) employers. Any automated decision-making, including profiling, is subject to the usual requirements of the GDPR.

Individual opportunity increasingly depends on automated decisions by companies and (prospective) employers. Any automated decision-making, including profiling, is subject to the usual requirements of the GDPR. Special attention should be paid to (a) selection and use of software, (b) transparency, (c) the qualified prohibition on significant decisions based on solely automated processing.

(a) Selection and use of Software – Data Protection by Design and Default

Software developers should design products that support fulfilment of data protection obligations, including fulfilment of data subject rights. Controllers must implement appropriate and effective technical measures. Before introducing new technologies into automated decision-making operations, especially in case of profiling, a Data Protection Impact Assessment may be required.

(b) Transparency – Right to be informed and obtain information

A data subject must be told of the existence of automated decision-making, including profiling, and “meaningful information” about the algorithmic logic involved, as well as the significance and envisaged consequences, at least where processing relates to decisions based on solely automated processing.

The responsibility is (normally) to provide this information at the time personal data is collected. This suggests information provided will relate to system function rather than specific decisions. A data subject also has a right to obtain this information at any point. If requested after an algorithm has been applied, then it may be possible for a controller to provide information about a specific decision, although the requirement appears still to be future oriented.

(c) Qualified Prohibition on Significant Decision-Making on Solely Automated Processing

Decisions which have legal or similar significant effects (on a data subject) should ordinarily not be based solely on automated processing: human intervention should be present. Exceptions exist if (i) necessary to a contract, (ii) with explicit consent, or (iii) otherwise authorised by law with suitable safeguards. In case of (i) or (ii), the data subject still has the right, at least, to obtain human intervention, to express his or her viewpoint, and to contest the decision. Solely automated decision-making (usually) ought not to concern a child.

Furthermore, special categories of personal data ought not to be used without explicit consent unless for reasons of substantial public interest, on a legal basis that is proportionate to the aim pursued, respects the essence of data protection, and provides appropriate safeguards. Such safeguards may include a right to obtain an explanation of the specific decision reached.

Mark Taylor is Associate Professor in Health Law and Regulation at Melbourne Law School.

Further Reading

Relevant GDPR Provisions

See all previous briefs.

Please note that GDPR Briefs neither constitute nor should be relied upon as legal advice. Briefs represent a consensus position among Forum Members regarding the current understanding of the GDPR and its implications for genomic and health-related research. As such, they are no substitute for legal advice from a licensed practitioner in your jurisdiction.

Latest News

Picture of hand writing while binary code runs over the image
5 Dec 2024
Model Data Access Agreement Clauses have been approved as an official GA4GH Product
See more
Picture of Uppsala, Sweden.
14 Nov 2024
GA4GH 13th Plenary
See more
GA4GH announces an open call for nominations for the GA4GH Inc. Board of Directors
12 Nov 2024
GA4GH Inc. opens call for new Board members to enhance global leadership in genomics
See more