Clarity, surprises, and further questions in the Article 29 Working Party draft guidance on automated decision-making and profiling

Michael Veale, Lilian Edwards

Research output: Contribution to journalArticlepeer-review

27 Citations (Scopus)
50 Downloads (Pure)


The Article 29 Data Protection Working Party's recent draft guidance on automated decision-making and profiling seeks to clarify European data protection (DP) law's little-used right to prevent automated decision-making, as well as the provisions around profiling more broadly, in the run-up to the General Data Protection Regulation. In this paper, we analyse these new guidelines in the context of recent scholarly debates and technological concerns. They foray into the less-trodden areas of bias and non-discrimination, the significance of advertising, the nature of “solely” automated decisions, impacts upon groups and the inference of special categories of data—at times, appearing more to be making or extending rules than to be interpreting them. At the same time, they provide only partial clarity – and perhaps even some extra confusion – around both the much discussed “right to an explanation” and the apparent prohibition on significant automated decisions concerning children. The Working Party appears to feel less mandated to adjudicate in these conflicts between the recitals and the enacting articles than to explore altogether new avenues. Nevertheless, the directions they choose to explore are particularly important ones for the future governance of machine learning and artificial intelligence in Europe and beyond.
Original languageEnglish
Number of pages7
JournalComputer Law and Security Review
Issue number2
Early online date10 Jan 2018
Publication statusPublished - 30 Apr 2018


  • automated decision making
  • algorithmic decision making
  • right to an explanation
  • right of access
  • data protection regulation

Cite this