Explainable Aesthetics: Explainability and the Aesthetics of Explanation

Computation combined with artificial intelligence and machine learning has raised interesting questions about authorship, authenticity, post-human futures, creativity and AI-driven aesthetics. Many of these debates foreground the question of the human, whether as post-human technologies or as challenges to the privileged status of humans as intelligent, thinking or creative beings. However with the recent Data Protection Act 2018 which was the enabling legislation in the UK for the GDPR (General Data Protection Regulation) a new right has been created in relation to automated algorithmic systems that requires the "controller" of the algorithm to supply an explanation of how a decision was made to the user (or "data subject") – the right to explanation.[1] This has come to be known as the problem of explainability. More particularly, under the GDPR Article 22,
the data controller shall implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision (GDPR, Art. 22)
The GDPR has a number of interesting effects, firstly defining a new kind of subject, the "data subject" to whom this right to explanation about an algorithm (amongst other data protection and privacy rights) has been given, this is defined as follows,
relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person (GDPR Art. 4)
Secondly, this has created a particular legal definition of what processing through a computer algorithm is, in this case processing,
means any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction (GDPR Art. 4)
Brought together this creates the notion of a "data subject" with a range of very specific and unique rights as a "natural person" which distinguishes that person from the artificial intelligence, machine-learning system or algorithm. Indeed, one might read this definition as a post-posthuman subjectivity creating and reinforcing a boundary between human and machine. The full range of rights are:
Rights of the data subject 
Section 1 – Transparency and modalitiesArticle 12 – Transparent information, communication and modalities for the exercise of the rights of the data subject 
Section 2 – Information and access to personal dataArticle 13 – Information to be provided where personal data are collected from the data subject
Article 14 – Information to be provided where personal data have not been obtained from the data subject
Article 15 – Right of access by the data subject 
Section 3 – Rectification and erasureArticle 16 – Right to rectification
Article 17 – Right to erasure (‘right to be forgotten’)
Article 18 – Right to restriction of processing
Article 19 – Notification obligation regarding rectification or erasure of personal data or restriction of processing
Article 20 – Right to data portability 
Section 4 – Right to object and automated individual decision-makingArticle 21 – Right to object
Article 22 – Automated individual decision-making, including profiling 
Section 5 – RestrictionsArticle 23 – Restrictions (GDPR, Chapter III)
The non-binding GDPR Recital 71 is the most interesting in regard to the notion of explainability, it states,
the data subject should have the right not to be subject to a decision, which may include a measure, evaluating personal aspects relating to him or her which is based solely on automated processing and which produces legal effects concerning him or her or similarly significantly affects him or her, such as automatic refusal of an online credit application or e-recruiting practices without any human intervention... such processing should be subject to suitable safeguards, which should include specific information to the data subject and the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision (GDPR Recital 71)
Whilst a recital is non-binding, the "European Court of Justice (ECJ) jurisprudence reveals that the role of Recitals is “to dissolve ambiguity in the operative text of a framework.”", it therefore provides a critical reference point for future interpretations (Casey et al 2018: 17). This is where the notion of explanation as a derivation from automated algorithmic systems is largely given as a requirement, and in effect requires a deconstruction of the processes of computation, including the value specific and calculative model that was used to perform the processing. But there are difficult questions in relation to this requirement, indeed,
this might be impossible, even for systems that seem relatively simple on the surface, such as the apps and websites that use deep learning to serve ads or recommend songs. The computers that run those services have programmed themselves, and they have done it in ways we cannot understand. Even the engineers who build these apps cannot fully explain their behavior (Knight 2017).
The major reason for these requirements is that there are greater concerns over biases, whether intentional or not being built into an algorithmic or machine-learning system and reflected in an "anxiety felt by those who fear the potential for bias to infiltrate machine decision-making systems once humans are removed from the equation" (Casey et al 2018: 4). So, this new "right to explanation" has been mobilised as an attempt to mitigate these worries but also put in place legislative means to seek redress from them. As Casey et al 2018 have explained,
the “complexity of machine-learning” algorithms used in such systems “can make it challenging to understand how an automated decision- making process or profiling works.” But such complexity, it insisted, “is no excuse for failing to provide information” to data subjects... companies making automated decisions which fall under Article 22(1) “should find simple ways to tell the data subject about the rationale behind, or the criteria relied on in reaching the decision”—albeit “without necessarily always attempting a complex explanation of the algorithms used or [a] disclosure of the full algorithm (Casey et al 2018: 30)
Indeed, they further argue that this does not necessarily mean that the algorithm as such need be provided, nor details of the processing steps outlined, indeed this is rather a representational issue. The processing need be presented as a simplified model or explanation that shows the general contours of the algorithm used in this case,
The “right to explanation” may not require that companies pry open their “black boxes” per se, but it does require that they evaluate the interests of relevant stakeholders, understand how their systems process data, and establish policies for documenting and justifying key design features throughout a system’s life cycle (Casey et al 2018: 39)
So the "GDPR provides an unambiguous 'right to explanation' with sweeping legal implications for the design, prototyping, field testing, and deployment of automated data processing systems. Failing to countenance this right could subject enterprises to economic sanctions of truly historic magnitudes—a threat that simply did not exist under the GDPR’s predecessor" (Casey et al 2018: 49).

So what are the implications for aesthetic works in this case?

Screenshot from Emissary in the Squat of Gods, Ian Cheng 2015.
Live simulation and story, infinite duration, sound. 
The discussion I wish to open is largely speculative. It seems to me that we have two issues that are interesting to consider, firstly that the GDPR might require algorithmic artworks to have or to be explainable aesthetics and therefore subject to the same data protection regime as other algorithms. This may mean they are required to provide their processing descriptions under this "right to explanation". So, for example, it would be interesting to raise a request for explanation in relation to Ian Cheng's work on algorithmically structured visual environments (Steyerl and Cheng 2017). What would such explanation consist in? How would it be presented and what would be the relationship of this explanation to the installation as an object of art.

Secondly, there is the question of the aesthetics of explanation, in as much as the explanations will not necessary need to be code-based or detailed descriptions of the algorithm, instead they may well be representations or stereotyped, designed and visually engaging as a mediation of the underlying algorithm and its processing. It seems to me that for the average "data subject" a highly complex mathematical explanation will be useless as an explanatory device, so the kinds of explanatory models being calculated from a system will need to be mediated through aesthetics (such as in the user interface, or through a video-essay of some kind).[2] 

As we see the roll-out of a greater number of projects using AI or machine-learning we might see other artist's and individuals interventions, not as algorithms in and of themselves, but as a kind of algorithmic critique (or "tests") (see Berry 2015: 65) that seek to force prior artworks to explain themselves. Secondly, I should imagine more attention will start to be paid to the explanations that are generated and the way in which they are structured, the information they provide and the aesthetic language they deploy.

This post was prompted in response to a paper by Joanna Zylinska (Goldsmiths) talking about "On Creative Computers, Art Robots and AI Dreams" at the Intelligent Futures: AI, Automation and Cognitive Ecologies, organised at the University of Sussex 1-2 Oct 2018. 


1. A "‘controller’ means the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data; where the purposes and means of such processing are determined by Union or Member State law, the controller or the specific criteria for its nomination may be provided for by Union or Member State law" (GDPR Art. 4)
2. There are also very interesting implications for the notion of the "truth" of an algorithm and what constitutes an accurate or correct representation of it. 


Berry, D. M. (2015) The Philosophy of Software, London: Palgrave.

Casey, Bryan and Farhangi, Ashkon and Vogl, Roland, (2018) Rethinking Explainable Machines: The GDPR's 'Right to Explanation' Debate and the Rise of Algorithmic Audits in Enterprise (February 19, 2018). Berkeley Technology Law Journal, Forthcoming. Available at SSRN: https://ssrn.com/abstract=3143325

Steyerl, H. and Cheng, I. (2017) Simulated Subjects: Glass Bead in conversation with Ian Cheng and Hito Steyerl, Glass Bead, http://www.glass-bead.org/article/simulated-subjects/

Knight, W. (2017) The Dark Secret at the Heart of AI, Technology Review, https://www.technologyreview.com/s/604087/the-dark-secret-at-the-heart-of-ai/