The Antinomies of Computation
AntiSurveillance Feminist Poet Hair & Makeup Party (photo by emilyraw.com) |
Computation makes the collection of data relatively easy. This increases visibility through what Rey Chow (2012) calls “Capture”. Software enables more effective systems of surveillance and hence new capture systems. As Foucault argues, “full lighting and the eyes of a supervisor capture better than darkness, which ultimately protected. Visibility is a trap” (Foucault 1991:200). The question is also linked to who is made visible in these kinds of systems, especially where as Feminist theorists have shown, visibility itself can be a gendered concept and practice, as demonstrated in the historical invisibility of women in the public sphere, for example. Here we might also reflect on the way in which the practice of making-visible also entails the making-invisible – computation involves making choices about what is to be captured. For example, Zach Blas's work is helpful in showing the various forms of race, gender and class-based exclusion in computational and biometric systems (Magdaleno 2014).
The challenge for critical theory today lies in conceptualising how to resist the totalising computational gaze that seeks to render all aspects of social and political life transparent and calculable. This requires careful thought about creating spaces that remain opaque to algorithmic surveillance while still enabling democratic politics to flourish. The metaphor of the black box offers a productive way to think through this problem. In technical discourse, black boxes represent systems whose internal workings remain hidden even as their external interfaces remain readable. This tension between visibility and opacity suggests possibilities for political spaces that retain their internal autonomy while participating in broader democratic life.
The creation of what could be termed opaque zones of political experimentation becomes crucial. These would be spaces where democratic deliberation and formation can occur without being subject to complete computational visibility and control. Such zones would need to operate at the edges of algorithmic legibility while remaining fully political in nature. They must be open and inclusive spaces for democratic debate and organisation, yet structured to resist the drive towards total transparency that characterises computational capitalism.
Following Rossiter and Zehle's (2014) argument for a politics of anonymity, we can understand how algorithmic mediation itself could paradoxically create planes of opacity that enable political action. The key lies in appropriating computational infrastructures to generate spaces of relative autonomy. These would not be spaces of complete darkness or full withdrawal, but rather zones where the political can emerge through selective revelation and concealment. This suggests the need for new practices of computational opacity that work against the logics of total visibility while preserving democratic openness.
This reconceptualisation of political space under computational conditions requires moving beyond simple binaries of transparency versus secrecy. Instead, it points toward cultivating what could be understood as graduated or differential opacity: spaces that remain selectively readable while maintaining crucial zones of algorithmic illegibility. Such spaces would enable the essential work of democratic politics to continue without succumbing to the totalising visibility that threatens to foreclose genuine political experimentation and formation. The challenge ahead lies in developing both theoretical frameworks and practical techniques for creating and maintaining these zones of opacity. This requires careful attention to the specific affordances of computational systems, seeking ways to turn their own logics toward the preservation of political autonomy.
The aim is not to generate spaces of paranoia or underground societies, but rather to cultivate conditional moments where the regime of computational visibility temporarily falters. This approach moves beyond individualistic notions of privacy towards the creation of collective spaces that enable critical reflection and political formation. Such spaces facilitate the development of theoretical and practical responses to computational power through what Goldberg (2014) terms 'open source theory', constantly reformulated through collective revision and reimagining.
These theoretical practices align with what CTI (2008) describes as 'poor theory', suggesting modes of thinking that remain provisional and adaptive rather than totalising. Through cryptographic practices and related technical interventions, it becomes possible to generate shadows within the supposedly transparent spaces of computational society. These shadows do not represent complete withdrawal or invisibility, but rather create zones where critical thought and political organisation can flourish.
A central paradox emerges in that attempts to create opacity may themselves attract heightened surveillance from state authorities and intelligence agencies. The very act of employing encryption or seeking to establish zones of reduced visibility often triggers increased attention from those systems of control that these practices seek to evade. This demonstrates the complex dialectic between opacity and visibility under contemporary conditions of computational capitalism.
This paradox should not, however, lead to abandonment of the project of creating opaque spaces for critical reflection and political formation. Rather, it points toward the need for more sophisticated theoretical and practical approaches that can work within and against the logics of computational visibility. These approaches must remain attentive to the ways in which attempts at opacity may themselves generate new forms of visibility, while still pursuing the essential task of creating space for genuine political thought and action. The cultivation of such spaces requires careful attention to both technical practices and theoretical frameworks. It demands engagement with cryptographic tools through crypto practices, and techniques while remaining mindful of their limitations and the ways they may be recuperated by systems of control.
CV Dazzle Project by Adam Harvey |
These artworks point towards the notion of "opaque presence" explored by Broeckmann (2010) who argues that "the society of late capitalism – whether we understand it as a society of consumption, of control, or as a cybernetic society – visibility and transparency are no longer signs of democratic openness, but rather of administrative availability" (Broeckmann 2010). It also is suggestively explored by the poet Edouard Glissant, who believes that we should "agree not merely to the right to difference but, carrying this further, agree also to the right to opacity that is not enclosure within an irreducible singularity. Opacities can coexist and converge, weaving fabrics" (Glissant 1997: 190).
So this is not just a technical (e.g. cryptographic) practice. Indeed crypto practices have to be rethought to operate on the terrain of the political and technical simultaneously. Political activity, for example, is needed to legitimate these cryptographically enabled “dark places”. Both with the system (to avoid paranoia and attack), with the public (to educate and inform about them), and with activists and others.
That is, we could think about these crypto-practices as (re)creating the possibility of being a crowd, both in the terms of creating a sense of solidarity around the ends of a political/technical endeavour and the means which act as a condition of possibility for it. Thus we could say in a real sense that computer code can act to create “crowd source”, as it were, both in the technical sense of the computer source code, and in the practices of coming together to empower actors within a crowd, to connect to notions of the public and the common. But these crypto-practices could also help individuals to "look to comprehend how things fit together, how structural conditions and cultural conceptions are mutually generative, reinforcing, and sustaining, or delimiting, contradictory, and constraining. [They] would strive to say difficult things overlooked or purposely ignored by conventional thinking, to speak critically about challenging matters, to identify critical and counter-interests" (Goldberg 2014).
In contrast, to think for a moment about the other side of the antinomy, liberal societies have a notion of a common good of access to information to inform democratic citizens, whilst also seeking to valorise it. That is, the principle of visibility is connected to not only the notion of seeing ones representatives and the mechanisms of politics themselves but also the knowledge that makes the condition of acting as a citizen possible.
Meanwhile, with the exploding quantity of information in society and the moves towards a digital economy, information is increasingly seen as a source of profit for capitalism if captured in an appropriate way. Indeed, data and information are said to be the new ‘oil’ of the digital age (e.g. Alan Greenspan 1971) (Berry 2008: 41, 56). This highlights both the political and economic desire for data. Meanwhile, the digital enables exploding quantities of data that are increasingly hard to contain within organisation boundaries.
One response to computational changes in political and the economy has been the kinds of digital activism connected with whistleblowing and megaleaks, that is the release of massive amounts of data into the public sphere and the use of social media and the internet to distribute it. These practices tend to act to take information out of the "black boxes" of corporations, governments and security services and provide information in the public domain about their mechanisms, practices and machinations. They seek then to counter the opaqueness of the organisation form, and making use of the copyable nature of digital materials.
However, as megaleaks places raw data into the public sphere – usually as files and spreadsheets of data – there is a growing problem of being able to read and comprehend it, hence the growing need for journalists to become data journalists. Ironically then, “opening the databanks” (Berry 2014: 178, Lyotard 1984: 67) creates a new form of opaqueness. Computational strategies are needed to read these new materials (e.g. algorithmic distant readings). Attached to the problem of information overload is that this mechanism can also be harnessed by states seeking to attack megaleaks by counter-leaking and delegitimate megaleaks. Additionally, in some senses the practices of Wikileaks are connected to creating an informational overload within organisations, both in terms of their inability to cope with the release of their data, but also the requirement to close communicational channels within the organisation. So information overload can become a political tactic of both for control and resistance.
But what is at stake here is not just the relationship between visibility and incarceration, nor the deterritorialisation and becoming-mobile made possible by computation. Rather it is the collapse of the “time lag between the world and its capture” (Chow 2012). When capture becomes real-time through softwarized monitoring technologies and the mediation of “police” functions and control that implies.
The question then becomes what social force is able to realise the critique of computational society but also block the real-time nature of computational monitoring. What practices become relevant when monitoring and capture become not only prevalent but actively engaged in. Tentatively I would like to suggest embedding critical cryptographic practices made possible in what Lovink and Rossiter (2013) calls OrgNets (organised networks).
Antisurveillence Feminist Party (photo by emilyraw.com) |
Notes
[1] This post is drawn from a talk given at Digital Activism #Now: Information Politics, Digital Culture and Global Protest Movements, at Kings College, London (KCL), 04/04/14. See http://www.kcl.ac.uk/aboutkings/worldwide/initiatives/global/nas/news-and-events/events/eventrecords/Digital-Activism-Now-Information-Politics,-Digital-Culture-and-Global-Protest-Movements.aspx
Bibliography
Berry, D. M. (2008) Copy, Rip, Burn: The Politics of Copyleft and Open Source, London: Pluto Press.
Berry, D. M. (2011) The Philosophy of Software, London: Palgrave.
Berry, D. M. (2014) Critical Theory and the Digital, New York: Bloomsbury.
Broeckmann, A. (2010) Opaque Presence / Manual of Latent Invisibilities, Berlin: Diaphanes Verlag.
Chow, R. (2012) Entanglements, or Transmedial Thinking about Capture, London: Duke University Press.
CTI (2008) Poor Theory Notes: Toward a Manifesto, Critical Theory Institute, accessed 14/4/2014, https://www.humanities.uci.edu/critical/poortheory.pdf
Deleuze, G. (1992) Postscript on the Societies of Control, October, vol. 59, pp. 3-7. Available at https://files.nyu.edu/dnm232/public/deleuze_postcript.pdf
Foucault, M. (1991) Discipline and Publish, London: Penguin Social Sciences.
Glissant, E. (1997) The Poetics of Relation, Michigan: The University of Michigan Press.
Goldberg, D. T. (2014) Afterlife of the Humanities, accessed 14/04/2014, http://humafterlife.uchri.org
Harvey, A. (2014) Stealth Wear, accessed 04/04/2014, http://ahprojects.com/projects/stealth-wear/
Lovink, G. and Rossiter, N (2013) Organised Networks: Weak Ties to Strong Links, Occupy Times, accessed 04/04/2014, http://theoccupiedtimes.org/?p=12358
Lyotard, J. F. (1984) The Postmodern Condition: A Report on Knowledge. Manchester:
Manchester University Press
Magdalenom J. (2014) Is Facial Recognition Technology Racist?, The Creators Project, accessed 05/04/2014, http://thecreatorsproject.vice.com/blog/is-facial-recognition-technology-racist
Oliver, J. (2014) Julian Oliver, accessed 05/04/2014, http://julianoliver.com/output/
Rossiter, N. and Zehle, S. (2014) Toward a Politics of Anonymity: Algorithmic Actors in the Constitution of Collective Agency and the implications for Global Justice Movements, in Parker, M., Cheney, G., Fournier, V. and Land, C. (eds.) The Routledge Companion to Alternative Organization, London: Routledge.
Comments
Post a Comment