|AntiSurveillance Feminist Poet Hair & Makeup Party|
Computation makes the collection of data relatively easy. This increases visibility through what Rey Chow (2012) calls “Capture”. Software enables more effective systems of surveillance and hence new capture systems. As Foucault argues, “full lighting and the eyes of a supervisor capture better than darkness, which ultimately protected. Visibility is a trap” (Foucault 1991:200). The question is also linked to who is made visible in these kinds of systems, especially where as Feminist theorists have shown, visibility itself can be a gendered concept and practice, as demonstrated in the historical invisibility of women in the public sphere, for example. Here we might also reflect on the way in which the practice of making-visible also entails the making-invisible – computation involves making choices about what is to be captured. For example, Zach Blas's work is helpful in showing the various forms of race, gender and class-based exclusion in computational and biometric systems (Magdaleno 2014).
The question then becomes how to “darken” the visibility to prevent the totalising nature of full top-view possible in computational society? Using the metaphor of “black boxes” – the technical notion of objects which have opaque or impossible to read internal states but readable surfaces – how can we think about spaces that paradoxically enable democracy and the political, whilst limiting the reading of the internal processes of political experimentation and formation. Thus, how to create the conditions of possibility of “opaque places” working on the edges or at the limits of legibility. This we might call opaque temporary autonomous zones, that seek to enable democratic deliberation and debate. These should be fully political spaces, open and inclusive, but nonetheless opaque to the kinds of visibility that computation makes possible. As Rossiter and Zehle (2014) argue, we need to move towards a "politics of anonymity", part of which is an acknowledgement of the way in which the mediation of algorithms could operate as a plane of opacity for various actors.
It is important to note that this is not to create paranoid spaces or secret societies, but conditional and temporary moments – glitches in the regime of computational visibility. The idea is not to recreate notions of individual privacy as such, but rather collective spaces of critical reflection for practices of creating a political response. That is, to draw on theory and "un-theory" as a way of proceeding theoretically as "an open source theory [and practice] in constant reformulation from multiple re-visions and remixings" (Goldberg 2014), what CTI (2008) calls "poor theory". Indeed, crypto practices can create shadows in plain sight thus tipping the balance away from systems of surveillance and control. Of course, paradoxically these opaque spaces themselves may draw attention to state authorities and the intelligence community who monitor the use of encryption and cryptography – demonstrating again the paradox of opacity and visibility.
|CV Dazzle Project by Adam Harvey|
These artworks point towards the notion of "opaque presence" explored by Broeckmann (2010) who argues that "the society of late capitalism – whether we understand it as a society of consumption, of control, or as a cybernetic society – visibility and transparency are no longer signs of democratic openness, but rather of administrative availability" (Broeckmann 2010). It also is suggestively explored by the poet Edouard Glissant, who believes that we should "agree not merely to the right to difference but, carrying this further, agree also to the right to opacity that is not enclosure within an irreducible singularity. Opacities can coexist and converge, weaving fabrics" (Glissant 1997: 190).
So this is not just a technical (e.g. cryptographic) practice. Indeed crypto practices have to be rethought to operate on the terrain of the political and technical simultaneously. Political activity, for example, is needed to legitimate these cryptographically enabled “dark places”. Both with the system (to avoid paranoia and attack), with the public (to educate and inform about them), and with activists and others.
That is, we could think about these crypto-practices as (re)creating the possibility of being a crowd, both in the terms of creating a sense of solidarity around the ends of a political/technical endeavour and the means which act as a condition of possibility for it. Thus we could say in a real sense that computer code can act to create “crowd source”, as it were, both in the technical sense of the computer source code, and in the practices of coming together to empower actors within a crowd, to connect to notions of the public and the common. But these crypto-practices could also help individuals to "look to comprehend how things fit together, how structural conditions and cultural conceptions are mutually generative, reinforcing, and sustaining, or delimiting, contradictory, and constraining. [They] would strive to say difficult things overlooked or purposely ignored by conventional thinking, to speak critically about challenging matters, to identify critical and counter-interests" (Goldberg 2014).
In contrast, to think for a moment about the other side of the antinomy, liberal societies have a notion of a common good of access to information to inform democratic citizens, whilst also seeking to valorise it. That is, the principle of visibility is connected to not only the notion of seeing ones representatives and the mechanisms of politics themselves but also the knowledge that makes the condition of acting as a citizen possible.
Meanwhile, with the exploding quantity of information in society and the moves towards a digital economy, information is increasingly seen as a source of profit for capitalism if captured in an appropriate way. Indeed, data and information are said to be the new ‘oil’ of the digital age (e.g. Alan Greenspan 1971) (Berry 2008: 41, 56). This highlights both the political and economic desire for data. Meanwhile, the digital enables exploding quantities of data that are increasingly hard to contain within organisation boundaries.
One response to computational changes in political and the economy has been the kinds of digital activism connected with whistleblowing and megaleaks, that is the release of massive amounts of data into the public sphere and the use of social media and the internet to distribute it. These practices tend to act to take information out of the "black boxes" of corporations, governments and security services and provide information in the public domain about their mechanisms, practices and machinations. They seek then to counter the opaqueness of the organisation form, and making use of the copyable nature of digital materials.
However, as megaleaks places raw data into the public sphere – usually as files and spreadsheets of data – there is a growing problem of being able to read and comprehend it, hence the growing need for journalists to become data journalists. Ironically then, “opening the databanks” (Berry 2014: 178, Lyotard 1984: 67) creates a new form of opaqueness. Computational strategies are needed to read these new materials (e.g. algorithmic distant readings). Attached to the problem of information overload is that this mechanism can also be harnessed by states seeking to attack megaleaks by counter-leaking and delegitimate megaleaks. Additionally, in some senses the practices of Wikileaks are connected to creating an informational overload within organisations, both in terms of their inability to cope with the release of their data, but also the requirement to close communicational channels within the organisation. So information overload can become a political tactic of both for control and resistance.
But what is at stake here is not just the relationship between visibility and incarceration, nor the deterritorialisation and becoming-mobile made possible by computation. Rather it is the collapse of the “time lag between the world and its capture” (Chow 2012). When capture becomes real-time through softwarized monitoring technologies and the mediation of “police” functions and control that implies.
The question then becomes what social force is able to realise the critique of computational society but also block the real-time nature of computational monitoring. What practices become relevant when monitoring and capture become not only prevalent but actively engaged in. Tentatively I would like to suggest embedding critical cryptographic practices made possible in what Lovink and Rossiter (2013) calls OrgNets (organised networks).
|Antisurveillence Feminist Party|
 This post is drawn from a talk given at Digital Activism #Now: Information Politics, Digital Culture and Global Protest Movements, at Kings College, London (KCL), 04/04/14. See http://www.kcl.ac.uk/aboutkings/worldwide/initiatives/global/nas/news-and-events/events/eventrecords/Digital-Activism-Now-Information-Politics,-Digital-Culture-and-Global-Protest-Movements.aspx
Berry, D. M. (2008) Copy, Rip, Burn: The Politics of Copyleft and Open Source, London: Pluto Press.
Berry, D. M. (2011) The Philosophy of Software, London: Palgrave.
Berry, D. M. (2014) Critical Theory and the Digital, New York: Bloomsbury.
Broeckmann, A. (2010) Opaque Presence / Manual of Latent Invisibilities, Berlin: Diaphanes Verlag.
Chow, R. (2012) Entanglements, or Transmedial Thinking about Capture, London: Duke University Press.
CTI (2008) Poor Theory Notes: Toward a Manifesto, Critical Theory Institute, accessed 14/4/2014, https://www.humanities.uci.edu/critical/poortheory.pdf
Deleuze, G. (1992) Postscript on the Societies of Control, October, vol. 59, pp. 3-7. Available at https://files.nyu.edu/dnm232/public/deleuze_postcript.pdf
Foucault, M. (1991) Discipline and Publish, London: Penguin Social Sciences.
Glissant, E. (1997) The Poetics of Relation, Michigan: The University of Michigan Press.
Goldberg, D. T. (2014) Afterlife of the Humanities, accessed 14/04/2014, http://humafterlife.uchri.org
Harvey, A. (2014) Stealth Wear, accessed 04/04/2014, http://ahprojects.com/projects/stealth-wear/
Lovink, G. and Rossiter, N (2013) Organised Networks: Weak Ties to Strong Links, Occupy Times, accessed 04/04/2014, http://theoccupiedtimes.org/?p=12358
Lyotard, J. F. (1984) The Postmodern Condition: A Report on Knowledge. Manchester:
Manchester University Press
Magdalenom J. (2014) Is Facial Recognition Technology Racist?, The Creators Project, accessed 05/04/2014, http://thecreatorsproject.vice.com/blog/is-facial-recognition-technology-racist
Oliver, J. (2014) Julian Oliver, accessed 05/04/2014, http://julianoliver.com/output/
Rossiter, N. and Zehle, S. (2014) Toward a Politics of Anonymity: Algorithmic Actors in the Constitution of Collective Agency and the implications for Global Justice Movements, in Parker, M., Cheney, G., Fournier, V. and Land, C. (eds.) The Routledge Companion to Alternative Organization, London: Routledge.