API: Authoritarian Programming Interface

David M. Berry

DALLE inspired Dashboard. Unfortunately
DALLE cannot spell "impound". 
In this article, I explore the current unprecedented attempt to gain control over the U.S. federal government's payment systems through direct code manipulation and how it represents a dangerous experiment in authoritarian control through computational means (see Tankus 2025). The aim is to understand why the Trump Administration's Department of Government Efficiency's (DOGE) access to legacy financial systems is so serious and potentially creates what I call an Authoritarian Programming Interface (API). Through an analysis of this case, I examine how computational infrastructures have become fundamental to the exercise of state capture and how technical systems mediate new forms of political control. This is a fast changing situation and I will be updating this post as new information becomes available. Here, I aim to explain how these systems have developed in the way they have and how they create an unprecedented and unplanned concentration of power inside computer code.[1]

This is a speculative article and as such I will be covering a lot of material relatively quickly. I aim to outline how traditional programming interfaces might be transformed into instruments of authoritarian control and how this signals a qualitative shift in how computational systems might be used to structure governance. I argue that what looks like an attempted creation of new control points, by changing the computer code within federal payment systems, reveals broader transformations in how technical architectures enable or constrain political power. Through an examination of the technical mechanisms and political implications of these new computational interfaces, I attempt to show how control over critical infrastructure might become central to contemporary forms of political sovereignty. As Glasser (2025) notes, quoting an unnamed Republican,

“Elon figured out that the personnel, information-technology backbone of the government was essentially the twenty-first-century equivalent of the nineteen-fifties television tower in the Third World”... and “that you could take over the government essentially with a handful of people if you could access all that.” My friend, incidentally, chose to speak on background despite his years of public criticism of Trump, noting that a think tank with which he is affiliated receives government contracts. Fear, in this revolution, as in all revolutions, is perhaps the most effective weapon of all (Glasser 2025).

This stark assessment demonstrates how computational infrastructure have become central to government functioning. Access to technical systems potentially enables unprecedented consolidation of power through the medium of computation. The metaphor of the television tower is shocking not just in how it captures how control over computer systems represents a new vector for political control but also how we have sleepwalked as a society into creating these chokepoints that can be manipulated. Indeed, the privileged access to the federal payment systems suggests a troubling situation where computational architectures might be weaponised to bypass traditional democratic safeguards through direct manipulation of technical infrastructure.

In the first section, I examine how access to computer code and privileged access rights to programming interfaces function as mechanisms of control, showing how they structure and mediate power relations through technical standardisation. Next, I look at the potential material resistance offered by legacy computer systems against attempts at authoritarian capture. Finally, I try to unravel the infrastructural stakes of this struggle over computational control, arguing that the future of democracy might increasingly depend on protecting critical computer systems in order to prevent them from becoming instruments of authoritarian power.

Programming Interfaces and Control

The technical architecture of computational systems is increasingly seen as creating the possibilities for political control and through the standardisation and formalisation of programmatic access new forms of power can potentially be unlocked. In computing, an Application Programming Interface (API) is a form of design that provides a set of definitions, protocols and tools that allow different software applications to communicate with each other. APIs can be understood as creating "hooks" and "access points" into computer systems that enable programmers to call functions to direct and control how computer systems work – they provide a simplified but extremely powerful abstraction of the underlying computer system. To understand how APIs have developed into potential instruments of control, we need to examine their historical development.

The historical development of APIs reveals their increasing use in computational systems. They were originally developed in the 1960s to deal with the complexity of operating systems such as Unix, APIs emerged as standardised protocols to enable programmatic access to system resources (Mahoney, 2008). Through a process of technical standardisation they were further developed through the growth of object-oriented programming in the 1980s, which emphasised the importance of stable interfaces for managing complex systems (Kay, 1993). The rise of web APIs in the 1990s and 2000s marked an important shift as APIs became mechanisms for controlling access to services over networks and remote data flows (Fielding and Taylor, 2002). Indeed, it is argued that Amazon and Google adopted APIs in a way that transformed from purely technical protocols into instruments of platform power, using them to structure markets (e.g. online advertising) and control digital infrastructure (e.g. AWS) (Helmond, 2015). This brief genealogy of APIs helps us to understand how technical standardisation enables new forms of computational control, but also potentially creating the conditions for their later use as mechanisms of authoritarian power (Galloway, 2004). Indeed, this form of power was described by Deleuze (1992) as creating societies of control, where power operates through modulation and continuous monitoring rather than disciplinary forms of control. Algorithmic systems therefore create the possibility for what Rouvroy and Berns (2013: 163) term "algorithmic governmentality", where power operates through the automated processing of data rather than direct intervention. As Chun (2021: 89) argues, these systems discriminate through correlation and pattern-matching, creating what she terms "proxies" that mediate new forms of control.

These theoretical insights into algorithmic control take on a striking significance when we examine current developments. We should therefore be concerned at the Trump Administration's Department of Government Efficiency's attempt to gain access to the underlying source code of government computer systems (Stanley-Becker et al. 2025). Indeed, by being able to change it, we can, perhaps, start to see that changes to computer code could precede a very different use of APIs. Indeed, this attempt to change the code of the federal payment systems, reported by Tankus (2025) and others, potentially represents a radical intensification of control tendencies already present in the historical development of computational interfaces. We might say that it shows a movement from application programming interfaces towards authoritarian programming interfaces.

What makes the current situation at the U.S. Treasury particularly concerning is a possible attempt to create new programmatic interfaces directly within these mission-critical payment systems that process over $5.4 trillion annually (Tankus, 2025). By embedding new code into the existing Bureau of Fiscal Service systems, DOGE almost certainly will be attempting to create mechanisms for unprecedented executive control over federal financial flows. Indeed, Wired reported that this access gave DOGE "the capability to, among other things, illegally cut off Congressionally authorised payments to specific individuals or entities" (Elliott 2025b). This transformation of abstraction via traditional APIs into instruments of authoritarian control potentially represents a fundamental shift in how computational infrastructures mediate political power. More worryingly, this shift occurs not merely at a technical level but reconstructs the very possibility of governance through new forms of programmatic control via computer code. Trying to get a sense of what this transformation might mean requires understanding both the technical mechanisms and political implications of computational interfaces.

As recently revealed (Tankus 2025; Snyder, 2025), DOGE members have been granted administrative access to modify core payment systems, including the Payment Automation Manager (PAM) and Secure Payment System (SPS) (Alemany et al 2025; Elliott 2025). These systems are where new programmatic "hooks" might be inserted that would thereby allow the Administration to exercise selective blocking, redirection, or manipulation of intra-government or external payments. One might suspect that one of the goals would be to establish a computational "dashboard" providing real-time visibility and control over federal money movements (see Tkacz 2022). Politico recently reported that "government payments will now have a 'payment categorisation code' for auditing purposes" and that "payments must provide a rationale in a comment field" (Bianco 2025). From this we can speculate that an "ID code", and a "Comment" field have been added to the Treasury database that would provide the extra information needed to categorise and display this information on, for example, such a dashboard. These extra fields would also make it easier to filter payments, pause them and potentially cancel them.  

I am concerned that this attempt to create programmatic control over federal payment systems through authoritarian programming interfaces signals a mutation of state power into a new form of computational sovereignty. This form of power might therefore operate through the mobilisation of technical standardisation and control of computational infrastructures, particularly focussed on the critical systems that facilitate the operation of the state apparatus. The attempt at the insertion of programmatic "hooks" into legacy financial systems should therefore be seen for what it is, an attempt to establish computational control which bypasses traditional forms of political and bureaucratic control of the state. This could, therefore, represent a new way in which state power is exercised, moving from bureaucratic administration mediated through humans towards automated forms of computational administration operating at the level of technical infrastructure (e.g. auto-technocratic governance).

Legacy Systems and Fragility 

This desire to access and change mission-critical payment systems suggests that computational infrastructures have become a crucial site of political contestation in the 21st century. By attempting to establish computational control over federal financial flows, DOGE might be seeking to create new forms of API governance that would potentially override existing institutional checks and balances. This can be seen as an example of how algorithmic systems create new forms of sovereign decision-making through the modulation of data flows. As Amoore writes, we therefore need to "acknowledge[...] that algorithms contain, within their spatial arrangements, multiple potentials for cruelties, surprises, violences, joys, distillations of racism and prejudice, injustices, probabilities, discrimination, and chance" (Amoore 2020: 7). Yet whilst these theoretical frameworks help us understand the potential for algorithmic control, the concrete reality of legacy systems also presents interesting material constraints.

However, technical complexities may frustrate these ambitions. The Treasury's payment infrastructure is said to consist of multiple legacy systems, some written in COBOL dating back to the 1960s, which have been carefully maintained and updated over decades (Tankus, 2025). As David Lebryk's forced departure demonstrates, the reality is that very few people fully understand how these complex, interrelated systems work and, indeed, their fragility. Even with the rumoured artificial intelligence assistance that DOGE has suggested they are using, changing or refactoring such mission-critical code requires extreme care and extensive testing – something that appears to be absent from current descriptions of the process currently underway.

DOGE access to US Government Agencies as of Feb 2025

One way of describing this is through the notion of an accumulation of technical debt  which in these Agencies' legacy systems represents a form of deep infrastructural friction that complicates attempts at authoritarian control. Technical debt, understood as the long-term consequences of past technical decisions, creates what we might call the installed base problem in infrastructure (see Star and Ruhleder 1996). Indeed, the Treasury's payment systems can be seen as an example of how maintenance and repair work become a crucial sites of infrastructural politics (Jackson, 2014). Each modification or update to these systems required careful negotiation with decades of accumulated technical decisions as complex maintenance ecologies (Denis and Pontille 2019) . These ecologies of repair and maintenance work can be seen not just as technical complexity but also a potential barrier to modification that constitutes a form of infrastructural resistance from the historical development of these systems. The social organisation of maintenance labour would therefore be a crucial factor in determining the possibility of transforming these systems into instruments of authoritarian control – and it seems that DOGE is keener to ignore this knowledge than use it, and in the case of Lebryk, losing important institutional knowledge in the process, thereby undermining their own plans.[2]

We should note that DOGE's alleged approach of potentially pushing untested changes directly into production systems, rather than following proper development and deployment procedures, also creates serious risks of catastrophic failure. As one Treasury IT staff member noted, "this level of unchecked access is critically dangerous to the economy and the government" (Tankus, 2025b). This means that rather than giving them control, this seemingly reckless modification of these fragile systems could trigger what might be a very different kind of API, what we might call an authoritarian payment implosion through a collapse of trillions of dollars of federal payment processing that would paralyse government operations and have knock on effects in the wider economy.

The tension between DOGE's authoritarian ambition and technical reality might therefore reveal a broader dialectic within computational governance. While computational systems seem to enable new forms of control and surveillance, they simultaneously create new vulnerabilities and points of resistance. 

The Computational Underground

Having examined both the ambitions for control and the technical constraints that might frustrate them, we should also consider the broader implications of this struggle. I therefore argue that the attempt to create an authoritarian programming interface reveals how important what I call the computational underground is likely to be in future political tussles over control. In this case, the financial computational infrastructure represents a new frontier in the contestation of state power. However, as described above, the technical complexity and brittleness of legacy systems may actually prove to enact their own form of resistance to authoritarian capture. The question remains whether DOGE's attempt to reprogram federal payments will result in control or catastrophe.

The technical mechanisms through which control could be asserted over a system are particularly interesting, even if only seemingly minor, like the reported DOGE access being "set to 'insert' for the Secure Payment System (SPS)"  (Tankus 2025c). For example, even the limited INSERT permissions within database systems, reported by Tarkus (2025c) represent a significant security vulnerability that extends beyond simple data creation. As he describes, "this type of permission lets you 'add a row to a table' which is a 'type of write access but very limited'. According to my source who is unfamiliar with the situation it lets you 'create data but not change its structure or delete it, create tables etc.'" (Tankus 2025c). But in actuality, within computer database systems, INSERT rights allow the addition of new records which can potentially alter the behaviour and processing of a system. Within the Federal payment system, the capacity to insert new records, even without full modification rights, creates possibilities for redirecting payments, creating processing conflicts, or manipulating automated verification systems. The complexity of these legacy interdependent infrastructures makes it extremely difficult to detect or understand the full implications of inserting records into a database. The ability to insert data therefore might seem to be a limited form of computer privilege, but potentially could be a devastating access right that could be exploited to compromise the database system integrity while appearing more restricted than full modification rights.

These technical details, far from being merely technical details, reveal the close connection between computational architecture and political power. The speculative dialectic I have identified between computational control and infrastructural resistance raises broader questions about the future of democratic governance under conditions of increasing computational mediation. This transformation of APIs from neutral technical protocols into instruments of authoritarian control show how computational infrastructures have become fundamental to our societies (Berry 2014). Computational systems are now deeply embedded in state power and control. Indeed, the technical architecture of these systems may increasingly shape the space of contestation between authoritarian and democratic forms of governance. The struggle over programming interfaces and computer code might therefore become a site where future battles over democracy take place.

If the materiality of computational infrastructure emerges as a key terrain of political struggle, as seems to be the case, then there is an urgent need to develop new theoretical frameworks and political strategies for ensuring democratic oversight and strong protection of critical computational infrastructure. It seems strange to infer that the future of democracy may well depend on our ability to prevent the transformation of computational interfaces into instruments of authoritarian control but this is what seems we are being taught in what might be a world historic moment.


Written by David M. Berry

This post builds on work developed in my forthcoming book on Artificial Intelligence

** Headline image generated using DALL-E in Feb 2025. The prompt used was: "A dystopian, authoritarian programming interface displayed on a futuristic computer screen. The interface is dark, rigid, and heavily restricted, with red warning messages, locked access controls, and a highly structured command-line layout. Surveillance icons, security protocols, and cryptic government-style directives dominate the screen. The aesthetic is cold, metallic, and oppressive, evoking a sense of control and restriction. Prominent buttons labeled 'STOP PAYMENTS' and 'IMPOUND SPENDING' are displayed in bold red text, reinforcing the strict financial control theme. Ensure the spelling is accurate: 'IMPOUND SPENDING'." Due to the probabilistic way in which these images are generated, future images generated using this prompt are unlikely to be the same as this version. 

Notes

[1] This article emerges out of a conversation with Mark Marino in relation to Critical Code Studies and its potential for reading and understanding computer source code as a critical practice in scholarly work (see Marino 2020). We were both struck by the request by DOGE members for access to the source code.  Apparently, "Musk’s DOGE team has been asking for what the New York Times reporting refers to as 'source code information' since December" (Tankus 2025). They now have it. 

[2] We can only speculate that the resignation of David Lebryk may have been an heroic act of civil disobedience which denied DOGE his extensive knowledge and experience, and hopefully slows down or stalls their attempts to gain control over these computer systems. People who has experience of working on legacy systems will know that once institutional knowledge of a legacy computer system has left the building it can be very difficult to patch or change the code without serious risk of breaking it. 

Bibliography

Alemany, J., Stein, J. and Torbati, Y. (2025) DOGE deputy to oversee powerful Treasury system as Musk demands cuts, Washington Post, 7 February. Available at: https://www.washingtonpost.com/business/2025/02/07/treasury-doge-payments-musk/ (Accessed: 8 February 2025).

Amoore, L. (2020) Cloud Ethics: Algorithms and the Attributes of Ourselves and Others. Duke University Press.

Berry, D. M. (2014) Critical Theory and the Digital. New York: Bloomsbury.

Berry, D. M. (2024) Algorithm and code: explainability, interpretability and policy, in Handbook on Public Policy and Artificial Intelligence. Edward Elgar, pp. 134-146.

Bianco, A. (2025) Musk says Treasury, DOGE instituting reporting changes to all government payments, POLITICO. Available at: https://www.politico.com/news/2025/02/08/elon-musk-doge-government-payments-014920 (Accessed: 9 February 2025).

Chun, W.H.K. (2021) Discriminating data: correlation, neighborhoods, and the new politics of recognition. Cambridge, Massachusetts: MIT Press.

Deleuze, G. (1992) ‘Postscript on the Societies of Control’, October, 59, pp. 3–7.

Denis, J. and Pontille, D. (2019) ‘Why do maintenance and repair matter?’, in I.F.& C.R. Anders Blok (ed.) The Routledge Companion to Actor-Network Theory, pp. 283–293. Available at: https://minesparis-psl.hal.science/hal-02172939 (Accessed: 6 February 2025).

Elliott, V. (2025) ‘A US Treasury Threat Intelligence Analysis Designates DOGE Staff as “Insider Threat”’, Wired. Available at: https://www.wired.com/story/treasury-bfs-doge-insider-threat/ (Accessed: 8 February 2025).

Elliott, V. (2025b) ‘The US Treasury Claimed DOGE Technologist Didn’t Have “Write Access” When He Actually Did’, Wired. Available at: https://www.wired.com/story/treasury-department-doge-marko-elez-access/ (Accessed: 8 February 2025).

Fielding, R.T. and Taylor, R.N. (2002) Principled design of the modern Web architecture, ACM Trans. Internet Technol., 2(2), pp. 115–150. Available at: https://doi.org/10.1145/514183.514185.

Galloway, A. R. (2004) Protocol: How Control Exists after Decentralization. MIT Press.

Glasser, S.B. (2025) Elon Musk’s Revolutionary Terror, The New Yorker, 6 February. Available at: https://www.newyorker.com/news/letter-from-trumps-washington/elon-musks-revolutionary-terror 

Helmond, A. (2015) The Platformization of the Web: Making Web Data Platform Ready, Social Media + Society, 1(2), p. 2056305115603080. Available at: https://doi.org/10.1177/2056305115603080.

Jackson, S.J. (2014) ‘Rethinking Repair’, in T. Gillespie, P.J. Boczkowski, and K.A. Foot (eds) Media Technologies: Essays on Communication, Materiality, and Society. The MIT Press, p. 0. Available at: https://doi.org/10.7551/mitpress/9780262525374.003.0011.

Kay, A.C. (1993) The early history of Smalltalk, in The second ACM SIGPLAN conference on History of programming languages. New York, NY, USA: Association for Computing Machinery (HOPL-II), pp. 69–95. Available at: https://doi.org/10.1145/154766.155364.

Mahoney, M. S. (2008) What Makes the History of Software Hard, IEEE Annals of the History of Computing, 30(3), pp. 8-18.

Marino, M.C. (2020) Critical code studies. The MIT Press.

Rouvroy, A. and Berns, T. (2013) ‘Algorithmic governmentality and prospects of emancipation: Disparateness as a precondition for individuation through relationships?’, Réseaux, 177(1), pp. 163–196.

Snyder, T. (2025) Of course it's a coup, Thinking About... Newsletter, 6 February.

Stanley-Becker, I. et al. (2025) ‘Musk’s DOGE agents access sensitive personnel data, alarming security officials’, Washington Post, 6 February. Available at: https://www.washingtonpost.com/national-security/2025/02/06/elon-musk-doge-access-personnel-data-opm-security/ (Accessed: 6 February 2025).

Star, S.L. and Ruhleder, K. (1996) ‘Steps Toward an Ecology of Infrastructure: Design and Access for Large Information Spaces’, Information Systems Research, 7(1), pp. 111–134. Available at: https://doi.org/10.1287/isre.7.1.111.

Tankus, N. (2025) Elon Musk Wants to Get Operational Control of the Treasury’s Payment System. This Could Not Possibly Be More Dangerous, Notes on the Crises. Available at: https://www.crisesnotes.com/elon-musk-wants-to-get-operational-control-of-the-treasurys-payment-system-this-could-not-possibly-be-more-dangerous/ (Accessed: 6 February 2025).

Tankus, N. (2025b) Day Five of the Trump-Musk Treasury Payments Crisis of 2025: Not “Read Only” access anymore, Notes on the Crises. Available at: https://www.crisesnotes.com/day-five-of-the-trump-musk-treasury-payments-crisis-of-2025-not-read-only-access-anymore/ (Accessed: 6 February 2025).

Tankus, N. (2025c) Day Seven of the Trump-Musk Treasury Payments Crisis of 2025: “Yours and WIRED’s Reporting is Actually Doing Something”, Notes on the Crises. Available at: https://www.crisesnotes.com/day-seven-of-the-trump-musk-treasury-payments-crisis-of-2025-yours-and-wireds-reporting-is-actually-doing-something/ (Accessed: 6 February 2025).

Tkacz, N. (2022) Being with Data: The Dashboarding of Everyday Life. Polity.


Comments

Popular Posts