"Every disease is a musical problem, every cure is a musical solution."

― Novalis        

This project is an odd confluence of three of my interests: political economy, machine learning, and luthiery of odd acoustic instruments. None of these affairs I can claim to be particularly knowledgeable about so, per usual, this project began by wielding my typical artist-as-excuse ethos to embark on an adventure wrought by intrepid ignorance. Of Strings and Kings is a set of musical compositions written for strings and machine learning algorithms intended to accompany this manuscript. The compositions interspersed throughout the essay will serve as aesthetic counterparts relating to each chapter’s subjects.

The first chapter will draft a framework of some of the bleak vectors of platforms, big data, cloud computation, and centralized AI when drawn to their conclusions. I will tell a cautionary tale of this totalizing hegemony that is being established by harnessing cluster computation and transcontinental network infrastructure and will draw historical parallels to the despotic doctrine that entrenches sovereign rule. In the second chapter I will address the ideas of play and practice as a way of claiming agency in a domain of increasingly unique sets of uncertainty. I will use a multitude of musical metaphors to correlate to various deep learning techniques (training, prediction, learning, feedback, and bias) and will use these metaphors to articulate ways of retaining harmonious improvisational dynamics with the abounding swarms of behavior modifying algorithms that live in our pockets, networks, and cities. The third chapter’s aim is to review alternative tuning systems as a tool for recalibrating not just our music, but our social relations. Much like tuning an acoustic instrument, algorithms are tuned and optimized. I will posit different ideas for how we get our hands on these parameters to bring about more harmonious and equitable outcomes. I will discuss a number of topics including data-as-labor, distributed ownership, co-operative parametric design, right-to-explanation, decentralized machine learning, and optimizing for utilities that redistribute decision making and minimize extraction.

Before diving into the first chapter allow me to provide a brief primer on the composition and instrumentation. These compositions were written for custom fabricated lyres, hammered dulcimer, lute, gittern, and popular machine learning techniques including LSTM RNN1 (long short-term memory recurrent neural networks), adversarial neural audio synthesis2, and a multitude of others cited in appendix B3. I will only partially expound upon on the relevance of the various tuning systems and algorithms’ aesthetic and technical significance throughout the essay but will provide detailed auxiliary reflections in appendix A4. In short, the music is stylized in neo-feudal minstrel songform with accompaniment from gothic automata. This synthesis of ars antiqua and deep learning portends a speculative future in which the class asymmetries of yore are resurrected through menacing algorithmic injunctions.

1    “LSTM Networks”
2   “Adversarial Neural Audio Synthesis | OpenReview”
3   “Appendix B: Algorithms & Libraries",
4   “Appendix A: Auxiliary Musical Context”,


Masks of Power


Mystification, Naturalization, & Literacy

This section will outline three masks that contemporary power has adorned in the age of deep learning. These masks are ancient methods of refashioning authority, but are being retrofitted for planetary computational dominion. Used for distancing and obfuscating material jurisdiction away from the general population, the sovereign justify and cloak their authority with these historically vetted tactics.


Where realpolitik’s brute force approach to wielding power dwindles is where its twin, mysticism, intervenes. Across most cultures, authority can be found seated next to the keepers of transcendental doctrine: monarchs’ devotion to the papacy, maharajahs’ consultation with the brahmins, shahs’ relation to the caliph, emperors’ adjacency to the hierophants, and so on. The mystical consulate are positioned as the adjudicators of transcendental doctrine. Their creeds typically required their hermeneutic expertise due to complex, abstract, and remote precepts.

The modern humanist project that began during the European Enlightenment quite literally dethroned old world autocratic rule as reason triumphed and laissez-faire capitalism formed a new political subject: the individual liberal citizen. This social subject was granted inalienable rights that superseded the divine decrees of pontiffs and monarchs. However, in the age of computational capital and ecological meltdown we are witnessing a crisis of the Westphalian nation-state, the primary maintainer of the principles of the humanist project. In an accelerating geopolitical landscape where intelligent algorithmic systems continue to agitate national identities, desubjectify modern democratic subjecthood, and retopologize new jurisdictions according to remote technocratic administration it is becoming evident that monolithic power is re-emerging.

︎ Refer to Appendix A1 for ancillary musical, historical, and technical details

Given the emergent complexities in our global networks of coordination, algorithmic capitalism has anointed its entrepreneurial engineers to remodel the world to reflect the staggering rate of growth. To contend with the sheer amount of information processing human oversight is bypassed due to the magnitude of tasks. These convoluted processes of automation are developed and maintained by the embrocated. This software canonicate occupy the edges of computation where mathematics becomes metaphysics; where transcendental logic seeks to untether intelligence from the shackles of humanism.

Some of these machine intelligence researchers utilize allegories of gods for framing models of intelligence that exceed the capacity of humanity. While mythology can serve as a mirror for reflecting insights into the human condition, these analogies with AI generally play into teleological and theological schemas of determinism and inevitability that are ultimately unhelpful misnomers. Deifying complex systems serves as another tactic for elevating and naturalizing investment capital’s capacity to aggregate and analyze colossal amounts data at planetary scale.

Myth-making is to be expected as humans attempt to explicate these ineffable, foreign processes that play such an enormous role in our lives. Since its indigenous origins humanity has spun stories using in-built associations to construct meaning and negotiate with uncertainty. However, these types of animistic myths of enchanted relationality and ecological cohabitation with other intelligences are not the ones cropping up in Silicon Valley circles. These factions have categorically discarded pre-colonial models of temporality, including dreamtime and cyclical time, supplanting them with a chronological telos of autocatalytic productivity that will beget emergent computational supremacy. This belief in a deterministic a priori supreme being made of pure reason (logos) and is quite literally derived from ecclesiastical Christian dogma and rabbinical Hebrew scripture. In fact, the Jesuit priest Pierre Teilhard de Chardin postulated the Omega Point5, a theory that the universe is evolving towards a maximum state of complexity and consciousness. Despite its religious provenance, his ur-Singularity cosmology has been widely adopted by many secular executives and engineers helming the burgeoning technocracy.

The Ethics of Big Data
The Ethics of Big Data, O'Reilly Media, 2012
Server Blessing
Data Center Blessing; # /etc/init.d/daemon stop
Image Credit: India Times

The Sophist notion of technē which forms our most essential figuration of technology is derived from Promethean myth: whereby fire was stolen from the gods and bestowed upon humanity spawning progress and civilization. This correlation is apotheosized in Yudkowsky’s Bayesian horrors6, Alexander's transhumanism7, Kurzweil's technological singularity8, Land's cosmological singularity9, Bostrom's Superintelligence10 and Levandowski's Church of AI. The irony in these rationalists’ accounts of intelligence is that they are all reifications of inherited cosmologies that aren’t predicated on formal logic, but on myth. When subjected to methodological rigor these narrative-based scare tactics are subsumed by heuristic biases and collapse into xenophobic rhetoric.

Mystifying computation in this way tends towards a cosmic narcissism: gazing into the abyss as the abyss affirms its own preconceptions of itself. Regardless of one’s stance on the othering of hypercomplex computation I would argue that humanity should cultivate a lexicon for talking about aliens, others, or xeno-intelligence without dynamic divergence.

The theories of breakaway recursive self-enhancing technology that undergird these conceptions of intelligence are often discussed in a bounded immaterial domain without the examination of the anatomical geographic scale. These exceedingly brilliant cerebral meta-linguists and transcendental number theorists often neglect to acknowledge the material earthen corpus upon which their symbolic logic is expressed.11

Researchers Kate Crawford and Vladen Joler’s Anatomy of an AI System12  is a graphical dissection of the infrastructural assemblage required to embody artificial intelligence; spanning the fabric of capital to include mineral resource extraction, human labor, supply chain logistics, data collection and distribution, analytics, prediction, and optimization, their project synopsizes the pipeline for constructing a deep learning system. This corporeal plexus is typically trivialized with the public relations nomenclature of the Cloud.

Referring to the vast material architectural projects of data centers with the ephemeral parlance of the Cloud is not just a bit disingenuous, it is downright deceptive. As platforms deploy preemptive algorithms into the cultural apparatus it requires a delicate and tactical marketing narrative as its trojan horse. As architecture design critic Keller Easterling aptly suggests: “You can see the discrepancy between what organizations are saying and what they are doing. You can even see temperament in construction or potentials for violence. That disposition is propensity within a context, property or tendency that is unfolding over time.”13 By steering the public narrative away from its material operations, these supranational syndicates are able to truncate opposition by minimizing attention to what they are building.


Historically the doctrinal gatekeeping that delineated class and caste was maintained by ecclesiastical scribes. There are striking parallels to the emerging niche of computational literati contributing to the steepening disparity of wealth and technical literacy. Both clergy and programmers provide order, in the form of textual statutes, that designate the foundational axioms upon which a society rests. Similar to the vertical denominations that emerge with religious sects, emanant power has stratified social order and restricted access to its inner sanctum with media illiteracy and technical deficiency.

The development of user experience serves as an exegetical layer between the code and its graphical representation. These simplified behavioral flows divert users away from the software’s extractive disposition and carry the hermeneutic subtext of “leave it to the experts”. UX is a set of inherited decisions that form interfaces to assist in navigating a computational domain. While user experience design affords those without literacy to use digital media with relative ease and is an essential facet of all computing, the Jobsian design ethos of “it just works” is synonymous with consumerism and is not designed to garner media literacy. By funneling users into compromised sets of autonomy these firms are able to maintain opacity by occluding access to the operational facets of their services. For example, the frictionless front-end of Amazon’s Alexa and its ilk are reverse portals into behavioral surplus supply chains.

Developing formal design criteria, let alone literacy, for deep learning is a sophisticated process that even those who are are working in the field have failed to do. Arguably since artificial intelligence interpolates the latent fields between engineering, science, philosophy, mathematics, design, and spectacle, it doesn’t satisfy the methodologies that quantify any real metrics specific to each respective field.14 This provides us interesting output but becomes subject to motivated logic15 and is used to “prove” certain assumptions but without the rigor and criteria that each of the aforementioned fields uses. This effectively allows interesting, but not provable claims to slip through under the guise of technical or conceptual rigor.16

AI researchers are more likely educated in fields that take formal problems as inputs: engineering, computer science, mathematics, or theoretical physics. Yet the problems being tackled are mostly ones in which a design approach, maintaining a continuous, open-ended relationship with nebulosity, may be more appropriate.17 These applications assume a highly technocratic solutionist position on social life; entire fields and industries are “disrupted” by platform engineering that ignores the specialist knowledge held by experts in their respective practices. This approach circumvents domain specific expertise and supplants it with big data. With its appeals to bottom line, businesses can afford to overlook the resultant margin of error bypassing proficient professionals with automated solutions. What the adopters of these “disruptive solutions” may not realize is that these platforms are using tautology to substantiate their claims. In essence the proponents create the criteria that they need to fulfill, fit the data to qualify their results, and prove that their “solution” will cost less than hiring experts.

︎ Refer to Appendix A2 for ancillary musical, historical, and technical details


Venkatesh Rao alleges that “deep learning has an authoritarian right wing bias. It feeds on vast data sets created by natural behavior, has a tendency to inherit and reproduce endemic biases, and codify them in favor of conservative authoritarians who see the incumbent balance of power as natural and just.”

Rao goes on to state that the management class organizing the business and social formations for how deep learning is codified claim that trying to “regulate” the functioning of deep learning algorithms directly, through human political processes, or by demanding ‘justifiable AI’ that can explain itself, is a fool's errand.

By adopting this framing of deep learning it becomes a tautological justification for itself. Outside of the market-based rhetoric of profit motivation, how is this being justified as data science? These algorithms use techniques that leverage what are called adversarial networks. These techniques use computation to accelerate simulated evolutionary processes by determining “data fitness” through a mathematical sorting process. These are sometimes referred to more broadly as genetic algorithms.18 These information processing models compute data by simulating Darwinian natural selection where only the best data “survives” making it through a statistical gauntlet of adversarial regression. By using these types of sorting algorithms many Silicon Valley executives leverage and vindicate the results as “natural” data science; the truth is that they are anything but natural. Their reasoning generally claims that if you have enough data the algorithm will be able to arrive at a statistical equilibrium having run through enough permutations of evolution.

This assumption is predicated on the cum hoc logical fallacy19 which, in statistical lexicon, can be summed up as “correlation doesn’t imply causation.” As more of big data’s conclusions are subjected to scientific rigor outside of its own self-affirming means-tested regressions it has been shown that more data can often lead to erroneous results.20 21

Cultural critic Mike Pepi articulates the hubristic naturalization of platforms as biological organisms in his astute analysis of Silicon Valley’s Sublime Administration: Between Platform and Organism22. The careful use of biological and evolutionary language around these techniques is intentional because the proponents position their critics as being against “science”. This has been a continual assertion of those advocates seeking to deepen the entrenchment made by these systems. The evolutionary justification for these conclusions are not just fallacious but are epistemologically falsifiable.

Francis Galton, a pioneer in eugenics and biometrics, was also a progenitor in the field of statistics. The statistical techniques that Galton invented23 (correlation, regression) and phenomena he established (regression to the mean, bivariate normal distribution) form the basis of the biometric approach and now operate at the core of deep learning data analytics.

Historically, we’ve seen this type of evolutionary rhetoric attempt to draw sinister erroneous conclusions about race and criminology. In the age of deep learning we are experiencing the resurgence of outmoded 19th century ‘race (pseudo)science’ [e.g. criminal anthropology23, biological determinism24, social Darwinism25, phrenology26, physiognomy27, and eugenics28). Like much of the induction biases latent in deep and reinforcement learning systems these fallacious notions are predicated on racist methodological weaknesses (poor sampling technique, bias in gathering data, poor statistics)29. Even though these claims have been categorically disproven we are observing deep learning with these biases rapidly mount the American martial systems30 of ICE31, NYPD32, U.S. Army33, Orlando Police34, New Orleans PD35, and Washington Sheriff’s Department36.

5    "Teilhard de Chardin and Transhumanism."
6    "Rationality: A-Z - LessWrong 2.0."
7    "Transhumanism | Slate Star Codex."
8    "Ray Kurzweil | Singularity"
9    "Fanged Noumena, Nick Land"
10   "Nick Bostrom | Superintelligence"
11   "Total Consumer Power Consumption Forecast - ResearchGate."
12   "Anatomy of an AI System."
13   "Keller Easterling — Extrastatecraft: The Power of Infrastructure Space."
14   "How should we evaluate progress in AI?"
15   "Artificial intelligence pioneer says we need to start over - Axios."
16    "Why AI Is Not A Science - Stanford University"
17    "Troubling Trends in Machine Learning Scholarship”
18    "Genetic Algorithm [Deep Learning Patterns]"
19    "Logical Fallacies » Cum Hoc Fallacy."
20    "Causal Inference and Statistical Fallacies"
21   "Issues with data and analyses: Errors, underlying themes, and ..."
22    "Jenna Sutela - Orgs: From Slime Mold to Silicon Valley - Printed Matter."
23   "Francis Galton: Pioneer of Heredity and Biometry | Johns Hopkins University Press"
23    "Neural Network Learns to Identify Criminals by Their Faces - MIT."
24   "OSF | Deep neural networks are more ...."
25    "Researchers Want to Link Your Genes and Income—Should They?"
26    "FACEPTION | Facial Personality Analytics."
27   "Automated Inference on Criminality using Face Images - Brown CS"
28    "Sociogenomics is opening a new door to eugenics - MIT Technology"
29   "Machine Bias — ProPublica."
30   "AI is sending people to jail—and getting it wrong - MIT Technology."
31   "ICE Extreme Vetting Initiative: A Resource Page | Brennan Center for Law."
32   "Palantir Contract Dispute Exposes NYPD's Lack of Transparency"
33    "Palantir wins competition to build Army intelligence system",
34   "ZeroEyes AI Threat Detection - ZeroEyes."
35   "An improved kernelized discriminative canonical correlation analysis - IEEE"
36    "Orlando Pulls the Plug on Its Amazon Facial Recognition Program"


Vectoralism: Content Isn’t King 

Reminiscent of rentier capitalism or, even worse, absolute technocratic monarchism, we are increasingly finding that our interactions are mediated by platform economies, or what McKenzie Wark refers to as the vectoralist class37. I’ve adopted this term because it redefines the terms of the game. It seems that society is no longer operating according to capitalist rules, but something worse, and even more extractive. Vectoralism tracks the space of possibility. It annihilates the familiar concepts of ownership and property which capitalism is predicated upon. To retrieve or access a good or service for a reasonable price it is now temporarily leased as a service; vectors of access are monopolized by those that operate the infrastructure, protocols, and applications by which we have historically interacted and transacted in an unsurveiled peer-to-peer marketplace. By driving a wedge in between our social interactions to mediate our economies and social lives these private entities are able to learn intimate knowledge about us and siphon off a rentier tax from our labor, property, and interactions without any of the risk or responsibility to the commons. Contemporary power works as an environmental form of pre-emption; it is perpetually produced, monitored, refactored, and presupposed.

Venture capitalist Marc Andreessen has infamously said that “software is eating the world”38 and unfortunately it seems to be hungry to replace democracy’s checks and balances with something that runs more efficiently. Vectoralists envision governance models where citizenship is relegated to the status of a user. Using computer networks this vision aims to circumvent nation-states granting of rights and supplant them with privileges and permissions managed by remote administration. In addition to physical walls, their prototypes are augmented with paywalls to access sites, services, and goods by embedding software and sensors into everything. The platform operates as the proxy layer through which all transactions are coordinated. Typically obscured under the auspices of ‘sharing’ and ‘convenience’ the actual subtext signifies the end of ownership.

︎ Refer to Appendix A3 for ancillary musical, historical, and technical details

[Zuckerberg Family: How to Train Your Dragon, 2018. Image Credit: Facebook]

Allow me to position this in the neo-feudal aesthetic framework of the compositions written for Of Strings and Kings. The task of the class of lords & scribes (managers, programmers) is to cultivate, legitimize, and reify the knowledge systems of the kingdom (company, platform). These scribes take direction from the monarchic technocrats (CEOs) presiding over the kingdom and its wealth (data, software, infrastructure, capital) from which the serfs (users, netizens) temporarily lease from the fiefdom. The court jesters’ and minstrels’ (the creative class) role is to entertain the kingdom’s inhabitants and hypostatize absolute power through aesthetics (streaming content, marketing, branding, PR).

37  "The Vectoralist Class - e-flux journal 56th Venice Biennale."
38  "Why Software Is Eating the World – Andreessen Horowitz"