Introduction for Common Circuits
INTRODUCTION
Circuits in Common
It began as a prank. It became a passion and obsession, a crime and stigma, an identity to be defended with pride, an international community, a fad, and finally, a political practice. In the span of six decades of digital experimentation, hacking has shape-shifted constantly to become a catch-all term for pretty much anything tentatively clever or subversive or makeshift with varying degrees of technical prowess. Now it seems to be everywhere but with radically distinct contours. From cryptic and clever forms of détournement to its dissemination as an identifier for disparate practices, hacking has traveled far and wide as a practical symbol.
What does this obsession with hacking tell us? It was only with the spectacularization of underground computer collectives in the 1980s that hackers made their way into the commercial media spotlight, which brought their practices, if somewhat distorted, into public view. While ambiguously representing “menace” and “solution,” hackers have figured ever since in conflicting narratives as rebels and freedom fighters, tricksters and occultist magicians, cybercriminals and government allies, information security consultants and anti-corporate technologists. Across a fairly wide range of historical experiences, the moral pendulum has oscillated between the identification of hacking with criminal activities and virtuous practices of community-making. But it is beneath the surface of public discourse that we find minor histories of computing connected through a common thread: hackers have stubbornly fostered the common1 as a principle to create community with, for, and through computing and have increasingly done so beyond and in connection with Euro-American computerdom.
Common Circuits offers an anthropological study of a transnational network of community spaces that distinguish themselves through their collaborative work-arounds to create alternatives to “Big Tech” computing. Their alternatives are made possible through practices and infrastructures of commoning to redress social and environmental problems. Their ways of making in collaboration demonstrate that “another possible is possible,” an expression made memorable by anthropologist Arturo Escobar in his reformulation of the alterglobalization slogan.2 Against the Big Tech Midas that transforms everything it touches into property, in minor histories of hacking we find explorations of computing for the purposes of community building. In this context, commoning refers to the practices for creating, sharing, and sustaining technologies of common interest—where “interest” is not to be taken in its economic or colloquial sense but by its etymological roots: inter-esse, an expression in Latin that stands for the “in-between,” a social tie and, for our purposes here, a bundle of ties between us and our technical objects.3 This insistence on sociotechnical ties is an important challenge to the entrepreneurial history of computing for demonstrating that its purported innovations came from elsewhere, through the substantial work of technologists made invisible.4 The computer enthusiasts who figure in this book have dedicated their lives to (informal) learning through (convivial) sharing, not necessarily by breaking computer security and exploiting remote networks. They have defended the need for community in a world where digital technologies have been designed and implemented to deny one’s ability to learn through collaborative exploration.
Much ink has been spilled in the examination of the negative effects of digital computing as supporting infrastructure of surveillance states, where hacking figures as a wild card in not only advancing but also interrupting circuits of ever-expanding dispossession and extraction in neoliberal circuits. In this book, we take a detour into alternative spaces where marginal projects counter dominant circuits with computing as convivial technology.5 Our starting point is not only that we can better devise alternative technopolitics through minor practices of computing, but also that contemporary hacker sensibilities, spaces, and projects open up fundamental questions regarding the persistent divides between the technical, the ethical, and the social in science and technology.6 To explore the implications of this debate from the perspective of a group of (mostly) invisible technologists, I pursue counterexamples of conviviality in common circuits7 of community spaces called “hackerspaces.” In this particular context, common circuits refer to the infrastructural conditions for informal learning based on open technologies.8 It is through active participation in these circuits that I describe (1) how technologists become hackers (through a process we call personification in anthropology9); (2) how hacker projects are spatialized through the creation of convivial places for community gathering; and (3) how these two interrelated processes culminate in the politicization of computing expertise at transnational scales. As I explain in what follows, these three analytic frames represent the core components of the book.
Common in Computing
From efforts in the 1950s to break free from the caste of computer priests to the cross-pollination of the countercultural movements in the 1960s and 1970s, we find key experiences through which phone and computer networks came to be used as a means for not only war and bureaucratic governance but also community-making. From the first compilers and programming languages to computer hardware kits and software-sharing collectives, early discourses and practices of liberation constitute one of the aspirational legacies of contemporary hacker circuits. To probe for origin stories, one could start in many places, anywhere from the assumed beginnings with historical fragments of Al-Khwarizmi’s work on algorithms in Baghdad around 820 CE; René Carmille and his détournement of Nazi punch cards and IBM sorting machines during the occupation; or Britain’s secret code-breaking Bletchley Park “women computers” and the trials and tribulations of Alan Turing. For our purposes, however, it is more urgent to begin with what anthropologist Michael M. J. Fischer has called the “cultural switching points”10 through which digital technologies are peopled in alternative circuits. It is by engaging computing as people mattered that we find technologies at the scale of everyday human and computer interactions. This approach, we must say, does not entail anthropomorphizing computers (as thinking machines) or reducing social matters to computational problems that have become trillion-dollar industries. Rather, what we need in order to break out of social and technical divides is to study (and foster) new forms of exchange to enable openings for the common between (and beyond) public and property relations. We can say to this effect with a good degree of methodological confidence that every starting point is arbitrary, but some are meaningful.
One of the well-known figures in the folklore of computerdom is computer scientist and mathematician Grace Hopper, an inescapable character, we could say, for fulfilling institutional and technical roles. Her often-quoted piece of wisdom “It is easier to ask for forgiveness than it is to get permission” would become an article of faith for hackers since it came to print.11 Hers is a tale of intimacy with computers and collaboration with fellow technologies that many hackers would identify as their own. Hopper’s biographer, Kurt Beyer, observed that the celebration of “Amazing Grace” has led, unexpectedly, to a misinterpretation of her contributions. Rather than rebellious, Beyer notes, her work was defined by her collaborative and gregarious nature. Hers was an incredible capacity to overcome gender divides and break into the high priesthood of computing in the 1940s.12 Despite emerging disputes over patents and prior art concerning early computer technologies, informal sharing was common at the time. Early circuits of information exchange were sustained by the circulation of technologists working on similar problems, but they were hardly open by any stretch of imagination. Women mathematicians had to work hard around established men-identified engineers with technical work-arounds of their own. Hopper, for example, gifted engineers with compiler programs that liberated them from having to use machine code to instruct commercial and military computers. They could finally use “natural languages” for creating a common pool of computer programs and calculate, among other “domestic” applications, precise bomb trajectories (including atomic ones). Software was not yet seen as an “original expression” or “innovation” in the state of the art to be enclosed, respectively, with copyright titles or patents.13
Another inescapable point of passage in hacker histories is the one that marks the transformation of the technopolitical landscape of computerdom with the conversion of a disability into a special ability. By the end of the 1950s, Joe “Joybubbles” Engressia is known to have come across an important finding that granted him the honorary place of pioneer in the art of exploration of telephony known as “phone phreaking.” Joy bubbles was born blind but gifted in his ability to hear, memorize, and reproduce sounds within a wide range of frequencies by whistling. Since childhood he had an obsession with the telephone as a medium of exploration, where he sought refuge from painful memories of domestic violence and sexual abuse. Joybubbles is known for having figured out that by whistling a 2,600-hertz tone, he could unlock long-distance calls. This discovery was also made by electronics hobbyists who proceeded to build a device to substitute for JoJo’s whistling, the famous Blue Box, an exciting gift widely circulated among phreaks to generate the standard and maintenance tones of the phone system. In his comprehensive history of phone hacking, engineer Phil Lapsley calls our attention to one of the fundamental aspects of this scene: the communal experience in which the telephone came to represent the very embodiment of curiosity as the network was transformed into a site for exploration and discovery, as well as community-based learning.14
Phone phreaking is of particular interest to us for pinpointing one of the earliest hacker circuits through which technical knowledge and autonomist values started to converge to form a network-shaped commons. As noted by legal scholar Fred Shapiro, one of the first negative occurrences of the term hacking as a “computer-mediated crime” is intimately linked with phreaking. By 1963 the MIT student newspaper The Tech published a front-page article titled “Telephone Hackers Active, Services Curtailed.” It described the usage of MIT’s PDP-1 computer for “war-dialing,” or traversing lists of phone numbers to identify modems on the other side of the line. The professor who was responsible for the phone system is alleged to have said, “We do not have too much trouble with the boys; we appreciate their curiosity.” Commenting on the case, the article’s author also quotes an “accomplished hacker” who voiced a shared sentiment at the time: “The field is open for experimentation.” Indeed, it was all open for experimentation within the parameters of exclusivity through which the boys exercised their privilege.15
One year before the somber event of his assassination in 1968, Dr. Martin Luther King Jr. set the tone for the upcoming generation of hackers who inherited the countercultural attitude of the phone phreaks. “We must rapidly begin the shift from a thing-oriented society to a person-oriented society,” Dr. King urged while signaling to the space where commoning thrives. “When machines and computers, profit motives and property rights are considered more important than people,” he said, “the giant triplets of racism, extreme materialism, and militarism are incapable of being conquered.”16 As we reached the peak effervescence of the late 1960s, phreaking was taken further and beyond (gendered) technical exploration and youth pranksterism. The hippie movement was radicalized by the yippies (with their politics illustrated by their flag: a black background with a red communist star and a green cannabis leaf)—and new types of restricted expert knowledge were put into circulation and promptly included in the hacker toolbox. As part of a whole series of circumvention techniques for liberating things from their proprietary modes of existence, lock picking was taught alongside numerous techniques of phreaking in the Youth International Party Line (YIPL). An important newsletter of the movement, which led in the subsequent decade to the formation of the long-lasting 2600 Hacker Quarterly, YIPL reclaimed phone hacking with an overt political orientation. It represented not only a community for the insatiably curious and technically inclined but, most importantly, a venue for engaging the antiestablishment. Jason Scott, an organic historian of the computer underground, remarks that the newsletter stems from the integrated circuit of radicalized elements of the counterculture: the autonomist New Left of the 1960s.
The wild shake-up in the 1960s would prove pivotal for convivial experimentation and critique of computing as one of the key symbols of bureaucratic and instrumental governance. Computing was still infrastructural, but it would soon become a locus of sociotechnical change with the extension of information, knowledge, and power to nonexperts outside academic and governmental agencies.17 Media scholar Charlton McIlwain reminds us that before the 1980s, there were only two places nonexperts could have access to computers: research laboratories and workplaces. Yet, neither of these spaces were welcoming to Black and Indigenous students and professionals.18 Curiously, these places would eventually breed a culture of laser-focused computer enthusiasts with unprecedented degrees of openness toward information sharing among themselves. It was a generation of undisciplined technologists obsessed with technical puzzles but also quite oblivious to the massive industrial-government-military complex—as historian Nathan Ensmenger describes in his historical reconstruction19—where they found the conditions to cultivate skills by circumventing restrictions to mainframe and mini computers. As early as 1976, computer scientist Joseph Weizenbaum would identify hackers as an “international phenomenon,” the so-called computer bums.20
Phone and computer hacking would find fertile ground in communal experiments of the mid-1970s. These experiments of a social and technical nature became well-known for popularizing new ways of imagining how communication technologies could also be designed (within limits) for convivial purposes. Instances of commoning in this period included “homebrew” projects to convert computer networks into autonomous means for exchanging information of local interest.21 Following social critic Ivan Illich’s plan for deschooling society with distributed “learning webs,” one famous project involved the creation of a small network of public terminals in the San Francisco Bay Area called Community Memory. As a “peer-matching network,” it was rather simple: users identified themselves “by name and address and describe[d] the activity for which [they] sought a peer. A computer would send [them] back the names and addresses of all those who had inserted the same description.”22
Another minor project with major consequences consisted in imagining and implementing a “computerized bulletin board” for exchanging messages between members of the Chicago Area Computing Hobbyist Exchange (CACHE). In his prehistory of social media, communication scholar Kevin Driscoll dubbed these experiments “improvised workshops” where social technologies were first devised.23 These workshops enabled outward-expanding circuits where parts, kits, manuals, tricks, tapes, and program listings circulated as gifts to create ties, but they would soon be challenged with commercial enclosures in a not-so-distant future. Think of what the Apple I computer kit would become as soon as it abandoned its hobbyist affordances to serve as a launchpad for a soon-to-be global business. Commoning was made possible through the diversion of chips and parts from the military-patronized local electronic and silicon components industry but also by refocusing expertise from military-industrial affairs to an uncharted experimental, technopolitical terrain.24 From industry and university-backed digital infrastructures to convivial spaces and back, reciprocity ties would break as hacker technologies left their common circuits to enter the register of the nascent personal computer market. The “personal computer revolution,” anthropologist of technology Bryan Pfaffenberger would claim, was anything but revolutionary.25
A sweeping transformation in the global political economy of the 1980s will take place, and not by chance but ideology, with the rise of commercial software and personal computer businesses. Shaped as much as it was by intellectual property (IP) with rapidly expanding enclosures of “immaterial goods,” this was also a period of consolidation of international computer hobbyist scenes that had branched out from their radio amateur and phone phreaking ancestors. As IP became the new currency in the corporate game, computer businesses started to have more frequent and unpleasant encounters with hackers—soon to be recast as “computer criminals” with an unstoppable if not “destructive” drive to know more and more about digital technologies. For the emergent industry, hackers represented an inestimable source of expert labor and, simultaneously, an antagonistic force to be combated for their practices of software sharing, social engineering, and systems penetration. Meanwhile, a well-sheltered transnational community continued to carry forward a rather elite circuitry of technical exchange, fostering a software commons by circulating code for the collaborative development of the Unix operating system. An up-and-coming generation of hackers were cutting their teeth at the console by exploring security flaws and internal secrets of various commercial “flavors” of the system.
The ancestral hacker behind the first implementation of Unix, Ken Thompson, would tellingly dedicate his prestigious Turing Award Lecture (1983) for the Association for Computing Machinery to the rising problem of “trust” in computing. After demonstrating instruction by instruction how to deceitfully introduce Trojan horses at various levels of computation—that is, from “userland” applications all the way down to the microcode that runs in a processor—Thompson addressed the media-saturated activity of teenage hackers. “There is an explosive situation brewing,” he warned. “On the one hand, the press, television, and movies make heros [sic] of vandals by calling them whiz kids. On the other hand, the acts performed by these kids will soon be punishable by years in prison.”26 It was as if two major genealogies of hacking ran neatly in parallel up until this point, where they crossed paths, colliding with quite distinctive practices and understandings.27
The shadowy figure of the “dark-side hacker” would take strong hold of the cultural imagination by the mid-1980s, engulfing the computer common with mystical clout. One of the earliest collectives to gain media attention, the same Thompson originally condemned, was the 414s, a teenage group credited for propelling the passage of the Computer Fraud and Abuse Act (1986) in the United States due to (mostly) harmless visits to computer networks with lax but standard security at that time. In fact, a myriad of hacker collectives from various countries, such as Germany, Netherlands, England, and Australia, were quite active in this period. Clustered in places where access to personal computers and modems was extended to the middle classes, these collectives were quickly stigmatized as “rebellious youth,” organized in gangs to wreak havoc behind the keyboard—a very popular but partial and misguided representation.
In the mid-1980s, tech journalist Steven Levy published Hackers: Heroes of the Computer Revolution as a corrective to widespread vilification of the community. In his modern classic, Levy narrates the earliest and most compelling stories of hardware and software hacking at MIT, the Homebrew Computer Club in the San Francisco Bay Area, and the very first steps of the computer gaming industry. The book circulated rapidly among hackers, created instant fans, bridged several generations, and gained international acclaim through numerous translations in subsequent decades. One of its lasting contributions, however, was Levy’s elaboration on the “hacker ethic” as a technopolitical ethos that was not codified but could be abstracted from the practical experience of early hackers. According to the tech journalist, core tenets characterized hackerdom, all of which were either based on the computing commons or represented an overt commitment to maintaining one. The first and most important was identified as the “hands-on imperative,” with the idea that “access to computers or anything that might teach you something about the way the world works should be unlimited and total.” From this cardinal orientation followed all the others—that “all information should be free”; that “authority” and “centralization” should be mistrusted; and that “art and beauty” were digitally possible, because computers “can change [lives] for the better.” To date, one of the most inspirational and yet most controversial tenets is the one that concerns socioeconomic and cultural belonging: “Hackers should be judged by their hacking and not,” Levy concludes, “bogus criteria such as degrees, age, race, or position.”28 As early hacker circuits were rerouted with a wide range of technopolitical orientations in subsequent decades, the “hacker ethic” would be problematized, revisited, and rewritten across contexts. There was no unified ethic but rather a set of context-sensitive “genres” and moral economies of hacking with distinctive commitments within quite specific contexts of invention—as aptly demonstrated by anthropologists Gabriella Coleman, Alex Golub, and Christopher Kelty.29
After the first half of the 1980s, the moral valence of hacking would tilt toward its identification with devious, addictive, and criminal behaviors. The controversial term cracker was put into circulation to draw a distinction between “malicious attackers” and the ancestral hackers that Levy first brought to public attention. Collectives that shared commercial software and video games further popularized the notion by circulating “cracked” versions with copy protections removed.30 We find in this period droves of young technologists who would become police cases as systems’ penetration became widespread. Raids became commonplace, targeting individual hackers in the United States, Germany, the UK, and Australia. By the late 1980s, police cases were promptly converted into veritable thrillers of computer exploitation told and retold by journalists—such as sci-fi author Bruce Sterling in Hacker Crackdown—as well as hackers themselves in their memoirs.31 Like Kevin Mitnick, who served an unfair sentence for being labeled the “dark-side hacker,” and Robert Tappan Morris, who unleashed one of the first self-replicating internet viruses (worms) and brought down machines of ARPANET and MILNET, many hackers of a much lower profile made it into a flurry of publications, depicted as “bad actors” with addictive personalities. In retrospect, these were romantic times of network exploration that were met with overbearing laws against “computer fraud,” followed by the consolidation of the profitable business of computer security.32
By the early 1990s, the situation would change dramatically from the early days of the computer underground, as security consultants Dan Farmer and Wietse Venema suggest:
This seems to be the popular image of a system cracker. Young, inexperienced, and possessing vast quantities of time to waste, to get into just one more system. However, there is a far more dangerous type of system cracker out there. One who knows the ins and outs of the latest security auditing and cracking tools, who can modify them for specific attacks, and who can write his/her own programs. One who not only reads about the latest security holes, but also personally discovers bugs and vulnerabilities. A deadly creature that can both strike poisonously and hide its tracks without a whisper or hint of a trail. The übercracker is here.33
Expressions of computer-aided superpowers of “übercrackers” have appeared in the public discourse ever since. Security, cryptographer Bruce Schneier reminds us, is often a fear sell. What is curious to note, above and beyond the media circus, is how the magical power of hackers—here understood anthropologically as a set of persuasive techniques that create a (symbolic) separation between magicians and their audiences34—has been widely, yet covertly utilized. We find strong evidence in the do-it-yourself (DIY) media of the “hacker underground”: the self-organization of a commons around bulletin board systems (BBS) where hackers not only were socialized, but also where they shared studies and findings with other magicians. It was through the participation in “board culture” that the underground would thrive through commoning that was considered, due to the mad race for software commercialization, a serious threat for IP-based business models.
On the flipside of this rather exclusive collective experience, electronic networks also had many harmful aspects, such as the idiotic (in the etymological senses of both “unskilled, rude, simple” and “private, unprofessional, ordinary”) celebration of text files on how to prepare explosives and other deadly weapons, known as “anarchy files,” as well as vast streams of misogynistic and racist materials.35 As Charlton McIlwain reports from his study of the experience of Black Americans in online spaces: “BBS, Usenet, The Internet. Yes, they were creating a whole new world. But it wasn’t a question about if and when racism would rear its ugly head in this new world. Racism, fueled by anti-blackness, was already there when it began.”36 It should be well understood that in the minor histories of computing the common is no panacea, but rather an opening to computing otherwise that deserves our attention.
Commoning as Method
My own point of departure in research was an origin myth (“there once was a time when software was freely shared among hackers . . .”) crafted with a mastery of persuasive magic to interpellate the disquieted hearts and minds of computer aficionados in Porto Alegre, a peripheral town of a peripheral country. Digital technologies were hard to come by, so everything we could put our hands on, we shared. Computer parts circulated as much as games, applications, manuals, technical magazines, and books. Very few of us had access to computer machines throughout the 1980s, and by the late 1990s we were still gathering in a friend’s house, taking turns at the keyboard, browsing the internet for the first time. Running a BBS was a luxury that only the richest had the opportunity to experience: phone lines were traded as valuable property before the first wave of large-scale privatization in Brazil in the mid-1990s.
During the terminal phase of the military regime in the mid-1980s, a market reserve was implemented for computers using the Política Nacional de Informática (National Informatics Policy), and we witnessed the growth of a local industry with a vibrant home-brew scene. Social studies of computing of this period have emphasized how much it all depended upon cannibalizing computer architectures from abroad (in this case, through reverse engineering, according to computer science scholar Ivan da Costa Marques37). However, the most original experiments included creating our own flavor of Unix, called SOX, from scratch and assembling our first digital computers based on minicomputer architectures of the 1970s, such as the Patinho Feio—which made it into the archive through the work of historian Márcia de Oliveira Cardoso.38 The sad part of this minor history is that it has mostly disappeared into the oblivion of perishing feed-paper printouts and design drawings in the basements of the first computer engineers.
By the late 1990s a new convivial circuit began to be routed through software sharing—known in our circles as Software Livre39—with technical collectives that were formed through the unlikely encounter between social movements and software engineers, government staff, university researchers, and enthusiasts of all ages (though not all genders and socioeconomic classes) to rethink what digital commoning could be from a peripheral standpoint. We started to self-organize computer users’ groups to share what we learned through whatever means we had—say, a faded Xerox copy of a copy of a Unix manual.40 Our nascent community imagined itself through unlikely connections that were forged with the rise of the newest Left in Porto Alegre at the World Social Forum.
Immersed in the technopolitical effervescence of the early 2000s, we found each other through but also against corporate computational orders in pursuit of collective exploration and self-training by downloading at the lowest speeds, testing and studying, or writing and sharing software to the best of our knowledge and infrastructural capabilities. We were certainly not afraid of disassembling technical objects for the purposes of learning. Little did we know that we were embarking upon a long decade of technical activism, articulating political debates and struggles for “digital sovereignty” in all-out opposition to the transnational IP regime. Our genealogy was not that of the Northern California libertarian culture—far from it. We were much closer to the project of political self-determination that we identified in the Zapatistas’ reappropriation of communication technologies in Chiapas, Mexico. “Hacktivism” became a key symbol that stood for the solidarity between hackers and social movements with powerful examples of direct action coming from groups like the Electronic Disturbance Theater and the Independent Media Center—set against the socioeconomic evils and upheavals of corporate globalization. The discourse on software freedom interpellated us, but different understandings of “freedom” were at stake in this historical process. The musician, technologist, and main agitator of the Quilombola cultural center Casa Tainã, Antônio Carlos TC, saw the urgency of this digital project early on. “Free!” he once interjected. “For those of us who were enslaved, the question of freedom gets immediately our full attention.” And so we persisted (under)commoning41 beyond the hegemonic circuits of computing in the Euro-American world.
Alterglobalized circuits of software sharing have one of their starting points, curiously enough, in a conventional research laboratory under rather ordinary circumstances in the Global North. By the mid-1980s, the project of Free Software (conceived simultaneously as a software development practice, a set of legal devices for circumventing IP restrictions, and an international technopolitical project) came into existence through the common experience of a “moral (and technical) breakdown”—that is, a breakdown in our naturalized, intentional (but limited) relationship with the world.42 As far as the origin myth goes, the project started at the MIT Artificial Intelligence (AI) Lab with a malfunctioning printer. We are told that a prolific operating system programmer, Richard Stallman, wanted to use his engineering prowess to fix the issue but was prevented by having his access to the code that controlled the printer denied. As we learned from Levy’s depiction of the “hacker ethic,” denial of access went against a common practice of information sharing among peers.
The annoyance with a failed network printer gave Stallman the opportunity to realize that the “original affluent society” of software sharing—to borrow anthropologist Marshall Sahlins’s felicitous expression—was rapidly changing around him. From a minor technical disruption, Stallman diverted his attention to one of a more serious and profound moral nature: having his access denied represented a major interruption of the common circuit to which he belonged. From his vantage, the denial represented a form of betrayal. “To be a hacker at the AI Lab meant that your ethical code was driven by progress of the computer code—it was wrong and almost evil to keep code and computational resources for yourself. Hackers respected one another because they were good at what they did, not because they had titles and money,” he would protest alongside another ancestral hacker, Larry Wall. “This led to profound conflicts with other ethical systems, particularly those who gave supremacy to individual ownership of ideas. From the hacker perspective, to keep an idea or a new program to yourself was the same as a spit in the eye of everyone else.”43
Common Circuits is a product of these experiments with commoning across unequal contexts, which I partook in firsthand at the intersection between computing and anthropology through numerous border crossings. After living through and studying technopolitical cultures of Software Livre in Brazil, I found myself working closely with “cyborg anthropologists” in California to examine moral economies of science and technology.44 At the crossroads of distinct technopolitical histories, I came to inhabit a position between cyborg and cannibal traditions. Cannibal intellectuals in Brazil had historically an attitude of non-submission but also, in the words of literary critic Haroldo de Campos, of creative expropriation, recontextualization, and transformation of the European canon.45 I realized over time that being tactically placed in between these traditions gave me the opportunity to study instrumental forms of economic rationality while participating in emergent forms of convivial relationality. I found in convivial projects an interruption, albeit temporary and localized, of the radical monopoly of “Big Computing,” thus offering a point of departure from the abstract utilitarianism of computing (as a discipline) to the practice of hacking in self-organized collectives (as a generative indiscipline).46 At this crossroads, I set out to examine practices of commoning through active participation in the open technology circuits that I document in this book.
The interface between cyborg and cannibal anthropologies helped me realize a broader potential for the discipline through their persistent indisciplines: fostering collaborative ties for conceptual work across disciplinary borders, staking the importance of situatedness with the recognition of partiality in our knowledge practices, and opening up the field for multi-locale research of emergent technoscientific domains of practice. What cyborg and cannibal anthropologists investigated through renewed ethnographic engagement was our shared sociotechnical condition that could not be included in the disciplinary agenda (an anthropology without the “primitive”) until more recently. Informed by this line of work, I dedicated myself to studying the transposition of the economy of software sharing to the domain of hardware design and fabrication, a very unlikely process, we must say, taking place in community spaces for computing worldwide. This problem led me to study the passage of a moral economy of software to hardware commoning under the guise of Open Hardware.
The more I learned about contemporary, self-organized convivial spaces for computing, the more evident it became that active participation in technical exchange was a fundamental condition for the study of commoning. For the purposes of multi-locale ethnography, I followed technical and political activities in situ at hackerspaces, as well as in motion as I traveled to conferences as a participant and, at times, as an organizer, between 2013 and 2017. During preliminary field research, I identified the Pacific Rim as an area of key importance for concentrating closed circuits of migration of IT professionals and exchange of open technologies. The hackerspace network in the Pacific Rim embodies a technopolitical location of concentrated Western investment and projection, helping to reconfigure geopolitical dynamics. To study community spaces through their placemaking practices, I drew from an understanding of ethnography as “translation circuitry”—that is, “a circuit whose input is the ethnographic, multi-locale experience and whose output is an ethnographic representation that is constituted by symbolic links to other forms of representation.”47
The criteria I applied for selecting community spaces in the Pacific Rim had to do with the importance of their projects vis-à-vis the broader hackerspace network. Not every community space participated in the social life of the network; each location created specific conditions for sharing wildly different sociotechnical experiments. There were substantial contrasts in terms of how influence and prestige were distributed across locations, projects, and members. Differences, more often than not, masked underlying frictions of class, ethnic, and gender identifications and belongings. Whereas many spaces represent regular hobbyist clubs without much of an active connection to broader transnational circuits of exchange, expressing a much longer genealogy that branched from the first ham radio to present-day robotics clubs, the ones I describe here were characterized by a strong influx of open projects and cosmopolitan technologists that became known well beyond their communities of origin.
In preparing this book, I drew primarily from projects and trajectories of active community members in San Francisco, Shenzhen, and Tokyo, in addition to several local and international hacker conferences. In what follows I will explain why placemaking matters for the contemporary reinvention of hacking as a cosmopolitan practice of commoning.
Common Spaces and Places
“A hackerspace,” I was told by one of the founders of Tokyo Hacker Space, “relies on three variables: its space, its people, and its projects.” From its people, hackerspaces become sites for socialization and self-directed learning with public performances of technical knowledge. Through the initiative of its members, a hackerspace expands beyond geopolitical boundaries, hosting regular gatherings in bustling urban centers. Through member projects in circulation, a hackerspace travels far and wide through tangible and intangible media, articulated in instances of virtual and actual exchange. Based on its physical space and equipment in place, a hackerspace affords certain projects while constraining many others, serving as a magnet as much as a centrifugal force for the uninitiated. Some places are well-stocked with donations from the most powerful tech companies on the planet, while others are maintained mostly from the means of their members at the periphery of peripheral circuits of computing. Through their spatial and technical affordances hackerspaces constitute nevertheless simultaneously imagined spaces and convivial places. They are conceived by their members as heterotopic spaces for reclaiming information technology and sharing technical expertise—which is to say, they are spaces of difference in the context of corporate circuits for computing in educational and professional spheres.
One of the origin stories of the global hackerspace network starts with the publication of a blueprint for new computer clubs: the “hackerspace design patterns” delivered by two German hackers, Jens Ohlig and Lars Weiler, at the influential hacker gathering Chaos Communication Congress (24C3) in 2007. The hackerspace design patterns provided the general orientation not only for organizing places of conviviality but more importantly for avoiding conventional separations between designers, builders, and users. The productive capacity of a community space is perceived by network participants to have central importance in creating social ties: hackerspaces that are not active are deemed uninteresting and undeserving of the title. Through the circulation of (privileged) people and (open) projects, hackerspaces take on a life of their own. First encounters often involve asking what someone is working on in order to create an opportunity for demonstration of a technique or technology of shared interest. The creation of a “pattern language” for hackerspaces represents an important contribution for identifying challenges and proposing infrastructural solutions for facilitating exchange with, for example, infrastructures with “radical openness”; horizontalist conflict resolution; independence from governments and companies through membership-based funding and donations; sustainability through active volunteering; “creative chaos” for enticing practices of hacking as expressions of poiesis; and distributed responsibility for the governance of the space through consensus-based decision-making. The hackerspace-design-patterns catalog included key issues of concern in the creation and maintenance of new spaces (“sustainability patterns”); autonomy from universities, companies, landlords, and neighbors who may not be happy with the space (“independence patterns”); sustenance of active interest and participation as well as renewal of the membership body (“regularity patterns”); governance and interpersonal conflict issues (“conflict resolution patterns”); and finally, the basic ingredients for keeping the members running wild with their projects, such as a good supply of mate-infused48 soda drinks and materials to hack on (“creative chaos patterns”).
The orientation for creating and sustaining spaces through “design patterns” has been widely debated and has given rise to important derivations, such as “feminist hackerspace design patterns,” drafted after numerous events of harassment against gender minorities in the network. In her study of community spaces, communication scholar Sophie Toupin analyzed how openness, one of underlying principles of the design patterns, was problematized through the creation of feminist hackerspaces.49 As an anthropological intuition would have us suspect, the original design patterns were key in instituting the imagined global network, but their application was anything but straightforward, given insurmountable differences in political histories and available expertise and infrastructure, in addition to events of discrimination that ignited multiple forms of friction and conflict. Over time, hackerspace members were faced with the hard lesson that commoning is much easier said than done in the context of rampant individualism, sexism, dispossession, and socioeconomic precarity that intervenes in communitarian experiments worldwide. The sociotechnical foundations upon which common spaces were built was highly disparate from the start. Community spaces in global cities, for example, were much better off than others in their capacity to sustain a fragile common, based on the relative socioeconomic privilege of their members. The distribution of computing skills was highly unequal as well, mirroring divides of gender, race, ethnicity, and socioeconomic class. Despite these profound challenges, the design patterns served as a generative guide for implementing learning webs in community-driven, self-organized educational spaces.
The importance of studying and contributing ethnographically to the informal, marginal, and self-organized spaces for computing has to do with the fact that they are responsible for the spatialization of hacking. This is an emergent process that deserves careful consideration as commoning is practiced in more perennial spaces—not to be identified, we must say, with the process that political theorist Wendy Brown has identified as the “self-cannibalizing tendencies” of the so-called sharing economy that involves, in fact, very little sharing but aggressive rent-seeking that drives mass dispossession from Silicon Valley outward.50 To demonstrate this crucial distinction, we will start our exploration of the hard challenges of commoning in the highly exclusionary, high-tech hub of the San Francisco Bay Area. We will examine the social experiment of radical openness and horizontalist governance at Noisebridge, a hackerspace that is globally taken not only as an inspiration for community-building but also for its community-unraveling contradictions (chapter 1). From its extensions through the circulation of hackers and their projects, we will follow the transnational circuits that inform central sites of digital production in Southern China. We will turn to the experience of Chaihuo and the hacker group Shenzhen Do-It-Yourself (SZDIY) to discuss how convivial circuits can interrupt entrepreneurial pressures in highly unequal contexts (chapter 3). We will see how this community space was made relevant to the global network for creating a space for self-organized local collectives through cosmopolitan linkages with itinerant hackers and Open Hardware businesses to convert gifts into commodities. Finally, we will follow the unlikely response of Tokyo Hacker Space (THS) to the Fukushima disaster (chapter 5). An otherwise ordinary computer hobbyist club, THS rapidly became a critical site for the circulation of open technologies and technologists. At the onset of the nuclear disaster, THS became a laboratory for serious bricolage of open radiation-monitoring devices, thus serving as an influential example of the technopolitical alternatives that thrive in autonomous spaces.
Based on the ethnographic engagement with these three community spaces, I caution readers against the recurrent error of equating entrepreneurial apples with hackerspace oranges—which is to say, against misreading our examples as entrepreneurial circuits of “digital innovation.” An important and frequently asked question concerns the classification of any particular space: is it a hackerspace, makerspace, or a fablab? Within the global network, each space I describe draws from local genealogies in political and computing histories as they intersect with transnational circuits of open technologies. Variants of “hacking” and “making” can be as culturally and technically distant as a politicized, autonomist convivial zone or a coworking space for “flexible” and precarized IT workers; a space to match computer programmers with investors to create start-up ventures; or merely a shared place to meet, functioning as a hub for expatriate computer aficionados in global cities. Hackerspaces are particularly important in this debate because they are characterized by a form of indiscipline that consists in building social and technical ties for exploring digital alternatives. They are not innovative in the industrial sense of the term, but they can, at times, be anti-innovation.51 They often interface with entrepreneurial circuits and give rise to companies and services of all sizes, but they are not built for market purposes. This is an important distinction that is scantly observed in the existing literature. The hackerspace network is responsible for spatializing the practice of hacking in distinctive ways, but also for shaping conditions for technologists within and beyond the Euro-American axis to be cultivated as hackers. And this is the subject of another core component of this book: the personification of computing expertise in spaces for computing otherwise.
Personhood through Commoning
If it is between the public and the private spheres that we identify the common, it is between the personal and the technical that we find the moral circuitry through which technologists are formed. When it comes to practices of commoning that extrapolate market logics, we are reminded by Marcel Mauss, founding figure of economic anthropology, that it all comes down to mixtures: persons and objects, persons-in-things, and things-as-tokens-for-persons create and sever social ties, create and re-create sociotechnical arrangements that escape utilitarian justifications for why people engage in exchange practices. Persons and things come “out of their spheres,” as Mauss poetically suggests, “to mingle.” And this is “precisely the exchange and the contract” that creates a social form distinct from established institutions of computerdom. The reciprocal dynamic that makes this possible is that of technical objects as gifts, circulating to create pathways (“traces”) linking hackerspaces.
In the common circuitry of the hackerspace network, technical objects are made and remade, but so too are persons through their trajectories of circulation and their practices of exchange. Technical personae are cultivated in specific political histories, made through particular sociotechnical trajectories, individuated in relational arrangements. In the debate over what makes technologists hackers, it is my contention that commoning is foundational for self-cultivation in and through computing—one of the most important means for educating one’s attention through active participation52—thus serving as an interpretative key for examining the formation of technical personhood.53 It is in this particular sense that engaging personal trajectories can help us illuminate intricate technical and experiential dimensions of hacking. It is through the study of commoning that we get to understand how technologists are cultivated otherwise.
Under the guise of moral demands for cultivation in spaces for community-oriented learning, little is currently known about the idioms of ethical reasoning54 of computer experts beyond the liberal tradition deftly described by anthropologist Gabriella Coleman.55 Despite the existence of a solid body of literature on the historical foundations of hacking in the Euro-American context,56 substantial work remains to be done on the cosmopolitan dynamics of the alterglobalization of computing expertise, the process responsible for spatializing the practice of hacking with the emergence of expert collectives beyond the Euro-American axis. While these collectives are still entangled in many ways with their counterparts in the hegemonic centers of digital development, their trajectories are often marked by contrasting and unequal conditions for cultivation of (noncredentialed) expertise. As such, their technopolitical histories describe a different orientation toward questions of openness, autonomy, and conviviality. It is in these distinctive contexts that computing expertise is cultivated otherwise.
Common Circuits describes personal narratives of technical cultivation to give us access to the ways in which hackers reflect on their moral experiences. By retracing the trajectories of three influential hackers—Gniibe (Japan), NalaGinrut (China), and Mitch Altman (US)—we get a better sense for how moral and technical orientations are articulated through practices of commoning. Here I invite us to redirect our attention to an important but often forgotten dimension of computing expertise by asking what the moral sources for the practices of commoning are. What binds technologists in moral economies beyond corporate circuits of computing? Each personal trajectory we will trace brings to light the distinctive experiential grounding of hackers as they “learn to learn”57 through the exchange of open technologies. We describe the constant search for community through which Mitch, one of the founders of Noisebridge, became one of the global spokespersons for the hackerspace network (chapter 2). In the tech pilgrimage site of San Francisco, Mitch promoted community as an antidote to the never-ending gold rush where computing stands for the ultimate symbol of “disruption,” leading to dispossession and displacement for those who are not connected to computing and financial industries. His trajectory is one of political cultivation in queer spaces, communes, and happenings from the 1970s to the present. We turn next to the trajectory of NalaGinrut (chapter 4), Chaihuo resident hacker and one of the Shenzhen Do-It-Yourself (SZDIY) founders, who combines the practice of software and hardware sharing with questions of prosperity, familial commitment, and community-making in the cosmopolitan center of Shenzhen. In NalaGinrut’s experience, we find the articulation of hacking as a spiritual calling at the crossroads of neo-Calvinism and neo-Confucianism in postsocialist China. His work is perceived by the local community as fundamental for bringing young Chinese engineers to participate in common circuits. Finally, in the technical landscape of Tokyo, we engage with the trajectory of Gniibe (chapter 6), a veteran hacker whose software and hardware development practices are interpreted curiously through the Golden Rule in its negative form: hacking as a moral imperative that prevents expert technologists, such as himself, from exercising power over others, or as he puts it, “hacking for doing no harm.”
As we retrace these three trajectories with their moral sources for ethical reasoning and self-cultivation, we position ourselves to better identify differential conditions of possibility for commoning. In these three cases, hacker personhood is made body and skill58 through contrastive spiritual orientations, ethical struggles, and political relationalities as much as by their common sense of intimacy with computer architectures, programming languages, network protocols, and digital tools. Through active making and mixing of technical persons and objects, the common opens up as an interstitial domain of exchange in the midst of institutional and corporate circuits. Narrative articulations of what it means to be a hacker represent the output of the (context-dependent) transformative circuits we abstracted from individuated trajectories. From the joint exploration of the spatial and personal dimensions of hacking, we find the last process that we will examine in this book: the formation of a technopolitics where hacking stands for the practice of hacking ties.
Technopolitics Otherwise?
As the 1980s came to an ending with Cold War–era “closed computing,”59 a watershed event, the Galactic Hacker Party—also known as the International Conference on the Alternative Use of Technology (ICATA ’89)—brought together hackers from various generations, nationalities, and political orientations for the first time.60 In ecstatic anticipation of the gathering, the Dutch techno-anarchist collective, Hack-Tic, disseminated a call for participation in their zine of the same name. “During the summer of 1989 the world as we know it will go into overload,” the organizers promised. “An interstellar particle stream of hackers, phone phreaks, radioactivists, and assorted technological subversives,” coming from most corners of hackerdom up to that point, “will be fusing their energies into a media melt-down as the global village plugs into Amsterdam for three electrifying days of information interchange and electronic capers.”
And so they did. Over one thousand participants formed a communitas of technical exchange accompanied by debates of pressing political matters. Not only the scale was unusual for hacker conferences at the time, but so were the internationalist angle and political sharpness of the debates. Telephone uplinks were established for hackers, activists, and academics interested in examining the political use of information technologies in capital cities, such as Nairobi, Moscow, and Wellington. Computer networks were wired in a public hacklab for the exploratory delight of the conference participants. Ancestral figures of the Homebrew Computer Club, John Draper and Lee Felsenstein, were guests of honor and took the stage to share their experiences in phone phreaking and community-driven networks. On the table was the potential of digital technologies to be used in democratic processes, but also for mass surveillance, long before we got to witness the demise of cyber-utopianism on both counts. Ecological activists, by their turn, planted the seeds of an urgent debate that would mark future problematizations of the digital: the double bind in which many activists find themselves when defending the use of computer networks for mitigating environmental impact while contributing to environmental pollution through the consumption and disposal of computers and peripherals. Particularly fitting for the hacker party was Terry Gilliam’s film Brazil, whose screening was meant to serve as a warning, the co-organizers remarked, about a potential future with “too many machines and too little humanity.”61 Fascist software (“Naziware”) was controversially brought to the stage, examined, strongly objected to, and promptly rejected. The contemporary anarchist classic bolo’bolo (1983) in its Dutch translation was presented as well, but as a miraculous antidote to fortify the imagination of alternative futures to capitalist and Soviet modernities. The publication sold out amazingly fast as excerpts were read out aloud over the PA system. And a historic heated panel was held on the ethics of information disclosure involving a co-founder of Chaos Computer Club (CCC), Wau Holland, and Pengo, a club member who had been ousted for selling information to the KGB that he and his friends had obtained from foreign computer networks. “Responsible handling of information,” Wau explained, “has enabled us to use it very effectively in a political sense.” He concluded with a reminder about their mission: “I think of the CCC as a public service instead of a secret service,” drawing an important distinction that would inspire scores of future hacker activists. At the end of the conference, the ICATA ’89 declaration was drafted to address the “deeply [disturbing] prospects of an information technology let loose by economic and political actors without democratic control and effective popular participation.” The declaration included a telling contribution from University of Nairobi: “The current discussions on alternative uses of computer technology are guided by the same misleading assumptions that dominated the modernisation theories of the 1950s—particularly the notion of information technology as an independent variable.” After all that had been presented and discussed at the conference, how could computing be defined as an independent variable? The declaration proceeded with a prescient warning: “Computer technology is a dependent variable—its effectiveness will largely be determined by the existing social conditions. As a result, more computers means more global inequality.”62
We started our pathfinding in this book with the hint that we could conceive of hacking as a contentious variable with at least six decades of experimental history. If we were to take into ethnographic account the spatial and personal aspects of the practice, I suggested, we would be able to expand considerably our understanding of who counts as a hacker and what counts as a hack. From the postwar period to the present, we saw how computing was practiced and transformed into different objects of discourse, indexing different infrastructures, technopolitical practices, and moral experiences. The valuation of hacking shifted between the defense of the virtuous practice of computer exploration and the criminalization of the underground study of information security. Little did we know that, at the turn of the century, hacker tools and techniques would be vested in digital infrastructures of planetary scope: from (open) cryptographic technologies for secure communications to basic internet services with (mostly open) system software. In the process, hacking would shape-shift once again to become strangely profitable and desirable for markets and government contracts: amassing online vulnerability databases, creating independent mailing lists for sharing expert information (outside and beyond credentialed expert circuits), and bundling infosec tools in software distributions for the study and exploitation of computer networks.63 Barriers to entry into hackerdom were brought to the lowest point at the same time that, paradoxically, sociotechnical systems increased in complexity. The hacker underground was professionalized and contested, creating room in the spotlight for reactionary and rebel assets to appear: the oppositional figures of the state-sponsored hacker and the politicized hacktivist.64 Free and Open Source projects succeeded in popularizing their collaborative development tools and methodologies, becoming less combative as they provided their not-so-secret sauce to nascent internet companies, empowering infrastructures and research and development teams for the anti-common of Silicon Valley. The fierce opposition to “corporate lock-ins” of the early Free Software activists slowly gave in to the creation of what philosopher Pierre Dardot and sociologist Christian Laval called the “ersatz commons” or the “quasi-commons” of corporate Open Source projects.65 Between the booms and busts of the computer industry with rapidly expanding sociopolitical and ecological implications, “hacking” became a critical practice with indisputable urgency, for it allows us to see in clearer relief major interplays between the moral and the technical, the personal and the computational, the institutional and the convivial. Hacking became, in sum, central to our understanding of contemporary technopolitics.
In the upcoming chapters, we will retrace the common circuit of hackerspaces through what I call the alterglobalization of computing expertise, a scale-making process that re-creates the practice of computing within and beyond the Euro-American context. In the cosmopolitan trajectory of community spaces and projects, we find an important rearticulation of hacking that contrasts with well-established popular definitions of the practice as shadowy computer crime. To the standard economist’s shock, we encounter a worldwide range of practices that place in common technical objects, services, and infrastructures of high market value, where hacking consists of an attitude with an openness that stands for, as philosopher of technology Gilbert Simondon notes, “respect for the work of the others.”66 By virtue of a hack, in this sense, we obtain backdoor access to a domain of moral orientations concerning shared technical objects under the charged symbols of “collaboration,” “autonomy,” and “openness.” More than a symbolic act, we will see how hacking generates sociotechnical ties through situated practices of commoning, performing an overlooked binding function that creates and sustains and also dissolves technical, moral, and political ties. Hacking establishes, in other words, technopolitical connections of transnational scope among cosmopolitan technologists in community spaces but also severs ties with entities that are perceived to be adversarial to the purposes of collective exploration. Depending as much on its moral as on its digital circuits, open technical objects circulate widely to find collaborators, embodying invitations for practical engagement. Hacking ties technologists and technical objects, in sum, but it also serves to hack ties of institutional control to create and sustain common circuits. Ultimately, it is the pursuit of alternative technological futures that organizes the technopolitical formation we examine here.
In what follows, we will probe the noisy signals of spatialization, personification, and politicization of computing expertise to account for the hard challenges of sustaining the common that community spaces have struggled to create at global and yet partial scales.
Notes
1. The commons is most frequently identified as a politico-economic framework that stands as an alternative form of governance to markets and states, offering self-organized institutional arrangements for managing shared goods (Ostrom 1990). The common in its singular form, however, refers to the recent debate on commoning as a political practice that is not limited to the affordances of “natural resource systems” that Vincent Ostrom and Elinor Ostrom (1977) identify as “common-pool resources.” The theoretical move away from the definition of the commons as resource is identified as particularly pressing for recuperating the meaning of commoning practices that create de facto politico-economic, technical, and ecological alternatives. This move is also defended as politically urgent in foregrounding the move in the historical understanding of dispossession related to land enclosures and other natural resources toward intangible goods (with intellectual property enclosures and, more recently, large-scale data extraction and rent collection in digital platforms). In this book, I use the term common after Pierre Dardot and Christian Laval (2014) and Silvia Federici (2018) to describe technopolitical practices of commoning—that is, practices that “institute something in common,” politically and materially, for collective self-governance and mutual benefit. I continue to use the term commons throughout the book in reference to the public debate on the digital commons—a domain of exchange of “open technical objects” under the rubric of Free and Open Source, Open Hardware, Open Data, and Open Access technologies. By using the two terms I hope to highlight the distinction and the importance of these two domains of study, bridging areas of scholarship that explore technical and political practices alongside self-governance structures without, it is important to highlight, inscribing the work I present here in the tradition of rational choice theory (Obeng-Odoom 2021).
2. Escobar (2018).
3. Social ties are fundamentally the means through which moral economies of gift giving are sustained. Here I follow the definition of the “gift” (don) after Alain Caillé’s work to renew the political and economic anthropology of Marcel Mauss. According to Caillé, the economy of the gift is characterized by “any prestation of gifts or services conducted, without guarantee of return, that is meant to create, sustain, or regenerate social ties” (Caillé 2007, p. 124, my translation). Camille Tarot corroborates this line of interpretation when she describes the relevance of Mauss beyond former misinterpretations of the gift as “pure disinterestedness” or “reciprocity.” What Mauss offers instead is a “radical reflection of the emergence and nature of social ties, caught between violence, rivalry, reason and obligation” (Tarot 2003, p. 73, my translation).
4. The list of colleagues exploring minor histories of computing is not as long as it should be, but it keeps growing with contributions from historians and media studies scholars, such as Jennifer Light (1999), Nathan Ensmenger (2010), Mar Hicks (2018), Joy Rankin (2018), Charlton McIlwain (2020), and Kevin Driscoll (2022).
5. I use the term “conviviality” (eutrapelia) after Ivan Illich’s (1973) critical examination of the “radical monopoly” of the technosciences over social life, a situation where alternatives are made difficult if not impossible to conceive. Illich described “convivial tools” as an antidote to technocratic, instrumental reasoning. In his definition, conviviality is the “the opposite of industrial productivity,” being first and foremost about “autonomous and creative intercourse among persons, and the intercourse of persons and their environments” (Illich [1973], p. 20). This is key for our analysis here as the philosopher inspired a shift in the 1970s from a discourse on computing as an instrument of bureaucratic governance to its integration in everyday forms of life where technologies are “peopled”—to borrow Michael M. J. Fischer’s (2009) expression. In Illich’s (1971) argument for “deschooling,” we also find a discussion of a spectrum of organizational forms that separates “manipulative” and “convivial” institutions. Here we find his critique of “credentialed expertise” that is more often than not a meager substitute for embodied and situated knowledges. The celebration of conviviality with high levels of skepticism with respect to educational institutions and credentialed forms of expertise has been a constant in hacking. My suggestion is that anthropology can help us identify convivial alternatives in emergent forms of knowledge making and sharing, even if only in fragmentary and prefigurative forms.
6. Gilbert Simondon laid the foundational work for bridging the “big divide” between the technical and the social in his critique of what he called “facile humanism.” “Our culture,” he writes with reference to one of the centers of philosophical and technoscientific knowledge production in continental Europe, “is one that is based on the systematic error of considering the technical to have an independent existence from human affairs” (Simondon [1958] 2012, p. 9). It is in his philosophical démarche—after Marcel Mauss’s and André Leroi-Gourhan’s contributions to an anthropology of techniques—that we find the conceptual tools to decompose technical objects into processes, resonances, and structures, so as to examine their genealogies and answer the question: How do they transform over time to become concrete, active, autonomous, and functional parts of human activity? This is a generative point of departure for an anthropology within and beyond informatics—one that is best grounded in ethnography for its engagement with naturalist and nonnaturalist epistemologies much more broadly.
7. The concept of common circuits was inspired by the relational anthropology of Marcel Mauss, whose contribution has been a lasting one for identifying the imbrication of social, spiritual, and material patterns in the circuitries of exchange. In examining the dynamics of gift economies, Mauss identified what sustains their generative character in fostering social ties: they concern magical and technical practices, aesthetic sensibilities, political mediations, moral orientations, and societal obligations forming an intricate circuitry by virtue of circulation of persons and objects. As I quoted in the epigraph, in their “mixing,” persons and objects come out of their realms to integrate an emergent process where the divides between the technical, the personal, and the political cannot be empirically sustained. Mauss’s key finding of the “triple injunction” of giving-receiving-returning as the elementary dynamic of moral circuits can be found in spaces of conviviality that have not been completely overtaken by market logics (Mauss [1923] 1950; Caillé 2007). As a methodological imperative, one must calibrate one’s ethnographic attention to identify moments of gift giving in contemporary circuits since they are quite often mistaken as illogical, residual, retrograde forms of exchange running on obsolete technologies that no longer have a place under the neoliberal order of “big computing.” “Retro-computing,” “permacomputing,” and “post-collapse computing” are illustrative examples in the everyday life of community spaces that I discuss in this book.
8. Definitions of Free Software and Open Source technologies have framed the debate around two hegemonic Euro-American orientations for digital commoning: the ethics of “software freedom” for the former (under the injunction of the Golden Rule) and the legal and pragmatic defense of the benefits of collaborative software development practices for the latter. In this book I use the rubric of “open technologies” to encompass all the alternatives in the experience of commoning involving software, hardware, data, and other types of information technology that matter to the hackerspace community. Openness, in this context, is better understood as a function of expertise (of the technologist qua hacker), the nature of the technical object, and its associated milieu (which provide conditions for openness to be meaningfully realized). A more suitable term for our purposes here would be common technologies.
9. I follow Strathern’s theorization of the relations between persons and things in exchange practices: “By objectification I understand the manner in which persons and things are construed as having value, that is, are objects of people’s subjective regard or of their creation. Reification and personification are the symbolic mechanisms or techniques by which this is done . . . If these concepts thus refer to anything it is to the forms in which persons appear, and thus with the ‘making’ (-ification) of persons and things” (Strathern 1988, p. 176).
10. Fischer (2018).
11. Source: US Navy’s Chips Ahoy magazine (1986).
12. Before the term computer was attributed to machines by the mid-1940s, it was used to identify a human occupation. Not only were computers human, but their inhuman activity was mostly carried out by women in key but often unrecognized positions (Fritz 1996; Light 1999; Ensmenger 2010). Active recruiting of women for “human computer” positions led to the creation of a team of early programmers. Several reasons account for what Jennifer Light identifies as the “feminization” of the work of ballistic computing and, with the advent of computer machines, that of the “computer operator.” “It is a curious paradox,” Light ponders, “that while the War Department urged women into military and civil service and fed the media uplifting stories about women’s achievements during the war, its press releases about a critical project like the ENIAC do not mention the women who helped to make the machine run” (Light 1999, p. 473). Both Jennifer Light and W. B. Fritz depicted the artifices used to render women invisible, despite their centrality in training new operators as well as documenting, operating, and maintaining the ENIAC. The practice of rendering women invisible, Light (1999) notes, went as far as cropping them out of the publicity images about the machine.
13. This argument has been supported by historians of computing: “The fact that Hopper wholeheartedly welcomed non-UNIVAC personnel to learn about the A-2 compiler sheds some light on her beliefs concerning intellectual property. Hopper did not view software as a commodity to be patented and sold. She took her cue from the mathematics community” (Beyer 2009, p. 242). William Aspray, Martin Campbell-Kelly, Nathan Ensmenger, and Jeffrey Yost corroborate this line of interpretation when they affirm that “probably no one did more to change the conservative culture of 1950s programmers than Grace Hopper” (Campbell-Kelly et al. 2019, p. 169).
14. Lapsley (2013).
15. See Levy (2010) and Johns (2009) for descriptions of this intersection between phone phreaking and computer hacking at MIT in the 1960s. According to Levy, some of the early hackers were, in fact, quite young, such as Peter Deutsch, who was eleven years old when he first joined the ranks of the informal technical elite.
16. King (1968).
17. It is important to emphasize that this genealogy of “cyber utopianists” became not only mythologized but also deterritorialized, fueling the persuasive magic of “digital innovation” well beyond Silicon Valley. This process has been well explored in the existing literature, so there is no need to recount it here. For a critique of this cultural and political formation, refer to the classic essay “The Californian Ideology” by Richard Barbrook and Andy Cameron (1996), which examines the romantic, individualist, and techno-utopian roots that shaped this hegemonic technopolitical horizon. For a cultural history of this tradition, see Turner (2008). For a geospatial and political critique of its imperial extensions, see McElroy (2024).
18. McIlwain (2020).
19. Ensmenger (2015).
20. The political alienation of many “computer bums” is not only a point of departure for the historical critique leveled at the contemporary celebration of hacking (Ensmenger 2015). It is also a pivotal point for the study of the relation between computing and selfhood pioneered by Sherry Turkle (1984). We owe her for pointing us first to Weizenbaum’s work on “computer ethics,” but also for providing an empirically-grounded examination of the phenomenology of hacking through narratives of “deep hack” as a mode of complete immersion (with undivided attention) in computing.
21. The question of autonomy is best articulated by Johan Söderberg and Maxigas (2022) through their concept of “functional autonomy” in early networks—including community-driven bulletin board systems (BBS), free-space optical (FSO) communication technologies, and internet relay chat (IRC) service providers. According to these authors, functional autonomy is exercised by self-managed, collaborative control and maintenance of a digital service by the technologists themselves.
22. Illich (1970), p. 93.
23. Driscoll (2022).
24. See Lécuyer (2006) for a comprehensive history of the semiconductor industry in the San Francisco Peninsula. For the urgent (and mostly overlooked) examination of the toxic legacy of this industry, see Pellow and Park (2002) and Gabrys (2013).
25. Pfaffenberger (1988).
26. Thompson (1984), p. 763.
27. Kelty (2008) demonstrated that the computer pioneers of 1970s continued to foster their own circuits through the informal practice of code sharing, animating a rather complex moral economy that involved legal, commercial, technical, and aspirational dimensions. See Salus (1994) and Weber (2004) for additional historical sources on this communal expert practice.
28. Levy (2010), pp. 28–31.
29. Coleman and Golub (2008); Kelty (2010).
30. See Coleman (2012) on the controversy over the distinction between “hackers” and/as “crackers” and Eve (2021) for an examination of underground software distribution networks (“warez”) where crackers are regarded as high-status technologists, given the difficulty of their contributed work: “as with other types of computer information security roles, this type of breakage [of software protection] is akin to an incredibly elaborate puzzle that the cracker must solve to succeed” (Eve 2021, p. 103).
31. Important (and controversial) memoirs of this period include Out of the Inner Circle (1985) by Bill Landreth; The Cuckoo’s Egg (1989) by Clifford Stoll; and Ghost in the Wires (2011) by Kevin Mitnick. Influential collectives of this period, such as The Realm, Legion of Doom, Masters of Deception, and Cult of the Dead Cow, have been portrayed, respectively, by Suelette Dreyfus in collaboration with Julian Assange (1997); Michelle Slatalla and Joshua Quittner (1994); and Joseph Menn (2019). It took quite some time for hackers to make peace with journalists, but it eventually happened around the time that hacktivism started to circulate as a keyword in the late 1990s. This historical trajectory is best narrated by Gabriella Coleman (2016) in ““From Internet Farming to Weapons of the Geek,” a brilliant concept she coined by borrowing from resistance studies and adapting it to the political history of hacking.
32. This trajectory was minutely and extensively tracked by Matt Goerzen and Gabriella Coleman (2022) in their research report Wearing Many Hats on the spawning of the information security industry from the so-called hacker underground. The authors detail two very important processes in the professionalization of hackers: (1) the rise of the practice of “full disclosure” of security vulnerabilities in important online channels, such as Bugtraq, involving underground hackers and “above-ground” researchers; and (2) public attack and shaming of large corporations, such as Microsoft, for lack of security in their commercial software. Firsthand accounts of the process of criminalization of hacking in the mid-1980s and 1990s can be found in Sterling (1992); Taylor (1999); Thomas (2002); and Goldstein (2009).
33. Farmer and Venema (1993).
34. See Murillo (2020b) for the application of a theory of power in Mauss and Hubert’s Esquisse d’une théorie générale de la magie to the interpretation of historical encounters between hackers (as magicians) and non-hackers (Mauss 1950).
35. See Scheirer (2024) for the proper contextualization of “fake” materials in underground hacker publications, such as text files and e-zines.
36. McIlwain (2020), p. 96.
37. Marques (2005).
38. Cardoso (2003).
39. “Software Livre” is the term for the sociotechnical experiments that took place from the late 1990s to the present around Free and Open Source projects in Brazil. See Murillo (2010) for the study of its cultural and historical formation.
40. A similar experience has reportedly rung true for several generations of computer science students who learned about operating systems through Xerox copies of a classic reference in the field, Lions’ Commentary on UNIX 6th Edition, with Source Code by Professor John Lions (first published in 1976). In my case, the heavily copied book was Operating Systems Design and Implementation by Andrew Tannenbaum, coauthored with Albert Woodhull (and first published in 1987) and known to be one of the most influential references to be challenged by the generation of hackers who gave us the Linux operating system (Moody 2001; Torvalds 2002). The crucial difference here is that many community members in our scene were “unschooled” in computing. For the analysis of the historical relevance of these publications for the Free and Open Source community, see Kelty (2008). For the discussion of schooling and unschooling practices, see Blum (2024).
41. Here I follow the elaboration on “study” as “undercommoning” after Harney and Moten’s (2013) work on the possibilities for creating political communion and collaborative knowledge production despite the exclusionary conditions and dynamics of university settings and corporate research laboratories. Following the authors’ definition, conditions for studying computing otherwise are gathered in the interstices (through under-commoning) of institutional and corporate settings.
42. Zigon (2008).
43. Stallman and Wall (2002), p. 243.
44. For being marginal with respect to the anthropological canon, cyborg anthropologists were deep at work questioning the Euro-American tradition’s claims to universality through ethnographic research, demonstrating the situatedness and narrative strategies that have historically been deployed to institute and universalize the figure of the self-centered expert individual working within the well-guarded walls of the citadel of science. That was a tough battle as the field of anthropology was very skeptical of their contributions and reluctant to incorporate their critique. Under the theoretical and methodological orientation of the original Committee for the Anthropology of Science, Technology, and Computing established in the context of the American Anthropological Association of the early 1990s, ethnographies of the technosciences were finally brought to the forefront of anthropological research agenda. See Haraway (1991); Traweek (1988); Downey, Dumit and Williams (1995); and Downey and Dumit (1997) for key interventions that managed to storm the citadel of anthropology (in the United States).
45. After Viveiros de Castro (2015) I take the political commitment of Brazilian anthropology to be grounded in a mode of knowledge production that has to do with the reinvention of the European modernist tradition through theoretical, artistic, and philosophical cannibalism. In his analysis of the “anthropophagic reason,” Haroldo de Campos examined the Brazilian modernist metaphor of the cannibal. In his reading, anthropophagy is the “critical devouring of the universal cultural legacy, which is formulated not from the resigned and submissive perspective of the noble savage” but from that of a cannibal, “which does not involve a submission (a catechesis), but a transculturation or, better, a transvalorization: a critical view of history as negative function (in the sense of Nietzsche), capable of appropriation as well as expropriation, de-hierarchization, deconstruction. Every past that is other for us deserves to be negated. Better yet: it deserves to be eaten, devoured.” (Campos 1992, pp. 234–35, my translation). The need for “critical digestion” with de-hierarchization of Euro-American influences, we could say, has been a constant in the reflection upon the place, the identity, and the role of Brazilian anthropologists. The same applies for hackers in the so-called majority world. In addition to its importance for “transvalorizing” philosophies from elsewhere, the cannibal approach has broader political implications for anthropology through its indiscipline: it might help us foster collaborative ties across national traditions, inviting ethnographers to reestablish collaborative relationships with co-participants that differ from traditional forms of distancing and interpreting through secondhand ethnographic textualization and theorizing.
46. I borrowed this term from Illich’s (1973) critique of the effects of the technosciences in everyday life. The basic idea is that every hegemonic scientific and technological state creates a monopoly on the possibilities of engagement with techniques and knowledge, suppressing alternatives by presenting itself as the only way of organizing the search for knowledge, forms of exchange, and learning practices.
47. Fischer (2003), p. 304.
48. Mate (also known as chimarrão) is a traditional tea of the American Southern Cone made from the ground leaves of Ilex paraguariensis. Its ritualistic aspects have been described (briefly) by Lévi-Strauss in Tristes tropiques (1955).
49. Toupin (2014).
50. Brown (2016).
51. See Söderberg and Maxigas (2022) for a series of ethnographic cases of hacker projects and collectives that are organized against “digital innovation” as a technopolitical project. Examples range from Open Hardware–based networking technologies and 3-D printing collectives to autonomous networks of community spaces (such as hacklabs and hackerspaces) and internet network service providers, such as Internet Relay Chat servers.
52. See Ingold (2000) on the topic of “enskillment” and his usage of the ecological psychology of Eleanor and James Gibson.
53. “Personhood” is a key but often neglected aspect of technoscientific expertise, but it is central for understanding the distinctive aspects of hacking in contrast to other domains of computing. Important exceptions in the literature can be found in the work of Turkle (1984); Kelty (2008); and Coleman (2012). Hacker “personification” has to do with self-cultivation in a particular sociotechnical milieu, as well as with the process through which technical objects rendered as gifts personify the giver in computer expert circuits. Pina-Cabral (2021) has demonstrated the importance of Mauss for the contemporary study of relations and persons as they are materially and symbolically implicated. “Being as relation” is the ontogenetic (more than an ontological) orientation we fundamentally pursue here: from Mauss’s classic study of personhood ([1938] 2012) to Simondon’s definition of being ([1964] 2017) as “singular common”—a fortuitous expression elaborated by Aspe (2002) in his interpretation of the process of individuation.
54. Ethical reasoning is the practice of self-reflection concerning a set of relationships one is confronted with and through which one is, might, or ought to be implicated (Zigon 2008, 2011). It is a form of coping and dealing with contradiction and conflict experienced in everyday life. The late Foucault (1988) of Technologies of the Self is one of the key references in this debate, but the most productive framing of the problem comes from the anthropology of morality and ethics, which has helped us to transpose and extrapolate on Foucault’s approach to study other historical and political contexts, informed by other elaborations on personhood in contrast with the Greco-Roman and the early Christian traditions (Zigon and Throop 2014; Throop and Mattingly 2018). Ethical reasoning is a fundamental part of the process through which hackers work on themselves as we will see in the chapters that are dedicated to personal trajectories, but it is not sufficient to account for the intimate technical engagement with computing for the formation of hacker personhood. See Turkle (1984), Kelty (2008), and Coleman (2012) for the psychosocial and cultural elaboration on the relationship between hacker selfhood and technical objects.
55. Coleman (2012).
56. Anthropologists have explored manifestations of hacking as cultural re-elaborations on the liberal tradition (Coleman and Golub 2008; Coleman 2012) and pragmatism to create “recursive publics” (Kelty 2008), as well as a generative domain of technical development animated by a shared “moral imagination” of freedom (Leach 2009), technological autonomy, and gift-giving (Zeitlyn 2003; Apgaua 2004; Murillo 2010; Chan 2013). In philosophy and political economy, the most influential contributions have identified in hacker communities the distinctive dynamic of “commons-based peer-production,” politico-economic resistance, and social experimentation to create distinct notions of property (Nissembaum 2004; Benkler 2006; Benkler and Nissembaum 2006; Weber 2004; Söderberg 2007).
57. Based on Gestalt psychology, Gregory Bateson’s concept of deutero-learning as meta-learning went through important revisions in the course of his work. The definition first appears in 1942 as a response to Margaret Mead’s article “The Comparative Study of Culture and the Purposive Cultivation of Democratic Values.” In this early elaboration, deutero-learning is articulated as the means for “getting an insight into problem-solving in context.” Bateson then provides a working definition: “Let’s say that there are two sorts of gradient discernible in all continued learning. The gradient at any point on a simple learning curve (e.g., a curve of rote learning) we will say chiefly represents the rate of proto-learning. If, however, we inflict a series of similar learning experiments on the same subject, we shall find that in each successive experiment the subject has a somewhat steeper proto-learning gradient, that he learns somewhat more rapidly. This somewhat progressive change in the rate of proto-learning we will call ‘deutero-learning’” (Bateson 2000, p. 167). “Learning to learn,” Bateson concludes, “is a synonym for the acquisition of habits of thought” (Bateson 2000, p. 166). In his address to a group of psychologists and psychiatrists in 1959, he reformulates his previous observations about deutero-learning in terms of “trito-learning”—that is, as an advanced stage characterized by “learning to learn to receive signals.” Deutero-learning is elaborated here in cybernetic terms as “second-order learning” that constitutes the “expectation” that the world (as a system) will be “structured in a certain way” (Bateson 2000, p. 249).
58. It is well established in the anthropological literature that the human body represents a primordial instrument, thanks in large part to the work of Mauss ([1934] 2012) on the “techniques of the body.” Yet, under-explored and under-theorized is his definition of technique as a “traditional and efficacious act” that is inseparable from other symbolic and material practices of moral, religious, technical, aesthetic, and magical nature. Tradition, in other words, is the regular act of transmitting techniques which, in turn, can be considered efficacious due to the very nature of their transmissibility. This debate has been recuperated and updated by Sigaut (2010) and Schlanger (2012). The tacit dimension of this transmission is as important as the conscious, overt, and systematic ones because it speaks to the preobjective conditions for “learning to learn” about computing. This is particularly salient in the context of independent computer collectives: transmission-through-sharing-and-practice is a condition for cultivating one’s dispositions (bodily habitus) and also foundational for community building through shared technical objects that, at once, circulate and serve as infrastructure. The feedback mechanism that sustains a “hacker public” through the sharing of technical objects (that become infrastructural for the public itself) has been theorized as a “recursive” property by Kelty (2008).
59. The expression “closed world” was employed by Edwards (1996) in his study of Cold War–era computing and cybernetic imaginaries. I draw from Edwards here to highlight the passage of a particular kind of infrastructural moment in computing toward a phase where “openness” would become a key orientation for neoliberal and cybernetic governance. This understanding of openness, however, is not to be confused with the definitions of the very same term by Gilbert Simondon, Ivan Illich, and André Gorz. It is important to highlight that the opposition Edwards established was between a “green” world of natural, magical, and transcendental forces and the “closed” political and ideological world with its overbearing instrumental reason, allocated to total war with (the possibility of) total annihilation.
60. Nevejan and Badenoch (2014).
61. Riemens (1989), p. 20.
62. Riemens (1989), p. 72. Alongside conference co-organizers, Patrice Riemens and Rop Gonggrijp, Caroline Nevejan (2007) registered that the gathering had a double agenda, so to speak. It was not meant to be only a party, but to be tasked with drafting a declaration, which they did, representing an early hacker engagement with public policy that would reappear in the mid-1990s and then again with full force in the 2000s with the rise of hacker activism. For more details about this type of political engagement on an international scale, see Maxigas (2012); Coleman (2016); and Menn (2019).
63. Goerzen and Coleman (2022).
64. See Coleman (2015); Follis and Fish (2020); and Zetter (2014) for the description of these opposed trajectories.
65. Dardot and Laval (2014), p. 163.
66. Simondon (2014), p. 402.