jbork - 2015-07-31

From Automated Genocide to the Dumbest Generation

[Home]
Histories of computing technologies often feature images of early electronic machinery developed in America and Britain during World War II, and connect their emergence to heroic narratives of triumphant, democratic civilization over genocidal, totalitarian regimes. By a seamless progression of innovation, commercialization, and dissemination, these forerunners laid the ground for subsequent eras of mainframe, mini, personal, and networked computers, on to state of the art mobile, multimedia devices. However, in this dissertation that suggests, instead of another history, an approach toward a philosophy of computing, my inaugural image shall be a 1933 advertisement for Hollerith punch cards (Figure 1), the same one with which Edwin Black begins his provocatively titled IBM and the Holocaust: The Strategic Alliance Between Nazi Germany and America's Most Powerful Corporation. Rays emanating from an all-seeing eye illuminate an enormous punch card, factory, and smokestack. Black implicates IBM machinery, its employees, and its partners in America and Europe, with their bureaucratic counterparts in the murderous Nazi regime, embodying the evil latent in apparently benign technological devices. IBM published but quickly withdrew a promotional book on the history of computing in Europe. The History of Computing in Europe detailed the exploits of famous employees on both the American and Nazi sides who were computer wizards of their day; a very rare book indeed, which Black claims is so rare as to not be found in any public library, or even Internet archives (425). Thus both a hidden history of the modern computing era and ripe store of unexamined philosophical perspectives awaits scholarly attention. My research devolves to the readily available material by early workers in the field, from Bush, Burks, Goldstein, von Neumann, Licklider, Kemeny, and so on, through Stallman, Knuth, Stroustrup, Gates, Torvalds, Jobs, and the like, yet this suppressed text symbolizes a holy grail of sorts. Though perhaps now information technology is imbricated in mass atrocities by dropping smart bombs from drones, rather extracting innocents from the populace via census forms, herding them into ghettos, and operating death camps, it remains important to think about how information is gathered and processed, and how the fruits of human invention are used. Are the perpetrators as steeped in evil as the vilest hacker, or morally ambivalent managers like Adolf Eichmann, or rank and file office workers blind to the purposes of their efforts – or perhaps the lists are already being made by machines on their own, leading to future genocides portrayed in science fiction apocalypses like The Terminator? That is one extreme perspective epitomized in science fiction, excellently argued by N. Katherine Hayles and a host of digital humanities theorists. However, I believe we are on a trajectory not aimed toward automated genocide, but rather unintentional stupefaction, reduction of potential in comparison to the increasing competence of machine cognition, despite enormous hopes and efforts by many well intentioned policy makers, engineers, and educators.
It is from this vantage point that contemporary digital media theorist David Rushkoff argues that social hopes for the Internet seem to be failing, draining values and denying deep thinking rather than fostering highly articulated connections and new forms of creativity (16). To compound the damage on society as a whole, Jaron Lanier explains how the Internet has spanned an ecosystem of what he calls siren servers, which allow a select few to monetize the network usage of the masses. It is from such critical perspectives that I argue the decline in human intelligence, industriousness, and creativity – whose empirical validity is a research question but will be taken for granted – will not consummate in a regression to prehuman forms, a digital dark age, or machine apocalypse, but rather leave behind traces suggesting that more advantageous synergies with machine intelligence could have been achieved.i My thesis is that the problem is complicated by humans getting dumber for want of spending time programming, or working code, replacing it with ordinary computer application use. Thus the sinister Dehomag poster foreboding automated genocide gives way to imagery from the 2008 Disney movie WALL-E, of the evolutionary effects of generations lived in the machine-controlled spaceship environment of the Axiom (Figure 2).It depicts obese, shallowly content, physically and mentally unchallenged human consumers, whose needs are met and whose desires are fulfilled – precisely because they are also supplied and conditioned – by the surrounding intelligence of the built environment, their lives unfolding on screens aboard a gigantic cruise ship.i My basic premise is that the dumbest generation has infected human being to steer it toward WALL-E torpor rather than apocalyptic science fiction narratives of automated destruction.

A Collective Intelligence Problem

Henry Jenkins uses the term collective intelligence to name the contemporary collective process involving humans collaborating along with inhuman information technologies, especially Internet resources, together consuming and creating knowledge (4). The concept, on the one hand, seems essential for any knowledge to exist at all, implicating signs, symbols, and artifacts with biological entities, making all intelligence collective; that the nonhuman, technological component plays an active, participatory role, on the other hand, seems to be an emergent phenomenon. Hayles refers to the nondeterministic, evolutionary development of technology systems as technogenesis, and is adamant that it is deeply intertwined with concurrent synaptogenesis, the lifetime, peri-generational – rather than long term, epochal – changes in the human brains that use them (How We Think 11). Kemeny, who invented the BASIC programming language with Thomas Kurtz, makes the implicit argument that through learning to program computers to perform the formerly mundane, repetitive knowledge work of prior generations, masses of humans who knowledgeably use information technologies will prosper in a new golden age. Such one-sided predictions of overall social benefits by technology evangelists recall the critique of writing in Plato. Winner's critique of the conviction that widespread adoption of computers and communications systems will automatically produce a better world for human living, mythinformation, expresses the contemporary ideology that all aspects of life will likewise benefit from speedy digitized information processing, compounded by political assumptions of computer romantics mistaking the supply of information with the ability to leverage it. This is simply a false assumption that ordinary citizens equipped with microcomputers will be able to counter the influence of massive, computer-based organizations.i Postman also concludes that computers, like television, afford little to the masses, make no substantive, positive transformation of their condition, but instead primarily intrude on their lives, making the majority losers, and only a few winners.

Societies of Control

The Quintessential Postmodern Object

Foss Hopes

Default Philosophies of Computing

Digital Humanities Solutions

Not to Use Old Tools for New Problems

Scholarship Requires a Cybersage

Digital Humanities Projects

Critical Programming Studies

 

Related

Wiki: Home


Last edit: jbork 2015-07-31