Artificial Life After Frankenstein

Beginning with Mary Shelley's great novels, Frankenstein and The Last Man, Eileen Hunt Botting's Artificial Life After Frankenstein reveals the techno-political stakes of modern political science fiction and brings them to bear upon the ethics and politics of making artificial life and intelligence in the twenty-first century.

Artificial Life After Frankenstein

Eileen Hunt Botting

Dec 2020 | 306 pages | Cloth $34.95
Political Science / Literature
View main book page

Table of Contents

Preface. Learning to Love the Bomb

Introduction. Mary Shelley and the Genesis of Political Science Fictions
Interlude. Births and Afterlives
Chapter I. Apocalyptic Fictions
Chapter II. Un/natural Fictions
Chapter III. Loveless Fictions

Coda. A Vindication of the Rights and Duties of Artificial Creatures

Acknowledgments
Postscript. "The Journal of Sorrow"
Notes
Index


Excerpt [uncorrected, not for citation]

Preface
Learning to Love the Bomb

Back in the 1980s, it was hard for us American kids to have fun. Even if you were a virgin—like a mock-tragic character in a John Hughes film—you could still catch AIDS. If you didn't "just say no" to drugs, your brain would be fried like an egg cracked into a sizzling hot pan. If you survived into your thirties and got famous, you would probably be assassinated in public like John Lennon or at least need surgery to recover from it, like President Ronald Reagan. You could never forget the threat of nuclear apocalypse, which the communists had built in the shadow of the wall that divided East and West into clear sets of enemy camps.

Without remorse, video had killed the radio star. While Sting mournfully wondered if the Russians love their children too, we teens watching him stateside knew the real, hard truth: the question barely mattered, since the USA would fight back if the USSR took the first strike, even to the point of annihilating the world. We wanted our MTV to be a New Wave antidote to doom, but our chosen medium was not immune to a catastrophic message. The original ID for the channel featured footage of the Apollo 11 moon landing with the MTV logo optimistically superimposed on the American flag. It played on a nearly endless loop—forty-eight times per day between 1981 and 1986—until NASA's Challenger blew up in front of us.

Just seventy-three seconds after takeoff, this spectacular national disaster played on the television sets our teachers had wheeled into their social studies classrooms for a special display of current events. The first American teacher-astronaut, Christa McAuliffe, taught us that no one was safe from the dangers of technological hubris. MTV removed its iconic moon landing ID to protect us—its audience, whose parents paid for subscriptions—but the damage was already done. We had already developed a speculative knack for thinking in terms of worst-case scenarios.

This appetite for destruction may have been a function of the television and music videos we had watched. Before the cable hookup, my parents had a simple two-step rule for determining family-friendly television: it had to be on PBS and finish before 7:30 p.m. Their moral algorithm got me into bed on time, but certainly not to sleep, what with my time spent puzzling through the existential questions of Doctor Who. Time travel, reincarnation, octopi-aliens, and the mysterious attractive force of blue British police boxes filtered through my mind in a dreamlike cycle during my elementary school years in suburban Boston.

When I was about twelve, my family moved back to my dad's tiny hometown—way, way up in northern Maine. I quickly fell into a gang of girl-nerds. Perhaps because of the suffocating absence of things to do, we started to obsessively consume science fiction, fantasy, and horror in whatever forms we could find them. This literary enterprise was a challenge to coordinate in our network of villages in Aroostook County, 90 miles past the nearest "Mr. Paperback" bookstore in Bangor, and without the possibility of Amazon home delivery. The Internet had just begun to be imagined in the most cutting-edge science fiction—most famously in the cyberspace of William Gibson's Neuromancer (1984).

Up in "The County"—a bleak frontier of forest and fields on the border with Canada—we caught only some crude cinematic glimpses of cybernetics, as in Ridley Scott's 1982 film Blade Runner. It starred Han Solo—I mean Harrison Ford—as a bounty hunter in the near future who falls in love with a target android. Our cool English teacher, who had just graduated from Colby, would have known it was based on Philip K. Dick's 1968 philosophical sci-fi satire Do Androids Dream of Electric Sheep? We had to settle for finding the disc of Blade Runner at the local Qwik Stop, where my parents would pay for a twenty-four-hour rental that included the massive RCA Videodisc player required to watch it. Once home, the slumber party would gather around the TV to collectively memorize the futuristic love story and dark urban aesthetic, as we replayed the disc again and again before it had to be returned.

With a sense of existential urgency only teenagers can possess, my fellow nerds and I circulated books like other kids passed around cigarettes. After school, our gang of girls would huddle behind the walls of the wheelchair ramp to read weighty novels about the end of the world. Snug in our bunker, we inhaled hand-me-down copies of the classics of our local hero Stephen King, who lived in Bangor.

King's best work, The Stand (1978), took three full days to read if you did almost nothing else and occasionally snuck a look at it under your desk during class. A whopping 823 pages, the epic presented a pregnant, college-aged woman from coastal Maine who survived a plague made by government scientists. After withstanding the contagion, she helps defeat the Devil himself out West, before returning to the true wilderness of New England, with her heroic lover, to live off the pure fruits of the land and repopulate the Earth. There was a real satisfaction in seeing the good guys and girls win and relocate happily to your home state. When you are from Maine, this does not happen much.

Sensing my need for speculative literature more complex than The Stand, my Cornell-educated chemistry teacher from upstate New York introduced me to ironic sci-fi in the form of the confessional satires of Kurt Vonnegut and his fictional alter ego, the failed science fiction writer, Kilgore Trout. I felt adult reading books that used alien kidnappings to mitigate the trauma of witnessing the US bombing of Dresden or pictured—in a crude cartoon—the liquid-lye Drano under the sink as the "Breakfast of Champions" for a postwar suburbia. Yet the loneliness and isolation of his characters—including the author, who always depicted himself sleepless, drinking, and writing soliloquys—made me honestly wonder: what could be missing?

The atheist Vonnegut dared to imagine God without omniscience. God was a master artificer who stepped back and laughed at the surprising free choices of his own human creations. Like Dr. Hoenikker in Vonnegut's 1963 novel Cat's Cradle, God lacked control over his children. If they wished, they could choose to use the raw materials of the cosmos to construct the instruments of human extinction.

Based on the Nobel Prize-winning chemist and physicist Irving Langmuir of Schenectady, Dr. Hoenikker is "one of the chief creators" of the atomic bomb. He invents ice-nine for the mundane purpose of freezing muddy ground for the swift passage of military troops through the jungle. Seeing its market value in the arms race, his kids steal shards of ice-nine from his kitchen laboratory and sell them to the highest bidders, securing their own financial futures without thought to the risks. When a piece mistakenly falls into the placid waters of the Caribbean, Jonah—a new-age Ishmael—finds himself in the extreme position of being not just the last man on Earth but the last human atop a dead sea of ice.

The theme of human beings as the creative and destructive authors of their own destinies kept appearing in the books I picked up. Pushing back into the history of American science fiction, to the twilight of the Golden Age of pulp magazines and the dawn of the New Wave of literary fiction, I acquired Daniel Keyes's Flowers for Algernon. It was first published as a short story in the Magazine of Fantasy & Science Fiction in 1959, then expanded seven years later into a novel, which won the Nebula award. This poignant story of a man with an IQ of 68 who voluntarily subjects himself to surgical experimentation on his brain that turns him, briefly, into a genius made me break down and cry like a baby. I could not stop sobbing as Charlie—this artificially enhanced intelligence—mentally deteriorated.

Knowing that he would soon lose long-term memory altogether, Charlie scrawls in childish prose his instructions for flowers to be placed on the grave of his best friend, the mouse Algernon, who had died from the same neurological experiment. The implication of this ending stuck with me: the transformation of life through science—and, more broadly, education itself—does not necessarily constitute improvement. Watching the idiot-savant Charlie observe himself ascend to the heights of intellect, then decline to the point where he had lost or virtually forgotten everyone he had loved, was as close to a real horror story as I had ever encountered. For a child of the Cold War, this was saying a lot.

I dug deeper into science fiction to try to find the hidden source of its dark ideas. When my high school friends and I read George Orwell's post-World War II novels Animal Farm (1945) and Nineteen Eighty-Four (1948) in our small, college-track English class, I wanted more. Evading the censorious small-town librarian, I discovered Aldous Huxley's Brave New World (1932) and Mary Shelley's Frankenstein (1818) in the forbidden adult section of the stacks. The opening pages of Huxley's reproductive dystopia reminded me dimly of my Catholic parents discussing the first "test-tube" babies over dinner. Back then, I had silently worried that the babies would grow too big for the glass tubes. Huxley's bottled fetuses and decanted children seemed just as preposterous.

Frankenstein, by contrast, affected me much as Flowers for Algernon had: I responded not with skepticism but with pathos. I found Victor Frankenstein's Creature eerily familiar due to the tragicomic—bulky and bolted, flat-topped and green-tinted—versions of him that I had encountered in the syndicated midcentury Universal and Hammer monster movies, plus the Addams Family cartoon series of the 1970s. Although made by a scientist in an extraordinary way, the Creature did not seem unnatural to me. He resonated more with the geeks and goonies of my favorite adolescent angst films of the 1980s.

The parallels with reality did not escape me. When we were seniors, my high school class of fifty officially voted me "most likely to succeed." At the same time, the scribbled results of a secret poll—passed furtively around school on a crumpled piece of paper—ranked me first on the list of those "most likely to be a virgin." Funny: I heard echoes of my growing desolation in the urgency of the abandoned Creature's voice. "I had never yet seen a being resembling me," he cried, "or who claimed any intercourse with me. What was I?" He may have been an eight-foot monster and I a five-foot-five Mainer, but a strange kinship between us was, undeniably, already there.

Alienated by competition over grades, my studious friends and I had ceased to share books anymore. Rather, we hoarded them as special knowledge for our own private browsing. The summer before senior year, on an extended trip to the University of Maine campus at Orono for the Girls State summer civics course, I drifted away from my classmates with a delegate from another school. Though she was basically a stranger, we instantly bonded: it was so obvious that we were both outsiders. She told me about a computer science lab where we could exchange electronic messages with people at a university in Canada. Without really knowing what we were doing, we created email avatars of ourselves over a direct link to the BITNET network at Yale University and onto the University of Toronto over the Canadian network NETNORTH. Entering the matrix of the nascent Internet, we spent hours chatting virtually with a male graduate student. Thrilled by the illusion of intimacy generated by the shield of digital anonymity, we stopped only when he seemed to be making awkward propositions for us to meet up. The girl and I didn't stay in touch either—ashamed, a bit, by our flirtation with finding an alternate reality.

Soon I left them all behind for my own little utopia. Bowdoin College was just a few hours south of "The County," but it seemed lightyears away to me. During late-night study breaks there, I sat on institutional furniture and stared at the television installed in the common space of the nineteenth-century residence hall. A competitive runner, I dutifully peeled cheese off the warm Domino's pizza delivery, thinking to myself, Too many calories. In the silent company of my peers, I observed the Berlin Wall come down and the US bombing of Iraq start up on dorm TVs. Satellites directly transmitted live images of missile-driven destruction into our lounge. CNN newscasters informed us that we were the first generation to see world politics play out like a video game. But where was the joystick? I silently wondered. There was no remote control for Gen X to operate the system. A bit numb, I sat and watched the screen, chewing my slice of carbo-heaven.

Political science fiction had become hyperreal.