Cross Talk
Artbyte (New York) 2, no. 3 (September-October 1999), pp. 28-29
Jon Ippolito

Whatever happened to the scary cyborg?

Back in 1985, when James Cameron's original Terminator drove through police stations and shot up night clubs, Donna Haraway proclaimed that the figure of the cyborg would destabilize our stereotype-ridden culture. Now, Star Trek's Seven of Nine is taking dancing lessons and going on dates, yet new media theorists who still parrot Haraway's assertion seem not to have noticed how easily the cyborg has been co-opted by old media. Contemporary movies, TV, and comic books portray cyborgs as efficient, strong, reticent, good with firearms, and willing to risk their lives to get the job done. The human protagonists of these action and science-fiction genres, by contrast, are efficient, strong, reticent, good with firearms, and willing to risk their lives to get the job done. So much for subverting stereotypes.

Truth, however, can be stranger than fiction. In November of 1995, Leonard Adleman of the University of Southern California in Los Angeles published a paper in the journal Science that documented a truly disturbing hybrid of the organic and the artificial. Adleman's cyborg bore little resemblence to terminators or borgs: it had no arms, legs, head, skin, or personality, but neither did it have silicon chips or a metal endo- or exoskeleton. It didn't even have a solid form. Adleman's cyborg was simply a few drops of DNA in a test tube. But Adleman's DNA could compute.

The premise behind the DNA computer is that all life on our planet is based on a genetic code that is essentially digital. Genes may account for a bewildering variety of birds, bugs, and bacteria, but they are all just different combinations of the same few molecular units: the nucleotides. Adleman realized that the 1s and 0s that make up computer code could be represented by the two kinds of nucleotide pairs (adenine-thymine and cytosine-guanine) that make up DNA. Then a particular sequence of bits stored in the computer's memory, say, 100110101, could be represented by a particular sequence of nucleotides stored on a strand of DNA, say, ACCAACACA. A computer can search for the bit sequence 100110101 on its hard drive, read the information encoded there, and interpret these bits as an executable command. A cell's RNA, analogously, can search for the gene sequence ACCAACACA on its DNA, make a chemical imprint of the nucleotides encoded there, and build the corresponding protein. The most familiar example of this powerful analogy is the computer virus, a sequence of runaway programming code that, like its biological counterpart, infects other hosts by instructing them to replicate its code.

Computer scientists and biologists alike treat the viruses in their respective fields as uncontrollable pests to be "disinfected." A problem-solving test tube of DNA is no renegade, however, but an obedient servant; its most apt silicon analogue is probably the "bot," a mini-program that plays some useful but limited role on a computer network. Like network bots, DNA bots cannot perform much more than one task at a time; holding a very long string in memory or computing the answer to a particular problem in logic is about all they can handle.

It's one thing to acknowledge that DNA is like computer code; it's another to coddle DNA into solving a problem for us. Of course, DNA already solves lots of metabolic problems for the cell in its bodily niche; but for the DNA bot to be useful, Adleman had to figure out a way to program it %in vitro%. Unlike network bots, DNA bots don't act by themselves; a lab worker has to perform manually the operations that the microprocessor of a computer would perform automatically. Suppose a DNA programmer wants to compute the driving route from New York to Los Angeles with the least number of tolls. If he identifies each of the toll stations along the Eastern Seaboard with a different nucleotide sequence, then a test tube of random DNA strands will automatically represent billions of such possible combinations. Of course, only combinations of toll routes that begin and end in the right cities would be acceptable. To select for them, the technician might pour into the test tube two chemicals primed to bond to the sequences corresponding to New York and Los Angeles. Heat applied by the technician will then cause the DNA's double helix to unzip; the two primers will lock into their positions, serving as molecular bookends for free-floating adenines and cytosines that pair up with the nucleotides in the intervening stretch of DNA. The researcher would then detach and collect these strands from the DNA, knowing confidently that they contain the code for the correct origin and destination cities at either end. By applying other techniques that tie strands together or select them by length, the researcher can gradually weed out the DNA strands corresponding to the worst toll routes in favor of the best ones. In the end, the DNA sequence that passes this biochemical obstacle course will be a statement of the problem's solution written in genetic characters.

Why would Adleman go through this laborious process, if his laptop could have done the computation without all that mixing and stewing? The answer is that most silicon-based computers can only attack a problem in a linear fashion, starting at the top of their instructions and proceeding, one command at a time, to the bottom. Because billions of DNA molecules can fit in a test tube, however, operations like those described above can occur on an extraordinary number of sample molecules at the same time. In such "massively parallel" computing, the problem is attacked from literally billions of angles at once. Imagine a billion safecrackers all trying to guess the combination of a safe at the same time: by sheer probability, one of them is bound to get it right.

The reason Adleman's real-life cyborg is so much scarier than the cliched cyborgs of mass media is that it poses a fundamental challenge to our concept of free will. We have a much cherished belief that biological organisms do what they want and that computers do what they are told. When we say this, however, we are ignoring our history of breeding animals for selected traits: thoroughbreds for racing, Clydesdales for pulling carriages, quarterhorses for roping cattle. Recently perfected recombinant DNA techniques and cloning have allowed us to go further, by circumventing the messy reproduction of species and tinkering directly with an organism's DNA. Creating a DNA bot also demands working directly on the DNA, but at the next level of control. For unlike the genetically engineered tomato, which is grown for a variety of characteristics ("make it juicy, fleshy, plump, and red"), a DNA bot is designed only for the resolution of a single problem ("find the New York-Los Angeles route with the fewest tolls"). It's not just a body without a mind, but a body without the survival instinct that evolution has bred into every life form from amoebas to antelopes. The DNA bot is like a kamikaze investigator who is destroyed in the process of solving his problem and leaves only the solution as his cadaver. Is this not the ultimate evolution of slavery: a self-consuming servant that transforms itself into the object desired by its master?

At the same time that computers are starting to look less like silicon lattices and more like cytoplasm, our models of the brain have been moving away from the computer-like storage of 1s and 0s and more toward an evolutionary system of weighted neurons. Neurobiologists like Gerald Edelman stress the evolutionary development necessary to produce an embodied mind, while researchers in artificial intelligence build silicon-based "neural networks" that learn to recognize patterns in a way that is not preordained by the programmer. There is a critical difference, however, between these developments and the DNA bot. Neural nets and embodied automata are scrappy gadgets that can learn on their own, while the DNA bot is a compliant biomass whose one-track mind is completely determined by its creator. Advanced research in computer science is thus in the ironic position of bestowing silicon with memories and associations with one hand and removing them from flesh with the other. It is not merely the conjunction of the words "DNA" and "computer" that makes Adleman's cyborg so frightening, but the attempt to wrest learned experience away from biological tissue, replacing it with the subservience we expect from the unfeeling circuitboard in our desktop PC.