Last week my stepfather, a retired electrical engineer, passed away at the age of 89, and in the boxes and boxes of papers he had kept as the signifiers of his long life and work were sheaves upon sheaves of circuit blueprints, elegant statements in purple ink of logical relations laid out in sequences of transistors that represented the height of Eisenhower era computing intricacy. They are beautiful, not only aesthetically, or as a representation of a life now past, but as a reminder of a time when a single person could manageably lay out a computer’s mind on a series of oversized pieces of paper.
That era seemed to be over in 1971, when Intel made its first microprocessor, the 4004, commercially available. It boasted 2300 transistors, a daunting enough logical and geometric puzzle to unravel, but it would soon be dwarfed as next year’s 8008 had 3500 transistors, 1974’s 8080 boasted 4500, and the industry defining 8086 chip of 1978 featured some 29,000. Under the strain of such intensive chip complexification, the computer industry was poised to fracture into a collection of independent, mutually incomprehensible, research fiefdoms.
Chip design was threatening to become so intense a discipline that only a walled off collection of engineering gurus could possibly understand its intricacies, leaving regular computer scientists completely out of the game of designing the devices that implemented their ideas. That sense of helpless isolation in the face of mounting complexity, however, dissipated suddenly in 1979 when Lynn Conway and Carver Mead published Introduction to VLSI Systems, a work which took the complexity of chip design, boiled it down to its essentials, and presented rules that any interested computer scientist could grasp and implement, kickstarting a dizzying wave of creativity as graphic and interface engineers applied the new methods to produce ways of approaching computers that were unthinkable a decade before.
Lynn Conway (b. 1938), though listed second on the book cover, was the primary author of the text and though, from the point of view of the larger computing community, her name appeared as if materialized from the ether, she had by that point worked a number of under-the-radar revolutions that had the bad fortune of running hard against institutional inertia and social prejudice.
The fact of the matter was that, for the first thirty years of her life, Lynn Conway was not the person she was meant to be.
Born Robert Conway into a loving and attentive family, she discovered very early that she identified as a girl and wanted nothing more than to be a part of that world, a desire that came hard against reality one day at the age of four. Seeing a dress in a store, she asked her mother if she could have it, only to hear in tones of sudden rage, “No you cannot have that dress. You are NOT a girl!” Suddenly the loving environment Robert had known growing up was stripped away as her parents tried withholding affection as a means of Masculanizing their child. Robert’s father soon left the family, and her mother grew increasingly distant, leaving Robert to figure out as best she could how to surreptitiously become the girl she knew instinctively that she was without bringing the wrath of her parents or peers down upon her.
As hard as these years of youth were, however, they were nothing as compared to the hellscapes of adolescence, when Robert watched in horror as her body daily betrayed her, becoming more and more male, while the face and figure she saw in the mirror each day drifted further and further from the person she felt she actually was. She devoured books on biology and science to learn more about what caused gender differentiation, and what might be done for somebody seeking gender reassignment at a time when the mere act of wearing clothes from the opposite gender was enough to land one in prison under most state laws.
In and amongst the keen biological anguish of these years, however, were moments of satisfaction as Robert discovered not only that she was fascinated by science and particularly astronomy, but that she possessed a first-rate mind for it as well. She maintained high grades, which kept her mother’s scorn largely at bay, and found delight in building telescopes and conducting experiments with a small cadre of like-minded advanced students. She graduated in the top 10% of her class and was admitted to MIT in 1955.
Here, she lived in dorms with students of her own intellectual caliber and found that, far from condemning her for her attempts to become more feminine by dressing in women’s clothing, her room-mates seemed to take the situation largely in stride, accepting her needs as real and normal. This allowed her to know some degree of domestic peace, but also perhaps gave a false impression of just how accepting intellectual circles would be of her attempts to become her true gender. Three years into her college career, she took the reins of her biological destiny into her own hands and paid some friends to steal hormones for her, building up a stockpile to last several years as she created, using the best research she could find, a hormone therapy treatment for herself.
It is a desperately sad picture to contemplate now, sixty years on – a person in pain, having to resort to breaking the law to bring herself relief based on a program she has to devise herself because No One Will Help Her. The hormones were having an effect, as Robert slowly transformed into Lynn, but her studies had led her to believe that, as long as her body was still producing testosterone, the transformation could only ever go so far. She began researching self-castration methods, trying to find a method with a minimum of pain and, perhaps more importantly, no need to go to a hospital afterwards where her secret might be discovered and her person transported to a mental asylum for observation as a degenerate.
She managed to see a highly placed member of the medical faculty at Boston University, to whom she laid out her problem, and asked, please, for any advice about what she might do, only to be met by fury as the doctor railed against her plan, telling her she would only become a “freak” thereby, institutionalized and broken. After the casual acceptance of her identity by her classmates, the violent reaction by this educated man was a jagged cut to the bone, severing her all at once from her dreams of becoming a woman at last.
Lynn dropped out of MIT and wandered the country, trying to figure out at last who she was and what she might do. Finding no answers, and no road to the future that every fiber of her body and mind told her ought to be hers, she threw herself into an attempt to cultivate a male persona, working out, riding motorcycles, and going on hunting excursions to put a wall of machismo between herself and the feminized past she was assiduously attempting to convince herself she was ashamed of.
She transferred to Columbia University to complete a BS in electrical engineering and begin her life’s work at the heart of the digital computing revolution, and in 1963 married a young woman with whom she had two daughters over the course of the next half decade. To the outside observer, all was going well for Conway as, at last, the slim shadow of Lynn was being buried under layer upon rocky layer of Robert, and normalcy was being decisively established complete with a wife, two kids, and a job at computing giant IBM working on their cutting edge supercomputing project, the ACS-1.
From the point of view of her professional work, life was indeed aces and roses as Conway at last had a project that matched her abilities. Within a year she made a breakthrough discovery that rewrote what engineers thought single-stream computer architecture might achieve. The old assumption was that a machine could only issue one instruction per machine cycle, an assumption that gummed up a computer’s ability to effectively use all of the new hardware being developed by IBM in the mid 1960s. Conway, too new and eager to realize that she was tackling a problem that had been declared intractable by a generation of engineers, developed Dynamic Instruction Scheduling (DIS), a means of allowing more instructions to be issued per cycle by intelligently scanning queues of instructions for elements that can be processed out of order to take advantage of a machine’s separate functional units. To cite the example Lynn used in her 1966 IBM confidential memorandum, given the queue
R1 + R2 -> R3
R6 + R2 -> R7
R1 x R4 -> R5
R3 x R6 -> R8
it would make more sense, given a machine with separate adding and multiplying units, to take the steps out of order as follows:
R1 + R2 -> R3
R1 x R4 -> R5
R6 + R2 -> R7
R3 x R6 -> R8
The architectural innovations that Conway devised to drive DIS were instantly recognized as a major step forward by her coworkers at IBM, and were incorporated into the ACS-1 machine, a quite substantial achievement for an electrical engineer freshly out of college, and they are furiously at work this very moment, a half century after their original devising, in the microprocessor of whatever device you are reading this piece on.
A genius innovator-in-the-making at work, Lynn’s personal life was rapidly deteriorating. She dearly loved spending time with her daughters, but knew that her marriage could never be of the intensely physical type that her wife fundamentally needed. The future seemed a path of mutual and eternal dissatisfaction until 1966 brought news from the medical community that genuine gender reassignment methods were at last available. Lynn, after a period of intense agonizing, told her wife that this was something she needed to do, and informed IBM of her intention to undergo surgery in late 1968.
Her immediate superiors at IBM approved of her plan to relocate to a different branch after the operation to minimize confusion, only to have the upper echelons of IBM management declare the agreement unworkable and Lynn’s employment immediately terminated. Citing the inevitable distress of her colleagues at having to work with a transgendered individual, she was fired in spite of a series of fundamental contributions to the company’s most advanced project. In quick succession, Lynn’s friends and extended family abandoned her, while her wife stood by her until a month before the surgery, when the couple divorced at last.
After December of 1968, Robert disappeared at last forever, and Lynn Conway began her agonizing attempt to build a meaningful career in the shadows of the computing industry, afraid that at any moment somebody might recognize her, report her to management, and force her to begin the cycle of hiring and reputation building all over again. She worked at Memorex, designing the microprocessor for a new computer line meant to go toe-to-toe with IBM, employing strategies she had developed to streamline the design process to create a processor that worked virtually without flaw from the firing of the first prototype.
That computer would never launch, as it turned out – IBM’s position was, as-of-yet, unassailable – but it hardly mattered because the Next Big Thing was on its way. While Conway worked at Memorex to create a CPU that would never be used, Intel was launching its 4004 chip and sending engineers like Conway equally into reveries of anticipation and apoplexies of consternation about what the new complexity might betoken for the future of computer science.
1972 found Conway at Xerox, working at the fabled Palo Alto Research Center (PARC), the base from which she would launch the Mead-Conway Revolution in 1977. That revolution came as a result of Caltech professor Carver Mead’s interest in the principles and limits implied in the rapid scaling down of microprocessor component size throughout the 1970s. If component size continued to halve ever two years, soon the complexity would be overwhelming unless somebody found a way to boil the chip design problem down to a few essentials which could be grasped by anybody with a computer science background.
Ever since college Conway had had a reputation as a person who could take a mass of data, digest it, and find a way to explain it that was approachable in its first concepts but powerful in its implementation. Hers was precisely the sort of mind Mead needed to develop a new set of design principles, and in 1976 she began the process of recreating the chip design process from the ground up. Her approach used color coded stick diagrams that explained transistors and logic relations as intersections of paths embedded into the three layers of a computer chip. By taking advantage of a small, graspable subset of common intersection types, a non-specialist computer scientist could control the flow of data through a chip with exquisite precision. What was more, thanks to Conway’s use of a scalable “Lambda” factor, those basic intersection types could be readily rescaled as technology improved and changed, meaning that the Mead-Conway principles, once learned, were easy to apply to new generations of chips and the varying standards and specifications of different chip manufacturers.
Conway wrote up her methods in a book, Introduction to VLSI Systems, and presented the results in a class at MIT that demonstrated how accessible her principles were by having students learn the theory, design prototype chips, and then receive working examples of those chips, all within the course of a semester. The course was a success, and colleges all over the world began using her text and accompanying resources, with the book ultimately selling an astounding 70,000 copies. She then expanded the system that allowed students to design VLSI (or Very Large Scale Integration) projects and receive prototypes to a national level with MOSIS, a service that connected (and continues to connect) universities and researchers with chip manufacturers to allow a seamless transition from design to prototype. The simplicity of Conway’s principles, combined with the ease of the prototyping process, unleashed a wave of creativity that included 3-D graphic capabilities and the optical mouse as its descendents.
DIS kicked computing speeds into hyperdrive, and VLSI vastly expanded the number of people who could participate in chip design, and together they have substantially contributed to the rich electronic world whose intricacies we now find so entirely natural. That kind of success brought with it exposure to the glare of public scrutiny. Conway was naturally terrified of the implications of fame – could her career, so hardly won a second time, be ended by an act of recognition, an anonymous accusation? Faced with the choice of hiding from the limelight or acknowledging her past publicly, Conway made the courageous decision to share her story, placing a full account of her life’s path on the Internet.
Fortunately, the world now is not the world of 1968, and Conway’s identity has been embraced as deeply as her contributions have been lauded. She has been inducted into the Electronic Design Hall of Fame, made a fellow of the IEEE, the Computer History Museum, and the American Association for the Advance of Science, and holds multiple honorary doctorates. And I am especially happy to report that the young girl who agonized over if she ever could find love did eventually meet somebody to live out the rest of her days with – Lynn Conway met Charles Rogers in 1987. They married in 2002.
Lead photo by Joseph Xu, courtesy of Lynn Conway.
FURTHER READING: Conway’s homepage is a treasure trove of biographical resources and original documents from her time at IBM and Xerox. Michael Hiltzik’s Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age (1999) has a chapter on the development of the Mead-Conway VLSI system which is pretty nifty, and so for that matter is the whole book as a glimpse into one of the great computing think tanks of the Sixties and Seventies.