Caffeinated Bitstream

Bits, bytes, and words.

The Dream Machine, and highlights from the dawn of computing

It's easy to imagine that computing sprang into existence with the advent of home computers in the late 1970's and 1980's, just as many people have the perception that the Internet sprang into existence in the mid- to late-1990's. In both cases, these technologies crossed thresholds that made them accessible to general consumers, leading to greatly increased usage that makes their pre-consumer existences seem quite meager. However, in the early days great minds were hard at work on developing revolutionary ideas that are so ingrained in computing today that they are largely taken for granted.

I've gained a much better appreciation for this pre-PC era of computing after recently reading The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal, written by M. Mitchell Waldrop and published in 2002. While ostensibly a biography of J.C.R. Licklider, it actually devotes more text to covering the contributions of many other pioneers, including Norbert Wiener, Vannevar Bush, Claude Shannon, Alan Turing, John von Neumann, John McCarthy, Douglas Engelbart, Bob Taylor, and Alan Kay. Licklider's life and career provide the book with a convenient backbone since he crossed paths with so many other notable figures, and also because of the book's focus on how these people helped bring about Licklider's vision of interactive computing as a means of boosting human productivity.

I can definitely recommend this book to anyone interested in the history of computing, as it clearly conveys how our modern world of computing didn't arrive overnight but rather is the result of a long continuum of ideas from throughout the 20th century. In the words of the author:

I finally began to realize that windows, icons, mice, word processors, spreadsheets, and all the other things that seem so important to the modern software industry actually come at the end of the story, not at the beginning.

— Waldrop, M. Mitchell. The Dream Machine: J. C. R. Licklider and the Revolution That Made Computing Personal (p. 472). Kindle edition.

In this post, I'll share some notable highlights from this history that seemed to fall into a few common themes that I found interesting.

The time travelers

Douglas Engelbart delivers what was later known as The Mother of All Demos in 1968.

Some of the pioneers of computing were so far ahead of their time, that they almost seem like time travelers from the future.

  • Vannevar Bush's Memex. In his 1939 draft article submitted to Fortune, Vannevar Bush described a desktop device that would allow a user to instantly access a library of printed material, all of which could be connected with links so the user could effortlessly jump from one resource to another, with the goal of enhancing the productivity of intellectual workers. Although it predates the World Wide Web by 52 years, it's astonishing how similar Bush's Memex system was to hypertext and Wikipedia. The Memex was described as an electromechanical system based on microfilm. While it was never built, the idea inspired many later researchers.
  • The Mother of All Demos. In 1968, Douglas Englebart delivered a groundbreaking technology demonstration showing his team's work at Stanford Research Institute on the "oN-Line System" (NLS) which introduced a number of new ideas such as the mouse, a practical hypertext system, and a raster-scan video output. It integrated a number of pre-existing ideas such as word processing to create a unique system for productivity and online collaboration. The demonstration itself was extraordinarily elaborate for its day, featuring a 22-foot tall projector screen, microwave video links back to SRI, and backstage video mixing. It later came to be known as The Mother of All Demos. It's a testament to how far Englebart's vision has become reality that, after reading about his presentation from 50 years ago, you can follow a link to YouTube and be watching it in seconds.

World War II

Reading the chapters that covered the 1930's and 1940's, I was struck at how much progress was put on hold as World War II threw a colossal non-maskable interrupt at the scientists working on the foundations of computing. Industry and academia shifted gears to support the war effort, and individual innovators put their computing projects aside while they lent their skills to facing the imminent threats. The author concludes that the war ultimately forged the pieces of computer theory "into a unified whole". (Ibid. p. 40) Nonetheless, the book contains a number of examples where critical ideas were put on ice for the duration of the war, such as:

  • Vannevar Bush put his Memex research on hold in 1940 to create and lead the National Defense Research Committee (NDRC) to organize scientific research into defense technology. Norbert Wiener proposed to include digital computer research in the NDRC's scope, but Bush felt the need to prioritize more immediately useful technology such as radar, anti-aircraft fire control, anti-submarine warfare, and the atomic bomb. It wasn't until 1945 that Bush returned to the topic and delivered his influential essay, "As We May Think".
  • Claude Shannon's 1937 master's thesis "A Symbolic Analysis of Relay and Switching Circuits" laid the foundation for digital computing, but his even more revolutionary contributions would have to wait. In 1941, he joined Bell Labs and worked on anti-aircraft fire control and encrypted radio systems during the day, and spent what time he could in the evenings developing his ideas about the fundamentals of communication. It wasn't until 1948 that he published his ideas in "A Mathematical Theory of Communication" and became known as the father of Information Theory.
  • John V. Atanasoff invented the first electronic digital computer in 1939. In 1942 he left academia to oversee the acoustic testing of mines at the Naval Ordnance Laboratory, and continued to work in non-computing fields after the war.

An interesting counterpoint is John von Neumann's career. The war led him to join the Manhattan Project, where his experience with performing large-scale calculations on mechanical tabulators led him to develop his ideas about electronic computing, ultimately leading to his famous von Neumann architecture.

The things we take for granted

Many ideas that seem obvious to us today were actually not obvious at all, but rather had to be invented. Some examples include:

  • Interactive computing. We take it for granted today that you use a computer by entering commands (via the keyboard, mouse, or the touchscreen on your phone or tablet), receive immediate feedback on the screen, then enter more commands. In the 1950's and 1960's, however, such a usage model was not only uncommon, but actually controversial.

    One of the central themes of the book is Licklider's push for interactive computing, where an operator can work with a computer in real time to solve problems through the exploration of ideas, trial and error, and rapid feedback. This went against the prevailing idea of the era that batch processing on centralized computers would always make more sense. Proponents of batch processing maintained that no matter how inexpensive a unit of computation became, it would still be most efficient to concentrate these resources in one giant computer that could focus its entire capacity on performing a sequence of submitted tasks one after the other. Indeed, early attempts to provide interactivity via time-sharing suffered from considerable overhead as the processor had to switch between many users fast enough to maintain the illusion that each user had his or her own computer. Ultimately, the productivity benefits outweighed the overhead, and we all use computers interactively today.

  • Programmability. Even though Charles Babbage had described how a programmable machine might be built in the 1830's, the concept was still exotic in the 1930's. The idea that a machine might make a decision and choose between alternate courses of action was quite radical and sounded eerily similar to thinking. Thus, Howard Aiken's Mark I, one of the first programmable computers, was often called "the electronic brain". (Interestingly, according to Wikipedia, loops were initially implemented on the Mark I by physically creating loops of tape so the same instructions could be fed to the processor repeatedly!)

    Today, programmability seems like the essence of computing. But in those days, even after they invented a means of implementing Babbage's vision, it took a great deal of further thinking about how to harness the power of programming. In 1947, Herman H. Goldstine and John von Neumann published "Planning and Coding Problems for an Electronic Computing Instrument" which outlined techniques such as flow charting and subroutines, and software engineering was born. (Ibid. p. 87)

    Years later, in the 1950's, engineers were still struggling to grasp the complexities of software:

    Lincoln Lab's initial guess for the programming requirements on SAGE — that it would require perhaps a few thousand lines of computer code to run the entire air-defense system — was turning out to be the most laughable underestimate of the whole project. True, the Lincoln Lab team was hardly alone in that regard. Many computer engineers still regarded programming as an afterthought: what could be so hard about writing down a logical sequence of commands?

    — Waldrop, M. Mitchell. The Dream Machine: J. C. R. Licklider and the Revolution That Made Computing Personal (p. 118). Kindle Edition.

  • Binary math. We take it as a given today that binary is the most fundamental unit of information and, given that electrical switches naturally hold a binary state (off or on), the most elegant way of expressing values in a computer. This was not always obvious. Designers of early computing devices assumed that decimal arithmetic was the natural approach, as each digit could store more information and no base conversion of inputs and outputs was needed for the benefit of humans. Manufacturers of mechanical calculators in the 1930's actually used complex systems of gears to perform decimal arithmetic end-to-end. Even Aiken's Mark I, which was proposed in 1937 but not built until 1944, operated directly on decimal numbers.

    Everything changed in 1937 with Claude Shannon's master's thesis, "A Symbolic Analysis of Relay and Switching Circuits", which proposed binary-oriented digital circuits. Independently in 1937 at Bell Labs, George Stibitz invented a binary adding circuit. Shannon's 1948 paper "A Mathematical Theory of Communication" established that the fundamental unit of information is the bit, a term coined by his colleague J.W. Tukey, and the rest is history.

Skepticism

New technology often faces skepticism and opposition from people invested in the existing technology.

  • Packet-switched networks. People invested in the idea of circuit-switched communication were highly resistant to the idea of packet switching. The ARPANET designers were routinely criticized by their Pentagon colleagues and AT&T engineers for their decision to base their network on packet switching, receiving comments such as "this is how a telephone works...", "there's just no way this can work", and "the buffers are going to run out". (Ibid., p. 227)
  • Moore's Law. Jacob Goldman, creator of Xerox's Palo Alto Research Center (PARC), tried to explain Moore's Law to the pencil pushers at Xerox, but they refused to believe that such a thing could even be possible. (Ibid., p. 389)
  • Laser printers. Goldman also struggled with convincing Xerox of the value of one of PARC's legendary inventions, the laser printer. Apparently many at Xerox felt safer going with an alternative printing technology developed by another team, which basically involved glueing a CRT screen to a photocopier. In the end, Goldman somehow managed to convince the product selection committee to embrace the laser printer, which eventually made billions of dollars for Xerox. (Ibid. p. 392)
  • Internet. As late as 1990, it was still hard to sell people on the value of the Internet. Notably, AT&T looked into the business potential of the Internet, but concluded that it couldn't be profitable. (Ibid., pp. 462-463)