The Machine That Changed the World (PBS) is the longest, most comprehensive documentary about the history of computing ever produced, but since its release in 1992, it’s become virtually extinct. Out of print and never released online, the only remaining copies are VHS tapes floating around school libraries or in the homes of fans who dubbed the original shows when they aired.
It’s a whirlwind tour of computing before the Web, with brilliant archival footage and interviews with key players — several of whom passed away since the filming. Jointly produced by WGBH Boston and the BBC, it originally aired in the UK as The Dream Machine before its U.S. premiere in January 1992. Its broadcast was accompanied by a book co-written by the documentary’s producer Jon Palfreman.
EPISODE 1: GIANT BRAINS
The first part begins with a brief introduction to the series, summarizing the impact of computers on every aspect of our lives, attributed to their versatile nature. The history of computing begins with the original definition of “computers,” human beings like William Shanks that calculated numbers by hand. Frustration with human error led Charles Babbage to develop his difference engine, the first mechanical computer. He later designed the analytical engine, the first general-purpose programmable computer, but it was never finished. Ada Lovelace assisted Babbage with the design and working out programs for the unbuilt machine, making her the first programmer.
100 years later, German engineer Konrad Zuse built the Z1, the first functional general-purpose computer, using binary counting with mechanical telephone relays. During World War II, Zuse wanted to switch to vacuum tubes, but Hitler killed the project because it would take too long. At the University of Pennsylvania, John Mauchly and J. Presper Eckert built ENIAC, the first general-purpose electronic computer, to aid in military calculations. They didn’t finish in time to be useful for the war, but soon after, Eckert and Mauchly started the first commercial computer company. It took years before they brought a computer to market, so a British radar engineer named Freddie Williams beat them to building the first computer with stored programs. In Cambridge, Maurice Wilkes built EDSAC, the first practical computer with stored programs. Alan Turing imagined greater things for computers beyond calculations, after seeing the Colossus computer break German codes at Bletchley Park. Actor Derek Jacobi, performing as Alan Turing in “Breaking the Code,” elaborates on Turing’s insights into artificial intelligence. Computers can learn, but will they be intelligent?
EPISODE 2: INVENTING THE FUTURE
Shortly after the war ended, ENIAC’s creators founded the first commercial computer company, the Eckert-Mauchly Computer Corporation in 1946. The early history of the company’s funding and progress is told through interviews and personal home videos. They underestimated the cost and time to build UNIVAC I, their new computer for the US Census Bureau, quickly sending the company into financial trouble. Meanwhile, in London, the J. Lyons and Co. food empire teamed up with the EDSAC developers at Cambridge to build LEO, their own computer to manage inventory and payroll. It was a huge success, inspiring Lyons to start building computers for other companies.
The Eckert-Mauchly company was in trouble, with several high-profile Defense Department contracts withdrawn because of a mistaken belief that John Mauchly had Communist ties. After several attempts to save the company, the company was sold to Remington-Rand in 1950. The company, then focused on electric razors and business machines, gave UNIVAC its television debut by tabulating live returns during the 1952 presidential election. To CBS’s amazement, it accurately predicted an Eisenhower landslide with only 1% of the vote. UNIVAC soon made appearances in movies and cartoons, leading to more business.
IBM was late to enter the computing business, though they’d built the massive SSEC in 1948 for scientific research. When the US Census ordered a UNIVAC, Thomas Watson, Jr. recognized the threat to the tabulating machine business. IBM introduced their first commercial business computers in 1953, the mass-produced IBM 650. While inferior technology, it soon dominated the market with their strong sales force, relative affordability, and integration with existing tabulating machines. In 1956, IBM soared past Remington-Rand to become the largest computer company in the world. By 1960, IBM captured 75% of the US computer market.
But developing software for these systems often cost several times the hardware itself, because programming was so difficult and programmers were hard to find. FORTRAN was one of the first higher-level languages, designed for scientists and mathematicians. It didn’t work well for business use, so COBOL soon followed. This led to wider adoption in different industries, as software was developed that could automate human labor. “Automation” become a serious fear, as humans were afraid they’d lose their jobs to machines. Across the country, companies like Bank of America (with ERMA) were eliminating thousands of tedious tabulating jobs with a single computer, though the country’s prosperity and booming job market tempered some of that fear.
In the ’50s, vacuum tubes were an essential component of the electronics industry, located in every computer, radio, and television. Transistors meant that far more complex computers could be designed, but couldn’t be built because wiring them together was a logistical nightmare. The “tyranny of numbers” was solved in 1959 with the first working integrated circuit, developed and introduced independently by both Texas Instruments and Fairchild. But ICs were virtually ignored until adopted by NASA and the military for use in lunar landers, guided missiles, and jets. Electronics manufacturers soon realized the ability to mass-produce ICs. Within a decade, ICs cost pennies to produce while becoming a thousand times more powerful. The result was the birth of the Silicon Valley and a reborn electronics industry.
EPISODE 3: THE PAPERBACK COMPUTER
Like the books of the Middle Ages, early computers were large, extremely expensive, and maintained by a select few. It seemed unlikely they’d be commonplace, partly because they were so difficult to use. Developing software was extremely tedious, the interface limited to writing instructions on punched cards. Ivan Sutherland’s revolutionary Sketchpad was the first graphical user interface, pioneering the fields of interactive computing, computer-aided drawing, and object-oriented programming. Douglas Engelbart’s NLS, demonstrated in the Mother of All Demos from 1968, demonstrated for the first time several concepts that would become commonplace: the mouse, CRT display, windowing systems, hypertext, videoconferencing, collaborative editing, screen sharing, word processing, and a search engine ordering by relevance. Xerox, realizing computers might lead to paperless communication, created the PARC research laboratory to make computers easy to use. They unified several concepts into a usable computer environment, the Xerox Alto, inventing the modern GUI paradigm of folders, files, and documents, along with Ethernet, Smalltalk, WYSIWIG editing, and the laser printer. Xerox marketed the Xerox Star, but it was expensive and a commercial failure.
In 1971, the invention of the microprocessor led to affordable computer kits like the Altair 8800. Groups of computer hobbyists like the Homebrew Computer Club led to a cottage industry of hardware and software startups, including the founders of Apple Computer. Their Apple I in 1976 and the Apple II in 1977 were huge hits. The success of the personal computer, including the Commodore PET, Atari 400/800, and TRS-80, inspired IBM to enter the market with the PC in 1981. They soon dominated the industry. Inspired by the work at Xerox PARC, Apple responded with the Macintosh, the first successful mass-produced computer with a mouse and GUI.
Software enabled computers to become diverse machines, able to be used for business use, flight simulators, music, illustration, or anything else that could be imagined. Pure software companies like Lotus and Microsoft became tremendously successful, making their founders and early employees very rich. Those using computers required no knowledge of how it worked, including an entire generation raised on computers as familiar objects. The episode concludes with some excellent conceptual designs of future computers from Apple, and a discussion of the potential uses of virtual reality in future computing.
EPISODE 4: THE THINKING MACHINE
The fourth episode of The Machine That Changed the World covers the history of artificial intelligence and the challenges that come from trying to teach computers to think and learn like us.
EPISODE 5: THE WORLD AT YOUR FINGERTIPS
Here’s the fifth and final episode of The Machine That Changed the World, this one focusing on global information networks including the Internet, and the communication benefits and privacy risks they create.