The Machine that Changed the World (Documentary)

The Machine That Changed the World is the longest, most comprehensive documentary about the history of computing ever produced. The film consists of 5 fantastic episodes but unfortunately since its release in 1992 it has become virtually extinct.

Out of print and never released online, the only remaining copies are VHS tapes floating around school libraries or in the homes of fans who dubbed the original shows when they aired. Below you can watch all 5 parts of The Machine That Changed The World, and even though it has been 20 years since the movie was originally produced, it is still absolutely worth watching.

Part I: Giant Brains

The first part begins with a brief introduction to the series, summarizing the impact of computers on every aspect of our lives, attributed to their versatile nature. The history of computing begins with the original definition of “computers,” human beings like William Shanks that calculated numbers by hand. Frustration with human error led Charles Babbage to develop his difference engine, the first mechanical computer. He later designed the analytical engine, the first general-purpose programmable computer, but it was never finished. Ada Lovelace assisted Babbage with the design and working out programs for the unbuilt machine, making her the first programmer.

100 years later, German engineer Konrad Zuse built the Z1, the first functional general-purpose computer, using binary counting with mechanical telephone relays. During World War II, Zuse wanted to switch to vacuum tubes, but Hitler killed the project because it would take too long. At the University of Pennsylvania, John Mauchly and J. Presper Eckert built ENIAC, the first general-purpose electronic computer, to aid in military calculations. They didn’t finish in time to be useful for the war, but soon after, Eckert and Mauchly started the first commercial computer company. It took years before they brought a computer to market, so a British radar engineer named Freddie Williams beat them to building the first computer with stored programs. In Cambridge, Maurice Wilkes built EDSAC, the first practical computer with stored programs. Alan Turing imagined greater things for computers beyond calculations, after seeing the Colossus computer break German codes at Bletchley Park. Actor Derek Jacobi, performing as Alan Turing in “Breaking the Code,” elaborates on Turing’s insights into artificial intelligence. Computers can learn, but will they be intelligent?

Interviews:

Paul Ceruzzi (computer historian), Doron Swade (London Science Museum), Konrad Zuse (inventor of the first functional computer and high-level programming language, died in 1995), Kay Mauchly Antonelli (human computer in WWII and ENIAC programmer, died in 2006), Herman Goldstine (ENIAC developer, died in 2004), J. Presper Eckert (co-inventor of ENIAC, died in 1995), Maurice Wilkes (inventor of EDSAC), Donald Michie (Codebreaker at Bletchley Park)

 

Part II: Inventing The Future

The rise of commercial computing, from UNIVAC to IBM in the 1950s and 1960s:

Shortly after the war ended, ENIAC’s creators founded the first commercial computer company, the Eckert-Mauchly Computer Corporation in 1946. The early history of the company’s funding and progress is told through interviews and personal home videos. They underestimated the cost and time to build UNIVAC I, their new computer for the US Census Bureau, quickly sending the company into financial trouble. Meanwhile, in London, the J. Lyons and Co. food empire teamed up with the EDSAC developers at Cambridge to build LEO, their own computer to manage inventory and payroll. It was a huge success, inspiring Lyons to start building computers for other companies.

The Eckert-Mauchly company was in trouble, with several high-profile Defense Department contracts withdrawn because of a mistaken belief that John Mauchly had Communist ties. After several attempts to save the company, the company was sold to Remington-Rand in 1950. The company, then focused on electric razors and business machines, gave UNIVAC its television debut by tabulating live returns during the 1952 presidential election. To CBS’s amazement, it accurately predicted an Eisenhower landslide with only 1% of the vote. UNIVAC soon made appearances in movies and cartoons, leading to more business.

IBM was late to enter the computing business, though they’d built the massive SSEC in 1948 for scientific research. When the US Census ordered a UNIVAC, Thomas Watson, Jr. recognized the threat to the tabulating machine business. IBM introduced their first commercial business computers in 1953, the mass-produced IBM 650. While inferior technology, it soon dominated the market with their strong sales force, relative affordability, and integration with existing tabulating machines. In 1956, IBM soared past Remington-Rand to become the largest computer company in the world. By 1960, IBM captured 75% of the US computer market.

But developing software for these systems often cost several times the hardware itself, because programming was so difficult and programmers were hard to find. FORTRAN was one of the first higher-level languages, designed for scientists and mathematicians. It didn’t work well for business use, so COBOL soon followed. This led to wider adoption in different industries, as software was developed that could automate human labor. “Automation” become a serious fear, as humans were afraid they’d lose their jobs to machines. Across the country, companies like Bank of America (with ERMA) were eliminating thousands of tedious tabulating jobs with a single computer, though the country’s prosperity and booming job market tempered some of that fear.

In the ’50s, vacuum tubes were an essential component of the electronics industry, located in every computer, radio, and television. Transistors meant that far more complex computers could be designed, but couldn’t be built because wiring them together was a logistical nightmare. The “tyranny of numbers” was solved in 1959 with the first working integrated circuit, developed and introduced independently by both Texas Instruments and Fairchild. But ICs were virtually ignored until adopted by NASA and the military for use in lunar landers, guided missiles, and jets. Electronics manufacturers soon realized the ability to mass-produce ICs. Within a decade, ICs cost pennies to produce while becoming a thousand times more powerful. The result was the birth of the Silicon Valley and a reborn electronics industry.

Interviews:
Ted Withington (network engineer, industry analyst), Paul Ceruzzi (Smithsonian), J. Presper Eckert (ENIAC co-inventor, died 1995), Morris Hansen (former US Census Bureau, died 1990), John Pinkerton (Chief Engineer, LEO, died 1997), Thomas J. Watson, Jr. (Chairman Emeritus, IBM, died 1993), James W. Birkenstock (retired Vice President, IBM, died 2003), Jean Sammet (programming language historian), Dick Davis (retired Senior V.P., Bank of America), Robert Noyce (co-inventor, integrated circuit, died 1990), Gordon Moore (former Chairman of the Board, Intel), Steve Wozniak (Co-founder, Apple)

 

Part III: The Paperback Computer

Like the books of the Middle Ages, early computers were large, extremely expensive, and maintained by a select few. It seemed unlikely they’d be commonplace, partly because they were so difficult to use. Developing software was extremely tedious, the interface limited to writing instructions on punched cards. Ivan Sutherland’s revolutionary Sketchpad was the first graphical user interface, pioneering the fields of interactive computing, computer-aided drawing, and object-oriented programming. Douglas Engelbart’s NLS, demonstrated in the Mother of All Demos from 1968, demonstrated for the first time several concepts that would become commonplace: the mouse, CRT display, windowing systems, hypertext, videoconferencing, collaborative editing, screen sharing, word processing, and a search engine ordering by relevance. Xerox, realizing computers might lead to paperless communication, created the PARC research laboratory to make computers easy to use. They unified several concepts into a usable computer environment, the Xerox Alto, inventing the modern GUI paradigm of folders, files, and documents, along with Ethernet, Smalltalk, WYSIWIG editing, and the laser printer. Xerox marketed the Xerox Star, but it was expensive and a commercial failure.

In 1971, the invention of the microprocessor led to affordable computer kits like the Altair 8800. Groups of computer hobbyists like the Homebrew Computer Club led to a cottage industry of hardware and software startups, including the founders of Apple Computer. Their Apple I in 1976 and the Apple II in 1977 were huge hits. The success of the personal computer, including the Commodore PET, Atari 400/800, and TRS-80, inspired IBM to enter the market with the PC in 1981. They soon dominated the industry. Inspired by the work at Xerox PARC, Apple responded with the Macintosh, the first successful mass-produced computer with a mouse and GUI.

Software enabled computers to become diverse machines, able to be used for business use, flight simulators, music, illustration, or anything else that could be imagined. Pure software companies like Lotus and Microsoft became tremendously successful, making their founders and early employees very rich. Those using computers required no knowledge of how it worked, including an entire generation raised on computers as familiar objects. The episode concludes with some excellent conceptual designs of future computers from Apple, and a discussion of the potential uses of virtual reality in future computing.

Interviews:
Canon John Tiller (Library Master, Hereford Cathedral), Mitch Kapor (Founder, Lotus), Robert Taylor (Xerox PARC), Ted Nelson (Creator, Project Xanadu), Douglas Engelbart, Larry Tesler (Xerox PARC), Alan Kay (Xerox PARC), Ted Hoff (Co-inventor, microprocessor), Steve Jobs (Cofounder, Apple), Steve Wozniak (Cofounder, Apple), Mike Markkula (Investor, Apple), Lee Felsenstein (Designer, Osborne 1), Bill Gates (Chairman, Microsoft), Chris Peters (Manager, Office), Anne Meyer (Center for Applied Special Tech.), Dr. Henry Fuchs (UNC, Chapel Hill), Dr. Jane Richards (UNC, Chapel Hill), Dr. Frederick P. Brooks, Jr (UNC, Chapel Hill)

 

Part IV: The Thinking Machine

The history of artificial intelligence, from Minsky to neural networks.

The fourth episode of The Machine That Changed the World covers the history of artificial intelligence and the challenges that come from trying to teach computers to think and learn like us.

Key People/Concepts:
Jerome B. Wiesner, Marvin Minsky, John McCarthy, Oliver Selfridge of Lincoln Labs, Claude Shannon, Freddy Robot at the University of Edinburgh, Sir James Lighthill, Terry Winograd’s SHRDLU, Edward Feigenbaum’s work on expert systems, Doug Lenat’s Cyc project, Oliver Sacks, neural networks, NETtalk

Interviews:
Marvin Minsky (MIT), Hubert Dreyfus (UC Berkeley), Edward Feigenbaum (Stanford University), Hans Moravec (Carnegie Mellon Robotics Institute), Doug Lenat (University of Texas, Austin), Dean Pomerleau (Carnegie Mellon Robotics Institute), Terrence Sejnowski (Salk Institute)

 

Part V: The World At Your Fingertips

Computer networks, including the Internet, and their global impact on communication and privacy.

Here’s the fifth and final episode of The Machine That Changed the World, this one focusing on global information networks including the Internet, and the communication benefits and privacy risks they create. This is the most familiar material of the documentary, so I’m going to skip the notes and annotations this time. I hope you enjoyed the documentary as much as I did.

Interviews:
Robert Lucky (AT&T Bell Labs), Dave Hughes, Kathleen Bonner (Trader, Fidelity), George Hayter (Former Head of Trading, London Stock Exchange), Ben Bagdikian (UC Berkeley), Arthur Miller (Harvard Law School), Forman Brown (songwriter, died in 1996), Tan Chin Nam (Chairman, National Computer Board of Singapore), B.G. Lee (Minister of Trade and Industry, Singapore), Lee Fook Wah, (Assistant Traffic Manager, MRT Singapore), David Assouline (French Activist, now a senator), Mitch Kapor (founder, Lotus), Michael Drennan (Air traffic controller, Dallas-Fort Worth)

Browse More

Nikola Danaylov on Aways Another Way Podcast

Nikola Danaylov on the Always Another Way Podcast: The World is Transformed by Asking Questions

Futurist Gerd on AI

Gerd Leonhard on AI and Digital Ethics

At-the-Heart-of-Intelligence preview

At the Heart of Intelligence: A Film by Gerd Leonhard & Telia Finland

Yuval Harari on Fascism thumb

Yuval Harari on Why Fascism is so Tempting and How Your Data Could Power It

Third Industrial Revolution Rifkin thumb

The Third Industrial Revolution: A Radical New Sharing Economy [Documentary]

Will tech save the world thumb

London Futurists Hangout On Air: Will science & technology save the world?

Jaguar I-Pace EV thumb

Jaguar I-PACE EV Guns for Tesla

Yuval Harari World Economic Forum thumb

Yuval Harari at the World Economic Forum: Will the Future be Human?