From space travel to the World Wide Web, email to nanoscale robots, humanity’s constant drive for improvement and innovation has changed the course of history. So hop aboard and get comfortable — the time machine is all fired up and ready to relive the greatest scientific discoveries of decades past. Let’s go!
Our first trip took us back to the 1960s — a decade of moon landings and microchips. This time we’re setting course for the most innovative technology in the ’70s: the personal computer.
The Mainframe Goes Mainstream
Today, the computer revolution is complete, personal computers (PCs) are commonplace. Low-cost business models are bought in bulk for offices; mid-range technological workhorses can be easily purchased online; and high-end, bespoke gaming PCs offer maximum performance for a premium price. But just over 40 years ago, the PC industry was nonexistent.
While massive mainframes gained traction for large-scale enterprise computing, price points and size were well beyond the means of the middle class, and they required extensive technical knowledge to operate. All that changed in 1974, with the release of the MITS Altair 8800, which used the Intel 8080 CPU and ran Altair BASIC, a new computing language developed by Paul Allen and Bill Gates. As noted by Computer History, after the 8800 was featured in an issue of Popular Electronics for just under $300, demand soared. MITS co-founder Ed Roberts coined the term “personal computer” and the rest — as they say — is history.
And history is where we’ll start — we’re heading all the way back to 1703.
Binary Beginnings
Before the computer revolution really gained speed, early computers were cumbersome and limited in scope. Developers needed a framework to underpin critical functions while controlling for complexity — and binary fit the bill. As Inverse points out, while the binary number system — 1s and 0s — dates back to ancient Egypt, mathematician Gottfried Wilhelm Leibniz refined the concept in 1703 in his article Explication de l’Arithmétique Binaire, which described a method to represent any number using only 1s and 0s. Two hundred years later, binary made it possible for programmers to easily define computing “on” and “off” states, spurring the rapid advancement of computer science.
By 1936, Alan Turing advanced the idea of a “universal machine” capable of computing anything, and in 1941, J.V. Atanasoff and Clifford Berry created a computer capable of storing information in memory. Twelve years later, Grace Hopper developed COBOL, the first computer language, and in the mid-1960s, integrated circuits opened doors for the first generation of computers with graphical user interfaces (GUIs). IBM engineers developed the floppy disk for information sharing in 1971, and in 1974 personal computers emerged — they were affordable, easily-understandable and (for the time) powerful.
Now, It’s Personal
The MITS Altair 8800 caught consumer attention in 1974, but other offerings weren’t far behind. As noted by Computer History, that same year saw the release of Xerox’s ALTO personal computer, which featured the ability to share and print files and offered a word processing program called “Bravo.” Other options, such as IBM’s 5100 and Radio Shack’s TRS-80 — sometimes called the Trash 80 — also garnered attention; beyond the clever name, the TRS-80 changed the PC market by allowing ordinary users to write their own simple programs.
1976 saw the release of the Apple I, the world’s first single-circuit board kit computer. It required users to supply their own keyboard, power supply and housing, so its technological advancement didn’t really spur seismic market change. A year later, however, the now-famous mobile device maker doubled down with the Apple II, a feature-complete PC that included color graphics and a floppy disk drive. And in 1978, the first electronic spreadsheet — VisiCalc — hit the market. While not exactly exciting for consumers, it helped make things personal for business users, offering valuable functionality that didn’t require massive volumes of vacuum tubes.
The result of a decade’s worth of PC development? Tech revolution. As noted by research firm Gartner, even in a digital world increasingly driven by mobile device sales, worldwide PC shipments topped 68 million in 2018.
Cultural Computing
Innovative technology in the ’70s culminated in the development of accessible, attainable personal computers. Large-scale business adoption, the digital gaming industry and electronic word processing flourished, as did technology giants, including Microsoft and Apple.
But getting a true sense of the PC’s impact means looking at another 1970s technological and cultural touchstone: “Star Wars.” Released in 1977 just as the PC craze was starting to take off, “A New Hope” casually showcased computer use in everyday life. From more advanced onboard ship systems and AI-driven droids to simple handheld devices, computing was simply a part of life as Luke (mostly) brought balance to the Force. Today, PCs are ubiquitous in homes and offices worldwide — often so unremarkable that they’re ignored until actively required.
Informed by the integrated circuit technology of the 1960s and bolstered by 200 years of computing science, PCs took the ’70s by storm to rank among humankind’s greatest scientific discoveries. They also paved the way for our next stop: 1989 and the invention of the World Wide Web.
Are you interested in all things related to technology? We are, too. Check out Northrop Grumman career opportunities to see how you can participate in this fascinating time of discovery.