Tech Giants: Steve Jobs and Bill Gates
Written by: Benjamin Waterhouse, University of North Carolina
By the end of this section, you will:
- Explain the causes and effects of economic and technological change over time
Suggested Sequencing
Use this narrative with the Rodney King and the Los Angeles Race Riots Narrative; the Timothy McVeigh and the Oklahoma City Bombing Narrative; the Is Affirmative Action Justified? Point-Counterpoint; and the AIDS Memorial Quilt, 1987 Primary Source to discuss domestic issues between 1980 and the present day.
Bill Gates, the founder of Microsoft and one of the most influential philanthropists in American history, came from a comfortable, upper-middle-class background. The entrepreneur, whose company became synonymous with personal computing and software development in the 1980s, was born in Seattle, Washington, in 1955, in the midst of the post-war baby boom. He nurtured his interest in electronics while at a private preparatory school and gained admission to Harvard University in 1973. Yet, within two years he had left college, eventually becoming one of Harvard’s most famous dropouts and one of the world’s wealthiest people by the time he was in his forties.
What lured Gates away from Harvard and toward his iconic status in modern American history was the pull of opportunity in the burgeoning world of computer technology. In the years after World War II, government-funded research programs at universities and private companies led to major advances in computing. The completion of one of the first general-purpose, electronic computing machines was announced by a team of researchers at the University of Pennsylvania in 1946. Their Electronic Numerical Integrator and Computer (ENIAC) had been sponsored by the U.S. Army, which used it to calculate the trajectories of weapons systems, including nuclear-armed missiles. In 1950, the Remington Rand typewriter company acquired the patent for the ENIAC and produced the first marketable mainframe computer, the Universal Automatic Computer (UNIVAC), which was about the size of a small truck. At first, the Social Security Administration and other government agencies, which needed to catalog huge volumes of data, were the primary customers of the computer industry. In 1954, General Electric became the first private company to own a mainframe computer.
In the 1960s and early 1970s, mainframe computers shrank in physical size as their computational power grew. This was made possible, in part, by the introduction of electronic transistors, developed at the private research facility Bell Labs in the late 1940s, that replaced large vacuum tubes as conduits of electricity. Then, in 1958, an engineer at Texas Instruments invented the integrated circuit – a group of small transistors carved from a semiconducting material (typically silicon) – thus creating a processing system known as a microprocessor. This seminal breakthrough allowed far greater volumes of electrical current to pass through the machines, exponentially increasing their computational power. And in 1974, the MITS company (which originally made calculators) released a product called the Altair 8800 that was small enough to sit on a desk. This was the first personal computer (PC), designed for individual electronics hobbyists.
Bill Gates was one of those hobbyists. As a sophomore at Harvard, he invented a programming language, BASIC, for the Altair. Quickly realizing the tremendous business opportunity before him, he dropped out of school in 1975 and moved to New Mexico, near MITS’s headquarters. There, he founded a company to write software (or computer code) for microprocessing computers – hence the company’s name: Microsoft.
In the late 1970s, as Gates was launching his company, another recent college dropout began a career that came to reshape the way people around the world interacted with computer technology. In 1976, Steve Jobs and his friend and fellow electronics enthusiast Steve Wozniak co-founded a company they called Apple to market a competitor to the Altair designed by Wozniak: the Apple I. The next year, they introduced the Apple II, one of the first successful microprocessors aimed at a mass market. This device was user friendly and included a keyboard, monitor, color graphics, and a built-in programming language, making it suitable for small companies and households, not just hobbyists and large corporations.
In the early 1980s, Gates and Jobs both expanded their respective companies. In 1981, the decades-old corporation IBM, a long-time leader in mainframe computing that had initially stayed away from personal computers, used its size and market power to dominate the PC market. Gates built Microsoft into the largest seller of disk operating systems (DOS) to PC manufacturers, including both Apple and IBM. Jobs, on the other hand, faced growing challenges. By the mid-1980s, IBM PCs were cheaper and far more prevalent than Apple products, including the Macintosh introduced in 1984. Amid stiff competition and disputes over the direction the company should take, Jobs clashed with Apple’s board of directors and left the company in the fall of 1985. (Wozniak had also resigned earlier that year.)
Advances in computer technology revolutionized all aspects of business and society in the United States and around the world. Faster, smaller, and more powerful computers could more quickly and efficiently store, process, and share data, from payrolls to global price fluctuations to consumer trends. Computerized bar-code scanners allowed retailers to control inventory with precision, and the most successful, like Walmart, used computers to streamline their deliveries, orders, and other logistics. Smoother and more efficient management and communication helped improve worker productivity, the measure of how much an individual worker got done in certain time. New software products like Microsoft’s Windows95 operating system made personal computing easier and more widespread, cementing the company’s position as a leader in the new technology. Bill Gates retired from Microsoft in 2000, turning his attention and the massive wealth he had accumulated to a wide range of global philanthropic initiatives, particularly through the Bill and Melinda Gates Foundation.
By the 1990s, the boom in information technology spearheaded by companies like Microsoft and Apple had expanded even further with the rapid adoption of the internet. First designed by the U.S. military in the 1960s, the internet became increasingly accessible to private citizens and companies as the Cold War came to an end. Companies like AOL, CompuServe, and Netscape developed new programs to make the internet available to, and usable by, the general public. Entire new industries formed in the latter half of the 1990s as online-only companies tried to profit from the new technology. Many burned out quickly when the speculative bubble in technology stocks burst in 2000. Yet several giants emerged even stronger from the “dot-com” bust, including search engine Google and online retailer Amazon, which fundamentally altered traditional models of shopping and outcompeted “brick-and-mortar” stores.
Steve Jobs returned to Apple in 1997. Under his renewed leadership, the company dramatically expanded its product offerings, expanding from personal computers to a wide range of new and extremely successful consumer products. The year 2001 saw the opening of the first Apple stores, designed to bring products directly to consumers. Also that year, Apple launched the iPod, a portable digital audio player; two years later, it created the iTunes store to make it easier for customers to purchase and download music. In 2007, Jobs introduced the iPhone, the standard bearer of a new line of smartphones that united internet connectivity and computational power in a single handheld device that also made phone calls. The iPad, a tablet-sized device with a touch screen, followed in 2010. By the time of Jobs’s death from pancreatic cancer in 2011 at the age of 56, Apple was among the world’s 20 most profitable companies.
The proliferation of high-speed internet connectivity, easy-to-use software applications (apps), and personal smartphones fundamentally reshaped modern life around the world, from high schools in the United States to rural villages in developing countries to business centers in fast-growing Asian cities. After the dot-com bubble burst early in the new millennium, commentators noted a new way in which consumers used computer technology. A short-lived term, “Web 2.0,” captured an enduring trend: the internet, once a mostly one-way vehicle for consuming information or opinion, was increasingly interactive. Social media platforms arose that allowed users to generate their own content, from photographs and videos to blog posts and commentary. People around the world could connect with each other at very low costs in ways that reshuffled everything from entertainment to education to politics. The world created by business and technological upstarts like Bill Gates and Steve Jobs continues to mold every aspect of modern life.
Review Questions
1. The primary users of early computers were
- home-based businesses
- university students
- government agencies
- online retailers
2. What communications technology had its roots in the U.S. Department of Defense during the Cold War but became increasingly accessible to private citizens and companies during the 1990s?
- The semiconductor
- The internet
- The personal computer
- Smartphones
3. The company co-founded by Steve Jobs and Steve Wozniak made its mark by
- becoming the first private company to own a mainframe computer
- becoming the largest seller of disk operating systems (DOS)
- creating an inexpensive, low-tech, mass-produced microprocessing unit
- developing a user-friendly personal computer system
4. The mass-marketing of computer technology became possible with the invention of
- mainframe computers
- vacuum tubes
- microprocessors
- the internet
5. The work of the Bill and Melinda Gates Foundation is comparable to
- the Gilded Age philanthropy of Andrew Carnegie
- nativist sentiment during the first half of the nineteenth century
- isolationism of the 1930s
- anti-communism of the Cold War
6. The innovative work of Bill Gates and Steve Jobs has most clearly contributed to
- global connectivity
- a rise in nationalism
- an increase in democratic liberalism
- educational reform
Free Response Questions
- Analyze the factors that led to the proliferation of computers in the United States starting in the 1960s.
- Explain the impact of computer technology on business in the United States.
AP Practice Questions
Refer to the chart provided.1. For the United States, a direct result of the trend demonstrated in the charts was
- an increase in manufacturing-sector employment
- promotion of an isolationist foreign policy
- increased immigration from Central America
- increasing challenges to its superpower status
- Allocation of increased resources to the war on terrorism
- Debates over the appropriate use of American power in the world
- New global developments in science and technology
- Growing economic inequality in the United States
- Debates over the use of resources foster regional identity
- Government policy affects technological innovation
- Demographic change affects the environment
- The investment of private capital halts the development of communication technology
Primary Sources
Goodell, Jeff. “Steve Jobs in 1994: The Rolling Stone Interview.” June 16, 1994. https://www.rollingstone.com/culture/culture-news/steve-jobs-in-1994-the-rolling-stone-interview-231132/
Suggested Resources
Isaacson, Walter. Steve Jobs. New York: Simon and Schuster, 2011.
Lichtenstein, Nelson. The Retail Revolution: How Wal-Mart Created a Brave New World of Business. New York: Picador, 2010.
Mowery, David, and Nathan Rosenberg. Paths of Innovation: Technological Change in 20th-Century America. Cambridge, UK: Cambridge University Press, 1999.
O’Mara, Margaret. Cities of Knowledge: Cold War Science and the Search for the Next Silicon Valley. Princeton, NJ: Princeton University Press, 2005.
Russell, Andrew. Open Standards and the Digital Age: History, Ideology, and Networks. New York: Cambridge University Press, 2014.
Swaine, Michael, and Paul Freiberger. Fire in the Valley: The Birth and Death of the Personal Computer. Third ed. New York: Pragmatic Bookshelf, 2014.
Wallace, James. Hard Drive: Bill Gates and the Making of the Microsoft Empire. New York: John Wiley and Sons, 1992.
Waterhouse, Benjamin C. The Land of Enterprise: A Business History of the United States. New York: Simon and Schuster, 2017.
Wozniak, Steve, with Gina Smith. iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apples, and Had Fun Doing It. New York: W.W. Norton & Company, 2006.