Information: The Revolution that Didn’t Happen

By Alex Sayf Cummings

It is not uncommon for Americans to think they are always in the middle of a communication revolution. The invention of the steamship sped the movement of information at a time when, in the early nineteenth century, transportation essentially was communication; a letter traveled only as fast as it could physically be moved. 

Then came advances in printing technology that gave us the penny press and the dime novel, and, of course, the telegraph, which transmitted messages instantaneously. By the 1870s, there was the telephone and phonograph; soon after followed radio and film. Each of these moments could qualify as a revolution, or perhaps as episodes in one long information revolution. 

But that, of course, is not how we typically use the term “information revolution.” Since the 1960s, journalists, intellectuals, and tech companies have promoted the idea that the United States was becoming a postindustrial society, one that produced information instead of things.  Manufacturing was declining, if not dead, and in the future Americans would make music, movies, pharmaceuticals, software, and the like.

The message came from many quarters. Sociologist Daniel Bell talked about the “coming of post-industrial society” and the rise of “knowledge workers” in an influential 1973 book; companies such as RCA and IBM had already hyped the “information revolution” in the 1960s, framing their own computing technologies as beneficial to society.  Economist Marc Uri Porat found that information industries already accounted for 53% of labor income by 1967.

Of course, Porat used a very elastic definition to come up with his 53% figure.  Information workers included “the research scientist, engineer, designer, draftsman, manager, secretary, clerk, accountant, lawyer, advertising manager, communications officer, personnel director” — people who are “paid to create knowledge, communicate ideas, and process information.”

But is a mailman an information worker? He handles information — if, in fact, a Crate & Barrel catalog can be understood to constitute “knowledge.” What about the clerk at Barnes & Noble? A book changes hands; he or she uses an electronic cash register, paired with a credit card terminal that processes information, sending immaterial signals shooting through the financial circuits of the world. 

Indeed, computing technologies have permeated virtually every line of work since the 1960s, but that does not mean that everyone whose job involves processing information in some way or another works in an information industry. If we classify everyone who doesn’t literally labor on an assembly line making physical objects as an information worker, then, yes, the so-called information industry is the largest of all and has been for decades. Manufacturing has been in decline as a share of employment since the 1950s, even though it has grown ever more productive in creating value and output—American factories make more stuff with less labor, owing in large part to automation. 

In other words, the notion that America doesn’t “make stuff” anymore is spurious. It may be true that industries such as software, pharmaceuticals, and biotechnology have grown more significant for the US economy in the last forty years. But the idea of an “information revolution” or an “information society” obscures more than it illuminates. 

For example, healthcare remains one of the fastest growing sectors of the economy, accounting for nearly a fifth of GDP. Is this an information industry? Certainly new technologies have transformed the practice of medicine, and doctors, nurses, and other medical professionals may be “knowledge workers” in the sense that they require extended education and training to do their jobs. 

Education, too, might be considered the quintessential information industry. After all, aren’t schools and universities all about the transmission of knowledge? And scientists and other academics literally produce information in the form of articles, books, and patents.

However, Georgia State University, where I work, is not quite Google; Wellesley is not GlaxoSmithKline. Lumping together a variety of different jobs and industries under the single rubric of information is a careless move, akin to Richard Florida’s maneuver of categorizing the guitar-playing barista, computer programmer, and corporate CEO as all members of the “creative class.”

Rather, I would suggest that the service economy is a far better descriptor of where we are today than an information economy. As manufacturing declined, more and more people began providing services—the accountant, the financial planner, the teacher, the nurse, the massage therapist, the realtor, the dog walker. Even if we develop a machine that spouts out all the stuff we need, much like the all-purpose replicator on Star Trek, human needs would still need to be met by other humans. Hence, “service.”

Critics might quibble that this a purely semantic debate. However, as the conservative thinker Richard Weaver reminded us long ago, “ideas have consequences.”  As I argue in my book Democracy of Sound, lawmakers have passed increasingly powerful protections for intellectual property—copyright, patent, trademarks—since the 1970s, on the theory that information industries deserved special treatment in a postindustrial world. And as the loud calls to “let Detroit die” during the 2008 economic crisis showed, many Americans have taken for granted the idea that manufacturing is a thing of the past — missing the real value that the auto industry and other sectors contribute to the economy.

None of this is to say that computers have not transformed our lives in countless ways, nor that many new jobs have been created in the tech sector (who could have guessed what an “app developer” was 25 years ago?). But the idea of the information revolution distorts our understanding of the economy we live in and privileges the interests of a few key sectors—Hollywood and Silicon Valley—in ways that may not benefit workers, consumers, or other industries. 

Partly, this boils down to recency bias.  Thinkers such as Jeremy Rifkin credit computing with launching a “Third Industrial Revolution,” one that transforms society as thoroughly as the first two. It took humans tens of thousands of years to figure out the first revolution (agriculture), and another few thousand to get to the second (manufacturing). Perhaps 150 years later we arrive at yet another transformative moment, with the rise of ENIAC and Clippy and Angry Birds.

What is nearest in the rear view mirror often seems most important. But communication technologies arguably revolutionized the world far more dramatically in the nineteenth and early twentieth centuries than in the recent past—the heyday of high-tech.  We might be wiser to see the information revolution of the late twentieth century as an extension of one much longer transformation—or perhaps as no revolution at all. 

Alex Sayf Cummings is an associate professor of History at Georgia State University. He is the author of Democracy of Sound: Music Piracy and the Remaking of American Copyright in the Twentieth Century (Oxford, 2013) and co-editor of the blog Tropics of Meta.

Title image: IBM Pavilion visitors view computer operating Jacquard loom at the Lakeside Pavilion during the 1968 World’s Fair known as Hemis Fair ’68 held from April to October 1968 in San Antonio, Photograph.

3 thoughts on “Information: The Revolution that Didn’t Happen

  1. Isn’t labeling someone a “service worker” just as “careless” a move as labeling everyone an “information worker”? Dog walkers and professors are substantially different things.

    Like

Leave a comment