The “extended mind” theory of cognition argues that the reason humans are so intellectually dominant is that we’ve always outsourced bits of cognition, using tools to scaffold our thinking into ever-more-rarefied realms. Printed books amplified our memory. Inexpensive paper and reliable pens made it possible to externalize our thoughts quickly. Studies show that our eyes zip around the page while performing long division on paper, using the handwritten digits as a form of prosthetic short-term memory. “These resources enable us to pursue manipulations and juxtapositions of ideas and data that would quickly baffle the unaugmented brain,” as Andy Clark, a philosopher of the extended mind, writes.
Granted, it can be unsettling to realize how much thinking already happens outside our skulls. Culturally, we revere the Rodin ideal—the belief that genius breakthroughs come from our gray matter alone. The physicist Richard Feynman once got into an argument about this with the historian Charles Weiner. Feynman understood the extended mind; he knew that writing his equations and ideas on paper was crucial to his thought. But when Weiner looked over a pile of Feynman’s notebooks, he called them a wonderful “record of his day-to-day work.” No, no, Feynman replied testily. They weren’t a record of his thinking process. They were his thinking process:
“I actually did the work on the paper,” he said.
“Well,” Weiner said, “the work was done in your head, but the record of it is still here.”
“No, it’s not a record, not really. It’s working. You have to work on paper and this is the paper. Okay?”
Every new tool shapes the way we think, as well as what we think about. The printed word helped make our thought linear and abstract and vastly increased our artificial memory. Newspapers shrank the world; then the telegraph shrank it even further, producing a practically teleportational shift in the world of information. With every innovation, cultural prophets bickered over whether we were facing a technological apocalypse or utopia. Depending on which Victorian-age pundit you asked, the telegraph was either going to usher in a connected era of world peace or drown us in idiotic trivia. Neither was quite right, of course, yet neither was quite wrong. The one thing that both apocalyptics and utopians understand is that every new technology invisibly pushes us toward new forms of behavior while nudging us away from older, familiar ones. Harold Innis—the lesser known but arguably more interesting intellectual midwife of Marshall McLuhan—called it the “bias” of a new tool.
What exactly are the biases of today’s digital tools? There are many, but I’d argue three large ones dominate. First, they’re biased toward ridiculously huge feats of memory; smartphones, hard drives, cameras and sensors routinely record more information than any tool did before, and keep it easily accessible. Second, they’re biased toward making it easier to find connections—between ideas, pictures, people, bits of news—that were previously invisible to us. And the third one is they encourage a superfluity of communication and publishing. This last feature has a lot of surprising effects that are often ill understood. Any economist can tell you that when you suddenly increase the availability of a resource, people not only do more things with it but they do increasingly odd and unpredictable things. As electricity became cheap and ubiquitous in the West, its role expanded from things you’d expect—like nighttime lighting—to the unexpected and seemingly trivial: Battery-driven toy trains, electric blenders. The superfluity of communication today has produced everything from a rise in self-organized projects like Wikipedia to curious new forms of expression: Television-show recaps, video-game walk-throughs, map-based storytelling.
In one sense, these three shifts—infinite memory, dot-connecting, explosive publishing—are screamingly obvious to anyone who’s ever used a computer. Yet they also somehow constantly surprise us by producing ever-new “tools for thought” (to use the writer Howard Rheingold’s lovely phrase) that upend our daily mental habits in ways we never expected. Indeed, these phenomena have already woven themselves so deeply into the lives of people around the globe that it’s difficult to stand back and take account of how much things have changed and why. While this book maps out what I call the future of thought, it’s also frankly rooted in the present, because many parts of our future have already arrived, even if they are only dimly understood. As the sci-fi author William Gibson famously quipped: “The future is already here—it’s just not very evenly distributed.” This is an attempt to understand what’s happening to us right now, the better to see where our augmented thought is headed. Rather than dwell in abstractions, like so many marketers and pundits—not to mention the creators of technology, who are often remarkably poor at predicting how people will use their tools—I focus more on the actual experiences of real people.