A teaser edition of our interview with Nicholas Carr, the entirety of which can be read in the Technology Issue! You can subscribe to our magazine here.

In his book, The Shallows, which was a 2011 finalist for the Pulitzer Prize, Nicholas Carr talks about the internet’s re-wiring of the human mind. Like a number of well-regarded tech skeptics (Marshall McLuhan, Neil Postman, Sherry Turkle), Carr argues that the way the internet presents information to us is changing the way we think everywhere else—in our jobs, in our free time, in our inner lives. Towards the end of his book, he quotes a passage from Psalm 115:

Their idols are silver and gold, the work of human hands.
They have mouths, but do not speak; eyes, but do not see.
They have ears, but do not hear; noses, but do not smell.
They have hands, but do not feel; feet, but do not walk;
they make no sound in their throats.
Those who make them are like them; so are all who trust in them.

According to Carr, the psalmist isn’t just decrying our susceptibility to golden calves. The psalmist is also saying what John Culkin argues, that, “We shape our tools, and then our tools shape us.” While it is easy to vilify the growing roles of computers in society, Carr is noting that, since the early beginnings of human technology, whenever a tool was held in a hand, the hand became that tool. This was both an enriching and a limiting experience. While our hands are given new capabilities, the tool in our hands also limits us from doing the other things our hands can naturally do.

51lESeR-C+L._SX326_BO1,204,203,200_In his new book, The Glass Cage: Automation and Us, Carr is arguing that beyond our mere propensity to be shaped by our technologies, we are also susceptible to giving away the things in life which give us pleasure. Since the Industrial Age, the endgame for our technological progress has been automation—to make more and more things that require less and less human intervention. For the sake of leisure, we have sought less work—and this arrival of more leisure has ironically left us less happy. This, Carr notes, is the human affliction of ‘miswanting’:

Those are symptoms of a more general afflction, on which psychologists have bestowed the poetic name miswanting. We’re inclined to desire things we don’t like and to like things we don’t desire. “When the things we want to happen do not improve our happiness, and when the things we want not to happen do,” cognitive psychologists Daniel Gilbert and Timothy Wilson have observed, “it seems fair to say we have wanted badly.”

Of course there are automatic conveniences you’d be silly not to be grateful for—dishwashers and remote controls come to mind—and Carr is not suggesting we all become backwoods craftsmen. But he is interested in the misleading motivation of automation. Amidst all it gains, what does a world with so much leisure lose? And beyond the science fiction ghosts of robots and artificial intelligence, what elemental parts of human being get forgotten in a world with an automatic transmission?

We had the chance to talk with Nicholas about these topics, as well as the history of aviation, and the future of online privacy. He spoke with us from his home in the Denver area.


In The Glass Cage, you mention science fiction writers of the past, who warn us of a future when we would be the “architects of our own obsolescence.” We certainly see this happening in our social lives and the social media use around us. Where else in the world of automation do you see this happening? What sticks out in your mind as a place where we are crafting our own obsolescence?


A lot of the recent discussions about robots and artificial intelligence have focused on the labor market and the sense that computers will be able to replace us very quickly at almost all the jobs that human beings do. And that’s certainly a concern. I think some of the speculations have gotten ahead of the reality of the technology.

But my focus is a little bit different—or at least my main focus. What I argue is that even if a computer or a robot or an artificial intelligence program doesn’t take your job, it’s going to change the way you do your job, and it’s going to change the role you take in your job. And it’s going to go beyond questions of employment. It’s going to have effects on the way you live your life, the way you perform the various activities that we do in our day-to-day lives, even outside of our jobs, whether it’s socializing or gathering news or whatever. We human beings are very quick to seek convenience and more leisure. And as a result of that and more economic forces, we’re very quick to think, “Well, let’s just let the technology do this activity for us.” And it turns out that it’s very easy to begin to give away tasks and activities that are actually quite fulfilling and quite meaningful. I think we’re at risk of handing over to computers activities—whether they’re activities of the mind or the body—that we really should be holding onto.

So, at a fundamental level, I think what we’re seeing is something we’ve always seen with tools and technologies: we’re determining a division of labor between ourselves and our technologies. And I think we do that a little too cavalierly. In deciding the division of labor between ourselves and our technologies, we’re also in some deep way establishing the terms of our own existence: what we’re going to do, how we’re going to think, how we’re going to act, how we’re going to socialize. These questions should be given a good deal of serious consideration. And so far, I fear that we’re not doing that.


That’s actually one of the questions I was going to ask you: does an automated existence, or our propensity to hand over things to our computers, allow us to believe things about ourselves that we wouldn’t believe otherwise?


What changes is the scope and the scale of the decisions. I think you could certainly argue that technology has from the very start—from levers and wheels and maps, some of the earliest technologies—always changed the terms of our engagement with the world. And in many, many cases it’s changed for the good. We’ve been able to use tools to understand the world more deeply and to enrich our interaction with the world and with each other, and so it’s really expanded our horizons and expanded the possibilities of our lives.

chris-ware-unmasked-20091027-122528But there’s also a dark side to these decisions. We can also use tools and technologies to isolate us from the world, to impoverish our interactions with one another, with the social world, with the physical world and so forth. As computers have become more capable of both acting autonomously in the physical world and also taking over more intellectual tasks like parsing data, analyzing phenomena, making judgments and decisions, suddenly the scope of the decisions we’re making about what we hand off to machines and other technologies has just gotten much, much broader, and I fear that, more often than not, what’s happening is that we’re making decisions that are impoverishing our lives rather than enriching them or widening them.

There’s that famous New Yorker cover where the parents are taking their kids out trick-or-treating on Halloween and all the parents are staring into their phones. And that gets across what I’m talking about, that the technology can seduce us away from the kind of activities that actually are very, very enjoyable and very meaningful, and yet we’ll give all that up in order to read something trivial or watch something that pops up on our phones.


St. Augustine calls it incurvatus in se—being curved inwards, curved in on yourself. Augustine’s talking about human sinfulness, that we have this tendency to turn inward. But are there places where you see turning inward happening? I mean, we definitely see this with our smartphones. Anywhere else in particular?


We often think, “I wish I didn’t have anything to do; I wish I had leisure and I didn’t have to work,” and yet when that happens what we see is that people naturally do turn inward and start becoming obsessed with themselves. They fall into what I think Ralph Waldo Emerson called “the prison house of self-consciousness.” And because we have this bias, which I think can be a pernicious bias—to believe that not having to face challenges is somehow good—we’re very, very quick to give up the sources of pleasure and fulfillment and satisfaction that come from utilizing some deep talent, whether it’s a talent that has to do with thinking deeply, or a talent that has to do with manual tasks that are difficult to accomplish. Software writers have become very good at finding anything, any activity that challenges us, and they assume that being challenged is somehow a problem that can be solved with software.