In the late 1980s, the psychologist Mihaly Csikszentmihalyi gave electronic pagers to hundreds of workers from five businesses around Chicago. For one week, each pager beeped at seven random moments during the day. At each beep, the workers stopped what they were doing and filled out a short questionnaire. They answered questions about what they were working on, how challenging their work was, and the psychological state they were in. Csikszentmihalyi was motivated by a simple question: How do people spend their time?

He found that the workers felt more fulfilled and happy during work hours compared to leisure time, even though they indicated that they preferred leisure time to work. The findings suggested not just that we underestimate the link between work--even monotonous work--and well-being. They also suggest that we might overvalue free time. A mentally calm day is not necessarily a pleasant one.

Yet from the steam engine to driverless cars, technology is designed to reduce labor and free up more leisure time. This shift, to be sure, is a net positive. Writers don't have to spend that much time rifling through books to find quotes and citations. We use Google. Most people don't worry about cultivating their own food or curing their own sickness. The transfer of labor-intensive work to machine work has allowed individuals to solve some of humanity's greatest challenges.

But is there such thing as too much automation? Every day millions of Americans order food online and "cook" microwavable dinners. One wonders if these time-saving alternatives come at the expense of learning a new skill (how to cook), a sense of accomplishment (cooking a steak medium rare), and the joy of providing (making a hungry spouse happy). More automation means more leisure time. But instead of tackling higher pursuits, we typically use free time to do nothing instead of something.

In The Glass Cage, Nicholas Carr writes about the substitution myth: "A labor-saving device doesn't just provide a substitute for some isolated component of a job or other activity. It alters the character of the entire task, including the roles, attitudes, and skills of the people taking part."

Commercial pilots, for example, typically switch autopilot on just after takeoff and switch it off before landing, sometimes a few feet above the runway. Above 28,000 feet, the FAA requires autopilot (computers are more accurate at maintaining altitude than humans). The role of the pilot is changing, and the shift is one reason flying has never been safer.

But Carr worries that the substitution myth is creating a related phenomenon he dubs automation complacency. In 2013, a government report attributed half of recent aviation accidents to automation. "Once you put pilots on automation, their manual abilities degrade and their flight-path awareness is dulled: flying becomes a monitoring task, an abstraction on a screen, a mind-numbing wait for the next hotel," writes William Langewiesche in a recent feature for Vanity Fair.

Boeing planes, Carr says, are preferred because the automation is human-centered. The control yokes, for example, provide artificial feedback that mimics what pilots felt before automation become standard (think about playing Star Fox on N64 with the rumble pack). Boeing planes never allow their software to override a pilot; in certain situations, Airbus planes do.

Automation bias is a closely related phenomenon. "It creeps in," Carr writes, "when people give undue weight to the information coming through their monitors." Automation bias is usually portrayed in apocalyptical terms: the rise of Skynet in The Terminator, Hal cutting off life support in 2001: A Space Odyssey, Sonny violating the Second Law of Robotics in I, Robot, an adaptation of a short story by Isaac Asimov. In this view, we blindly take orders from computers until the computers turn on us.

These fictitious examples exaggerate but they hint at real problems. We're all guilty of following the GPS into dead ends or against traffic on a one-way street. Sometimes automation bias even leads to life changing mistakes. Carr writes that Inuit hunters could detect subtle changes in the winds, snowdrift patterns, and animal movement. When snowmobiles replaced dogsleds and GPS replaced wayfinding, accidents, injuries, and deaths increased. They didn't know what to do once the batteries froze and the receiver broke.

Technology gives us more free time, but it doesn't tell us how to use it. This is a worrisome concession given how much we underestimate the upside of leisure, according to decades of research. Automation is not necessarily bad, but given that it is inevitable, we must remember that whenever we use technology, we're making a trade off. To paraphrase Carr, if we use machines to bypass learning, to carry out tasks that we don't understand, we'll never acquire the new skills that emerge from mastering a task, or the new horizons that those skills open up.