The Boeing B-17, also known as the Flying Fortress, helped the Allies win World War II, but it had a design flaw that cost the lives of many passengers and crew. The shift in thinking required to fix that error--from focusing on the machine to focusing on the human using it--was the first step down the path that led to the iPhone and the iPad today. 

The B-17 was rushed into wartime production and went from design to an actual plane in just a year, in time to make a real difference to the war effort, Cliff Kuang, author and founding editor of Fast Company's Co.Design, explains in an excerpt  from his book User Friendly on But something strange kept happening. The planes kept crashing unexpectedly, usually during what should have been a routine landing. By the end of the war, there had been thousands of such crashes. They were generally attributed to pilot error--after all, wartime had necessitated quickly training a lot of new pilots. But in many cases the pilots survived and couldn't think of anything they'd done wrong. On the other hand, there was no evidence of mechanical malfunction either.

The accidents remained a mystery until after the war, when psychologists Paul Fitts and Alphonse Chapanis finally figured it out--and the answer was stupidly simple. The control that lowered the landing gear and the one that lowered the wing flaps looked identical. It was all too easy for a pilot, especially at night, to reach for the landing gear control and grab the wing flap one instead. If that happened, instead of putting down the plane's wheels for a safe landing, he would slow the plane and drive it into the ground. Rather than pilot error, Chapanis called it "designer error"--the first time anyone had used that term. Chapanis pioneered the field of shape coding by creating a system of levers and knobs for airplanes in which each control had a different shape, making it much harder to mistake any of them for something else. He's considered one of the creators of the field of ergonomics.

Teach computers about people.

That B-17 redesign was the first time it occurred to anyone that we should design machines to accommodate human behavior instead of retraining human behavior to fit machines. But it certainly wasn't the last. In the 1980s, that idea came to the computer industry. Until then, the computer field had been dominated by coders who had learned their jobs by feeding stacks of cards into mainframe trays in order to convey their instructions. Then Steve Jobs and Apple came along to upend their most fundamental assumptions. As one of the first ads for the Mac put it, "Since computers are so smart, wouldn't it make sense to teach computers about people, instead of teaching people about computers?"

Apple was founded by both Steve Jobs and Steve Wozniak, of course, but it was Jobs' particular genius to see the role that design and user experience had to play in computers (and later in music players, smartphones, and tablets). Before Apple popularized the mouse-driven graphic user interface, the first thing anyone who wanted to use a computer had to do was spend some time learning its language. 

These days, our technology studies us more than the other way around, which  Kuang finds disturbing. It's a trend that leads to technology making decisions for you, for example when Facebook fills your newsfeeds only with items it thinks you want to see. I agree that it's a bad thing when technology does your choosing for you, but I don't agree with Kuang that it's a logical outgrowth of the idea of user-friendliness. There's a distinction between, say, autocorrect trying to guess what you intended based on what you typed, and Gmail's new feature that actually tries to help you write your email by guessing what you want to say.  

But regardless of which one of us you agree with, the next time you ask Siri to find movie times or tell you the weather, remember that you not only have Steve Jobs to thank, but also Paul Fitts, Alphonse Chapanis, and the Boeing B-17.