On January 16, 2003, the space shuttle Columbia was successfully launched from the Kennedy Space Center on a sixteen—day research mission. The next day, shuttle engineer Rodney Rocha reviewed a video of the launch and became deeply concerned about the size and position of a chunk of insulating foam that appeared to have fallen off the shuttle’s external tank and struck its left wing. The video images were grainy, and it was impossible to be sure what had happened. To determine whether damage had occurred, Rocha hoped to obtain photographic images of the shuttle’s wing from spy satellites. Although the photos would have to be authorized by the Air Force, the request would require neither a technical nor financial miracle. It did mean that NASA would have to ask for help from the Department of Defense.

Rocha initially expressed the need for the satellite images in an e-mail to his immediate superior, emphasizing the urgency by using bold - faced type. When he learned that his request was unlikely to be honored, Rocha wrote a scathing e-mail: “Remember the NASA safety poster everywhere around, stating,  ‘If it’s not safe, say so?’ Yes, it’s that serious.” He didn’t send the e-mail to the Mission Manager, however, only shared it with fellow engineers. Later, he explained that “engineers were often told not to send messages much higher than their own rung in the ladder.”

Discouraged by his early efforts to call attention to the foam-strike issue and convinced that voicing concerns was career limiting at NASA, Rocha refrained from sharing his anxiety in a critical mission management team meeting, eight days into the flight. He fervently hoped others with more clout might offer their concerns. The opportunity passed, however, and the issue was never formally revisited in a mission management team meeting. Just eight days after this lost opportunity to speak up, the shuttle burned up upon reentry into the Earth’s atmosphere, resulting in the death of all seven astronauts. Much later, asked in a television interview with ABC News Anchor Charlie Gibson why he didn’t voice his doubts about the safety of the shuttle in that mission management team meeting, Rocha replied, “I just couldn’t do it. I’m too low down . . . and she [Mission Management Team Leader Linda Ham] is way up here,” gesturing with his hand held above his head.

The 2003 Columbia space shuttle tragedy reflects an unusually dramatic consequence of not speaking up in the workplace—especially with tentative concerns or unproven ideas—an all-too-common organizational dynamic. Instances where people are reluctant to voice concerns or engage in behaviors that could threaten their image occur within a wide spectrum of industries and organizations. Although it’s understandable to keep silent about mistakes when not much is at stake, in many situations errors can be deadly. Consider a nurse momentarily pondering, but then immediately dismissing, the possibility that the medication dosage for a hospital patient seems high. As the thought crosses her mind to call the doctor, by then fast asleep at home, she recalls his disparaging comments the last time she called. In that brief moment of opportunity to voice concern, her brain exaggerates the importance of the doctor’s scorn and minimizes the chance of harm to the patient.

Far from the urban hospital, a young pilot in a military training flight notices that the senior pilot may have made a crucial misjudgment, but lets the moment go by without pointing out the error. The young pilot is not only of lower rank, but is also formally evaluated on every flight. The prospect of speaking up to the superior officer brings significant emotional costs, even though the pilots are interdependent members of a cockpit team. Unlike the nurse, the pilot may actually be choosing silence over preservation of his own life. Here again, his mind, against reason, discounts the chances that not speaking up will lead to a fatal crash and exaggerates the importance of his discomfort at being chastised or ignored.

Even those at the top of the hierarchy are not exempt from the fear of speaking up. Consider the following example: a senior executive, recently hired by a successful consumer products company, has grave reservations about a planned takeover. New to the top management team, and conscious of his status as an outsider, he remains silent because other executives seem uniformly enthusiastic. Many months later, when the takeover has failed, the team gathers to review what happened. Aided by a consultant, each executive muses on what he or she might have done to contribute to or avert the failure. The silent executive, now less of an outsider, reveals his prior concerns. Openly apologetic about his past silence, he explains that the others’ enthusiasm left him afraid to be “the skunk at the picnic.” 

What all of these vignettes have in common is the degree to which interpersonal fear can dominate modern work life and thwart the collaboration that is desperately needed in the knowledge-intensive organizations that dominate today’s economy.

Interpersonal fear—the fear associated with personal interaction and social risks—is at the root of many of these failures. The problem is widespread. In corporations, hospitals, and government agencies, my research has found that interpersonal fear frequently gives rise to poor decisions and incomplete execution. Fortunately, effective leadership and practice with new ways of thinking and working can create an environment of psychological safety that mitigates this problem.

The term psychological safety describes a climate in which people feel free to express relevant thoughts and feelings. Although it sounds simple, the ability to seek help and tolerate mistakes while colleagues watch can be unexpectedly difficult. Yet, frank conversations and public missteps must occur if teaming is to realize the promise of collaboration across differences.

This chapter explains the construct of psychological safety and examines methods and behaviors for developing a psychologically safe environment. Drawing from extensive research, I begin by defining psychological safety and exploring its fundamental attributes. I then describe the seven ways that psychological safety contributes to successful teaming and organizational learning, and examine the corrosive effect hierarchy can have on psychological safety. I end the chapter with a detailed explanation of how to cultivate psychological safety, including how a team leader can shape and strengthen the collective learning process both directly and indirectly by fostering an open, safe environment. 

Trust and Respect

Simply put, psychological safety makes it possible to give tough feedback and have difficult conversations without the need to tiptoe around the truth. In psychologically safe environments, people believe that if they make a mistake others will not penalize or think less of them for it. They also believe that others will not resent or humiliate them when they ask for help or information. This belief comes about when people both trust and respect each other, and it produces a sense of confidence that the group won’t embarrass, reject, or punish someone for speaking up. Thus psychological safety is a taken-for-granted belief about how others will respond when you ask a question, seek feedback, admit a mistake, or propose a possibly wacky idea. Most people feel a need to “manage” interpersonal risk to retain a good image, especially at work, and especially in the presence of those who formally evaluate them. This need is both instrumental (promotions and rewards may depend on impressions held by bosses and others) and socio-emotional (we simply prefer approval over disapproval).

Psychological safety does not imply a cozy situation in which people are necessarily close friends. Nor does it suggest an absence of pressure or problems. Psychological safety does not mean a group has to be cohesive or in agreement about things. As research has shown, group cohesiveness can reduce people’s willingness to disagree with or challenge each other. The term groupthink refers to this problem. Specifically, in many cohesive groups, people are reluctant to disturb the feeling of harmony created by the group’s apparent agreement about an important issue. This leads them to hold back or fail to admit to holding a different view, and thus contributes to poor decision making. Yale professor Irving Janis attributed President Kennedy’s ill-fated plan to send Cuban exiles to invade the Bay of Pigs in 1961 to groupthink. In contrast, psychological safety describes a climate in which raising a dissenting view is expected and welcomed. A tolerance of dissent allows productive discussion and early detection of problems.

I have found that many people are genuinely pained and frustrated by keeping silent at work. For the most part, the people I’ve studied aren’t failing to provide ideas or input because they’ve “checked out” or don’t care, but because of a subtle but pervasive fear of what others, particularly those in power, might think of them. As most people intuitively recognize, each of us engages in a tacit “calculus” in which we assess the risk associated with a given interpersonal behavior, quickly and effortlessly, as we face a micro-behavior decision point. To illustrate what I mean by a micro-behavior decision point, imagine that while you are in a conversation with your boss, you consider fleetingly, “Should I say something about this?” In this almost imperceptible thinking process, you weigh the potential gain against the potential loss. You wonder, “If I do this, will I be hurt, embarrassed, or criticized?” If you quickly conclude that the answer is no, then you have a sense of psychological safety, and you proceed to voice your thoughts. (If you believe that the answer might be that you could be hurt but you speak anyway, then you are demonstrating courage.) Typically, proceeding means being authentic. It means expressing the work—relevant thoughts and feelings on your mind without excessive self—censorship.

Consider the fact that admitting a mistake or asking for help may be unthinkable in one work setting and yet readily accepted, even valued, in another setting. The difference between the two situations is what psychological safety is all about.

The easy solution to minimizing image risk at work is to avoid doing or saying anything unless you’re absolutely sure you’re right. This is obviously a facetious solution. Not only does it limit creativity, stifle innovation, and preclude authentic relationships, it also creates important risks of another kind—risks to performance and safety. This is especially true in dangerous industries such as nuclear power, where admitting errors and asking for help may be critical for avoiding catastrophe. The human tendency to favor silence over voicing concerns is also particularly troubling in organizations where lives are at stake, such as in hospitals. Extensive research on hospitals and other high-risk organizations has shown that rules and required procedures are not enough to eradicate errors that were not caught or corrected due to a lack of psychological safety. This isn’t because people deliberately break rules, but rather because of the subtle ways in which we make sense of uncertainty and view each other at work. Whether frequently or infrequently, overtly or implicitly, most people in organizations are being evaluated in an ongoing way. The presence of others with more power or status makes the threat associated with being evaluated especially powerful, but it by no means disappears in the presence of peers and subordinates.  

When we speak up about concerns or ask questions at work, we risk being seen as ignorant. Right or wrong, people may expect us to already know the answer or understand the situation. Similarly, most people intuitively believe that speaking up about mistakes or seeking help will lead people to conclude that they’re incompetent. And when someone speaks up about problems or errors, he or she also risks being seen as negative. Because most people also believe themselves to be working to the best of their abilities, when others give them negative feedback, it can be seen as inaccurate, and so the messenger can be seen as a troublemaker. Finally, in speaking up about something, we might risk giving an impression of being disruptive. This is particularly true in busy organizations where it’s often hard to accomplish the day’s tasks within normal business hours, and so interruption can seem more disruptive than helpful.

Amy C. Edmondson, Teaming: How Organizations Learn, Innovate, and Compete in the Knowledge Economy, © 2012 by John Wiley & Sons, Inc. All rights reserved; reprinted with permission of the publisher.