Knowledge Risk Management and the Challenger disaster

When tragic events happen that could have been prevented, we all want to know how and why they happened. One example is  that of the Challenger disaster. In 1986, a space shuttle called the Challenger blew up after takeoff killing those on board. In the aftermath, an investigation was done into the cause of the tragedy to see how and why it happened. Kumar and Chakrabarti look into this tragedy through the lens of bounded awareness and tacit knowledge as it pertains to decisions made by managements. Bounded awareness is experienced when decision makers “overlook relevant  and readily available information and take a decision that is either suboptimal or entirely erroneous” (Kumar and Chakrabarti, 2012, p. 935). We have all probably experienced bounded awareness at some point in our lives (at least, I know I have). We may not always make good decisions and often overlook or ignore information that could help us make optimal decisions.  Tacit knowledge, however, usually is seen in a more positive light as many see it as important to organizations. In the case of the Challenger tragedy, both tacit knowledge and bounded awareness affected the Challenger flight. Both the managers and engineers were aware of issues with the flight. However, that information were not considered a high threat, especially by the managers, so it was ignored and a decision was made that was erroneous (especially in hindsight).

Kumar and Chakrabarti (2012), “take it that managers approved the launch only because they genuinely did not perceive that the explicit information given to them by the engineers on the previous night about high failure likelihood was relevant” (p. 938).

According to Massingham (2010), knowledge can help move us toward more certainty which would help people make decisions, especially in risk management (p. 465).  The problem with knowledge, however, is that it can be very subjective. People do not always see risks in a logical way, and people have different perceptions of the world and the reality around them.  Our experiences and our tacit knowledge (which varies from person to person) affect how we see the world and the risks in our life. Even when people have the exact same knowledge, like the knowledge of the risks of the Challenger flight, that knowledge is seen through their specific lens. The managers had the same knowledge as the engineers, but whereas the engineers were concerned and told the managers they shouldn’t launch the flight, the managers did not view that knowledge in the same way because the engineers did not have hard proof and the stakeholders were determined the flight should go on as planned.

Now this is where this post is going to take a hard left to social and intellectual capital, for a little bit, but hopefully it will all come together in the end. According to Nahapiet and Ghoshal, “the central proposition of social capital theory is that networks of relationships constitute a valuable resource for the conduct of social affairs” (p. 243). Intellectual capital revolves around knowledge and the knowing capability of an organization. Both social and intellectual capital affect organizations and the decisions they make. In the organization involved in the Challenger flight, there were many relationships at play that could have had an impact on the decisions made about the Challenger flight. There was the relationship between the managers and the engineers (which seemed to be a good relationship) and the relationship between the managers and the stakeholders (the stakeholders were invested in the takeoff moving forward) and the relationship between the managers and the relationship between the engineers. Nahapiet and Ghoshal (1999) argue that “it is the interaction between social and intellectual capital that underpins organizational advantage” (p. 259).

Through these articles and the themes that showed up in them (and previous articles read) got me thinking about the value of tacit knowledge. Tacit knowledge is typically seen as a hue asset to organizations. However, it seems as if tacit knowledge stays stagnant, rather than new knowledge being created (usually based on existing tacit knowledge), it can become  a detriment. That being said, our assumptions and biases and similar things may play more of an impact than our tacit knowledge. It is important to create new knowledge rather than just relying on existing knowledge.

How do we determine what knowledge and information is important? The managers had the right information to make a good decision but didn’t realize it was important. Of all the knowledge and information we have access to each day, how can we determine the necessary information  and knowledge to making the best decisions we can in each situation?

 

Kumar J, A., & Chakrabarti, A. (2012). Bounded awareness and tacit knowledge: Revisiting Challenger disaster. Journal of Knowledge Management, 16(6), 934-949.

Massingham, P. (2010). Knowledge risk management: A framework. Journal of Knowledge Management, 14(3), 464-485.

Nahapiet, J., & Ghoshal, S. (1998). Social capital, intellectual capital, and the organizational advantage. The Academy of Management Review, 23(2), 242-266.

 

13 thoughts on “Knowledge Risk Management and the Challenger disaster

    • I want to say yes. From the readings I have read so far (and my understanding of them, which I’m still trying to wrap my head around), knowledge is incredibly complicated and context often plays a part in how we act and the risks we take. So knowledge could prevent people from taking risks they might have taken, though that could be a positive or negative thing.
      That’s probably more than you asked for lol.

      Liked by 1 person

  1. Just to add — what I think @abigailkeller2 has pointed out well — is that there are some constraints on how we interpret knowledge, on how we weigh its importance. That is, to know something doesn’t presume some kind of new objective state of being. This is because any two of us can know the same thing, but neither or only one of us may realize its significance.

    Like

  2. I haven’t read these articles yet but it seems like the managers didn’t realize the importance of the information because they didn’t have the right knowledge to understand it, right? If so, that makes a lot of sense why there can be issues when managers are just managers but don’t have the organizational knowledge to be a true part of the organization. I feel like there should be a checks and balance in place. Like in order to set off a nuclear bomb, you have to have a bunch of codes and turn two keys on opposite sides of the room at the same time. Basically, consensus must be that setting off these nukes is the right answer. Consensus and voting and things like that can can help make sure everyone agrees (in the sense of happy compromise) and everyone has a stake in the decisions that are made. I remember a TED Talk we watch in one class about introverts in the workplace and one thing that stood out to me was their approach to brainstorming and collaboration. Maybe that could help with keeping track of bounded awareness, tacit knowledge, and risk management?

    Like

    • There didn’t seem to be that much consensus (at least true consensus) between the engineers and managers. If there was more of a consensus maybe they could have prevented it. Then again there were so many factors involved! I agree that a check and balance system could be beneficial. I also think trust is a huge part as well.

      Like

  3. Pingback: Real Life Knowledge Management – kamrynwies

  4. Pingback: Post-Paper Framework Examination – rhmaxsonlis658

  5. In the context of the Challenger disaster, it’s hard for me to separate bounded awareness from arrogance. Additionally, it’s hard for me to view this as an unintentional disaster (a crisis communication term).

    Like

Leave a reply to abigailkeller2 Cancel reply