Effective space safety efforts require learning from success and from failure. One way to learn is to examine investigation reports of accidents and incidents. Studying accidents is important because it can help us understand what went wrong in the past so that we can use this knowledge to prevent similar mishaps in the future. But the study of mishaps can also change the way we look at complex systems, helping us foster a positive, questioning attitude about our own systems. Developing successful space systems is a challenging and difficult endeavor under the best of circumstances. Therefore, a healthy skepticism is absolutely essential to ensure that assumptions are challenged and that problems are looked at from many different perspectives. Studying accidents can fuel this skepticism, helping us formulate the questions we should ask when making critical decisions. This compilation provides summaries of several accidents with the intention of improving space safety decision making.

The scene of the railway accident near Aberdeen

Railroad accidents like the one near Aberdeen depicted here, can be a source of important lessons learned in the development of safe space systems.

What can we Learn from Accidents on Earth?

The accidents and incidents described in this section of the website are intentionally selected from outside the space industry for a couple reasons. First, members of the space safety community tend to already be familiar with space system mishaps. Second, improving the safety of complex systems requires thinking outside our own fields of interest. There are many lessons we can gather from other industries to improve space safety. It is hoped that these accident summaries become tools we all use to reduce risks in the development of our own systems.

Note that in discussing these events, the intent is not to oversimplify the conditions that led to the incidents. Rarely is there only one identifiable cause leading to the accident. Accidents and incidents are usually the result of complex factors that include hardware, software, human interactions, procedures, and organizational influences. Readers are encouraged to review the full investigation reports referenced to understand the often complex conditions that led to each accident discussed here.

– In the video below, a documentary about the Texas City Explosion, one of the worst human-made disasters of history that killed at least 185 people.

The Role of Human Judgement

The variety of the accidents in this compilation should illustrate the safety challenges for all organizations implementing complex technologies. Safety activities require human judgment and therefore are subject to common human failings. Therefore, we must avoid the inclination to blame individuals and organizations when studying these accidents. We must all heed the words of Lord Anthony Hidden, Chairman of the Clapham Junction railway accident investigation, “There is almost no human action or decision that cannot be made to look flawed and less sensible in the misleading light of hindsight.” The review of these accidents should be used to help us to see where others have failed to prevent an accident, in spite of their honest intentions, intelligence, and hard work. And then we should apply those lessons to our systems with a certain amount of humility, recognizing that the incident could have happened to us had the circumstances been a bit different.

 

– By Terry Hardy

Inside “Lessons for Space Safety from Life on Earth”

  1. Lessons for Space Safety from Life on Earth
  2. Lessons for Space Safety from Life on Earth: System Safety Planning
  3. Lessons for Space Safety from Life on Earth: Flawed Risk Assessment Leads to In-Flight Engine Failure
  4. Lessons for Space Safety from Life on Earth: Hazard Identification
  5. Lessons for Space Safety from Life on Earth: Hazard Controls
  6. Lessons for Space Safety from Life on Earth: Hazard Analysis
  7. Lessons for Space Safety from Life on Earth: Design Reviews

About the author

Terry Hardy

Twitter Facebook Website

Terry Hardy founded and leads efforts in system safety, software safety, and emergency management at Great Circle Analytics. Mr. Hardy has over 30 years of engineering experience and has performed engineering, safety, emergency management, and risk management activities for a number of commercial and government organizations including NASA and the U.S. Federal Aviation Administration. Mr. Hardy has created a web site, www.systemsafetyskeptic.com, to provide lessons learned in system safety, and he is author of several books on system safety including "The System Safety Skeptic: Lessons Learned in Safety Management and Engineering" and "Software and System Safety: Accidents, Incidents, and Lessons Learned."

Leave a Reply

Your email address will not be published. Required fields are marked *