MIT event exposes fault lines among high-ranking former government officials on NSA’s data-gathering programs.
The Columbia shuttle accident that killed seven astronauts last February resulted from failures in both technical and organizational systems, according to MIT Institute Professor Sheila Widnall, a member of the Columbia Accident Investigation Board.
Lessons learned from the investigation can be applied not only to NASA but also to other types of organizations -- and engineers must play a key role in implementing them, she said during the third annual Brunel Lecture Series on Complex Systems on Nov. 4, sponsored by MIT's Engineering Systems Division.
"The response of engineers and program managers during the 16 days that Columbia was in orbit raises important issues for educating and utilizating engineers, as well as questions about their responsibility to treat system-level issues with the same disciplinary respect and expertise with which they treat components," said Widnall.
The Columbia Accident Investigation Board (CAIB) initially consisted solely of government employees and was intended to report to NASA. "Congress and the press let us know very quickly that this was not a good idea, so the CAIB was rechartered and civilian members were added," said Widnall, who joined the board as one of its new members on Feb. 18, seventeen days after the accident. "We decided that NASA would be a colleague in the investigation and that we would report to the American people."
The source of the tragedy was a piece of insulating foam from the external fuel tank that hit the shuttle on takeoff, creating a breach in the wing's leading edge. Hot gases entered the wing on reentry, devastating the internal structure.
Although foam problems had been noted in prior shuttle launches, schedule pressure created a motivation to treat these in-flight anomalies as maintenance turnaround events, or even as unplanned tests, rather than as an immediate danger to the shuttle and its occupants, Widnall said.
"Well-intentioned people and high-risk organizations can become victims of the normalization of deviance," she said. Although there had been several close calls before both the Columbia and the Challenger disasters, she noted that "the unexpected became the expected, which became the accepted."
The lesson: poor organizational structure can be just as dangerous to a system as technical, logistical or operational factors. "They can create blind spots, group-think and unwritten rules that make it change-resistant," Widnall said.
Mishap prevention often lies at the interface between technology and the organizational frameworks in which it's embedded, Widnall said. "Engineers must think about the organization as well as the technology and learn how to put their concerns in actionable form."