June 9,2008:
The loss of a $1.4 billion B-2
bomber in Guam last February was the result of three of 24 pressure sensors on
the wings providing incorrect data, which led the flight control software to
take unneeded corrective action during takeoff, and causing the aircraft to
crash. The pressure sensors were designed to deal with high humidity, but
conditions in the tropical climate of Guam created more humidity moisture than
the system could actually deal with. Some maintainers noted this, and manually
cleared the moisture. But this anomaly was not reported to the people who
maintained the pressure sensor system (and its related software), so it was
just a matter of time before the condition caused the kind of takeoff accident
that occurred last February. As a result of the accident, pre-flight procedures
now take high-humidity situations into account, and make sure all the sensors
are calibrated properly before the aircraft rolls down the runway.
Such accidents
are common with complex systems. In the last seventy years, engineers have
become much better at discovering dangerous situations before they occur. But
for the same reason you still have bugs in software, you still have these kinds
of problems in complex weapons systems. Eliminating these problems is more
difficult in more complex systems, and also depends on the quality of the
engineers developing, and later maintaining, the system, as well as the
"corporate culture" of the developers and maintainers. The Japanese, for
example, have a corporate culture that enables them to create critical software
with very few bugs. America, on the other hand, have a corporate culture that
enables complex systems to be developed and built.
A related
problem is the difficulties some nations have in maintaining complex systems,
because they have a "corporate culture" that is not exacting enough to operate
and maintain these systems effectively. It's a complex world out there, and
avoiding disaster takes a lot of work, talent and management skill.