The Challenger Disaster’s Minority Report

Following the space shuttle Challenger disaster, a committee was convened to determine both the cause of the accident and how future accidents could be prevented.  One of the committee members was the iconoclastic scientist Richard Feynman.

Feynman challenged the whitewashed report officially sanctioned by the committee and insisted on including his own “minority report” regarding the accident.  I think it’s important to reflect back on the importance of this report and the warnings it imparted to NASA and the manned space program.

Feynman was appalled at the apparent willful ignorance of the NASA management team and their reliance on imaginary statistics and fanciful reasoning.

It appears that there are enormous differences of opinion as to the probability of a failure with loss of vehicle and of human life. The estimates range from roughly 1 in 100 to 1 in 100,000. The higher figures come from the working engineers, and the very low figures from management. What are the causes and consequences of this lack of agreement? Since 1 part in 100,000 would imply that one could put a Shuttle up each day for 300 years expecting to lose only one, we could properly ask “What is the cause of management’s fantastic faith in the machinery?”

What was most upsetting to Feynman was the obliviousness of the management team to the risks they were ignoring.

The phenomenon of accepting for flight, [solid rocket booster] seals that had shown erosion and blow-by in previous flights, is very clear. The Challenger flight is an excellent example. There are several references to flights that had gone before. The acceptance and success of these flights is taken as evidence of safety. But erosion and blow-by are not what the design expected. They are warnings that something is wrong. The equipment is not operating as expected, and therefore there is a danger that it can operate with even wider deviations in this unexpected and not thoroughly understood way.

The report continues to detail problems with the main engines in addition to the solid rocket boosters.

Let us make recommendations to ensure that NASA officials deal in a world of reality in understanding technological weaknesses and imperfections well enough to be actively trying to eliminate them. They must live in reality in comparing the costs and utility of the Shuttle to other methods of entering space. And they must be realistic in making contracts, in estimating costs, and the difficulty of the projects. Only realistic flight schedules should be proposed, schedules that have a reasonable chance of being met. If in this way the government would not support them, then so be it. NASA owes it to the citizens from whom it asks support to be frank, honest, and informative, so that these citizens can make the wisest decisions for the use of their limited resources.

For a successful technology, reality must take precedence over public relations, for nature cannot be fooled.

One has to wonder if the Challenger disaster represented an inevitability given NASA’s culture of conformity at the time.  Rocking the boat was punished and keeping the shuttles flying (and the dollars flowing into NASA’s coffers) was seen as paramount.

Machines That Kill

Predator-DroneIn the latest issue of Radical Philosophy, Susan Schuppli attempts to define the brave new world of autonomous system which are programmed to take human lives.  How shall we react?  What of morality?  What of law?  Who is accountable?  What is accountability in this context?  Her commentary, Deadly algorithms: Can legal codes hold software accountable for code that kills? is well worth a read.

Decision-making by automated systems will produce new relations of power for which we have as yet inadequate legal frameworks or modes of political resistance – and, perhaps even more importantly, insufficient collective understanding as to how such decisions will actually be made and upon what grounds. Scientific knowledge about technical processes does not belong to the domain of science alone, as the Daubert ruling implies. However, demands for public accountability and oversight will require much greater participation in the epistemological frameworks that organize and manage these new techno-social systems, and that may be a formidable challenge for all of us. What sort of public assembly will be able to prevent the premature closure of a certain epistemology of facts, as Bruno Latour would say, that are at present cloaked under a veil of secrecy called national security interests – the same order of facts that scripts the current DOD roadmap for unmanned systems?

Osama Bin Laden has done more damage to the future of our country than he could possibly have imagined.  He managed to tap into our innate fears and drive us to create the weapons of our undoing.

The end is nigh…

Cyberdyne Systems Latest Autonomous System

Cyberdyne Systems Latest Autonomous System