Long ago, in 1984, Charles Perrow published the book destined to be the most important one written by him: Normal Accidents. In this book, Perrow established that accidents happened due to an increasing complexity linked to tightly-coupled organizations. Snowball effects could happen through unexpected interactions among parts of the system.
Perrow had many followers -Hollnagel is perhaps one of the most brilliant- that voiced their concern about the rationale behind technology improvement. Accidents increase their outcome through the same channels that organizations use for their normal activity. Accidents in efficient organizations are efficient too. The risk concept, understood as a product of impact by probability is changed through complexity: Probability is decreased and potential impact is increased.
A few years ago, Nassim Taleb used an interesting concept, Black Swan, to speak about situations that, simply, were supposed not to happen. Therefore, nobody had provided resources for this possibility. Once the accident was happened, we could attend to an incompetence exhibition in the management of the event because everyone was convinced that it cannot happen. We can try to reason in a fine and honest way but, even though, we won’t avoid the existence of black swans , that is, we always are going to find fully unexpected situations. However, there is a variety of situations that can pass undetected, that is, the blackened swans.
What is a blackened swan? Something that, actively, we have chosen not to see. Perrow, Hollnagel and others tell about a dynamics that many others do not want to see: If an airline run into serious financial problems or works with a very short profit margin, we could reasonnably thing that they are saving money in the less visible parts -one of them is Maintenance- but Aviation regulator don’t know about financial issues and financial specialists do not know about Aviation. As it happened in the Titanic, both things work as watertight compartments. As it happened in the Titanic compartments are not fully watertight and the vessel can be sunk: When a pilot is taught that advanced electronic systems can guarantee that the plane cannot stall…he will pull the flighstick, he’ll do it still more whole-heartedly if a synthetic voice encourages him to do that and he’ll do it without a feeling that something is wrong if the flighstick does not give any feedback through pressure feeling. If regulations allow a twin plane to fly more than five hours through oceanic ways without any available airport…sooner or later, a plane will have to go into the water with a full load of passengers. If a company loads the minimum fuel required by law and it had problems in the past because of that, a moment will come with a plane going down because of fuel-starvation…these are the blackened swans, risk situations that everybody knows but where accountable people look to other place.
When these blackened swans situations produce its expected outcomes, we always will find people telling us that it was a black swan. Of course, to do that, they will carefully hide the fact that they painted it before to not see the risk and, hence, to say that they were ignorant about the risk before the event. There is still another resource to justify why a situation does not change: Call it Human Error.