Technology development model, safety and security: Time to change?

Many people had been warning about a wrong technological development years ago; you can find an example in the frame of this blog under the name “Improving Air Safety through Organizational Learning”. However, the development model did not change at all and nobody paid attention. Statistical information, in a high level analysis, could justify this behavior but, in the last years, many things have happened that can be seen as a serious warning hard to ignore. The so-called  “black swans”, meaning facts impossible to forecast, start to become a full flock. Some cases:

  1. Economic crisis: We can speak about greed, opportunism and many other things. The fact is that the financial system was so complex that it was impossible to have a real surveillance and many people understand parts of it but ignore secondary effects in different parts of the same system. Perhaps, the best explanation about what happened is a humoresque one: Opening any newspaper can show how hard is an agreement about the diagnostics and, hence, about the solutions. Depending on the expert one asks, forecasts are going to be different. Even, with emergent issues like BitCoin, there is not an agreement among experts about what the consequences are going to be.
  2. AF447: This air accident should mark a change: Experts with interest in the market went running to show that, at the end, everything was coming from a sensor and a human error. Is that right? If a faulty sensor and a tired pilot are enough to crash a plane…something is seriously wrong about air safety. Journalists ready to help manufacturers and regulators emphasized the time required by the pilot to go back to the cockpit. Have you ever seen the layout of a long-haul plane? Perhaps, you have observed a door in the middle of the plane where crewmembers come and go through. This door allows crewmembers to access the bunks that are downstairs. Now, watch in hand, try to find how long it takes to arrive from the bunk to the cockpit. Why bunks were removed from a position near to cockpit? The reality is that crewmembers got confused and a system that does not give enough information about what happens is, at least, questionnable because under unplanned events, it can produce confusion and inability to find the right action to perform.
  3. Cyberattacks: About two years ago, Iranian said to be able to get control over a U.S. drone forcing to land. The feasibility of that was rejected by U.S. officers but, only days ago, something happened inviting to think that it was a real possibility: If a mobile phone is enough to get control over a manned plane…can we seriously affirm that it is not possible with more advanced technology over an unmanned one? Aviation is not the only activity where things like these can happen: It’s said months ago that something bigger than 11S could come from powerful cyberattacks that could make useless vital installations. Even some technology managers are worried about the possibility of an undetected cyberattack that could convert a sophisticated weapon into something useless at the critical moment. Beyond cyberattacks, the feasibility of an EMP ( ) with similar but longer term effects is another real and present danger.
  4. Food fraud: Is it so hard identifying the kind of meat -if so-  that hamburgers have inside? Perhaps it is hard for users but…also for regulators? Why every other day a new scandal appears with products containing something that it is not in the declared composition? Not only horse or moose meat but dogs…what else?
  5. Aviation safety: Beyond AF447…is the average passenger informed and concerned about what is the real safety level? Does the passenger know that, when crossing an Ocean in a twin plane, if one of the engines fails, the plane is certified to fly for hours with the remaining engine? Manufacturing processes are the ones that are certified or, as some people say, different? . What about the air quality onboard? What about the practice in some airlines of having a flight student as a first officer paying to fly in a plane with passengers?…

In short, technology has kept the same track and it is more and more hard to understand and check. Regulators cannot be trusted -independently of their knowledge or professional attitude- if final user cannot check their work. If users cannot check how the regulators protect their interests, the expected results should be that the regulators should take care of their own interest, that not always could be the same that the ones of users.

The old Rassmussen rule, “the operator has to be able to run cognitively the program that the system is running”, is not followed time ago. However, as times goes by, we find that the problem goes beyond operators. Designers themselves understand specific parts without a clear understanding of the full product and, hence, unable to foresee consequences coming from interactions among different parts of the system. AF447 was a big warning light but it is not the only one and, perhaps, it is not even the worst one. It should be the right time for an assessment of technology development model. Otherwise, consequences will be worse and worse.



Por favor, inicia sesión con uno de estos métodos para publicar tu comentario:

Logo de

Estás comentando usando tu cuenta de Cerrar sesión / Cambiar )

Imagen de Twitter

Estás comentando usando tu cuenta de Twitter. Cerrar sesión / Cambiar )

Foto de Facebook

Estás comentando usando tu cuenta de Facebook. Cerrar sesión / Cambiar )

Google+ photo

Estás comentando usando tu cuenta de Google+. Cerrar sesión / Cambiar )

Conectando a %s