Dealing with, and preventing problems

The recent crash of an Ethiopian Airlines Boeing 737 Max 8 aircraft with the loss of the lives of all on board has once again focused considerable attention on airline safety. The main reason for this is, of course, because in the vast majority of cases when an aircraft crash occurs,  many people lose their lives in that one incident.  But whilst I don’t want in any way to suggest we try and reduce the seriousness and sadness of such tragedies, it is perhaps important to recognise that the proportion of lives lost through flying on commercial airlines relative to the number of people flying is considerably and significantly less each year than, for example, the proportion of lives lost of patients undergoing medical or surgical procedures in national health services around the globe. Yet the latter seldom receives a lot of publicity since these deaths primarily occur on an individual basis at different points in time.

Now of course, it could be said that it’s foolish to try and compare safety in the airline industry with that in the health services; so many things are different that a comparison is unfair and unreasonable.  Although there is some truth in that, although possibly not quite as much as most people assume, there are two things which the airline industry do with very strict adherence, which sadly national health services either do not do or do nowhere near as rigorously and thoroughly. These relate to the prevention of problems and the addressing of problems when they do occur.

The first of these is the use of the ‘black box’. Introduced as a mandatory requirement across the globe in the late 1960’s, each commercial aircraft carries two of these virtually indestructable boxes that record all important data from the aircraft’s systems as well as the voices of the officers in the cockpit. Following a crash the recovery of one, preferably both of these boxes is vitally important. Why? Because the content is analysed to the minutest degree by experts to try and establish the root cause of the crash, and to identify long term solutions that will as far as is humanly and technologically possible, prevent it ever occurring again, rather than to try and simply point the finger of blame at individuals or organisations. Solutions can cover many aspects and not solely technical. Sometimes, for example, weaknesses in crew training or briefing are identified and addressed. People are encouraged to come forward and to contribute to the whole process in, as far as is possible, an open and honest way. It is reckoned by air industry experts that the invention and vigorous use of the ‘black box’ has contributed significantly to the excellent safety record that the industry now has. Contrast this, if you will, with the health sector. As far as I’m aware no hospital operating theatre anywhere in the world has the equivalent of a ‘black box’, recording from beginning to end of each surgical procedure all data produced by operating and monitoring machines, video of the procedure and the verbal communications between the operating theatre staff.  On the contrary in some authorities, certainly here in the UK, far from everything being recorded in a full and transparent way, there appears to be a culture of fear in which mistakes are covered up and lessons are seldom learnt from them. This is so prevalent that the National Health Service (NHS) has introduced a policy to safeguard ‘whistleblowers’. What sort of culture is one that needs a formal policy to encourage people to come forward and tell the truth without fear of recriminations?!  Of course in the absence of any form of independent recording mechanism, reliable statistics on preventable deaths due to surgical errors are hard to come by; those stats that do exist depend upon the people responsible recording/reporting them. In a culture where blame and recrimination appears rife it is hardly surprising that not a lot of confidence is placed in the stats available, but even the estimates that exist, which must surely under-represent the true situation, are staggering. A report in 2017 estimated that in NHS England alone there were up to 9,000 deaths in hospitals each year caused by failings in NHS care. That’s the equivalent of having FIFTY fully loaded Boeing 737 crashes with NO survivors each YEAR in England alone!

The second thing the airlines now do with considerable rigour to help prevent accidents and major problems occuring is Crew (or Cockpit) Resource Management (CRM). Introduced in the late 1970’s following two major airline disasters, it became a global standard by the 1990’s and is now followed throughout the industry. Focusing on training, leadership and communication, it essentially brought the concept of true teamwork into the cockpit. Prior to this the Captain ‘reigned supreme’. He (and in those days it was almost entirely “He”), made all the decisions, and was assumed to be the expert who knew everything. It would be a brave, and possibly foolish First Officer who challenged a decision or action by the Captain, even in western culture where this was more the norm in other industries. But CRM changed all that and today, as any viewer of “Pilotseye” videos will know, all aircraft  flight deck verbal communications are repeated, every key action taken by a flight crew member is checked by a colleague, every key decision is discussed and agreed by the team.  I was fascinated to observe that whilst Captain Sullenberger was bringing his totalled crippled airliner down onto the Hudson River in 2009, an amazing act of flying skill, he was still ‘doing’ CRM. With less than a minute to go before landing he asked his First Officer “Got any ideas?”! Even though the answer was “None, actually” it demonstrated how well CRM was embedded into his consciousness. Again, as for the ‘black box’, air industry experts consider that CRM has contributed significantly to the excellent safety record that the industry now has, by helping to prevent accidents and major incidents occurring. Contrast this, again if you will, with the health sector. Although I’m sure every member of a surgical operating theatre ‘team’ will assure you that ‘teamwork’ is essential and they all practise it, I suspect that they are largely referring to the distinct roles that each person has and how these roles are all needed in the process, rather than to communications, leadership and team focused problem prevention and solving. There are plenty of anecdotes, backed up by patients observations, that suggest that there are still far too many cases where the consultant ‘in charge’ of a surgical procedure does indeed tend to ‘reign supreme’ with very little allowance for challenge or team based problem solving. How many, I wonder, of those estimated 9000 preventable deaths in England alone, would not have occurred if a rigorous and enforced system such as the airlines CRM, was in place?

So all of this may generate some questions for you, (as you’ve read this far!). Which of these industries, the airline or the health service one, is your organisation most like? Do you have a culture and supporting systems and processes that investigate fully and without attaching blame, each and every serious problem that occurs, with a view to learning from it and taking action that will substantially help prevent it ever happening again? Do you have a truly team based approach to decision making and problem solving?  If not, then maybe there’s an opportunity for you to initiate some improvements!