• Title Image

    The Aviation Advocacy Blog

    A cornucopia of news, opinion, views, facts and quirky bits that need to be talked about. Join our community and join in the conversation on all matters aviation. The blog includes our weekly round-up of the bits of European aviation you may otherwise have missed – That Was The Week That Was

Categories

Month of Issue

Just Culture

In this article, Margriet Bredewold, CEO Co-Guard GmbH explains her concerns with interpretations of the ‘Just Culture’ approach. Safety management system-based operations are being introduced across aviation. Consequently, the discussion about ‘Just Culture’ is increasingly relevant. The European Commission, EASA and various trade associations have taken up the concept and are making a serious attempt to oblige operators, maintenance organisations and ground handlers to adopt a ‘Just Culture approach’. Most of these attempts involve regulatory frameworks, policies and signed declarations stating that ‘honest mistakes’ are accepted as long as reported, but cases of ‘gross negligence’ and wilful conduct are a matter for the courts. The Just Culture concept provides managers a procedure to justify assigning culpability and disciplinary action (Hudson et al, 2008). However, is the baking in of judicial involvement the way forward? To understand the possibly very harmful effect of such interpretation of a Just Culture, we need to look more closely at human behaviour and human error in particular. Our industry accepts that 70% of accidents or incidents are down to ‘human error’ (UK CAA, 2014; Stevens and Vreeken, 2014). This can take many forms, ranging from ‘pilot judgment and actions’ and ‘situational awareness’ to ‘unsafe acts and errors.’: ‘human error’ as a hind-sight label for the root-cause of incidents and accidents. We speak of human error if something unintended by the actor has happened; was not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits. Human error as such is defined by its (undesired) outcome. To detect ‘error’ means we have to be confronted by the (unwanted) consequences first. We then backtrack an event to a decision point where the actor could have acted differently. We assign the label ‘human error’ there (Dekker 2014; Hollnagel 2014). Unfortunately, the operator involved did not have the information of the eventual outcome at that point in time. Therefore, assigning responsibility for decisions or actions of which unintended consequences could not possibly have been foreseen is not meaningful, nor ‘just’. The main difference between error and violation is intent. This difference is important, as it justifies many of our disciplinary policies across aviation. As with assigning error, we backtrack from an unfortunate action and compare this action to existing procedures. When we find a procedure was in place and had not been followed to the letter, we assign the label of ‘violation’ and base our judgement of culpability on this comparison. In this way, we accept reducing human performance to either ‘good’ or ‘bad’ in relation to (unwanted) outcomes and the existence of and post facto adherence to rules and procedures. We conveniently strip a situation from its context – time pressure, lack of resources, customer demands, bad weather – which may provide valuable information to understand why a particular action or decision made sense to a professional in that situation (Dekker, 2014; Hollnagel, 2014). What is worse, we justify assigning culpability and serious legal and professional consequences to this simplification and misrepresentation of facts. Once formalised and legalised we call it a ‘Just Culture’. However, operators perform despite imperfect information, technology and context (Dekker, 2014; Hollnagel, 2014, Pepe and Cataldo, 2011). Most of the time, they get it right. Day-to-day performance adjustments are necessary to make our operations successful (Dekker 2014; Hollnagel 2014). So instead of focusing on the few times something goes wrong, conveniently simplifying complexity and ignoring the imperfect conditions to serve the courts, our focus should be on understanding the complexity (Hollnagel 2014) and the adjustments needed to support our front-end operators, reducing the chances that they end up in a bad situation where only split-second judgements save the day. Claiming an a priori right to punish, based on a hindsight label assigned through oversimplified models, will deepen the distrust already present in our industry and undermines any opportunity for learning. It risks undermining the great safety and performance achievements our industry has built up. Instead, when we are willing to change our attitudes and opinions about our operators by actually understanding what they do and understand how they make our systems successful we may already move towards that desired gain in trust. List of References Bredewold, G.M. (2014) A Socio-Technical Appraoch to Safety, European Rotorcraft Conference, Aeronautical Society, United Kingdom. Dekker, S. (2008) Just Culture: who gets to draw the line?, Springer-Verlag London Limited Dekker, S. (2013) Second Victim, CRC Press, New York Dekker, S. (2014) The Field Guide to Understanding Human Error, third edition,Griffith University Australia GAIN working Group E (2004), A roadmap to a Just Culture, enhancing the safety environment Hollnagel, E., (2014) Safety-I and Safety-II The Past and Future of Safety Management, University of Southern Denmark, Ashgate, England. Hollnagel, E., J. Leonhardt, T. Licu and S. Horock, From Safety I to Safety II, European Organisation for the Safety of Air Navigation, www.eurocontrol.int. Hudson, P. et al (2008) Meeting expectations: A New Model for a Just and Fair Culture, Society for Petroleum Engineers 2008. Pepe J., and P. Cataldo (2011), Manage Risk, Build a Just Culture, www.chausa.org (2011) Reason, J.(1990), Human Error, Cambridge University Press, England. Senders, J.W. and Moray, N.P. (1991) Human Error: Cause, Prediction, and Reduction. Lawrence Erlbaum Associates, p.25. ISBN 0-89859-598-3 at WIKIPEDIA 2015 Stevens, J.M.G.F., J. Vreeken (2014) The Potential of Technologies to Mitigate Helicopter Accidents Factors – An EHEST Study, NLR, Amsterdam. UK CAA (2014) EASA Rotorcraft Symposium, Cologne.

Leave a comment

Save my name, email, and website in this browser for the next time I comment.

Previous Posts

Subscribe to receive notifications of new posts

[contact-form-7 404 "Not Found"]

Archive

Feed

RSS