Chapter 15: Human Factors and Ergonomics Practice in Aviation: Assisting Human Performance in Aviation Operations

by Jean Pariès and Brenton Hayward

Practitioner summary

By nature a highly dynamic and intrinsically risky system, aviation has always manifested the need to consider HF/E very seriously. It first did this empirically, through trial and error, following intuitions, which naturally designated pilots or aircraft (and their designers and/or maintenance technicians) as primarily responsible for mishaps. It subsequently opened up to scientific rigour, and became the focus of much research. Over the past 30 years the global footprint of aviation has facilitated rapid adoption and dissemination of quality HF/E rules and principles. Aviation remains relatively open to new ideas and is often venerated by other domains in regard to safety management matters. It is today a very demanding playground for HF/E practitioners, who are faced with high-level requests from demanding, experienced and knowledgeable clients. This chapter attempts to reflect the authors’ diverse experience with regard to issues of HF/E practice in aviation personnel selection, in aviation training, in equipment design and certification, in aviation safety management, and aviation safety occurrence investigation.


Chapter Quotes

“HF/E issues have always been an integral component of the aviation industry, from the selection of personnel, through training development and implementation, equipment and procedural design, to safety occurrence investigation.”

“While HF/E focuses on design, one thing that has certainly not changed with time is the critical importance of having a clear understanding of the relevant job characteristics (of the pilot/air traffic controller/flight attendant/mechanic…) and selecting individuals that possess KSAOs appropriate for that job. While other elements of the aviation system are very important to safety and success, well designed equipment, procedures and training – the common purview of the HF/E practitioner – will amount to little if we don’t start by selecting the right people, and of course training them appropriately.”

“It is recognised today that proper consideration of HF/E in the design of a cockpit or an air traffic control (ATC) tower or operations room could actually govern the efficiency and the reliability of the interaction between those systems and their human operators and maintainers; hence influencing productivity and safety. In theory, the perspective has almost been inverted, and many efforts are made to adapt the systems to the humans as well as the other way around.”

“In an attempt to help investigators… the authors of this chapter have developed two safety event analysis methodologies. The first, SOAM…evolved from the Reason model and guides the analyst in tracing back contributing factors from human involvement, through contextual conditions to organisational factors, and on to absent or failed barriers. The second, MINOS…is the result of work undertaken with Airbus and EUROCONTROL in the early 2000s to overcome the limitations of causal approaches.”

Practitioner reflections

Reflection by Stephane Deharvengt

On CRM…
“CRM must be subjected to rigorous evaluation or business case analyses”. This triggered old memories about why HF/E matters should be subjected to such “rigor and analyses”. HF/E is not part of the usual academic training in aviation. It is paid lip service in engineering schools. Top management rarely acknowledge the value of training in this area. One has to look at the expertise that has been developed, maintained and retained in various organizations in aviation: Airlines? None. Regulators? Gradually disappeared at the end of the 90s, with the notable exception of the FAA. This discussion on CRM misses the sociological acceptance of HF/E knowledge, techniques in aviation. Were CRM designers and facilitators rewarded for their enthusiasm or involvement, was it a career enhancement? No. Therefore you can’t expect regulations or industry practice to reflect rigor or develop business case when no one has the knowledge to properly develop and evaluate those programmes. (see Deharvengt, S. (2007). Barriers to Regulating Resilience: Example of Pilots’ Crew Resource Management Training. Resilience Engineering Workshop, Vadstena, Sweden, Linköping University Electronic Press.)

On design…
Description of regulatory requirements lack the systemic perspective embedded in the design requirement for cockpit (CS.25 §1302): in addition to the ergonomics items listed in p.9, the regulation requires the manufacturer to not only consider its design in isolation (integration aspect), to perform a systemic analysis of cases expected to occur in service. The “methodology” discussed in p.10 is one aspect of that specific regulation.

In order to do that, and before the regulation was even developed, major aircraft manufacturers and some major equipment suppliers invested in high level academic HF/E. This is a profound change for those organizations where HF/E practitioners have built a place for themselves (in-house or as consultants) and are accepted and their expertise is sought (by top management, designers and flight test pilots). This took some time, but I think the criticism toward the cockpit design community is not totally grounded. However the lack of HF/E expertise in one of the two major regulators represents a real danger in implementing HF/E.

Not I can’t say much about ATM, my experience in-house does not show a high level of understanding of HFE integration needs, rather a disseminated concern for low level issues, although things are progressively turning the way they have for the cockpit community. ATM is really the next big field for systemic design issues.

On SMS…
I agree with Jean and Brenton that SMS implementation relies on concepts (safety culture) and structures (the pipes) that constrain activities of front line operators without properlly accounting for their adaptative capacities.

Even though the conceptual framework is well established and workable grids for SMS are yet to be developed, maybe the major change should be the recognition by top and middle managers that processes and organisations issues have a real influence on how we deliver safety. Moving from a compliance based view (quality oriented) to a socio-technical management view of organizations is needed that would take into account local and historical cultures, professional cultures gridlocks, training deficiencies in social sciences for management for example.

On occurrence analysis…
The major bias nowadays is not a lack of organizational view of the accident. On the contrary, most safety practitioners and their managers are naive believers that the more data we get the better we’ll be in predicting, maybe not the next accident, but predicting the next safety action plan we’ll implement. This big data myth is pregnant in the still widely acknowledged view of the Heinrich pyramid.

In addition, also managers see a lot of organizational dimensions in incident analysis, the usual response (corrective action) is still on the front line operators (change the procedure or training) rather than looking at the aspects of the big picture that those issues are trying to draw our attention on.

On the “causation” issue, I’ve had several discussions with DEDALE on this. My perspective from a pragmatic point of view and considering the number of incident investigators involved in our organizations (airlines or ANSPs), it would be unrealistic to try and change their views or training or procedures and shift the investigation away from causation (not to mention AAIBs…). That’s why my message is to use and build on a Safety-I perspective (causation, things that go wrong) and progressively introduce Safety-II perspective in small touch. I agree with the concepts but the real world is absolutely not ready for a radical change.

As a summary, I really concur and feel “at home” with this historical description of HF/E and aviation, as well as recognizing the concepts that motivate my own practice. As an insider to the system and key players in this game, my reflexions point to the successes and failures in the field as well as the avenues for furthering HF/E in aviation since there is still a lot to do for motivated and knowledgeable individuals.

Reflection by Julia Pitsopoulos

It is great to see the focus on recruitment and selection in this chapter. Upon reflection, in the aviation industry today, there appears to be a strong HFE focus on non-technical skills training (e.g. via CRM training), and sometimes the other parts of the employee lifecycle, namely recruitment, selection and assessment are somewhat neglected. A challenge to overcome is to bridge the gap that often exists between the Human Resources teams and the Operational / Safety teams.  There are often differences in philosophies, objectives and focus in terms of what matters most. A common language and approach to the definition of safety critical competencies for aviation personnel is vital to ensure we are utilising all methods for managing human risk across the entire employee lifecycle. 

The challenge of non-technical skills training is ‘transfer’. How do we optimise the training methods to ensure that when it matters, the non-technical skills needed to effectively manage threats are enacted in an effective and timely manner. Fast tracking the development of mental models through experiential methods, such as fully integrating non-technical skills training across all training touchpoints (e.g. LOFT, technical training), and providing performance feedback using a structured behavioural framework, is critical to effective learning.One of the biggest focus areas, for today’s HFE practitioner, certainly in the area of commercial airlines, needs to be on flight path management and automation.  The complexity of the system means that we are relying on the human operator to have a mental model of the system that is so complex that it will inevitably present ‘surprises’ that need to be managed. As system complexity increases, the relative expertise of the operators in the industry is decreasing, as everyday aviation experiences provide less opportunity to build the skills and mental models necessary for ensuring deep levels of understanding of these complex systems.  Furthermore, as the industry continues to grow, the supply vs. demand of operators with the requisite deep knowledge, skills and experience in operating these complex systems is also under strain.  The automation surprise element certainly appears likely to be a key risk to be managed into the future.

One of the biggest focus areas, for today’s HFE practitioner, certainly in the area of commercial airlines, needs to be on flight path management and automation.  The complexity of the system means that we are relying on the human operator to have a mental model of the system that is so complex that it will inevitably present ‘surprises’ that need to be managed. As system complexity increases, the relative expertise of the operators in the industry is decreasing, as everyday aviation experiences provide less opportunity to build the skills and mental models necessary for ensuring deep levels of understanding of these complex systems.  Furthermore, as the industry continues to grow, the supply vs. demand of operators with the requisite deep knowledge, skills and experience in operating these complex systems is also under strain.  The automation surprise element certainly appears likely to be a key risk to be managed into the future.

In reflection on the aviation industry as compared with other high hazard industries, there still appears to be a large focus on issues that are historically and regulatory driven, rather than being data-driven and operator specific. Risk management needs to continue to focus on factors such as context, national culture, operating environment, systems, organisational culture and types of automation in order to ensure that operators are identifying and managing their own current and emerging risks. Generally the aviation industry has been seen as having an enviable culture of adherence to SOPs as compared to many other high hazard industries. However recent industry data, e.g. LOSA, appears to reveal that perhaps this culture is not as strong as is perceived.

Concerning safety management, the challenge in any industry, not just aviation, is to establish metrics and data sources that are valid and reliable to enable the ‘weak signals’ to be identified and managed, and to have a system for collecting, interpreting and presenting this information in a manner that is timely and efficient to enable appropriate actions to be taken. Good investigation ‘process’ skills are critical to get past the superficial level. Thinking in a ‘systems’ manner does not always come naturally, and these skills require education and practice in order to continue to develop.

 

 

Advertisements

About drclairewilliams

I am a senior consultant at Human Applications and Visiting Research Fellow in Human Factors and Behaviour Change at the University of Derby. Most of my work just now deals with leadership and culture in the health and safety realm; trying to support organisations to take a systems approach to understanding behaviour. I blog in a personal capacity. Views expressed here are mine and not those of any affiliated organisation, unless stated otherwise. You can find me on twitter at @claire_dr
This entry was posted in Part 3: Domain-specific issues and tagged , , , , , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s