Pilot Duty of Care and the Role of the Human Factors Expert

When an aircraft accident occurs, everyone wants to uncover the underlying causal and contributory factors; in other words, the why. After careful examination of all available information in an aviation accident investigation, a skilled human factors expert can render educated opinions regarding a wide range of issues, from duty of care considerations to external influences and beyond. At a minimum, these issues will include physical, physiological, psychological and psychosocial factors, as well as matters pertaining to the human-machine, human-environment and human-system interfaces.

First, some background: Most of us immersed in the airline industry, and particularly those of us qualified as captains in scheduled commercial airline operations are intimately familiar with the pilot duty of care concept. Varied explanations and definitions of duty and degree of care exist; for the purposes of this discussion, however, there are only a few that matter. First, the code of federal regulation provides that a person may not operate an aircraft in a careless or reckless manner so as to endanger the life or property of another.1 Second, a pilot has a duty to remain vigilant throughout the flight, and a duty to observe, recognize, and avoid dangerous conditions which confront him/her.2 In Webb v. United States, 840 F. Supp. 1484 (D. Utah 1994), the court held that “the pilot of the aircraft is directly and ultimately responsible for the operation of his aircraft. Pilots are charged with that which they should have known in the exercise of the highest degree of care.” Finally, with specific regard to the airline, a common carrier by air must exercise the highest degree of care consistent with the practical operation of its plane for the safety of its passengers.3

Practical Operation. Perhaps it was unforeseeable at the time, but in this modern era of highly-advanced, automated aircraft capable of virtually every maneuver except automatic takeoffs, a definition of “practical operation” lends itself to various interpretations. While some salty dog whose last aircraft has long been soaking in the desert may contend that his definition of practical operation is anything between the “shaker and the clacker,” (i.e., stall and overspeed), mine is a slightly more conservative approach to that which I assume was intended. For example, upon being cleared for a visual approach on a cloudless, sunny day with light winds and unlimited visibility, I might very well assume that one option for the practical operation of the airplane is to disconnect every last bit of automation, (including the flight directors) and hand-fly the approach all the way to landing, (an option I routinely prefer). Conversely, given the same conditions, my First Officer may elect to leave the automation completely engaged throughout the entire approach, landing and rollout, assuming conditions and equipment were permissible for an autoland. Did either of us fail to exercise anything less than the highest degree of care consistent with the practical operation of the aircraft for the safety of our passengers?

Ask 10 experts and, surprisingly, you might get 10 different answers.

Several fatal commercial aviation accidents within the last five years have raised the issue of automation; specifically, whether flight crews were exclusively focused and/or overly-reliant upon it to the complete detriment of their situational awareness (SA), ultimately resulting in the loss of the aircraft. Despite the presumed lessons learned in the aftermath of these accidents, NASA’s Aviation Safety Reporting System (ASRS) continues to receive reports in which crews appear to focus on the autoflight system to the extent that their SA is reduced, sometimes during critical phases of flight. Awareness of the aircraft’s current flight dynamics, (altitude, heading, airspeed, attitude, etc.) and the pertinent aspects of the approach appear to have become secondary notions rather than primary elements of flying the aircraft.4

While a strong argument can be made for the use of automation to its maximum extent as consistent with the practical operation of today’s complex aircraft, the duty of care required of a pilot to fully comprehend, properly utilize, and diligently monitor the autoflight system must evolve in lockstep with these advancements in technology. Training is obviously critical; however, even as contemporary flight crew training now fully embraces human factors concepts such as Crew Resource Management (CRM), it has also become increasingly focused on the passive task of automation management vs. active, physical manipulation of the aircraft. As we have seen, to ignore the myriad human factors issues associated with this sea change in thinking is to do so at our own peril.

Recently-raised human factors red flags such as automation fixation, human interaction vs. workload, and decreased takeover skills and other “out-of-the-loop” performance are not new; unfortunately, however, they persist as contributory and sometimes causal influences in modern-day aviation accidents. From a regulatory perspective, it can be argued that not much has been done. As a start, a conceptual shift from “Pilot-Not-Flying” (PNF) to “Pilot Monitoring” (PM) occurred in 2003 with the publication of FAA Advisory Circular (AC) No. 120-71A, Standard Operating Procedures for Flight Deck Crewmembers, and a related appendix that addressed crew monitoring and cross-checking. The AC stated, among other things, that “it is increasingly acknowledged that it makes better sense to characterize pilots by what they are doing rather than by what they are not doing.”5 It went on to say that studies of crew performance, accident data, and pilots’ own experiences all point to the vital role of the non-flying pilot as a monitor. In my experience, this role has become increasingly critical in the effort to maintain the level of cockpit SA required in highly-automated aircraft, one not easily eroded by distractions, mode confusion, workload, fatigue, or over-reliance.

After AC 120-71A was published, all was relatively quiet until 2006, when the FAA announced its recognition of the need to address technological advancements in the cockpit, corresponding changes in procedures, and increased reliance on automated flight management systems. They responded by forming a Flight Deck Automation Working Group to assess the safety and efficiency of modern flight deck systems for flight path management, equipment design, procedures, qualification, and training, among other issues. On November 21, 2013, the FAA published a document entitled, “Fact Sheet – Report on the Operational Use of Flight Path Management Systems”; ultimately taking over seven years to conclude that “modern flight path systems create new challenges that can lead to errors…including complexity in systems and in operations, concerns about degradation of pilot knowledge and skills, and integration and interdependence of the components of the aviation system.”6 Along the way, the working group made 18 recommendations, some of which the FAA claims to have taken action on; however, most has been in the form of guidance and research studies vs. actionable regulation. One important development has been the recently-finalized Pilot Training and Pilot Certification Rules requiring balanced automation management and manual flying skills training and checking for all air carrier pilots. In addition, on January 4, 2013, the FAA published Safety Alert for Operators (SAFO) 13002, “Manual Flight Operations,” encouraging operators to promote manual flight operations when appropriate.7Unfortunately, the SAFO is not regulatory in nature and therefore not enforceable; it merely encourages operators to take this integrated approach and suggests in its recommended action that all affected parties should be familiar with the content, and should work together to ensure it is incorporated into policy, training, and proficiency checks.

Despite industry-wide attention on the impact of automation on pilot SA and the latent erosion of manual flying skills, a 2013 study I conducted on a representative sample of my peers as part of my Master’s degree in Aviation/Aviation Safety indicates that we still have a long way to go. Approximately two-thirds of respondents indicated that they had at some point been confused by what the automation was doing, and 100% of respondents reported having to disconnect the automation to make the aircraft respond as desired. When asked if the automation confusion event could have resulted in an accident or serious incident, almost 75% responded that it could have, had it gone unnoticed and/or unresolved. There is cause for optimism in that each pilot acted appropriately, disconnecting the automation and hand-flying prior to ending up in an undesired aircraft state; regrettably, history proves this doesn’t always occur. Of further concern is that respondents reported that between 85-95% of their annual recurrent simulator training is accomplished with the automation engaged, with very few maneuvers practiced and/or evaluated manually. Exacerbating the issue is that not every airline has an automation policy mandating or advocating hand-flying to maintain proficiency and those that tout such policies don’t truly have any viable means to enforce them. Thus, fair or not, it is essentially left up to the individual pilot to establish and maintain the requisite proficiency level both with and without automation engaged, particularly in critical phases of flight, and ensure that it continuously meets or exceeds his/her duty of care.

That said, the issue of pilot duty and degree of care continues to evolve due to advancements in technology and aircraft capability, among other progressions such as electronic flight bags, NextGen, and perhaps someday soon, adaptive automation, including touchscreen avionics. This evolution persists irrespective of whether the pilot community has been given adequate time, resources and/or training to adapt, or if the necessity of such a period of acclimatization is even acknowledged. By virtue of stepping into the cockpit day after day, pilots both acknowledge and tacitly accept the duty of care concept, regardless of what it takes to uphold. In consideration of the challenges that advanced automation can present, coupled with my personal observations from the cockpit over the last 20 years, my preparation consists of the following (minimum) requirements:

–       The necessity of fully understanding both functionally and operationally the capabilities, modes, limitations, incompatibilities and possible failure scenarios of the autoflight system, and maintaining comprehensive proficiency in its utilization, particularly during critical phases of flight;

–       The commitment to maintaining a high level of manual flight proficiency, (i.e., autopilot, autothrust, and flight directors off), particularly in challenging conditions and/or critical phases of flight, so that I can provide an equally high level of care in the event the autoflight system fails or is otherwise unavailable as that which exists when the system is fully functional and engaged;

–       Full participation in my role as Pilot Monitoring when my First Officer is at the controls, including active vigilance/situational awareness, complacency avoidance, and the readiness to manually intervene should the flight path become compromised; and

–       Enhanced crew coordination and communication efforts, to include briefing specific levels/modes of automation to be used, and verbal annunciation of active flight management system modes as they vary throughout the flight.

Using these examples as a template of sorts, and with regard solely to automation in the context of this discussion, is it then reasonable to conclude that the absence of similar preparation constitutes careless or reckless behavior, or otherwise fails the duty of care standard? That will be but one of numerous complex issues to sort through when litigating an aviation accident, and, while it’s ultimately up to you and your client, an experienced human factors expert may be able to help. How? By providing you with an expert analysis of the why.

Regardless of what the NTSB eventually concludes, frequently absent from the report is an exhaustive human factors analysis. It may be apparent to all involved parties that the aircraft stalled and subsequently crashed; what is not so obvious is how a competent and skilled crew allowed that to occur, and the answer is rarely simple. Cockpit voice recordings, digital flight data, and ATC radar and voice tapes can be meticulously analyzed and are routinely used to construct a “simulation” of the accident. What this won’t reveal, however, is how both pilots became sufficiently distracted, fixated, task saturated, or confused; why they failed to reference visual cues or raw data; why they failed to appropriately monitor the automation and/or manually intervene, or any additional human factors issue(s) that may have contributed to the loss of SA that resulted in the accident. Unless, for example, a surviving pilot discloses in the post-accident interview that he/she was uncomfortable shooting a visual approach on a clear, beautiful day, the data will never explain why the pilot sat idly by and didn’t intervene when the aircraft’s flight path became compromised. Further, and more critically, the data will divulge neither the underlying reason for the flying pilot’s anxiety, (e.g., lack of training) nor the absence of intervention by the pilot monitoring, (e.g., culture). Sadly, investigators rarely have the luxury of a surviving pilot to interview, much less one willing to speak candidly regarding pertinent human factors deficiencies that may have emerged in hindsight, so as not to put a chink in his/her duty-of-care armor. This means that it is frequently left up to the human factors expert to discover the underlying contributing or causal influences that might have been overlooked in the investigation, or explain why any human factors issues alluded to might be far more significant than they appear in the context of the accident overview.

In conclusion, a good human factors expert can often be the difference in determining the why when an aviation accident occurs. While a detailed description of the extent of issues the human factors expert will attempt to determine is beyond the scope of this discussion, they will include the aforementioned physical, physiological, psychological and psychosocial factors, as well as the human interface with the machine, the systems and the environment. Additionally, in accordance with International Civil Aviation Organization (ICAO) guidelines for Investigation of Human Factors in Accidents and Incidents,8 the expert will generally collect sufficient information to construct a detailed chronology of each significant event known to have occurred prior to and, if appropriate, following the occurrence, (placing particular emphasis on behavioral events and what effect they may have had on the accident events sequence); and information permitting reasonable inferences to be made about factors which may have influenced or motivated a particular accident-producing behavior. This includes examining the conditions under which front-line operators were working; the decisions, actions and behavior of all parties concerned with the occurrence; and the organizational structure, policies, procedures and practice under which activities were performed. In this manner, a human factors expert can assist with gaining a full understanding of how the “window of opportunity” for the accident may have been created.

References

1 14 CFR 91.9.

2 Scruggs v. United States, 959 F. Supp. 1537 (S.D. Fla. 1997).

3 Andrews v. United Airlines, 24 F.3d 39 (9th Cir. Cal. 1994).

4 Autoflight Associated Loss of Situational Awareness, National Aviation Safety Administration (NASA) Aviation Safety Reporting System (CA), Callback, Issue 407, December, 2013.

5 Standard Operating Procedures for Flight Deck Crewmembers, Federal Aviation Administration, Advisory Circular No. 120-71A, February 27, 2003.

6 Fact Sheet Report on the Operational Use of Flight Path Management Systems, Federal Aviation Administration, November 21, 2013.

7 Manual Flight Operations, Safety Alert for Operators (SAFO), Federal Aviation Administration, January 4, 2013, Flight Standards Service, Washington, DC.

8 Investigation of Human Factors in Accidents and Incidents, Human Factors Digest No. 7, International Civil Aviation Organization (ICAO) Circular 240-AN/144, 1993, Montreal, Canada.

Print to PDF

related professionals