26 June 2025

Communication Key to Safety Improvements

Through CRM and TEM, communication has been key to safety improvements.

American author and aviator Ernest K. Gann wrote of his experiences in what some call the golden age of the airlines - the late 1930s to the 1950s. He chronicled a world where captains were kings and rarely questioned. In Gann’s memoir Fate is the Hunter, he described his role as a copilot: ‘ I was expected to operate the landing gear and flaps on command, keep the log, the flight plan, and my mouth shut.’

But the golden age was not so golden. In 1959, in the United States alone, there were 40 fatal accidents per 1 million aircraft departures. Today, the fatal accident rate is a fraction of that; according to most calculations, it hovers not far above zero.

Improvements in cockpit dynamics and communication, championed by Flight Safety Foundation and other organisations, contributed to that improvement. Copilots, or first officers, are no longer expected to keep silent. Captains are expected to encourage crew input to make better decisions and prevent and mitigate errors. Crew resource management (CRM) can be defined as the effective use of all available resources for flight crew personnel to ensure safe operations, reduce errors, avoid stress, and increase efficiency.

In 1959, in the United States alone, there were 40 fatal accidents per 1 million aircraft departures. Today, the fatal accident rate is a fraction of that; according to most calculations, it hovers not far above zero.

A 1979 workshop sponsored by the U.S. National Aeronautics and Space Administration (NASA) helped establish the CRM concept. The workshop, titled Resource Management on the Flightdeck, stemmed from NASA research on air transport accidents. Research presented at the event pointed out the types of human error that lead to accidents, including poor interpersonal communication, decision-making, and leadership.

Examples included the worst aviation disaster of all time, the March 27, 1977, collision of two Boeing 747s on the ground at Tenerife in the Canary Islands. The accident killed 583 people.

A KLM 747 began a takeoff roll in low daylight visibility at the same time a Pan American World Airways 747 backtracked on the same runway. The investigation found that the KLM captain:

  • Took off without clearance.
  • Did not obey the ‘stand by for takeoff’ instruction from the tower.
  • Did not reject the takeoff when the crew of the Pan Am aircraft reported they were still on the runway.
  • Replied emphatically in the affirmative when the flight engineer asked if the Pan Am 747 had cleared the runway.

The accident demonstrated in tragic terms what can happen when communication breaks down.

Early research referred to the CRM concept as cockpit resource management. By the time NASA held another workshop in 1986, the name had been changed to crew resource management. Training focused on topics such as team building, briefing strategies, situational awareness, and stress management.

By the 1990s, CRM training had evolved to better reflect the flight deck environment. Airlines began to include modules on CRM concepts in aircraft automation. This led to memory aids that help prevent mistakes in automation use. An example is CAMI, which stands for confirm, activate, monitor, intervene:

  • Confirm the function the crew wants to use.
  • Activate that function.
  • Monitor aircraft performance.
  • Intervene if the automation does not do what the crew intended.

Training also began to look at human factors such as fatigue and hazardous attitudes that can lead to accidents.

Those attitudes include:

  • Anti-authority - Don’t tell me what to do.
  • Impulsivity - The weather is marginal, but let’s just try it.
  • Invulnerability -I never have a problem with this approach.
  • Macho - The rules are for average pilots, and I’m better than average.
  • Resignation - What’s the use?

Further refinements led to the U.S. Federal Aviation Administration (FAA) approving a major change in airline training, the advanced qualification program (AQP). AQP training includes CRM concepts put to use in a line-oriented flight training (LOFT) simulator session. The ‘LOFT ride’ takes a captain and first officer on a normal line flight from departure to destination. Along the way, they encounter problems ranging from minor issues such as a runway change to major issues such as an engine failure or fire. For most of these problems, there is no right or wrong answer as long as the flight terminates safely. The pilots are graded not only on how they flew the aircraft but also on how they communicated and considered options.

Catch up with the Flight Plan Podcast

Discussing aviation safety, risk management and industry best practices.

Click here

Evolving CRM research brought an acknowledgement that human error cannot be eliminated entirely. This led to the threat and error management concept (TEM): If we cannot eliminate error, then how do we minimise, mitigate, and manage it? The origin of TEM can be traced to line operational safety audits (LOSA) conducted by the University of Texas Human Factors Research Project (UT). During the 1990s, UT conducted jump-seat observations with Delta Air Lines and Continental Airlines. Trained observers categorised the origin of errors and the response to them, along with the result.

This research led to a TEM framework model with three main components:

  1. Threats – events or errors beyond the control of line personnel. Threats can include a wide variety of things, such as weather, malfunctions, air traffic congestion, and disruptive passengers.
  2. Errors – actions or inactions by line personnel that lead to deviations from intentions or expectations.
  3. Undesired states – conditions in which an unintended situation results in a reduced safety margin. These conditions can range from relatively minor mistakes, such as turning onto the wrong taxiway, to potentially disastrous situations, such as a runway incursion.

During TEM-oriented training, pilots are debriefed after simulator sessions and asked to identify the threats they faced, how they handled the threats, and the result. In addition, airlines have begun using threat-forward briefings to stay ahead of potential problems. For example, during approach briefings, instead of a rote recitation of data on an approach chart, a captain and first officer discuss anticipated threats: ‘There are low-level wind shear advisories for our destination. Let’s review the wind shear escape procedure.’ Or for a departure briefing: ‘This is a low-visibility takeoff. Call out ‘centre line’ if you see me drifting off it.’

This type of threat-forward briefing came about in the aftermath of the Aug. 14, 2013, crash of an Airbus A300-600 freighter during an approach to Birmingham-Shuttlesworth International Airport in Alabama, U.S. The crash killed the captain and first officer - the only people in the aeroplane — and destroyed the aeroplane.1 The U.S. National Transportation Safety Board (NTSB) report on the accident said that because the crew did not re-brief when elements of their planned approach changed, they ‘ placed themselves in an unsafe situation because they had different expectations of how the approach would be flown.’

In 2017, an AeroSafety World article noted that the Birmingham accident was an example of how briefings had become a ‘one-sided, box-checking’ event. The article advocated a more collaborative briefing concept.

To facilitate such collaborative briefings, airlines have begun using briefing aids. These are cards or placards that organise potential threats into categories, such as personal issues, weather, mechanical problems, and others. The aids encourage conversations between captains and first officers, such as What other threats do you see? Did I miss anything? Ideally, the most imminent hazards become top of mind for the crew, and they become better prepared to handle those hazards.

In the decades since Ernest K. Gann’s career, the aviation community has learned much about how communication on the flight deck can enhance safety. Gann dedicated Fate is the Hunter to ‘old comrades with wings… forever folded.’ His list of those comrades runs for five pages and concludes, ‘Their fortune was not so good as mine.’ Lessons learned from such fortunes continue to benefit the flying public today.

Note:

The NTSB said the probable cause of the accident was ‘the flight crew’s continuation of an un-stabilised approach and their failure to monitor the aircraft’s altitude during the approach, which led to an inadvertent descent below the minimum approach altitude and subsequently into terrain.’ Among the contributing factors was ‘the captain’s failure to communicate his intentions to the first officer once it became apparent the vertical profile was not captured’.

Back to Basics: Management of Change

Aviation change management protects operations with structured processes identify risks and safety opportunities during organisational shifts.

Read now

The Intersection of Aviation and Insurance

Aviation insurers navigate evolving risks like AI, climate change, and pilot shortages by improving data quality, collaboration, and underwriting models.

Read now

Positioning Ground Operations for Success

Ground operations risk USD10B in damage by 2035. Governance, training, and risk analysis can proactively strengthen safety and operational performance.

Read now

Cockpit Observation Programme (COP) for Helicopter Operations

Camera-based COP extends LOSA to helicopters, enabling continuous safety monitoring and data-driven improvements where traditional observers can't go.

Read now
Back to Home

Share on social

The Walbrook Building 25 Walbrook London, EC4N 8AW

Legal & Regulatory | Privacy Policy

In association with

Author

Thomas W. Young

Flight Safety Foundation

www.flightsafety.org

Arthur J. Gallagher (UK) Limited is authorised and regulated by the Financial Conduct Authority. Registered Office: The Walbrook Building, 25 Walbrook, London EC4N 8AW. Registered in England and Wales. Company Number: 119013.