Are The ACS Working?

It depends. The implementation is going well, but it could be years before the new Airman Certification Standards can move the safety needle for GA.


The first Airman Certification Standards (ACS) were issued in 2016, after a five-year gestation period, replacing the Practical Test Standards (PTS) system that previously governed checkrides for airman certificates and ratings. Implementation has proceeded smoothly, and according to designated pilot examiners (DPEs), the ACS is not more burdensome than the PTS. The big unknown, however, is whether the ACS will help improve safety, as reflected in accident rates.

One answer to that question will depend on how the risk management elements are trained and tested and how these standards will be adopted by existing pilots who may never take another practical test. In the final analysis, it will depend on how pilots, instructors and others take the ACS to heart after the checkride. I have written about the ACS before (see “The Coming Airman Certification Standards,” July 2013, and “New Certification Standards,” September 2015). It’s been four years since the first ACS was issued, so it’s time to take a look at how well they are working.


A little history can explain the origin of the ACS concept and connect it to the real safety issues still affecting general aviation pilots. We need to look back several years to get the whole picture.

In response to several tragic and grisly airline accidents in the 1990s, a joint FAA/industry program called the Commercial Aviation Safety Team (CAST) was formed to improve airline safety. By 2001, its programs were already beginning to bear fruit and today, the domestic U.S. airline fatal accident rate has effectively been reduced to zero. A similar effort, the General Aviation Joint Steering Committee (GAJSC), also was formed in the late 1990s to address general aviation accidents.

I was the manager of the FAA’s General Aviation and Commercial Division from 2001 to 2005, and I played a major role in creating the FAA Industry Training System (FITS) program to test the premise that training reform would be an important means to lower the accident rate (see the sidebar on the opposite page). In partnership with aviation universities, manufacturers and others, the FAA tested new training concepts such as scenario-based training, student-centered learning, single-pilot resource management (SRM), better automation management and other new training concepts. These differed from traditional training concepts—like maneuver-based training and rote learning—used in general aviation as far back as the Civilian Pilot Training Program in 1939.

Between 2005 and 2010, the FAA made only a little progress in implementing FITS-related training improvements. For example, the FAA issued the Risk Management Handbook (FAA-H-8083-2) in 2009 but did not initially change knowledge and practical tests to incorporate risk-management concepts.

In late 2010, the newly formed Society of Aviation and Flight Educators (SAFE) decided it was time to highlight the need for flight training reform by sponsoring a conference. The event was held in May 2011, attended by several hundred leaders in the flight training community and included a keynote address by the FAA administrator.

The SAFE conference and other events catalyzed FAA efforts and, in late 2011, the FAA chartered an Aviation Testing Standards and Training Aviation Rulemaking Committee (ARC) to study testing standards and make recommendations. In a 2012 report and 2013 follow-on report, the ARC laid out broad recommendations to make training standards more effective, and the agency also formed an Airman Certification Standards Work Group (ACSWG).

Fits Findings and Recommendations

The FAA Industry Training System (FITS) team concluded that the then-current FAA methodology for developing its Practical Test Standards had “not been updated to include modern scientific or statistical methods to select training and evaluation tasks.” For example:

• “Many of the maneuvers currently included in the Practical Test Standards were originated by the Army Air Corp (pre-1947) and later adapted by the FAA for general aviation. There has never been any scientific evaluation of these maneuvers to determine validity.”

• Drawing a distinction between the content of a maneuver and the criterion for its completion standards, the FITS team found that “a number of PTS maneuvers (eights on pylon, chandelles, lazy eights, s-turns, etc) do not represent actual flight maneuvers and therefore are not content valid. In addition, a handful of maneuvers also seem to lack criterion validity (180 degree accuracy approach and landing, steep spiral, lazy eights, eights on pylon, and chandelles).”

• “If a particular skill is sufficiently tested in one maneuver, it need not be evaluated in other maneuvers.” The team identified “180-degree accuracy approach and landing, chandelles, lazy eights, eights on pylons, s-turns, turns around a point, steep spirals, and rectangular course” as maneuvers exhibiting “excessive redundancy.”

• “If a particular evaluation maneuver is artificial (i.e. it does not mimic a maneuver required in actual flight; low content validity), and if the skills that are evaluated by that maneuver are sufficiently tested by other maneuvers…then the artificial maneuvers could be eliminated—especially if training time or cost is critical.”

Results of the FITS research are still available on the FAA web site and may be accessed at:


The working group in 2013-2014 began developing the first ACS documents, starting with the private pilot airplane certificate and the premise that the existing PTS comprehensively addressed the stick-and-rudder skill requirements for the practical test, and these should be carried forward into the ACS system. What was missing were standards specifying knowledge requirements supporting all the tasks in the practical tests. Until this change took place, the separate knowledge tests administered by the FAA were only loosely correlated with the practical tests and included considerable numbers of obsolete questions.

From the beginning of the ACS creation process, I and others advocated for including risk management standards in the ACS. The evidence implicating poor risk management as a major root cause of fatal accidents was strong. The ACSWG thus decided that the trilogy of knowledge, risk management and skills would be present in each task of every ACS document. In addition, the ACSWG determined that the risk management elements of every task would require the applicant to demonstrate the ability to identify, assess and mitigate risk associated with that task.

Once the prototype documents were available, it was time to validate them in the real world of flight training and testing. This was accomplished by selecting several university flight programs as beta test locations. Dozens of applicants were tested for the private pilot and eventually other certificates, including the instrument rating. The applicants, flight instructors and examiners were all polled regarding their views on the effectiveness of the ACS documents, and in comparison, with the pre-existing PTS.

Overall, implementation and rollout of the prototypes were successful, with feedback incorporated in the final documents. The final private pilot airplane and instrument rating ACS documents became effective in June 2016, followed by commercial pilot airplane in June 2017 and airline transport pilot in 2018. As this is written, flight instructor, helicopter and other ACS are being developed. Information on the status of the ACS program is available on the FAA web site at this link:

Make the Flight Review More Effective

Most general aviation pilots only see a CFI once every two years for their required flight review. A few meet the biennial requirement by accomplishing one of the alternatives permitted by FAR 61.56. These include taking a practical test for a certificate or rating, or completing an FAA Pilot Proficiency Program (Wings) phase. The number of pilots using the Wings option is very low. 

The GAJSC issued a safety enhancement (SE) several years ago that recommended the creation of a risk-based flight review. This SE has gone nowhere, and I can vouch for this by personal experience. Through most of my flying career, I have been subject to mandated FAA proficiency checks that substitute for the flight review. For example, I was required to take annual pilot-in-command (PIC) proficiency checks to serve as PIC in turbojets. For the last several years, however, I have taken the route that most of you take: hunting down a CFI to give me a flight review. The experience has been depressing.

For the last three cycles, I asked the CFI to give me a “risk-based” flight review. All I got back was a blank stare. Of course, this isn’t really the fault of the CFI, since the GAJSC SE on this subject has not gotten out to the training community. However, when I asked these CFIs what guidance they would use to conduct a flight review, I got more blank stares. I asked them if they were aware of the guidance in FAA Advisory Circular (AC) 61-98 (the current edition is 61-98D). Again, more blank stares. The resulting flight reviews I received were desultory affairs, maneuvered-based, unrelated to the missions I typically fly and utterly unchallenging. It’s past time for the FAA and the aviation community to end this and make the flight review more effective.



Over the long run, I’m confident the ACS will help lower the GA accident rate. There are, however, two factors that could be an impediment to this taking place anytime soon. The FAA and the general aviation community can take actions to reduce these impediments and, in some cases, these actions are already underway.

First and foremost, the FAA and its industry partners have not, up until just recently, created adequate guidance on how to perform, teach and test risk management. The existing FAA Risk Management Handbook, published in 2009, does not completely explain how to identify, assess and mitigate risk, as required by the ACS. The handbook does a fair job on risk identification but a less complete summary of risk assessment and almost no information on how to mitigate risk. In addition, the handbook does not provide meaningful real-world case studies.

The need for revising this important guidance material has not gone unnoticed by the GAJSC. A comprehensive revision of the Risk Management Handbook was begun in January 2020. With any luck, we will see a completely rewritten handbook sometime in 2021.

The GAJSC also agreed that revised guidance was needed for flight instructors on how to teach risk management and for designated pilot examiners (DPE) on how to test risk management. Such guidance was developed during 2019, and a revised Aviation Instructor’s Handbook was issued in June 2020. Revised guidance for DPEs is pending. For these two professional groups, instructors and examiners, guidance material is only half the story. These groups will also require FAA standardization training in order to learn how to properly apply the guidance material. There are fewer than 1000 DPEs nationwide, and the FAA has annual standardization requirements for all of them. This should enable FAA to promptly bring the examiner community in line with risk management testing requirements.

The 100,000-plus flight instructor community is another matter. However, flight instructors must renew their certificates every two years, and most of them do it through online or live flight instructor refresher clinics (FIRCs). If the FAA takes immediate action to require a FIRC element on teaching risk management, most CFIs should have the ability to teach this important skill within a couple of years.

That finally brings us to the question of how to provide practical risk management training to the 500,000 or so existing pilot population. Many of them may never take another practical test. Further, a percentage of the new pilot issuances are for individuals training to become professional pilots. Many of these pilots will never enter the general aviation community after completing training. At this rate, it would take years for the entire pilot population to have taken a practical test under the ACS system. The most effective way to reach the existing pilot population is through the flight review and other currency events required under Part 61. The sidebar on the opposite page discusses this in more detail.


Even if you never take another checkride, you can make good use of the standards in the ACS, especially those pertaining to risk management. The steps listed here will help you master the art and science of risk management, improve your operations, and perhaps avoid an accident.

• Take a risk management course, or a single-pilot resource management (SRM) course that includes a risk management element.

• When the FAA reissues the Risk Management Handbook, obtain a copy and take a deep dive into it. The final version should have plenty of examples and case studies to help you.

• For your next flight review, try to find an instructor who at least follows the guidance in AC 61-98D, and maybe will agree to provide a risk-based review.

• If you are a CFI who gives flight reviews, familiarize yourself with AC 61-98D. Also, dive into the revised Aviation Instructor’s Handbook and familiarize yourself with the chapter on Teaching Risk Management.

• If you are a CFI, for your next online renewal course, pick one that emphasizes risk management. I use the one from King Schools, but they all should be adequate.


Robert Wright is a former FAA executive and President of Wright Aviation Solutions LLC. He is also a 10,000+hour ATP with four jet type ratings and he holds a Flight Instructor Certificate. His opinions in this article do not necessarily represent those of clients or other organizations that he represents.


  1. Excellent article.

    So, why are there so few FAA Wings seminars (OR webinars) that address risk, risk mitigations, etc.?

    FWIW, I look for and attend a large number of interesting webinars each year. I am disappointed in most of the FAA presented events. Yes, some are very useful, chock full of material to think long and hard about. However, beyond presenting grim stories the insights drawn from accident data are hit or miss. I seldom see any Wings or other webinars integrate meaningful polls within webinars as tools to stimulate thinking or test understanding of presented material.

  2. Risk management is difficult to evaluate, as it is highly subjective. What represents a high risk operation in one scenario may very well be low risk with certain factors changed. How then can a DPE, asses if a pilot has made the correct decision, other than by using their own subjective judgment. Additionally, evaluating a change in practical standards across a university setting, and no other, leaves a very large segment of the GA training system out of the equation, especially considering that most University instructors are very low time upper classmen simply teaching for the hours. Not asking the high time, long term part 61 instructors for input is rejecting valuable data that is directly correlated to GA training performance- Unfortunately ignoring GA needs is a common theme behind many FAA and other Alphabet organizations. Leaving such a large segment of highly experienced users out of an analysis is poor evaluation of any concept, and predestines it to either fail, or at least, be relatively ineffective.


Please enter your comment!
Please enter your name here