ICAO Doc 9841 was written for national authorities: the CAAs that approve and oversee aviation training organisations. In practice, it’s also the reference document for ATOs themselves. The third edition (2018) runs to 180 pages. Most of the operational effort it asks for sits in two appendices: A and B.
Appendix A: what the training and procedures manual must contain
Appendix A lists eight main sections that any ATO’s training and procedures manual must cover. Four address the organisation: chart, management qualifications, policies, facilities. Four go directly to training production: client training programmes, practical and theoretical syllabi, competency-based programmes, and tests for the issuance of licences or ratings.
Section 3 (Client training programmes) is by far the largest. It requires, among other things:
- An explicit statement of each course’s aim and the required level of performance
- Pre-entry requirements and the handling of credit for prior learning
- Detailed training curricula covering theoretical knowledge, practical skills, human factors, assessment, and monitoring
- Student evaluation procedures: remediation, skill tests, knowledge tests, question analysis
- Training effectiveness policies: coordination between services, internal feedback, interim standards, handling unsatisfactory progression, instructor changes, suspensions
For competency-based programmes, the explicit target of EU regulation 2025/2143 and ICAO Doc 9868, section 3.3 adds a module-based organisation requirement, with a training objective tied to each module and the integration of knowledge, skills, and attitude.
Read quickly, this looks like sensible pedagogical practice. Read carefully, it’s a list of documentary deliverables that most ANSPs still produce manually in Word and Excel. Which immediately raises the maintenance question: how do you guarantee consistency across instructors, sites, and intakes when documents are scattered?
Appendix B: the quality system, where things get harder
Appendix B is longer (20 sections) and it’s where the real solidity of an approval gets decided. Three concepts keep coming back.
The coherence matrix (section 7) is a tabulated document that lists every applicable regulatory requirement, with the corresponding internal process and the named manager responsible for each line. It must also point to the most recent audit and the next scheduled one. The text calls it a “powerful addition to the ATO’s compliance efforts”. Many ATOs confuse it with their procedures manual. They’re not the same thing.
Records (section 19) must be kept for at least three years, or longer where national requirements demand it. That covers audit schedules, inspection and audit reports, responses to findings, corrective and preventive action reports, follow-ups, and management review reports. Beyond storage, accessibility is the actual constraint. The text speaks of “accurate, complete and readily accessible records”. During a CAA audit, you get asked for documents. You don’t get the time to rebuild them.
Continuous improvement (section 17) requires a formal Plan-Do-Check-Act cycle, run by the quality manager and fed by the risk profile, the coherence matrix, corrective actions, and inspection reports. The logic is ISO 9001 quality management, applied to aviation training.
Section 20 adds the satellite ATO rule: a main ATO’s quality responsibility extends to its subcontractors. For an ANSP running several training sites or using third-party centres, that point deserves attention.
The real operational workload
Taken together, these two appendices describe a substantial documentary and traceability burden. On the ground, here’s what we typically observe:
- Training exercises are designed by instructors case by case, without formal weighting and with no guarantee of equivalence between intakes
- KSA assessments are captured on paper or in local files, with no centralised archive usable for audit
- The coherence matrix, where it exists, is an Excel file updated manually at unpredictable intervals
- Records are stored but hard to retrieve within audit timeframes
- Pedagogical quality monitoring is retrospective (after each intake), not continuous
None of these points is a deal-breaker on its own. But the accumulation makes the annual quality system update more time-consuming than the training itself, and weakens auditability under pressure.
Reading the appendices as data
Another way to read Appendices A and B is to ask: what can be structured as data rather than as documents?
The answer: almost everything related to pedagogical production and records. A performance objective is data. A weighted PCS is data. A KSA assessment is data. A control load is computable. An audit trail is exportable. The coherence matrix becomes a view onto that data, not a document maintained in parallel.
That reading guided the design of EDS, our platform for the digitalisation of the ATC training chain. Out of roughly fifty detailed requirements across Appendices A and B, around twenty are natively covered by EDS, fifteen are assisted, and the remainder is organisational (HR, SMS, facilities). That’s the honest ratio. The sections covered (client programmes, records, monitoring, pedagogical continuous improvement, multi-site operations) are the ones that consume the most time and are the most exposed during audits.
Going further
ICAO Doc 9841 remains the best operational checklist available to an ATO, even though it was written for the authorities. Re-reading it while asking what must stay a document and what can become data changes the nature of the burden it imposes.
To discuss practical readings of Doc 9841, Doc 9868, or compliance with EU regulation 2025/2143, please get in touch.
