DORA Gap Assessment Spreadsheet
A structured framework for assessing your compliance gaps across all five DORA pillars
A gap assessment is only useful if it is honest. An inflated score tells the board a comfortable story while the regulator builds a different one. The framework below is designed to give you an accurate picture of where you stand — so your remediation effort goes to the right places before an auditor tells you where they are.
Bottom Line
Assess all five pillars. Score each control honestly: 0 (not in place), 1 (partially in place), 2 (fully in place). For every gap, assign a P1/P2/P3 priority, a named owner, and a target remediation date. P1 gaps are regulatory breach risks — escalate them to the board. A gap assessment without follow-through is just paperwork.
What Auditors Will Actually Look For
- A gap assessment conducted at DORA go-live — and updated at least annually or after material changes.
- Evidence the assessment was honest: a perfect score across all pillars is a red flag, not a green one.
- P1 gaps with named owners, target dates, and progress tracking — not just a list of problems.
- Board visibility: evidence P1 gaps were escalated and the board understood the regulatory risk.
- Closed gaps with evidence of what was implemented — not just a status change in a spreadsheet.
Common Mistakes
- Scoring "partially in place" when a policy exists but the control has never been tested — this is 0, not 1.
- Assigning P2 to gaps that are legally mandated under DORA Arts. 5–16 — these are P1 regardless of difficulty.
- No remediation tracking: the gap assessment is completed, filed, and nothing is done with it.
- Assessment updated only when forced — a live programme reassesses after every major incident, significant change, or regulatory interaction.
Pillar 1 — ICT Risk Management (Arts. 5–16)
The largest pillar. Score every control honestly — the board must see your real position.
| Control Area | Key Questions | Priority |
|---|---|---|
| Governance (Art. 5) | Has the board formally approved the IRMF? Is board-level ICT training documented in minutes? | P1 |
| ICT Risk Framework (Art. 6) | Is the framework documented, reviewed annually, and updated after major incidents? | P1 |
| Asset Identification (Art. 8) | Is a complete, current ICT asset inventory maintained and mapped to critical functions? | P1 |
| Protection controls (Art. 9) | Are MFA, patching SLAs, network segmentation, and encryption policies in place and tested? | P1 |
| Detection (Art. 10) | Is 24/7 monitoring in place for critical systems? Are detection use cases defined and tested? | P1 |
| BCP / DRP (Art. 11) | Are BCPs and DRPs tested? Have live DR tests met documented RTO/RPO targets? | P1 |
| Backup policy (Art. 12) | Are backups isolated from production? Is restoration tested — not just creation? | P1 |
| Post-incident learning (Art. 13) | Is there a formal post-incident review process? Are findings tracked to closure with evidence? | P2 |
Pillar 2 — ICT Incident Management & Reporting (Arts. 17–23)
| Control Area | Key Questions | Priority |
|---|---|---|
| Incident management process (Art. 17) | Is there a documented, tested process covering detection through CA closure? | P1 |
| Classification policy (Art. 18) | Are the six criteria documented, embedded in triage, and applied to every incident? | P1 |
| CA notification capability (Art. 19) | Is the CA notification channel confirmed? Have staff trained on the 4-hour clock? | P1 |
| Report templates | Are initial, intermediate, and final report templates pre-approved and ready? | P1 |
| Voluntary threat notification (Art. 23) | Is there a process for escalating significant cyber threats that have not yet caused an incident? | P2 |
Pillar 3 — Digital Operational Resilience Testing (Arts. 24–27)
| Control Area | Key Questions | Priority |
|---|---|---|
| Testing programme (Art. 24) | Is there a documented annual testing programme covering all DORA-required test types? | P1 |
| Vulnerability assessments (Art. 25) | Are VA scans conducted at least annually across critical systems? Are findings tracked? | P1 |
| Penetration testing | Is penetration testing conducted on systems supporting critical functions? How frequently? | P1 |
| TLPT obligation (Art. 26) | Has the entity confirmed its TLPT designation status with the CA in writing? | P1 |
| Test findings tracking | Are findings from all test types tracked to remediation with defined SLAs? | P2 |
Pillar 4 — ICT Third-Party Risk (Arts. 28–44)
| Control Area | Key Questions | Priority |
|---|---|---|
| Provider inventory (Art. 28) | Is the register of information complete, current, and ITS-compliant? | P1 |
| Pre-onboarding DD (Art. 29) | Is there a documented DD process applied before new ICT providers are onboarded? | P1 |
| Contractual provisions (Art. 30) | Do all ICT contracts include mandatory Art. 30 clauses? Have legacy contracts been updated? | P1 |
| Concentration risk | Has concentration risk (single-provider, geographic, technology) been assessed and documented? | P1 |
| Exit strategies | Are exit strategies in place for all Tier 1 providers — with named alternatives? | P2 |
| Ongoing monitoring | Is there a structured annual review programme for Tier 1 providers with documented evidence? | P2 |
Pillar 5 — Intelligence Sharing (Art. 45)
| Control Area | Key Questions | Priority |
|---|---|---|
| Threat intelligence consumption | Does the entity consume current threat intelligence relevant to its sector and geography? | P2 |
| Sharing arrangement | Has the entity evaluated joining a sector threat intelligence sharing arrangement? | P3 |
| GDPR compliance | If sharing threat intelligence externally, has GDPR compliance been assessed for personal data content? | P2 |
Scoring and Readiness Calculation
- Score each control: 0 = not in place, 1 = partially in place, 2 = fully in place.
- Pillar score = (sum of scores ÷ maximum possible score) × 100.
- Overall readiness = weighted average across five pillars (weight Pillars 1 and 4 more heavily).
- Traffic light: 0–40% = Red (escalate to board), 41–70% = Amber (material gaps), 71–100% = Green (on track).
- P1 gaps at Red are regulatory breach risks — escalate with a remediation plan and named owner.
3-Step Action Checklist
- 1. This week: If you have not completed a formal gap assessment in the last 12 months, schedule one. Assign a named owner and a completion deadline. Do not delegate scoring to the team responsible for the controls — independent challenge produces more honest results.
- 2. This month: For every P1 gap in your most recent assessment, confirm a named owner, target date, and current progress status. If any P1 gap has no owner or no date, escalate immediately.
- 3. This quarter: Present gap assessment results — including P1 gaps and remediation status — to the board or audit committee. Document their response. This is the evidence that the board is genuinely engaged with DORA compliance.
Need a DORA gap assessment?
Use our free readiness tool to identify your compliance gaps across all five DORA pillars.