DORA Regulation: Why Digital Resilience Starts With Human Behaviour
Compliance

DORA Regulation: Why Digital Resilience Starts With Human Behaviour

DORA is in force, but technical compliance alone is not enough. Discover why human behaviour remains the greatest risk and how behavioural analysis is the key to genuine digital resilience in the financial sector.

N
Nexus-7 Security Team · Cybersecurity Experts
· March 11, 2026 10:02 · 6 min read
Read in Dutch | English

The DORA Regulation: A New Era for Financial Cybersecurity

The Digital Operational Resilience Act (DORA) has been in force since 17 January 2025 and is reshaping the rules for the entire European financial sector. From banks and insurers to payment service providers and crypto platforms — every entity must demonstrably prove its digital resilience. Yet while most organisations focus on technical compliance, the greatest risk is often overlooked: human behaviour.

In this article, we explore what DORA means for your organisation, why technical measures alone fall short, and how behavioural analysis makes the difference between ticking a box and achieving genuine operational resilience.

What Is DORA and Who Does It Apply To?

DORA is a European regulation with direct applicability across all EU member states. Unlike a directive, DORA does not need to be transposed into national legislation — its rules apply immediately. The regulation is built on five core pillars:

  1. ICT risk management — Organisations must maintain a robust framework for identifying, protecting against, detecting, responding to, and recovering from ICT risks.
  2. Incident reporting — Significant ICT-related incidents must be reported to supervisory authorities in a timely manner.
  3. Digital operational resilience testing — Regular testing, including threat-led penetration testing (TLPT) for systemically important institutions.
  4. Third-party risk management — Strict requirements for managing ICT service providers, including cloud providers.
  5. Information sharing — Encouraging the exchange of threat intelligence between financial entities.

The regulation applies to more than 22,000 financial entities across the EU, plus their critical ICT service providers. The European supervisory authorities (EBA, EIOPA, and ESMA) continue to publish regulatory technical standards that flesh out the details.

The Blind Spot: The Human Element

DORA speaks extensively about technical measures, governance structures, and testing programmes. But read the regulation carefully, and you will find that human behaviour runs as a common thread through all five pillars.

Consider ICT risk management. Article 13 of DORA explicitly requires financial entities to establish ICT security awareness and training programmes. This goes beyond an annual e-learning module — DORA expects training to be tailored to the specific risks and roles within the organisation.

The reality on the ground? Research consistently shows that the vast majority of successful cyberattacks involve a human component. Phishing emails that get opened, passwords that are reused, sensitive documents sent to the wrong recipients, USB drives plugged into corporate machines. No firewall in the world catches these.

Why Standard Awareness Training Falls Short

Most organisations in the financial sector offer their employees periodic security awareness training. The problem? These programmes treat everyone the same. The compliance officer receives the same phishing simulation as the receptionist. The IT administrator follows the same module as the policy adviser.

This one-size-fits-all model ignores a fundamental insight from behavioural science: people respond differently to risks depending on their personality, workload, motivation, and cognitive style. Some employees are naturally cautious and sceptical — they recognise phishing intuitively. Others tend to act quickly, bypass rules for efficiency, or refrain from questioning authority. These are precisely the traits that make them vulnerable to social engineering.

Behavioural Analysis as a DORA Compliance Instrument

This is where Q-Method behavioural analysis enters the picture. This scientifically validated methodology maps the subjective risk perceptions and behavioural patterns of individual employees. Instead of guessing who is vulnerable, you measure it.

The Q-Method works as follows:

  • Employees rank statements about cybersecurity scenarios based on their personal beliefs and preferences
  • Statistical factor analysis identifies clusters of like-minded risk profiles
  • Targeted interventions are designed per profile, aligned with specific drivers and vulnerabilities

This delivers three concrete benefits for DORA compliance:

1. Demonstrable risk management (Pillar 1)
By making behavioural risks measurable, you can show supervisory authorities that your ICT risk management extends beyond technical measures. You have data on the human factor — and a plan to address it.

2. More effective training (Article 13)
Instead of generic training, you offer personalised programmes aligned with the actual risk profiles in your organisation. This is precisely what DORA means by training tailored to specific roles and risks.

3. Better incident prevention (Pillar 2)
By proactively identifying the employees most vulnerable to social engineering or unsafe behaviour, you prevent incidents rather than merely reporting them.

DORA and the Supply Chain: Behavioural Risks at Third Parties

An often underestimated aspect of DORA is the obligation to manage risks at ICT service providers. Your organisation may have everything in order internally, but if an employee at your cloud provider clicks a phishing link, you still have a problem.

The behavioural analysis approach can be extended to critical suppliers. By requiring your ICT partners to conduct comparable behavioural assessments, you create a chain of resilience that goes beyond contractual SLAs.

Practical Steps for DORA Compliance

Want to prepare your organisation for DORA with proper attention to the human element? Consider these steps:

  1. Map your behavioural risks — Conduct a Q-Method assessment to understand which risk profiles exist in your organisation.
  2. Segment your training — Design awareness programmes per risk profile instead of one-size-fits-all.
  3. Integrate behavioural data into your risk framework — Make human behaviour a measurable component of your ICT risk register.
  4. Test realistically — Combine technical pentests with social engineering simulations tailored to vulnerable profiles.
  5. Engage your suppliers — Set requirements for the behavioural awareness programmes of your critical ICT service providers.
  6. Measure and improve — Repeat the behavioural analysis periodically to track progress and identify new risks.

Conclusion: Compliance That Truly Protects

DORA compels the financial sector to take digital operational resilience seriously. But genuine resilience is more than a technical checklist. It is about understanding and influencing the behaviour of the people who interact with your systems every day.

Organisations that invest in scientifically grounded behavioural analysis — such as the Q-Method — comply not only with the letter of DORA, but with its spirit. They build a culture where cybersecurity is not an annual obligation, but an integral part of how people work.

The question is not whether your organisation falls under DORA. The question is whether you are ready to give the human element the attention it deserves.

Related solutions

Ready to strengthen your cybersecurity?

Schedule a free demo and discover how Nexus-7 can protect your organization.

Request demo

Related articles