top of page
Search

AI meets DORA

  • oliverluerssen7
  • Feb 8
  • 3 min read

The use of artificial intelligence in the financial sector is steadily increasing, ranging from fraud detection and automated credit decisions to intelligent customer service solutions. Alongside these technological advances, regulatory requirements for IT security and operational stability are becoming more stringent. With the Digital Operational Resilience Act (DORA), the European Union has established a binding regulatory framework that has been applicable since January 2025 and aims to strengthen the digital operational resilience of financial institutions and their ICT service providers. AI systems typically fall under the definition of information and communication technology and are therefore clearly within the scope of the DORA regulation.

Artificial intelligence can significantly contribute to strengthening digital resilience by enabling early detection of cyberattacks, identifying system anomalies, and supporting automated responses to security incidents. At the same time, the use of AI introduces new risks, including opaque decision-making processes, flawed or biased training data, model drift, and increased dependency on external technology providers. DORA directly addresses these challenges by requiring AI systems to be systematically integrated into the institution-wide ICT risk management framework. Financial institutions must identify, assess, and document AI-related risks and ensure that AI models are properly governed, continuously monitored, and regularly reviewed. AI must not be treated as a standalone innovation initiative but must be embedded within clearly defined governance structures.

Another key element of DORA concerns incident management. Malfunctions or security breaches in AI-driven systems can have significant impacts on critical business processes and may constitute reportable ICT-related incidents. Organizations must therefore be able to detect disruptions in AI systems at an early stage, escalate them appropriately, and report them to supervisory authorities when required. This necessitates not only technical monitoring capabilities but also organizational processes that allow for thorough root cause analyses at the level of models, data, and underlying systems.

DORA further requires regular testing of digital operational resilience. In the context of AI systems, this includes testing for scenarios such as data manipulation, model failures, service outages, or extreme input values. These tests are particularly important when AI is used in decision-critical or customer-facing processes. The objective is to identify vulnerabilities in advance and ensure that systems remain stable, reliable, and controllable even under stress conditions.

The management of ICT third-party risks is also a central aspect of DORA. Many AI solutions rely on external cloud providers or AI platforms, which creates additional operational dependencies. DORA obliges financial institutions to ensure transparency regarding these dependencies, to establish appropriate contractual safeguards, and to develop viable exit strategies. Importantly, regulatory responsibility remains with the financial institution, even when AI capabilities are sourced as a service from third parties.

Finally, DORA reinforces the role of senior management in overseeing digital risks. Responsibilities for AI systems must be clearly assigned, decision-making processes must be documented, and systems must be designed in a way that ensures traceability and auditability for supervisory authorities. When considered alongside other regulatory frameworks such as the EU AI Act, the GDPR, and existing national ICT supervision requirements, it becomes evident that the sustainable use of AI requires an integrated governance and compliance approach.

Overall, DORA should not be viewed as a barrier to innovation in artificial intelligence. Instead, it provides a structured framework that enables the secure, resilient, and responsible use of AI. Organizations that proactively align their AI initiatives with DORA requirements not only strengthen regulatory compliance but also enhance their digital resilience and long-term competitiveness in an increasingly technology-driven financial market.

 
 
 

Comments


Verantwortlich für den Inhalt:

Oliver Lürssen

bottom of page