Maturity Assessments: Allocating AI Compliance Resources Smartly
4 min read
2025-04-09

topic

AI Maturity

jurisdiction

Global
Talk with the Author
Book a 1‑on‑1 call with the author under Chatham House rules. Limited slots are available for 20Minds editorial committee members on a first‑come, first‑served basis.
Book now
Fully Booked
Gregor Rutow
Global Data Protection Officer, Allianz Partners SAS

Executive Summary

  • Maturity assessments systematically evaluate compliance across legal entities by measuring both design and effectiveness of the compliance program.
  • Such assessments identify compliance gaps and can justify resource allocation by providing structured insights that support staffing, budgeting, and tool implementation decisions.
  • AI compliance obligations can be derived from sources like the EU AI Act, ISO/IEC 42001:2023, and NIST frameworks, but should be refined over time by incorporating case law, regulatory guidance, and jurisdiction-specific requirements.

Gallery

No items found.
Need this in PowerPoint?
Enter your email to request the file.
Thanks! We'll get in touch.
Something went wrong while submitting the form. Get in touch with 20Minds via info@twentyminds.com

article

sample

Why conduct a maturity assessment for AI compliance?

A maturity assessment is a tool to measure how advanced and effective a function, process, or organisation is against defined standards. It helps identify strengths, close gaps, and drive improvement in processes. Originally used in software quality management, these assessments are now widely applied across industries and functions, including compliance.

Five years ago, we built a maturity assessment to measure data protection compliance across hundreds of group entities and prove “accountability”. Accountability is a requirement under the EU General Data Protection Regulation (GDPR) and demands proof of design and effectiveness across ~87 GDPR obligations—not just GDPR “on paper.” The maturity assessment tool delivered fast, effective results.

We are now applying this approach to the EU AI Act and other AI regulations worldwide. Like the GDPR, the EU AI Act is complex, has extra-territorial effect, and is hard to monitor across a group of companies. To avoid over- or under-investing, a maturity assessment will give us a clear, structured view of where the compliance program truly stands.

Maturity Assessment for AI: Where do they help?

What are the advantages?

In the compliance build phase,1 a maturity assessment pinpoints gaps and tracks progress. It helps justify new resources or reallocation—something simple Key Performance Indicators (KPIs) cannot do.

For example, if an assessment flags privacy by design as weak, it may justify allocating engineering resources to strengthen it. Once new tools are implemented, maturity scores should rise.

How did you start?

To define the obligations to measure against, we drew from the EU AI Act, AI standards like ISO/IEC 42001:20232 and NIST controls.3 We translated these rather technical obligations into clear, actionable questions—enabling teams across jurisdictions, from the US and Mexico to India and China, to respond to them with confidence.

How long does it take to build the assessment tool?

Building the initial question set is quick. Refinement takes longer. Adjustments need to be made to incorporate case law, regulatory guidance, and jurisdiction-specific requirements. For data protection, it took two to three years, and we expect a similar timeline for the AI maturity assessment.

Maturity Assessments for AI: Benchmarking Group Entities

What are key challenges implementing maturity assessments?

First, ensuring individuals provide accurate and complete answers. Second, defining clear evidence requirements to support maturity scores and agreeing on who provides what. Third, addressing jurisdictional differences while distinguishing between global and local compliance rules.

How do you handle differences in AI regulations across jurisdictions?

Our scoring accounts for regional differences. For example, early GDPR assessments showed lower scores in Hong Kong, where certain data subject rights were not mandated. We adjusted the calculations to avoid unfair penalization.

The same will be done for AI compliance. The system remains flexible to reflect evolving legal obligations.

How does an AI standards certification differ from a maturity assessment?

Maturity assessments help secure resources and track ongoing improvement. Certification, by contrast, confirms that compliance thresholds are met at a specific point in time, without ensuring continuous development. While not required for certification, maturity assessments are a valuable tool to prepare for it. Companies with mature compliance programs, sufficient budgets, and dedicated teams may run both in parallel. Others may find it too resource intensive.

How much does a maturity assessment cost, and what types of organisations would find it useful?

External costs typically range from €20,000 to €30,000, plus internal resources and a tool to create audit trails.

Not every company needs a maturity assessment. Businesses using AI solely for internal tasks (e.g., HR automation) may not require this level of oversight. However, for consumer-facing companies handling sensitive data or making AI-driven decisions that impact customers, a maturity assessment is strongly recommended to assess the strength of the compliance program.

Any final thoughts?

Do not give up too early. For us, the main challenge was team resistance. Initially, many teams found the process too complex and time-consuming.

But once they saw the benefits, their view changed. The assessment helped them secure extra resources, especially for regional teams. It also gave them a clear baseline to report privacy efforts to management.

When teams received lower scores, this triggered quality assurance reviews. These reviews helped spot gaps and showed where improvements were needed.

Today, most teams value the assessment because it gives them structure, reduces uncertainty, and lowers personal risk in compliance work.

Gregor Rutow is the Global Data Protection Officer of Allianz Partners group. He is also an Editorial Committee Member of 20Minds.

Sources

  1. The “build phase” of compliance is the initial stage where an organization designs and implements the frameworks, policies, and processes required to meet regulatory and internal standards.
  2. ISO/IEC 42001:2023 is an international standard that provides a framework for organizations to establish, implement, maintain, and continually improve an AI management system.
  3. The AI Risk Management Framework are Guidelines from the National Institute of Standards and Technology (NIST) that outline best practices for managing AI risks, focusing on reliability, security, and transparency.