Artificial Intelligence

How to comply with the AI Act and GDPR

Do you manage an AI project or use solutions integrating AI and wonder how to ensure the security and confidentiality of your data?

At Lexagone, every process, whether related to generative AI or high-risk AI systems, is analyzed based on security and compliance requirements (AI Act and GDPR).

Together, we transform regulatory requirements into strategic opportunities.

TALK TO AN EXPERT
ai act

What You Need to Know About the AI Act

The first question is: What is the risk level of your Artificial Intelligence System (AIS)? A simple yet crucial question for managing your AI project and regulatory governance (AI Act and GDPR). If you’re asking yourself the same question, you’re in the right place.

The AI Act (or Regulation on Artificial Intelligence) governs AI Systems by establishing a risk-based approach.

The AI Act Defines 4 Risk Levels

Minimal Risk AIS

These systems, such as spam filters, require no special measures.

Limited Risk AIS

Specific transparency obligations apply (e.g., chatbots and artificially generated content).

High-Risk AIS

These require in-depth risk management mechanisms. They include AI integrated into products already under market surveillance (medical devices, toys, vehicles, etc.) and AI used in eight specific areas, such as biometric systems and recruitment tools (Annex III of the AI Act).

Unacceptable Risk AIS

These systems are outright banned (e.g., social scoring, real-time biometric recognition in public spaces).

The AI Act Specifically Regulates Generative AI (General Purpose AI Models)

The AI Act imposes specific frameworks on these systems.

Why? Because affect sensitive areas: transparency, harmful biases, and systemic risks.

To address these challenges, the AI Act aims to strike a balance between innovation and security.

Key Requirements to Remember:

  • Enhanced transparency: You must inform users that your content is AI-generated.
  • Mandatory risk assessment: Every generative AI project must include a thorough evaluation to anticipate biases and systemic risks.
  • Minimum documentation requirements: The AI Act requires clear reports explaining how your models function.
  • Risk mitigation measures: To limit cyberattacks and reduce harmful biases.

Why is following regulatory methodology important for generative AI projects?

Imagine your AI model producing biased content that harms users or decision-making. Beyond damaging your reputation, you could face scrutiny from regulators.

Compliance with the AI Act protects your business, your users, and your innovation.

What the AI Act Means for Your AI Projects

1

Know your classification

Is your AI system high-risk or limited-risk ?
2

Risk assessment is mandatory for critical systems and some generative AI systems

This includes tools like DPIA
3

Obligations vary by risk level

Transparency, documentation, and increased monitoring may be required.
4

Non-compliance comes with high penalties

Fines can reach 7% of global annual revenue or €35 million, so it's best to prepare in advance.

Why Choose Lexagone for AI Act Compliance?

Our GDPR consulting firm has been supporting healthcare stakeholders (software publishers, medical device manufacturers, healthcare institutions, and GRADeS) since 2019 in the development and deployment of AI-integrated healthcare solutions.

Lexagone is also continuously monitoring industry developments by participating in working groups, conferences, and round tables, such as the discussion on the impact of the DPO profession with the Head of the AI Department at CNIL (CNIL GDPR Day in Lille, 2024), or the AFCDP webinar in June 2024 on the experimentation of algorithmic cameras.

Thanks to its hands-on experience in sensitive data protection sectors and continuous peer exchanges, Lexagone offers AMOA compliance services tailored to your needs and regulatory requirements for both the AI Act and GDPR.

Our services cover various types of AI projects and integrations that you may choose:

  • Implementation of an in-house solution (to leverage industry expertise).
  • Integration of third-party solutions (AI integration via an existing service provider or selection of a new provider).
  • Internal use of generative AI (to enhance team productivity).

Our Services Meet 5 Key AI Project Needs

1

Lexagone conducts an initial assessment to evaluate:

  • Potential risks linked to AI use (AIS classification),
  • Existing processes (robustness and biases),
  • Specific business needs.

Regulatory aspect:

We identify compliance gaps with the AI Act & GDPR and provide tailored recommendations.

2

Based on the diagnostic assessment, Lexagone supports you in the design and integration of your AI System (AIS).

This phase includes:

  • Defining objectives,
  • Identifying appropriate technologies,
  • Structuring data governance processes.

Lexagone helps you select the most suitable AIS for your needs while considering regulatory constraints, including evaluating technology providers and ensuring solution compliance with security and data protection requirements.

Regulatory Aspect

Lexagone ensures that integration plans comply with:

  • Data minimization,
  • Consent management,
  • Rights of data subjects,
  • AI ethics principles.

Your role is also defined under the AI Act, whether as a provider, importer, distributor, deployer, or agent.

3

Depending on the classification of the implemented AI systems (high-risk or not), Lexagone conducts a Data Protection Impact Assessment (DPIA) when necessary.

4

Lexagone supports you in drafting your AI governance documentation, ensuring compliance with regulatory requirements and best practices.

This includes:

  • AI Governance Policy
  • Preliminary risk and opportunity assessment procedure for AI projects
  • Documentation process for models, data, and algorithmic parameters
  • Validation procedure for AI Systems according to AI Act and GDPR compliance criteria
  • Update and maintenance procedures for AI Systems
  • Incident and malfunction management procedures for AI Systems
  • AI Administrator Charter
5

Lexagone offers awareness sessions for teams to prepare them for the use of AI-integrated tools while ensuring a clear understanding of the associated regulatory challenges. These sessions include specific modules on GDPR, risk management, and legal obligations related to AI usage.

OUR REFERENCES

grades.bourgogne
ej min
mondial.relay2 min

Client Testimonials: Our Approach to AI Advisory Services

As part of the implementation of an artificial intelligence software for medical imaging within the Hospital Group, I reached out to Lexagone's teams to help define the roles of the involved parties (data controller, joint controller, processor) and assess the pseudonymization process. This project allowed me to discover a multidisciplinary, expert, and agile team that was able to mobilize efficiently to meet our deadlines.

CIO, Hospital Group

Frequently Asked Questions (FAQ)

Implementation Timeline of the AI Act Provisions

Classification of AI Systems Under the AI Act

pyramide des risques reglement ia fr

Source : https://www.cnil.fr/fr/entree-en-vigueur-du-reglement-europeen-sur-lia-les-premieres-questions-reponses-de-la-cnil

Do You Have Questions? We Have the Answers

Let's talk about your compliance


Contact Information

Mail : contact@lexagone.fr
Phone : +33 (0)972 169 310

Lexagone is present at:

  • Biarritz
  • Bordeaux
  • Grenoble
  • Lille
  • Lyon
  • Marseille
  • Montpellier
  • Nantes
  • Toulon
lexagone logo

Our GDPR consulting firm offers external DPO services managed by teams of specialized legal experts to ensure controlled GDPR governance.

Member of

afcdp min
logo apssis h100 min
club decision dsi min

Referenced by

logo caih 400 copie 0 0 1 min
53a58cfd 2d9c 4a08 84ac f80456cd147b
logo csirt blue
logo footer@2x