Step 5
Staying up to date, ongoing compliance in AI governance
1. Continuous monitoring and adjustment
The AI Act imposes requirements that make it essential for companies to continuously track regulatory changes, review their AI system portfolio, and adjust their AI governance strategies accordingly. As part of a comprehensive compliance approach, organisations should:
Implement ongoing regulatory monitoring
As the European Commission continues to refine and adjust the AI Act through delegated and implementing acts, companies must ensure their governance frameworks are agile enough to incorporate these updates. This includes staying aware of new guidelines, codes of practice, and technical standards issued by the newly established AI Office and national authorities.
Commit to regular governance review cycles
Companies will need to adopt a systematic review process, such as a five-step cycle, to reassess their AI systems, governance structure, and compliance processes. Regular review cycles will ensure that AI governance remains aligned with both current regulations and evolving best practices.
Implement ongoing regulatory monitoring
As the European Commission continues to refine and adjust the AI Act through delegated and implementing acts, companies must ensure their governance frameworks are agile enough to incorporate these updates. This includes staying aware of new guidelines, codes of practice, and technical standards issued by the newly established AI Office and national authorities.
Commit to regular governance review cycles
Companies will need to adopt a systematic review process, such as a five-step cycle, to reassess their AI systems, governance structure, and compliance processes. Regular review cycles will ensure that AI governance remains aligned with both current regulations and evolving best practices.
2. Navigating a complex regulatory landscape
The AI Act is part of a broader regulatory ecosystem, requiring a nuanced understanding of multiple overlapping requirements. Key regulations relevant to AI governance include:
- General Data Protection Regulation (GDPR): The GDPR remains central to AI-related data practices, particularly concerning personal data use in AI training. Compliance with GDPR is critical to avoiding data protection breaches and associated penalties.
- Cyber Resilience Act (CRA): This act introduces cybersecurity requirements directly impacting AI systems, especially for safeguarding against security vulnerabilities. Compliance will be essential for AI applications involving sensitive data or critical infrastructures.
- Data Act: The Data Act provides a framework for data sharing, especially relevant for companies using AI-based IoT devices or third-party training data. Understanding data access rights and limitations will be important for AI development and operations.
- AI/Product Liability Directive: This directive will hold companies accountable for damages caused by their AI systems. Companies must evaluate their liability exposure and put safeguards in place to reduce risk.
- Digital Services Act (DSA): For AI content moderation applications, the DSA outlines obligations for platforms to monitor, report, and manage content responsibly. Understanding these requirements is essential to maintain platform integrity and avoid potential sanctions.
- Directive on Copyright in the Digital Single Market: Licensing and compensation for rights holders, particularly in AI-generated content, are addressed under this directive. AI content creators and distributors must stay informed on copyright requirements to ensure compliance and avoid infringement risks.