The Malta Gaming Authority (MGA) has opened a targeted consultation on a proposed AI Gaming Charter, setting out voluntary principles for the ethical and responsible use of artificial intelligence in licensed gaming operations.
The Charter, developed jointly with the Malta Digital Innovation Authority (MDIA), is framed as principles-based guidance rather than binding regulation. It is designed to sit alongside existing legal frameworks — including the EU Artificial Intelligence Act — while addressing the specific operational conditions of the gaming sector.
Voluntary Framing, Strategic Intent
The MGA has been building toward this consultation for several months. In January 2026, MGA CEO Charles Mizzi outlined the rationale in an interview with iGaming Business.
“As AI becomes increasingly embedded in operational, compliance and player-facing tools, it is essential that there is no ambiguity for operators about what good practice means in real terms.”
The voluntary nature of the Charter is deliberate. The MGA has positioned the framework as a tool for early alignment rather than a precursor to enforcement. For operators, the consultation is an opportunity to shape how responsible AI is defined in practice — before EU-level obligations move from broad principle into sector-specific requirements.
“Voluntary, for us, means creating space to lead. Operators who engage early are helping shape standards rather than reacting to fixed requirements later,” Mizzi said.
The regulator has been clear that the framework is intended as a practical reference rather than a conceptual policy exercise. It is targeted at a sector where AI is no longer in experimentation — it is embedded in day-to-day operations, from player risk modelling to fraud detection to marketing personalisation.
The MDIA Partnership
The involvement of the MDIA gives the Charter a technical foundation that extends beyond gaming regulation. The MDIA has run Malta’s national AI governance programme since 2019, when it launched what it described as the world’s first national AI certification programme — built around ethical alignment, transparency and social responsibility, and anticipating much of the structure later embedded in the EU AI Act.
That history makes the MDIA a credible co-author for a sector-specific framework. The collaboration reflects Malta’s approach to AI governance: co-ordinated across regulators rather than developed in isolation within a single sector.
Internal AI Deployment Running in Parallel
The Charter is not the MGA’s only AI initiative. The regulator has confirmed an internal AI implementation roadmap covering 2026 and 2027, targeting supervisory functions including AML, player support and financial compliance.
In AML and financial crime supervision, the MGA has indicated AI tools are already being used to process large transaction datasets and identify anomalies that manual processes would miss. Responsible gambling oversight is also under development, with AI being assessed for its ability to evaluate how closely licensees’ policies align with regulatory requirements. In financial compliance, automation is being explored to reduce manual error and accelerate reporting cycles.
The internal programme is relevant to the Charter in a practical sense: the MGA is applying the same principles internally, which gives it operational grounding when the framework moves toward finalisation.
What This Means for MGA Licensees
Malta hosts a high concentration of Europe’s leading online operators, suppliers and platform providers. Regulatory direction from the MGA typically carries weight beyond Malta’s jurisdiction — because many operators structure their EU compliance frameworks around their Maltese licence, MGA guidance often informs how responsible AI is interpreted across multiple markets simultaneously.
For operators currently integrating AI into player management, CRM, compliance monitoring or game recommendation systems, the consultation is the first formal opportunity to submit views on how those deployments should be governed. Operators who do not engage leave that definition to others.
The EU AI Act’s requirements for high-risk AI systems — including automated systems that affect individual consumer outcomes — apply across all licensed sectors. Gaming operators using AI for credit-risk assessment, player affordability checks or behavioural profiling will face specific obligations as the Act’s implementation timelines advance. The MGA’s Charter, even as a voluntary instrument, is intended to give licensees a gaming-specific interpretation of those obligations ahead of binding deadlines.
No consultation deadline has been published by the MGA at this stage. The targeted format suggests it is directed at licensees and relevant industry bodies rather than open to general respondents.
For context on how European regulators are approaching cross-jurisdictional governance challenges, see the seven-regulator roundtable in Madrid and the rising cost of non-compliance analysis published in Q4 2025.
Source: Malta Gaming Authority









