Our Blog

The Maccabi AI Scandal: Why Businesses Deploying Copilot Needs a Governance Strategy

30 Mar 2026

A recent inquiry has put AI governance in the spotlight and the findings should concern businesses across both the public and private sectors.

In February 2026, the Home Affairs Select Committee published a report into a public sector organisation’s decision to ban a group of overseas football fans from an event, based in part on intelligence that referenced a made-up football match. The made up event had been generated by Microsoft Copilot which they used in the research process without adequate oversight, verification, or policy controls.

The outcome was that a senior leader resigned, A formal misconduct investigation was launched, and the organisation’s reputation for both competence and community trust took a hit.

Microsoft Copilot is embedded across the Microsoft 365 suite it’s in Teams, Outlook, Word, and SharePoint. Millions of employees across the UK use it daily. This makes people feel like they’re not “using AI” when they do.

What Went Wrong and Why It Matters Beyond This Case

The parliamentary committee found that AI-generated content was included in a formal decision making document without being verified. There was no policy specifying how AI tools could be used in this context. There was no audit trail. And when challenged, senior leadership was unaware that AI had been involved at all.

It’s not a case of rogue technology. It is about the gap between tool adoption and governance.

Sky News have reported that at least 21 public sector organisations are still using Copilot in similar contexts despite the findings. The absence of a national standard means that individual organisations are left to determine their own risk appetite, often without the frameworks, expertise, or specialist talent to do so effectively.

The Risk for Your Organisation

Microsoft Copilot is a powerful tool. Used well, it can accelerate research, surface insights, and reduce administrative burden. Used without proper controls, it can generate answers that are factually wrong, what’s known in the industry as a “hallucination”.

It’s not just public sector companies as private sector organisation are also using Copilot in legal, commercial, HR, or client-facing situations.

The questions every technology and business leader should be asking right now are:

  • Do we have a clear policy on where and how AI generated content can be used?
  • Are our teams trained to critically evaluate AI outputs rather than accept them at face value?
  • Do we have audit and verification controls in place for important decisions?
  • Do we have the right people internally or through specialist support to govern this effectively?

Getting AI Governance Right: Where Peel Digital Can Help

At Peel Digital, we work with organisations across the public and private sectors to navigate these challenges. Whether you need consultancy support, or specialist talent to fill the digital and technology roles that responsible AI adoption demands, we can help.

Our consultancy team brings practical experience of the Microsoft ecosystem including Copilot, Dynamics 365, and Power Platform ย and can work with your organisation to assess your current AI usage, identify governance gaps, and build the policies and processes that protect you operationally and reputationally.

If you are scaling your digital team to meet growing demand, our talent solutions give you access to experienced digital professionals on a permanent or contract basis, who understand both the technology and the governance requirements that come with it.

The Maccabi case is a warning, not a verdict on AI. The technology works. What organisations need now is the expertise and the structure to use it responsibly.

Get in touch to find out how Peel Digital can support your AI governance and digital talent strategy:

๐Ÿ“ž 01925 377 878

โœ‰๏ธ [email protected]

 

Source: