Ai error kept fans out of stadium
Fear driven decision after synagogue attack
In October 2025, Birmingham’s Safety Advisory Group assessed security risks ahead of a Europa League match between Aston Villa and Maccabi Tel Aviv. The review took place amid heightened religious tensions following a deadly attack on a synagogue in Manchester.
According to reporting by Wirtualna Polska, West Midlands Police, a key member of the advisory group, recommended that the match be played without away supporters. Officers argued that allowing Israeli fans into the stadium could increase the risk of violence.
Claims from abroad shaped local policy
Police documents cited alleged incidents involving Maccabi Tel Aviv supporters during a previous match in Amsterdam. The report claimed fans had attacked Muslim communities and that Dutch authorities had required as many as 5,000 officers to restore order.
Dutch police later rejected those claims. According to statements from Amsterdam Police, the events had been exaggerated and the scale of deployment was significantly smaller than suggested.
Read also: The data behind the Premier League title race and top-four battle
A match that never happened
One of the most serious errors involved references to unrest during a match between West Ham United and Maccabi Tel Aviv. That fixture, however, had never taken place.
Despite the inaccuracies, the Aston Villa match on 6 November went ahead without Maccabi supporters in attendance.
Shifting explanations under parliamentary scrutiny
The decision soon reached Westminster. Craig Guildford, Chief Constable of West Midlands Police, was summoned before members of the British Parliament.
In December and again in early January, Guildford denied that artificial intelligence tools had played any role in the decision making process. On 6 January, he told lawmakers that officers had relied on internal databases and standard Google searches.
Read also: Manchester United stunned as top target chooses rival path
That explanation changed days later. In a letter dated 12 January, Guildford acknowledged that false information about the West Ham match had originated from Microsoft Copilot.
When automation meets accountability
The admission prompted calls from several MPs for Guildford’s resignation and for an independent audit of police technology use. Critics argue the case contradicts earlier assurances that AI systems were not being used in operational judgments.
Microsoft has not commented on the incident. The company has not addressed allegations that so called AI hallucinations contributed to the flawed report.
According to parliamentary records and police correspondence reviewed by Wirtualna Polska, the case is now being cited as an early warning of how automated tools can influence real world security decisions when oversight fails.
Read also: No English manager has won the Premier League until now? Liam Rosenior’s Chelsea test
Sources, Wirtualna Polska, British Parliament, West Midlands Police
Read also: Nothing works for Real Madrid as Cup exit fuels growing crisis
