Hungary’s New Biometric Surveillance Laws Violate the EU AI Act
Written by Marina Linde de Jager – Legal Advisor & AI Ethics Specialist at AI for Change Foundation
Introduction
In March 2025, the Hungarian government enacted sweeping amendments that dramatically expand the state’s use of facial recognition technology (FRT), raising urgent questions about legality, transparency, and fundamental rights. These changes, which took effect on 15 April, allow police to deploy biometric identification tools not just in serious criminal investigations, but also for minor infractions and at public demonstrations, including LGBTQIA+ events like Budapest Pride. Civil society organisations, including the Civil Liberties Union for Europe, EDRi, the
European Center for Not-for-Profit Law (ECNL), and the Hungarian Civil Liberties Union (HCLU), argue that the amendments directly violate the EU’s Artificial Intelligence Act (AI Act) as well as the Charter of Fundamental Rights of the European Union.
What has changed?
Until recently, Hungarian law permitted the use of facial recognition technology only in cases involving infractions punishable by custodial sentences. Even this limited application raised human rights concerns. However, the new legal amendments passed in March now broaden this scope dramatically enabling law enforcement to use FRT in all types of infraction procedures, regardless of severity.
This means Hungarian police can now use biometric identification to detect individuals at public protests, including those attending a banned Pride march, or even for minor infractions such as jaywalking. Crucially, the surveillance is based on video footage frequently captured at public demonstrations.
What is Real-Time Biometric Identification and Why is it
Regulated?
The EU’s Artificial Intelligence Act, adopted in 2024, introduced binding rules for the use of high-risk AI systems. One of the most critical provisions is the prohibition of real- time remote biometric identification (RBI) in publicly accessible spaces by law enforcement authorities, except under extremely limited circumstances such as preventing imminent terrorist attacks or locating missing persons.
Real-time RBI involves scanning and identifying individuals as they move through public spaces using facial or other biometric data, often without their knowledge or consent. Because of its deeply invasive nature, this form of surveillance has been heavily restricted under Article 5(1)(h) of the AI Act. Even where permitted, strict safeguards must apply, including prior authorisation and proportionality assessments.
Why Hungary’s New Law Conflicts with the AI Act
Although Hungary’s system relies on still images, such as CCTV footage, it enables automatic, near-instant comparisons with a government database. Hungarian police now have direct connections to this system, which is run by the Hungarian Institute for Forensic Sciences. Based on available analysis, the setup facilitates rapid identification of individuals at or near the time of public demonstrations.
Under the AI Act, systems that allow fast enough identification to influence behavior during public events are considered “real-time.” Hungary’s model appears to meet this definition. It does not involve retrospective processing of pre-recorded material generated independently of police monitoring. Instead, it enables automated comparisons using material that is generated specifically for surveillance purposes.
This is a critical distinction: while retrospective facial recognition is classified as “high- risk” (subject to regulatory controls taking effect in 2026), real-time biometric surveillance is already prohibited. Hungary’s implementation effectively crosses this line.
Impact on Civil Liberties
The most immediate concern is the chilling effect this surveillance regime could have on civic participation. When individuals know that attending a peaceful protest may lead to them being scanned, identified, and potentially penalized, even for a minor infraction, many will choose not to attend at all.
This effect threatens the core rights guaranteed by both the AI Act and the EU Charter of Fundamental Rights: freedom of assembly, freedom of expression, and the right to privacy. Surveillance tools of this nature risk turning democratic public spaces into zones of quiet fear and self-censorship.
A Targeted Threat to LGBTQIA+ Communities
The legal amendments were passed just weeks before Budapest Pride and appear designed, in part, to criminalise LGBTQIA+ demonstrations. By enabling police to identify and track individuals attending banned or unofficial gatherings, the Hungarian state is weaponising AI-driven surveillance to marginalise specific communities.
This has profound implications not just for Hungary, but for the wider EU. It sends a message that powerful technologies can be used to monitor vulnerable groups under the guise of law enforcement, despite binding regulations intended to prevent exactly this outcome.
What Needs to Happen Next?
Hungary’s approach poses a direct challenge to the integrity of the EU AI Act. If the European Commission does not act swiftly, it risks allowing member states to ignore the law’s protections before they’ve had a chance to take full effect.
The AI Office of the European Commission, which is tasked with ensuring compliance with the AI Act and protecting citizens from AI-related risks, must urgently investigate the legal amendments and assess whether Hungary’s use of real-time biometric surveillance is lawful.
This is not only a Hungarian issue. The response, or lack thereof, from EU institutions will send a clear message to other governments across Europe. Will the AI Act be meaningfully enforced, or will it be undermined before implementation is complete?
Conclusion
Hungary’s newly expanded biometric surveillance laws are a litmus test for the European Union’s resolve. The AI Act was introduced to prevent the exact kind of rights violations that are now unfolding on the ground in Hungary. Without immediate oversight and enforcement, the EU risks normalising unlawful surveillance practices under the cover of innovation and public order.
The AI for Change Foundation stands with civil society partners across Europe in urging the European Commission to take action. Biometric surveillance, especially when deployed in protest contexts and against marginalised communities, has no place in a democratic society without strict adherence to fundamental rights protections.
The EU must treat this moment as more than a legal issue. It is a question of values, accountability, and the future of civil liberties in an age of artificial intelligence.
References
European Center for Not-for-Profit Law. (2025, April 28). Hungary’s new biometric surveillance laws violate the AI Act [Legal analysis]. ECNL.
https://ecnl.org/news/hungarys-new-biometric-surveillance-laws-violate-ai-act
Liberties.EU. (2025, June 24). Facial recognition to target Pride in Hungary: Civil society orgs call on the EU to commit to rights and rule of law. Liberties.EU.
https://www.liberties.eu/en/stories/hungary-facial-recognition-pride/45453
International Comparative Legal Guides. (2025, March 27). Hungary’s surveillance of Pride attendees may breach EU law. ICLG.
https://iclg.com/news/22439-hungary-s-surveillance-of-pride-attendees-may-breach-eu-law
Cybernews. (2025, May 9). Hungary’s biometric laws breach EU AI Act, endanger LGBTQ+ rights. Cybernews.
https://cybernews.com/news/hungarys-biometric-laws-endanger-lgbtq-rights/
AlgorithmWatch. (n.d.). Pride with Pride! Stop mass surveillance at Pride, stop face recognition now [Petition page]. AlgorithmWatch.
https://algorithmwatch.org/en/pridewithpride/