Yesterday, March 11, 2026, the European Parliament voted to restrict the mass scanning of private communications.
An amendment tabled by MEP Markéta Gregorová (Greens/EFA) passed by a narrow margin: any scanning of private communications must be limited to individual users or groups of users suspected by a competent judicial authority.
In the phrasing used by Patrick Breyer’s summary of the vote: no warrant, no scanning.
Primary/near-primary links (as compiled by Breyer):
- https://www.europarl.europa.eu/doceo/document/A-10-2026-0040-AM-004-006_DE.pdf (Amendment 5)
- https://www.patrick-breyer.de/wp-content/uploads/2026/03/ChatControl1-plenary.pdf (Mandate text as shared by Breyer)
- https://www.patrick-breyer.de/en/historic-chat-control-vote-in-the-eu-parliament-meps-vote-to-end-untargeted-mass-scanning-of-private-chats/ (Breyer’s write-up)
This is not the end of Chat Control.
But it is a political signal about a system that, looking at the available data, has never worked as advertised.
It’s worth going back to the beginning, because March 11 didn’t come out of nowhere.
🔗What Chat Control is, and why it’s been on the table for years
There are two tracks that tend to get blurred together in public debate:
- The permanent regulation proposal (often referred to as “Chat Control 2.0” by critics): the Commission’s 2022 proposal for a Regulation laying down rules to prevent and combat child sexual abuse (CSAR).
- The interim/temporary derogation (often referred to as “Chat Control 1.0” by critics): a time-limited carve-out from ePrivacy confidentiality rules that allows providers to voluntarily detect/report CSAM.
The interim derogation is Regulation (EU) 2021/1232:
Breyer’s summary of the March 11 vote frames it as a mandate concerning this interim system, under time pressure because that regime is set to expire in early April 2026 (he cites April 6):
This “expiring regime” is also referenced in an EU institutional document: the EDPS notes that the Interim Regulation is set to expire, and discusses a proposal to extend it until 3 April 2028.
(The “1.0 / 2.0” labels are political shorthand, not official EU names.)
🔗A timeline (why this keeps coming back)
This dossier has been declared “dead” more than once.
A few waypoints that are directly linkable from Breyer’s document pool:
- 22 Nov 2023: the European Parliament adopts its position on the permanent CSAR proposal (EDRi write-up):
- 14 Oct 2025: planned Council (JHA) vote was removed from the agenda (Breyer links the Council agenda document):
- Oct 2025: Germany’s justice minister states Germany “will not agree” to mass scanning legislation (reported by POLITICO, linking to her post on X):
- Oct 2025: Denmark backs away from mandatory detection orders and shifts to a voluntary-detection posture (reported by Euractiv):
Even if you ignore every opinion piece around it, the procedural pattern is visible:
- the vote date moves
- the proposal gets softened or rebranded
- the same core conflict returns: bulk scanning vs. confidentiality/encryption
🔗The track record of the “voluntary” version
Before discussing what should replace the current system, it’s worth looking at what the current system has produced.
Patrick Breyer cites the European Commission’s own implementation report on the interim derogation as containing some of the most damning admissions.
The report he links is:
- European Commission report (COM(2025) 740 final): https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52025DC0740
Breyer summarizes the report’s findings as follows:
- Around 99% of reports sent to police in Europe come from Meta.
- Germany’s Federal Criminal Police Office (BKA) reports that, out of roughly 300,000 chats reported annually, 48% are false positives / criminally irrelevant.
- 40% of investigations in Germany target minors (e.g., consensual sexting), rather than organised networks.
- The number of reports has dropped by 50% since 2022, which Breyer attributes to increasing end-to-end encryption adoption and migration.
Source:
And a separate Breyer post about the implementation report:
I’m deliberately phrasing these as “Breyer summarizes the Commission report as…” because I’m not going to pretend second-hand numbers are first-hand evidence.
But even as second-hand summaries of a Commission document, the shape of the story is consistent:
- one company dominates the reporting pipeline
- error rates are high enough to become a system property
- encryption adoption changes what the system can even see
🔗The technical problem can’t be solved by policy
There’s a recurring argument in Chat Control debates: the idea that the challenge is finding a balance between security and privacy.
In end-to-end encrypted systems, the deeper problem is that the provider doesn’t have access to plaintext messages.
That’s not a missing API.
That’s the point.
One of the more aggressive workarounds proposed in policy debates is client-side scanning: scanning content on the user’s device before encryption.
A detailed technical analysis of client-side scanning risks is:
- Bugs in our Pockets: The Risks of Client-Side Scanning (Abelson et al.): https://arxiv.org/abs/2110.07450
Client-side scanning also has a very concrete “this seemed like a good idea until it touched the real world” example.
Apple announced a CSAM detection system for iCloud Photos in 2021 and later abandoned it. CNN quotes Apple as saying it had “decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos.”
The Verge’s reporting also describes this as Apple ending development of the client-side scanning plan.
And the “encryption backdoors aren’t a surgical tool” point shows up in case law too.
The European Court of Human Rights held (in Podchasov v. Russia) that a decryption requirement “cannot be regarded as necessary in a democratic society”.
HUDOC entry:
🔗Legal signals inside the EU system
In November 2023, the European Parliament adopted its position on the CSAR dossier.
EDRi describes the Parliament’s position as rejecting mass scanning of private messages and instead requiring reasonable suspicion:
EDRi also links to the Council Legal Service warning (April 2023) that the original proposal would violate the essence of the right to privacy:
(That case is not “about Chat Control”, but it is relevant to the recurring claim that you can mandate decryption / degraded encryption and still be within fundamental rights constraints.)
🔗The lobbying problem
I’m not describing a conspiracy.
I’m describing a documented incentive structure.
One part is the “revolving door” pattern.
Breyer summarizes an EU Ombudsman decision as finding maladministration in Europol’s handling of a staff member’s move to Thorn:
- https://www.patrick-breyer.de/en/chat-control-eu-ombudsman-criticises-revolving-door-between-europol-and-chat-control-tech-lobbyist-thorn/
- Ombudsman decision page: https://www.ombudsman.europa.eu/en/decision/en/200017
Another part is direct, documented lobbying and coalition-building around the 2022 proposal.
A detailed investigative report by Balkan Insight describes, among other things:
- a close working relationship between Commissioner Johansson’s services and Thorn in 2022
- lobbying spend figures (e.g. Thorn paying FGS Global at least €600k in 2022)
- the role of WeProtect Global Alliance (including Commission participation)
- funding flows involving Oak Foundation grants to organisations campaigning for the proposal
Source:
🔗What happens now
The March 11 vote is a negotiating mandate, not a law.
According to Breyer’s write-up of the vote, trilogue negotiations are set to begin immediately, under time pressure from the interim regulation’s expiry.
Breyer also writes that the Commission and “the vast majority” of the Council have so far rejected restrictions on untargeted scanning (his phrasing), with Italy called out as an exception.
Source:
This dossier has survived on two things:
- procedural persistence
- rebranding
“Voluntary scanning.”
“Upload moderation.”
“Age verification.”
New labels for ideas that often converge on the same outcome: bulk surveillance of private communications.
The question now is whether the Parliament’s mandate survives the closed-door reality of trilogues.
🔗References
- Patrick Breyer (vote summary): https://www.patrick-breyer.de/en/historic-chat-control-vote-in-the-eu-parliament-meps-vote-to-end-untargeted-mass-scanning-of-private-chats/
- EDRi (Parliament position, Nov 2023): https://edri.org/our-work/csar-european-parliament-rejects-mass-scanning-of-private-messages/
- Breyer (Chat Control dossier / document pool): https://www.patrick-breyer.de/en/posts/chat-control/
- Commission implementation report (linked in Breyer’s document pool): https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52025DC0740
- Ombudsman / Europol ↔ Thorn: https://www.patrick-breyer.de/en/chat-control-eu-ombudsman-criticises-revolving-door-between-europol-and-chat-control-tech-lobbyist-thorn/
- Balkan Insight investigation (Thorn/WeProtect/Oak): https://balkaninsight.com/2023/09/25/who-benefits-inside-the-eus-fight-over-scanning-for-child-sex-content/
- Council agenda document (Oct 2025): https://data.consilium.europa.eu/doc/document/ST-13309-2025-INIT/en/pdf
- POLITICO (Germany justice minister statement): https://www.politico.eu/article/germany-split-online-eu-child-abuse-scanning-bill-pressure-mounts/
- Euractiv (Denmark backs away from mandatory detection orders): https://www.euractiv.com/news/danish-presidency-backs-away-from-chat-control/
- EDPS Opinion 7/2026 (Interim Regulation set to expire; extension proposal to 3 April 2028): https://www.edps.europa.eu/data-protection/our-work/publications/opinions/2026-02-16-edps-opinion-72026-regulation-extending-application-regulation-eu-20211232_en
- CNN (Apple drops iCloud CSAM detection tool): https://www.cnn.com/2022/12/08/tech/apple-csam-tool
- The Verge (Apple drops CSAM scanning plans): https://www.theverge.com/2022/12/7/23498588/apple-csam-icloud-photos-scanning-encryption
- Bugs in our Pockets (CSS analysis): https://arxiv.org/abs/2110.07450