Chat Control: the European Parliament says no. For now.

Yesterday, March 11, 2026, the European Parliament voted to restrict the mass scanning of private communications.

An amendment tabled by MEP Markéta Gregorová (Greens/EFA) passed by a narrow margin: any scanning of private communications must be limited to individual users or groups of users suspected by a competent judicial authority.

In the phrasing used by Patrick Breyer’s summary of the vote: no warrant, no scanning.

Primary/near-primary links (as compiled by Breyer):

This is not the end of Chat Control.

But it is a political signal about a system that, looking at the available data, has never worked as advertised.

It’s worth going back to the beginning, because March 11 didn’t come out of nowhere.

🔗What Chat Control is, and why it’s been on the table for years

There are two tracks that tend to get blurred together in public debate:

  • The permanent regulation proposal (often referred to as “Chat Control 2.0” by critics): the Commission’s 2022 proposal for a Regulation laying down rules to prevent and combat child sexual abuse (CSAR).
  • The interim/temporary derogation (often referred to as “Chat Control 1.0” by critics): a time-limited carve-out from ePrivacy confidentiality rules that allows providers to voluntarily detect/report CSAM.

The interim derogation is Regulation (EU) 2021/1232:

Breyer’s summary of the March 11 vote frames it as a mandate concerning this interim system, under time pressure because that regime is set to expire in early April 2026 (he cites April 6):

This “expiring regime” is also referenced in an EU institutional document: the EDPS notes that the Interim Regulation is set to expire, and discusses a proposal to extend it until 3 April 2028.

(The “1.0 / 2.0” labels are political shorthand, not official EU names.)

🔗A timeline (why this keeps coming back)

This dossier has been declared “dead” more than once.

A few waypoints that are directly linkable from Breyer’s document pool:

Even if you ignore every opinion piece around it, the procedural pattern is visible:

  • the vote date moves
  • the proposal gets softened or rebranded
  • the same core conflict returns: bulk scanning vs. confidentiality/encryption

🔗The track record of the “voluntary” version

Before discussing what should replace the current system, it’s worth looking at what the current system has produced.

Patrick Breyer cites the European Commission’s own implementation report on the interim derogation as containing some of the most damning admissions.

The report he links is:

Breyer summarizes the report’s findings as follows:

  • Around 99% of reports sent to police in Europe come from Meta.
  • Germany’s Federal Criminal Police Office (BKA) reports that, out of roughly 300,000 chats reported annually, 48% are false positives / criminally irrelevant.
  • 40% of investigations in Germany target minors (e.g., consensual sexting), rather than organised networks.
  • The number of reports has dropped by 50% since 2022, which Breyer attributes to increasing end-to-end encryption adoption and migration.

Source:

And a separate Breyer post about the implementation report:

I’m deliberately phrasing these as “Breyer summarizes the Commission report as…” because I’m not going to pretend second-hand numbers are first-hand evidence.

But even as second-hand summaries of a Commission document, the shape of the story is consistent:

  • one company dominates the reporting pipeline
  • error rates are high enough to become a system property
  • encryption adoption changes what the system can even see

🔗The technical problem can’t be solved by policy

There’s a recurring argument in Chat Control debates: the idea that the challenge is finding a balance between security and privacy.

In end-to-end encrypted systems, the deeper problem is that the provider doesn’t have access to plaintext messages.

That’s not a missing API.

That’s the point.

One of the more aggressive workarounds proposed in policy debates is client-side scanning: scanning content on the user’s device before encryption.

A detailed technical analysis of client-side scanning risks is:

Client-side scanning also has a very concrete “this seemed like a good idea until it touched the real world” example.

Apple announced a CSAM detection system for iCloud Photos in 2021 and later abandoned it. CNN quotes Apple as saying it had “decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos.”

The Verge’s reporting also describes this as Apple ending development of the client-side scanning plan.

And the “encryption backdoors aren’t a surgical tool” point shows up in case law too.

The European Court of Human Rights held (in Podchasov v. Russia) that a decryption requirement “cannot be regarded as necessary in a democratic society”.

HUDOC entry:

In November 2023, the European Parliament adopted its position on the CSAR dossier.

EDRi describes the Parliament’s position as rejecting mass scanning of private messages and instead requiring reasonable suspicion:

EDRi also links to the Council Legal Service warning (April 2023) that the original proposal would violate the essence of the right to privacy:

(That case is not “about Chat Control”, but it is relevant to the recurring claim that you can mandate decryption / degraded encryption and still be within fundamental rights constraints.)

🔗The lobbying problem

I’m not describing a conspiracy.

I’m describing a documented incentive structure.

One part is the “revolving door” pattern.

Breyer summarizes an EU Ombudsman decision as finding maladministration in Europol’s handling of a staff member’s move to Thorn:

Another part is direct, documented lobbying and coalition-building around the 2022 proposal.

A detailed investigative report by Balkan Insight describes, among other things:

  • a close working relationship between Commissioner Johansson’s services and Thorn in 2022
  • lobbying spend figures (e.g. Thorn paying FGS Global at least €600k in 2022)
  • the role of WeProtect Global Alliance (including Commission participation)
  • funding flows involving Oak Foundation grants to organisations campaigning for the proposal

Source:

🔗What happens now

The March 11 vote is a negotiating mandate, not a law.

According to Breyer’s write-up of the vote, trilogue negotiations are set to begin immediately, under time pressure from the interim regulation’s expiry.

Breyer also writes that the Commission and “the vast majority” of the Council have so far rejected restrictions on untargeted scanning (his phrasing), with Italy called out as an exception.

Source:

This dossier has survived on two things:

  • procedural persistence
  • rebranding

“Voluntary scanning.”

“Upload moderation.”

“Age verification.”

New labels for ideas that often converge on the same outcome: bulk surveillance of private communications.

The question now is whether the Parliament’s mandate survives the closed-door reality of trilogues.

🔗References