EU Commission is planning what Apple stopped after backslash from privacy groups: Automatic CSAM scanning of your private communication.

Act now to stop surveillance!

European Commission published draft that would force companies to scan for CSAM images and private messages - just like Apple once proposed. This opens the door for general surveillance and must be stopped.
Only last year Apple wanted to introduce AI-based CSAM scanning to iPhones, turning people’s devices into little surveillance machines. After international public outcry, the company rolled back on its plans. Now the EU Commission has published very similar plans that should become mandatory for all companies offering communication services in Europe. We must act now to stop these plans.

Apple-like CSAM surveillance in Europe

What are the EU Commission’s plans?

The plan presented by the EU Commission entails AI-based scanning of all messages and images for CSAM (child sexual abuse material) directly on citizens’ devices. This so-called client-side scanning would be an attack on any confidential communication.

Apple’s CSAM scanning plans

The EU plan is very similar to what Apple wanted to introduce in 2021: Client-side CSAM scanning directly on your Apple device, e.g. your iPhone.

”It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of our phones and laptops,” said Ross Anderson, professor of security engineering at the University of Cambridge to the Financial Times in 2021. Cryptography professor Matthew Green says: “This will break the dam — governments will demand it from everyone.”

The immense public outcry by privacy activists around the world made Apple stop its plan for CSAM scanning on peoples’ own phones.

New surveillance plans

Now the European Commission introduces the very same plans, trying to turn communications providers into deputy sheriffs of the authorities and enabling general mass surveillance of all our emails and chat messages.

The scanning for material that depicts sexual abuse of children on people’s devices is nothing less than bugs – or security weaknesses - embedded in the system that permanently search for specified content. What they are looking for exactly is not defined yet. But what could possibly go wrong?

Let’s explore this thought a little further!

What could go wrong?

  1. First of all, we are all being secretly monitored - all the time. The list of images and content being searched for can be adapted. Once a law forces communication providers to implement client-side scanning, the tool to do this could theoretically search for anything. The list can therefore be expanded upon request. Initially, it's child pornography - as always when you need the broadest possible consensus for new surveillance capabilities. Then they look for terrorists, human traffickers, drug dealers, and so on and so forth.

  2. Based on the draft by the European Commission it is not clear who defines the list and gets access to the content being searched for. However, we can estimate that it will be at least all European governments, which also includes countries such as Poland (which just made abortion illegal) and Hungary (which is known for its suppression of the media).

  3. One major issue that is completely neglected by the European Commission: Cybersecurity. Ways will be found to hack the process of client-side scanning. For instance, malicious attackers could inject images or documents onto people’s devices in order to discredit them. Or malicious attackers could find a way to siphone off the data that is being scanned on our devices and use it for cyberattacks.

After all we all know that a ‘backdoor for the good guys only’ is not possible.

CSAM harms cybersecurity

Equipping all our communication with a permanent monitoring function that reports any misbehavior of any kind to the provider is a terrible idea. It is a breach of the dam that will lead to unprecedented total surveillance. Anyone even considering such a thing must face massive headwind.

And opposition is already forming.

The German ‘Digitale Gesellschaft’ has organized a protest on very short notice that coincided with the EU Commission’s publication of the draft law that is calling for CSAM scanning on people’s phones and computers - just like Apple once unsuccessfully proposed.

The European digital rights organization EDRi says that the “European Commission’s online CSAM proposal fails to find right solutions to tackle child sexual abuse”.

Ella Jakubowska, Policy Advisor, at EDRi says:

“The European Commission is opening the door for a vast range of authoritarian surveillance tactics. Today, companies will scan our private messages for CSAM content. But once these methods are out there, what’s stopping governments forcing companies to scan for evidence of dissidence or political opposition tomorrow?”

Even the German Child Protection Association considers “intervention in encrypted communication in the fight against child pornography to be unnecessary.” Board member Türk has described the EU Commission's planned scanning of private communications via messenger or e-mail without any reason as “disproportionate and not effective.“

What is at stake is clear: Everybody’s privacy on the internet as well as good cybersecurity protections.

Comparison to Apple’s CSAM plans

It is to be expected that the international outcry that is about to happen will mirror what has been said when Apple announced that it would scan for CSAM images on people’s iPhones last year (and then did not follow through with the plan).

So let’s take a look at how security experts evaluated client-side CSAM scanning after Apple announced its plans:

For instance, security expert Bruce Schneier called the plan to scan every iMessage as "Apple’s iPhone backdoor"; Edward Snowden said that Apple Declared War on Your Privacy and added: “No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.”

Now, in regards to the EU Commission’s proposal, cryptography professor Matthew Green adds:

“Speaking of actual free speech issues, the EU is proposing a regulation that could mandate scanning of encrypted messages for CSAM material. This is Apple all over again.”

Mass surveillance will not solve the issue

The EU Commission's focus on a technical solution to a complex social problem of child abuse is misguided in its approach.

Shifting responsibility to communications providers, placing all citizens under general suspicion and undermining the secrecy of our communication cannot be the alternative to responsible and targeted police work in a constitutional state.

The German Chaos Computer Club warns in a statement:

“So far, it is not clear who is to define and control the detection algorithms and databases. Such a non-transparent system can and will be easily expanded after its introduction. It is already foreseeable that the rights exploitation industry will be just as interested in the system as anti-democratic governments. It is therefore all the more frightening to see the guilelessness with which it is now to be introduced.”

Take action now to stop the law

Security experts agree that the surveillance measures proposed in the current draft of the EU Commission would destroy privacy online and infringe our right to the secrecy of communication.

General mass surveillance – even if done by an AI – is against the Charter of Fundamental Rights of the European Union and must be stopped.

We must act now to prevent this draft from becoming a law!

  1. If you live in Germany, sign the petition by Campact to fight the EU Commission’s draft that would lead to unprecedented mass surveillance in Europe.

  2. If you live in Austria, sign the petition by #aufstehn to fight the EU Commission’s draft that would lead to unprecedented mass surveillance in Europe.

  3. Voice your feedback to the European Commission and share your concerns with this draft law.

  4. Check here how you can join the protest as well as call and email your representatives.

Together we must fight mass surveillance!

What is new with the EU CSAM proposal compared to Apple's approach?

The proposal by the EU Commission is new as it would make CSAM scanning mandatory, would include scanning for “grooming”, and would include end-to-end encrypted communication. End-to-end encrypted providers can not scan for CSAM as this would undermine the end-to-end encryption.