After multiple years of debate, the highly criticized Online Safety Bill will be passed into law in the coming weeks, with massive consequences for the ways people in Great Britain access online services.
During one of its last readings in the House of Lords Sunak's government backed down by confirming that it would not plan to break encryption and only require messaging apps to scan for illegal content when it became "technically feasible".
This statement was hailed as a "victory" by privacy advocates. There was a minute of hope that Britain would re-write the co called playbook for dictators. For instance, security expert Alec Muffett tweeted:
"RIGHT HERE, RIGHT NOW, IN TERMS OF POLICY, LET'S ACKNOWLEDGE ONE THING: After years - especially after the bluster of the past 6 months - that there is acknowledgement from the government of the intractability of client-side scanning, is a HUGE WIN."
And Signal's Meredith Whittaker tweeted:
"WOW. I'm so moved, a bit stunned, and more than anything sincerely grateful to those who came together to ensure sunlight on the dangerous OSB Spy Clause, and to those in the UK gov who synthesized the facts and acted on them."
"I knew we had to fight. I didn't know we'd win❤️🙏"
However, Meredith also adds:
"Of course this isn't a total victory. We would have loved to see this in the text of the law itself. But this is nonetheless huge, and insofar as the guidance for implementation will have the force to shape Ofcom's implementation framework, this is, again, very big and very good."
Because, crucially, while the government said it would not force tech companies to break encryption, that phrase didn’t make it into the amended legislation. What is worse, government representatives only said it would hold its fire until "technically feasible".
Stephen Parkinson, the Parliamentary Under-Secretary of State for Arts and Heritage, said to the House of Lords: Companies will not be required to scan encrypted messages until it is "technically feasible and where technology has been accredited as meeting minimum standards of accuracy in detecting only child sexual abuse and exploitation content." He also directly mentioned the controversial clause 122 of the Online Safety Bill:
"If the appropriate technology doesn’t exist which meets those requirements, then Ofcom will not be able to use clause 122 to require its use."
While the UK government has admitted that currently no technology to scan for child sexual abuse material exists that does not violate the human right to privacy, it is still seaking such technology. When discussing options, most often politicians talk about client-side scanning, which is also eyed closely by the EU as the 'holy grail' for law enforcement.
Services that use end-to-end encryption guarantee that only the sender and the recipient can see the content of the messages. For being able to decide whether the messages contain CSA material, one would need to scan the data locally on people's devices so that it can still be sent with end-to-end encryption. The scanned data would need to be compared to a dataset on another server, which in fact would break the end-to-end encryption and violate people's privacy.
Because scanning and respecting people's right to privacy is essentially impossible, Apple killed its plans to develop client-side scanning for iCloud, stating that it couldn’t make the scanning process work without infringing on users’ privacy.
Ultimately, today no feasible technology to scan for abuse material on encrypted data exists.
Yet privacy activists are claiming a victory. For instance, Wired wrote:
"Britain Admits Defeat in Controversial Fight to Break Encryption: The UK government has admitted that the technology needed to securely scan encrypted messages sent on Signal and WhatsApp doesn’t exist, weakening its controversial Online Safety Bill."
In parts it is a victory, because - at least for now - the so-called "spy clause" in the UK’s Online Safety Bill, which would have required messaging apps to break end-to-end encryption for being able to scan for abuse material, will no longer be enforced.
Finally, after months of convincing from security and privacy experts, the government admitted the technology to securely scan encrypted messages for signs of child sexual abuse material (CSAM) without compromising users’ privacy, doesn’t exist - yet.
However, other privacy experts are not so optimistic. Matthew Hodgson, CEO and co-founder of UK-based Element, a decentralized messaging app said:
This is "not a change, it’s kicking the can down the road."
"It’s only what’s actually written in the bill that matters. Scanning is fundamentally incompatible with end-to-end encrypted messaging apps. Scanning bypasses the encryption in order to scan, exposing your messages to attackers. So all ‘until it’s technically feasible’ means is opening the door to scanning in future rather than scanning today."
A spokesperson for the campaign organisation Index on Censorship said:
"The online safety bill as currently drafted is still a threat to encryption and as such puts at risk everyone from journalists working with whistleblowers to ordinary citizens talking in private. We need to see amendments urgently to protect our right to free speech online."
Another expert, Martin Albrecht, a professor of cybersecurity at King’s College London and a critic of clause 122 of the Online Safety Bill, said he would not see any possible way that a message-scanning technology could be "feasible". How would such a technology be able to accurately scan only for abuse material while still protecting people's privacy?
Albrecht said to the Guardian:
"I am relieved to see the government accepting the scientific consensus that the technology does not exist to scan encrypted messages without violating users’ privacy. However, it is not clear what test the government plans to apply to decide on whether the technology is feasible in the future."
Since the government did not change the wording of the bill, the option to force companies to scan later still exists. The media regulator Ofcom still has the power to declare any technology good enough for the task of scanning for abuse content while respecting people's right to privacy - whether the technology can actually achieve this or not doesn't matter.
Listening closely to politicians and their statements about the Online Safety Bill, the current plan is not to break encryption, but this could only be a lip service.
Minister for tech and the digital economy, Paul Scully MP said:
"Our position on this matter has not changed and it is wrong to suggest otherwise. Our stance on tackling child sexual abuse online remains firm, and we have always been clear that the Bill takes a measured, evidence-based approach to doing so."
This means: The UK government still wants to scan every message and every text you send, no matter if it's end-to-end encrypted or not.
If the Online Safety Bill is passed in its current form, Ofcom will be able "to direct companies to either use, or make best efforts to develop or source, technology to identify and remove illegal child sexual abuse content — which we know can be developed," Scully said.
In the end, all we have is a promise. The UK government says they won't force messaging apps to use unproven technology to scan for child sexual abuse material, or CSAM, but the powers to do so are still in the bill. The government can change their mind any minute.
The new tone of the UK government came about due to heavy pressure from tech companies. For instance, WhatsApp and Signal threatened to leave the UK if the Online Safety Bill is passed as they would not weaken or break their end-to-end encryption in order to comply with UK legislation.
We at Tutanota will not accept the Online Safety Bill.
We at Tutanota took a different approach, stating we would not leave the UK, but that the UK would need to block access to Tutanota, just like Russia and Iran.
We focus on our users' security and privacy, now and in the future. Instead of thinking about how to break or bypass encryption, we make encryption stronger by already investing into post-quantum secure encryption.
As tech experts we understand the need for end-to-end encryption and as freedom fighters we would rather fight the Online Safety Bill in court than tinker with our built-in encryption that protects the data of millions of users around the world. We have not given in to China or Iran who already block access to Tutanota and we will not do so for the UK.
Our passion is to fight for your right to privacy.
What is the Online Safety Bill?
The Online Safety Bill aims at making the UK the most secure space online. The bill forces platforms to remove illegal content such as child sexual abuse material as well as remove content that is banned according to their own terms.
The bill contains a set of new laws with the goal of protecting children and adults online. Social media companies should take more responsibility over the content that is published on their sites and remove illegal content faster.
The first draft of the Online Safety Bill was published in May 2021, and it's highly likely that the current form of the bill will be passed into law in autumn 2023.