Featured Article : Temporary Climb-Down By UK Government

In an apparent admission of defeat, the UK government has conceded that requiring scanning of platforms like WhatsApp for messages with harmful content, as required in the Online Safety Bill, is not (currently) feasible.

The ‘Spy Clause’ 

Under what’s been dubbed the ‘spy clause’ (Clause 122) in the UK’s Online Safety Bill, the government had stated Ofcom could issue notices to messaging apps like WhatsApp and Signal (which use end-to-end encryption) that would allow the deployment of scanning software. The reason given was to scan for child sex abuse images on the platforms. However, the messaging apps argued that this would effectively destroy the end-to-end encryption, an important privacy feature valued by customers. This led to both WhatsApp and Signal threatening to pull their services out of the UK if the Bill went through with the clause in it.

Also, some privacy groups, like the Open Rights Group, argued that forcing the scanning of private messages on apps amounted to an expansion of mass surveillance.

Climbdown 

However, in a recent statement to the House of Lords junior arts and heritage minister Lord Stephen Parkinson announced that the government would be backing down on the issue. Lord Parkinson said: “When deciding whether to issue a notice, Ofcom will work closely with the service to help identify reasonable, technically feasible solutions to address child sexual exploitation and abuse risk, including drawing on evidence from a skilled persons report. If appropriate technology which meets these requirements does not exist, Ofcom cannot require its use.” 

In other words, the technology that enables scanning of messages without violating encryption doesn’t currently exist and, therefore, under the amended version of the bill, WhatsApp and Signal will not be required to have their messages scanned (until such technology does exist).

This is a significant climbdown for the government which has been pushing for ‘back doors’ and scanning of encrypted apps for many years, particularly since it was revealed that the London Bridge terror attack appeared to have been planned via WhatsApp.

Victory – Signal & WhatsApp 

Writing on ‘X’ (formerly Twitter), Meredith Whittaker, the president of Signal, said the government’s apparent climbdown was “a victory, not a defeat” for the tech companies. She also admitted, however, that it wasn’t a total victory, saying “we would have loved to see this in the text of the law itself.”

Also posting on ‘X,’ Will Cathcart, head of WhatsApp said that WhatsApp “remains vigilant against threats” to its end-to-end encryption service, adding that “scanning everyone’s messages would destroy privacy as we know it. That was as true last year as it is today.” 

Omnishambles 

Following the news of the government’s ‘spy clause’ climbdown, privacy advocates the Open Rights Group’ (ORG) highlighted the fact that on the one hand, the government had conceded that the technology that would have been needed to scan messages didn’t exist, while on the other hand appeared they to say they hadn’t conceded.  Describing the matter as an “omnishambles,” the ORG highlighted how during an appearance on Times radio, Michelle Donelan MP said that, “We haven’t changed the bill at all” and that “further work to develop the technology was needed.” 

What Does This Mean For Your Business? 

For apps like WhatsApp and Signal, this is not only a victory against government pressure but is also good news for business as, presumably, they will continue to operate in the UK market.

This is also good news for many UK businesses that routinely use WhatsApp as part of their business communications and won’t need to worry (for the time being) about having their commercially (and personally) sensitive messages scanned, thereby posing a risk to privacy and security, and perhaps increasing the risk of hacks and data breaches. It appears that the UK government has been forced to admit the technology does not yet exist that can scan messages on end-to-end encrypted services and maintain the integrity of that end-to-end encryption at the same time. It also appears that it may realistically take quite some time (years) before this technology exists, thereby making the victory all the sweeter for the encrypted apps.

The government’s climbdown on ‘clause 122’ (the ‘spy clause’), is also being celebrated by the many privacy groups that have long argued against it on the grounds of it enabling mass surveillance.

Featured Article : UK Gov Pushing To Spy On WhatsApp (& Others)

The recent amendment to the Online Safety Bill which means a compulsory report must be written for Ofcom by a “skilled person” before encrypted app companies are forced to scan messages has led to even more criticism of this rather controversial bill to bypass security in apps and give the government (and therefore any number of people) more access to sensitive and personal information.

What Amendment? 

In the House of Lords debate, which was the final session of the Report Stage and the last chance for the Online Safety Bill to be amended before the Bill becomes law, Government minister Lord Parkinson amended the bill by calling for the need for a report to be written for Ofcom by a “skilled person” (appointed by Ofcom) before powers can be used to force a provider / tech company (e.g. WhatsApp or Signal), to scan its messages. The stated purpose of scanning messages using the powers of the Online Safety Bill is (ostensibly) to uncover child abuse images.

The amendment states that “OFCOM may give a notice under section 111(1) to a provider only after obtaining a report from a skilled person appointed by OFCOM under section 94(3).” 

Prior to the amendment, the report had been optional.

Why Is A Compulsory Report Stage So Important? 

The amendment says that the report is needed before companies can be forced to scan messages “to assist OFCOM in deciding whether to give a notice…. and to advise about the requirements that might be imposed by such a notice if it were to be given”. In other words, the report will be to assess the impact of scanning on freedom of expression or privacy, and to explore whether other less intrusive, less alternative technologies could be used instead.

It is understood, therefore, that the report’s findings will be used to help decide whether to force a tech firm to scan messages. Under the detail of the amendment, a summary of the report’s findings must be shared with the tech firm concerned.

Reaction 

Tech companies may be broadly in agreement of the aims of the bill. However, the detail of the bill that companies such as encrypted messages operators (e.g. WhatsApp and Signal and others) have always opposed being forced into scanning user messages before they are encrypted (client-side scanning). Operators say that this completely undermines the privacy and security of encrypted messaging, and they object to the idea of having to run government-mandated scanning services on their devices. Also, they argue that this could leave their apps more vulnerable to attack.

The latest amendment, therefore, has not changed this situation for the tech companies and has led to more criticism and more objections. Many objections have also been aired by campaign and rights groups such as Index on Censorship and The Open Rights Group, who have always opposed what they call the “spy clause” in the bill for example:

– The Ofcom appointed “skilled person” could simply be a consultant or political appointee, and having these people oversee decisions about free speech and privacy rights would not amount to effective oversight.

– Judicial oversight should be a bare minimum and a report written by just a “skilled person” wouldn’t be binding and would lack legal authority.

Other groups, however, such as the NSPCC, have broadly backed the bill in terms of finding ways to make tech firms mitigate the risks of child sexual abuse when designing their apps or adding features, e.g. end-to-end encryption.

Another Amendment 

Another House of Lords amendment to the bill requires Ofcom to look at the possible impact of the use of technology on journalism and the protection of journalistic sources. Under the amendment, Ofcom would be able to force tech companies to use what’s been termed as “accredited technology” to scan messages for child sexual abuse material.

This has also been met with similar criticisms over the idea of government-mandated scanning technology’s effects on privacy, freedom of speech, and potentially being used as a kind of monitoring and surveillance. WhatsApp, Signal, and Apple have all opposed the scanning idea, with WhatsApp and Signal reportedly indicating that they would not comply.

Breach Of International Law? 

The clause 9(2) of the Online Safety Bill which requires platforms to prevent users from “encountering” certain “illegal content” has also been soundly criticised recently. This clause means that platforms which host user-generated content will need to immediately remove any such content, which has a broad range, or face considerable fines, blocked services, or even jail for executives. Quite apart from the technical and practical challenges of being able to achieve this effectively at scale, criticisms of the clause include that it threatens free speech in the UK, and it lacks the detail for legislation.

Advice provided The Open Rights Group suggests that the clause may even be a breach of international law in that there could be “interference with freedom of expression that is unforeseeable” and goes against the current legal order on platforms.

It’s also been reported that Wikipedia could withdraw from the UK over the rules in the bill.

Investigatory Powers Act Objections (The Snooper’s Charter) 

Suggested new updates to the Investigatory Powers Act (IPA) 2016 (sometimes called the ‘Snooper’s Charter’) have also come under attack from tech firms, not least Apple. For example, the government wants messaging services, e.g. WhatsApp, to clear security features with the Home Office before releasing them to customers. The update to the IPA would mean that the UK’s Home Office could demand, with immediate effect, that security features are disabled, without telling the users/the public. Currently, a review process with independent oversight (with the option of appeal by the tech company) is needed before any such action could happen.

The Response 

The response from tech companies has been and swift and negative, with Apple threatening to remove FaceTime and iMessage from the UK if the planned update to the Act goes ahead.

Concerns about granting the government the power to secretly remove security features from messaging app services include:

– It could allow government surveillance of users’ devices by default.

– It could reduce security for users, seriously affect their privacy and freedom of speech, and could be exploited by adversaries, whether they are criminal or political.

– Building backdoors into encrypted apps essentially means there is no longer end-to-end encryption.

Apple 

Apple’s specific response to the proposed updates/amendments (which will be subject to an eight-week consultation anyway) is that:

– It refuses to make changes to security features specifically for one country that would weaken a product for all users globally.

– Some of the changes would require issuing a software update, which users would have to be told about, thereby stopping changes from being made secretly.

– The proposed amendments threaten security and information privacy and would affect people outside the UK.

What Does This Mean For Your Business? 

There’s broad agreement about the aims of UK’s Online Safety Bill and IPA in terms of wanting to tackle child abuse, keep people safe, and even making tech companies take more responsibility and measures to improve safety. However, these are global tech companies where UK users represent only a small part of their total user base, and ideas like building in back doors into secure apps, running government approved scanning of user content and using reports written by consultants/political appointees to support scanning all go against ideas of privacy, one of key features of apps like WhatsApp.

Allowing governments access into apps and granting them powers to turn off security ‘as and when’ raise issues and suspicions about free speech, government monitoring and surveillance, legal difficulties, and more. In short, even though the UK government want to press ahead with the new laws and amendments there is still a long way to go before there is any real agreement with the tech companies. In fact, it looks likely that they won’t comply and some, like WhatsApp have simply said they’ll pull out of the UK market, which could be very troublesome for UK businesses, charities, groups and individuals.

The tech companies also have a point in that it seems unreasonable to expect them to alter their services just for one country in a way that could negatively affect their users in other countries. As some critics have pointed out, if the UK wants to be a leading player on the global tech stage, alienating the big tech companies may not be the best way to go about it. It seems that a lot more talking and time will be needed to get anywhere near real-world workable laws and, at the moment, with the UK government being seen by many as straying into areas that are alarming rights groups, some tech companies are suggesting the government ditch their new laws and start again.

Expect continued strong resistance from tech companies going forward if the UK government doesn’t slow down or re-think many aspects of these new laws – watch this space.