It is no news that technological advancements come loaded with data security and privacy concerns. I don’t think this is the first time we are witnessing Government(s) demand a “backdoor” undermining encrypted messaging protocols. The latest salvo where Meta-owned WhatsApp opposes the proposed UK’s Online Safety Bill(OSB) has once again fueled the decades-old encryption debate.

The bill includes provisions that could require messaging app companies to implement content moderation policies that would only be possible to comply with by compromising end-to-end encryption(E2EE). Failing, it could face fines of up to 4% of its parent company’s annual turnover. This significant penalty would leave companies with little choice but to either comply with the regulations or withdraw from the UK market altogether.

E2EE is regarded as the “gold standard” for secure communication. The technique employs a  unique algorithm to transform messages into a seemingly random string of characters, making it practically impossible for anyone to decipher the message without the encryption “key.” The key is only available on the sending and receiving devices, meaning that even if hackers gain access to the message while in transit, they cannot decode it without the key. While E2EE does not guarantee complete security, keeping the encryption keys on devices makes it extremely difficult for unauthorized parties to access the contents of messages.

What Makes OSB Controversial?

Secure messaging apps like WhatsApp, Signal, and Element that safeguard users’ privacy by employing this robust encryption technology are openly criticizing the OSB, arguing that the bill will jeopardize online safety and they will stop providing services if it passes.

“Our users all around the world want security – 98% of our users are outside the UK, they do not want us to lower the security of the product,” Will Cathcart, UK WhatsApp head  said, “And the app would rather accept being blocked in the UK.”

Meredith Whittaker, the president of Signal, has also expressed her opposition stating in February that the company would not hesitate to “absolutely 100% walk” away and cease operations in the UK if forced to weaken the privacy of its encrypted messaging system.

While detractors of the bill argue that it would grant Ofcom(Office of Communications) the authority to mandate that private, encrypted messaging apps and other services implement “accredited technology,” the government believes the bill “does not represent a ban on end-to-end encryption.”

The legislation seeks to tackle a range of issues that the government deems as dangers posed by the internet, including illegal content such as child sexual abuse and terrorism, as well as “harmful” content such as pornography and bullying. The bill aims to increase the responsibility of technology platforms for the type of content they host, requiring them to prevent such content from appearing or swiftly remove it when it does.

According to a report by the NSPCC, the number of child abuse image offenses in the UK has increased by over 66% in the past five years. Police recorded over 30,000 such offenses in the most recent year compared to 18,574 in the past.

Of these cases, more than 75% of reports that included social media or gaming sites were attributed to only two companies: Snapchat and Meta. Snapchat was responsible for over 4,000 incidents, while Meta’s flagship apps – Facebook, Instagram, and WhatsApp – were mentioned in more than 3,000 incidents.

For years, the government and child protection organizations have been advocating that encryption poses a major obstacle in combating online child abuse.

“It is important that technology companies make every effort to ensure that their platforms do not become a breeding ground for pedophiles,” the Home Office said.

In response to such criticisms, platforms like Meta have taken measures to address concerns over the use of E2EE. Although the social media giant argues that the technology serves to protect the human rights of billions of its users, it has also pledged to introduce extra safeguards for children, such as investing in “proactive detection technology” that can analyze metadata and detect any signs of trafficking illegal images.

While the government strives to make the internet a safe place by introducing “accredited technology”, many public activists and IT professionals fear that doing so will bring more harm than good.

Dr. Monica Horten from the Open Rights Group said: “With over 40 million users of encrypted chat services in the UK, this turns it into a mass-surveillance tool, with potentially damaging consequences for privacy and free-expression rights.”

“Rather than using kids and terrorists as an excuse to expand bulk intercept capabilities, governments need to calmly revisit several policy areas, including family violence, political violence, and online crime. Details matter; they will vary from one country to another depending on local law, police practice, the organization of social work, the availability of firearms, and political polarisation (this list is not exhaustive),” states  Prof Ross Anderson in his paper titled “Chat Control or Child Protection?”

Via Pixabay

Governments’ Persistent Attempts to Weaken Encryption

Governments have a history of compelling tech companies to weaken or eliminate end-to-end encryption in their products, or to create “back doors” like the Clipper Chip from the 1990s, to aid government surveillance.

In 2016, the FBI made an aggressive attempt to force Apple to unlock the iPhone belonging to one of the terrorists of the 2015 mass shooting in San Bernardino, California. Apple’s CEO, Tim Cook, strongly opposed the FBI’s move, referring to it as “the software equivalent of cancer.” Cook argued that complying with the request would set a dangerous precedent, opening the door for further government surveillance in the future.

“Maybe it’s an operating system for surveillance, maybe the ability for the law enforcement to turn on the camera,” Mr. Cook said. “I don’t know where it stops.”

Apple’s feud with the FBI does not seem to rest. Back in December 2022, the FBI expressed its distaste when the company rolled out  a new, optional, end-to-end encryption scheme aimed at preventing user’s iCloud data from being accessed via an “untrusted” device, saying  it is “deeply concerned with the threat end-to-end and user-only-access encryption pose.”

In 2018, Australian legislators approved a bill called “Assistance and Access Act 2018” that mandates technology companies to grant access to encrypted communications to law enforcement and security agencies. The bill empowers the government to obtain a court order that can secretly order technology companies and experts to redesign software and hardware to enable spying on users.

This law is modeled after Britain’s 2016 Investigatory Powers Act which allows British companies to provide keys to unscramble encrypted data to government authorities.

Other countries are also exploring the possibility of implementing new encryption laws. For instance, in India, officials informed the country’s Supreme Court in October 2019 that Facebook is required by Indian law to decrypt messages and provide them to law enforcement when requested.

“They can’t come into the country and say, ‘We will establish a non-decryptable system,’” India’s attorney general, K.K. Venugopal, told the court, referring to Facebook and other big tech platforms.

As it stands, The United States is a party to several international intelligence-sharing arrangements—one of the most prominent being the “Five Eyes” alliance. Born from spying arrangements forged during World War II, the Five Eyes alliance facilitates the sharing of signals intelligence among the U.S., the U.K., Australia, Canada, and New Zealand.

With the addition of India and Japan, the surveillance group in 2020 called for the tech industry to loosen end-to-end encryption and assist government agencies to access private conversations through backdoors while maintaining that they “ support strong encryption.”

“We call on technology companies to work with governments to take the following steps, focused on reasonable, technically feasible solutions: Embed the safety of the public in system designs, thereby enabling companies to act against illegal content and activity effectively with no reduction to safety, and facilitating the investigation and prosecution of offenses and safeguarding the vulnerable; enable law enforcement access to content in a readable and usable format where an authorization is lawfully issued, is necessary and proportionate, and is subject to strong safeguards and oversight; and engage in consultation with governments and other stakeholders to facilitate legal access in a way that is substantive and genuinely influences design decisions.”

Via Pixabay

The Hunt for a Middle Ground

While the tech agencies are “pushing back,” concerned government institutions are optimistic that the proposed laws will help find a middle ground where national security is assured without compromising data privacy.

“It is not a choice between privacy or child safety – we can and we must have both,” the UK government statement read.

In the summer of 2021, Apple announced that it would roll out an on-device scanning feature that will use a machine learning model to sift through individual users’ photos to look for CSAM(Child Sexual Abuse Material) followed by informing human technicians and hence the police.

This was the kind of middle ground that the government appreciated, but soon after the announcement, the company faced extreme backlash from privacy and security experts. Apple initially fought back but then postponed and later on canceled the launch.

Activists suggest that the FBI and other authorities should pump up their technical expertise instead of depending on tech vendors or third-party security experts to crack encryption and other security systems.

“Enhancing the government’s technical capability is one potential solution that does not mandate backdoors,” Representative Diana DeGette, a Colorado Democrat, said during a hearing of the House of Representatives Energy and Commerce Committee’s oversight subcommittee.

FBI representative argued that it is unlikely that the FBI can hire the experts it needs to keep up with new encryption services that continue to roll out.

“We live in such an advanced age of technology development, and to keep up with that, we do require the services of specialized skills that we can only get through private industry.”

Privacy advocates hold that implementing encryption backdoors cannot guarantee safety and is not foolproof. Even if vulnerabilities are hidden or kept secret, there is a risk of them being discovered by others and possibly misused. The 2015 paper “Keys Under Doormats,” authored by a group of leading cryptographers, highlights the inherent and inevitable risks of such approaches.


This article was originally published by Swasti Kaushik on Hackernoon.