The official enactment of the government’s Online Safety Bill, which has raised concerns among technology corporations regarding the potential infringement of encoded communications, has now taken place. The Online Safety Act, approved on 26 October 2023, aims to establish a more secure online environment for children. It places legal responsibilities on tech firms to prevent and promptly eliminate illicit content, including materials related to terrorism and revenge pornography. Moreover, tech companies are obligated to protect children from accessing harmful but legal content, such as the promotion of self-harm, bullying, pornography, and eating disorders. The communications regulator, Ofcom, has been granted additional powers to impose fines on non-compliant tech companies, totaling either £18 million or 10% of their revenue, whichever is higher. This indicates that major tech corporations may confront substantial fines amounting to billions. The Online Safety Act is anticipated to encompass around 100,000 online services, with the strictest requirements imposed on “Category 1” services that carry the highest risk and have the widest reach. Michelle Donelan, the secretary of technology, has asserted that the act will guarantee online safety for many years to come while safeguarding freedom of speech and empowering adults by ensuring the elimination of illicit content. However, despite these intentions, the legislation’s four-year journey to its enactment has continued to raise concerns among technology companies regarding provisions that could undermine encrypted communications. Encrypted messaging and email services, including WhatsApp, Signal, and Element, have threatened to depart from the UK if Ofcom compels them to integrate “accredited technology” for monitoring encoded communications in the search for illicit content. Section 122 of the act grants Ofcom the authority to require tech companies to implement systems that critics argue would compromise the privacy and security of encoded services by scanning the content of all messages and emails to identify the presence of materials involving sexual abuse of children (CSAM). Mathew Hodgson, the CEO of Element, a secure communications provider for entities like the Ministry of Defence, the US Navy, Ukraine, and NATO, explained that his customers are demanding assurances that the company will not implement message scanning if mandated by the Online Safety Act. Hodgson argues that while the objectives of the bill are commendable, private messaging apps should not be subjected to broad surveillance as it would seriously compromise safety and privacy for everyone. He contends that enforcing Section 122 against tech companies would introduce vulnerabilities that could be exploited by hackers. Andy Yen, the CEO of encrypted mail service Proton, shares a similar viewpoint, cautioning that the Online Safety Act poses a genuine threat to privacy without adequate safeguards for end-to-end encryption. Yen asserts that the bill grants the government unrestricted access to private conversations. While he is reasonably confident that Ofcom will not exert its powers to require Proton to monitor customer emails, he is deeply concerned that the act was passed with a provision allowing the British government access to anyone’s private communications.
Presented by News Live Updates