Article Details

Scrape Timestamp (UTC): 2024-06-18 16:23:07.126

Source: https://thehackernews.com/2024/06/signal-foundation-warns-against-eus.html

Original Article Text

Click to Toggle View

Signal Foundation Warns Against EU's Plan to Scan Private Messages for CSAM. A controversial proposal put forth by the European Union to scan users' private messages for detection child sexual abuse material (CSAM) poses severe risks to end-to-end encryption (E2EE), warned Meredith Whittaker, president of the Signal Foundation, which maintains the privacy-focused messaging service of the same name. "Mandating mass scanning of private communications fundamentally undermines encryption. Full Stop," Whittaker said in a statement on Monday. "Whether this happens via tampering with, for instance, an encryption algorithm's random number generation, or by implementing a key escrow system, or by forcing communications to pass through a surveillance system before they're encrypted." The response comes as law makers in Europe are putting forth regulations to fight CSAM with a new provision called "upload moderation" that allows for messages to be scrutinized ahead of encryption. A recent report from Euractiv revealed that audio communications are excluded from the ambit of the law and that users must consent to this detection under the service provider's terms and conditions. "Those who do not consent can still use parts of the service that do not involve sending visual content and URLs," it further reported. Europol, in late April 2024, called on the tech industry and governments to prioritize public safety, warning that security measures like E2EE could prevent law enforcement agencies from accessing problematic content, reigniting an ongoing debate about balancing privacy vis-à-vis combating serious crimes. It also called for platforms to design security systems in such a way that they can still identify and report harmful and illegal activity to law enforcement, without delving into the implementation specifics. iPhone maker Apple famously announced plans to implement client-side screening for child sexual abuse material (CSAM), but called it off in late 2022 following sustained blowback from privacy and security advocates. "Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types," the company said at the time, explaining its decision. It also described the mechanism as a "slippery slope of unintended consequences." Signal's Whittaker further said calling the approach "upload moderation" is a word game that's tantamount to inserting a backdoor (or a front door), effectively creating a security vulnerability ripe for exploitation by malicious actors and nation-state hackers. "Either end-to-end encryption protects everyone, and enshrines security and privacy, or it's broken for everyone," she said. "And breaking end-to-end encryption, particularly at such a geopolitically volatile time, is a disastrous proposition." Continuous Attack Surface Discovery & Penetration Testing Continuously discover, prioritize, & mitigate exposures with evidence-backed ASM, Pentesting, and Red Teaming.

Daily Brief Summary

MISCELLANEOUS // EU Proposal Threatens Encryption with Private Message Scans

The European Union has introduced a controversial proposal to scan users' private messages for child sexual abuse material (CSAM), raising significant concerns among privacy advocates.

Meredith Whittaker, president of the Signal Foundation, criticized the proposal, stating that it severely undermines the integrity of end-to-end encryption (E2EE).

The proposed measure, known as "upload moderation," would require messages to be analyzed before encryption, allowing for the detection of CSAM.

The law excludes audio communications and requires user consent under service provider terms, offering alternatives for users who do not consent to scanning.

Europol emphasizes the need for tech industry cooperation to balance public safety and privacy, suggesting the design of systems capable of reporting harmful activity without breaking encryption.

The response from the tech community, including Apple, highlights concerns about privacy infringement and the potential for such measures to lead to broader surveillance practices.

Signal warns that interfering with encryption algorithms or creating backdoors for scanning could lead to vulnerabilities, exploitable by both malicious actors and nation-state hackers.