Source: Ars Technica
A European Commission proposal could force tech companies to scan private messages for child sexual abuse material (CSAM) and evidence of grooming, even when those messages are supposed to be protected by end-to-end encryption.
Online services that receive "detection orders" under the pending European Union legislation would have "obligations concerning the detection, reporting, removal and blocking of known and new child sexual abuse material, as well as solicitation of children, regardless of the technology used in the online exchanges," the proposal says. The plan calls end-to-end encryption an important security tool but essentially orders companies to break that end-to-end encryption by whatever technological means necessary:
In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation.
That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users.
A questions-and-answers document describing the plan emphasizes the importance of scanning end-to-end encrypted messages. "NCMEC [National Center for Missing and Exploited Children] estimates that more than half of its CyberTipline reports will vanish with end-to-end encryption, leaving abuse undetected, unless providers take measures to protect children and their privacy also on end-to-end encrypted services," it says.