March 28. 2024. 1:29

The Daily

Read the World Today

EU Council’s legal opinion gives slap to anti-child sex abuse law


The legal service of the EU Council of Ministers slammed the EU proposal to fight child sexual abuse material (CSAM), criticising, in particular, the ambiguity of detection orders and their possible impact on privacy rights.

The CSAM draft law has faced controversy since it was proposed by the European Commission last year. It gives judicial authorities the power to issue detection orders addressed to communications services providers it considers at significant risk of being used to disseminate this type of illegal content.

Following receiving a detection order, services like Gmail or WhatsApp would be forced to implement tools that automatically scan private emails or texts to detect suspected content.

This instrument has been accused of disproportionately affecting people’s privacy, as potentially every person using the service might be affected. These concerns were echoed by the European Data Protection Supervisor, a study commissioned by the European Parliament.

The legal service of the EU Council, which is extremely influential in the EU legislative process, is now adding to the proposal’s troubled history, according to extracts of its legal opinion seen by EURACTIV.

LEAK: Commission to force scanning of communications to combat child pornography

The European Commission is to put forward a generalised scanning obligation for messaging services, according to a draft proposal obtained by EURACTIV.

Detection orders

In the Commission’s text, detection orders can be issued by a national judicial or independent administrative body to detect known material, new material and grooming, the practice of predators trying to lure children.

While the declared intent is for the proposal to be technologically neutral, the legal opinion notes that the “content of all communications must be accessed and scanned, and be performed using available automated tools.”

On paper, the draft law is also meant to be as least intrusive as possible in terms of impact on users’ rights to privacy and data protection.

However, the opinion notes that when all communications have to be scanned “with the assistance of an automated operation,” it interferes “with the right to data protection, regardless of how that data is used subsequently.”

The legal opinion adds that the application of the orders cannot “exceed 24 months for the dissemination of known or new CSAM and 12 months for the solicitation of children.”

Ambiguity

According to the document, the detection orders are not “sufficiently clear, precise and complete.”

For example, what “effective” technology means is not elaborated on. What it means lies ultimately with the service providers, which raises “serious doubts as to the foreseeability of the impact of these measures on the fundamental rights at stake.”

The extent of interference would be determined by those who pick the technologies used to implement “the detection order on a case-by-case basis,” such as the EU Centre, national authorities, judges, and service providers.

“The extent of discretion involved could give rise to a very broad range of possible different interpretations and concerns as regards compliance with fundamental rights,” the legal opinion says, calling for more detailed limitations.

Child sexual abuse material: EU Council proposes survivors’ board

A new EU Council presidency compromise text of the proposal aiming to prevent child sexual abuse material (CSAM) online introduces a survivors’ board and puts more focus on competent authorities.

Violation of fundamental rights

Other than the detection orders not being clear enough, the “screening of interpersonal communications” also affects “the fundamental right to respect for private life” because it gives access to interpersonal communications, such as text messages, emails, audio conversations, pictures, or any other kind of exchanged personal information.

It can also have a deterrent effect on the freedom of expression, the document says. Moreover, the data is also being processed, which “affects the right to protection of personal data.”

The legal opinion emphasises that almost everyone uses interpersonal communication services, meaning people who view and share CSAM. In detecting such materials through general screening, however, cybersecurity measures would be weakened if not circumvented, especially regarding end-to-end encryption.

Moreover, to detect these materials, in text or audio form, “age assessment/verification generalised to all users of the service concerned” would be needed, as “without establishing the precise age of all users, it would not be possible to know that the alleged solicitation is directed towards a child,” the text explains.

There are three ways to do this: mass profiling of the users, biometric analysis of the user’s face and/or voice, or digital identification/certification system. Any of these would add “another layer of interference with the rights and freedoms of the users.”

Child sexual abuse: leading MEP sceptical of technical limitations

As the European Parliament’s published its draft report on the proposal to fight child sexual abuse material (CSAM), the rapporteur shared with EURACTIV his vision about the key aspects of the file.

Javier Zarzalejos is an influential voice inside European People’s …

Proportionality problem

The legal service also referred to the jurisprudence of the EU Court of Justice, which ruled out against generalised data retention of IP addresses to fight crime.

As it is potentially much more invasive than IP addresses, the legal opinion notes that automatically processing traffic and location data “can meet the requirement of proportionality only in situations in which a member state is facing a serious threat to national security.”

Child sexual abuse, even though it is a crime of “a particularly serious nature,” does not count as a “threat to national security,” the legal opinion states.

Read more with EURACTIV

Regulation, digital divide key focus of connected future, stakeholders say

Regulation, digital divide key focus of connected future, stakeholders say

Telecoms sector regulation must look to the future if Europe’s connectivity goals are to be met. Still, it is important not to lose sight of efforts to bridge the digital divide, according to stakeholders.