June 23. 2024. 2:06

The Daily

Read the World Today

EU countries seek flexibility in enforcing online child sexual abuse rulebook

A new compromise text, seen by EURACTIV, introduces more discretion for EU countries in the national setup for important parts of the new regulation against child sexual abuse material.

The draft regulation to fight Child Sexual Abuse Material is set to introduce specific mechanisms and legal tools to deal with this type of illegal content. The file is currently being discussed in the Law Enforcement Working Party, a technical body of the EU Council of Ministers.

On Thursday (24 February), the national representatives are set to discuss a new compromise text that gives more discretion to the European capitals on the enforcement architecture and introduces significant changes to blocking and removal orders and the reporting obligations.

“In summary, the Presidency has sought to accommodate the text to the need for flexibility that Member States have expressed. Given the layer of involved entities, the attainment of flexibility has however in some instance also made the text slightly more complex,” the compromise reads.

EU Council discusses cross-border removal orders to fight child pornography

A new compromise text by the Czech EU Council presidency has expanded the original proposal for tackling online child sexual abuse material to include a mechanism for dealing with cross-border content removal.

National discretion

In the explanatory note to the compromise text, the Swedish presidency points out that the implementation architecture of the proposed regulation is complex as it can involve one or more competent authorities, a coordinating authority at the national level and the EU Centre to prevent and combat child sexual abuse.

The member states might designate competent authorities among law enforcement, judicial or administrative authorities. The national governments would also choose a coordinating authority to liaise with other national authorities and the EU Centre.

These coordinating authorities might or not be competent authorities and carry out certain tasks, like following up on the application of blocking orders. The text mentions that countries might give this role to the same authorities designated as Digital Services Coordinator under the EU’s Digital Services Act.

Removal & blocking orders

A competent authority can request hosting services – in essence, any platform sharing content online – to remove or disable access in all EU countries to one or more pieces of materials identified as child sexual abuse material.

These orders would be removed if all necessary investigations have been carried out and the reasons for taking down the content outweigh the possible negative consequences, notably the interference with the users’ freedom of speech and the platforms’ freedom to conduct business.

Before a removal order is issued, the relevant authority must inform the platform of its intention and reasoning, allowing a reasonable time for the service provider to comment.

The removal order might be modified or repealed following a redress procedure. In this case, the hosting provider should immediately reinstate the material.

A similar procedure is envisaged for orders to block entire websites addressed to internet access services. In the case of blocking orders, however, they can only be issued if their subject matter is on a list provided by the EU Centre.

“Do Member Sates think that there should be such a requirement? Or should it be sufficient that Member States share their blocking orders with the EU Centre and other Member States once they become final?” a footnote to the text questions.

Moreover, for blocking orders, service providers must inform the relevant authority about their execution immediately, including protective measures.

In the original proposal, anyone who could no longer access the blocked material could appeal against the blocking order. The right to redress a blocking order was limited to the internet access service and the users who uploaded the blocked material.

Reporting suspicious content

Another amendment to the text voices concern that the users whose content has been flagged as suspicious will receive less information.

The users in question will still be notified that the material has been reported to the EU centre, which will filter false positives from illegal content before transmitting it to law enforcement.

However, compared to the original proposal, the users will no longer be told how the service provider identified the suspicious content or any follow-ups to the report. Instead, how the provider has become aware of the suspicious content was included in the information the platforms must provide when reporting suspicious material.

At the same time, the online platform will have to provide a mechanism for users to be able to report suspicious content, which must be easy to access, effective, age-appropriate and user-friendly.

The presidency added that users must also be able to report suspicious material anonymously and exclusively via electronic means. The submission form should be sufficiently precise and allow for adequately substantiated notices, including the exact location of the information.

In this regard, the European Commission is due to propose a definition of Uniform Resource Locator (URL), better known as the web address of a webpage.