March 29. 2024. 10:32

The Daily

Read the World Today

Number-based communications services excluded from EU scanning rules


The EU Council is moving towards excluding number-based communications services from the scope of the regulation on Child Sexual Abuse Material (CSAM) in a new text that also refines delisting, removal and blocking orders.

The EU proposal to fight CSAM is currently going through the legislative process. Following feedback from other European countries, the Swedish presidency of the EU Council put forth a new compromise dated 23 March and seen by EURACTIV.

The document was discussed at the Law Enforcement Working Party, a technical body of the Council, on Wednesday (29 March).

Number-independent communication services

The compromise text clarifies that the draft law will only apply to number-independent interpersonal communication services, meaning that number-based services will be outside the scope of the regulation.

Facebook’s Messenger and WhatsApp are examples of number-independent communication services because they enable instant messaging without being a part of a numbering plan. These types of services will face lighter regulations than number-based ones.

For example, number-based services must register with the national authorities so that users can dial in emergency service numbers. A number-based service is, for example, ViberOut, through which users can call international and national numbers. Skype also has a feature called SkypeOut.

Child sexual abuse material: Half of viewers then contact children, study finds

A study investigating the behaviours of people who search for and view child sexual abuse material (CSAM) on the dark web found that nearly half of the respondents sought direct contact with a child after viewing the material.

The research, conducted …

Delisting orders

The EU Council introduced the capacity for judicial authorities to issue delisting orders requiring online search engines like Google and DuckDuckGo to delist websites containing specific items of child sexual abuse from search results.

Delisting is also important, according to the document, as it can prevent the dissemination of CSAM and protect victims. However, delisting is already voluntarily widely applied by major search engines.

The delisting order will have to have a period, clarifying the start and end date, and can be translated into any official language of the member state.

Providers of online search engines will have the right to challenge the order before the courts of the member state of the authority that issued the order. If the order is repealed because it is found to be a redress procedure, the provider will have to reinstate the delisted online location to appear in search results.

Removal orders

The compromise elaborates on the conditions for the competent national authorities to issue removal orders.

The removal will have to be completed in 24 hours, instead of the previously suggested 1 hour, which some EU countries called too fast to complete. Others, however, found 24 hours too long.

The provider should also be able to reinstate the material following a redress procedure.

If the hosting service does not have a main establishment in the member state from which the report was issued, a copy of the order must be sent to the Coordinating Authority of the country where the company is based.

CSAM proposal: children first, privacy second?

The European commission has unveiled on 11 May its long-awaited proposal to fight against child sexual abuse material online, or CSAM in short.

While children’s organisations have been receiving this regulation very well, it also sparked a lot on concerns for …

Blocking orders

In case of blocking child sexual abuse material, the provider will have to complete the blocking within a week. There also has to be clear information enabling the provider to identify CSAM.

The document adds that a “blocking order can only be issued if the subject matter of the blocking list is on the list provided by the EU Centre”.

Privacy

The document now adds that the regulation cannot have the “effect of modifying the obligation to respect the rights, freedoms and principles referred to in Article 6 TEU and shall apply without prejudice to fundamental principles relating to the right for respect to private life and family life and to freedom of expression and information.”

The mention seems to reference the concerns related to privacy that the draft law has raised, as it allows judicial authorities to issue detection orders that would oblige communications services to scan private communications to detect suspected content.

Data collection, emergency report, and ‘user-friendly’ mechanism

Regarding information sharing, the Coordinating Authority will be able to request that a peer from another member state share specific information, following the mutual assistance mechanism of the Digital Services Act.

It should also be possible to file an emergency report if an ‘imminent threat’ exists to a child’s life. In such cases, the national law enforcement authority of the country where the offence is taking place or the suspect resides should be notified.

Europol, the EU law enforcement agency, should also be kept in the loop for such emergency reporting, which would be particularly relevant if the location of the offence or the offender is unknown.

The text also mentions that the provider will have to “establish and operate an accessible, age-appropriate and user-friendly mechanism” through which users can file complaints.

Timeline

The deadline for member states to designate competent authorities has been extended from two to 18 months after the regulation’s entry into force.