March 2. 2024. 3:46

The Daily

Read the World Today

EU-funded project trials device scanning for child sexual abuse content

A new project by EU and UK researchers will seek to develop a machine learning tool to detect and block child sexual abuse material (CSAM) on devices in real-time.

The European Commission-funded project, announced on Thursday (23 February) by UK child safety organisation the Internet Watch Foundation (IWF), will begin next month and will work for two years to develop the tool.

The software will be trialled on a voluntary basis by those who consider themselves at risk of viewing CSAM online. The tool will conduct real-time monitoring and block access to any images or videos in this category before the user can view them.

“Sadly, the demand for images and videos of children being sexually abused is unceasing”, said IWF’s Chief Technology Officer, Dan Sexton.

“But we know that finding and removing this horrendous content is not enough in the ongoing, global fight to stop the sexual abuse of children, which is why we are glad to play our role in this project to train and test software which could prove vital in lowering the demand for the criminal material in the first place.”

The €2 million “Protech” project, which is set to begin in March, will be led by university hospital Charité – Universitätsmedizin Berlin (CUB) and will gather experts from fields including child protection, psychology, software engineering and public health.

The participating organisations, drawn from the UK and EU, will include groups such as the Lucy Faithfull Foundation, Stop it Now Netherlands and the University Hospital Antwerp’s University Forensic Centre.

Researchers from these institutions will collaborate to develop a machine learning tool – titled Salus – that will detect and intercept CSAM material to prevent it from appearing on users’ screens.

The software will require no interaction unless the material is detected, but its device installation will be voluntary. The tool, the IWF says, will provide intervention for users who fear they are at risk of offending against children and will strive to lessen child sexual abuse survivors’ fears of re-victimisation through existing content.

The project will also explore what will happen once such content is detected. An IWF spokesperson told EURACTIV that the tool intends not to report users to the police but to work on the prevention side to reduce the availability of such illegal content.

“Volunteers who use the app will be people who are looking to stop themselves seeing child sexual abuse material and will be recruited by organisations aiming to help them control their behaviour”, the spokesperson said.

During its pilot phase, the app will be offered to at least 180 users in five countries – Germany, the Netherlands, Belgium, Ireland and the UK – over an 11-month period. The project members will then evaluate its rollout and impact and consider its potential deployment as a component of public health prevention initiatives.

“Digital devices can play a role in reducing sexual abuse online by providing certain features and measures that enhance the safety of users, such as security and privacy settings, reporting and blocking options,” Stefan Bogaerts, Chair of Tilburg University’s Department of Developmental Psychology.

Last year, the Commission released its own proposal to tackle CSAM, which includes the possibility for judicial authorities to request communications service providers like messaging apps and emails to put measures in place to detect child abuse content.

Whilst welcomed by child safety advocates, the measure was criticised for not being compatible with end-to-end encryption, a technology whereby only those involved in the communication can decrypt the messages.

The enforcement of detection orders might happen in two main ways, with the communications being sent to an additional ‘scanning’ server, and via on-device scanning.

Ella Jakubowska, Senior Policy Officer at European Digital Rights (EDRi), told EURACTIV that, while the project seems to be oriented towards on-device operation, strong safeguards would be needed to prevent “scope-creep” and ensure that the technology did not end up being used for third-party surveillance of devices.

The involvement of privacy, data protection, and cybersecurity experts in the project would also be vital, said Jakubowska, adding, “whilst innovation in the prevention of child abuse is crucial, there are always limits to what a technological approach can do.”

“No app can replace investment in education, in social services, in preventative healthcare, and in tackling the structural issues that lead to offending”, she said.

LEAK: Commission to force scanning of communications to combat child pornography

The European Commission is to put forward a generalised scanning obligation for messaging services, according to a draft proposal obtained by EURACTIV.