Meta must answer revelations on child protection failures, says EU’s Breton
Meta’s voluntary code on child protection is not working, European Commissioner Thierry Breton said in the wake of revelations that Instagram’s algorithms had facilitated and promoted child sexual abuse material networks.
On Wednesday (7 June), the Wall Street Journal (WSJ) published the results of an investigation it had run in partnership with researchers, which found that the platform had helped to connect and boost a network of accounts used to commission and trade in child sexual abuse material (CSAM).
In response, the EU’s internal market chief Thierry Breton said that the voluntary code on child protection operated by the platform’s parent company, Meta “seems not to work”, adding that Meta CEO Mark Zuckerberg “must now explain and take immediate action”.
Breton and Zuckerberg are due to meet at the company’s headquarters in California on 23 May before the Digital Services Act (DSA), the EU’s new content moderation rulebook, enters into force for ‘systemic’ platforms like Instagram in August.
Commission announces first platforms to fall under EU digital rulebook’s stricter regime
The European Commission identified on Tuesday (25 April) 19 online platforms and search engines that must comply with its more rigorous rules for digital services.
The EU has recently adopted the Digital Services Act (DSA), a flagship regulation introducing responsibilities for …
According to the WSJ’s investigation, which was conducted along with researchers from Stanford University and the University of Massachusetts Amherst, Instagram does not just host paedophiles and their content but instead actively promotes them.
Its recommendation system, investigators found, connects these accounts and links them to others selling sexual content featuring minors, including those run by underage individuals themselves.
The platform also allows people to search for explicit hashtags linked to this material and features accounts accepting “commissions” or even offering in-person meetings with children.
The promotion of such content, the WSJ notes, violates both US federal law and Meta’s own policies. The company says it features “zero-tolerance policies” and “cutting-edge, preventative tools”, facilitating reporting of potential harms and responses to them.
“We work aggressively to fight it on and off our platforms and to support law enforcement in its efforts to arrest and prosecute the criminals behind it,” a Meta spokesperson told EURACTIV, stressing how sexual predators constantly change their tactics, which is why the company has strict policies and hires specialists to understand evolving behaviour.
“Between 2020 and 2022, these teams dismantled 27 abusive networks, and in January 2023, we disabled more than 490,000 accounts for violating our child safety policies,” the spokesperson added.
Breton, who on Monday told reporters he considers himself a ‘hands-on regulator’, does not seem to take Meta’s response at face value. Already in April, he announced that he was in talks to arrange for ‘stress tests’ at the headquarters of what he saw as the most troublesome companies, including Twitter, Meta and TikTok.
Twitter set to exit EU Code of Practice on Disinformation, sources say
Twitter told the European Commission it is seriously considering withdrawing from the EU Code of Practice on Disinformation, a voluntary agreement that preludes upcoming binding rules, EU officials told EURACTIV.
The announcement of Twitter’s withdrawal from the code would come as …
The French Commissioner warned the social media company that, once the new digital regulation kicks in, “Meta has to demonstrate measures to us or face heavy sanctions.” Under the DSA, sanctions can reach 6% of the global annual turnover.
The legislation is due to introducing stricter obligations regarding quickly removing illegal online content like child pornography. The largest online platforms, including Meta’s Facebook and Instagram, will be required to submit an annual risk assessment, featuring an analysis of potential negative effects on children’s rights.
The DSA also features a ban on advertising targeted at children and limitations on using sensitive data.