Surfshark's Study Reveals Alarming Rise in CSAM Tech Firms Challenge EU Regulation

Introduction

A recent study by Surfshark has uncovered a concerning issue: over 1,700 websites in the European Union (EU) may contain unreported Child Sexual Abuse Material (CSAM).

Global Increase in CSAM Reports

The study reveals a global increase in CSAM reports, with a staggering 83 million reports filed between 2020 and 2022.

Out of these reports, 3.1 million originated from EU countries.

Tech Companies’ Response

Tech companies, including Surfshark, have expressed their concerns through an open letter urging EU ministers to reconsider a proposed anti-CSAM regulation.

Concerns about Proposed Regulation

The proposed regulation, while aiming to address the CSAM issue, raises concerns about privacy infringement. It could potentially allow authorities to scan private communications for dangerous content.

Regional CSAM Statistics in Europe

In Europe, Poland has the highest CSAM problem, accounting for 16% of EU cases (269 unreported harmful websites). Other countries with notable cases include France (260), Germany (158), Hungary (152), and Italy (110).

Global CSAM Concerns

Globally, Asia is a significant contributor to CSAM concerns, accounting for two-thirds of the 83 million reports. India tops the list with almost 16% of reports, followed by the Philippines, Pakistan, Indonesia, and Bangladesh.

Data Sources and Analysis

Researchers utilized open-source information from the National Center for Missing and Exploited Children (NCMEC) between 2020 and 2022.

This data was compared with information from the Communications Regulatory Authority of Lithuania (RRT).

Privacy-Preserving Experiment

Surfshark highlights a privacy-preserving experiment conducted by the RRT in 2022, in collaboration with proxy service provider Oxilabs.

Oxilabs developed an AI-powered tool to scrape the web, identify illegal content, and analyze image metadata for matches in police databases.

The tool successfully identified 19 local websites violating laws, leading to police reports and investigations.

EU Parliament’s Request

In October of the previous year, the EU Parliament requested the removal of the Chat Control clause from the EU Child Sexual Abuse Material (CSAM) Scanning Proposal.

This move emphasized privacy as a fundamental right.

Criticism of Chat Control Proposal

The Chat Control proposal, currently under debate, has faced criticism for potentially compromising citizens’ security by side-scanning chats.

Such scanning is seen as an attack on encryption and privacy.

Surfshark’s Perspective

Surfshark spokesperson Lina Survila emphasizes that an individual’s right to privacy should be non-negotiable.

Less invasive tools like web scraping should be explored before resorting to more intrusive measures.

Oxilabs and Technological Alternatives

Denas Grybauskas, Head of Legal at Oxylabs, believes that intrusion into citizens’ privacy should be allowed only as a last resort.

There’s a need for more detailed discussions in EU regulations regarding technological alternatives.

Oxilabs continues to work with RRT to enhance its AI-powered web scraping tool and collaborates with organizations, students, and researchers to develop more software solutions for addressing online threats.

#child sexual abuse material, #EU child abuse websites, #global CSAM reports, #privacy concerns in CSAM scanning, #tech companies vs. anti-CSAM regulation