Flashback: the draft regulation of the European Parliament and of the European Council establishing rules to prevent and combat sexual abuse of children, presented on 11 May 2022, aims to step up the fight against the dissemination of pornographic images and videos and sexual solicitation of minors.
As part of this, service providers would be obliged to report online child sexual abuse committed on their platforms and alert the authorities. This obligation, known as “Chat Control,” is tantamount to introducing surveillance of electronic messaging. How does it work? By allowing access to people's private messages in order to scan them and identify the authors of those messages. In other words, by breaching the encryption processes that are meant to ensure no one--not even the operators of the messaging applications on the market--can read the contents of a message.
This systematic surveillance is being denounced by privacy advocates.
The failure of the Belgian presidency
On 20 June, an amended version of the draft was due to be examined by the European Council in its justice and home affairs configuration. The Belgian presidency cancelled the examination of the dossier at the last minute, noting that the qualified majority required to adopt the draft would not be reached.
The balance between the fight against paedophile crime and the guarantee of privacy and confidentiality of exchanges appeared too unbalanced in favour of what the project’s detractors described as a “terrifying mass surveillance mechanism.” In order to be adopted, the regulation would need the support of at least 15 member states representing at least 65% of the EU’s population. But the hostility of Germany, Austria, Poland, the Netherlands, the Czech Republic and Luxembourg made it impossible for a majority in favour of the text to emerge.
The principle of moderation on dispatch
Technically, the control system is based on the principle of moderation on posting. This involves examining the photos and links contained in messages before they are sent. The general terms of use of the services concerned stipulate that users can only share content if they consent to their messages being scanned.
The first version of the text, heavily amended by the European Parliament, provided for media to be analysed when they were loaded into an application, rather than while they were being sent. In other words, a principle of moderation of downloads. It’s a rhetorical difference if we consider that the very principle of mass analysis of communications remains unchanged. For professionals, the introduction of such a mechanism means doing away with the end-to-end encryption used by most private messaging services, with the creation of what specialists call “backdoors.”
Backdoors that hackers and special services could easily exploit. Easier, in any case, than if they had to break the encryption keys used by messaging services.
Luxembourg against generalised surveillance
Luxembourg is in the camp of Chat Control’s opponents. “The effective fight against sexual abuse and exploitation of children is a priority for the government. In order to act more consistently, the government supports efforts to strengthen legislation at European Union level, in terms of detecting, reporting, blocking and deleting child pornography on the Internet,” said justice minister (CSV) on 25 June, in response to a parliamentary question tabled by MPs (Piraten) and (Piraten, who has since switched to LSAP).
But at the same time, the justice minister reaffirmed that “this government also maintains the importance that the essential fundamental rights of citizens remain guaranteed, and that the proposed regulation must respect EU law, including the principle of proportionality.” “Just as the legal service of the Council of the EU recommends in its opinion on the legality of ‘chat control,’ the government is in favour of more targeted detection of child pornography and grooming material and the avoidance of blanket surveillance.”
The improvements made by the Belgian presidency last spring--the categorisation of risks according to a defined methodology; the limitation of detection orders to visual content and internet addresses and the moderation of downloads--did not tip Luxembourg into the camp of supporters of the project. “These changes do not reduce the risk of general surveillance and the proportionality criteria of the EU Charter of Fundamental Rights are not respected. The mechanism still does not rule out the possibility that the communications of all users of a service, even those who have no connection with the offence, could be analysed. For these reasons, the government still cannot accept the text as it stands.”
The emergence of censorship lists
Since then, the Hungarian presidency of the EU, which has made this project a political priority, has regained the upper hand.
What has changed in the version that will be presented to the council? Hungary has taken up the compromise proposal of the Belgian presidency by restricting the detection injunctions that the authorities can issue against a risky service to content that has already been identified, excluding new content.
Sven Clement is not convinced by the “censorship lists,” lists of illegitimate content based on file codes that the police will transmit to operators who will have to track down this content. “A minor change that does not rule out the risk of widespread surveillance of the population,” says Clement, a member of the Luxembourg Pirate Party who is spearheading the fight against the project at European level. In his view, these censorship lists are technically ineffective because they are very easy to circumvent--“all you have to do is change a pixel in a file so that it is no longer recognised by the list, a manipulation that is easy for anyone to do.” “The priority should be to tackle the production of child pornography content,” he continues.
This is not enough to change the mind of the Luxembourg government, which believes that “the current proposal does not provide for targeted detection mechanisms that would help to combat online sexual abuse and exploitation of children effectively, but poses a risk of generalised surveillance of users of the communication services concerned. The government therefore notes that its concerns regarding the protection of fundamental rights, and in particular data protection, have still not been allayed at this stage. Luxembourg therefore maintains the position it has consistently expressed, namely that the draft European regulation must respect the principle of proportionality, as recalled in the opinion of the legal service of the Council of the European Union.”
“In view of these concerns, Luxembourg is not at this stage in a position to support the text as currently proposed, but will continue to support efforts to find a solution at European level,” the justice ministry states.
This article was originally published in .