The European Parliament decided not to vote to extend the law, amid privacy concerns from some lawmakers. This regulatory gap has created uncertainty for big tech companies, because while scanning for harms on their platforms is now illegal, they still remain liable to remove any illegal content hosted on their platforms under the Digital Services Act. The law expired on 3 April 2026.
The law was a carve-out of the EU Privacy Act, put in place in 2021 as a temporary measure allowing companies to use automated detection technologies to scan messages for harms, including child sexual abuse material (CSAM), grooming and sextortion. The voluntary exemption was already extended in 2024. Today the temporary rules disappear.
Parliament's actions appear contradictory: while it has blocked the extension, it has also endorsed a temporary extension of the current derogation of the ePrivacy Directive. With 458 in favour, 103 against and 63 abstentions, the European Parliament has endorsed a temporary extension of a current derogation of the ePrivacy Directive – due to expire on 3 April 2026 – so that an agreement on the long-term legal framework to prevent and combat child sexual abuse online can be reached. MEPs support extending an exemption to privacy legislation allowing the voluntary detection of child sexual abuse material online until 3 August 2027.
In response to the legal gap, Google, Meta, Snap and Microsoft said they would continue to voluntarily scan their platforms for CSAM, in a joint statement posted on a Google blog. Privacy concerns have shaped Parliament's position on scanning technologies. While they support the derogation's extension, MEPs say the voluntary measures need to remain proportional and targeted and should not apply to end-to-end encrypted communications.
Scanning traffic data alongside content data should not be allowed either, they argue. According to MEPs, the technology used for the voluntary detection of child sexual abuse material (CSAM) should only apply to material that has already been identified as such, or flagged as potential CSAM by a user, a trusted flagger or an organisation. Measures should target users or specific groups of users reasonably suspected of being connected to CSAM.
Child protection advocates had warned that allowing the legislation to lapse would probably trigger a steep fall in reports of child sexual abuse. A similar legal gap occurred in 2021, when reports of such material from EU-based accounts to the National Center for Missing and Exploited Children (NCMEC) fell by 58% over a period of 18 weeks. 8 million images, videos and other files suspected of being related to child abuse, from around the world.
About 90% of these reports are related to countries outside the US. Negotiations for a permanent framework continue despite the current impasse. The European parliament said in a statement that it was prioritizing its work on legislation to prevent and combat child sexual abuse online, and that negotiations on a permanent legal framework were ongoing, though the body had offered no timeline for agreements or implementation.
Parliament has been ready for negotiations on the permanent framework since November 2023, and since Council adopted its position in November 2025, talks on the permanent law have been ongoing. Parliament is now ready for negotiations with the Council on extending the exemption. The timeline for reaching a permanent agreement remains unclear, with no specific deadline established for concluding negotiations.
A spokesperson for the EU parliament declined to comment on whether the legislative body had conducted any assessments to determine the consequences of the lapse of the law. How the legal gap will affect the actual number of child sexual abuse reports and victim identifications in the EU has not been quantified by EU institutions. The political controversy surrounding what critics call 'chat control' has complicated the legislative process.
The disagreement over 'chat control' can create problems for how child abuse online can be combated. Critics argue that it is a surveillance law invented by the EU to check citizens' communication. Supporters of scanning measures emphasize the child protection necessity.
Those who are positive about the law point out that it fundamentally is about protecting children from being subjected to abuse by preventing the spread of abuse material. The police agree with us that the legislation is needed, says Susanna Pettersson, child rights lawyer at Ecpat. The EU's decision will have implications beyond its borders.
The EU’s decision to prohibit scanning will have ripple effects in other regions around the world, child safety experts said. Tech implementation details and cross-border cooperation mechanisms represent additional unknowns that will need to be addressed.
