The 27-nation EU has for years been debating the strengthening of regulations, under which online platforms and messaging services currently detect and report abusive images on a voluntary basis [File] | Photo Credit: REUTERS The European Commission on Tuesday urged EU member states and lawmakers to “dramatically speed up” work on new rules to tackle child sexual abuse material online, after an old set expired unreplaced. A legal derogation allowing for online platforms and messaging services to voluntarily detect and report abusive images lapsed on April 3, as governments and the European parliament squabbled over an overhaul of the system. “The co-legislators must now dramatically speed up their work” on finding a long term solution, said Guillaume Mercier, a spokesman for the commission, the European Union’s top executive body. “We will support them in the negotiations to proceed as quickly as possible to reduce any legal gap,” he told a press conference in Brussels. Google, Microsoft, Meta and Snapchat said in a letter last week that they would continue to “take voluntary action” and scan messages when necessary. But they complained that the lapsing of the derogation to privacy rules, which granted them the ability to do so, clouded “the legal certainty that has helped responsible platforms try to protect our communities”. “We are disappointed by this irresponsible failure to reach an agreement to maintain established efforts to protect children online,” the tech giants wrote. On Tuesday, the commission declined to say whether the companies could be in breach of privacy rules for continuing to scan messages with no clear legal grounding. “The protection of our children should not be subject to autonomous business decisions by companies, but rather be based on clear and binding rules,” Mercier said. But Brussels is “fully committed to ensuring that detection continues to be possible”, he added. The 27-nation EU has for years been debating the strengthening of regulations, under which online platforms and messaging services currently detect and report abusive images on a voluntary basis. The commission proposed in 2022 to make that compulsory and also require the reporting of attempts by predators to contact minors. Though supported by several child protection groups, the plans nicknamed “Chat Control” sparked fierce debate, with critics including the EU’s own data protection authorities saying they could pose a “disproportionate” threat to privacy. Talks between lawmakers and member states to find a compromise failed last month and a last-ditch effort for parliament to approve a temporary extension of the existing system also fell short. Published – April 08, 2026 10:00 am IST Share this: Click to share on WhatsApp (Opens in new window) WhatsApp Click to share on Facebook (Opens in new window) Facebook Click to share on Threads (Opens in new window) Threads Click to share on X (Opens in new window) X Click to share on Telegram (Opens in new window) Telegram Click to share on LinkedIn (Opens in new window) LinkedIn Click to share on Pinterest (Opens in new window) Pinterest Click to email a link to a friend (Opens in new window) Email More Click to print (Opens in new window) Print Click to share on Reddit (Opens in new window) Reddit Click to share on Tumblr (Opens in new window) Tumblr Click to share on Pocket (Opens in new window) Pocket Click to share on Mastodon (Opens in new window) Mastodon Click to share on Nextdoor (Opens in new window) Nextdoor Click to share on Bluesky (Opens in new window) Bluesky Like this:Like Loading... Post navigation health matters newsletter: Health across systems and risks RBI MPC meeting LIVE: Central Bank keeps policy repo rate unchanged at 5.25%