OFCOM INVESTIGATES TELEGRAM OVER CHILD SAFETY CONCERNS
INDUSTRY DESK■ 2 MIN READ
TUE, APR 21, 2026■ AI-SUMMARIZED FROM 2 SOURCES BELOW
UK communications regulator Ofcom has launched an investigation into Telegram following evidence that the messaging platform is being used to share child sexual abuse material (CSAM) and facilitate child grooming.
Ofcom, Britain's independent communications regulator, initiated the probe on Tuesday after receiving evidence of illegal activity on the encrypted messaging app.
The investigation centers on two primary concerns: the distribution of child sexual abuse material on Telegram's platform and reports that predators are using the service to groom minors. These issues have raised significant questions about the platform's safety measures and content moderation practices.
Telegram, founded in 2013, has grown to become one of the world's largest messaging platforms, with hundreds of millions of users. The service emphasizes encryption and privacy, features that have made it popular among users seeking secure communications. However, these same characteristics have drawn scrutiny from regulators and child protection advocates who argue the platform's privacy-first approach creates gaps in content moderation.
Ofcom's investigation comes amid growing pressure on tech platforms to strengthen protections against child exploitation. The regulator has powers under the Online Safety Bill framework to compel platforms to implement safety measures and face substantial penalties for non-compliance.
This marks another regulatory action against messaging apps over child safety concerns. Similar investigations have been launched against other platforms in recent years as authorities worldwide intensify efforts to combat online child exploitation.
Telegram has previously stated its commitment to combating illegal content and has claimed to cooperate with law enforcement agencies. The platform operates a reporting mechanism allowing users to flag inappropriate material.
Ofcom's investigation will examine whether Telegram has adequate systems in place to identify and remove CSAM, prevent grooming behavior, and cooperate with child protection authorities. The regulator is expected to gather evidence and consult with stakeholders before determining any enforcement actions or remedies required.
The outcome of this investigation could have significant implications for how encrypted messaging platforms balance privacy with child safety obligations under UK law.
■ MORE FROM THE SECURITY DESK
AI company Clarifai has deleted 3 million profile photos it obtained from dating site OkCupid in 2014 for facial recognition training. The deletion follows an FTC settlement with Match Group, OkCupid's parent company, over privacy policy violations.
JUST NOW— AI Desk
Mozilla's Firefox 150 release fixes 271 security vulnerabilities identified through early access to Anthropic's Mythos Preview AI system. The browser maker used the tool to strengthen defenses ahead of broader AI-driven security challenges.
JUST NOW— AI Desk
A previously unknown data-wiping malware called Lotus was deployed in targeted attacks against Venezuelan energy and utility organizations last year. The discovery reveals a coordinated campaign against critical infrastructure.
2H AGO— AI Desk
Meta is deploying monitoring software on US employee computers to capture mouse movements, clicks, and keystrokes within work applications. The data will be used to train the company's artificial intelligence systems.
3H AGO— AI Desk