Telegram has taken a big step in cleaning up its platform by removing 15 million illegal groups and channels using advanced AI technology. These groups were involved in fraud, child abuse, and other illegal activities.
The company has been under pressure to tackle harmful content, especially after its CEO, Pavel Durov, was arrested in France on charges related to illegal material shared on the platform. Despite his ongoing legal case, Telegram has focused on improving its moderation efforts.
Using cutting-edge AI tools, Telegram has significantly increased its ability to detect and remove harmful content. In 2024 alone, over 15.4 million groups and channels were banned, including 703,809 groups linked to child abuse. Telegram also works with organizations like the Internet Watch Foundation and the National Center for Missing and Exploited Children to combat such content.
Since 2016, Telegram has been recognized for its efforts to block terrorist content. Collaborating with Europol and other agencies, the platform has banned 100 million pieces of such content, with 129,099 removed this year.
Telegram’s moderation team continues to remove harmful material while keeping users informed through regular updates. Despite the challenges, the platform is committed to creating a safer space for its users.
Source: readwrite