Skip to main content

2025 Guide: How Does Telegram Perform Internal Content Moderation?

· 4 min read
Aston
All maintainers of All In One TG

In March 2024, Australia's internet safety regulator eSafety launched an investigation into Telegram’s content moderation practices. Telegram disclosed details of its illegal content interception systems to the authorities. Below is a summary of the key points, helping you quickly understand how Telegram performs internal moderation.

1. Private Chats: Only Reviewed Upon User Reports

Telegram is well known for its focus on privacy. In private one-on-one chats, Telegram does not proactively scan content or look for illegal material. Only when a chat participant reports the conversation will Telegram initiate further checks—either via automated tools or human reviewers. Otherwise, private chats remain completely inaccessible to moderation systems.

2. Encrypted Chats: Even Harder to Moderate

Telegram’s "Secret Chats" offer even stricter protection. Even if users report a conversation, Telegram does not pass the messages to moderators. Instead, Telegram uses a system of "alternative signals"—undisclosed behavioral indicators—to detect anomalies without exposing message content.

3. Groups and Channels: Public Visibility Determines Review Level

Groups and channels are classified as public or private. Public groups/channels are subject to stricter moderation. Content like violence, extremism, or illegal pornography is prohibited.

Even private channels may be subject to review if their invite links are publicly shared, making them "publicly accessible" in practice.

Telegram also stated it automatically scans images and videos uploaded to groups or channels for known illegal content such as terrorist propaganda or child sexual abuse material.

4. Automated Detection: AI Combined with Human Review

Telegram operates a detection system that analyzes images, videos, and texts for potentially illegal content. This system does not scan all messages arbitrarily; it focuses on content that has been reported or flagged as high risk.

If the system detects possible violations—like extremist content—it uses AI to decide whether to automatically delete the content or escalate it to human moderators. About 65% of suspected extremist content is manually verified by reviewers.

5. How Telegram Handles Known vs Suspected Illegal Content

  • Known illegal content (identified by Europol or Telegram itself): is automatically scanned across all public groups and channels (not in private chats).
  • Suspected illegal content: is only reviewed when reported, including texts, images, videos, profile photos, usernames, and group/channel descriptions.
  • All removed illegal content is added to Telegram’s internal blacklist database for future detection.

6. Technology Strategy: Focus on AI, Not Blacklists

Telegram does not rely on traditional URL blacklists. Instead, it prioritizes machine learning-based recognition.

Its detection system includes:

  1. Internal database: of content manually flagged as illegal by moderators.
  2. Europol's database: integrated regularly for updated threat data.
  3. AI behavior modeling: such as how often admins ban users, duration, and reasons—these help prioritize moderation actions.

Telegram also matches patterns to detect whether new groups resemble previously banned ones, allowing preemptive action.

7. Consequences and Protection for Users

Telegram analyzes how users find and join illegal content. If certain keywords are commonly used to access such groups or channels, they may be removed from Telegram's public search results.

  • Minor violations: If admins did not post the illegal content themselves, Telegram may issue a temporary ban to allow cleanup.
  • Severe violations: Groups that repeatedly spread extremism or illegal pornography will be permanently removed, including the admins’ accounts.
  • Subscriber protection: Regular members who do not engage in distribution or management will not be penalized. This aims to protect journalists, law enforcement, and researchers who observe such content for work.
  • Child exploitation content: Telegram enforces a zero-tolerance policy. Any associated accounts, channels, or groups will be permanently banned.

Learn More: