¯
New Safeguards Added to Government’s Online Content Blocking Rules
Oct. 23, 2025

Why in news?

The Ministry of Electronics and IT is amending its content blocking rules. Under the new rules, only senior officials at the Centre and state levels can issue content removal notices under Section 79(3)(b) of the Information Technology Act, 2000.

This aims to ensure greater accountability and prevent misuse of online blocking powers.

What’s in Today’s Article?

  • New Safeguards for Content Blocking
  • Background: X’s Legal Challenge
  • Section 79(3)(b) vs Section 69A

New Safeguards for Content Blocking

  • The Ministry of Electronics and IT (MeitY) is amending the Information Technology Rules, 2021 to ensure that only senior officials can issue online content blocking notices under Section 79(3)(b) of the IT Act, 2000.
  • This means that content removal requests sent to platforms like YouTube, Instagram, and X (formerly Twitter) will now be authorised only by:
    • Joint Secretary (JS) or an equivalent officer at the Centre or state level,
    • Director-level officers where no JS exists, and
    • In police departments, Deputy Inspector General (DIG) or above, specifically authorised.
  • Each order must specify:
    • The legal basis and statutory provision,
    • The nature of the unlawful act, and
    • The specific URL or digital location of the content to be removed.
  • Additionally, all such orders will undergo a monthly review by an officer not below the rank of Secretary, such as the IT Secretary (Centre) or Home/IT Secretaries (State).
  • The amendments will come into effect from November 15, and a Template Blocking Order will be provided to central and state agencies for uniformity in issuing lawful takedown notices.
  • Rule 3(1)(d): The Basis for Safeguards
    • The new rules focus on Rule 3(1)(d) of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, which allows officials to flag specific online content.
    • When a government notice is issued under this rule, social media platforms lose “safe harbour” protections—meaning they may be held legally responsible for user-generated content unless they justify or remove it.
    • Officials clarified that such notices are not direct takedown orders but warnings indicating that safe harbour protections no longer apply to the flagged content.
  • Reason for the Amendment
    • According to a senior official, in some states, junior police officers such as sub-inspectors and assistant sub-inspectors had been issuing blocking notices to social media companies.
    • The new amendment seeks to prevent misuse of power by restricting this authority to senior officers, thereby ensuring greater accountability and transparency.
    • Experts say that the move would ensure all blocking orders are reasoned, justified, and issued by senior officials.

Background: X’s Legal Challenge

  • Elon Musk’s X (formerly Twitter) had earlier challenged the government’s use of Rule 3(1)(d), calling it unconstitutional and arbitrary, claiming it enabled local police officers to issue “censorship” orders nationwide.
  • However, the Karnataka High Court recently upheld the government’s authority to empower officials under this rule.
  • Officials clarified that the new amendments are not a response to X’s case, though they do address some of its concerns by formally restricting who can issue such orders and mandating detailed justifications.

Section 79(3)(b) vs Section 69A

  • Section 79(3)(b): Allows the government to direct platforms to remove any unlawful content, failing which they can lose safe harbour protection (legal immunity for user-generated content).
  • Section 69A: Permits blocking content only if it affects sovereignty, integrity, defence, or security of India.
  • The new changes bring more clarity, accountability, and uniformity in applying Section 79(3)(b), aiming to balance content regulation with due process.

Enquire Now