Explainers

Explained: Draft IT Rules 2026 and what could change for social media users

Southcheck Network

Hyderabad: Imagine you post a photo, video, comment or news update on Instagram, YouTube, X or Facebook, or forward something on WhatsApp. If that content breaks the law, who is responsible: you, or the platform hosting it?

What is the role of online platforms?

In India, online platforms are legally classified as ‘intermediaries.’ They generally enjoy what is known as safe harbour protection, meaning they are not held liable for user-generated content, as long as they follow certain due diligence requirements.

These requirements are defined under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (commonly called the 2021 IT Rules).

Now, the government has proposed changes.

Govt wants public input on new IT rules 

On March 30, the Ministry of Electronics and Information Technology (MeitY) released draft amendments to these rules, titled ‘Draft amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021’. 

The public has been invited to submit comments by April 14, after which the government will finalise the rules.

What does the government want to do?

What exactly is changing, and why does it matter? Who are the ‘intermediaries’ and why do rules matter?

Intermediaries are platforms that allow users to create, upload and share content. This includes social media platforms, video-sharing websites and messaging services. The platforms themselves do not create content — users do.

The 2021 IT Rules were designed to balance three key objectives:

- Protecting free speech and user rights

- Holding platforms accountable for harmful or illegal content

- Providing a clear compliance framework so platforms retain safe harbour protection

The proposed amendments largely aim to clarify and expand these existing rules.

Key proposed changes in the draft amendments

The draft introduces targeted changes, mainly in Part II (intermediary obligations) and Part III (digital media ethics code). 

Here are the most important ones:

1. Data retention obligations clarified

Under existing rules, platforms must retain certain user data (such as registration details or logs) for a specified period in certain cases.

The draft adds language to clarify that these obligations apply in addition to requirements under other laws.

What this means:

If other laws, such as tax, cybersecurity or law enforcement regulations, require longer data retention, platforms must comply with those as well. This removes ambiguity when multiple laws apply.

2. Mandatory compliance with government directions

A new provision requires intermediaries to comply with any written instructions issued by MeitY. These may include advisories, orders, guidelines, standard operating procedures (SOPs) or codes of practice.

Compliance with such directions will now be considered part of the ‘due diligence’ required to retain safe harbour protection.

What this means:

Platforms will need to follow government-issued instructions more closely to avoid legal liability.

3. Expansion of rules to user-generated news content

The draft expands the scope of Part III (which deals with news and current affairs content).

Earlier, these rules mainly applied to publishers such as digital news platforms. The amendment extends oversight to intermediaries and even users who are not publishers but share news-related content.

What this means:

Posts by ordinary users about current affairs or politics could potentially come under the same regulatory framework as professional news content.

4. Broader powers for the oversight committee

The Inter-Departmental Committee (IDC), which currently examines complaints, will see its role expanded.

The draft replaces the term ‘complaints’ with ‘matters’ and allows the committee to review issues referred by the Ministry of Information and Broadcasting, not just individual grievances.

What this means:

The committee can take up broader issues beyond specific complaints and recommend actions such as content modification or removal.

5. New rules for AI-generated and deepfake content

One of the most significant additions is a framework for what the draft calls Synthetically Generated Information (SGI).

This includes audio, video or images that are artificially created or altered using technology in a way that makes them appear real.

The draft excludes routine edits like colour correction or noise reduction. It generally does not cover text unless it creates false documents.

Under the proposed rules, platforms must:

- Use reasonable technical measures (including automated tools) to detect and prevent harmful AI-generated content

- Clearly label permissible synthetic content through visible or audio disclosures

- Embed metadata or technical markers to identify such content

- Prevent tampering with these labels or identifiers

For large platforms (Significant Social Media Intermediaries), additional requirements include user declarations and verification mechanisms before publishing such content.

The draft also clarifies that using automated tools to remove such content will not affect safe harbour protection.

What this means:

Platforms will be required to actively detect, label and manage deepfakes and AI-generated media, especially when it could mislead users or cause harm.

Why is the government proposing these changes?

According to MeitY, the amendments are largely ‘clarificatory and procedural.’

The stated objectives include:

- Improving legal clarity

- Strengthening enforcement mechanisms

- Updating the framework to address emerging challenges like AI-generated content

- Expanding oversight of online harms

What happens next?

These are draft rules; they are not in force yet.

The government has invited feedback from individuals, companies and organisations. After reviewing public comments, it may revise the draft before notifying the final rules in the official gazette.

If implemented, the changes could lead to:

- Greater compliance obligations for platforms

- Increased oversight of user-generated content, especially news-related posts

- Stricter handling of AI-generated and deepfake content

The bigger picture

The proposed amendments reflect the government’s attempt to update digital regulations in response to a rapidly evolving online environment, from social media influence to AI-generated misinformation.

At the core, the debate remains the same: how to balance user freedom, platform responsibility and government oversight.

The final version of these rules will play a key role in shaping how Indians create, share and consume content online in the coming years.

Fact Check: Iran bombs Israel’s nuclear reactor? No, here are the facts

Fact Check: തൃക്കരിപ്പൂരില്‍ തിരഞ്ഞെടുപ്പ് പ്രചാരണത്തിനിടെ കോണ്‍ഗ്രസ് പ്രവര്‍ത്തകര്‍ തമ്മില്‍ സംഘര്‍ഷം? വീഡിയോയുടെ സത്യമറിയാം

Fact Check: எடப்பாடி பழனிசாமி பிரச்சார வாகனத்தில் திமுக பாடல் ஒலித்ததா? வைரல் காணொலியின் உண்மை பின்னணி!

Fact Check: ಕಮಾಂಡೋಗಳು ಪಶ್ಚಿಮ ಬಂಗಾಳದಲ್ಲಿ ಮಮತಾ ಬೆಂಬಲಿಗರ ಮೇಲೆ ಲಾಠಿ ಚಾರ್ಜ್ ಮಾಡಿದ್ದಾರೆಯೇ?

Fact Check: ఇజ్రాయెల్ విలేకరి నెతన్యాహు మరణాన్ని ధృవీకరించారా? లేదు, నిజం ఇక్కడ తెలుసుకోండి