Published on 15th April 2025
Authored By: Nidhi Chakrapani
Lords Universal College of Law
Abstract
Social media has transformed communication, providing a forum for free expression while also posing issues such as disinformation, hate speech, and privacy concerns. In India, social media regulation is guided by Article 19(1)(a) of the Constitution, which provides free expression subject to reasonable limits under Article 19(2). Key legal rules governing online content and platform liability include the Information Technology Act of 2000, the Intermediary Guidelines of 2021, and the Digital Personal Data Protection Act, 2023.
This article examines the growing legal environment around social media regulation in India, focusing on significant decisions such as Shreya Singhal v. Union of India (2015), which overturned Section 66A of the IT Act for breaching free expression. The role of government control under Sections 69A (content blocking) and 79 (intermediary responsibility) is discussed, as well as recent regulatory developments such as SEBI’s desire for further powers and legal action against problematic internet content.
A comparative analysis of foreign approaches—such as the GDPR (EU), Section 230 (US), and China’s tight censorship model—shows the difficulties in balancing free expression with responsibility. The paper suggests that a middle-ground strategy centered on transparent content filtering, user awareness, and an independent oversight agency is required to establish a responsible digital environment in India.
Introduction
Social media has evolved into a powerful communication tool, allowing people to rapidly express their thoughts, obtain news, and participate in public conversation. Platforms like as Facebook, Twitter (now X), Instagram, and WhatsApp have democratized expression by enabling people to communicate internationally. However, this freedom presents considerable obstacles, such as disinformation, hate speech, defamation, cyber harassment, false news, and dangers to national security. The unregulated dissemination of dangerous information has spurred governments throughout the world, including India, to enact regulatory frameworks to promote responsible digital conduct while protecting freedom of speech.
In India, the right to free speech and expression, guaranteed by Article 19(1)(a) of the Constitution, is essential to democratic discourse. However, Article 19(2) sets reasonable constraints, allowing for control in the interests of public order, decency, morality, defamation, state security, and national sovereignty. Social media platforms, which serve as intermediates, play an important role in material transmission, but they are also under investigation for their culpability in situations of disinformation, illegal content, and data breaches.
To address these issues, India has implemented a number of regulations, including the Information Technology Act of 2000, which controls online behavior, and the Intermediary Guidelines and Digital Media Ethics Code of 2021, which sets compliance standards for platforms. Additionally, the Digital Personal Data Protection Act of 2023 seeks to improve user privacy and govern data processing. Judicial decisions, notably in landmark instances such as Shreya Singhal v. Union of India (2015), have established the extent of online speech and intermediary responsibility.
As digital platforms expand, new concerns emerge, including artificial intelligence-generated disinformation, deepfakes, financial fraud, and social media’s effect on elections. The government must strike a delicate balance between regulatory control and the basic right to free expression, ensuring that laws prevent harm while not allowing for censorship or restricting innovation. This article investigates India’s legal framework for social media regulation, examining important legislation, case laws, current regulatory developments, and worldwide best practices to recommend a balanced approach to online freedom and responsibility.
Constitutional Framework: Freedom of Speech and Its Limitations
The Indian Constitution guarantees the right to free speech and expression under Article 19(1)(a). However, this freedom is not absolute and is subject to reasonable constraints outlined in Article 19(2), which include the interests of India’s sovereignty and integrity, state security, public order, decency or morality, and the prohibition of defamation or incitement to an offence. These rules serve as the foundation for regulations controlling material on social media sites.
Statutory Provisions Governing Social Media
Information Technology Act, 2000[1]
The Information Technology Act, 2000 (IT Act) is the fundamental legislation governing electronic communication in India. Several provisions of this Act are specifically pertinent to social media regulation:Â
- Section 66A criminalizes delivering offensive communications using communication services. However, the Supreme Court deemed it illegal in Shreya Singhal v. Union of India, AIR 2015 SC 1523, since it was ambiguous and overbroad, infringing on the right to free expression.
- Section 69A allows the government to restrict access to information on computer resources for sovereignty, security, or public order. The Information Technology (Procedure and protections for Blocking Public Access to Information) Rules, 2009, explain the procedural protections for this ability.
- Section 79 protects “intermediaries” (e.g., social media platforms) from responsibility for third-party content if they use due care and do not engage in illegal activities. This “safe harbor” clause was defined in the Shreya Singhal case, which said that intermediaries must comply with court or government demands to delete particular information in order to claim protection.
Indian Penal Code, 1860[2]
Certain articles of the Indian Penal Code (IPC) also apply to offenses committed through social media:Â
- Section 499 defines defamation as any verbal, written, or visual representations that injure a person’s reputation.
- Section 500 outlines penalties for defamation, including imprisonment for up to two years, fines, or both.
- Section 505 penalizes anyone who spread rumors or information that instigate offenses against the state or public peace.
Landmark Judicial Pronouncements
Shreya Singhal v. Union of India, 2015[3]
In this key 2015 judgment, the Supreme Court declared Section 66A of the IT Act unconstitutional owing to its ambiguous and wide phrasing, which might result in arbitrary enforcement and suppression of free expression. The Court stressed the importance of clear and specific law when limiting basic rights. Furthermore, the ruling expanded the extent of intermediary obligation under Section 79, stating that intermediaries must delete information only after receiving a court order or communication from an authorized government body.
Mouthshut.com v. Union of India[4]
In Mouthshut.com v. Union of India, the petitioner, Mouthshut.com, a consumer review website, challenged the constitutionality of certain provisions of the Information Technology
Act of 2000, specifically those relating to intermediary liability under Section 79 and the Intermediary Guidelines of 2011. The website stated that these regulations put onerous requirements on intermediaries, compelling them to remove information without judicial scrutiny, limiting free speech and unfairly burdening digital platforms. The case gained relevance following the Shreya Singhal v. Union of India (2015) decision, in which the Supreme Court of India struck down Section 79, emphasizing that intermediaries would only be obligated to delete information if they received a court order or a government instruction. This decision gave internet platforms more protection, assuring that companies would not be held accountable for user-generated content unless expressly required by law, so enhancing free speech rights while preserving regulatory monitoring.
Regulatory Framework: Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021[5]
In February 2021, the Indian government issued the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, with the goal of creating a complete framework for regulating digital material and social media platforms. The key features include:
- Due Diligence Requirements: Intermediaries must employ a Chief Compliance Officer, Nodal Contact Person, and Resident Grievance Officer based in India to guarantee compliance with local laws and timely resolution of customer issues.
- Content Removal Timelines: Platforms must delete or deactivate unlawful content within 36 hours of receiving a court order or information from an authorized government agency.
- Traceability mandate: Social media intermediaries must identify the first source of information for preventing, detecting, investigating, and prosecuting offenses related to sovereignty, security, public order, or sexual violence.
These restrictions have raised discussions about user privacy, possible overreach, and practical issues for social media businesses.Â
Recent Developments and Challenges
SEBI’s Request for Enhanced Powers
In February 2025, the Securities and Exchange Board of India (SEBI) requested further ability to monitor social media data and delete illicit financial advice from platforms like as WhatsApp and Telegram. This move highlights the regulatory issues faced by the broadcast of unverified financial information, as well as the importance of strong investor protection systems. SEBI’s proposal underlines the changing nature of social media regulation, particularly in financial markets.Â
Content Moderation and Free Speech
The arrest of podcaster Ranveer Allahabadia in February 2025 for allegedly making obscene statements on an internet program highlights the conflicts between content management and free expression. The Supreme Court’s order to suspend his performances pending an inquiry raises concerns about the scope of lawful speech and the role of judicial monitoring in content control.
Data Privacy and Intermediary Liability
The Competition Commission of India’s (CCI) inquiry into the monopolistic activities of major social media platforms, notably in terms of data privacy and algorithmic transparency, adds another layer of regulatory obstacles. Concerns have been raised about the usage of user data for targeted advertising, as well as a lack of accountability in content control procedures. The passage of the Digital Personal Data Protection Act of 2023 is a step toward improving user rights and imposing stronger data protection regulations.
The Digital Personal Data Protection Act, 2023[6]
The Digital Personal Data Protection Act, 2023 (DPDP Act) seeks to provide a comprehensive framework for data privacy in India by putting requirements on companies that process personal data and giving consumers more control over their information. The key provisions include:
- Consent-Based Data Processing: Platforms must get user consent before collecting personal data.
- Right to Data Erasure: Users have the right to delete their data if it is no longer needed.
- Penalties for Non-Compliance: Companies that violate data protection regulations risk severe fines.
This rule directly affects social media sites, which rely heavily on user data for tailored content suggestions and targeted marketing.Â
Comparative Analysis: Global Social Media Regulations
India’s legislative structure reflects worldwide trends in social media regulation. A comparative study focuses on essential approaches:
United States
- First Amendment Protections: The US prioritizes First Amendment protections, restricting government intervention in social media content monitoring.
- Section 230 of the Communications Decency Act: It protects web platforms from responsibility for third-party material, save for copyright breaches and criminal acts.
European Union
- The General Data Protection Regulation (GDPR): imposes strict data privacy rules for platforms in the EU.
- The Digital Services Act (DSA) and Digital Markets Act (DMA): It mandate transparency and protect against dangerous information.
China
- Strict Government Oversight: Social media networks in China face strict government oversight, including real-time content monitoring and restriction.
India’s strategy seeks to achieve a balance between enabling free expression and avoiding harm caused by disinformation, hate speech, and criminal content.Â
The Way Forward: Striking a Balance
Striking a balance between freedom of expression and reasonable control is difficult. The following steps might help:Â
- Clearer Guidelines for Content Moderation: Establishing objective, clear criteria to prevent arbitrary censoring.
- Stronger Grievance Redressal Mechanisms: Improving user rights by establishing an impartial monitoring body to resolve content concerns.
- Public Awareness Initiatives: Educate users on appropriate social media usage and the impact of disinformation.
Conclusion
India’s social media legislation is changing to handle the complexity of digital speech, disinformation, privacy issues, and online platforms’ rising effect on public debate. While legal provisions such as the Information Technology Act of 2000 (IT Act), the Intermediary
Guidelines and Digital Media Ethics Code of 2021, and the Digital Personal Data Protection Act of 2023 (DPDP Act) provide a framework for content regulation, intermediary liability, and data protection, their implementation and enforcement are critical to determining their effectiveness. Judicial interpretations, particularly those in major instances such as Shreya Singhal v. Union of India (2015), have been critical in defining the extent of free expression, establishing intermediary responsibility, and avoiding arbitrary governmental intervention. However, with quick technical improvements, the emergence of AI-driven disinformation, deepfake technologies, and online financial fraud, current regulations may need to be constantly revised and updated to keep up with new digital concerns.
A balanced and well-defined regulatory strategy is required, one that protects constitutional rights while guaranteeing responsibility and responsible digital activity. Overregulation or excessive state control may violate basic rights and result in censorship, whilst a lack of enforcement may allow digital platforms to be exploited for criminal purposes. Moving forward, collaboration among the government, courts, social media corporations, and civil society groups is required to develop clear, fair, and effective rules. Strengthening grievance redressal processes, incorporating AI-driven content filtering with human oversight, and boosting digital literacy among users would be critical in creating a safer and more accountable digital environment in India.
Â
References
- Shreya Singhal vs U.O.I on 24 March, 2015. Available at: https://indiankanoon.org/doc/110813550/ (Accessed: 19 February 2025).Â
- com v/S union of india – supreme court. Available at: https://www.mouthshut.com/pdf/main_pitition.pdf (Accessed: 19 February 2025).Â
- Information technology act, 2000. Available at: https://www.indiacode.nic.in/bitstream/123456789/13116/1/it_act_2000_updated.pdf (Accessed: 19 February 2025).
- The Information Technology (intermediary guidelines and Digital Media Ethics Code) rules, 2021 (2025) PRS Legislative Research. Available at: https://prsindia.org/billtrack/the-information-technology-intermediary-guidelines-anddigital-media-ethics-code-rules-2021 (Accessed: 19 February 2025).Â
- The Digital Personal Data Protection bill, 2023 (2025) PRS Legislative Research. Available at: https://prsindia.org/billtrack/digital-personal-data-protection-bill-2023 (Accessed: 19 February 2025).
- India regulator seeks greater access to social media records, source and memo say | Reuters. Available at: https://www.reuters.com/world/india/indias-sebi-seeks-greateraccess-social-media-records-say-source-memo-2025-02-13/ (Accessed: 19 February 2025).
- India’s top court tells podcaster charged with obscenity to stop shows for now | reuters. Available at: https://www.reuters.com/world/india/indias-top-court-tellspodcaster-charged-with-obscenity-stop-shows-now-2025-02-18/ (Accessed: 19 February 2025).
Â