The Supreme Court (SC) has recently weighed in on the issue of social media influencers making derogatory remarks, emphasizing the need to strike a balance between free speech and the dignity of vulnerable groups. The Court noted the growing trend of commercializing free speech on platforms and the potential harm it can cause, especially to marginalized communities.
Commercialisation & Accountability:
Social media influencers often monetize free speech, which can overlap with prohibited speech.
The SC cautioned that such speech should not target vulnerable groups such as disabled individuals, women, children, minorities, or senior citizens.
Humour vs. Dignity:
The Court acknowledged that humour is vital, but it cautioned against jokes or remarks that perpetuate stigma, discrimination, and undermine the constitutional mandate of inclusivity.
Clear Boundaries in the Digital Space:
The SC emphasized the need for clear delineation between free speech, commercial speech, and prohibited speech, as irresponsible online content erodes social harmony and community trust.
Regulatory Guidelines:
The Court directed the Union Government to frame regulatory guidelines in consultation with stakeholders like the National Broadcasters and Digital Association.
These guidelines should ensure proportionate consequences for violators, moving beyond "empty formalities".
Sensitisation & Responsibility:
The Court emphasized the importance of awareness and sensitisation on digital ethics, holding influencers accountable for any harm caused by their content.
Apology & Balance of Rights:
Influencers were directed to issue unconditional apologies via their platforms, reiterating that the Court's intention is not to curtail free speech but to balance liberty with dignity in a diverse society.
The regulation of social media in India is primarily governed by the Information Technology Act, 2000, and subsequent rules. These include provisions aimed at ensuring online safety, data privacy, and the removal of unlawful content.
Information Technology Act, 2000:
Section 79(1) provides safe harbour protection to intermediaries (e.g., Facebook, Twitter, Instagram) from liability for third-party content, as long as they act as neutral platforms.
Section 69A of IT Act, 2000:
Allows the Government to block content that threatens sovereignty, security, or public order.
Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021:
Mandates social media platforms to ensure user safety, remove unlawful content, and promote awareness on issues such as privacy, copyright, and national security.
2023 Amendment:
This amendment requires platforms to remove content related to misleading government information. However, its enforcement has been stayed by the Supreme Court due to concerns about misuse.
Shreya Singhal v. Union of India (2015): The SC struck down Section 66A of the IT Act, declaring that criticism, satire, and dissent are protected under Article 19(1)(a) unless they fall within reasonable restrictions.
K.S. Puttaswamy v. Union of India (2017): The SC recognized privacy as a fundamental right under Article 21, influencing subsequent data protection measures.
There are several pressing reasons to regulate social media in India:
Protection of Vulnerable Groups:
Social media platforms, when unregulated, allow derogatory content, cyberbullying, and exploitation, particularly of marginalized communities (e.g., women, children, elderly, minorities).
Curbing Misinformation & Hate Speech:
The spread of fake news, deepfakes, and hate speech can damage social harmony and national security. Regulation can address these threats by curbing the disinformation ecosystem.
Safeguarding Mental Health:
The addictive nature of social media (e.g., FOMO, endless scrolling) has led to increased cases of anxiety, depression, and addiction. Regulatory measures can promote digital well-being and responsible design.
Ensuring Accountability of Influencers:
With the rise of influencer marketing, there have been instances of misleading promotions and financial risks. Regulation can introduce transparency, disclosure norms, and consumer protection.
Data Privacy & Security:
Social media platforms collect vast amounts of user data. Regulation is needed to protect users from privacy violations, surveillance, and misuse for profit or political influence.
Balancing Free Speech with Responsibility:
While free speech is protected under Article 19(1)(a), it is subject to reasonable restrictions under Article 19(2), particularly regarding public order, morality, and security. Effective regulation can help strike this balance.
Volume & Anonymity:
The sheer volume of content and anonymity of users make it difficult to regulate online speech effectively. This creates an environment conducive to hate speech and misinformation.
Transparency & Accountability Gaps:
Social media platforms often lack transparency in their content moderation processes, and there's insufficient independent oversight, leading to concerns about opaque practices and arbitrary decision-making.
Defining Harmful Content:
Defining harmful content is subjective. Cultural, political, and social differences complicate the process of distinguishing between legitimate expression and prohibited speech.
Free Speech vs. Censorship:
Any effort to regulate social media risks being seen as censorship. Clear, objective, and proportionate criteria are necessary to avoid the curtailment of free speech.
Cross-Border Jurisdictional Issues:
A large portion of harmful content originates from outside India's jurisdiction, making it difficult to enforce domestic laws.
Political Neutrality Concerns:
Content moderation decisions often raise accusations of political bias, affecting the credibility and effectiveness of regulation.
Robust Legal-Policy Framework:
Update the IT Act, 2000 to create a Digital India Act, ensuring platform accountability, data protection, and free speech, with judicial oversight to avoid overreach.
Algorithmic Transparency & Accountability:
Mandate algorithm audits and encourage the use of AI-driven moderation tools to ensure neutrality and quick redressal.
Technological & Institutional Capacity:
Expand cyber forensic labs and enhance agency capacity to monitor online content while ensuring privacy standards are maintained.
Digital Literacy & Ethical Usage:
Launch nationwide campaigns to promote digital literacy, combat misinformation, and encourage responsible online behavior.
Global & Multi-Stakeholder Cooperation:
Strengthen international collaboration on cross-border regulation and involve civil society, academia, and industry in creating a future-ready digital ecosystem.
Effective regulation of social media is essential to balance the right to free speech with the dignity and rights of vulnerable groups. By developing a robust legal framework, ensuring algorithmic transparency, fostering digital literacy, and promoting ethical online behavior, we can ensure a safe, inclusive, and credible digital ecosystem that fosters both innovation and accountability
We provide offline, online and recorded lectures in the same amount.
Every aspirant is unique and the mentoring is customised according to the strengths and weaknesses of the aspirant.
In every Lecture. Director Sir will provide conceptual understanding with around 800 Mindmaps.
We provide you the best and Comprehensive content which comes directly or indirectly in UPSC Exam.
If you haven’t created your account yet, please Login HERE !
We provide offline, online and recorded lectures in the same amount.
Every aspirant is unique and the mentoring is customised according to the strengths and weaknesses of the aspirant.
In every Lecture. Director Sir will provide conceptual understanding with around 800 Mindmaps.
We provide you the best and Comprehensive content which comes directly or indirectly in UPSC Exam.