POLICIES AND SCHEMES

India’s new social media code

India’s new social media code

Objective: Regulation of Digital and Online Media

Introduction: The objective is to establish regulatory measures for social media platforms, digital news media, and over-the-top (OTT) content providers.

Regulatory Framework: The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 have been devised for social media platforms. Various legal precedents and legislative discussions guide these rules.

Legal Basis:

  • Supreme Court Observation (2018): The regulatory impetus gained momentum from a significant observation made by the Supreme Court in 2018.
  • Supreme Court Order (2019): The momentum continued with a subsequent Supreme Court order in 2019, further emphasizing the need for regulatory measures.
  • Parliamentary Discussions (2018 and 2020): Deliberations in the Rajya Sabha, including a comprehensive report presented by a committee in 2020, affirmed the necessity of creating rules that empower ordinary users of digital platforms to address grievances and ensure accountability in cases of rights infringement.

The aim is to balance the empowerment of digital platform users with a framework that promotes accountability and safeguards their rights.

Key Proposals for Social Media in the New Guidelines

The newly introduced guidelines offer key proposals regarding social media platforms:

1. Due Diligence Requirements: Under Section 79 of the Information Technology Act, intermediaries enjoy a “safe harbour” from liability for user-generated content. The guidelines now require intermediaries to exercise due diligence, and failure to do so will lead to the cessation of the safe harbour provisions. This provision applies to platforms like Twitter, Facebook, YouTube, and WhatsApp.

2. Grievance Redressal Mechanism: The guidelines mandate that intermediaries, including social media platforms, must establish a system for receiving and resolving user complaints. To adhere to this, these platforms are required to appoint a grievance officer. The grievance officer’s responsibilities include acknowledging complaints within 24 hours and resolving them within 15 days from the date of receipt. This mechanism aims to ensure efficient redressal of user grievances.

Content Removal Rules in the Guidelines

The guidelines establish rules for the removal of content from social media platforms. These rules outline ten categories of content that should not be hosted by the platforms. The prohibited content includes:

  1. Content that threatens India’s unity, integrity, defense, security, sovereignty, friendly relations with foreign states, or public order.
  2. Content that incites the commission of any recognizable offense, hinders the investigation of any offense, or insults foreign states.
  3. Content that is defamatory, obscene, pornographic, pedophilic, invasive of privacy (including bodily privacy), or insulting or harassing based on gender.
  4. Content that is libelous, racially or ethnically objectionable.
  5. Content that relates to or encourages money laundering or gambling, or is inconsistent with or contrary to Indian laws.

These rules guide the removal of such content to maintain the safety and integrity of social media platforms in compliance with Indian laws.

Due Diligence Requirements for Social Media Companies

Social media companies are required to adhere to certain due diligence standards under the new guidelines. These requirements include:

  1. Appointment of Chief Compliance Officer: Social media platforms must appoint a Chief Compliance Officer who resides in India. This officer is responsible for ensuring compliance with the rules and regulations.
  2. Appointment of Nodal Contact Person: Platforms are also mandated to appoint a Nodal Contact Person who is available for 24×7 coordination with law enforcement agencies. This facilitates a quick response to legal and law enforcement matters.
  3. Monthly Compliance Report: Social media intermediaries should publish a monthly compliance report. This report should include details about the complaints received, actions taken in response to these complaints, and information about any content proactively removed by the significant social media intermediary.

These due diligence requirements aim to enhance accountability, compliance, and coordination with Indian authorities for social media companies operating within the country.

Penalties for Violating the Guidelines

Companies that violate the new guidelines may face various penalties, which are detailed as follows:

  1. Loss of Safe Harbour: If an intermediary fails to comply with the prescribed rules and due diligence requirements, they will lose the safe harbour protection. This means they will be liable for legal consequences under existing Indian laws, including the Information Technology Act and the Indian Penal Code.
  2. Penal Provisions: Violations of the Information Technology Act may encompass a range of offenses, including document tampering, computer system hacking, online misrepresentation, breaches of confidentiality and privacy, and fraudulent content publication, among others. The penalties for these offenses vary and may result in imprisonment for a minimum of three years and a maximum of seven years. Fines for violations can begin at Rs 2 lakh.

These penalties are designed to encourage compliance with the guidelines and ensure that intermediaries and social media platforms adhere to the specified regulations.

Data Privacy Laws and Internet Regulations in India

India’s legal framework with respect to data privacy and internet regulations is outlined as follows:

  1. Privacy in the IT Act of 2000: The Information Technology (IT) Act of 2000 does not have specific provisions that define privacy, nor does it contain penal provisions directly related to privacy concerns. However, certain sections of the Act address specific instances of data breaches and privacy breaches.
  2. Section 43A: This section of the IT Act deals with compensation in cases where an intermediary is found to be negligent in employing reasonable and high-quality security measures to safeguard data and user privacy.
  3. Section 72: Under this section, the IT Act includes provisions for penalties and imprisonment if a government official, while performing official duties, gains access to specific information and subsequently leaks it. This section is designed to address unauthorized access and data leaks.
  4. Section 72A: Section 72A provides for criminal consequences if a service provider discloses a user’s personal information without their awareness during the provision of a service or within the contract period. This is intended to protect individuals from unauthorized data sharing.

These sections within the IT Act aim to address specific instances of data privacy and security but do not provide a comprehensive framework for user data protection and online privacy on social media platforms. The newly introduced guidelines seek to fill this gap by imposing additional compliance measures on social media intermediaries and digital platforms.

Impact of OTT Service Regulations on Consumers

The rules governing Over-The-Top (OTT) services, like YouTube and Netflix, have implications for consumers, and they aim to provide age-appropriate content:

  1. Content Classification: OTT service providers are required to classify their content into five categories based on age suitability. These classifications help consumers determine the appropriateness of the content for different age groups.
  • U (Universal): Content suitable for all individuals aged 7 and older.
  • U/A 7+ (Universal/Adults 7 and older): Content appropriate for individuals aged 7 and older but may require parental guidance.
  • U (Universal): Content designed for viewers aged 13 and older.
  • U/A 13+ (Universal/Adults 13 and older): Content suitable for individuals aged 13 and older with parental guidance.
  • A (Adults): Content intended for adults.

These content classifications ensure that consumers, including parents and guardians, can make informed decisions regarding the content they access on OTT platforms. This promotes age-appropriate viewing and helps protect the interests of both children and adults.


Related Articles

Back to top button