Connect with us

Hi, what are you looking for?

SLC Reads

Tackling The Monkey in The Middle: Intermediary Liability in India

In today’s digital era, many individuals find themselves deeply engrossed in their devices. According to the Global Digital Report, an individual spends about 6.5 hours online every day.

Last Updated on September 11, 2023 by Administrator


In today’s digital era, many individuals find themselves deeply engrossed in their devices. According to the Global Digital Report, an individual spends about 6.5 hours online every day. The increase in netizens and proliferation of digital media platforms like Social Media and Over-the-top(OTT)  streaming services have led to an upsurge of User Generated Content (UGC).[1]

When there is a spread in UGC, it is inevitable that certain people might upload content that is offensive or unlawful. This might be in the form of defamatory content, child nudity, sexual exploitation, display of triggering content such as self-harm, posting content which hurts religious sentiments, threatens national security etc.[2] A good example of UGC abuse is the Bulli Bai and Sulli Deals app which were created on the GitHub platform. Both apps were created to upload publicly available images of muslim women to announce that they are up for action. The unfiltered nature of UGC presents new challenges to the Indian Legal System.

The Indian Courts have held that apart from newspapers, cinemas and social media are also an important source of information dissemination for the public.[3] In India, the general population greatly idolizes film stars. The youth are easily influenced by the content shown in social media. Creative liberty is at the heart of democratic society but it becomes imperative to regulate it when the content becomes harmful for the audience.[4]

Intermediaries are platforms that host the UGC and enable free flow of information. They facilitate interactions between two or more people over the internet. They play a crucial role in moderating the content posted.[5] The discussion on intermediary liability revolves around whether apart from the person who posted the unlawful content, the intermediary which hosted the content can also be held liable?  Before jumping to this fascinating legal question, it is vital to demystify the concept of an intermediary first.

 Who is an Intermediary?

The Information Technology Act, 2000 has defined an intermediary in Section 2(1)(w). An intermediary is any person who “who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record and includes telecom service providers, network service providers, internet service providers, web-hosting service providers, search engines, online payment sites, online-auction sites, online-market places and cyber cafes.”

The definition of an intermediary in the Indian law is broad enough to include both physical and online intermediaries. Physical intermediaries are those that provide the necessary infrastructure for accessing the internet. For example – Internet Service Providers (ISPs). Online intermediaries provide a platform for hosting content. For example – Facebook, Twitter, Amazon Prime etc. Online Intermediaries include both social media platforms and OTT streaming services.[6]

Many developed countries and regional groupings like the UK, USA and EU have a separate definition for intermediary which specifically encompasses digital intermediaries. While the EU uses an umbrella term ‘information society service,’ the UK uses the term ‘intermediary service provider.’ India’s definition of intermediary is not limited to only online intermediaries but also includes physical intermediaries. The developed countries definitions are perhaps more focussed and concentrated to address their specific needs. 

The Need for Intermediary Liability

Deliberations on intermediary liability begun in the Indian Judiciary through the case of Avnish Bajaj v. State.[7] In this case, a student from IIT Kharagpur listed an offering which contained obscene and  pornographic content on the platform The content was circulating rapidly through the internet as many people began sharing it. Tracing these many individuals who use online platforms like email, social media, texting services etc. to share illegal content will prove to be very difficult especially if the circulation is multiplying rapidly. This issue gets more complex when government agencies identify that some users are in other countries that do not come under the jurisdiction of the national government. Another potential problem is user anonymity which makes the tracing process cumbersome.

When direct deterrence is ineffective due to anonymity or users subject to a different jurisdiction, creating third party liability becomes an enticing option. If gatekeepers could ensure that illegal information is not displayed in India, then the government can save a lot of resources in an attempt to directly contact a foreign author to take down the content.[8]

Intermediary liability evolved as an earnest attempt to regulate online content by making the best use of their position as “gatekeepers to flow of information.” Intermediaries are seen as gatekeepers because they facilitate and host huge volumes of Internet content. Physical intermediaries like BSNL or Vodafone physically connect the users to the internet. Making the gatekeepers liable for filtering, removing, blocking and monitoring online content is an effective way of preventing sharing and uploading of illegal content.

Exploring the Safe Harbour Provision and Its Impact on Intermediary Liability

It seems rather unfair that a third-party intermediary which merely hosted the content but did not actively participate in making, uploading or circulating the content is held liable. The law intends to punish wrongdoers. But technically, intermediaries did not commit any wrong.

When an individual posts illegal content, the intermediaries are “necessary but insufficient causes of online harms.” They may be held secondarily liable for hosting illegal content. “Secondary liability is imposed on someone who does not commit the legal wrong directly, but is found responsible for encouraging, facilitating or profiting from it.” If this is the legal position, then intermediaries stand a substantial risk of being held liable because of the sheer volume of users and content hosted. To ensure that intermediaries have sufficient incentive to operate despite this substantial risk, the law grants a safe harbour provision in the form of Section 79 in the Information Technology Act, 2000.

Section 79 grants immunity from liability to intermediaries for hosting or making available illegal content. However, this is not an absolute immunity and the intermediary must meet certain conditions in order to avail it. It bars claims for criminal liability and monetary damages against the intermediaries. However, it does not grant protection from non-monetary liabilities like injunctions.

Section 79(2) and Section 79(3) lists out various conditions to be met by the intermediaries to avail the defence. The defence applies only if the intermediary’s function is restricted to providing access to a communication system through which information posted by users is transmitted. The intermediary should not initiate the transmission or select and modify the content in the transmission. The intermediary is bound to observe due diligence as prescribed under the relevant rules. This is where the delegated legislation in the form of IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 come into action. It prescribes in detail the due diligence to be carried out by the intermediaries.

According to Section 79(3), the safe harbour provision will not apply if the intermediary is actively involved in committing the unlawful act. This can be in the form of conspiring, abetting, aiding or inducing through threats or promise. If the intermediary receives actual knowledge about illegal content being hosted on its platform, it is bound to take action to remove it or disable access to such content. But if it fails to do, it cannot rely on the safe harbour provision.

The term actual knowledge has been subjected to a lot of interpretation. The knowledge can be actual or constructive. It can be a general awareness or specific knowledge. When the knowledge is actual, it is demonstrable. For example – a court order or government agency directing the intermediary to take down the content. When the knowledge is constructive, the intermediary is expected to be aware of the content hosted on its platform.

The Delhi High Court interpreted Section MySpace Inc. v. Super Cassettes Industries Ltd. (SCIL). In this case, SCIL filed an infringement suit against MySpace, a popular photo- sharing social media platform. SCIL claimed that MySpace used its copyrighted works without proper authorisation. 

The court held that Section 79 is an affirmative defence. It was not a blanket immunity but a “measured privilege to an intermediary.” Section 79 does not mention any penalties for non-compliance nor does it act as an enforcement provision. 

The court ruled in favour of Myspace because the facts merely amount to general awareness of infringement but not actual knowledge. MySpace did not have reasonable ground to believe infringement had occurred.

In Christian Louboutin Sas v. Nakul Bajaj and Ors[9]., the court stated that “when an e-commerce website is involved in or conducts its business in such a manner, which would see the presence of a large number of elements enumerated above, it could be said to cross the line from being an intermediary to an active participant”

The judgement of Shreya Singhal v. Union of India[10], is important to understand the safe harbour provision. Among other major considerations, the constitutional validity of Section 79 was a  point of issue in this case. While the hon’ble Supreme Court upheld the provision, it was read down. Now, only upon receiving actual knowledge through a court order or upon being intimated by the appropriate government that unlawful acts under Article 19(2) of the Constitution is committed that the intermediary is bound to remove or disable access to such content.

The Consumer Protection (E-Commerce) Rules, 2020 in rule 5(1) expressly states that e-commerce marketplaces can make use of the safe harbour provision of Section 79 in consonance with the Intermediary Guidelines. 

Much of the language in Section 79 is supplemented by delegated legislation such as guidelines and rules. Therefore, Section 79 must be read in a holistic fashion by taking these rules into account. Moreover, the rules were recently updated in 2021 and since then, a mixed bag of criticism and appreciation has surrounded it. This is further examined in the next section. 

The Copyright Act, 1957 also grants a safe harbor provision for intermediaries. Section 51(a)(ii) states that secondary copyright infringement will occur if a person grants permission to use any place for profit to communicate copyright-infringing material. While this is an offence creating provision, Section 52(1)(b) provides the safe harbour. It can be availed in cases where the copyrighted work is stored incidentally in the “technical process of electronic transmission or communication to the public.” This provision grants immunity to online intermediaries. 

IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021

The intermediary Guidelines Rules was enacted by the Central Government under the powers given to it under Section 79 of the IT Act, 2000. It stresses on the need for online intermediaries to strictly comply with the Indian Constitution and Domestic Laws. The rules are revolutionary for instilling accountability in intermediaries and for properly bringing them under the framework of the IT Act.

According to rule 4, intermediaries are duty bound to publish their privacy policies, rules and regulations, user agreements etc. This information should emphasise on the user responsibility not to “host, display, upload, modify, publish, transmit, store, update or share” any prescribed list of contents. This prescribed list includes content which is defamatory, obscene, dangerous for minors, infringing of intellectual property, misinformation, impersonations, threats to the security of the country etc. Intermediaries should notify the users that non-compliance with the above will lead to termination of their user rights. Upon receiving knowledge through a court or government order, the intermediary is bound to remove or restrict access to the illegal content by 36 hours.

The rules create a new class of intermediaries called “Significant Social Media Intermediaries.”  (SSMI) They are demarcated based on the volume of users. SSMIs are required to appoint a Chief Compliance Officer, nodal person of contact and Resident Grievance Officer.  They are also required to publish a bi-annual compliance report. They are also required to design an appropriate compliant processing mechanism.

The ministry will establish a three-tiered Grievance Redressal Mechanism which consists of:

  1. Level I – Self-Regulation by the intermediary
  2. Level II – Self-regulation by the self-regulating bodies of the intermediary
  3. Level III – Central Government Oversight Mechanism

To address issues relating to content moderation in OTT Streaming services, they are required to classify the content according to the target audience. The contents are to be classified as “U, U/A 7+, U/A 13+, U/A 16+ and A” They are also required to further classify the content based on the messages and themes portrayed, such as sex, nudity, violence, substance abuse etc.

Final Thoughts: Harmonizing the Pros and Cons

After critically examining the legal framework of intermediaries, it is evident that the new rules are strongly regulating their conduct while at the same time, they also have access to safe harbour provisions. The law is balanced in terms of giving rights and duties to the intermediaries.

But if intermediary liability is strictly enforced, then businesses will take all possible steps to minimise the liability. In this process, there is a huge risk that intermediaries will overcompensate by blocking or removing lawful content. This is a severe blow to the right of freedom of expression which is enshrined as a fundamental right in Article 19(1)(a) of the Indian Constitution. Intermediaries create abundant and rich avenues for communication, and they help individuals to actively participate in political, social and economic life. The legal consequences of intermediary liability threaten the wonderful potential of these tools. Since some of the definitions and prescribed lists of illegal content may be vague, intermediaries are incentivised to err on the side of caution by over-censoring content. Intermediary liability also makes companies supervise and monitor the users extensively, and this creates privacy concerns. In order to regulate their networks, they might collect more personal information than ordinarily required. Intermediary liability also has the potential to thwart economic development as there is little incentive for companies to develop new products and services without bearing the brunt of additional risks. Therefore, strict enforcement of intermediary liability might cause a situation where the risk of conducting a business outweighs the incentives.[11]

The consequences of strict enforcement of intermediary liability in no way serves as an argument to overthrow it. Intermediary liability is important to ensure that companies adequately monitor online content and remove potentially illegal content. If legal consequences are not dire, then companies will neglect their due diligence obligations.

Too much or too little enforcement of intermediary liability is an equal cause of concern. It is important that government agencies walk a fine line to ensure that the scales are balanced. Government can take action to bring down illegal content without threatening freedom of expression. The key is to empower users to regulate what content should reach their screens. The government should promote voluntary use of tools that can be controlled by users, such as filtering software. This software allows users to block undesirable content. It is high time to prioritise digital literacy and give the power to the people to regulate content themselves, among other content moderation tools. This will help in striking the balance between freedom of expression and intermediary liability.


[1] We are social, Digital 2022: Global Overview Report,

[2] Anna Liz Thomas and Others, How Safe is Your Harbour? Discussions on Intermediary Liability and User Rights, The Centre for Internet and Society, India (July 10, 2020),

[3] Anand Patwardhan v. The Central Board of Film Certification, 2003 (5) BomCR 58.

[4] Sephali Svati, Classification of Content, Content Regulation and Ethics – OTT platforms in India, 3(4) IJLSI 509-525 (2021),

[5]Giancarlo Frosio, Oxford Handbook of Online Intermediary Liability (May 7, 2020), https://academic.ou

[6] Vasudev Devadasan, Report on Intermediary Liability in India, Centre for Communication Governance (December 2022),

[7] 2005 (79) DRJ 576.

[9] CS(COMM) 344/2018.

[10] AIR 2015 SC 1523.

[11] Supra note 2. 

Written By – Anisha L, Hidayatullah National Law University

Written By

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.


Related Posts

Inside Court News

Last Updated on December 4, 2023 by News Desk The petition filed by EAS Sarma and Jagdeep S Chhokar challenges the deployment of Defence...

Inside Court News

Last Updated on December 4, 2023 by News Desk On 30th November, SC had held that under order 7 rule 11, at the stage...

Inside Court News

Last Updated on December 1, 2023 by News Desk The Bombay HC had recently appointed a court receiver and had given direction to the...

Inside Court News

Last Updated on November 30, 2023 by News Desk On Tuesday the Kerala High Court had refused the claim for compensation that was made...