• About Us
    • Our team
    • Code of Conduct
    • Disclaimer Policy
  • Policy
    • Privacy
    • Copyright
    • Refund Policy
    • Terms & Condition
  • Submit Post
    • Guideline
    • Submit/Article/Blog
    • Submit-Event/Job/Internship
  • Join Us
    • Intership
    • Campus Ambassador
  • Media Partnership
  • Advertise
    • Magazine
    • Website
  • Contact us
Wednesday, April 15, 2026
  • Login
  • Register
law Jurist
Advertisement
  • Home
  • Articles
    • Articles
  • CASE LAWS
    • CRPC
    • IPR
    • Constitution
    • International Law
    • Contract Laws
    • IBC
    • Evidence Act
    • CPC
    • Property Law
    • Companies Act
    • CRPC
    • AI and law
    • Banking Law
    • Contact Laws
    • Criminal Laws
  • Law Notes
    • CPC Notes
    • Contract Laws Notes
    • Bharatiya Nyaya Sanhita
    • International Law Notes
    • Constitution Notes
    • Companies Act Notes
    • Banking Law Notes
    • Evidence Act Notes
  • Opportunities
    • Internship
    • Moot Court
    • Courses
    • Seminar
  • Careers
    • Law School Update
    • Judiciary
    • CLAT
  • JOURNAL
  • Legal Documents
  • Bare Act
  • Lawyers corner
  • Draftmate
No Result
View All Result
  • Home
  • Articles
    • Articles
  • CASE LAWS
    • CRPC
    • IPR
    • Constitution
    • International Law
    • Contract Laws
    • IBC
    • Evidence Act
    • CPC
    • Property Law
    • Companies Act
    • CRPC
    • AI and law
    • Banking Law
    • Contact Laws
    • Criminal Laws
  • Law Notes
    • CPC Notes
    • Contract Laws Notes
    • Bharatiya Nyaya Sanhita
    • International Law Notes
    • Constitution Notes
    • Companies Act Notes
    • Banking Law Notes
    • Evidence Act Notes
  • Opportunities
    • Internship
    • Moot Court
    • Courses
    • Seminar
  • Careers
    • Law School Update
    • Judiciary
    • CLAT
  • JOURNAL
  • Legal Documents
  • Bare Act
  • Lawyers corner
  • Draftmate
No Result
View All Result
law Jurist
No Result
View All Result

STREAMING VIOLENCE: THE RISE OF GORE CONTENT ON OTT PLATFORM AND IT’S SOCIETAL IMPACTS.

Law Jurist by Law Jurist
13 April 2026
in Articles
0

Author: Ompriya Mishra , a LLM (student Cyber Law and Cyber Security) at Amity University, Noida

“I object to violence because, when it appears to do good, the good is only temporary and the evil it does is permanent.” — Mohandas K. Gandhi

ABSTRACT

The transformation of consumption of contents, offering incomparable access to diverse collection of audio video materials after the rapid exponential in the production of Over-the-Top (OTT) platforms. Though, this rapid growth has also sparked some major concerns about the portrayal and broadcasting of violent content over OTT platforms. The OTT platforms mostly work with lighter monitoring scrutiny, unlike the traditional broadcast media, which leads to some fueling debates on the accountability and societal risk caused by the unrestricted violence exposure.

In this paper, a thorough examination of the legal framework for OTT platforms in India concentrating on the Information Technology (Intermediary Guidelines and Digital Media Code) Rule, 2021. It is focused on the balance between the freedom of expression and harmful content regulation. It also evaluates whether the prevailing laws sufficiently addresses the challenges faced by violent content, analyses the judicial trends and examines the extent to which the platforms can be held liable for the impact on the society.

The paper analyses the regulatory models in India and nominated foreign authorities, to identify the gaps and recommending reforms, through a comparative legal approach. It claims for a stable regulatory mechanism that can prevent the freedom for creativity while safeguarding against societal harm. The paper concludes that the platform accountability must change in side by side with the technological progressions, demanding an amalgam model combined with self-regulation, constitutional oversights and empowerment of consumers.

Keywords: OTT Platforms, Violent Content, Platform Liability, Media Regulation, Freedom of Expression, Digital Governance

  1. INTRODUCTION

The sudden growth of OTT platforms like Disney+, Netflix and Amazon Prime Video has drastically changed the telecommunications media services by ducking the traditional organizations like theatrical censorship boards. The online streaming platforms functions in a comparatively lithe supervisory environment, unlike the cinema, which is controlled under the Cinematograph Act, permitting them to host content that often embraces psychologically alarming stories, gore content and obvious violence.

For the regulation of such violent content under the competent Indian Legal framework, this change has aroused a growing debate in India.  The debate has strengthened some concerns because in the past few years the age verification mechanisms are not strict and the growing occurrence of hyper violent contents were frequently available in web series and films without it. The ignorance of pre-certification joined with algorithm-driven that promotes contents, has further augmented exposure to such material.

With the help of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, the regulatory answer in India has primarily been shaped, which has introduced a code of ethics for the digital producers and a three-tier criticism remedy mechanism. These rules are bound to be followed by the OTT platforms and required to categorize content according to age and embrace signifiers such as “violence” or “substance abuse.”

Nevertheless, the efficiency of this framework stays disputed. The actions were repeatedly taken by the Indian government against the OTT platforms for allowing to introduced obscene or gore. For example, multiple platforms faced bans in recent years more specifically in 2025 and 2026 due to breach of laws relating to obscenity and public morality.

In chorus, the judiciary has maintained a restrained tactic. The Supreme Court has repeatedly ruled out that OTT regulating content comes under the domain of the executive and legislature declining to make new censorship instruments through judicial action.

The pressure of this regulation underscores a fundamental legal challenge that is under Article 19(1)(a) how to balance freedom of expression with the urgency to safeguard societal impact that has been caused by unchecked violent content. The detractors highlight the psychological and social consequences of rampant violence among the younger generations, while OTT platforms deal with the argument for creative freedom.

This research paper seeks out to address the following key questions:

  1. Are the existing Indian legal frameworks adequately regulating the violent and gore content shown on OTT platforms?
  2. What is the extent of OTT platforms? Can it be held legally responsible for harmful content?
  3. Whether the current co-regulatory model effectively balances free speech and social harm.

Through recent legal developments, enforcement activities and policy disputes, this research aims to offer a complete insight of the growing regulatory framework governing violent content on OTT platforms in India.

LITERATURE REVIEW

The discussions about the violent content on digital platforms has seen to be changed majorly since the OTT media has been evolved significantly. Many scholars have approached the issue from via different perspectives, including the regulatory theory and media psychology.

Early media violence studies done by Albert Bandura, shows the establishment of the social learning theory, suggesting that individuals, especially children, imitate aggressive behavior observed in visual media.  This framework has been consistently raised in modern discussions based on the OTT media where the absence of regulatory filters increases access to graphic violence.

The legal scholarship has mainly focused on the regulatory void in the Indian context, that occurred way before the introduction of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. Gautam Bhatia argues that digital content regulation must be examined through the lens of constitutional free speech protections, warning against excessive state control that may lead to indirect censorship.  Similarly, Apar Gupta critiques the ambiguity in defining “harmful content” noting that vague standards allow discretionary enforcement by authorities.

The shift of scholarship has been leaned towards analysing the co-regulatory model introduced under the IT Rules, 2021. A scholars named Nikhil Pahwa have highlighted the structural powerlessness in self-regulation particularly lacking the independence in grievance redressal bodies and the potential chilling effect on creative expression.

The European Audiovisual Observatory highlighted the study of the mounting trends towards platform liability, where OTT platforms are increasingly not treated as neutral intermediaries but publishers.  In the case of violence or any harmful content, this shift has substantial consequences for liability.

Yet, a significant gap continues in existing literature while obscenity and hate speech have gained much attention, the violent and gore content having dissimilar category of harm remains underexplored, within the Indian legal framework. This research paper addresses the gap between violence-based societal harm and regulatory responsibility.

  1. CONCEPTUAL STRUCTURE

WHAT IS “VIOLENT AND GORE CONTENT”

Violence is typically the act of smearing force in a manner that would inflict injury, and gore is visible depictions of injury, blood and dismemberment. The gore content exudes its violent details and realism which is intensive in nature as opposed to implied or symbolic violence.

The online media platforms have broadened their boundaries towards acceptance of showcasing of violence in a hyper-realistic manner. This raised some vital legal question:

  1. Does the right to speech covers the graphic violence?
  2. In order to justify the regulations when does it becomes harmful?

The Indian law has not provided the statutory definition of violent content, resulting in a vagueness in enforcement. Rather, regulation is based on more general areas like decency, morality and the order of the people under Article 19(2) of the Constitution.

HARM THEORY AND MEDIA VIOLENCE.

This control of violent material may be justified by the principle of harm articulated by John Stuart Mill under which an individual liberty may be restricted only to avoid harming someone.  Furthering this, Joel Feinberg introduced the idea of offense and harm that argues not every offensive material should be legally limited, unless there is physical injury to the society.

It is difficult to create a causal relationship between violent material and harm in the real world with regards to the OTT platforms. Although there may be psychological studies to indicate such relationships as desensitization, in the context of aggression and fear legal regulations, a higher standard of evidence may be the norm.

DESENSITIZATION AND PSYCHOLOGICAL EFFECT.

Recent studies have shown that repeated exposure to violent media may cause emotional desensitization, diminishing empathy with victims, and making violence a norm.  This is of particular concern in the context of the OTT ecosystem, where the binge-watching culture and algorithmic-driven suggestions heighten the experience.

Children and adolescents are considered as vulnerable. This is because of the absence of a strict age verification on most sites, which adds to the risk of free access to adult content.

PLATFORM ACCOUNTABILITY (PUBLISHER OR INTERMEDIARY)

One of the major questions surrounding the regulation of OTTs is whether they should be considered as intermediaries (merely hosting the contents) or publishers (entities that have editorial control). The protection of intermediaries by section 79 of the Information Technology Act, 2000, is given when they fail to do the editorial control and adhere to due diligence provisions.  Though, OTT platforms actively curate, commission and promote content, distorting the line between passive hosting and active publishing. Prompting calls for a strict liability or rather something similar to the traditional broadcasters.

The issue of OTT platforms as publishers of online curated content has also been dealt with to fall under the IT Rules, 2021, with new responsibilities of classification of the content and redressing grievances being added.

  1. FREEDOM OF SPEECH VS SOCIAL CONTROL.

The constitutional protection of freedom of speech and expression by Articles 19(1)(a) to the Constitution of India is the basis of the regulation of violent and gore content on OTT platforms in India. But there is no absolute freedom. Article 19(2) permit the State to impose reasonable restrictions in the name of social order, decency or morality, sovereignty and integrity of India and Security of the State. Violent and gore material is generally the domain of the concept of decency or morality and in most situations may also suggest issues of public order in case it leads to the violence or social unrest.

The Supreme Court has always believed that the freedom of expression encompasses the right to publish content that is shocking or disturbing as long as it does not go beyond the level of the constitutional limits.  The use of storytelling and heavy creative expression becomes imperative in the OTT context in this principle.

However, the OTT platforms enjoy a relatively less restrictive environment that is causing worries on whether the constitutional protection is well-balanced with societal interests as compared to traditional media.

THE INFORMATION TECHNOLOGY ACT, 2000.

In India, the main legal framework that trumps digital content is the Information Technology Act, 2000 (IT Act).

  1. Safe Harbour and Intermediary Liability.

The IT Act (Section 79) offers intermediaries safe harbour and immunity against the liability of third-party content, although they must not create or alter such content, exercise due diligence and take down illegal content on notice.

But OTT services do not conform to what can be defined as intermediaries. In contrast to social media, they are active curators, commissioners and controllers of content, casting doubts on their future qualification to receive such protections.

  1. Penal Provisions: Relevant To Violent Content.

Some parts of the IT Act indirectly control violent/ harmful content:

  • Section 67: Punishes the publication of obscene material electronically.
  • Section 69A section: Authorizes the government to prohibit public access to content based on sovereignty, public order or morality.

The sections are generally used in terms of obscenity or threats to national security, but are more frequently being used against OTT services that host controversial or explicit material.

 IT RULES, 2021

The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, mark the initial attempt to control OTT platforms in India.

  1. OTT Platforms Classification:

The Rules classify OTT platforms as publishers of online curated content thus separating them out of the intermediary category and giving them extra responsibilities.

The classification between these two are important because it recognizes the editorial role of OTT platforms, in that way it justifies the greater regulatory oversights.

  1. Classification of content and Age:

The need of the OTT platforms is to categorize the content into the following; U (Universal), U/A 7+, U/A 13+, U/A 16+ and A (Adult).  Moreover, the sites should provide content labels such as nudity, violence, language and substance abuse.  The overreliance on self-assessment was heavily debated by critics, who argue that this system enables harmful violent content to be under-classified, as the system should be empowering the viewers.

  • Parental controls and access restrictions.

The IT Rules require parental locking of U/A 13+ and over and verifying the age of adult content. Nevertheless, enforcement is still lax and many platforms does not have to offer a good system of verification that would enable minors to comfortably consume violent and explicit media. The IT Rules provide a three-level regulatory framework:

  1. Level I: Self-regulation by Publisher.

OTT platforms have to appoint a grievance officer to address complaints within 15 days.

  1. Level II: Self-Regulatory Bodies.

Unresolved complaints are reviewed by independent bodies headed by either retired judges or eminent persons.

  • Level III: Government Oversight Mechanism.

The Ministry of Information and Broadcasting has the overall control and can give instructions, such as content changes or blocking.

The controversy that the executive, in the end, had absolute control is an issue of concern to the critics, whereas this prototype tries to balance the autonomy and the liability.

A REGULATORY DIFFERENCE OF THE CINEMAGRAPH ACT OF 1952.

India Traditional films There is a Cinematograph Act, 1952, under which traditional films must be pre-certified by the Central Board of Film Certification (CBFC). The necessary changes, removals and austerity of the exhibition is approved to be carried out by the CBFC, it can also refuse the certification.

By contrast, OTT platforms are not pre-certified, which results in a disproportion in regulation. Graphic violence and gore, as well as content that might not be subject to CBFC inspection, can be easily streamed online. This exemption has been a significant area of contention and has prompted the demand to apply comparable censorship measures to OTTs.

RECENT GOVERNMENT ACTIONS (2024–2026)

The past few years have seen a rise in executive intervention in the regulation of OTT content.

  1. OTT Blocking.

The Government of India has exercised its authorities under Section 69A of the IT Act to block OTT sites which carry obscene or threatening content, especially those that contain explicit sexual and violent content.  Such measures suggest a move towards a tougher stance, despite the lack of explicit statutory measures against violent content.

  1. Heightened Content Control.

Governments have given warnings to OTT services about too much violence, graphic scenes and influence on children. This indicates an increasing worry about the social impacts of unregulated online material.

JUDICIAL APPROACH: UNWILLINGNESS TO INTERFERE.

Indian courts have typically taken a laissez-faire stance towards OTT regulation. The Supreme Court has also over and over again denied hearing Public Interest Litigations that demand more stringent censorship of OTT platforms, stating that regulation is a policy issue and the courts should not establish new censorship regimes.

This judicial deference strengthens the role of the executive in determining OTT regulation, yet it does not answer the questions about constitutional protections and accountability.

CRITICAL LEGAL PROBLEMS ARISING.

Based on the analysis above, there are a number of critical issues:

  1. Lack of Clarity of Violent Content- There is no legal precision on what is meant by excessive or harmful violence.
  2. Excessive use of Self-Regulation- Platforms have a lot of discretion, resulting in uneven standards.
  3. Executive Dominance- Government checks and balances can result in ad hoc enforcement.
  4. Absence of Proper Protection of Minors- Age checks have not been properly implemented.
  1. JUDICIAL TRENDS, RECENT CASE AND REGULATORY DEVELOPMENTS.

THE CHANGE IN VACUUM TO ACTIVE INSPECTION.

Legal environment of OTT platforms in India has been changing dramatically within the recent years. Early defined by a regulatory vacuum, the field has seen the rise of executive intervention, selective judicial incursion, and the rise of public attention, especially in regards to violent, explicit, and controversial content.

Recent instances underscore a trend: even as courts take into consideration issues of pernicious content, they are not willing to resort to pre-censorship, thus leaving the issue at the mercy of the executive.

THE TANDAV RAID CONTROVERSY AND CRIMINAL LIABILITY AND CREATION FREEDOM.

The Amazon Prime series Tandav (2021) was one of the first and the biggest scandals surrounding OTT content that resulted in numerous FIRs in India claiming that the show offended religious sensitivities and propagated obscene content.

Even though the controversy was mainly related to religious aspects, it was a milestone in the regulation of OTTs, as it showed how digital content could result in criminal liability in the form of already existing penal laws, including the Indian Penal Code.

As they provided interim protection to the creators, the Supreme Court made it clear that there should be a responsible creation of content, which indirectly recognized social influence of digital media.

The Effects created on Violent Content Regulation are:

  1. Establish a precedent on state interference with OTT content.
  2. Introduced a sense of chilling, which caused self-censorship of platforms.
  • Lack of clarity in legal standards.

THE CASE OF THE COLLEGE ROMANCE.

The case of Apoorva Arora v State (NCT of Delhi) or the College Romance case was a case in which the Delhi High Court had to tackle the problem of obscene language in a web series that was broadcasted on an OTT platform.

The Court noted that OTT platforms cannot invoke complete immunity, and should comply with community decency standards, even in the digital realm. The court further noted that OTT content is not outside regulation, Standards used to regulate cinema and television can be applied to the internet and the defence of realistic portrayal cannot be used to excuse too much obscenity. As much as it is about obscenity, the same reasoning can be applied to graphic violence since both fall under the good Article 19(2) of decency and morality.

POSITION OF THE SUPREME COURT: JUDICIAL RESTRAINT IN THE OTT REGULATION.

One of the most recent trends is the unwillingness of the Supreme Court to directly regulate OTT platforms by providing them with regulatory frameworks. The Court has ruled in several Public Interest Litigations that are aimed at increasing censorship of OTT material, that Regulation of digital platforms is a legislative and executive policy prerogative and the courts are not entitled to be censorship authorities.

This practice is an exercise of the separation of powers, yet a regulatory gap is present in which courts do not consider substantive issues concerning violent and harmful content.

CRACKDOWNS AND BLOCKING ORDERS (20242026) BY THE GOVERNMENT.

The last significant tendency is the use of executive powers more actively to control OTT platforms. The Government of India has used Section 69A of the IT Act in prohibiting access to a number of OTT sites that were allegedly hosting explicit sex content, excessive violence materials and content that were against the morale of the people.  Such moves are indicative of a move towards more enforcement, despite the lack of clear statutory definitions of violent content. Little was raised such as lack of transparency in blocking orders, lack of judicial checks and balances, and power abuse.

THE RISE OF SOFT CENSORSHIP AND SELF-REGULATION

As a reaction to legal uncertainties, government pressure, OTT platforms have progressively embraced self-censorship techniques, such as editing or dropping controversial scenes, slapping harder content warnings and evading politically or socially sensitive subjects.

This has come to be known as soft censorship and it has left issues of the freedom of creativity to be undermined without the need of any legal requirements.

SOCIAL PRESSURE AND PUBLIC INTEREST LITIGATIONS.

A number of PILs have been submitted demanding the OTT content to be much stricter especially in regard to violent and explicit scenes, the effects on children and degradation of culture.

Although the majority of these petitions have been rejected or not taken into consideration, they indicate the increasing concern in society about the unregulated status of OTT platforms.

CRITIC ANALYSIS: DO THE CURRENT TENDENCIES WORK OR NOT?

The present regulatory paradigm is indicative of a hybrid model that is executive enforcement, judicial restraint and industry self-regulation. But this model has a number of shortcomings such as:

  1. Inconsistency- Dissimilarity of standards is used in cases and platforms.
  2. Risk of Overreach- Executive authority, especially the blocking orders, might end up in arbitrary censorship.
  3. Lack of Focus on Violence- Most cases are considered on the grounds of obscenity or religious consideration whereas violent and gore content is not adequately regulated.
  4. Lack of Protection of Minors- In spite of the concern, there is a lack of judicial involvement with the effects of violent material on children.
  5. SOCIAL HARM INFLICTED BY VIOLENT AND GORE CONTENT.

DESENSITIZATION AND NORMALISATION OF VIOLENCE.

Desensitization is considered to be one of the most well-known impacts of repeated exposure to violent content. Regular consumers of graphic violence can become less emotionally engaged to this material. In the long run, it may cause the loss of empathy with the victims of violence, acceptance of violent actions and violence as a normal or acceptable phenomenon.

This is further exacerbated by OTT platforms, which are on-demand and binge-watching, enabling users to watch several episodes or movies at a time. This extended exposure makes it more probable to have a psychological effect than in traditional forms of media.

CHILDREN AND ADOLESCENTS IMPACT.

The impact of violent content is especially severe when it comes to children and teenagers who are more prone to be influenced and are less able to draw the lines between fiction and reality.

The IT Rules, 2021 require parental controls and age classification but in reality, there are not always effective. Examples of common are poor or easily defeated age checks; shared accounts allowing unlimited access by a family; Parental ignorance.

Viewing violent and graphic materials in childhood can lead to aggressive behavior, anxiety and fear and a warped perception of conflict management.  This underscores a major regulatory failure since the current mechanisms are not effective in protecting vulnerable groups.

BROADER SOCIETAL IMPACT

In addition to personal impact, violent content may affect the attitudes and cultural values of the society. Being exposed to such content often can lead to normalization of brutality and crime, decreased empathy to serious societal problems and tolerance to violence as a method of conflict resolution.

The representation of violence in the Indian society that is culturally diverse and sensitive when it comes to matters of identity can occasionally lead to social tensions especially when attached to the matters of religion, caste or politics.

  1. PLATFORM ACCOUNTABILITY: DECIDING ON ACCOUNTABILITY.

 DEBATE ON INTERMEDIARY VS. PUBLISHER

One of the main legal questions of OTT regulation is how the platforms should be considered as intermediaries or publishers.

Section 79 of the IT Act provides safe harbour protection to the intermediaries, so the third-party content should not be the responsibility of intermediaries as long as they uphold due diligence.  But OTT services are distinctly different than the traditional intermediaries since these platforms can select and filter content, produce original work and practice editorial skills.

Such functions are much more compatible with those of publishers, the direct source of the content they spread. The IT Rules, 2021 acknowledge this difference, classifying OTT platforms as publishers of online curated content, but not subjecting them to stringent standards of strict liability that are more akin to traditional media.

 THE GOVERNMENT INTERVENTION AND ITS RISKS.

Section 69A of the IT Act allows the government to censor content. Though this offers a means to deal with the objectionable content, it brings up the significant questions absence of transparency on decisions, no prior judicial review and possibility of arbitrary or excessive censorship.

Over-reliance on executive action can sabotage the constitutional right to freedom of speech in Article 19(1)(a) to cause confusion to content producers and platforms.

LACK OF A HARM-BASED LEGAL STANDARD.

One of the biggest flaws of the existing structure is the lack of a harm-oriented, explicit criterion of violent content regulation.

The Indian law lacks a clear definition of:

  1. What is too much violence?
  2. When content is socially detrimental.
  • How such harm is to be assessed.

Rather, regulation is based on such open-to-subjective notions as decency and morality. This brings about non-uniformity in enforcement and legal ambiguity.

  1. CONCLUSION

The fast growth of OTTs in India has radically altered the production, distribution, and consumption of content. Although the sites have facilitated freedom of creativity and a variety of storytelling, they have also come forth with a lot of problems, especially concerning violent and gore information.

The current legal framework, which is mostly based on the Information Technology Act, 2000, and the IT Rules, 2021, has shown to be the partial regulatory reaction as has been revealed in this paper. Although the framework proposes the concept of content classification, parental controls, and grievance redressal, it is mostly about the procedural compliance, but not substantial harm.

The judicial trends also support this restriction. The fact that the Supreme Court has consistently been unwilling to step in and regulate OTTs has led to a system in which executive powers and platforms, themselves, assume a central role. Meanwhile, the new steps by the government, such as blocking orders and advisories, signal a move towards more control, which is worrying as to possible overreach and transparency.

One of the critical gaps that have been found in this study is that there is no clear legal standard that can be used to control violent content. Violent and gore content is not explicitly spelled out in Indian law like in obscenity or hate speech and thus, there is a lack of consistency in its application and interpretation. Moreover, self-regulation has not been effective since sites tend to focus more on profits rather than social responsibility.

One cannot disregard the social impact of such content. Violent content has actual and quantifiable risks, whether it is desensitization and psychological damage or the implication on the children and the attitude of the wider society. The existing regulatory strategy, however, does not give sufficient consideration to these issues.

To sum up, the OTT regulatory environment in India is at the crossroad. It should transform into a harm-based system rather than a form-based system and the regulation of content should not be solely based on the nature of content but rather on the effects it has on both individuals and the society. This balance is crucial in ensuring freedom of expression and welfare of society.

References

  • Ali Abbas Zafar v Union of India (2021) (Supreme Court interim protection order).
  • Apoorva Arora v. State of NCT of Delhi
  • Bandura, Albert, ‘Social Learning Theory’, Prentice Hall, 1977
  • Bhatia, Gautam, ‘Offend, Shock, or Disturb: Free Speech under the Indian Constitution’, Oxford University Press, 2016
  • Cinematograph Act 1952
  • Constitution of India 1950, Art 19(1)(A)., 1950
  • Craig A Anderson and others, ‘“The Influence of Media Violence on Youth”’, Psychological Science in the Public Interest, 4 (2003), 81
  • European Audiovisual Observatory, Regulation of Video Sharing Platforms, Council of Europe, 2021
  • Feinberg, Joel, ‘The Moral Limits of the Criminal Law: Offense to Others’, Oxford University Press, 1985
  • Gupta, Apar, ‘Digital Speech and the Problem of Vagueness in Indian Law’, Indian Journal of Law and Technology, 6 (2021)
  • Huesmann, L. Rowell, ‘The Impact of Electronic Media Violence: Scientific Theory and Research’, Journal of Adolescent Health, 41 (2007) <https://doi.org/10.1016/j.jadohealth.2007.09.005>
  • Indian Penal Code 1860, Ss 153A, 295A (as Invoked in FIRs Related to Tandav).
  • ‘Information Technology Act, 2000’, Indian Journal of Public Administration, 46 (2000), 417–55 <https://doi.org/10.1177/0019556120000313>
  • Information Technology Act 2000, s 69A; Government of India Blocking Orders (2024–2026).
  • Information Technology Act 2000, s 79.
  • Justice for Rights Foundation v Union of India W.P.(C) 13526/2019 (Delhi HC 2020)
  • MeitY, The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 1, 2023 <https://www.meity.gov.in/static/uploads/2024/02/Information-Technology-Intermediary-Guidelines-and-Digital-Media-Ethics-Code-Rules-2021-updated-06.04.2023-.pdf>
  • Mill, John Stuart, ‘On Liberty (First Published 1859, Penguin 1985).’, Penguin(1985), 1859
  • Ministry of Electronics and Information Technology, Government of India, Internet of Things, 2016 <https://www.meity.gov.in/content/internet-things>
  • Pahwa, Nikhil, ‘Regulating OTT Platforms in India: A Critique of the IT Rules 2021’, Medianama, 2022
  • Sakshi; Raj, Aditya, Case Study: S. Rangarajan v. P. Jagjivan Ram (1989), Issue 2 Int’l JL Mgmt. & Human., 2023, vi, 1170
Previous Post

Alternative Dispute Resolution and Online Dispute Resolution Mechanisms in Resolving International Online Telecommunications Disputes: A Techno-Legal Study

Next Post

Cyber Security Challenges & Data Sovereignty The Technical-Legal Interface In India’s Satellite Communication Regime.

Next Post

Cyber Security Challenges & Data Sovereignty The Technical-Legal Interface In India's Satellite Communication Regime.

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recent Posts

  • The 77th Constitutional Amendment Act, 1995 And Its Evolution: Constitutional Equality, Reservation in Promotion, and Emerging Challenges in Cyberspace
  • Cyber Security Challenges & Data Sovereignty The Technical-Legal Interface In India’s Satellite Communication Regime.
  • STREAMING VIOLENCE: THE RISE OF GORE CONTENT ON OTT PLATFORM AND IT’S SOCIETAL IMPACTS.
  • Alternative Dispute Resolution and Online Dispute Resolution Mechanisms in Resolving International Online Telecommunications Disputes: A Techno-Legal Study
  • VANDE MATARAM i.e BHARAT MATA KI JAI

Recent Comments

  1. бнанс зареструватися on (no title)
  2. Binance注册 on (no title)
  3. registro da binance on (no title)
  4. crea un account binance on (no title)
  5. binance anm"alningsbonus on (no title)

Archives

  • April 2026
  • March 2026
  • February 2026
  • January 2026
  • December 2025
  • November 2025
  • October 2025
  • September 2025
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • January 2024
  • July 2023
  • June 2023
  • May 2023

Categories

  • About Us
  • Articles
  • Articles
  • Bare Acts
  • Bharatiya Nyaya Sanhita
  • Careers
  • CASE LAWS
  • Companies Act
  • Constitution
  • Constitution Notes
  • Contact Laws
  • Contract Laws
  • Criminal Laws
  • CRPC
  • IBC
  • Internship
  • IPR
  • Law Notes
  • Lawyers corner
  • Moot Court
  • Property Law
  • Seminar
  • Startup

Description

Law Jurist is dedicated to transforming legal education and practice. With a vision for change, they foster an inclusive community for law students, lawyers, and advocates. Their mission is to provide tailored resources and guidance, redefining standards through innovation and collaboration. With integrity and transparency, Law Jurist aims to be a trusted partner in every legal journey, committed to continuous improvement. Together, they shape a future where legal minds thrive and redefine impact.

Contact US

Gmail : lawjurist23@gmail.com

Phone : +91 6360756930

Categories

  • About Us
  • Articles
  • Articles
  • Bare Acts
  • Bharatiya Nyaya Sanhita
  • Careers
  • CASE LAWS
  • Companies Act
  • Constitution
  • Constitution Notes
  • Contact Laws
  • Contract Laws
  • Criminal Laws
  • CRPC
  • IBC
  • Internship
  • IPR
  • Law Notes
  • Lawyers corner
  • Moot Court
  • Property Law
  • Seminar
  • Startup

Search

No Result
View All Result
  • About Us
  • Bare Act
  • Code of Conduct
  • Contact us
  • Disclaimer Policy
  • Home 1
  • Join Us
  • Legal Documents
  • Our team
  • Policy
  • Privacy
  • Submit Post
  • Website
  • About Us
  • Refund Policy
  • Terms & Condition
  • Policy
  • Submit Post
  • Join Us
  • Media Partnership
  • Advertise
  • Contact us
  • Articles
  • CASE LAWS
  • About Us

Made with ❤ in India. © 2025 -- Law Jurist, All Rights Reserved.

No Result
View All Result
  • About Us
  • Bare Act
  • Code of Conduct
  • Contact us
  • Disclaimer Policy
  • Home 1
  • Join Us
  • Legal Documents
  • Our team
  • Policy
  • Privacy
  • Submit Post
    • Submit-Event/Job/Internship
  • Website
  • About Us
    • Our team
    • Code of Conduct
    • Disclaimer Policy
  • Refund Policy
  • Terms & Condition
  • Policy
    • Privacy
    • Copyright
  • Submit Post
  • Join Us
    • Internship
    • Campus Ambassador
  • Media Partnership
  • Advertise
  • Contact us
  • Articles
  • CASE LAWS
  • About Us

Made with ❤ in India. © 2025 -- Law Jurist, All Rights Reserved.

Welcome Back!

Login to your account below

Forgotten Password? Sign Up

Create New Account!

Fill the forms below to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In