{"id":5196,"date":"2025-07-02T00:21:08","date_gmt":"2025-07-01T18:51:08","guid":{"rendered":"https:\/\/lawjurist.com\/?p=5196"},"modified":"2025-07-02T00:28:38","modified_gmt":"2025-07-01T18:58:38","slug":"ai-generated-evidence-in-indiancourts-admissibility-and-legal-challenges","status":"publish","type":"post","link":"https:\/\/lawjurist.com\/index.php\/2025\/07\/02\/ai-generated-evidence-in-indiancourts-admissibility-and-legal-challenges\/","title":{"rendered":"AI-Generated Evidence in IndianCourts: Admissibility and Legal Challenges"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-post\" data-elementor-id=\"5196\" class=\"elementor elementor-5196\">\n\t\t\t\t<div class=\"elementor-element elementor-element-3975517f e-flex e-con-boxed e-con e-parent\" data-id=\"3975517f\" data-element_type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-549fc6d1 elementor-widget elementor-widget-text-editor\" data-id=\"549fc6d1\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\n<p>Author: <strong>Sakshi Tripathi, student of BBA.LLB (3<sup>rd<\/sup> Year) United university Prayagraj<\/strong><\/p>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-44c68fb e-flex e-con-boxed e-con e-parent\" data-id=\"44c68fb\" data-element_type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-f1d9c60 elementor-widget elementor-widget-text-editor\" data-id=\"f1d9c60\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><b>Introduction<\/b><span style=\"font-weight: 400;\">&nbsp;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Artificial intelligence (AI) has evolved from a futuristic idea to a reality in today\u2019s quickly evolving digital world. It can be found in self-driving cars, content curation, smart surveillance systems, smartphone apps, and even the legal system. An important question that emerges as AI is incorporated into Indian legal and law enforcement procedures is whether or not machine-generated evidence can be admitted into court and, if so, how to ensure that it is fairly regulated.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">In Indian legal discussions, the subject of AI-generated evidence is becoming more and more prominent. This kind of evidence, which includes algorithmic reconstructions, deepfake detection, predictive crime mapping, and facial recognition data, presents both tremendous opportunities and formidable obstacles. Although Indian courts have not yet reached a definitive ruling on the admissibility of such evidence, this issue will soon take centre stage.&nbsp;<\/span><span style=\"font-weight: 400;\">This article will look at what AI-generated evidence is, how Indian laws currently handle it (or don\u2019t), the ethical and legal issues it brings up, and what changes might be required to maintain accountability in technology and justice.<\/span><\/p>\n<p><\/p>\n<p><span style=\"font-weight: 400;\">How we create, store, and analyse data has changed as a result of the advancement of artificial intelligence (AI). Examples of AI-generated content that are increasingly being taken into consideration as potential evidence in legal systems worldwide, including India, include deepfakes, AI-written documents, facial recognition data, and predictive policing tools. However, this technological development poses a serious threat to established standards of evidence. Can evidence produced by AI be used in Indian courts? If so, under what conditions? The legal challenges, veracity, and admissibility of AI-generated evidence in Indian courts are examined in this article.<\/span><\/p>\n<p><b>Understanding AI-Generated Evidence<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Digital data or content created entirely or in part by artificial intelligence systems is referred to as AI-generated evidence. This comprises: Deepfake audio or video recordings AI-generated documents or emails Matches using facial recognition AI-powered predictive analysis Conversation logs from chatbots AI-generated materials, in contrast to traditional evidence, are not the direct result of human intention or action, which raises significant questions regarding manipulation, authorship, and dependability.<\/span><\/p>\n<p><b>Legal Framework in India<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Rules pertaining to evidence are based on India\u2019s legal system, particularly the Indian Evidence Act of 1872. Nevertheless, this colonial-era law has had difficulty adjusting to the new complexity brought about by the digital and artificial intelligence eras.<\/span><\/p>\n<p><b>Section 65B \u2013 Electronic Evidence<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Section 65B of the Indian Evidence Act describes a crucial component of electronic evidence. According to this section, electronic documents that are accompanied by a certificate attesting to their integrity and provenance may be used as evidence in court. However, rather than the autonomous choices made by self-learning AI systems, this clause was designed with conventional digital formats\u2014such as emails, SMS messages, computer files, and digital photos\u2014in mind. Therefore, even though AI-generated outputs could be regarded as electronic records in theory, there isn&#8217;t a formal legal procedure in place to confirm how an AI system came to its conclusions. When the AI&#8217;s decision-making process is opaque, a straightforward Section 65B certificate might not be enough.<\/span><\/p>\n<p><b>The Bharatiya Sakshya Adhiniyam, 2023<\/b><\/p>\n<p><span style=\"font-weight: 400;\">The Bharatiya Sakshya Adhiniyam, a recently proposed evidence law, is an attempt to update India\u2019s evidentiary laws. Even though it acknowledges digital evidence more explicitly, it still lacks specific provisions to handle machine-generated outputs or artificial intelligence (AI) analyses. Legislative ambiguity creates a grey area that may result in uneven application of the law.<\/span><\/p>\n<p><b>Judicial Precedents and Current Practice<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Although Indian courts have started to consider digital evidence, there aren\u2019t many instances that specifically deal with AI-generated information. The Supreme Court made clear in Anvar P.V. v. P.K. Basheer (2014) that electronic evidence must adhere to Section 65B. More recently, the Court reaffirmed that digital evidence is inadmissible if Section 65B certification is not followed in Arjun Panditrao Khotkar v. Kailash Kushanrao Gorantyal (2020). However, these decisions address electronic evidence in general rather than AI-generated data specifically, where the issue is not only authenticity but also the thinking behind the output, which is frequently not explicitly provided by AI systems.<\/span><\/p>\n<p><b>Admissibility Difficulties&nbsp;<\/b><\/p>\n<ol>\n<li><b> Authentication<\/b><span style=\"font-weight: 400;\">: How can it be demonstrated that evidence produced by AI has not been altered? For example, deepfakes are notoriously hard to spot without specialised equipment.&nbsp;<\/span><\/li>\n<li><b> Chain of Custody: <\/b><span style=\"font-weight: 400;\">AI systems might not keep thorough records of all data input, processing, and output, which makes the chain of custody more difficult to manage.<\/span><\/li>\n<li><b> Reliability and Bias: <\/b><span style=\"font-weight: 400;\">AI systems can only be as objective as the data they are trained on. For instance, ethnic groups have demonstrated greater error rates in facial recognition systems. It might be against fairness principles to use such results as evidence.<\/span><\/li>\n<li><b> Absence of Regulation:<\/b><span style=\"font-weight: 400;\"> As of right now, India has no particular legislation controlling the application of AI in court.&nbsp; Judicial interpretation is unpredictable and inconsistent due to this regulatory gap.&nbsp;<\/span><\/li>\n<li><b> Expert Testimony: <\/b><span style=\"font-weight: 400;\">Deciphering AI results frequently calls for in-depth knowledge of data science, machine learning models, and algorithms\u2014a discipline that is still in its infancy in Indian legal processes.<\/span><\/li>\n<\/ol>\n<p><b>&nbsp;<\/b><b>AI and Deepfakes: A Looming Threat to Evidence Integrity&nbsp;<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Deepfake technology, a particularly hazardous offshoot of artificial intelligence, is a serious threat to the Indian legal system. Deepfakes can be very challenging to spot with the unaided eye because they use artificial intelligence (AI) to produce incredibly realistic but completely fake audio, video, or image content. This technology presents a terrifying prospect for law enforcement and courts: fake evidence that appears to be authentic. Consider a deepfake video that uses publicly accessible voices and images to depict an innocent person committing a crime. Such a video could cause irreparable harm to justice if it is used as evidence without undergoing a rigorous technical review. Furthermore, even a layperson can produce convincingly false evidence because deepfakes are now simple to create using free tools.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Deepfakes are not only a cyberthreat in India, where digital literacy is still in its infancy, but they also have the potential to be used as a tool for character assassination, false implication, or communal unrest. Fake videos of celebrities and politicians have recently gone viral in an effort to sway public opinion, and it is not implausible to assume that litigants will try to use similar content in civil or criminal cases in the future.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">At the moment, India lacks a specialised legal or forensic framework to verify the veracity of AI-powered audio or video evidence. Certain certification requirements for electronic records are mandated by Section 65B of the Indian Evidence Act (or its updated counterpart in the Bharatiya Sakshya Adhiniyam, 2023), but deepfakes and other synthetic content were never intended to be covered by these provisions.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Indian courts may soon have to do the following to combat this threat:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">&nbsp;Create standardised forensic procedures to check for the presence of deepfakes.&nbsp;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Require digital submissions to include disclosures about AI-generated content.&nbsp;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Work together with cybersecurity specialists and international AI watchdogs.&nbsp;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Educate judges on how this kind of deception can happen.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The challenge lies not just in detecting deepfakes, but also in ensuring that real evidence is not wrongly dismissed as fake due to suspicion. The solution lies in balancing technical expertise with legal sensitivity, and in creating a legal environment where truth is not just sought\u2014but technologically verified.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">As AI tools get smarter, so must our courts.<\/span><\/p>\n<p><b>Use of AI by Law Enforcement in India<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Numerous Indian law enforcement organisations have started experimenting with AI tools, frequently without public discussion or legislative approval.<\/span><\/p>\n<p><b>Facial Recognition:<\/b><span style=\"font-weight: 400;\"> For surveillance and investigation purposes, police departments in Hyderabad, Delhi, and other states have implemented facial recognition systems (FRS).<\/span><\/p>\n<p><span style=\"font-weight: 400;\">&nbsp;<\/span><b>Crime Mapping and Prediction: <\/b><span style=\"font-weight: 400;\">Some cities employ tools that forecast future hotspots based on historical crime data.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">&nbsp;Despite their dubious dependability, experimental tools such as <\/span><b>voice stress analysis and AI lie detectors<\/b><span style=\"font-weight: 400;\"> are occasionally employed during interrogations.&nbsp;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The absence of supervision, openness, and judicial accountability is the problem that unites all of these deployments. In a criminal justice system where people\u2019s rights are at risk, this is risky.<\/span><\/p>\n<p><b>International Developments: A Learning Opportunity<\/b><\/p>\n<p><span style=\"font-weight: 400;\">India is not the only country battling the legal issues raised by AI. The issue of AI-generated evidence has already been addressed by a number of nations:<\/span><\/p>\n<p><b>United states<\/b><\/p>\n<p><span style=\"font-weight: 400;\">A risk-assessment algorithm was used to determine the appropriate sentence in State v. Loomis (2016). The defendant contested the algorithm\u2019s opaqueness. The court cautioned against exclusive reliance but permitted its use. The Sixth Amendment, which protects the freedom to cross-examine witnesses, and AI evidence are at odds in the United States.<\/span><\/p>\n<p><b>European Union&nbsp;<\/b><\/p>\n<p><span style=\"font-weight: 400;\">The EU\u2019s AI Act (2024) places judicial and law enforcement AI under the \u201chigh-risk\u201d category and mandates accountability, transparency, and human oversight. European courts are adopting a rights-first stance.<\/span><\/p>\n<p><b>United Kingdom<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Due to unclear legal frameworks, the UK Court of Appeal ruled in Bridges v. South Wales Police (2020) that police use of facial recognition technology was illegal. India can learn a lot from these cases and put protective barriers in place sooner rather than later.<\/span><\/p>\n<p><b>Ethical and Constitutional Dilemmas<\/b><\/p>\n<p><b>Right to a Fair Trial (Article 21)<\/b><\/p>\n<p><span style=\"font-weight: 400;\">The right to a fair trial may be violated by the use of opaque AI-generated evidence. A \u201cdecision\u201d made by an algorithm whose logic is inaccessible cannot be rationally defended against by an individual.<\/span><\/p>\n<p><b>Right to Privacy (Puttaswamy Judgment)<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Large volumes of personal data are gathered and processed by AI tools like surveillance analytics and facial recognition. This infringes upon the right to privacy outlined in the Puttaswamy v. Union of India (2017) ruling in the absence of protections.<\/span><\/p>\n<p><b>Presumption of Innocence<\/b><\/p>\n<p><span style=\"font-weight: 400;\">The idea that a person is innocent until proven guilty may be threatened by predictive policing or risk-scoring algorithms that raise suspicions about someone without any concrete proof of crime.<\/span><\/p>\n<p><b>The Road Ahead: Recommendations for Reform<\/b><\/p>\n<p><span style=\"font-weight: 400;\">India has to quickly embrace a forward-thinking, rights-based approach to AI-generated evidence to make sure justice is not lost to technology.<\/span><\/p>\n<ul>\n<li aria-level=\"1\"><b>Enact Specific Legislation on AI in Criminal Justice<\/b><\/li>\n<\/ul>\n<p><span style=\"font-weight: 400;\">A comprehensive law governing the use, limits, and admissibility of AI tools in criminal and civil matters is needed. This law must cover:<\/span><\/p>\n<ul>\n<li><span style=\"font-weight: 400;\">Certification of AI tools<\/span><\/li>\n<li><span style=\"font-weight: 400;\">Transparency requirements<\/span><\/li>\n<li><span style=\"font-weight: 400;\">Disclosure of training data<\/span><\/li>\n<li><span style=\"font-weight: 400;\">Independent audit<\/span><\/li>\n<li><b>Establish a Legal Definition of AI Evidence<\/b><\/li>\n<\/ul>\n<p><span style=\"font-weight: 400;\">Legislators and courts need to clarify what \u201cAI-generated evidence\u201d is and set it apart from conventional electronic evidence.<\/span><\/p>\n<ul>\n<li aria-level=\"1\"><b>Mandate Explainability and Open Access<\/b><\/li>\n<\/ul>\n<p><span style=\"font-weight: 400;\">Legal proceedings should only use AI tools whose operation can be audited and explained. Criminal trials should not include black-box algorithms.<\/span><\/p>\n<ol>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Include Human Supervision at All Levels&nbsp;<\/b><\/li>\n<\/ol>\n<p><span style=\"font-weight: 400;\">Without a human expert confirming its validity, no AI-generated evidence should be used. Judges need to be taught to assess these tools critically rather than disadvantages&nbsp;<\/span><\/p>\n<ul>\n<li aria-level=\"1\"><b>Form a Committee on Judicial Technology&nbsp;<\/b><\/li>\n<\/ul>\n<p><span style=\"font-weight: 400;\">To examine and authorise AI tools for legal use, a national committee comprising judges, technologists, ethicists, and solicitors ought to be formed.<\/span><\/p>\n<p><b>6.Make Training and Education Investments in Law<\/b><\/p>\n<p><span style=\"font-weight: 400;\">&nbsp;AI literacy must be taught in law schools and judicial academies. Future judges and solicitors need to be aware of algorithmic systems\u2019 advantages and disadvantages.<\/span><\/p>\n<p><b>Conclusion<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Artificial intelligence has the potential to increase the speed, intelligence, and efficiency of justice if it is used responsibly, transparently, and with a strong commitment to human rights. Without robust safeguards, AI-generated evidence could be a double-edged sword that accelerates rather than prevents injustice. India is at a turning point. The legal community, legislature, and courts must work together to ensure that technology advances justice rather than the other way around. AI evidence must always be accepted based on sound legal reasoning and constitutional principles, never on novelty or convenience. As we look to the future, we must remember that justice is not a formula.It\u2019s a commitment to treat everyone equally, with respect, and sensibly. And a machine should never take precedence over that.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">It is a question of \u201cwhen,\u201d not \u201cif,\u201d that artificial intelligence will be integrated into the Indian legal system. AI should never take the place of human judgement, empathy, or accountability, even though it has the potential to significantly increase the effectiveness and precision of criminal investigations and court cases. We must make sure that technology advances justice rather than impedes it in a nation as diverse, democratic, and constitutionally rich as India. Even though evidence produced by AI might seem objective and scientific, it is ultimately produced and trained by humans, who have their own prejudices, presumptions, and constraints. We run the risk of turning our courtrooms into data centres where justice is reduced to code and human stories are lost in algorithms if we ignore this fact.&nbsp; <\/span><span style=\"font-weight: 400;\">An AI tool\u2019s incorrect identification is more than just a technical error, it can lead to erroneous arrests, humiliation, incarceration, or even death.&nbsp;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">In any civilised society, such outcomes are unacceptable. The real danger with AI is not in its use, but rather in its uncritical and unregulated adoption. We violate basic legal principles like the presumption of innocence, the right to a fair trial, and the right to be heard when courts begin to rely on results that even experts are unable to fully explain. Despite advancements in technology, these principles remain unalterable.<\/span><\/p>\n<p><b>References &amp; Sources<\/b><\/p>\n<ol>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Indian Evidence Act, 1872, Section 65B \u2013 Admissibility of electronic records<\/span><\/li>\n<\/ol>\n<p><span style=\"font-weight: 400;\">Bare Act \u2013 Indian Kanoon<\/span><\/p>\n<ol>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Bharatiya Sakshya Adhiniyam, 2023 \u2013 Proposed legislation replacing the Indian Evidence Act<\/span><\/li>\n<\/ol>\n<p><span style=\"font-weight: 400;\">PRS Legislative Research Summary<\/span><\/p>\n<ol>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Anvar P.V. v. P.K. Basheer, (2014) 10 SCC 473 \u2013 Landmark case on electronic evidence admissibility Supreme Court Judgment<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">State (NCT of Delhi) v. Navjot Sandhu (Parliament Attack Case), (2005) 11 SCC 600 \u2013 Use of digital evidence Case Summary<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Justice K.S. Puttaswamy v. Union of India, (2017) 10 SCC 1 \u2013 Landmark judgment recognizing the Right to Privacy Judgment Text<\/span><\/li>\n<\/ol>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>Author: Sakshi Tripathi, student of BBA.LLB (3rd Year) United university Prayagraj Introduction&nbsp; Artificial intelligence (AI) has evolved from a futuristic idea to a reality in today\u2019s quickly evolving digital world. It can be found in self-driving cars, content curation, smart surveillance systems, smartphone apps, and even the legal system. An important question that emerges as [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":5046,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[85],"tags":[],"_links":{"self":[{"href":"https:\/\/lawjurist.com\/index.php\/wp-json\/wp\/v2\/posts\/5196"}],"collection":[{"href":"https:\/\/lawjurist.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lawjurist.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lawjurist.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/lawjurist.com\/index.php\/wp-json\/wp\/v2\/comments?post=5196"}],"version-history":[{"count":10,"href":"https:\/\/lawjurist.com\/index.php\/wp-json\/wp\/v2\/posts\/5196\/revisions"}],"predecessor-version":[{"id":5208,"href":"https:\/\/lawjurist.com\/index.php\/wp-json\/wp\/v2\/posts\/5196\/revisions\/5208"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/lawjurist.com\/index.php\/wp-json\/wp\/v2\/media\/5046"}],"wp:attachment":[{"href":"https:\/\/lawjurist.com\/index.php\/wp-json\/wp\/v2\/media?parent=5196"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lawjurist.com\/index.php\/wp-json\/wp\/v2\/categories?post=5196"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lawjurist.com\/index.php\/wp-json\/wp\/v2\/tags?post=5196"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}