Facial Recognition and the Right to Privacy  in India

Published on 29th July 2025

Authored By: Shruti Kumari
Bharati Vidyapeeth New Law College Pune

Introduction

The constitutional basis for challenging Facial Recognition Technology (FRT) lies in the  Supreme Court’s landmark decision Justice K.S. Puttaswamy (Retd.) v. Union of India (2017)[1], wherein a nine-judge bench affirmed privacy as a fundamental right under Articles 14, 19, and  21. The Court shaped a stringent three-fold test legality, legitimate state aim, and  proportionality that any biometric surveillance measure, including FRT, must satisfy.[2] Under  the IT Act, 2000 (Section 43A, 72A) and its 2011 Rules, biometric data is explicitly classified as  “sensitive personal data,” imposing duties on data handlers and penalties for misuse. In  Puttaswamy, Justice Chandrachud emphasised that lawful data collection must be the least  intrusive means to achieve the objective, and must include procedural safeguards against abuse.  Consequently, any deployment of FRT by state or local authorities must be anchored in clear  legislation, pursue a legitimate purpose, be necessary and proportionate, and include built-in  oversight failing which it risks infringing the right to privacy guaranteed under Article 21.

Surveillance Deployment of Facial Recognition and Legal Risks Facial Recognition Technology (FRT) is being deployed by Indian police, airports, and public  institutions without a clear statutory mandate. Courts require that such surveillance must meet  the Puttaswamy three-pronged test: legality, legitimate state aim, and proportionality.[3] However, unlike Aadhaar, no central or state legislation explicitly authorises FRT deployment,  meaning legal legitimacy is often assumed rather than established. Additionally, the Indian  Evidence Act, 1872, does not yet admit FRT-derived evidence per se, raising concerns over its  admissibility and procedural safeguards. These regulatory voids underscore that current use of  FRT by public authorities may fail to satisfy judicially mandated privacy protections, placing  such programmes at risk of being struck down unless they are supported by transparent laws,  targeted purposes, necessity assessments, and oversight mechanisms.

Surveillance Deployment & Legal Gaps in Facial Recognition Despite robust principles established in Puttaswamy (2017)[4], actual deployment of Facial  Recognition Technology (FRT) by police and public authorities operates in a statutory vacuum. 

As noted by legal analysts in the context of AFRS and state-level police systems, such  surveillance “fails to meet any of these thresholds” absence of legality, manifest arbitrariness,  and lack of safeguards and accountability.[5] While FRT systems are being used from the  National Automated Facial Recognition System (AFRS) to Punjab’s PAIS and Uttar Pradesh’s  Trinetra they lack authorisation via primary legislation or subordinate rules. The result is a  serious risk of privacy violations: no oversight over data retention, no standards for accuracy,  and no process for redressal of errors. Unless FRT use is anchored in clear, democratically  enacted law with judicial or parliamentary oversight, routine deployments remain  constitutionally precarious.

Accuracy, Bias and Chilling Effects

Independent audits of facial recognition tools highlight significant accuracy concerns,  particularly in the Indian context. A study published in the Indian Journal of Legal Studies &  Innovations found error rates as high as 14–42% for gender and age classification on Indian  faces, and notably greater misidentification for women evidencing substantial algorithmic bias.[6] Such inaccuracies are not merely technical shortcomings; they translate into real harms. When  FRT is deployed for crowd monitoring or identifying suspects at protests, these  misidentifications can lead to wrongful detentions, stigmatization, and erosion of civil liberties.  For constitutional protection, FRT systems must adhere to Puttaswamy standards not only  legality and necessity, but also reliability. Without validation against error thresholds and  independent audit mechanisms, using FRT risks chilling fundamental freedoms of speech,  movement, and dissent under Articles 19 and 21.

A rigorous study, Cinderellas Shoe Wont Fit Soundarya: An Audit of Facial Processing Tools  on Indian Faces” (Jain & Parsheera, 2021), evaluated commercial facial recognition systems on  Indian datasets and uncovered alarming performance discrepancies. Gender-classification error  rates for Indian women reached nearly 15%, and age estimation errors ranged between 14% and  42%.[7] These inaccuracies are more than technical faults they can lead to wrongful identification  and exclusion, particularly in protest settings or public access controls. Given the Puttaswamy test’s proportionality and reliability clauses, such high error margins raise serious constitutional  doubts. Absent enforceable standards for accuracy and independent audits, FRT systems are  likely to trigger chilling effects on freedoms under Articles 19 and 21, justifying judicial scrutiny  or restraint on deployment until robust safeguards are codified.

A pivotal study titled Cinderellas Shoe Wont Fit Soundarya: An Audit of Facial Processing  Tools on Indian Faces” by Gaurav Jain and Smriti Parsheera (2021) examined four commercial  facial recognition systems using a dataset of 32,184 Indian faces from the Election Commission.  It found error rates as high as 14.68% for gender classification (particularly for women), and  14.3% to 42.2% for age estimation even within a ±10-year margin.[8]Such substantial  inaccuracies are far from benign: when authorities employ FRT for identifying suspects in  crowds or surveillance at public gatherings, these error rates can lead to misidentification,  wrongful suspicion, and unwarranted restrictions on speech and movement. Under the  Puttaswamy framework, not only must surveillance tools be legal, necessary, and proportionate  they must also be reliable. With error margins this high, FRT lacks reliability, heightening the  risk of chilling effects on fundamental rights under Articles 19 and 21.

A critical study titled Cinderellas Shoe Wont Fit Soundarya: An Audit of Facial Processing  Tools on Indian Faces” (Jain & Parsheera, 2021) assessed four commercial facial recognition  systems using a dataset of over 32,000 images from India’s Election Commission. It revealed a  14.68% error rate in gender classification for Indian women, alongside age-estimation errors  ranging from 14.3% to 42.2%.[9] These substantial inaccuracies raise serious constitutional  concerns: when FRT is employed for crowd surveillance or suspect identification, such error  margins can result in wrongful tagging, unwarranted suspicion, and restrictions on fundamental  freedoms. Under the Puttaswamy proportionality test, surveillance technologies must not only be  lawful and necessary, but also demonstrably reliable. In the absence of enforceable accuracy  standards and independent auditing, FRT in its current form risks chilling rights enshrined under  Articles 19 and 21 of the Indian Constitution.

Judicial and Statutory Responses

In a watershed April 2025 ruling, the Calcutta High Court prohibited police agencies from  deploying Facial Recognition Technology (FRT) in public spaces without an explicit legislative  framework. The division bench held that any biometric surveillance must satisfy the Puttaswamy  thresholds of legality, necessity, proportionality and procedural safeguards, and mandated  transparent disclosures regarding deployment zones and data retention policies.[10] Parallelly, the  Supreme Court issued a consequential directive: police may use FRT only with judicial warrant  in active investigations, and all facial data must be purged once investigations conclude  mandating deletion unless a statute requires retention.[11]These decisions collectively affirm that  public use of FRT without statutory sanction breaches Articles 14, 19 and 21 and are thus  constitutionally untenable.

In April 2025, the Calcutta High Court, in a PIL filed by civil liberties activists, held that police  deployment of Facial Recognition Technology (FRT) in public such as during protests or in  transit hubs—lacked any legal foundation, rendering it unconstitutional. The Court directed that  FRT must satisfy Puttaswamy legality, necessity, and proportionality criteria, and prescribed  mandatory disclosures about deployment areas and data-retention policies.[12]Around the same 

time, the Supreme Court reinforced this stance, ruling that police can only employ FRT pursuant  to a judicial warrant and affirmed that all facial data must be deleted once investigations  conclude unless retained under statute.[13] Collectively, these rulings assert that public FRT use  without explicit legal authorization or oversight breaches Articles 14, 19, and 21, marking  significant judicial reining in of biometric surveillance.

Comparative and Regulatory Perspectives

India currently lacks a comprehensive statutory framework for Facial Recognition Technology  (FRT). By contrast, the European Union treats biometric identifiers as “special category data”  under the General Data Protection Regulation (GDPR), demanding explicit consent, impact  assessments, and data minimization before processing. This is a high-standard global  benchmark.

In India, the Digital Personal Data Protection Act, 2023, nominally safeguards biometric data but  contains a key loophole: Section 17 allows government agencies to exempt themselves,  potentially permitting unregulated FRT use all the same. Without explicit regulations specific  to facial biometric systems, data subjects lack transparency, oversight, or recourse for misuse.

Accordingly, Indian policy must evolve to incorporate:

  • Warrant-based deployment: FRT could be legally activated only via judicial warrant in active criminal investigations.[14]
  • Mandatory Privacy Impact Assessments (PIAs): Before any system rollout, independent audits should ensure metrics for accuracy, bias, data retention, and deletion.
  • Public transparency mechanisms: Prior public notice, signage in deployment zones, and publication of data retention policies—reflecting Calcutta HC’s transparency  [15]
  • Independent Oversight Body: A specialized agency tasked with FRT certification, audit authority, and enforcement powers over misuse.

This three-tiered regulatory model—legal warrant, technical safeguards, oversight—aligns with  both Puttaswamy precedent and global best practices, ensuring that FRT deployment doesn’t  become a covert erosion of constitutional freedoms.

Conclusion

India stands at a critical juncture in reconciling the benefits of Facial Recognition Technology  (FRT) with its constitutional obligations. The April 2025 landmark judgment by the Calcutta  High Court explicitly barred unregulated FRT use in public spaces, mandating that any  deployment must adhere to the Puttaswamy trilogy legality, necessity, proportionality and be  supported by transparent data retention and access protocols. Shortly thereafter, the Supreme  Court reinforced this position, ruling that FRT use by police was permissible only under a  judicial warrant, and facial data must be purged post-investigation, unless statutory retention is  permitted. These twin judicial pronouncements reaffirm that biometric surveillance cannot  proceed without statutory or regulatory grounding.

Yet, statutes like the Digital Personal Data Protection Act, 2023 and the Aadhaar Act, 2016,  while covering biometric data broadly, do not explicitly govern FRT systems. This silence  creates a legal vacuum, encouraging piecemeal deployments without proportional oversight. To bridge this divide, two legislative responses are essential:

  1. Enact a Facial Recognition–specific Regulation: Parliament should mandate judicial warrants for FRT use, define accuracy standards, inscribe data deletion timelines, and institute robust redress mechanisms.
  2. Set Up an Independent Oversight Authority: This body would oversee pre-deployment Privacy Impact Assessments (PIAs), certify FRT systems, audit deployments, and investigate complaints, thus embedding procedural integrity and public trust.

By translating jurisprudential principles into binding law and instituting independent oversight,  India can align its FRT deployment with democratic norms. As the courts have clarified, liberty  and security need not be antagonistic; with proper framing and governance, technological  progress can coexist with constitutional rights. The judiciary’s recent push sets the blueprint— now it is Parliament’s turn to legislate responsibly.

 

References

[1] K.S. Puttaswamy (Privacy-9J.) v. Union of India, (2017) 10 SCC 1

[2] www.nishithdesai.com

[3] https://thedialogue.co/does-the-traceability-requirement-meet-the-puttaswamy-test

[4]  K.S. Puttaswamy (Privacy-9J.) v. Union of India, (2017) 10 SCC 1

[5] https://en.wikipedia.org/w/index.php?title=Facial_recognition_system&oldid=1292816955

[6] https://wilj.law.wisc.edu/

[7] https://internetfreedom.in/problems-with-facial-recognition-systems-operating-in-a-legal-vacuum

[8] https://www.catalyzex.com/author/Gaurav%20Jain

[9] https://www.smritiparsheera.com/research/artifical-intelligence

[10] https://en.wikipedia.org/w/index.php?title=Facial_recognition_system&oldid=1292816955 11 https://www.lawgratis.com

[11] https://www.lawgratis.com

[12] https://www.lawgratis.com/blog-detail/calcutta-hc-directs-police-not-to-use-facial-recognition-tech-without regulation-a-win-for-privacy-in-the-digital-age

[13] https://www.lawgratis.com/blog-detail/supreme-court-bans-the-use-of-facial-recognition-technology-by-police without-consent

[14] https://www.devdiscourse.com/article/law-order/3337162-calcutta-high-court-rules-against-consumer-forum arrest-warrants

[15] https://www.lawgratis.com/blog-detail/calcutta-hc-directs-police-not-to-use-facial-recognition-tech-without regulation-a-win-for-privacy-in-the-digital-age

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top