Facial Recognition and the Right to Privacy: Legal and Ethical Concerns in India

Published On: September 30th 2025

Authored By: Heena Bedi
Janhit College of Law, CCSU

Abstract

Facial Recognition Technology (FRT) is deployed at scale with negligible legislative guardrails in India. This paper argues — from a constitutional and rights-based perspective — that current deployments risk contravening the Puttaswamy proportionality framework and that state-centric exemptions in the Data Protection regime are dangerously broad. The solution is a bespoke statutory scheme with mandatory judicial authorisation, strict purpose-limitation, algorithmic auditability, retention limits and an independent oversight authority.

  1. Introduction

Facial Recognition Technology (FRT) converts human faces into biometric signatures and enables automated identification and verification at scale. Its rapid uptake in policing, public surveillance and administrative workflows promises operational efficiencies — but also creates systemic risks to privacy, equality and democratic participation. In India the problem is acute: law enforcement agencies are piloting or operationalising FRT-linked systems (notably the Automated Facial Recognition System proposals associated with the National Crime Records Bureau) without a coherent legislative framework, parliamentary scrutiny, or robust judicial oversight. This is not merely a regulatory gap; it is a constitutional fault line. The extent and permanence of facial biometrics require more than ad hoc executive memos — they require a statute calibrated to the Puttaswamy test. [1]

  1. Constitutional Touchstones: Puttaswamy and the Three-fold Test

The Supreme Court’s judgment in Justice K.S. Puttaswamy (Retd) and Anr v Union of India elevated privacy to a fundamental right under Article 21 and articulated the tripartite legality-necessity-proportionality test for permissible state intrusions. [2] Any biometric surveillance programme that operates without a clear legal mandate, fails to pursue a legitimate aim, or is not the least intrusive means will struggle to survive constitutional scrutiny. PUCL v Union of India emphasised procedural safeguards to prevent executive overreach in communications interception — an instructive precedent for FRT, which similarly involves pervasive interception of personal data in the public domain. [3]

The Aadhaar litigation is instructive: while the Court recognised the constitutional footing of Aadhaar for welfare implementation, it insisted on strict purpose-limitation, data minimisation, and safeguards against function creep. [4] FRT must be assessed through the same lens: if the State cannot demonstrate statutory authority, precise purpose, and built-in constraints, the deployment is constitutionally infirm.

  1. Statutory Landscape and its Inadequacy

India’s statutory scaffolding is fragmented. The Information Technology Act 2000 provides a patchwork regulatory overlay for digital data, but it was not drafted with biometric mass surveillance in mind. The Aadhaar (Targeted Delivery…) Act 2016 is narrowly tailored to identity authentication for welfare delivery and does not provide a comprehensive governance model for surveillance FRT. The Digital Personal Data Protection Act, 2023 (DPDP Act) brings welcome baseline norms of consent and data subject rights but simultaneously carves broad state exemptions for sovereignty, public order, and security — a carve-out that, if read expansively, could legitimise wholesale biometric surveillance without judicial oversight. [5]

This statutory incoherence matters because FRT captures immutable identifiers: a breach of facial biometric data is irreversible. Unlike passwords or tokens, faces cannot be changed. The legislature must therefore treat facial biometrics as a special category, not as a mere “personal data” sub-class to be swept into general data processing rules. The DPDP Act’s failure to expressly classify biometric facial templates as ultra-sensitive under a tailored regime is a regulatory omission that invites mission creep.

  1. Key Legal risks under Puttaswamy

First, legality: many FRT deployments rest on administrative orders or procurement contracts rather than a statute conferring investigatory power or surveillance authorisation. Administrative fiat is not a substitute for parliamentary law where fundamental rights are at stake.

Second, necessity: the State must show evidence that FRT is necessary to achieve the asserted objective and that less intrusive alternatives have been considered. Blanket, indiscriminate scanning of public spaces fails this requirement.

Third, proportionality: the infringement must be proportionate to the aim. This requires narrowly defined authorisation, retention caps, access controls, independent oversight, and redress mechanisms. Absent these, the use of FRT in policing and public administration risks being disproportionate.

A corollary risk is function creep. Once infrastructural investments and datasets exist, their reuse for non-original purposes (political profiling, immigration enforcement, commercial analytics) becomes likely unless statutes expressly prohibit secondary uses and prescribe strict penalties.

  1. Equality and Algorithmic bias

Empirical studies outside India have documented disparate false-positive rates for women and darker-skinned individuals. In the Indian socio-legal context, algorithmic inaccuracies translate into wrongful detentions, reputational harm, and discriminatory enforcement — all of which engage Article 14’s guarantee of equality and Article 21’s protection of life and liberty. The State cannot rely on proprietary algorithmic blackboxes: any regulatory regime must mandate independent, third-party bias audits; transparency obligations regarding training datasets; and accessible remedies for misidentification.

  1. Ethical and Democratic costs

Beyond legal doctrine, FRT has chilling effects on civic participation. Continuous or easily searchable surveillance suppresses dissent, disincentivises attendance at protests, and corrodes the associational fabric of liberal democracy. These are not speculative harms; they are predictable social reactions to pervasive identification systems. Ethics requires more than downstream mitigation; it requires purposive limits at design and statutory levels.

  1. Operational Governance: minimum statutory features

A credible statutory regime should contain the following non-negotiable elements:

  1. Clear purpose limitation — statutory enumeration of permissible uses (e.g., narrowly defined serious crime investigations), with an express prohibition on bulk, indiscriminate identification for general policing or administrative profiling.
  2. Judicial authorisation — real-time or retrospective access to identification databases should be subject to prior judicial or magistrate authorisation except in immediate exigencies, with mandatory post-action reporting.
  3. Independent oversight — a statutory supervisory authority with investigatory and sanctioning powers, capable of conducting algorithmic audits, reviewing procurement, and enforcing transparency.
  4. Algorithmic accountability — mandatory pre-deployment independent testing for accuracy and bias, publication of error rates, and open channels for independent researchers (with privacy-preserving access).
  5. Data minimisation and retention limits — strict rules on what biometric data can be stored, for how long, and for what purpose; deleted unless preserved under a court order.
  6. Strong criminal sanctions and civil remedies — for unauthorised use, retention, or secondary use of facial biometric data.
  7. Compulsory impact assessments — privacy and human-rights impact assessments made public prior to large-scale deployments.

These features operationalise Puttaswamy’s proportionality requirement and introduce concrete procedural safeguards in lieu of the current ad hoc approach.

  1. Enforcement Architecture and Doctrinal remedies

Statutory rights without enforcement are illusory. The oversight authority should be empowered to issue binding remediation orders, impose monetary penalties, and recommend criminal prosecution for willful breaches. Courts should interpret state exemptions in the DPDP Act narrowly and insist on granular, case-by-case authorisations rather than blanket immunities. In litigation, petitioners should be able to seek interlocutory relief to prevent ongoing unlawful processing.

  1. Drafting pointers for Counsel and Legislators

From a drafting perspective, the statute should: define “facial biometric data” and “faceprint” explicitly; require interoperability standards only to the extent necessary for law enforcement with judicial oversight; prohibit commercial re-use of state-collected facial databases; mandate procurement transparency (including contract disclosure, except narrow confidentiality redactions); and require sunset clauses for pilot projects with mandatory parliamentary review.

Counsel representing civil society should press for injunctive relief and systemic discovery in public interest litigation to force disclosure of algorithms, accuracy metrics, and data governance processes. Legislators should embrace a problem-driven approach — identifying concrete harms and designing targeted prohibitions — rather than adopting a technology-agnostic or permissive statute.

  1. Conclusion: Urgency and Proportionality

FRT’s adoption without robust legal guardrails is not merely a technical or administrative problem — it is a constitutional crisis in the making. The Puttaswamy test is not an academic checklist; it is a practical framework that requires the State to justify surveillance intrusions in a transparent, proportionate, and legally accountable manner. India does not need a binary choice between security and liberty; it requires a calibrated legislative architecture that permits legitimate uses while constraining the State’s surveillance appetite. Anything less is governance by default: functional, expedient, and dangerously irreversible.

References

[1] Justice K.S. Puttaswamy (Retd) and Anr v Union of India (2017) 10 SCC 1, paras 63–69 (constitutional recognition of right to privacy and three-fold test).

[2] Ibid (summary of legality, necessity, proportionality framework); see also Justice K.S. Puttaswamy v Union of India (Aadhaar) (2019) 1 SCC 1, paras 119–127 (on purpose limitation and data minimisation in biometric schemes).

[3] People’s Union for Civil Liberties v Union of India (1997) 1 SCC 301 (telephone tapping and need for procedural safeguards).

[4] Aadhaar (Targeted Delivery of Financial and Other Subsidies, Benefits and Services) Act, 2016.

[5] Digital Personal Data Protection Act, No. 22 of 2023, s 17(2) (state exemptions for sovereignty, public order, etc.).

[6] Information Technology Act, 2000 (overview of digital data governance; limited biometric specificity).

[7] Constitution of India, arts 14, 19(1)(a), 19(1)(b), 21 (equality, free speech, assembly, life and personal liberty).

[8] National Crime Records Bureau — Automated Facial Recognition System (AFRS) proposals and procurement (referenced for factual context; procurement actions indicate operationalisation trends).

(Note: procurement and policy documents should be appended in litigation or policy submissions.)

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top