Facial Recognition and the Right to Privacy: Legal and Ethical Concerns in India

Published On: September 30th 2025

Authored By: Samruddhi Pawar
ILS law college

Introduction:

Consider walking through a railway station while looking at a camera. Your face is being captured and scanned without your awareness. This is no longer a science fiction story: it is a current reality in India. The adoption of Facial Recognition Technology (FRT) is considered a potent means of surveillance worldwide, and India is no exception. Governmental, quasi-governmental, and even private local organizations have adopted FRT in India, often without established legal frameworks, which can lead to its misuse. This not only brings ecivilizational concerns regarding its constitutional implications, especially in the context of the right to privacy protected under Article 21[1] of the Indian Constitution, but also deepens the ongoing debate and controversy regarding the FRT’s legal and ethical implications in India. This is in light of the Puttaswamy judgment,[2] the absence of national data protection law, and the erosion of constitutional and democratic rights. Some of the most problematic implies the categorization, indexing, and semi-automated recognition of persons in photos and videos is possible. As a progressive innovation, recognition of different persons through their facial features is possible. The images or video frames of a face can be juxtaposed with a previously captured image in a database to authenticate and verify the person in question. Even though there is increasing reliance on this technology for administrative efficiency, security, and law enforcement everywhere, its unregulated use raises serious questions about the erosion of fundamental rights, especially the right to privacy. The emergence of facial recognition systems needs to be given deserved attention within the Indian context, as privacy has recently been recognized as a fundamental right in Article 21 of the Constitution. A legal and ethical framework is urgently required, given unclear statutory regulations, potential for misuse, data breaches, and bias in the technology with respect to underrepresented communities.  This article explores comparative jurisprudence, ethical concerns, legal framework of FRT in India.

Function and Application:

Facial Recognition Technology (FRT) captures an individual’s photographic face and scans biometric features like the eye distance and jawline shape to match with previously stored data. In India, this technology has been adopted in the following ways:

  1. It has been implemented by law enforcement agencies to locate and identify both suspects and missing individuals.
  2. It is used by airports for seamless passenger boarding through systems like DigiYatra.
  3. It is used by municipal corporations for tracking employee attendance.
  4. It has been adopted by police departments for crowd surveillance during public events and protests.

The National Crime Records Bureau (NCRB) suggested in 2019 the development of a country-wide Automated Facial Recognition System (AFRS), which was to collate facial data from different state records. Although these systems increase operational efficiency, they threaten individuals’ privacy and democratic rights. While the NAFRS is under construction, NCRB is executing the NAFRS to cross-reference images with databases to recognize and track offenders. Moreover, contactless travel is enabled through systems like DigiYatra, and the Delhi Police surveilled protestors with FRT during the 2019-2020 anti-CAA protests.[3]

Legal framework:

The right to privacy in India received constitutional recognition through the landmark decision of Justice K S Puttaswamy & Anr. v. Union of India (2017). The petition against the Government of India was lodged with the Supreme Court, which ruled in a bench of 9 judges that a right to privacy is an aspect of personal liberty and a fundamental right under Article 21. The three conditions (Legality, Legitimate Aim, and Proportionality) must be satisfied under privacy infringement.

  1. Legality: The existence of a law.
  2. Legitimate aim: The law’s purpose must be reasonable and in the public interest
  3. Proportionality: The law must be rationally connected to its purpose and it must infringe the right as little as possible.

Utilizing this framework for FRT, it is evident that numerous uses by state actors of FRT, most currently seen in use, are unconstitutional. In India, there is no existing law that regulates facial recognition technology. Many uses of FRT will be dependent on either executive orders, administrative guidelines, or pilot programs, and have no public dialogue or parliamentary oversight.

Digital Private Data Protection Act, 2023[4]: Partial In 2023, India passed the Digital Private Data Protection Act, aimed, at least in part, to respond to the calls for a data protection regime. The Act is intended to provide a legal framework for personal data, including biometrics, but there are limitations of applicability: 

  1. There are no specific regulations of FRT in the Digital Private Data Protection Act.
  2. Government agencies benefit from broad exemptions.
  3. There are no regulations for algorithmic accountability or audits.

 As a result, although a framework is attempted, state and private actors are still unimpeded in how they apply FRT.   

Ethical Issues in India: 

Informed Consent and Autonomy: The application of facial recognition technology in India is devoid of the ethics of informed consent. The application of facial recognition technology does not, and need not involve, notice or consent of the individuals being captured, particularly in public spaces. Such acts violate the ethical principle of autonomy, which fruitfully calls for respect for the right of individuals to control their data and personal decisions regarding their own identities. In many cases, individuals are surveilled in a covert manner or coerced indirectly into giving their consent. For example, at airports, DigiYatra users are often encouraged to opt in without being informed of an explicit privacy policy.

Disproportionate surveillance and the ‘chilling’ effect: Mass surveillance through facial recognition technology (FRT) during public protests or gatherings has the potential to have a chilling effect on the rights to freedom of expression and assembly. This right is fundamental in any democracy, allowing people to protest without fear of retaliation or profiling. The use of FRT to surveil protesters by Delhi Police, especially minority communities, raises alarms regarding excessive embedded inequalities, escalated political surveillance targeting opposite views, distortion of public narratives, and the poor people’s inability to protest primarily because of targeted policy surveillance.

Bias and Discrimination: Several international studies have established that facial recognition systems are more likely to misidentify women, children, and people with darker skin tone. The use of imported or not adequately trained algorithms could reproduce and amplify these biases in India. Hence, this presents ethical concerns around discrimination and fairness. This is particularly problematic as the technology is used in consequential decisions such as criminal identification. There is the possibility of disproportionate impact of algorithmic bias on historically marginalized groups in India’s diverse society, e.g., Dalits, Adivasis, and Muslims.

 Data Retention and Purpose Limitation: In most present and upcoming FRT projects, a narrow purpose for collection is not required; there is no limit as to how long facial data is retained. Without data minimization, there is an obvious concern of function creep, that is, data collected for one purpose may be used for another purpose, usually without consent. This breaches ethical principles of transparency, data minimization, and purpose limitation, the prescriptive ethical norms of fair information practices.

Lack of Accountability and Redressal Mechanisms: Most FRT deployments in India are based on vague administrative orders or pilot projects with no independent oversight mechanism. If someone is misidentified or their facial data is misused, there is no grievance redressal process or defined institutional responsibilities for criminal misidentifications or violations of policy.

Ethical Problems and Discrimination FRT is an ethical issue just as much as it is a legal issue; many studies (supported by MIT University and the ACLU) have demonstrated that face recognition algorithms have higher and different error rates for identifying women, children, and people from communities with darker skin, including India. With a legal structure in India based on caste, class, and religion, which sets the standard for access to equality and justice, biased data algorithms and biased technology will only perpetuate discrimination. FRT in identifiable public spaces and absent notice or consent, and lacking the opportunity to opt out, is an assault on dignity and autonomy. FRT in public to surveil peaceful protestors, as occurred in Delhi during India’s anti-CAA protests, is an assault and violation of the right to freedom for peaceful assembly and expression. Such practices have a chilling effect where people refrain from exercising their democratic rights for fear of being identified and facing punishment. Facial Recognition Technology (FRT) raises significant ethical and legal implications in India. Firstly, FRT contravenes the privacy rights recognized under Article 21 of the Constitution in the case of Justice K.S. Puttaswamy v. Union of India (2017). Secondly, FRT is deployed routinely, and without consent, thus enabling tracking of individuals in their public travels, and potentially restricting the Article 19(1) protections against freedom of movement and assembly.

There is no legal justification for the implementation of FRT. India’s Digital Personal Data Protection Act of 2023 provides only tepid protection for biometric data and is not yet in force. There are concerns about profiling and discrimination for FRT in law enforcement, like the use by the Delhi Police for FRT during protests. In addition, there is no described framework to ensure accountability or prevent misuse, which is a gap in regulation.

Recommendations:

Facial recognition technology procedures should have their dedicated, distinct legislation, with specific definitions for relevant terms, applications, approval mandates, and autonomous supervision outlines. 

Notification, risk evaluation, and monitoring contracts concerning retention of sensitive information should be mandated for all facial recognition technology placements. 

Facial recognition technology should be prohibited, or at a minimum highly regulated, concerning the use within the duration of a democratic protest or gathering for the sake of preserving democratic liberties. 

Allocating and establishing a governing body that does not have any dependency on the organization that utilizes facial recognition technology will allow for monitoring and auditing of said recognition technology. 

All procedures involved with facial recognition technologies must undergo systematic evaluation for reputation risk and biases of silos within the technological FRT. 

Citizens must be granted the rights to appeal, amend, and permanently erase their facial data when an error or misuse of facial recognition data occurs.

 Conclusion:

Facial Recognition Technology (FRT) can be both advantageous and disadvantageous. While FRT can offer efficiency and security, the improper use of this technology can undermine fundamental rights and ethical values. In the case of India, using FRT without laws, mechanisms, informed consent, regulation, and liability frameworks puts individuals’ privacy, expression, and equality at risk. From an ethics perspective, FRT policies should be guided by explicit criteria such as transparency, fairness, autonomy, and accountability. FRT must not violate the principles outlined in the Puttaswamy case in the Indian context. India is at a critical juncture today. The path chosen will either embrace mass surveillance or restrict it in the name of our enshrined ideals. Which direction India chooses will determine the future of the country’s digitized environment.

References:

[1] Indian Const. art 21.

[2] Justice K.S. Puttaswamy (Retd) and Anr. Vs. Union of India and Ors, AIR 2017 SC 4161 (India).

[3]  Internet Freedom Foundation, Project Panoptic: Facial Recognition in India. https://panoptic.in accessed 5 Aug 2025

[4] Digital Personal Data Protection Act,2023, No 22 of 2023.  https://www.meity.gov.in

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top