Published On: October 4th 2025
Authored By: Madhura Karmakar
Jogesh Chandra Chaudhuri Law College, University of Calcutta
ABSTRACT:
The Facial Recognition Technology (FRT) is a technology that is capable of verifying individuals from digital images by comparing their facial features. These systems are employed throughout the world by various governmental and private agencies. Though their effectiveness varies, the use of FRT has raised serious concerns like violation of privacy, incorrect identifications, racial profiling and leakage of biometric data. In India-where biometric system like Adhaar has come into existence, there has been a growing confrontation between technological needs and constitutional protections of privacy. Though technologies like FRT promise better convenience and streamlined access to services, it still raises questions about civil liberties and the boundaries of state and corporate power. Through this article we will explore what is FRT, its need, notable developments along with the legal and ethical safeguards which India needs.
INTRODUCTION:
 Facial recognition systems analyse our facial features to detect faces and match them to stored data. It is an AI system which allows identification and verification of a person’s data based on certain images or video clips by using complex algorithms. FRT can be used in various fields, but one most commonly used is in:
- Verification of identity (unlocking phones)
- Identification of people (surveillance)
The Indian legal framework faces some shortcomings in addressing the complexities of FRT and its impact on privacy. Although the Information Technology Act (2000) and the Aadhaar Act (2016) provides some framework for data protection, but they do not address certain specific concerns related to facial recognition. The absence of clear and comprehensive legislation leaves a huge gap in addressing data privacy in India.
CONSTITUTIONAL BACKDROP: RIGHT TO PRIVACY AS A FUNDAMENTAL RIGHT
Article 21, enshrined under part III of the Indian Constitution, guarantees right to life and personal liberty to every citizen of India. Right to privacy is one such right that comes under the purview of Article 21. The Aadhaar Act of 2016 raised serious concerns with the use of FRT leading to the violation of the right to privacy of the Aadhaar holders. Hon’ble Justice K.S. Puttaswamy filed a case in the Supreme Court of India challenging the validity of the Aadhaar Act. The supreme court’s landmark decision in K.S. Puttaswamy v. Union of India (2017)[1], held that the right to privacy is intrinsic to right to life and personal liberty under Article 21 of the Indian Constitution. This judgment also provided a threefold test for state intrusions in privacy:
- A law must be present that authorizes the such intrusion,
- It must have a legitimate aim,
- Such action must be necessary and minimally intrusive in nature.
CURRENT LEGAL SCENARIO
There is no single or comprehensive Indian statue that deals with facial recognition per se. Rather, there is a patchwork of legislations that deals with the use of FRTs;
- Aadhaar Act (2016) and UIDAI[2]: It governs the use of biometric identification system and UIDAI has issues guidance on face authentication as one of the means for verification. Still, the Act has been heavily blamed for its lack of clarity in data protection and sequestration of citizens.
- The Information Technology Act (2000)[3]: It contains some facts relating to sensitive particular data and concurrence conditions. Still, it isn’t designed to attack the challenges posed by FRT.
- The Personal Data Protection Bill (2023)[4]: although under review in the Parliament, it aims to establish a frame for biometric data protection. This bill offers some stopgap for addressing the challenges of FRTs but also leaves a wide gap when it comes to state’s surveillance and sensitive biometrics.
RECOMMENDATIONS OF NITI AAYOG FOR THE RESPONSIBLE USE OF FRTs:
- PRIVACY AND SECURITY: As set by the Supreme Court in the Puttaswamy judgement, a data protection regime should be established following the three-pronged test of legality, reasonability and proportionality.
- PROTECTION: There should be a committee to assess ethical implications, following the principle of protection of human values.
- SAFETY: Publishing the standards of FRT and removing errors.
- ACCOUNTABILITY: To secure the trust of people in the system, all the issues relating to algorithmic accountability and AI biases should be addressed accordingly.
- REDRESSAL CELL: A grievance related redressal cell should be established for dealing with any FRT related issues.
INDIA’S MIXED RECORDS ON THE USE OF FRTs:
India has already seen large scale FRT deployment which gave a mixed record on such use:
- AIR TRAVEL: The Digi Yatra scheme of the government which uses facial based boarding system to expedite passenger flows, has led to rapid expansion of the use of FRTs which now covers a big portion of domestic travels. However, this system poised a big question over consent, data-retention and third-party access to the passenger’s biometric data.
- PUBLIC SURVEILLANCE: The police forces of multiple states have adopted the use of FRT system for identifying habitual protestors. However, it received criticisms for free expression and assembly. Transparency reports are often absent in such cases.
- WELFARE AND PUBLIC SERVICES: government uses FRTs to verify beneficiaries. Such use is done for welfare aims to prevent fraud, but also imposes risk the use of people’s biometric.
Sometimes technical and governance failures lead to high profile leak of biometric data collected for law enforcement agencies online. This illustrates that catastrophic consequences can occur when large biometric datasets are mishandled.
LEGAL AND ETHICAL CONCERNS ON THE USE OF FRTs:
- CONSENT: Although many government and private organizations use FRTs, it is still unclear if user consent is obtained for such use. Consent is essentially forced in a lot of public service platforms. Any citizen in need of a boarding pass has no other choice but to submit face scans in order to obtain one. Even though consent is specifically mentioned in data protection principles, large-scale FRT services don’t pass the tests.
- EFFECTIVE OPERATION: To ensure that data collected for one purpose is not used for another, strict purpose-limiting regulations and audits must be implemented. The likelihood of biometric data being misused or used again increases with retention time
- LEGALITY AND THE PUTTASWAMY TEST: Puttaswamy asked for proportionality, a legitimate purpose, and an effective law allowing the intrusion when the state used FRT. A constitutional challenge arises because many deployments are carried out under administrative orders rather than explicit statutory authorization.
- RACIAL BIAS: False negative results are produced by FRT systems that perform worse on particular demographic groups. False suspicions and wrongful exclusion from services can result from inaccurate matches.
- DATA SECURITY: Biometric information is extremely private and unrecoverable once stolen. Hackers find it impossible to resist using FRTs in centralized or inadequately secured datasets.
INTERNATIONAL PRACTICES:
Globally, countries have adopted various approaches, strict regulatory frameworks and sector-specific rules. Some of them are as follows:
- European Union (EU):
 Under the EU’s AI Act[5], facial recognition in public places is strictly prohibited. For using FRT in necessary cases like locating missing people or addressing terrorism, permission from judicial and administrative authorities is necessary. After passing the EU’s AI Act in 2023, stringent controls and penalties on misuse of biometric systems were done to put a ban on scraping data for FT databases.
- United Kingdom:
The Protection of Freedoms Act (2012)[6] made certain provisions for biometric data handling. Parental consent has been introduced for the use of FRTs in schools. It can also be used by public authorities only with the consent of individuals.
- China:
 The Personal Information Protection Law (PIPL)[7] has been passed to limit the use of FRTs in public areas. It can only be used for public security purposes but only with the consent of the individuals. Data cannot be repurposed without clear consent.
- Latin America:
Clearview AI has expanded operations in Latin America for identifying offenders and victims in child exploitation cases. However, potential abuse in jurisdictions with weaker data protection regimes has been highly criticized.
RECOMMENDATIONS FOR INDIA ON THE USE OF FRTS:
For using FRTs with constitutional protections and public trust, India should pursue a multi-prolonged approach:
- Any use of FRT by the state shall be authorized through clear legislation that shall specify the necessity of the use of such system which is subjected to the Puttaswamy test.
- For the use of biometric data, consent should be taken from the individuals and they should e informed when such use is made by the authorities.
- Independent audits should be published for all significant deployment of FRTs.
- Pertaining to the risks of using FRTs, operators should face strict cybersecurity obligations and penalties for negligence.
- Wherever possible, strict retention limits should be made for local devices like smart phones.
- State must ensure public transparency and parliamentary oversight for the use of FRTs.
CASE STUDIES:
- 2019 DELHI RIOTS: FRT was widely used by police for to identify the suspects of communal riots. However, it raised serious concerns about the potential misuse of such technology to target people from specific communities.
- ADHAAR: use of biometric datasets for Aadhaar authentication has raised serious concerns about the potential misuse sensitive data.
- SCHOOL AND UNIVERSITIES: various schools and universities have started using biometric data and FRT for attendance. This has raised serious concerns about the privacy of children.
CONCLUSION:
While the use of FRTs promotes enhanced security and public safety, its misuse has raised serious concerns about the right to privacy of people. India’s constitutional commitment to privacy, as set out in the Puttaswamy test allows a principled framework to judge the deployment of FRTs. What India actually needs now is statutory clarity, strict technical standards and a precautionary approach to protect the privacy of citizens. The goal should be to ensure the responsible use of FRTs, safeguarding individual rights to privacy.
REFERENCES:
[1] https://www.scobserver.in/cases/puttaswamy-v-union-of-india-fundamental-right-to-privacy-case-background/
[2] https://uidai.gov.in/en/legal-framework/aadhaar-act.html
[3] https://www.geeksforgeeks.org/ethical-hacking/information-technology-act-2000-india/
[4] https://www.pib.gov.in/PressReleaseIframePage.aspx?PRID=1947264
[5] https://en.wikipedia.org/wiki/Artificial_Intelligence_Act
[6] https://en.wikipedia.org/wiki/Protection_of_Freedoms_Act_2012
[7] https://en.wikipedia.org/wiki/Personal_Information_Protection_Law_of_the_People%27s_Republic_of_China