Published On: February 2nd 2026
Authored By: Tejaswini Uppala
Alliance University
ABSTRACT
Facial Recognition Technology (FRT) is a new age technology and with the advent of high-end technological appliance it has emerged as most powerful tools of digital surveillance, transforming the way individuals, private entities, and governments identify, track, authenticate. the utilization of FRT raises urgent concerns regarding the constitutional right to privacy, data protection standards, and the potential for misuse. This article critically examines the legal and ethical implications of FRT use in India in light of the Supreme Court’s landmark Justice K.S. Puttaswamy (Retd.) v. Union of India judgment, which affirmed privacy as a fundamental right under Article 21 of the Constitution. It identifies gaps in regulating biometric surveillance.
As India embraces digital innovation, it must ensure that FRT is governed by robust laws, inclusive policies, and a commitment to civil liberties and human dignity.
INTRODUCTION
Facial Recognition Technology (FRT) is the one that recognizes an individual by comparing the face of the person in the photos or video with the images that are stored in a database. It identifies the face, derives important features and matches them against records.
Facial recognition technology (FRT) is widely part of everyday life and it is being utilized across different sectors like airports, border inspections for security, ticketing, attendance, and individual use. and the use of facial recognition technology sometimes can retrieve intimate personal data without proper permissions from the users and this is a violation of right to privacy i.e. article 21 of Indian constitution. This issue raises a serious legal and ethical concerns. Yet, India there is no legal regulation introduced on this matter.
IMPORTANCE OF RIGHT TO PRIVACY:
Privacy is a fundamental right as it protects an individual’s autonomy, dignity and freedom from unwarranted interference. In the digital age, where personal data can be collected silently and analysed instantly, privacy has become even more crucial.
Landmark Cases for Right to Privacy:
In India, the Right to Privacy has been gradually increasing, in a number of court decisions. Privacy was not fully held as a right by early cases such as the A.K. Gopalan (1950) and Kharak Singh (1962) cases. It came to be formed in Gobind (1975) and reinforced in Maneka Gandhi (1978). Privacy was then broadened by the courts through the media reports (Rajagipal, 1994), phone taps (PUCL, 1997), medical examinations (Sharda, 2003), bank statements (Canara Bank, 2005), and safeguarding against forced medical testing (Selvi, 2010). The LGBTQ+ rights were also supported by privacy (Naz Foundation, 2009 and Navtej Johar, 2018). The big shift came with Puttaswamy (2017) that defined privacy as a fundamental right, then the case of Aadhaar (2018) that established the boundaries of the usage of the data. Personal choice was safeguarded by Joseph Shine (2018). Such cases as Madhukar Mardikar (1991) about the privacy of women and Ritesh Sinha (2019) about voice samples reinforced this right in Article 21 even more.
FRT poses unique threats to privacy, making its regulation essential.
By 2025, there is no particular legislation on FRT in India. This legal gap enhances the danger of abuse-mass surveillance, profiling as well as unchecked data gathering.
The Justice K.S. Puttaswamy (Retd.) Supreme Court v. Union of IndiaArticle 21 of the constitution stipulated that any intrusion into privacy by a State should be laid down with three conditions:
1.Presence of a law – The action of state which concerns privacy should be supported by a good statute.
2.Target of the legitimate state – The goal should be neither arbitrary nor unreasonable (Article 14 standard).
3.Proportionality – The action should be necessary, minimally intrusive and proportional to the goal.
In India, FRT fails on all three tests:
- It has no specific law, which regulates its use.
- When utilized, it is mostly done on reasons such as national security. Though it is a matter of national security, It should not be often applied in excess and disproportionate way.
RISKS ASSOCIATED WITH FRT:
They include both technical risks and rights-based risks. Its application in real life, particularly by government agencies is ethically and privacy wise objectionable.
Design-Based Risks: The intrinsic factors that can make FRT systems give inaccurate results include facial expression, ageing, plastic surgery or disfigurement and the extrinsic factors that include lighting, image quality and pose variations. Such restrictions may contribute to wrong pairings.
Bias: FRT is also biased due to the fact that most systems are trained on small datasets. Such mistakes are widespread in regard to skin colour, and research shows that Indian men and women have a greater number of mistakes. FRT systems imported are even less precise in India because the country is diverse, with regard to facial features. This is a note of caution that a holistic, pan-India facial biometrics database is needed to enhance precision but a database means privacy risks.
There is also cybersecurity issue: FRT systems are huge repositories of biometric information, so companies establishing them or using them may be hacked.
Rights-Based Challenges: One of the biggest problems concerning the rights is the purpose creep: the facial data collected to serve one purpose are often reused to serve totally unrelated purposes without the knowledge of the individual. The consent is usually granted to the first use, and the further use is ethically questionable.
FRT databases are usually constructed based on AI that has been trained on internet scraped pictures. Even though this scraping is not directly criminal in India, it poses severe ethical concerns since the people, whose photographs are used, usually do not agree to such practise.
Case Study: Digi Yatra
The Digi Yatra is a Government of India programme of the ministry of civil aviation which allows air travellers the ability to verify their identity by biometrics aspects of the face. The system purports to save information on a decentralised registry to have decentralised trust and greater security.
The New Indian Express reported that passenger information gathered by Digi Yatra had been leaked to the Income Tax Department in order to monitor tax evasion. Even though the CEO of Digi Yatra and the Income Tax Department refuted these allegations, the scandal brought back the issue of data governance, transparency, and the absence of effective legal protection in India.
NITI Aayog’s discussion Paper on Responsible AI proposed the following:
- Removal of Biometric Data.
- Airports are obligated to make sure that, all facial biometric information stored under FRT systems are removed out of their database within 24 hours of a passenger leaving the airport to avoid the retention of the same and to avoid privacy threats.
- Strict, Meaningful, and Informed Consent: Information should not be automatically shared with cab operators or commercial organisations, but only under strict consent, meaningful, and in a deliberate manner.
- Enforcement of Strong Security Standards: In compliance with Rule 7 in the SPDI Rules, the Digi Yatra Central Ecosystem must instal the most advanced and secure security systems that avoid any violation or unauthorised access of sensitive biometric data.
- Frequent Cybersecurity Audits and Vulnerability Tests: Since facial biometrics is a sensitive tool, a cybersecurity audit and vulnerability assessment should be conducted regularly to energise that data protection mechanisms are robust, up to date and efficient.
Recommendations to policy makers:
- Establish a national policy on FRT with minimum protection and impact assessments of all large scale deployments.
- Require testing of bias and accuracy in algorithmic impact assessment prior to approval.
- Establish a rapid court cheque system of emergency deployment by police.
- Make sure that the DPDP framework clearly considers the biometrical facial data as sensitive data with more stringent protection and fewer exceptions.
- Encourage academic research to support audit work and fund independent audits.
- Conduct mass sensitization on the use and protection of rights by the citizens by educating them on the use of FRT.
CONCLUSION
In Conclusion, FRT is fast gaining momentum and India has not been able to establish a definite legal framework to govern the same. This is a vacuum which has already resulted in misuse e.g. the Delhi Police used FRT to screen people at a political rally although the High Court restricted it to only tracing missing children. These cases are also a critical breach to privacy and civil liberties, which is a direct opposite of the protection that is provided in the Puttaswamy ruling.
FRT will be left as an uncontrolled instrument of mass surveillance without a clear legislation. India needs an all-inclusive law creating an assurance that FRT should be implemented in a manner that is ethical, transparent and proportionate with stringent restrictions to ensure the fundamental rights of the citizens are not compromised.




