Published On: September 30th 2025
Authored By: Priyanshu Singh
Amity University, Noida
Abstract
Facial Recognition Technology (FRT) is being used more and more in India like in airports, police work, and even shopping malls. While it can make things faster and help in catching criminals, it can also harm our privacy if not used carefully. The Supreme Court of India has said that privacy is a fundamental right. This means the government and private companies must protect people’s personal data. However, there are not enough laws in India that explain when and how FRT can be used. This article looks at how FRT works, why it can be risky, and what India can learn from other countries. It also suggests steps that can make sure technology is used in a way that is safe, fair, and respects our rights.
INTRODUCTION
Facial Recognition Technology (FRT) is a system that can identify a person by looking at their face. It works by scanning an image or video of the face and comparing it with faces already stored in a database. In recent years, India has started using this technology in many places. For example, at airports through the DigiYatra program, in public spaces through CCTV networks, and by the police through the Automated Facial Recognition System (AFRS).[1]
While FRT can make processes faster and help solve crimes, it can also be used in ways that invade people’s privacy. This is a serious issue because in 2017, the Supreme Court of India decided in Justice K.S. Puttaswamy vs. Union of India that the right to privacy is part of the fundamental right to life and personal liberty under Article 21 of the Constitution.[2]
The problem is that India does not yet have strong laws that clearly say when and how FRT can be used. Without clear rules, there is a risk that the technology could be misused for mass surveillance, tracking people without their consent, or storing personal data without limits.
This article studies the relationship between FRT and the right to privacy in India. It will look at the legal background, ethical problems, examples from other countries, and suggestions for a better legal framework.
EVOLUTION OF THE RIGHT TO PRIVACY IN INDIA
The right to privacy in India has developed slowly over many years through court decisions. In the beginning, the Constitution of India did not clearly say that people had a right to privacy. Early court cases even refused to accept privacy as a separate fundamental right.
In M.P. Sharma vs. Satish Chandra (1954), the Supreme Court said that the Constitution did not mention privacy, so it could not be treated as a guaranteed right.[3] In Kharak Singh vs. State of Uttar Pradesh (1962), the Court agreed that police surveillance could violate personal liberty, but it still did not fully recognise privacy as an independent right.[4]
The real change came with Justice K.S. Puttaswamy vs. Union of India (2017) judgment. A nine-judge bench of the Supreme Court ruled that the right to privacy is a fundamental right under Article 21, which protects the right to life and personal liberty. The Court explained that privacy includes the right to control personal information, to be left alone, and to make decisions about one’s own life without interference.
This case is very important for the debate on Facial Recognition Technology. Since FRT collects and stores biometric data (like face scans), it directly affects the right to privacy. Any use of FRT by the government or private companies must meet the standards set in Puttaswamy, especially the “proportionality test” meaning the use of technology must be necessary, legal, and must not harm rights more than required.
GROWTH OF FACIAL RECOGNITION IN INDIA
Facial Recognition Technology is spreading very fast in India. Both the government and private companies are using it in many different areas.
One major example is the DigiYatra program at airports. Passengers can choose to register their face so they do not have to show boarding passes or ID at different checkpoints. Instead, cameras scan their face and match it with stored data to allow entry.[5]
The Automated Facial Recognition System (AFRS), started by the National Crime Records Bureau (NCRB), allows police to match faces from CCTV footage or photographs with a database of known persons. This can be used for finding missing children, identifying criminals, or verifying identities.[6]
Private companies also use FRT. Some banks use it for customer verification, shopping malls use it for security, and offices use it for attendance tracking.
The problem is that India does not have a clear national law that says how this technology should be used, how long data can be stored, or how people can complain if their privacy is violated. Without such rules, there is a high risk of mass surveillance and misuse.
LEGAL AND CONSTITUTIONAL CONCERNS
Facial Recognition Technology affects some of the most important rights in the Indian Constitution.
Right to Privacy (Article 21)
The Supreme Court has said that the right to privacy is part of the right to life and personal liberty. This means the government cannot collect and use personal data unless it follows the law and respects basic rights. Since FRT collects biometric data like face scans, it directly affects privacy.
Freedom of Speech and Expression (Article 19(1)(a))
If people know they are being watched all the time, they might feel scared to speak freely or take part in protests. This is called a “chilling effect.”[7] FRT in public places can make people feel like they are always under surveillance.
Proportionality Test
The Puttaswamy judgment said that any action that affects privacy must pass three checks:
- Legitimate aim – there must be a real and lawful reason.
- Necessity – there must be no better, less harmful option.
- Proportionality – the benefits must be greater than the harm to rights.
Lack of a Strong Data Protection Law
The Digital Personal Data Protection Act, 2023 is a new law that talks about how personal data should be handled. However, experts say it has big gaps. For example, it gives the government many powers to collect data without enough checks.[8] This means FRT can still be used in ways that might harm privacy.
For example, using FRT to find a dangerous criminal may pass this test, but using it to monitor all citizens all the time would fail.
COMPARATIVE PERSPECTIVE
Looking at how other countries handle Facial Recognition Technology can give India useful ideas.
European Union
The EU has strong privacy laws under the General Data Protection Regulation (GDPR). It also passed the AI Act in 2024, which places strict limits on the use of FRT in public spaces. Real-time facial recognition by police is mostly banned unless there is a serious threat like terrorism.[9]
United States
The U.S. does not have one national law on FRT, but some cities like San Francisco and Boston have completely banned government use of it.[10] Others allow it but with clear rules about warrants and public notice.
Lessons for India
From the EU, India can learn to set strict limits and make privacy rules legally binding. From the U.S., it can see how local governments can decide their own policies, allowing for flexibility based on local needs. Both models stress transparency and accountability, which India currently lacks.
ETHICAL ISSUES
Facial Recognition Technology raises several ethical questions apart from legal ones.
Accuracy and Bias
FRT is not always correct. Studies in other countries have shown that it can make more mistakes in identifying women, children, and people with darker skin tones.[11] In India, this could lead to false arrests or wrongful suspicion.
Mass Surveillance
If cameras with FRT are placed everywhere, it can create a system where everyone is watched all the time. This is dangerous for a democracy because it can be used to track political opponents, activists, or ordinary citizens without any crime being committed.
Consent and Awareness
Often, people do not even know that FRT is being used on them. For example, in public places like railway stations, no one is asked before their face is scanned. This goes against the idea of informed consent, which is an important part of both ethics and privacy.
Data Security
If the large databases of face scans are hacked, millions of people’s personal data could be stolen and misused. Since biometric data cannot be changed like a password, the harm can be permanent.
NEED FOR REGULATION IN INDIA
Facial Recognition Technology can be useful, but without rules, it can easily be misused. India urgently needs a clear legal framework that protects privacy while allowing limited, necessary use of the technology.
Independent Oversight
An independent body, separate from the police or government, should check and approve the use of FRT in sensitive cases.
Judicial Authorization
Before using FRT for surveillance, especially in public spaces, authorities should get permission from a court. This ensures there is a legal reason and not just convenience.
Data Retention Limits
Face scan data should be deleted after a short, fixed period unless it is part of an ongoing investigation.
Transparency Reports
Government and private companies should publish regular reports explaining when, where, and why FRT was used, and how many people were affected.
Public Awareness
People should be informed when FRT is being used, either through notices, announcements, or public campaigns.
These steps will help India balance safety with privacy, avoiding the dangers of mass surveillance.
CONCLUSION
Facial Recognition Technology in India is growing quickly, but the laws have not caught up. The Supreme Court has clearly said that privacy is a fundamental right. This means FRT cannot be used freely without strong safeguards. While the technology can help in solving crimes and making services faster, it can also lead to constant surveillance and loss of personal freedom if not controlled.
Other countries have already set limits and rules. India can learn from them to create its own clear, strict framework. Until such a framework is in place, the use of FRT should be minimal, necessary, and always tested against the standards of legality, necessity, and proportionality.
The future of FRT in India should be one where technology serves people, not one where people serve technology.
REFERENCES
- M.P. Sharma vs. Satish Chandra, AIR 1954 SC 300.
- Kharak Singh vs. State of Uttar Pradesh, AIR 1963 SC 1295.
- Justice K.S. Puttaswamy vs. Union of India, (2017) 10 SCC 1.
- National Crime Records Bureau, Automated Facial Recognition System (AFRS) Tender Document, 2019.
- Ministry of Civil Aviation, Government of India, DigiYatra Guidelines, 2022.
- K.S. Park, “Chilling Effect of Mass Surveillance,” Human Rights Law Review, Vol. 18, Issue 1 (2018).
- Internet Freedom Foundation, “Analysis of the Digital Personal Data Protection Act, 2023,” 2023.
- Joy Buolamwini and Timnit Gebru, “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification,” Proceedings of Machine Learning Research 81 (2018).
- European Parliament, Artificial Intelligence Act, 2024.
- City of San Francisco, Stop Secret Surveillance Ordinance, 2019.
[1] National Crime Records Bureau, Automated Facial Recognition System (AFRS) Tender Document, 2019.
[2] Justice K.S. Puttaswamy vs. Union of India, (2017) 10 SCC 1.
[3] M.P. Sharma vs. Satish Chandra, AIR 1954 SC 300.
[4] Kharak Singh vs. State of Uttar Pradesh, AIR 1963 SC 1295
[5] Ministry of Civil Aviation, Government of India, DigiYatra Guidelines, 2022.
[6] National Crime Records Bureau, Automated Facial Recognition System (AFRS) Tender Document, 2019.
[7] K.S. Park, “Chilling Effect of Mass Surveillance,” Human Rights Law Review, Vol. 18, Issue 1 (2018).
[8] Internet Freedom Foundation, “Analysis of the Digital Personal Data Protection Act, 2023,” 2023.
[9] European Parliament, Artificial Intelligence Act, 2024.
[10] City of San Francisco, Stop Secret Surveillance Ordinance, 2019.
[11] Joy Buolamwini and Timnit Gebru, “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification,” Proceedings of Machine Learning Research 81 (2018).