- Home
- News
- Articles+
- Aerospace
- Agriculture
- Alternate Dispute Resolution
- Banking and Finance
- Bankruptcy
- Book Review
- Bribery & Corruption
- Commercial Litigation
- Competition Law
- Conference Reports
- Consumer Products
- Contract
- Corporate Governance
- Corporate Law
- Covid-19
- Cryptocurrency
- Cybersecurity
- Data Protection
- Defence
- Digital Economy
- E-commerce
- Employment Law
- Energy and Natural Resources
- Entertainment and Sports Law
- Environmental Law
- FDI
- Food and Beverage
- Health Care
- IBC Diaries
- Insurance Law
- Intellectual Property
- International Law
- Know the Law
- Labour Laws
- Litigation
- Litigation Funding
- Manufacturing
- Mergers & Acquisitions
- NFTs
- Privacy
- Private Equity
- Project Finance
- Real Estate
- Risk and Compliance
- Technology Media and Telecom
- Tributes
- Zoom In
- Take On Board
- In Focus
- Law & Policy and Regulation
- IP & Tech Era
- Viewpoint
- Arbitration & Mediation
- Tax
- Student Corner
- AI
- ESG
- Gaming
- Inclusion & Diversity
- Law Firms
- In-House
- Rankings
- E-Magazine
- Legal Era TV
- Events
- News
- Articles
- Aerospace
- Agriculture
- Alternate Dispute Resolution
- Banking and Finance
- Bankruptcy
- Book Review
- Bribery & Corruption
- Commercial Litigation
- Competition Law
- Conference Reports
- Consumer Products
- Contract
- Corporate Governance
- Corporate Law
- Covid-19
- Cryptocurrency
- Cybersecurity
- Data Protection
- Defence
- Digital Economy
- E-commerce
- Employment Law
- Energy and Natural Resources
- Entertainment and Sports Law
- Environmental Law
- FDI
- Food and Beverage
- Health Care
- IBC Diaries
- Insurance Law
- Intellectual Property
- International Law
- Know the Law
- Labour Laws
- Litigation
- Litigation Funding
- Manufacturing
- Mergers & Acquisitions
- NFTs
- Privacy
- Private Equity
- Project Finance
- Real Estate
- Risk and Compliance
- Technology Media and Telecom
- Tributes
- Zoom In
- Take On Board
- In Focus
- Law & Policy and Regulation
- IP & Tech Era
- Viewpoint
- Arbitration & Mediation
- Tax
- Student Corner
- AI
- ESG
- Gaming
- Inclusion & Diversity
- Law Firms
- In-House
- Rankings
- E-Magazine
- Legal Era TV
- Events
AI And Facial Recognition In India: Privacy Under Threat?
AI And Facial Recognition In India: Privacy Under Threat?
AI And Facial Recognition In India: Privacy Under Threat?
Striking a balance between leveraging surveillance for societal benefits and protecting individual rights will be critical for a future where security does not come at the cost of privacy
I. State-Sponsored AI Surveillance
CCTV cameras have transformed from tools primarily aimed at enhancing security and reducing crime rates to sophisticated systems capable of continuous surveillance, individual identification, and even emotion recognition. Governments and law enforcement agencies are increasingly adopting these technologies for crime prevention, combating terrorism, and managing traffic, resulting in heightened surveillance in our daily lives. Nevertheless, this raises significant concerns regarding unconsented monitoring and the indefinite storage of personal data. The growing dependence on surveillance technologies, including drones, highlights the necessity for a balanced approach between security measures and privacy rights.
Originally introduced to enhance safety and prevention of crime, CCTV systems have evolved to integrate advanced technologies like facial recognition, high-definition zoom, and AI-driven emotion analysis. These advancements facilitate real-time, unnoticeable monitoring and predictive behavior analysis. However, the increasing reliance on these surveillance technologies raises critical concerns about unconsented monitoring, indefinite data retention, and the potential misuse of information by operators. Additionally, emerging tools such as drones and AI-powered systems further complicate the balance between security interests and the right to privacy, underscoring the urgent need for regulatory and oversight mechanisms.
II. Expanding Surveillance in India
India has embraced facial recognition technology (FRT) through initiatives such as the Automatic Facial Recognition System (AFRS)1, significantly impacting sectors like law enforcement, telecommunications, and healthcare. However, this rapid adoption raises substantial privacy concerns due to insufficient regulatory measures. The recently enacted Digital Personal Data Protection Act 2023 (DPDP) aims to enhance privacy protections and align Indian laws with global standards such as the GDPR. Nevertheless, challenges remain in ensuring the act’s effectiveness.
Over the past decade, India has seen a marked expansion of state-operated CCTV infrastructure across major urban areas. Delhi, with an average of 1,826.6 CCTV cameras per square mile2, is often referred to as the world’s most surveilled city, even surpassing major cities in China in terms of camera density. Other cities like Chennai, Mumbai, and Hyderabad have also implemented extensive CCTV systems, supported by centralized command and control centres for real-time monitoring. Proponents argue that CCTV surveillance bolsters national security and aids crime prevention; however, studies indicate that its effectiveness in reducing crime is rather inconclusive. Furthermore, growing concerns about privacy violations and the collection of personal data, particularly visual and audio-visual information in public spaces, have become increasingly prominent.
III. Legal Framework Governing CCTV Surveillance in India
1. Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 of the Information Technology Act, 20003
India’s existing legal framework for CCTV surveillance is primarily governed by the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011, under the Information Technology Act, 2000. However, a critical analysis of these laws would highlight certain areas for improvement.
One key issue is the lack of explicit guidelines regarding the placement of CCTV cameras in public spaces. Although the laws prohibit surveillance in areas where individuals have a reasonable expectation of privacy, the definition of such areas remains ambiguous, creating room for potential misuse. Additionally, while organizations are required to obtain consent before collecting, storing, or using personal data, broad exceptions for safety and security purposes leave the framework vulnerable to abuse under the pretext of ensuring security.
Another major shortcoming is the absence of robust mechanisms for oversight and regulation, which limits accountability for misuse of CCTV footage or breaches of privacy. The reliance on the general provisions of the Information Technology Act, 2000 further contributes to a fragmented and unclear regulatory structure, highlighting the need for more comprehensive and precise legislation.
2. The Digital Personal Data Protection Act, 2023
The recently enacted Digital Personal Data Protection Act, 2023 (DPDP), while not explicitly addressing the use of CCTV systems, introduces several safeguards that are relevant to CCTV surveillance in public spaces. The Act emphasizes that organizations cannot use malfunctioning CCTV systems as an excuse for non-compliance, underscoring the importance of proper maintenance and ensuring the functionality of these systems to uphold individuals’ data access rights. Additionally, it requires organizations to establish and adhere to clear data retention policies for CCTV footage, ensuring timely access to data when responding to access requests while also preventing unnecessary or prolonged retention of personal information.
Furthermore, the DPDP mandates that organizations develop well-defined internal procedures for handling subject access requests related to CCTV footage, ensuring that responses are both timely and comprehensive. By establishing a robust legal framework for data privacy and access, the DPDP Act encourages improved data management practices and compliance. When applied to CCTV systems, it empowers individuals, including victims of crime, by safeguarding their rights and ensuring their access to personal data.
However, significant legal and ethical concerns remain regarding the government’s use of data collected by CCTV systems. While the DPDP allows the processing of data through advanced technologies such as AI and surveillance systems, it deems such usage by the government as “implied consent,” raising fears of unchecked surveillance and potential misuse under the pretext of governance or national security.
Section 7(c) of the DPDP further broadens the government’s surveillance powers by permitting non-consensual data processing for purposes such as maintaining public order, sovereignty, or security of the state. However, the lack of clear definitions for terms like “public order” and “security of the state” leaves these provisions open to interpretation, increasing the risk of overreach.
Additionally, DPDP does not establish an independent regulatory body to oversee the government’s surveillance activities leaving its expansive powers unchecked. This lack of accountability, combined with vague provisions, poses a serious threat to individuals’ right to privacy. To mitigate these risks, safeguards must be implemented to ensure that any intrusion or non-consensual data processing by the government is both necessary and proportionate. The rules concerning DPDP are yet to come out for public consideration and relevant stakeholders. Further development from the legal and operational standpoint will be clearer thereupon.
IV. Case Study: Massive Installation of CCTV cameras in Delhi
Delhi’s recent “Safe City” initiative has seen the installation of thousands of CCTV cameras aimed at curbing crime, particularly against women. However, this extensive surveillance raises pressing privacy concerns, as individuals are monitored daily, often without their knowledge. These cameras have the capability to collect sensitive data, including movement patterns and facial recognition information, potentially creating detailed digital profiles of citizens.
The rapid expansion of CCTV surveillance in Delhi also prompts critical questions about its effectiveness and its impact on fundamental rights. While the initiative is often justified by the urgent need to combat violent crimes against women and children, data from the National Crime Records Bureau (NCRB) suggests otherwise. In 2021, Delhi reported 13,892 crimes against women—a sharp 40% increase compared to 9,782 cases in 20204. This significant rise occurred despite Delhi being the most surveilled city globally, with the highest number of CCTV cameras per square mile.
Moreover, the lack of robust legal safeguards intensifies privacy concerns. The Delhi Rules for Regulation of CCTV Systems in NCT of Delhi, 20185, state that cameras should not be placed in locations that invade individual privacy6, and any data collected must be used solely for specified purposes1. However, the integration of facial recognition technology has heightened fears of misuse, particularly its disproportionate impact on marginalized communities. For example, during the investigation of the 2020 North East Delhi riots, the use of facial recognition technology faced widespread criticism due to potential human biases and the lack of preventive safeguards. These biases can arise when human operators act without adequate oversight or protective measures.8
V. Conclusion
The rise of state-sponsored AI surveillance through widespread CCTV and facial recognition technology presents both opportunities and challenges. While these innovations can enhance public safety and national security, they also pose substantial risks to individual privacy and freedoms. India’s rapid adoption of such technologies, exemplified by cities like Delhi, underscores the urgency of addressing these concerns.
To navigate this landscape effectively, the Indian government must establish comprehensive legal frameworks that prioritize transparency, informed consent, and stringent data protection measures. Public engagement and oversight mechanisms are essential to safeguard against misuse and ensure accountability. Striking a balance between leveraging surveillance for societal benefits and protecting individual rights will be critical for a future where security does not come at the cost of privacy.
Disclaimer – The views expressed in this article are the personal views of the author and are purely informative in nature.