The issue of legality.
The issue of legality.
Third, it is assumed that use of facial recognition includes a foundation. Responding to a legal notice sent out by the Internet Freedom Foundation (IFF), the Home Ministry has stated that using facial recognition systems such as the AFRS don’t suffer with illegality – as they’ve been accepted by the cabinet via a 2009 note on CCTNS.
This assumption is unsound. At the outset, as noted by IFF, a cabinet note’is not a statutory enactment’ and can’t be considered the foundation on which facial recognition is performed.
Further, the Supreme Court explicitly stated that this right extended to spaces, and that taxpayers cannot be subject to manipulation or group of the personal data and at 2017 reaffirmed the Right to Privacy. The court also explained that any infringement on the Right to Privacy, such as collecting data for law enforcement purposes, must be carried out in accordance with the proportionality standard. It must be it is necessary and proportionate. Not only does recognition don’t satisfy the first part of the standard, it also falls short on another two. Collection of biometric data that may or might not be used in the future is an inherent part of facial recognition methods.
Another popular argument is that the Right to Privacy is not complete, which the state can violate this directly from the interest of national security and general security. Though this is actually true – no fundamental right is absolute – a recent ruling of the Bombay High Court explains that even in the instance of an overarching national security or public safety concern, the justification for breach of a right must be carried out via the proportionality test laid down from the Supreme Court’s 2017 Puttaswamy judgment – that the event of public safety has to be shown, not merely claimed.
While India doesn’t have a data protection legislation at the moment 2018 contemplates proportionality since the standard to justify law enforcement exceptions.
Law enforcement authorities around the world are grappling with all the limitations and dangers of recognition. Before this year san Francisco banned authorities use of facial recognition technology. Police trials of recognition systems from the uk have shown that the tech collapsed 80 per cent of the moment, and has’operational efficiencies’. Given the harmful and unreliable consequences of these technologies, police forces in the UK are resisting piloting recognition methods.
India approach buys into the hype of emerging technology, and voluntarily evidence of prejudice harm, and unconstitutionality. Recognition isn’t a panacea for police forces that are understaffed, and neither is it a example of modernisation of their police forces, particularly given the legal context in which its installation is being contemplated.
This article is part of a series examining The Future of Data in partnership with Carnegie India December 2019, leading up to its Global Technology Summit 2019 in Bengaluru in 4-6. Additional information about the summit are available here.
Vidushi Marda is a lawyer and directs research on AI and human rights for the global team of Article 19. She is also a non-resident scholar for Carnegie India. Views are private.