A scary proposal to use facial recognition and AI by an Indian state has experts fuming
Jan 24, 2021, 13:21 IST
- The Lucknow police are installing cameras with facial recognition technology, powered by artificial intelligence (AI), in prominent harassment hotspots.
- The cameras will gauge whether a woman is in distress based on her facial expression and send an alert to the police.
- Anita Gurumurthy, founding member and executive director of IT for Change, called the idea “both absurd and undemocratic.”
Advertisement
The Lucknow police seem to be dead serious about attempting to provide women with a sense of security, whether they want it or not. And, experts say, it is testing the fine line between safety measures and a person’s right to privacy.
Under the Uttar Pradesh government's Mission Shakti programme, Lucknow police are installing cameras equipped with facial recognition technology, powered by artificial intelligence (AI), that will gauge whether a woman is in distress based on her facial expression and send an alert to the police, even before she or anyone else can call the police for help.
While the idea may seem well-intended at first, most experts consider this surveillance programme precarious. Sumathi Chandrashekaran, a policy lawyer, based in New Delhi, sums it up as follows. "This policy proposal is laughable and problematic in so many multiple dimensions, that it is hard to parse it into a coherent argument,” she told Business Insider.
But do Lakhnawis care?
Advertisement
First up: Indian cops do not have a great track record in surveillance
Anita Gurumurthy, founding member and executive director of IT for Change, called the idea “both absurd and undemocratic.” And she has her reasons. She cited the example of how cameras in metro trains have been instruments of voyeurism with law enforcers violating the privacy and dignity of women commuters.
Secondly, the technology is untested, unproven
“There are ethical concerns that need democratic deliberation because the AI does not discern the difference between socially meaningful ends and socially repressive and harmful ones,” Gurumurthy added.
There are other practical problems for the cops as well. “In a situation wherein two women who are friends are having a heated conversation, an alert generated would only lead to unnecessary harassment by the police. If only certain alerts are responded to, the question that arises is who is making the decision about which alerts to respond to,” explained Anushka Jain, Associate Counsel (Transparency & Right to Information) at the Internet Freedom Foundation.
Advertisement
Not just that, Jain raised many pertinent questions, like:
- How and where will the data be stored?
- Who will have access to the data and who will approve this access?
- Who will be accountable for the misuse of information?
- What if women get harassed by the police itself?
‘The proposal to use facial recognition tech in this manner is legally untenable’
Chandrasekharan contends that technology is not foolproof. ‘Has the AI algorithm been tested for failures, biases, and the numerous other issues it will likely have?” she asked, highlighting that the policy strikes at the heart of the individual right to privacy. “Any act of surveillance has to pass through hoops of tests under law before it can be deemed a legitimate exercise,” she said.
Advertisement
According to Chandrasekharan, if this kind of surveillance receives sanction, nothing prevents the state from invading our private spaces, like homes, more flagrantly, purportedly to "protect" us from becoming victims. “Actions like these are fundamentally opposed to democratic principles, and thus, untenable," she said.SEE ALSO:
Here’s how Indians are buying Tesla, Facebook, and Apple shares through brokerage startups
Recession in India will end latest by March, says a top economist
Do not panic when Finance Minister Nirmala Sitharaman reads out the revised estimates