Tuesday, February 2, 2021

Automated Facial Recognition System of India and its Implications

This article is by

Share this article

Article Contributor(s)

Vaishnavi Krishna Mohan

Article Title

Automated Facial Recognition System of India and its Implications

Publisher

Global Views 360

Publication Date

February 2, 2021

URL

CCTV in operation

CCTV in operation | Source: Rich Smith via Unsplash

On 28th of June 2019, the National Crime Records Bureau (NCRB) opened bids and invited Turnkey Solution providers to implement a centralized Automated Facial Recognition System, or AFRS, in India. As the name suggests, AFRS is a facial recognition system which was proposed by the Indian Ministry of Home Affairs, geared towards modernizing the police force and to identify and track criminals using Facial Recognition Technology, or FRT.

The aforementioned technology uses databases of photos collected from criminal records, CCTV cameras, newspapers and media, driver’s license and government identities to collect facial data of people. FRT then identifies the people and uses their biometrics to map facial features and geometry of the face. The software then creates a “facial signature” based on the information collected. A mathematical formula is associated with each facial signature and it is subsequently compared to a database of known faces.

This article explores the implications of implementing Automated Facial Recognition technology in India.

Facial recognition software has become widely popular in the past decade. Several countries have been trying to establish efficient Facial Recognition systems for tackling crime and assembling an efficient criminal tracking system. Although there are a few potential benefits of using the technology, those benefits seem to be insignificant when compared to the several concerns about privacy and safety of people that the technology poses.

Images of every person captured by CCTV cameras and other sources will be regarded as images of potential criminals and will be matched against the Crime and Criminal Tracking Networks and Systems database (CCTNS) by the FRT. This implies that all of us will be treated as potential criminals when we walk past a CCTV camera. As a consequence, the assumption of “innocent until proven guilty” will be turned on its head.

You wouldn’t be surprised to know that China has installed the largest centralized FRT system in the world. In China, data can be collected and analyzed from over 200 million CCTVs that the country owns. Additionally, there are 20 million specialized facial recognition cameras which continuously collect data for analysis. These systems are currently used by China to track and manipulate the behavior of ethnic Uyghur minorities in the camps set up in Xinjiang region. FRT was also used by China during democracy protests of Hong Kong to profile protestors to identify them. These steps raised concerns worldwide about putting an end to a person’s freedom of expression, right to privacy and basic dignity.

It is very likely that the same consequences will be faced by Indians if AFRS is established across the country.

There are several underlying concerns about implementing AFRS.

Firstly, this system has proven to be inefficient in several instances. In August 2018, Delhi police used a facial recognition system which was reported to have an accuracy rate of 2%. The FRT software used by the UK's Metropolitan Police returned more than a staggering 98% of false positives. Another instance was when American Civil Liberties Union (ACLU) used Amazon’s face recognition software known as “Rekognition” to compare the images of the legislative members of American Congress with a database of criminal mugshots. To Amazon’s embarrassment, the results included 28 incorrect matches.. Another significant evidence of inefficiency was the outcome of an experiment performed by McAfee.  Here is what they did. The researchers used an algorithm known as CycleGAN which is used for image translation. CycleGAN is a software expert at morphing photographs. One can use the software to change horses into zebras and paintings into photographs. McAfee used the software to misdirect the Facial recognition algorithm. The team used 1500 photos of two members and fed them into CycleGAN which morphed them into one another and kept feeding the resulting images into different facial recognition algorithms to check who it recognized. After generating hundreds of such images, CycleGAN eventually generated a fake image which looked like person ‘A’ to the naked eye but managed to trick the FRT into thinking that it was person ‘B’. Owing to the dissatisfactory results, researchers expressed their concern about the inefficiency of FRTs. In fact mere eye-makeup can fool the FRT into allowing a person on a no-flight list to board the flight. This trend of inefficiency in the technology was noticed worldwide.

Secondly, facial recognition systems use machine learning technology. It is concerning and uncomfortable to note that FRT has often reflected the biases deployed in the society. Consequently, leading to several facial mismatches. A study by MIT shows that FRT routinely misidentifies people of color, women and young people. While the error rate was 8.1% for men, it was 20.6% for women. The error for women of color was 34%. The error values in the “supervised study” in a laboratory setting for a sample population is itself simply unacceptable. In the abovementioned American Civil Liberties Union study, the false matches were disproportionately African American and people of color. In India, 55% of prisoners undertrial are either Dalits, Adivasis, or Muslims although the combined population of all three just amounts to 39% of the total population (2011 census). If AFRS is trained on these records, it would definitely deploy the same socially held prejudices against the minority communities. Therefore, displaying inaccurate matches. The tender issued by the Ministry of Home Affairs had no indication of eliminating these biases nor did it have any mention of human-verifiable results. Using a system embedded with societal bias to replace biased human judgement defeats claims of technological neutrality. Deploying FRT systems in law enforcement will be ineffective at best and disastrous at worst.

Thirdly, the concerns of invasion of privacy and mass surveillance hasn’t been addressed satisfactorily. Facial Recognition makes data protection almost impossible as publicly available information is collected but they are analyzed to a point of intimacy. India does not have a well established data protection law given that “Personal data Protection Bill” is yet to be enforced. Implementing AFRS in the absence of a safeguard is a potential threat to our personal data. Moreover, police and other law enforcement agencies will have a great degree of discretion over our data which can lead to a mission creep. To add on to the list of privacy concerns, the bidder of AFRS will be largely responsible for maintaining confidentiality and integrity of data which will be stored apart from the established ISO standard. Additionally, the tender has no preference to “Make in India'' and shows absolutely no objections to foreign bidders and even to those having their headquarters in China, the hub of data breach .The is no governing system or legal limitations and restrictions to the technology. There is no legal standard set to ensure proportional use and protection to those who non-consensually interact with the system. Furthermore, the tender does not mention the definition of a “criminal”. Is a person considered a criminal when a charge sheet is filed against them? Or is it when the person is arrested? Or is it an individual convicted by the Court? Or is it any person who is a suspect? Since the word “criminal” isn’t definitely defined in the tender, the law enforcement agencies will ultimately be able to track a larger number of people than required.

The notion that AFRS will lead to greater efficacy must be critically questioned. San Francisco imposed a total ban on police use of facial recognition in May, 2019. Police departments in London are pressurized to put a stop to the use of FRT after several instances of discrimination and inefficiency. It would do well to India to learn from the mistakes of other countries rather than committing the same.

Support us to bring the world closer

To keep our content accessible we don't charge anything from our readers and rely on donations to continue working. Your support is critical in keeping Global Views 360 independent and helps us to present a well-rounded world view on different international issues for you. Every contribution, however big or small, is valuable for us to keep on delivering in future as well.

Support Us

Share this article

Read More

February 22, 2021 11:06 PM

WhatsApp's New Privacy Policy: Collecting Metadata and Its Implications

According to WhatsApp’s new privacy policy, the app is set to collect “only” user’s Metadata. Metadata can reveal a lot more than merely the app usage of a person. Former NSA General Counsel Stewart Baker stated, “Metadata absolutely tells you everything about somebody’s life. If you have enough metadata you don’t really need content.”

This article explores the ways in which WhatsApp is underselling the true estimation of the significance of Metadata.

Facebook owned WhatsApp recently announced the update of its privacy policy terms. 8th of February, 2021 was initially set as the deadline for users to either accept the new privacy policy or delete their account. By this time, most of us have already witnessed or been a part of the backlash that WhatsApp is experiencing. LocalCircles conducted a survey and the results indicated that 15% of India’s users are likely to move away entirely from the app while 36% will drastically reduce the usage and 67% of users are likely to discontinue chats with WhatsApp business accounts.

To reinstall trust in its users, WhatsApp released a clarification stating that the new policy update doesn’t compromise privacy of messages with friends and family. Furthermore, it explains that the update includes changes related to WhatsApp business accounts are optional too.

However, owing to severe backlash, WhatsApp has pushed the deadline to May 15 while they further clarify their policy updates.

It is true that WhatsApp cannot read our messages as it is end-to-end encrypted which implies that only a message’s sender and receiver can read it. The updated privacy policy intends to alert users that some businesses would soon be using Facebook-servers to store messages with their customers. By accepting the new privacy policy, users will be allowing WhatsApp to reserve all rights to collect your data and share it with the expansive Facebook and Instagram networks ‘regardless of whether you have profiles on those apps.’

A person using WhatsApp | Source: Andrés Rodríguez via Pixabay

By using WhatsApp, you may now be sharing your usage data, your phone’s unique identifier, your location when the location service is enabled, among several other types of metadata. A culmination of all your metadata is linked to your identity.

The value of metadata has been underestimated since the term isn’t clearly understood. Metadata is data about our data. For instance, in a cell phone conversation, the conversation itself isn’t metadata but everything except that is metadata. Data regarding who you called, how long you spoke for, where you were when you placed the call, where the other person on the line was and the time you placed the call. Consider a situation when every time you made a call to someone, you had to inform a particular person about who you called, how long you spoke for, when and where and all other details except the content spoken. This applies for every single call and everyone else’s metadata is also being recorded. The person owning the metadata can analyze and tell a lot about your personal life. Who you work with, who you spend time with, who you are close to, where you are at particular times and so on…

Kurt Opsahl, in his post in the Electronic Frontier Foundation, gives an example of how companies and governments collect intimate details about your life with the disguised use of the word called metadata. The following examples are an excerpt of his article:

“They know you rang a phone sex service at 2:24 am and spoke for 18 minutes. They know that you called suicide prevention hotline from the Golden Gate Bridge.

They know you spoke with an HIV testing service, then your doctor, then your health insurance company in the same hour.

They know you called a gynaecologist, spoke for a half hour, and then called the local Planned Parenthood's number later that day. But nobody knows what you spoke about.”

Metadata provides more than required context to know some of the most intimate and personal details of your lives.  When this data is correlated with the records of other phone calls, one can easily obtain a lot more data and track our daily routines. This is merely about phone calls. WhatsApp includes a lot more features and will collect metadata of chats, businesses and money transactions.

In WhatsApp’s words:

“We collect service-related, diagnostic, and performance information. This includes information about your activity (such as how you use our Services, how you interact with others using our Services, and the like), log files, and diagnostic, crash, website, and performance logs and reports.”

In addition to this, WhatsApp also collects information about IP address, OS, browser information and phone number.

Stanford’s computer scientists conducted an analysis to understand the extent of intrusion of privacy using metadata. The scientists built an app for smartphones. The app was developed to retrieve metadata of calls and text messages from more than 800 volunteers’ phone logs. The researchers received records of more than 250,000 calls and 1.2 million texts. Their inexpensive analysis revealed personal details of several people like their health records. Researchers were also able to learn that one of their participants owned an AR semi-automatic rifle with only metadata.

Gen. Michael Hayden | Source: Wikimedia

Gen. Michael Hayden, the former head of the National Security Agency once stated that “the U.S. government kill[s] people based on metadata.”

In 2016, Facebook was involved in the infamous data privacy scandal which centered around collection of personal data of over 87 million people by Cambridge Analytica, a political consulting and strategic analyst firm. The organization harvested user data for targeted advertising, particularly political advertising during the 2016 U.S. election. While the central offender was Cambridge Analytica, the apparent indifference for data privacy to Facebook facilitated Cambridge Analytical and several other organizations.

In June 2018, Facebook confirmed that it was sharing data with at least 4 Chinese companies, Huawei, Oppo, Lenovo and TCL. Facebook was under scrutiny from the U.S. intelligence agencies on security issues as they claimed that the data with the Chinese telecommunication companies would provide an opportunity for a foreign espionage.

In September 2019, there were reports that the Indian government contemplated making it mandatory for companies like Google, Facebook, and Amazon, to share the public data of users.

The Ministry of Electronics and IT (MeitY) was planning on issuing new guidelines under the Information Technology Act which according to which tech giants would have been required to share freely available data or the public information that they collate in the course of their operations, including traffic, buying and illness patterns.

Europe is exempted from WhatsApp’s new privacy policy as EU antitrust authorities fined Facebook 110 million euros for misleading the regulators during the takeover of WhatsApp in 2014. EU’s strict privacy laws empowers regulators to fine up to 4% of global annual revenue of the companies that breach the bloc’s rules.

Your Metadata is extremely personal. By giving WhatsApp the authority to access it, you are giving access to several other organizations, businesses and it also makes you more vulnerable to third-party hackers and trackers. WhatsApp has given multiple assurances about its updated privacy policy being noninvasive. However, most of these assurances are cleverly worded and misleading statements. It is important to read through the fine print of the new policy before accepting it.

Read More