Monday, July 27, 2020

India’s Transgender (Protection of Rights) Act: Why the activists are opposing it?

This article is by

Share this article

Article Contributor(s)

Vanshita Banuana

Article Title

India’s Transgender (Protection of Rights) Act: Why the activists are opposing it?

Publisher

Global Views 360

Publication Date

July 27, 2020

URL

Protests in Mumbai against the Transgender Bill

Protests in Mumbai against the Transgender Bill | Source: Tamravidhir via Wikimedia

On July 13, 2020 the Ministry of Social Justice and Empowerment of India notified the release of draft Rules for the much-disputed Transgender (Protection of Rights) Act 2019, and has given citizens 30 days to submit suggestions and objections.

The Ministry first published the draft Rules on April 18, 2020 and asked for comments by April 30, later extended to May 18. Based on the central government’s consideration of the submitted feedback, the updated Rules were once again opened to critique.

As summarised in this analysis by PRS Legislative Research, the Rules lay out the detailed process regarding issuance of Certificate of Identity, and welfare measures, medical facilities and such for transgender people. It also specifies that the National Institute of Social Defence will act as secretariat for the National Council for Transgender Persons.

Analysis

  1. The Act is infamous for claiming to confer the right to self-perceived gender identity, which is also enshrined in the National Legal Services Authority (NALSA) vs. Union of India judgement, but continuously neglecting this right thereby going against both a Supreme Court judgement and its own statement.
  2. This manifested once again in Rule 4 of the first draft of Rules which required a psychologist’s report— while paradoxically insisting that it requires “no medical examination”— as part of the application process. This requirement was removed from the recent draft of the Rules after backlash.
  3. Also, as stated in the Act, it is the District Magistrate who will determine the final “correctness” of the application, essentially stripping transgender people of any supposed right to self determination. It is worth noting that this places the District Magistrate, an executive figure, in a judicial position, one of ‘judging’ the ‘authenticity’ of a person’s gender identity.
  4. The above mentioned application will only provide a Certificate of Identity that states a person’s gender identity as transgender. To be able to apply for a revised Certificate of Identity to change one’s gender to male/female as per Rule 6, a person must undergo gender reassignment surgery and on top of that provide a certificate stating this from the Medical Superintendent or Chief Medical Officer from the medical institution which facilitates the surgery.
  5. This is problematic for a large multitude of reasons, including but not limited to: many transgender people not feeling the need for medical or surgical intervention, the policing of transgender people’s identity as only being ‘valid’ if they undergo surgery, and the sky-high costs of surgery contrasted with large numbers of transgender people living in unsupportive environments and/or being unable to finance their surgery.
  6. The right to self-identification continues to be blatantly violated in Rule 8, under which a District Magistrate can reject an application, following which the applicant has a right to appeal the rejection only within 60 days of intimation of the same, as stated in Rule 9.
  7. The right to self-determination was also thrown out the window when the first draft Rules imposed a penalty on “false” applications, once again referring to the arbitrary power of the District Magistrate. This has also been removed following strongly negative reactions.

It is important to compare the two versions of the Rules despite the second one being arguably better and cognizant of some of the demands made by the citizens and other stakeholders.

The first version of the Rules quite clearly depicted the narrowly cisnormative perspective through which transgender lives are seen by the people in power. Despite the many changes as a result of relentless protests, the Act is nowhere near to truly respecting and empowering transgender people.

The decision to give the final say to the District Magistrate- which some argue made the process harder than it used to be before the Act- and the refusal to provide affirmative action or reservations to ensure representation in positions of authority that transgender people have historically been denied access to.

It also does little to counter discrimination, as is seen most clearly in the punishment of sexual assault and rape being much less than for the rape of a cisgender woman. It advocates for plenty of measures but does pitifully little to ensure or enable these changes.  

History of the Act

The history of the Act is a turbulent one. The 2016 Transgender (Protection of Rights) Bill, was almost immediately slammed by activists, NGOs, other human rights organisations, and citizens, for multiple reasons.

The most derided was the provision to set up a ‘District Screening Committee’ which included the District Magistrate, a chief medical officer and a psychiatrist among others, for the sole purpose of scrutinising a transgender person’s body and identity. It also criminalised organised begging, an activity specifically common among the Hijra community.

The Lower House of the Parliament, the Lok Sabha, rejected all the proposed changes by the parliamentary standing committee along with the demands of the transgender community, and passed the bill with some amendments in 2018. A short-lived victory came in the form of the lapse of the bill due to the 2019 general elections.

However, as soon as the NDA government was re-elected, the bill was reintroduced in the Parliament with some more changes, particularly the removal of the section on District Screening Committees, but was still unsatisfactory.

The full text of this bill was not released when it was approved by the Union Cabinet on July 10, 2019, but on the morning that it was tabled in the Lok Sabha, garnering another consecutive year of protest since it was first introduced.

This is the bill as it exists today, having been passed by the Lok Sabha on August 5, 2019. When the motion to refer it to a select committee failed in the Rajya Sabha, it was passed on November 26, 2019, and received presidential assent on December 5, 2019. Recent developments include a writ petition in the Supreme Court challenging the validity of the Act.

Despite it becoming the law of the land, transgender citizens and activists such as Esvi Anbu Kothazam and Kanmani Ray continue to criticse it and the insidious transphobic thinking that has always guided it.

Support us to bring the world closer

To keep our content accessible we don't charge anything from our readers and rely on donations to continue working. Your support is critical in keeping Global Views 360 independent and helps us to present a well-rounded world view on different international issues for you. Every contribution, however big or small, is valuable for us to keep on delivering in future as well.

Support Us

Share this article

Read More

February 4, 2021 5:22 PM

Automated Facial Recognition System of India and its Implications

On 28th of June 2019, the National Crime Records Bureau (NCRB) opened bids and invited Turnkey Solution providers to implement a centralized Automated Facial Recognition System, or AFRS, in India. As the name suggests, AFRS is a facial recognition system which was proposed by the Indian Ministry of Home Affairs, geared towards modernizing the police force and to identify and track criminals using Facial Recognition Technology, or FRT.

The aforementioned technology uses databases of photos collected from criminal records, CCTV cameras, newspapers and media, driver’s license and government identities to collect facial data of people. FRT then identifies the people and uses their biometrics to map facial features and geometry of the face. The software then creates a “facial signature” based on the information collected. A mathematical formula is associated with each facial signature and it is subsequently compared to a database of known faces.

This article explores the implications of implementing Automated Facial Recognition technology in India.

Facial recognition software has become widely popular in the past decade. Several countries have been trying to establish efficient Facial Recognition systems for tackling crime and assembling an efficient criminal tracking system. Although there are a few potential benefits of using the technology, those benefits seem to be insignificant when compared to the several concerns about privacy and safety of people that the technology poses.

Images of every person captured by CCTV cameras and other sources will be regarded as images of potential criminals and will be matched against the Crime and Criminal Tracking Networks and Systems database (CCTNS) by the FRT. This implies that all of us will be treated as potential criminals when we walk past a CCTV camera. As a consequence, the assumption of “innocent until proven guilty” will be turned on its head.

You wouldn’t be surprised to know that China has installed the largest centralized FRT system in the world. In China, data can be collected and analyzed from over 200 million CCTVs that the country owns. Additionally, there are 20 million specialized facial recognition cameras which continuously collect data for analysis. These systems are currently used by China to track and manipulate the behavior of ethnic Uyghur minorities in the camps set up in Xinjiang region. FRT was also used by China during democracy protests of Hong Kong to profile protestors to identify them. These steps raised concerns worldwide about putting an end to a person’s freedom of expression, right to privacy and basic dignity.

It is very likely that the same consequences will be faced by Indians if AFRS is established across the country.

There are several underlying concerns about implementing AFRS.

Firstly, this system has proven to be inefficient in several instances. In August 2018, Delhi police used a facial recognition system which was reported to have an accuracy rate of 2%. The FRT software used by the UK's Metropolitan Police returned more than a staggering 98% of false positives. Another instance was when American Civil Liberties Union (ACLU) used Amazon’s face recognition software known as “Rekognition” to compare the images of the legislative members of American Congress with a database of criminal mugshots. To Amazon’s embarrassment, the results included 28 incorrect matches.. Another significant evidence of inefficiency was the outcome of an experiment performed by McAfee.  Here is what they did. The researchers used an algorithm known as CycleGAN which is used for image translation. CycleGAN is a software expert at morphing photographs. One can use the software to change horses into zebras and paintings into photographs. McAfee used the software to misdirect the Facial recognition algorithm. The team used 1500 photos of two members and fed them into CycleGAN which morphed them into one another and kept feeding the resulting images into different facial recognition algorithms to check who it recognized. After generating hundreds of such images, CycleGAN eventually generated a fake image which looked like person ‘A’ to the naked eye but managed to trick the FRT into thinking that it was person ‘B’. Owing to the dissatisfactory results, researchers expressed their concern about the inefficiency of FRTs. In fact mere eye-makeup can fool the FRT into allowing a person on a no-flight list to board the flight. This trend of inefficiency in the technology was noticed worldwide.

Secondly, facial recognition systems use machine learning technology. It is concerning and uncomfortable to note that FRT has often reflected the biases deployed in the society. Consequently, leading to several facial mismatches. A study by MIT shows that FRT routinely misidentifies people of color, women and young people. While the error rate was 8.1% for men, it was 20.6% for women. The error for women of color was 34%. The error values in the “supervised study” in a laboratory setting for a sample population is itself simply unacceptable. In the abovementioned American Civil Liberties Union study, the false matches were disproportionately African American and people of color. In India, 55% of prisoners undertrial are either Dalits, Adivasis, or Muslims although the combined population of all three just amounts to 39% of the total population (2011 census). If AFRS is trained on these records, it would definitely deploy the same socially held prejudices against the minority communities. Therefore, displaying inaccurate matches. The tender issued by the Ministry of Home Affairs had no indication of eliminating these biases nor did it have any mention of human-verifiable results. Using a system embedded with societal bias to replace biased human judgement defeats claims of technological neutrality. Deploying FRT systems in law enforcement will be ineffective at best and disastrous at worst.

Thirdly, the concerns of invasion of privacy and mass surveillance hasn’t been addressed satisfactorily. Facial Recognition makes data protection almost impossible as publicly available information is collected but they are analyzed to a point of intimacy. India does not have a well established data protection law given that “Personal data Protection Bill” is yet to be enforced. Implementing AFRS in the absence of a safeguard is a potential threat to our personal data. Moreover, police and other law enforcement agencies will have a great degree of discretion over our data which can lead to a mission creep. To add on to the list of privacy concerns, the bidder of AFRS will be largely responsible for maintaining confidentiality and integrity of data which will be stored apart from the established ISO standard. Additionally, the tender has no preference to “Make in India'' and shows absolutely no objections to foreign bidders and even to those having their headquarters in China, the hub of data breach .The is no governing system or legal limitations and restrictions to the technology. There is no legal standard set to ensure proportional use and protection to those who non-consensually interact with the system. Furthermore, the tender does not mention the definition of a “criminal”. Is a person considered a criminal when a charge sheet is filed against them? Or is it when the person is arrested? Or is it an individual convicted by the Court? Or is it any person who is a suspect? Since the word “criminal” isn’t definitely defined in the tender, the law enforcement agencies will ultimately be able to track a larger number of people than required.

The notion that AFRS will lead to greater efficacy must be critically questioned. San Francisco imposed a total ban on police use of facial recognition in May, 2019. Police departments in London are pressurized to put a stop to the use of FRT after several instances of discrimination and inefficiency. It would do well to India to learn from the mistakes of other countries rather than committing the same.

Read More