Accounting For Diversity In Automated Gender Recognition Systems

The Law of Tech
9 min readJun 6, 2021

Developments in Artificial Intelligence (AI) entail incredible progress for many fields in the near future. From AI-powered board directors, to AI-enabled life-saving medical treatments; and from virtual companions, to self-driving cars. Nevertheless, the introduction and implementation of AI in society raises a variety of ethical, legal and societal concerns, and within this context there are still many areas in which there is substantial room for improvement. A more specific practical example of such room for improvement can be found in the fact that AI systems do not always account for diversity, and this may have a detrimental impact on the lives of many individuals.

In recent years, particular concerns have been raised with regard to so-called Automated Gender Recognitions Systems (AGRS), AI systems that predict someone’s gender (and sexual orientation). This technology is believed to be fundamentally flawed when it comes to recognising and categorising human beings in all their diversity, as many of these AI systems work by sorting people into two groups — male or female. Not surprisingly, the consequences for people who do not fit within these binary categories can be severe. Moreover, these algorithms are often programmed to reinforce outdated stereotypes about race and gender that are harmful to everyone. From a legal…

--

--

The Law of Tech
The Law of Tech

Written by The Law of Tech

The Law of Tech is an online platform which aims to educate on the ongoing technology-driven transformation of the law and legal industry.