Published on [Permalink]
Reading time: 2 minutes
Posted in:

Trying to Force Analogue (Gender) to Binary

Service that uses AI to identify gender based on names looks incredibly biased

Some tech companies make a splash when they launch, others seem to bellyflop.

Genderify, a new service that promised to identify someone’s gender by analyzing their name, email address, or username with the help AI, looks firmly to be in the latter camp. The company launched on Product Hunt last week, but picked up a lot of attention on social media as users discovered biases and inaccuracies in its algorithms.

As the article details, there were a lot of inaccuracies, not to mention some strange results.

Although these sorts of biases appear regularly in machine learning systems, the thoughtlessness of Genderify seems to have surprised many experts in the field. The response from Meredith Whittaker, co-founder of the AI Now Institute, which studies the impact of AI on society, was somewhat typical. “Are we being trolled?” she asked. “Is this a psyop meant to distract the tech+justice world? Is it cringey tech April fool’s day already?”

The problem is not that Genderify made assumptions about someone’s gender based on their name. People do this all the time, and sometimes make mistakes in the process. That’s why it’s polite to find out how people self-identify and how they want to be addressed. The problem with Genderify is that it automated these assumptions; applying them at scale while sorting individuals into a male/female binary (and so ignoring individuals who identify as non-binary) while reinforcing gender stereotypes in the process (such as: if you’re a doctor you’re probably a man).

The biggest question in my mind is why such an AI would be deemed necessary, rather than, oh, I don't know, asking someone up-front how they identify. Especially if you're the kind of person (or organisation) that's going to discriminate against someone based on the answer to that question.

Sadly, I suspect something similar to Genderify will come to market eventually, if only because automating discrimination would (sadly) find a lot of eager customers...

Reply by email