Facial recognition has a gender problem

Facial recognition has a gender problem

Posted by


[Morgan Klaus Scheuerman] I’m Morgan Klaus Scheuerman, I’m an
information science PhD student. Facial recognition software, specifically, is
using a method called computer vision, that takes in a bunch of images of
people, and then matches an individual person to that database of images. We
knew that people of minoritized gender identities so people who are trans
people who are non-binary we’re very concerned about this technology but we
didn’t actually have any empirical evidence about the mis-classification
rates for that group of people. We made our own data set of images from
Instagram. So, we scraped them using different gender hashtags people used to
describe themselves. For example, trans man, trans woman, agender, things like
that. And then, we ran those images through four different commercially
available facial analysis software. [Jed Brubaker] Bottom line, what we found is that the
computer vision systems that run all of our facial detection and our facial analysis
do not handle the level of gender diversity that we live with every day. [Morgan] Different systems basically classified me differently, so I had, on one hand ,like
IBM would classify me as female, and maybe Clarify would classify me as male. [Jed] If you’re a cisgendered man or cisgendered woman you’re doing pretty
okay in these systems. If you are a trans woman, not as well, and if you’re trans
man, then we’re looking at, for example, on Amazon’s system, it’s called Recognition, you’re looking at about 61 percent accuracy. But
if we step beyond people who have binary gender identities to people who
identify as non-binary as agender, 100 percent of the time you’re going
to be classified incorrectly. [Morgan] As it gets used more and more for security,
some people might be stopped, because they just don’t match what’s in the
database. [Jed] We’re at the beginning of an era where we are training our systems. We are creating an infrastructure that will assist, enable and hopefully enhance our
daily lives, but that only works if those systems are taught who we are and are
taught to think about the diversity and the complexity of who we are. [Be Boulder]

7 comments

  1. This isn't a problem. The problem is that what these people are doing to their bodies, faces are basically disguises. They're attempting to conceal their identity by altering their outward appearance in order to appear as something they're not. You can't expect AI to decipher a masquerade.

  2. To quote the study:

    “As evidenced through these services’ abilities to only recognize “male” and “female,” they also have the inability to recognize whether someone is transgender, binary or not. So, while some users expressed pride in their identities, the systems are unable to affirm this. For example was of an #agender person wearing a t-shirt that read “not cis.” The underlying infrastructure would not have the ability to recognize trans from this declaration.”

    It seems there is misunderstanding of what FACIAL RECOGNITION SOFTWARE is designed to do. It is not designed to read shirts.

    Also here:

    “These systems lacked the ability to contextualize implicit and explicit visual markers of gender identity, particularly in the context of trans images. For example, Microsoft misclassified an image of a bearded #transman holding up a syringe, presumably for testosterone injection based on the Instagram caption which read: “I’m back on T (testosterone) after months so hopefully I’ll be back to myself.” This is considered a definitive marker of hormone replacement therapy and trans identity, and includes “insider” markers contextual to trans communities.”

    Did ever even receive the caption as data? How did it receive it?

    You are asking a piece of software designed to recognize faces to read and understand written captions it may or may not even have access to. You are asking it to recognize and contextualize inanimate objects.

    If it did this and used it to draw conclusions IT WOULD NOT BE FACIAL RECOGNITION SOFTWARE.

  3. It’s almost like AI isn’t putting up with your transgender and gender fluid nonsense. It knows gender better than you do. Also, that’s a dude with long hair and piercings. Get out of here.

  4. The problem lies within humans, not machines. Transgender people are by definition confused individual, so let's not blame the machinesnow for not putting up with your "non-binary" nonsense

  5. First of all it’s called sex. Nobody gives a crap about “gender.” Second of all the software is just telling the truth, unlike “trans” and “nonbinary” people.

Leave a Reply

Your email address will not be published. Required fields are marked *