My feed
Premium

Please
or
to access all these features

Feminism: Sex and gender discussions

Facial recognition software regularly misgenders trans people [shock]

36 replies

ItsAllGoingToBeFine · 20/02/2019 11:28

They studied 58 separate research papers to see how those researchers handled gender. It wasn’t good. Keyes found that researchers followed a binary model of gender more than 90 percent of the time, viewed gender as immutable more than 70 percent of the time, and—in research focused specifically on gender—viewed it as a purely physiological construct more than 80 percent of the time.

Shock

“Such a model fundamentally erases transgender people, excluding their concerns, needs and existences from both design and research,” Keyes wrote in The Misgendering Machines, a research paper they published in November.

Shock

Machines aren’t value neutral, they act as they’re programmed. “We’re talking about the extension of trans erasure,” Keyes said. “That has immediate consequences

Shock

motherboard.vice.com/amp/en_us/article/7xnwed/facial-recognition-software-regularly-misgenders-trans-people

OP posts:
Report
ItsAllGoingToBeFine · 20/02/2019 11:29
OP posts:
Report
AssassinatedBeauty · 20/02/2019 11:45

Presumably facial recognition software is identifying the sex of people, and not their gender? Given how gender is an internal feeling, not sure how any visual software could analyse and recognise it.

I bet that if you tried to get software to recognise gender from people's presentation and not their face shape/features, then it would regularly misgender any non-conforming women and men.

Report
Mner2019 · 20/02/2019 11:49

that software is transphobic Grin

Report
NotTerfNorCis · 20/02/2019 11:52

Quick, send the police around to arrest it.

Report
lottiegarbanzo · 20/02/2019 11:56

Huh? The writer is simply confusing gender and sex. How can facial recognition software 'see' how you feel in your head?!

It is perfectly possible that the software designers have used the word 'gender' in the traditional science-paper sense, to mean sex. That would be a failure to discern meaning, on the part of the writer of the paper cited.

When someone does come up with mind-reading software, its ability to discern gender feelings is not going to be the first thing I'd be worrying about.

Report
DpWm · 20/02/2019 11:58

What a monumental waste of time and resources.

Humans faces reveal their sex.
It's impossible to tell someone's internal sense of gender from looking at their face.
Next up, shock reveal, how babies are made.

Report
LuggsaysNotaWomen · 20/02/2019 11:58

Software that recognizes the reality of somebodies anatomy as opposed to the individuals fantasy of it is most certainly a phobic piece of shit.

If it doesn’t recognize me as a twenty five year old, sylphe like goddess, I’m gonna lose it.

Fucking bastards.

Report
QuietContraryMary · 20/02/2019 11:58

Computers are transphobic.

If only they used silicone instead of silicon.....

Report
Whatisthisfuckery · 20/02/2019 12:00

Facial recognition programmes say I’m a man 9 times out of 10. I assume it’s because I’ve got very short hair. I’m reliably informed that I don’t actually look like a man.

Report
FunnyTinge · 20/02/2019 12:01

Research finds PCs aren't PC. Grin

Take the machines to the reeducation reprogramming camps.

Report
NottonightJosepheen · 20/02/2019 12:02

This reply has been deleted

Message withdrawn at poster's request.

FunnyTinge · 20/02/2019 12:03

Whatisthisfuckery are you quite sure that you weren't assigned female at birth by mistake?

Report
AssassinatedBeauty · 20/02/2019 12:05

And yet... AI software has been shown to be sexist and racist. Does that have "immediate consequences"?

tech.co/news/sexist-ai-doomed-reflect-worst-2018-10

Report
Whatisthisfuckery · 20/02/2019 12:34

FunnyTinge hm, well I thought I was female, because of vag and giving birth and that, but I do like some boys’ things now you mention it, and my hair is very short. Now I’m confused. If I google ‘am I transgender?’ Do you think I might get some useful advice?

Report
FunnyTinge · 20/02/2019 12:38

Whatisthisfuckery eek...you've asked the question, so that has invoked The Trans. I guess a bit like saying 'Bloody Mary' three times in the mirror.

You'll need some blue hair dye.

Report
Homestar · 20/02/2019 12:40

Keyes found that researchers followed a binary model of gender more than 90 percent of the time,

I'm now laughing very hard at the idea of trying to build a facial recognition model that can reliably identify nonbinaries.

Seriously though, this seems laughable when it's just facial recognition software but how long before they start attacking researchers for including a "binary model of gender (sex)" in health research?

Report
FunnyTinge · 20/02/2019 12:44

Homestar

All the AI needs is labelled data to train with.


Good luck with getting the non-binaries to agree on a set of labels Grin

Report
Oldermum156 · 20/02/2019 12:45

Facial recognition should be informed that if you wear pink, you're a feeeemale :P

Report
ItsAllGoingToBeFine · 20/02/2019 12:46

how long before they start attacking researchers for including a "binary model of gender (sex)" in health research

I suspect that will start happening very soon, if it hasn't already.

OP posts:
Report
Homestar · 20/02/2019 12:48

Keyes doesn’t see a need for any kind of AGR at all.

“Technologies need to be contextual and need-driven,” they said. “What are the values of the people who use the space that you're deploying a technology in? Do the people in that space actually need it? If we're not discussing gender at all, or race at all...it doesn't necessarily lead to a better world.”

But you can't solve the racist AI problem without actually discussing race. If you pretend race doesn't exist, you'll get racist inputs and racist outputs because we're in a racist society.

The "genderfucky nightmare" wants sex/gender to be ignored. They want it to not even be a variable. Very disingenuous of them to pretend that the "gender" problem and the "race" problem are analagous when they think the gender problem can be solved by ignoring it.

Report
LangCleg · 20/02/2019 12:49

All. The. LOLs!

Report
ErrolTheDragon · 20/02/2019 12:50

Machines aren’t value neutral, they act as they’re programmed.

Actually, a lot of AI depends on neural networks etc, not deterministic programming.

I assume they'll generally use a large training set of pictures each of which has a 'sex' or 'gender' attribute. (This is the level at which things may not always be 'value neutral', I suspect - what attributes are being tested?). A possibility would be to include both (though probably harder if the options aren't binary?) - and then see how the results differed.

I'd take a guess that if the training data is representative and has appropriate attributes, AI might be able to accurately detect some transgender people using the same cues humans do unconsciously. whether that's the outcome those people would want, I don't know...

Report

Don’t want to miss threads like this?

Weekly

Sign up to our weekly round up and get all the best threads sent straight to your inbox!

Log in to update your newsletter preferences.

You've subscribed!

Vixxxy · 20/02/2019 13:53

Erm, how would this software ever be able to go on 'gender' rather than sex? Bonkers.

Report
Bezalelle · 20/02/2019 14:11

Puts me in mind of the Cultural Revolution in China, where machines were said to operate only by "Mao Zedong Thought" and not by mechanisms.

Report
ErrolTheDragon · 20/02/2019 14:15

Well, who knows what a neural network is going on? (there was the infamous, maybe mythical but plausible, case of an AI trained to detect tanks which actually detected cloudy weather really well).

If there were sex and 'gender' attributes, I'd think there was a fair chance of AIs distinguishing eg people with prominent browbones wearing makeup, or people with smaller facial features and stubble.

'Nonbinaries' ... who knows. Hairstyle and poutiness?

Report
Please create an account

To comment on this thread you need to create a Mumsnet account.