Meet the Other Phone. Child-safe in minutes.

Meet the Other Phone.
Child-safe in minutes.

Buy now

Please or to access all these features

Feminism: Sex and gender discussions

Lesbian dating app to use facial recognition to exclude males

117 replies

zibzibara · 20/06/2024 00:54

https://www.dailymail.co.uk/news/article-13484121/Lesbian-dating-app-use-facial-recognition-exclude-trans-women-matching-biological-females.html

This is from Jenny Watson who's known for fearlessly kicking out the males invading her lesbian speed dating events, now she's doing the same with a new dating app for women. Hopefully it will work as advertised but even if the usual crop of males manage to impose themselves through whatever deceitful trickery, it just further proves her point about how lesbians are being bullied out of their own spaces. Wishing her the best of luck with this!

Lesbian dating app to use facial recognition to exclude trans women

The first dating app for lesbians is set to launch - using sex-recognition technology to exclude trans women and ensure only biological females can sign up.

https://www.dailymail.co.uk/news/article-13484121/Lesbian-dating-app-use-facial-recognition-exclude-trans-women-matching-biological-females.html

OP posts:
Thread gallery
9
PatatiPatatras · 20/06/2024 09:01

Is there any way of identifying trans inner being. 0. But but but... trans anyway.

Is there a way of identifying female humans. Yes but it is only 60-80% right. No no no.

Don't worry that ai will eventually become 100% accurate because you've challenged it to become exactly what you don't want - a way of differentiating exactly the sexes.

And the only way it will ever be wrong is if you lie to it. So yeah keep challenging it.

AquaFurball · 20/06/2024 09:11

No3387 · 20/06/2024 08:01

This gave Dylan milvany (two separate pictures used) more % of accuracy than of me. An actual born woman.

Give it a photo of Dylan before the extensive surgery and it's very much a man, 97% man without the hair and makeup and earrings, a second image 66.7% woman unfortunately with it.

Whoopi Goldberg's imdb main image is 96.13% woman @DuchessNope

I ran my very manly looking female friend, super plus size, 6ft tall carbon copy of her dad and she's still over 95% woman.

None of the regular, non celebrity, transwomen unmanipulated pics produced anything other than man. There's only 5 of them, even a couple with facial surgery.

DrBlackbird · 20/06/2024 09:13

Actually, it’s interesting. I don’t think the police should use it or private firms on their employees. So should this dating app use it? I think I’d have to agree that it’s men’s responsibility to stay out of female spaces. Women should not be put in the position of having to ‘police’ female spaces.

DuchessNope · 20/06/2024 09:16

Whoopi Goldberg's imdb main image is 96.13% woman @DuchessNope

Thats the one I used! Weird.

Lesbian dating app to use facial recognition to exclude males
AquaFurball · 20/06/2024 09:22

@DuchessNope
The same image. Very odd.

Live camera image of myself was 99%, all saved images over 97% and I'm not dainty or delicate or conform to beauty standards @GrammarTeacher

Lesbian dating app to use facial recognition to exclude males
CantDealwithChristmas · 20/06/2024 09:24

DrBlackbird · 20/06/2024 08:52

To reassure @CantDealwithChristmas that absolutely no one thinks or is suggesting that black women are ‘less beautiful’ or that comments alluding to the idea that AI might have difficulty 'recognising' Black people's faces, for me there's an underlying idea beneath it that Black people's bone structures mark them out as fundamentally different and less 'evolved' than Caucasians.

To be clear, the concerns about AI having difficulty recognising black faces such as mentioned by @GrammarTeacher arises from the many research studies on FRT. For example, MIT computer scientist Joy Buolamwini has written extensively on this issue.

My guess is that anybody questioning AI’s sensitivity to misidentifying black faces comes this understanding. Kate Crawford wrote about this in her book Atlas of AI and how it comes from a lack of representation in AI’s training data. This then seems to have been corroborated by stories irl of black men being wrongly arrested on the basis of (poorly designed) FRT.

However, going by what you say here, it is at least reassuring that the UK border controls use more reliable technology.

https://www.media.mit.edu/articles/facial-recognition-software-is-biased-towards-white-men-researcher-finds/

https://mitsloan.mit.edu/ideas-made-to-matter/unmasking-bias-facial-recognition-algorithms

https://www.nytimes.com/2020/12/29/technology/facial-recognition-misidentify-jail.html

Edited

To reassure @CantDealwithChristmasthat absolutely no one thinks or is suggesting that black women are ‘less beautiful’ or thatcomments alluding to the idea that AI might have difficulty 'recognising' Black people's faces, for me there's an underlying idea beneath it that Black people's bone structures mark them out as fundamentally different and less 'evolved' than Caucasians.

Not being a child, I don't require 'reassurance'.

I never said anything about 'beauty' - why did you?

How do you know what's in the minds or the intentions of the posters I mentioned?

I was simply saying that I have been made slightly uneasy by the speed and keeness with which certain posters jumped to chat about race and specifically the faces of Black Women, on a thread about using AI to differentiate between the sexes.

To be clear, since some on this thread seem unaccountably not to know: the biological differences between the two sexes are real. 'Biological differences' between 'races' are not.

DuchessNope · 20/06/2024 09:32

I was simply saying that I have been made slightly uneasy by the speed and keeness with which certain posters jumped to chat about race and specifically the faces of Black Women, on a thread about using AI to differentiate between the sexes.

I can understand this and I’m sorry - I should have been much clearer that I was basing this on a lot of reading (to do with my job) about how these systems interact with race.

ETA - rather than just guessing black women would be less likely to be recognised as women or something. Which is gross, agreed.

334bu · 20/06/2024 09:47

Given that safeguarding the safety and dignity of same sex attracted women is the reason behind the use of this technology to prevent men gatecrashing the site, and that any safeguarding protocols can only ensure a certain level of security, what might be considered a normal margin of error.

AntoinetteCosway · 20/06/2024 09:48

DuchessNope · 20/06/2024 07:33

https://www.nyckel.com/pretrained-classifiers/gender-detector/

This is the AI the app uses if you want to play around with it. I got 98% woman.

I just tried this with a picture of my daughter (who has short hair) and it was 88% sure she was a man! Obviously I'm biased, but she's a very pretty girl, so I don't think this is very accurate...!

kittykarate · 20/06/2024 09:49

Doesn't the skeleton sexing vary on confidence depending on how much and how good a condition it is in? Like, if all you get is a little finger, then it's a lot harder to be certain about the sex (but I guess you could give an age range), but if you get a pelvis or skull it's like a klaxon going off.

GrammarTeacher · 20/06/2024 09:51

DuchessNope · 20/06/2024 09:32

I was simply saying that I have been made slightly uneasy by the speed and keeness with which certain posters jumped to chat about race and specifically the faces of Black Women, on a thread about using AI to differentiate between the sexes.

I can understand this and I’m sorry - I should have been much clearer that I was basing this on a lot of reading (to do with my job) about how these systems interact with race.

ETA - rather than just guessing black women would be less likely to be recognised as women or something. Which is gross, agreed.

Edited

Same. Although not my job I'm genuinely interested in this and some students (from a range of backgrounds) have presented on this issue at our school FemSoc. The problem is the racism of the culture fed into the AI not people who point out the problem.
LLM such as Chat GPT that are trained on the internet are demonstrably racist, sexist and homophobic due to the internet being a Wild West of opinions with no regulation.

I apologise if in my eagerness to debate the problems with AI processing I rushed ahead and didn't explain myself as respectfully as I should. Sorry.

LittleLittleRex · 20/06/2024 09:52

Even if it wasn't correctly sexing Whoopi Goldberg, it will have the capacity to know when a face is half covered in hair and sunglasses and not accept the image.

We can't complain it isn't 100% accurate, it will weed out the vast majority of males and make dating easier for lesbians. The current situation is having to pussyfoot around wanting a female partner, on this app the men can't complain that female is specified nor will most of them get in.

If we waited on perfect, nothing would change. If we dismissed any field of study because some bad people once tried something similar, we wouldn't have treatments for genetic disorders.

YouJustDoYou · 20/06/2024 09:54

DuchessNope · 20/06/2024 07:37

Just tried some out - Whoopi Goldberg came out as woman but with 86% confidence.

Dylan Mulvaney got woman as 98% confidence.

So it’s pretty useless.

Don't forget Dylan has had facial reconstruction surgery though to try and have a more "female" profile, so that might be why it came out like that....shaving his jaw line down, restructuring his face etc.

334bu · 20/06/2024 09:57

Also all photographs of Dylan are photoshopped to the nth degree.

SerendipityJane · 20/06/2024 10:03

DuchessNope · 20/06/2024 07:33

https://www.nyckel.com/pretrained-classifiers/gender-detector/

This is the AI the app uses if you want to play around with it. I got 98% woman.

Well by using the word gender that saved me a read.

GailBlancheViola · 20/06/2024 10:05

Datun · 20/06/2024 07:50

Quite. The fact that women have to go to these lengths is insane.

Exactly if men of any and every flavour just stayed the fuck away from things that are not for them then there would be no need to go to these lengths.

Why people are criticising Jenny Watson and the use of this software and whether it is good enough or not instead of criticising the men that have made the use of it necessary shows just how much of a problem there is.

Fariha31 · 20/06/2024 10:13

GrammarTeacher · 20/06/2024 09:51

Same. Although not my job I'm genuinely interested in this and some students (from a range of backgrounds) have presented on this issue at our school FemSoc. The problem is the racism of the culture fed into the AI not people who point out the problem.
LLM such as Chat GPT that are trained on the internet are demonstrably racist, sexist and homophobic due to the internet being a Wild West of opinions with no regulation.

I apologise if in my eagerness to debate the problems with AI processing I rushed ahead and didn't explain myself as respectfully as I should. Sorry.

You don't seem to be listening to @CantDealwithChristmas
She has said her AI is trained on Skeletal proportions, not social media or adverts.
Its what is used by professionals in big organizations when the decisions are likely to matter a lot.
That's a very basic difference.
Your basically talking about something else.

No3387 · 20/06/2024 10:22

AquaFurball · 20/06/2024 09:11

Give it a photo of Dylan before the extensive surgery and it's very much a man, 97% man without the hair and makeup and earrings, a second image 66.7% woman unfortunately with it.

Whoopi Goldberg's imdb main image is 96.13% woman @DuchessNope

I ran my very manly looking female friend, super plus size, 6ft tall carbon copy of her dad and she's still over 95% woman.

None of the regular, non celebrity, transwomen unmanipulated pics produced anything other than man. There's only 5 of them, even a couple with facial surgery.

Not the point though is it... The point is, it is picking him up as a woman. So the technology doesn't work.

The photos I put through, one more feminine looking and one more masculine of him both came out at over 98%, I however, only 96. That's fucking insulting.

oiwheresmymug · 20/06/2024 10:49

AntoinetteCosway · 20/06/2024 09:48

I just tried this with a picture of my daughter (who has short hair) and it was 88% sure she was a man! Obviously I'm biased, but she's a very pretty girl, so I don't think this is very accurate...!

To be fair, this particular tool says it recognises gender (not sex) based on 'societal norms', so that might be why?

oiwheresmymug · 20/06/2024 10:51

No3387 · 20/06/2024 10:22

Not the point though is it... The point is, it is picking him up as a woman. So the technology doesn't work.

The photos I put through, one more feminine looking and one more masculine of him both came out at over 98%, I however, only 96. That's fucking insulting.

That tool says it determines gender (not sex) based on 'societal norms'. It seems that it isn't necessarily trying to ascertain sex, like the dating app's tool would. It's not my area of expertise in the slightest so I can't speak as to whether these tools are basically the same or not though.

simmertime · 20/06/2024 10:51

I see Joy Buolamwini's work was referenced above. Her landmark study on the interaction of race and sex-identification has a website here: http://gendershades.org/ where you can explore the data and see how bad the effects are. It's pretty stark - for the Microsoft product, 94% of the cases it gets wrong are darker-skinned people.

Gender Shades

Intersectional Accuracy Differences in Gender Classification

http://gendershades.org

SerendipityJane · 20/06/2024 11:06

The photos I put through,

Just a note that it was possible to filter out photos in 2010. Not sure why this fantastic development is using such old tech. Is it steam powered ?

DrBlackbird · 20/06/2024 11:16

CantDealwithChristmas · 20/06/2024 09:24

To reassure @CantDealwithChristmasthat absolutely no one thinks or is suggesting that black women are ‘less beautiful’ or thatcomments alluding to the idea that AI might have difficulty 'recognising' Black people's faces, for me there's an underlying idea beneath it that Black people's bone structures mark them out as fundamentally different and less 'evolved' than Caucasians.

Not being a child, I don't require 'reassurance'.

I never said anything about 'beauty' - why did you?

How do you know what's in the minds or the intentions of the posters I mentioned?

I was simply saying that I have been made slightly uneasy by the speed and keeness with which certain posters jumped to chat about race and specifically the faces of Black Women, on a thread about using AI to differentiate between the sexes.

To be clear, since some on this thread seem unaccountably not to know: the biological differences between the two sexes are real. 'Biological differences' between 'races' are not.

As well as other posters, this comment of yours implicitly references appearance ie not having feminine features, which is why I mentioned beauty.

Got to say I'm feeling pretty uneasy about the posters @GrammarTeacher and @DuchessNope who seem to be implying or stating outright that Black Women have more masculine features because they are Black??!?

You’re right in that I can’t know for certain what’s in the minds of other posters, I guess neither can you. However, I took their comments to mean no more than a) AI has problems recognising sexes from facial recognition and b) Facial recognition AI is even more rubbish at recognising black female and male faces.

This is a well recognised outcome of many AI algorithms including those used by a variety of British police forces. No one, afaik, was making the ridiculous suggestion that differences between races is real. Such a suggestion is vile and I have never seen this expressed on this board. The concern on ‘Feminism: Sex and Gender Discussions’ tends to focus on differences between sexes.

DrBlackbird · 20/06/2024 11:20

simmertime · 20/06/2024 10:51

I see Joy Buolamwini's work was referenced above. Her landmark study on the interaction of race and sex-identification has a website here: http://gendershades.org/ where you can explore the data and see how bad the effects are. It's pretty stark - for the Microsoft product, 94% of the cases it gets wrong are darker-skinned people.

Joy’s work has been phenomenal in highlighting the problem of race in AI training data as has Kate Crawford’s.

https://excavating.ai/

Swipe left for the next trending thread