Meet the Other Phone. A phone that grows with your child.

Meet the Other Phone.
A phone that grows with your child.

Buy now

Please or to access all these features

Feminism: Sex and gender discussions

Disappointing from the OII

6 replies

HartSeven · 08/07/2025 16:44

The Oxford Internet Institute has always done good work but this news item from them is very disappointing.
https://www.oii.ox.ac.uk/news-events/ais-limited-understanding-of-gender-puts-health-equity-at-risk/
The gist of their complaint is that AI language models assume that sex is binary and so when dealing with issues of health might wrongly associate being a woman with having a uterus and so on.
"The researchers warn that in healthcare, where AI is increasingly integrated into health technologies, these flawed assumptions, which are often based on a model’s conflation of gender and biological sex characteristics, could lead to inaccurate advice and misdiagnoses."
So less would go wrong if health technologies &c worked on the assumption that there are more than two sexes in humans, and gender ID matters more than biology? Hmm.
The authors are experts in data science, not anything medical or health related funnily enough, and one of them goes by "they/them", which is fine but as far as I know has zero relevance to health equity.

OII | AI’s limited understanding of gender puts health equity at risk

Oxford researchers reveal how AI language models encode a flawed and binary understanding of gender, posing significant risks for transgender, nonbinary, and even cisgender individuals.

https://www.oii.ox.ac.uk/news-events/ais-limited-understanding-of-gender-puts-health-equity-at-risk/

OP posts:
myplace · 08/07/2025 16:48

And of course the issue apparently is being insufficiently woolly about the meanings of words, rather than insufficiently precise. If only words had meanings, AI would find it so much easier.

Alternatively, even AI knows what a woman is. It just hasn’t yet adjusted for those men who don’t.

theilltemperedmaggotintheheartofthelaw · 08/07/2025 19:01

Some models treat terms like ‘nonbinary’ or ‘genderqueer’ as less likely than non-human objects like ‘windscreen’, suggesting a fundamental failure to recognise these as valid human identities.

😂😂😂

Algorithms smarter than many humans shocker.

PauliString · 08/07/2025 19:17

It's late. I misread that as
failure to recognise non-human objects like ‘windscreen’ as valid human identities.

And it honestly didn't sound that surprising.

MelodyMalone · 08/07/2025 19:30

theilltemperedmaggotintheheartofthelaw · 08/07/2025 19:01

Some models treat terms like ‘nonbinary’ or ‘genderqueer’ as less likely than non-human objects like ‘windscreen’, suggesting a fundamental failure to recognise these as valid human identities.

😂😂😂

Algorithms smarter than many humans shocker.

I'm going to hazard a guess that most people say "windscreen" more often than "genderqueer" 🤦‍♀️

theilltemperedmaggotintheheartofthelaw · 08/07/2025 19:35

MelodyMalone · 08/07/2025 19:30

I'm going to hazard a guess that most people say "windscreen" more often than "genderqueer" 🤦‍♀️

Indeed, and the authors also note that the models tended to converge onto a biological reading of sex and gender, suggesting that most of their source material is not irredeemably corrupted by gender woo. There is hope yet!

TheywontletmehavethenameIwant · 08/07/2025 19:48

When even the AI is telling you, it's time to face up to reality muppets. 😁

New posts on this thread. Refresh page
Swipe left for the next trending thread