Meet the Other Phone. Child-safe in minutes.

Meet the Other Phone.
Child-safe in minutes.

Buy now

Please or to access all these features

Feminism: Sex and gender discussions
nocoolnamesleft · 06/06/2024 22:44

So it can easily define a man, but not a woman? For fuck's sake.

NoWordForFluffy · 06/06/2024 22:44

nocoolnamesleft · 06/06/2024 22:44

So it can easily define a man, but not a woman? For fuck's sake.

Literally coming to say that. 🤬

User1979289 · 06/06/2024 22:46

It's absurd! Like a South Park joke 😂😂

lcakethereforeIam · 06/06/2024 22:52

They'll never get AI to pass the Turing test if all you have to do is ask it what a woman is.

ScrollingLeaves · 06/06/2024 22:53

Imagine these systems regulating our lives. They will probably get other things just as wrong too.

Arconialiving · 07/06/2024 00:43

Fucking hell - this is really scary for our future!

DaSilvaP · 07/06/2024 06:04

Ahem ...

you're aware that this so-called "(artificial) intelligence" is only regurgitating whatever selected input was feed into it by humans?

Ask yourself who is selecting the "approved input" and then you won't be surprised by any nonsensical output.

theDudesmummy · 07/06/2024 06:12

Characteristics of women include interest in fashion, beauty, cooking and childcare, and having long hair? Oh come on AI, surely you can do better than this regressive 1950s shit?

theDudesmummy · 07/06/2024 06:15

Interestingly Chatgtp does better than this (for me anyway, and for today, I have found its answers can vary day to day, to the very same question).

Bringbackthebeaver · 07/06/2024 06:18

@NoWordForFluffy @nocoolnamesleft It defined a man as an adult human male only after it had been through the whole process with the definition of "woman".

If it was about man at the beginning it might have said something different.

Bringbackthebeaver · 07/06/2024 06:20

theDudesmummy · 07/06/2024 06:12

Characteristics of women include interest in fashion, beauty, cooking and childcare, and having long hair? Oh come on AI, surely you can do better than this regressive 1950s shit?

AI only learns from what humans teach it. It's a reflection of society and information on the internet. It just shows that society is regressive 😞

Igmum · 07/06/2024 06:24

Why am I not surprised? The computer people who programme this seem to have very high levels of trans women in their ranks.

PickledMumion · 07/06/2024 06:26

There's no intelligence or thought process behind A"I". It's just fancy predictive text, trying to figure out what the most likely next word is. FWIW Redit has formed a huge part of the input data.

WildAloofRebel · 07/06/2024 06:26

Didn’t she just train the AI that a woman is an adult human female and therefore it answered the male question in the same way? I don’t know how immediate it is. I wonder what the answer would be if someone else asked.

theDudesmummy · 07/06/2024 06:45

Yes, I have really had my eyes opened about AI recently. Touted as this great helpful amazing thing, but in fact so unbelievably and frighteningly flawed as a tool. I was at some teaching about it recently where I was alerted to the phenomenon of AI hallucination. The teacher showed us how input asking about legal cases on a certain issue resulted in completely fabricated cases being presented in a completely convincing way.

So I tried asking an AI engine about whether there are any legal cases about me (I have a completely unique name which no-one else in the world has, if you put my exact spelling and my title, so if it threw up a case it could only be linked to me, not someone else with the same name).

It turns out that, depending on when you ask, I am either one half of a lesbian couple in South Africa who had a child by artificial insemination and fought a custody battle with my ex-partner (I won that case, it seems), or an Australian anaesthetist who left a patient brain damaged through malpractice (I lost that one). The only two nuggets of truth in there are that I once lived in South Africa (not for decades and not at the time the case supposedly occurred), and I am a doctor (not an anaesthetist, nor have I ever caused anyone brain damage).

That little exercise stopped dead in their tracks any nascent ideas I had of using AI for anything other than writing amusing poems about charismatic sheep, or producing pictures of ballet dancing guinea pigs...

PepeParapluie · 07/06/2024 06:53

theDudesmummy · 07/06/2024 06:45

Yes, I have really had my eyes opened about AI recently. Touted as this great helpful amazing thing, but in fact so unbelievably and frighteningly flawed as a tool. I was at some teaching about it recently where I was alerted to the phenomenon of AI hallucination. The teacher showed us how input asking about legal cases on a certain issue resulted in completely fabricated cases being presented in a completely convincing way.

So I tried asking an AI engine about whether there are any legal cases about me (I have a completely unique name which no-one else in the world has, if you put my exact spelling and my title, so if it threw up a case it could only be linked to me, not someone else with the same name).

It turns out that, depending on when you ask, I am either one half of a lesbian couple in South Africa who had a child by artificial insemination and fought a custody battle with my ex-partner (I won that case, it seems), or an Australian anaesthetist who left a patient brain damaged through malpractice (I lost that one). The only two nuggets of truth in there are that I once lived in South Africa (not for decades and not at the time the case supposedly occurred), and I am a doctor (not an anaesthetist, nor have I ever caused anyone brain damage).

That little exercise stopped dead in their tracks any nascent ideas I had of using AI for anything other than writing amusing poems about charismatic sheep, or producing pictures of ballet dancing guinea pigs...

Edited

This is an issue which has recently appeared in a British court in at least one reported case. https://www.lawgazette.co.uk/news/ai-hallucinates-nine-helpful-case-authorities/5118179.article

Part of the problem is probably down to lawyers being expensive so people represent themselves and finding authorities is no easy task if you are not trained in it and don’t have access to the same resources as lawyers, so I can see why this happened, but it is worrying.

A robot hand types on a laptop keyboard

AI hallucinates nine ‘helpful’ case authorities

Judge accepts that appellant was unaware citations provided by 'friend in a solicitor's office' were fabrications.

https://www.lawgazette.co.uk/news/ai-hallucinates-nine-helpful-case-authorities/5118179.article

theDudesmummy · 07/06/2024 06:55

At least one lawyer in the US has been fined for presenting AI-generated hallucinated case law in court. He told the judge that he thought AI was a search engine, like Google...

mirax · 07/06/2024 06:56

It is such an unfortunate coincidence that global AI kicks in at a time of such ideological indoctrination, particularly with the young. Imagine how difficult it is going to be to undo the damage even if the will to do so exists. This stuff is embedded everywhere. For me, an Asian living out east, it is a form of cultural imperialism from the west.

RockaLock · 07/06/2024 07:01

WildAloofRebel · 07/06/2024 06:26

Didn’t she just train the AI that a woman is an adult human female and therefore it answered the male question in the same way? I don’t know how immediate it is. I wonder what the answer would be if someone else asked.

Yes, that was how I read it too.

By showing it all the contradictions in its definitions, she got it to realise that woman = adult human female.

And so when she then asked "what is a man?", it used what it had just learned to define a man.

theDudesmummy · 07/06/2024 07:03

I think now that the power of AI has been both under and overplayed.

Underplayed because of its huge potential to do harm by fabricating "reality", whether by accident (eg the hallucinated case law) or design (bad actors influencing elections etc).

Overplayed because it is ( for now) way more flawed and less reliable than we are being led to believe.

I guess the scary part will come if "they" sort out the obvious hallucinations and unreliability (and the weird six-fingered hands on AI images of people etc), so the distortion of reality becomes more convincing and undetectable, and thereby in effect "becomes" reality.

mach2 · 07/06/2024 07:25

I call it "artificial stupidity".

New posts on this thread. Refresh page