Meet the Other Phone. A phone that grows with your child.

Meet the Other Phone.
A phone that grows with your child.

Buy now

Please or to access all these features

Feminism: Sex and gender discussions

AI is really worrying and I fear this sort of image generation is tip of the iceberg

265 replies

mids2019 · 02/01/2026 21:22

https://www.bbc.co.uk/news/articles/c98p1r4e6m8o

I don't know to what extent current legislation covers this but to my mind any woman with an image on the net could be prone to this. Are we going to reach a stage where our daughters are going to simply not want any image taken of them for fear of how it could be manipulated?

A woman looks back over her shoulder, wearing red lipstick and gold hoops, in front of a Christmas tree

Woman felt 'dehumanised' after Musk's Grok AI used to digitally remove her clothes

The BBC has seen several examples of it undressing women and putting them in sexual situations without their consent.

https://www.bbc.co.uk/news/articles/c98p1r4e6m8o

OP posts:
Thread gallery
22
PollyNomial · 10/01/2026 11:27

BrokenSunflowers · 10/01/2026 11:12

Yet Sovereign states and maintaining borders is one of the things people talking about on X are criticised for.

The UK has maintained its borders for all our lifetimes.

Equating this maintenance with the (to be kind) ridiculous fixation on the proportion of non white, non UK born citizens is why they are criticised.

PollyNomial · 10/01/2026 11:41

BrokenSunflowers · 10/01/2026 11:16

My recent research [1] shows that over 35,000 text-to-image models designed to generate non-consensual intimate imagery are freely available on public platforms.

No other AI tool is embedded in a distribution platform that creates and publishes the images like this to make it immediately visible to all users of that platform. Other embedded AI tools don't create and publish these kind of images on the distributing platforms.

Non embedded tools typically return content to the prompter alone. While this can then be shared, the automated screening of such content on most mainstream platforms will prevent this type of imagery from being seen widely.

PollyNomial · 10/01/2026 11:43

SerendipityJane · 10/01/2026 11:07

We have now learned there is no such thing as a sovereign state.

Try visiting France without a passport and tell us how long it took to be deported.

RapidOnsetGenderCritic · 10/01/2026 11:46

PollyNomial · 10/01/2026 11:14

Yes, images created elsewhere can be shared on any platform (providing they don't trigger automated rules, such as platforms banning many breastfeeding images because they ban unclothed breasts without considering context).

Twitter goes much further because you can create and distribute these images to the whole platform with a single command and no filtering being applied.

How many commands does it take to do the equivalent (post appalling images) on other platforms? It's hardly a burdensome task that will put off horrible people.

X needs to get its house in order, and so do other platforms and providers of AI. X has a problem which needs fixing, but shutting down X won't go anywhere near to solving the problem.

BrokenSunflowers · 10/01/2026 11:47

PollyNomial · 10/01/2026 11:41

No other AI tool is embedded in a distribution platform that creates and publishes the images like this to make it immediately visible to all users of that platform. Other embedded AI tools don't create and publish these kind of images on the distributing platforms.

Non embedded tools typically return content to the prompter alone. While this can then be shared, the automated screening of such content on most mainstream platforms will prevent this type of imagery from being seen widely.

That does seem to be a sensible change to make for images. I guess if an extra step was introduced requiring the person prompting to agree to share then you would be ok with it? Or do you think all AI images need legislating? And ownership of child abuse imagery needs harsher sanctions?

RapidOnsetGenderCritic · 10/01/2026 11:54

PollyNomial · 10/01/2026 11:41

No other AI tool is embedded in a distribution platform that creates and publishes the images like this to make it immediately visible to all users of that platform. Other embedded AI tools don't create and publish these kind of images on the distributing platforms.

Non embedded tools typically return content to the prompter alone. While this can then be shared, the automated screening of such content on most mainstream platforms will prevent this type of imagery from being seen widely.

I have no reason to doubt what you say here about the differences in platforms, and assuming you are correct then a small step X could take is to implement similar screening before displaying images. As far as I can see, preventing AI from producing the images is a non-trivial task, as AI has no intelligence with which to understand the motivations and intentions of the people asking it to take action in producing images. So people with bad intentions are likely to use considerable ingenuity to get round whatever protections are built into any AI platform.

SerendipityJane · 10/01/2026 11:59

PollyNomial · 10/01/2026 11:43

Try visiting France without a passport and tell us how long it took to be deported.

I'm only responding to the Trump world view.

BrokenSunflowers · 10/01/2026 12:10

SerendipityJane · 10/01/2026 11:59

I'm only responding to the Trump world view.

Trump is keen on borders too.

PollyNomial · 10/01/2026 12:12

BrokenSunflowers · 10/01/2026 11:47

That does seem to be a sensible change to make for images. I guess if an extra step was introduced requiring the person prompting to agree to share then you would be ok with it? Or do you think all AI images need legislating? And ownership of child abuse imagery needs harsher sanctions?

I'm not ok with these images being created or shared at all!

I think all platforms should be screening them out before they are made available. And all AI tools should be coded so they cannot create/release them. We know it's possible because many have such systems in place.

There are already laws that have "life changing" punishments for the creation and distribution of these images for individuals. Not sure how equivalent deterrents could be applied to organisations; Indonesia's just banned grok (not sure if time limited) which might point a way forwards as billionaires are rather keen on £...

BrokenSunflowers · 10/01/2026 12:19

Indonesia's just banned grok

And Iran has just banned the internet….

What systems are in place that allow the creation of benign images or benign photoshopping but not sexualised imagery? How does that work?

SerendipityJane · 10/01/2026 13:45

BrokenSunflowers · 10/01/2026 12:10

Trump is keen on borders too.

Not anyone elses.

SerendipityJane · 10/01/2026 13:46

BrokenSunflowers · 10/01/2026 12:19

Indonesia's just banned grok

And Iran has just banned the internet….

What systems are in place that allow the creation of benign images or benign photoshopping but not sexualised imagery? How does that work?

Quis custodiet custodes ?

Christinapple · 10/01/2026 19:10

Women's rights defender Kelly Jay Keen is defending Elon Musk's AI porn generator now. Here she is saying "Hmmm, ban X but not pornhub.". The replies include another AI revenge porn of the British Ofcom all female internet team (third I've seen so far, all are being reported to the police and Ofcom).

It is interesting how I'm seeing a lot of notable people known for being gender critical are either defending EM or saying nothing at all on the topic, yet people who are gay/trans are calling EM out for allowing pedos and perverts to create illegal images and flood Twitter with them to this moment.

Anyway, shall we address Ms Keen's statement?

-Pornhub is compliant with the OnlineSafetyBill and requires age verification to access it.

-The OnlineSafetyBill aside, PH requires its own identify verification for anyone to upload anything and has done for many years now.

-There are strict rules on what can be uploaded- ch*ld porn is not allowed and people depicted need to be adults and consenting. Anything uploaded doesn't automatically appear and is checked by a human moderator first.

-The major porn sites are introducing further requirements where uploaders will have to be able to tell the website of the names of everyone in the uploaded material and provide proof they have given consent.

Now, how does this compare with Elon Musk's Twitter where anyone who pays for a blue tick can create anything they want or manipulate any image of a woman or child which automatically appears publicly?

RapidOnsetGenderCritic · 10/01/2026 20:02

Christinapple · 10/01/2026 19:10

Women's rights defender Kelly Jay Keen is defending Elon Musk's AI porn generator now. Here she is saying "Hmmm, ban X but not pornhub.". The replies include another AI revenge porn of the British Ofcom all female internet team (third I've seen so far, all are being reported to the police and Ofcom).

It is interesting how I'm seeing a lot of notable people known for being gender critical are either defending EM or saying nothing at all on the topic, yet people who are gay/trans are calling EM out for allowing pedos and perverts to create illegal images and flood Twitter with them to this moment.

Anyway, shall we address Ms Keen's statement?

-Pornhub is compliant with the OnlineSafetyBill and requires age verification to access it.

-The OnlineSafetyBill aside, PH requires its own identify verification for anyone to upload anything and has done for many years now.

-There are strict rules on what can be uploaded- ch*ld porn is not allowed and people depicted need to be adults and consenting. Anything uploaded doesn't automatically appear and is checked by a human moderator first.

-The major porn sites are introducing further requirements where uploaders will have to be able to tell the website of the names of everyone in the uploaded material and provide proof they have given consent.

Now, how does this compare with Elon Musk's Twitter where anyone who pays for a blue tick can create anything they want or manipulate any image of a woman or child which automatically appears publicly?

You seem to know a lot about Pornhub.

BrokenSunflowers · 10/01/2026 20:49

Also a reminder that 40% of child grooming crimes took place using Snapchat. Where is the call to ban that?

https://www.nspcc.org.uk/about-us/news-opinion/2025/data-shows-how-criminals-are-using-private-messaging-platforms-to-manipulate-and-groom-children/

persephonia · 10/01/2026 21:57

mids2019 · 03/01/2026 08:34

@SexRealist

I agree. I have noted similar attitudes with my daughter's which seems to be a really good idea. It is sad to see women and girls so defensive about photographs in general with having to effectively win their trust (even with friends and family) before a photo is taken.

it seems to me that using a camera phone at a public event is going to be a social no no without very good reason. If we can't stamp down on the tech then maybe changing social attitudes about the use of phones is the only sensible way forward.

The problem is some women need to be in the public eye and have images of themselves in the public realm. For example politicians, news reporters etc. I would hate for women to be put of going into politics because they know if they do they WILL have pornography made of them and potentially shared to everyone they know. It's already happened to an Irish politician. And it will drive women out of politics and dissuade them from raising their head above the parapet in life.
On a much more shallow level there are female makeup artists on YouTube and if they all stopped showing their faces it.would impact my ability to do a smokey eye on maturing skin. I also wouldn't want grieving families to feel they can't provide a photograph of their dead loved one to the media because it will be used to make porn.
Otherwise women just completely vanish from view unless they are very brave/strong willed. And we will lose a lot of gains.

persephonia · 10/01/2026 22:03

PollyNomial · 10/01/2026 12:12

I'm not ok with these images being created or shared at all!

I think all platforms should be screening them out before they are made available. And all AI tools should be coded so they cannot create/release them. We know it's possible because many have such systems in place.

There are already laws that have "life changing" punishments for the creation and distribution of these images for individuals. Not sure how equivalent deterrents could be applied to organisations; Indonesia's just banned grok (not sure if time limited) which might point a way forwards as billionaires are rather keen on £...

Agree
If someone agrees with another person editing their photo to show them naked or in a bikini then surely it would be much simpler for them just to take the picture or allow someone else to take the picture instead. The most likely explanation for someone wanting to edit a clothed picture to an unclothed one is that the subject of the picture doesn't know it's being done.
It's not sensible for other reasons to share nudes of yourself with other people. But thats a choice people can make. AI removes the choice.

Christinapple · 10/01/2026 22:12

I've seen more revenge porn images tonight on Twitter, a photo of female Ofcom staff members has now gone viral and Twitter users have used Grok to make sexually abusive manipulations. These have been reported to Ofcom (which won't help Elon Musk's case since Ofcom are the org who decide if Twitter is to be blocked in the UK), the police and the media. I see some homophobic comments being made towards them too.

The AI goes beyond simply "put person in a bikini" it can also change body size and shape, increase the size of breasts (while keeping bikinis a small size so it looks like the breasts are spilling out of them), put Nazi uniforms on people and recreate pornographic memes by having large men stand behind scantily clad women.

I've yet to see any of the notable great women's defenders in Britain such as glinner, kelly jay keen or jk rowling call out any of this sexual abuse of women and children on twitter.

RapidOnsetGenderCritic · 10/01/2026 22:16

Christinapple · 10/01/2026 22:12

I've seen more revenge porn images tonight on Twitter, a photo of female Ofcom staff members has now gone viral and Twitter users have used Grok to make sexually abusive manipulations. These have been reported to Ofcom (which won't help Elon Musk's case since Ofcom are the org who decide if Twitter is to be blocked in the UK), the police and the media. I see some homophobic comments being made towards them too.

The AI goes beyond simply "put person in a bikini" it can also change body size and shape, increase the size of breasts (while keeping bikinis a small size so it looks like the breasts are spilling out of them), put Nazi uniforms on people and recreate pornographic memes by having large men stand behind scantily clad women.

I've yet to see any of the notable great women's defenders in Britain such as glinner, kelly jay keen or jk rowling call out any of this sexual abuse of women and children on twitter.

https://x.com/theposieparker/status/2010085527207150025?s=61

Kellie-Jay Keen (@ThePosieParker) on X

@elonmusk Yeah, you should recognise that some of Groks features are fucking gross and that porn on this site is also fucking gross. Porn is harmful and part of the reason teenager boys start fantasising about being women. Having it on here normalises...

https://x.com/theposieparker/status/2010085527207150025?s=61

BrokenSunflowers · 10/01/2026 22:32

I've seen more revenge porn images tonight on Twitter

I’ve never seen porn on Twitter, so perhaps that reflects our respective algorithms…

I am all for banning all forms of porn and sex abuse imagery from all sites - including shutting down porn hub and only fans. And also going after Snapchat and meta whose platforms are regularly used for child grooming.

Christinapple · 10/01/2026 22:49

BrokenSunflowers · 10/01/2026 22:32

I've seen more revenge porn images tonight on Twitter

I’ve never seen porn on Twitter, so perhaps that reflects our respective algorithms…

I am all for banning all forms of porn and sex abuse imagery from all sites - including shutting down porn hub and only fans. And also going after Snapchat and meta whose platforms are regularly used for child grooming.

I've been reading replies to tweets by Ofcom and replies to retweets from Kelly Jay Keen and Graham Linehan who both posted a photo of a female Ofcom team.

Christinapple · 12/01/2026 11:37

https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/ofcom-launches-investigation-into-x-over-grok-sexualised-imagery

https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/investigation-into-x-internet-unlimited-company-and-its-compliance-with-duties-to-protect-its-users-from-illegal-content-and-child-users-from-harmful-content

"The UK’s independent online safety watchdog, Ofcom, has today opened a formal investigation into X under the UK’s Online Safety Act, to determine whether it has complied with its duties to protect people in the UK from content that is illegal in the UK."

Ofcom has decided to open a formal investigation to establish whether X has failed to comply with its legal obligations under the Online Safety Act – in particular, to:

  • assess the risk of people in the UK seeing content that is illegal in the UK, and to carry out an updated risk assessment before making any significant changes to their service;
  • take appropriate steps to prevent people in the UK from seeing ‘priority’ illegal content – including non-consensual intimate images and CSAM [chld sex abuse material aka chld porn];[3]
  • take down illegal content swiftly when they become aware of it;
  • have regard to protecting users from a breach of privacy laws;
  • assess the risk their service poses to UK children, and to carry out an updated risk assessment before making any significant changes to their service; and
  • use highly effective age assurance to protect UK children from seeing pornography.[4]"

I have made sure Ofcom are aware there are dozens of non-consensual pornographic images of their female staff being created and shared on Twitter- so now the matter is also personal for them. The IWF have also confirmed CSAM has been created by Grok and can be seen on Twitter (and is also being shared by pedos on dark web forums).

The UK Gov have said if Ofcom want to ban Twitter in the UK they will fully support this decision. Indonesia and Malaysia have already blocked Twitter over CSAM concerns and Canada and Australia aren't ruling out a block.

It's long overdue but it finally looks as if Musk is going to be dealt with. Twitter is already in breach of the OnlineSafetyBill due to inadequate enforcement of its moderation re extremist material on the website which the UK Gov and Ofcom have been turning a blind eye to until now, the only reason this investigation is happening IMO is because CSAM and revenge porn being created by the thousand per minute is something that can't be ignored.

Ofcom launches investigation into X over Grok sexualised imagery

Ofcom has today opened a formal investigation into X under the UK’s Online Safety Act, to determine whether it has complied with its duties to protect people in the UK from content that is illegal in the UK.

https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/ofcom-launches-investigation-into-x-over-grok-sexualised-imagery

BrokenSunflowers · 12/01/2026 12:15

Christinapple · 12/01/2026 11:37

https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/ofcom-launches-investigation-into-x-over-grok-sexualised-imagery

https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/investigation-into-x-internet-unlimited-company-and-its-compliance-with-duties-to-protect-its-users-from-illegal-content-and-child-users-from-harmful-content

"The UK’s independent online safety watchdog, Ofcom, has today opened a formal investigation into X under the UK’s Online Safety Act, to determine whether it has complied with its duties to protect people in the UK from content that is illegal in the UK."

Ofcom has decided to open a formal investigation to establish whether X has failed to comply with its legal obligations under the Online Safety Act – in particular, to:

  • assess the risk of people in the UK seeing content that is illegal in the UK, and to carry out an updated risk assessment before making any significant changes to their service;
  • take appropriate steps to prevent people in the UK from seeing ‘priority’ illegal content – including non-consensual intimate images and CSAM [chld sex abuse material aka chld porn];[3]
  • take down illegal content swiftly when they become aware of it;
  • have regard to protecting users from a breach of privacy laws;
  • assess the risk their service poses to UK children, and to carry out an updated risk assessment before making any significant changes to their service; and
  • use highly effective age assurance to protect UK children from seeing pornography.[4]"

I have made sure Ofcom are aware there are dozens of non-consensual pornographic images of their female staff being created and shared on Twitter- so now the matter is also personal for them. The IWF have also confirmed CSAM has been created by Grok and can be seen on Twitter (and is also being shared by pedos on dark web forums).

The UK Gov have said if Ofcom want to ban Twitter in the UK they will fully support this decision. Indonesia and Malaysia have already blocked Twitter over CSAM concerns and Canada and Australia aren't ruling out a block.

It's long overdue but it finally looks as if Musk is going to be dealt with. Twitter is already in breach of the OnlineSafetyBill due to inadequate enforcement of its moderation re extremist material on the website which the UK Gov and Ofcom have been turning a blind eye to until now, the only reason this investigation is happening IMO is because CSAM and revenge porn being created by the thousand per minute is something that can't be ignored.

Why just X and not other AI providers? A rhetorical question, we know why….

BrokenSunflowers · 12/01/2026 12:19

I have made sure Ofcom are aware there are dozens of non-consensual pornographic images of their female staff being created and shared on Twitter- so now the matter is also personal for them.

I wonder what the motivation for creating those might be? 🤔. I wonder if Ofcom are actually idiotic enough not to see it for what it is - an attack on X using Ofcom as a proxy.

Not sure if people could be more transparent (though I am sure Google AI and ChatGPC could do at least as good a job as Grok in making them so).