Meet the Other Phone. Flexible and made to last.

Meet the Other Phone.
Flexible and made to last.

Buy now

Please or to access all these features

Feminism: Sex and gender discussions

BBC: Fake naked photos of thousands of women shared online

24 replies

WootMoggie · 20/10/2020 18:16

I had a bit of fun the other day, pasting some friends faces into famous movie scenes using an app on the iPhone.

I figured the results were pretty extraordinary and showed my wife who said immediately - "You do know this will be used to target women and children don't you?". Well - it had crossed my mind, but I must confess it wasn't "front and center" - I was more amazed at how well it could map my friend's faces into any kind of scene lighting along with facial expressions in a very convincing way.

That was a few days ago, and sure enough - this has today become a lead story on the BBC news (see link below).

It's (as usual) pretty lazy reporting on the part of the BBC to say that they had poor results. THAT app may be poor, but the one I was using did a remarkable job using some built-in famous movie scenes. With a paid subscription you can apparently use your own footage (I've not tested this).

Anyway, I mention this here, because most people are probably not aware that (notwithstanding the app the BBC tried) the quality of this is now way past "silly joke" territory. We're now in the realm of very convincing footage that could 100% be used to ruin relationships, facilitate blackmail, or bring mass humiliation upon an unwitting victim. It also only takes seconds to do, and works off one single photo of the victims face. It's easily within the bounds of pushing (e.g.) teenage girls into taking their own lives.

The big question, is what can be done about it?

I don't think this is one of those "let the market decide" issues. I think this technology is advancing by the minute right now (it's A.I. - so it constantly learns how to improve) , the best apps are extraordinary, and I now believe that what is needed urgently is legislation.

It wouldn't be popular, but I think the only option is to put in place legislation that makes it an serious offence to make a fake video of anyone without their consent. No doubt that's probably going to require a bunch of vague exceptions for "legitimate comedy or satirical purposes" but I can't really see any other option.

Interested to here from the legal minds (and others) on FWR around this or other options; because this conversation is already happening far too late for many, many women.

www.bbc.co.uk/news/technology-54584127

OP posts:
midgebabe · 20/10/2020 18:23

Yip, saw that. Did I read correctly, those responsible said something like ...get over it, it's a bit of fun not war and famine ?

yourhairiswinterfire · 20/10/2020 18:24

way past "silly joke" territory. We're now in the realm of very convincing footage that could 100% be used to ruin relationships, facilitate blackmail, or bring mass humiliation upon an unwitting victim.

Yes, this is really concerning. I saw an app on the news maybe a year ago, where you could put in a picture of any woman and the app would undress her to be completely naked, and it apparently looked very realistic. So anyone could steal a picture of a woman from Facebook or Twitter to use.

And funnily enough, this app didn't have the option to do this to men Hmm

WootMoggie · 20/10/2020 18:32

Well, “they would, wouldn’t they?”

About three years ago Adobe (of photoshop fame) demonstrated a product that once you fed it a twenty minute sound recording of someone’s speech, you could get it to read out anything you typed into a text box using that person’s voice.

They gave a number of benign use cases (swearing replacement/deletion for movies etc) and claimed they would watermark the output so it could be identified as fake, but the whole demo felt a bit disturbing.

At the time, they dubbed it “photoshop for voice”. Now, they say it was “just an experiment” and act like it never existed. It was very odd.

Obviously, combining this technology with the face mapping is both powerful and dangerous. What can be done at the outer fringes of use cases is unimaginable to us right now.

OP posts:
ArabellaScott · 20/10/2020 19:14

The deepfake stuff is terrifying. To be honest, what worries me more than anything is governments using it. When you consider surveillance, and places like Belarus, China, etc and how human rights are always in such a fragile state.

I don't know much about tech, but I imagine there will be tech created to counter it, surely? Ways to test and ascertain veracity?

I don't know what the solution is.

Imnobody4 · 20/10/2020 19:29

Totally agree, technology is increasingly an ethics free zone at the same time as they're increasing their power to destroy people's mental health and threaten democracy and human rights, and it's all just for laugh.

CaraDuneRedux · 20/10/2020 19:29

(1) WTAF is wrong with men? Just stop it, fgs, stop it you fucking perves.

(2) totally agree with Arabella - the potential for governments to use this to create propaganda is terrifying.

WootMoggie · 20/10/2020 19:33

I think any incidental mathematical signature on the image could be eradicated by further smoothing or other processing - this is why I think the law is the only suitable tool.

OP posts:
QuentinWinters · 20/10/2020 19:33

Given the total lack of interest in anything to do with protecting females, I doubt much will happen.
The law hasn't caught up with smartphones yet ffs (I'm referring to rapes not being prosecuted now the police have to wade through tens of thousands of messages to declare there is nothing relevant to the defence).
Or how long it took to introduce stalking laws. Or make upskirting illegal.
Sadly I think nothing will happen until deep fakes are used to do political soundbites.

Goosefoot · 20/10/2020 19:39

I'm not so much worried about people using this technique a lot to make fake scenarios or fake images of naked people. Once it's known it can be done, the problem may be that no one will believe the real ones are real.

RHTawneyonabus · 20/10/2020 19:42

I imagine you’d have to create several offences against the creation and distribution of footage that others would reasonably believe to be genuine. You could also add that the intention must be to cause harm / reputation damage to that person but that might be tricky as I could foresee problems with deep faking people doing positive things. Satirists would have to be very cautious that their fakes would be very obviously not really the real person.

There are numerous issues with this though how would you enforce it against content created outside the country? How would you enforce it against those who shared it believing it to be genuine?

highame · 20/10/2020 19:43

Ah but Quentin if a famous daughter is used, then something will very quickly happen and you can guarantee there'll be some arse out there, very happy to have a crack

QuentinWinters · 20/10/2020 19:44

Maybe the answer is to make it illegal to share naked pics full stop. Make porn pay to view and regulate it. That would remove a lot of the "is it real or fake? Consensual or not? Stuff"

MondayYogurt · 20/10/2020 19:59

Every single time one of those apps trends and everyone makes themselves look old or fat or the opposite sex or whatever - your are teaching the AI to get better.
Every photo you upload to almost every app - read the T's and C's - they can (and do) sell it to companies who use it for these things.
When the app is free, YOU are the product.

PopperUppleton · 20/10/2020 20:02

@highame

Ah but Quentin if a famous daughter is used, then something will very quickly happen and you can guarantee there'll be some arse out there, very happy to have a crack

Most arses have cracks, don't they? Grin

WootMoggie · 20/10/2020 20:04

Thinking on this further, it wouldn’t be effective just to make the creation of the images illegal (proxy server uploads make that untraceable) - you would need an additional office of “possession” to for those viewing it.

OP posts:
Gingerkittykat · 20/10/2020 20:41

It's been going on in various forms for a very long time.

My stepsister was a model and her head was placed on porn images 20 years ago. What is especially worrying is that it is available to a lot more users.

Coyoacan · 20/10/2020 20:42

I agree with Goosefoot. Once everyone knows that this is possible, nobody sane will believe any footage of that type.

ArabellaScott · 20/10/2020 20:46

Maybe the answer is to make it illegal to share naked pics full stop

So I won't be able to do life drawing, then?

TicTacTwo · 20/10/2020 20:52

I saw a demo of what looked like Obama reading words that Trump actually said. This technology could cause wars.

I believe that there's lots of deep fake porn of famous women online

lucylucky1977 · 20/10/2020 21:06

There’s already loads of Deepfake prom of famous women both alive or dead who have had nothing to do with that industry in their lives.
It will 100% be used to blackmail and bully innocent girls and women.
And men won’t stand up and do a thing about it.

lucylucky1977 · 20/10/2020 21:07

*porn not prom

PurpleHoodie · 21/10/2020 14:32

The TV drama "The Capture" perfectly encapsulates this dark, dark issue.

Available on iplayer.

Ereshkigalangcleg · 21/10/2020 14:35

Yes I was thinking of that series too, Purple

MyOwnSummer · 21/10/2020 20:40

I couldn't agree more, this is fucking terrifying. I can easily see a vulnerable girl or woman doing herself serious harm over some realistic faked video.

It is about time the law got out in front of these issues. Is it beyond the wit of politics to devise a set of rules.

The exemptions would need to put the burden of proof on the creator / distributor. In cases of genuine satire, etc, you should need to be able to demonstrate that this was your primary goal. Otherwise the creeps and the bullies will hide behind "it was only a joke!" as they always do.

New posts on this thread. Refresh page