My feed
Premium

Please
or
to access all these features

Feminism: Sex & gender discussions

Pornhub removes the majority of its content due to concerns about rape and underage

78 replies

CoffeeTeaChocolate · 15/12/2020 18:43

edition.cnn.com/2020/12/15/business/pornhub-videos-removed/index.html

Has this been posted yet?

I can hardly believe it is true. So relieved!

OP posts:
Report
OvaHere · 15/12/2020 18:50

Great news. I'm no fan of studio made porn but it's preferable to random users uploading videos that can't be verified and contain a high proportion of abuse and underage content.

10 million removed videos is a lot!

Report
BlackForestCake · 15/12/2020 18:51

Can't be the majority of it, surely?

Report
Sexnotgender · 15/12/2020 18:52

10 million... fucking hell. That’s a lot of abuse 🙁

Report
testing987654321 · 15/12/2020 18:52

The changes took the number of videos on the website from 13.5 million videos down to a little under 3 million

About 75% taken down.

Report
fantasmasgoria1 · 15/12/2020 18:55

I'm glad. I think it should just be shut down.

Report
NiceGerbil · 15/12/2020 18:58

To be clear that doesn't mean 10 million had illegal content.

They have removed everything that wasn't uploaded by verified performers.

They say that they are very moral and wonderful for doing this, which they only did after a news paper report saying there were films of unconscious women and girls being attacked, and also underage content generally, which led to credit card companies blacklisting them. They were hit in the pocket.

They also continue to deny there was ever any illegal content. Even though everyone knows there was. And they've taken this very broad action. Which would be a big thing to do if they knew there was nothing iffy there.

So still bastards. But it's still good news.

Report
NiceGerbil · 15/12/2020 18:59

The article linked is not a long read and is very clear if anyone is interested in more detail.

Report
PronounssheRa · 15/12/2020 19:00

Mastercard, visa etc blocked their cards being used on the site, suspect this had a large part to play, god knows pornhub weren't acting when asked to by the victims of filmed abuse

www.theguardian.com/us-news/2020/dec/10/pornhub-mastercard-visa-rape-child-abuse-images

Report
ErrolTheDragon · 15/12/2020 19:01

Their 'verification' process doesn't sound exactly rigorous ... upload a picture of themselves and their username. I suppose better than nothing though.

Report
zzizz · 15/12/2020 19:02

Sounds like a huge step in the right direction.

Report
zzizz · 15/12/2020 19:03

I'm quite pleased with the thought that a bunch of creepy entitled men will be scouring the web for their favourite videos without as much luck tonight.

Report
littlbrowndog · 15/12/2020 19:05

💪💪💪

Hurray for this

Report
Sexnotgender · 15/12/2020 19:05

@ErrolTheDragon

Their 'verification' process doesn't sound exactly rigorous ... upload a picture of themselves and their username. I suppose better than nothing though.

I thought the same.
Report
talesofginza · 15/12/2020 19:09

A positive result for the NYT article, but shameful that it took them so long. Let's wait to see if they follow suit on their other platforms. These tech companies have access to all the latest artificial intelligence technology. It would cost PH nothing to create algorithms which could recognise with quite high accuracy - underage people (or people who users would perceive to be underage, even if the 'performer' is not), abusive content, content already identified as needing to come down (thinking specifically of that poor young woman whose underage revenge porn videos kept getting uploaded over and over again), etc etc.

Report
LumpySpacedPrincess · 15/12/2020 19:11

They knew exactly what they were doing for years and years. Happy to exploit and demean women and girls.

Report
ErrolTheDragon · 15/12/2020 19:15

@talesofginza

A positive result for the NYT article, but shameful that it took them so long. Let's wait to see if they follow suit on their other platforms. These tech companies have access to all the latest artificial intelligence technology. It would cost PH nothing to create algorithms which could recognise with quite high accuracy - underage people (or people who users would perceive to be underage, even if the 'performer' is not), abusive content, content already identified as needing to come down (thinking specifically of that poor young woman whose underage revenge porn videos kept getting uploaded over and over again), etc etc.

It wouldn't cost nothing, but it seems like a case where it wouldn't matter too much if they got quite a lot of 'false positives' and refused a proportion of material which didn't break their rules.
Report
BertieBotts · 15/12/2020 19:15

@talesofginza

A positive result for the NYT article, but shameful that it took them so long. Let's wait to see if they follow suit on their other platforms. These tech companies have access to all the latest artificial intelligence technology. It would cost PH nothing to create algorithms which could recognise with quite high accuracy - underage people (or people who users would perceive to be underage, even if the 'performer' is not), abusive content, content already identified as needing to come down (thinking specifically of that poor young woman whose underage revenge porn videos kept getting uploaded over and over again), etc etc.

Actually I think this is almost impossible to do currently. AI / image recognition is nowhere near that good. It can barely discern whether a person is in an image, let alone the age of that person or what's happening to them.

You need humans to regulate content in order to discern abuse or age concerns.
Report
BertieBotts · 15/12/2020 19:17

Recognising already banned content is easy to do, and they probably use this already but unfortunately it's really easy to get around simply by slightly changing the speed on the video - if you've ever seen an episode of a kids TV programme on YouTube where the characters talk in slightly high pitched voices, this is what has happened so the uploader avoids anti copyright detection.

Report
everythingcrossed · 15/12/2020 19:21

I wonder if this New York Times article describing how exploited and trafficked children end up on Pornhub has anything to do with the decision? It was published less than two weeks ago.

Report
everythingcrossed · 15/12/2020 19:22

Oops, I see the NYT piece has already been referenced.

Report
testing987654321 · 15/12/2020 19:23

You need humans to regulate content in order to discern abuse or age concerns.

We need humans to not enjoy watching others having sex. It's degrading for everyone involved. It's like prostitution, I don't believe it can ever be a job that doesn't involve abuse.

Report
VictoriasCousin · 15/12/2020 19:36

Can porn ever be ok?

Report

Don’t want to miss threads like this?

Weekly

Sign up to our weekly round up and get all the best threads sent straight to your inbox!

Log in to update your newsletter preferences.

You've subscribed!

OhHolyJesus · 15/12/2020 19:36

The Canadian-based website jabbed other social media websites, writing that "every piece of Pornhub content is from verified uploaders, a requirement that platforms like Facebook, Instagram, TikTok, YouTube, Snapchat and Twitter have yet to institute."
"In today's world, all social media platforms share the responsibility to combat illegal material.," Pornhub stated. "Solutions must be driven by real facts and real experts. We hope we have demonstrated our dedication to leading by example."


Lead by example? You've got to be kidding me. Comparing a porn site to YouTube? Did porn hub ignore a teenage girl when she begged them to remove the video of her rape? What are the top 10 categories of videos they host? How many girls have gone missing and turned up on their platform (one found by her own brother)?

It's the right direction but PH can get down from their high horse on the lead by example point. Disgusting. How dare they.

Report
BloggersBlog · 15/12/2020 19:42

@testing987654321

You need humans to regulate content in order to discern abuse or age concerns.

We need humans to not enjoy watching others having sex. It's degrading for everyone involved. It's like prostitution, I don't believe it can ever be a job that doesn't involve abuse.

Well put. We are as a society warping kids minds at a young age. They will never recover.

And that isnt even the start of how the tragic abuse victims have no chance of ever fully recovering.

This whole "industry" is the basis for millions of lives being ruined,either the abused, those watching while too young, or even those adults in marriages ruined by it
Report
laudemio · 15/12/2020 19:51

I qas very pleased to see this. The pressure must be kept on though as they will allow general users to upload their own content again with identity checks. There is potential for fraud. Also how they ascertain consent when many of the women are trafficked, coerced, groomed or otherwise abused I dont know.

Report
Please create an account

To comment on this thread you need to create a Mumsnet account.