Talk

Advanced search

Pornhub removes the majority of its content due to concerns about rape and underage

(79 Posts)
CoffeeTeaChocolate Tue 15-Dec-20 18:43:08

edition.cnn.com/2020/12/15/business/pornhub-videos-removed/index.html

Has this been posted yet?

I can hardly believe it is true. So relieved!

OP’s posts: |
OvaHere Tue 15-Dec-20 18:50:24

Great news. I'm no fan of studio made porn but it's preferable to random users uploading videos that can't be verified and contain a high proportion of abuse and underage content.

10 million removed videos is a lot!

BlackForestCake Tue 15-Dec-20 18:51:18

Can't be the majority of it, surely?

Sexnotgender Tue 15-Dec-20 18:52:11

10 million... fucking hell. That’s a lot of abuse 🙁

testing987654321 Tue 15-Dec-20 18:52:58

The changes took the number of videos on the website from 13.5 million videos down to a little under 3 million

About 75% taken down.

fantasmasgoria1 Tue 15-Dec-20 18:55:15

I'm glad. I think it should just be shut down.

NiceGerbil Tue 15-Dec-20 18:58:52

To be clear that doesn't mean 10 million had illegal content.

They have removed everything that wasn't uploaded by verified performers.

They say that they are very moral and wonderful for doing this, which they only did after a news paper report saying there were films of unconscious women and girls being attacked, and also underage content generally, which led to credit card companies blacklisting them. They were hit in the pocket.

They also continue to deny there was ever any illegal content. Even though everyone knows there was. And they've taken this very broad action. Which would be a big thing to do if they knew there was nothing iffy there.

So still bastards. But it's still good news.

NiceGerbil Tue 15-Dec-20 18:59:31

The article linked is not a long read and is very clear if anyone is interested in more detail.

PronounssheRa Tue 15-Dec-20 19:00:58

Mastercard, visa etc blocked their cards being used on the site, suspect this had a large part to play, god knows pornhub weren't acting when asked to by the victims of filmed abuse

www.theguardian.com/us-news/2020/dec/10/pornhub-mastercard-visa-rape-child-abuse-images

ErrolTheDragon Tue 15-Dec-20 19:01:10

Their 'verification' process doesn't sound exactly rigorous ... upload a picture of themselves and their username. I suppose better than nothing though.

zzizz Tue 15-Dec-20 19:02:13

Sounds like a huge step in the right direction.

zzizz Tue 15-Dec-20 19:03:44

I'm quite pleased with the thought that a bunch of creepy entitled men will be scouring the web for their favourite videos without as much luck tonight.

littlbrowndog Tue 15-Dec-20 19:05:45

💪💪💪

Hurray for this

Sexnotgender Tue 15-Dec-20 19:05:48

ErrolTheDragon

Their 'verification' process doesn't sound exactly rigorous ... upload a picture of themselves and their username. I suppose better than nothing though.

I thought the same.

talesofginza Tue 15-Dec-20 19:09:24

A positive result for the NYT article, but shameful that it took them so long. Let's wait to see if they follow suit on their other platforms. These tech companies have access to all the latest artificial intelligence technology. It would cost PH nothing to create algorithms which could recognise with quite high accuracy - underage people (or people who users would perceive to be underage, even if the 'performer' is not), abusive content, content already identified as needing to come down (thinking specifically of that poor young woman whose underage revenge porn videos kept getting uploaded over and over again), etc etc.

LumpySpacedPrincess Tue 15-Dec-20 19:11:08

They knew exactly what they were doing for years and years. Happy to exploit and demean women and girls.

ErrolTheDragon Tue 15-Dec-20 19:15:14

talesofginza

A positive result for the NYT article, but shameful that it took them so long. Let's wait to see if they follow suit on their other platforms. These tech companies have access to all the latest artificial intelligence technology. It would cost PH nothing to create algorithms which could recognise with quite high accuracy - underage people (or people who users would perceive to be underage, even if the 'performer' is not), abusive content, content already identified as needing to come down (thinking specifically of that poor young woman whose underage revenge porn videos kept getting uploaded over and over again), etc etc.


It wouldn't cost nothing, but it seems like a case where it wouldn't matter too much if they got quite a lot of 'false positives' and refused a proportion of material which didn't break their rules.

BertieBotts Tue 15-Dec-20 19:15:42

talesofginza

A positive result for the NYT article, but shameful that it took them so long. Let's wait to see if they follow suit on their other platforms. These tech companies have access to all the latest artificial intelligence technology. It would cost PH nothing to create algorithms which could recognise with quite high accuracy - underage people (or people who users would perceive to be underage, even if the 'performer' is not), abusive content, content already identified as needing to come down (thinking specifically of that poor young woman whose underage revenge porn videos kept getting uploaded over and over again), etc etc.

Actually I think this is almost impossible to do currently. AI / image recognition is nowhere near that good. It can barely discern whether a person is in an image, let alone the age of that person or what's happening to them.

You need humans to regulate content in order to discern abuse or age concerns.

BertieBotts Tue 15-Dec-20 19:17:47

Recognising already banned content is easy to do, and they probably use this already but unfortunately it's really easy to get around simply by slightly changing the speed on the video - if you've ever seen an episode of a kids TV programme on YouTube where the characters talk in slightly high pitched voices, this is what has happened so the uploader avoids anti copyright detection.

everythingcrossed Tue 15-Dec-20 19:21:44

I wonder if this New York Times article describing how exploited and trafficked children end up on Pornhub has anything to do with the decision? It was published less than two weeks ago.

everythingcrossed Tue 15-Dec-20 19:22:24

Oops, I see the NYT piece has already been referenced.

testing987654321 Tue 15-Dec-20 19:23:44

You need humans to regulate content in order to discern abuse or age concerns.

We need humans to not enjoy watching others having sex. It's degrading for everyone involved. It's like prostitution, I don't believe it can ever be a job that doesn't involve abuse.

VictoriasCousin Tue 15-Dec-20 19:36:09

Can porn ever be ok?

OhHolyJesus Tue 15-Dec-20 19:36:39

*The Canadian-based website jabbed other social media websites, writing that "every piece of Pornhub content is from verified uploaders, a requirement that platforms like Facebook, Instagram, TikTok, YouTube, Snapchat and Twitter have yet to institute."
"In today's world, all social media platforms share the responsibility to combat illegal material.," Pornhub stated. "Solutions must be driven by real facts and real experts. We hope we have demonstrated our dedication to leading by example."*

Lead by example? You've got to be kidding me. Comparing a porn site to YouTube? Did porn hub ignore a teenage girl when she begged them to remove the video of her rape? What are the top 10 categories of videos they host? How many girls have gone missing and turned up on their platform (one found by her own brother)?

It's the right direction but PH can get down from their high horse on the lead by example point. Disgusting. How dare they.

BloggersBlog Tue 15-Dec-20 19:42:47

testing987654321

*You need humans to regulate content in order to discern abuse or age concerns.*

We need humans to not enjoy watching others having sex. It's degrading for everyone involved. It's like prostitution, I don't believe it can ever be a job that doesn't involve abuse.

Well put. We are as a society warping kids minds at a young age. They will never recover.

And that isnt even the start of how the tragic abuse victims have no chance of ever fully recovering.

This whole "industry" is the basis for millions of lives being ruined,either the abused, those watching while too young, or even those adults in marriages ruined by it

Join the discussion

To comment on this thread you need to create a Mumsnet account.

Join Mumsnet

Already have a Mumsnet account? Log in