Meet the Other Phone. Child-safe in minutes.

Meet the Other Phone.
Child-safe in minutes.

Buy now

Please or to access all these features

AIBU?

Share your dilemmas and get honest opinions from other Mumsnetters.

Am I just really naive?

8 replies

Dixiestamp · 05/06/2017 12:02

Sorry if this has been asked already, but if YouTube is full of videos that are radicalising and encouraging extremism, why are they not removed? Are they monitored carefully and regularly? I don't get it- surely YouTube has the power to do this! Sorry if the answer to this is really obvious, but I really don't understand it.

OP posts:
Gileswithachainsaw · 05/06/2017 12:06

I think there is a clause that states websites are not liable for anything posted by a third party.

However I do agree there should he someway round this because there are so many things that just shouldn't be left for people to access.

ScarletForYa · 05/06/2017 12:06

In not sure if you tube is moderated....?

Gileswithachainsaw · 05/06/2017 12:06

230 or something.

ZacharyQuack · 05/06/2017 12:09

I don't think Youtube hire people to view every video before it is added to their site, just like MN don't pre-moderate every post.

You can report videos to Youtube, which I presume would be that inappropriate content is taken down.

MrsHathaway · 05/06/2017 12:12

YouTube has over a billion users–almost a third of all people on the Internet–and every day, people watch hundreds of millions of hours of YouTube videos and generate billions of views.

Around 300,000 videos are uploaded every day. How big a team would they have to employ to watch every one? I think it's a volume issue.

There are a lot of deleted videos so they must react to reports. But how would you come across a dodgy video to report it? Do you see what I mean? I have never stumbled across anything like that, myself, so I wouldn't be in a position to report it.

Dixiestamp · 05/06/2017 12:21

I see what you mean, MrsHathaway- I guess that only people who went looking for that sort of thing would see it, so no one else would see it to report it. I really think that something has to change- but I don't know what, judging from the sheer scale of it, from other posts.

OP posts:
Gileswithachainsaw · 05/06/2017 12:24

I don't think you have to necessarily be searching for it.i mean searches pick up key words in the title of the video so yes in some cases you would have to search for "how to be an X" bit if they entitled the video after say a song track it would come up when you searched for it

chipscheeseandgravy · 05/06/2017 13:31

It's not just places like YouTube, I read an article (its on Bbc website) about WhatsApp and similar platforms that work on sharing content amongst users. They start the content under a generic group name, build a fan base and then change it to something that can be identified as ISIS to its members. You then have thousands of people sharing this content building up more awareness of these additional sites. As quickly as these sites are shut down, they can put another back up in seconds. Although both of these groups do shut the content down, they will just reload it under a different user.
The same goes for twitter. It's a needle in a haystack sort of scenario.
Think about how much content is loaded up on these sites on an daily basis. No one sits and checks its content before approving it. It's only usually withdrawn if people complain about it.
The official ISIS news sites have also now started broadcasting in English. Therefore making it even more readily available to people who previously wouldn't be able to understand its content.

New posts on this thread. Refresh page