Talk

Advanced search

Am I just really naive?

(9 Posts)
Dixiestamp Mon 05-Jun-17 12:02:58

Sorry if this has been asked already, but if YouTube is full of videos that are radicalising and encouraging extremism, why are they not removed? Are they monitored carefully and regularly? I don't get it- surely YouTube has the power to do this! Sorry if the answer to this is really obvious, but I really don't understand it.

Gileswithachainsaw Mon 05-Jun-17 12:06:00

I think there is a clause that states websites are not liable for anything posted by a third party.

However I do agree there should he someway round this because there are so many things that just shouldn't be left for people to access.

ScarletForYa Mon 05-Jun-17 12:06:21

In not sure if you tube is moderated....?

Gileswithachainsaw Mon 05-Jun-17 12:06:32

230 or something.

ZacharyQuack Mon 05-Jun-17 12:09:25

I don't think Youtube hire people to view every video before it is added to their site, just like MN don't pre-moderate every post.

You can report videos to Youtube, which I presume would be that inappropriate content is taken down.

MrsHathaway Mon 05-Jun-17 12:12:39

YouTube has over a billion users–almost a third of all people on the Internet–and every day, people watch hundreds of millions of hours of YouTube videos and generate billions of views.

Around 300,000 videos are uploaded every day. How big a team would they have to employ to watch every one? I think it's a volume issue.

There are a lot of deleted videos so they must react to reports. But how would you come across a dodgy video to report it? Do you see what I mean? I have never stumbled across anything like that, myself, so I wouldn't be in a position to report it.

Dixiestamp Mon 05-Jun-17 12:21:41

I see what you mean, MrsHathaway- I guess that only people who went looking for that sort of thing would see it, so no one else would see it to report it. I really think that something has to change- but I don't know what, judging from the sheer scale of it, from other posts.

Gileswithachainsaw Mon 05-Jun-17 12:24:55

I don't think you have to necessarily be searching for it.i mean searches pick up key words in the title of the video so yes in some cases you would have to search for "how to be an X" bit if they entitled the video after say a song track it would come up when you searched for it

chipscheeseandgravy Mon 05-Jun-17 13:31:13

It's not just places like YouTube, I read an article (its on Bbc website) about WhatsApp and similar platforms that work on sharing content amongst users. They start the content under a generic group name, build a fan base and then change it to something that can be identified as ISIS to its members. You then have thousands of people sharing this content building up more awareness of these additional sites. As quickly as these sites are shut down, they can put another back up in seconds. Although both of these groups do shut the content down, they will just reload it under a different user.
The same goes for twitter. It's a needle in a haystack sort of scenario.
Think about how much content is loaded up on these sites on an daily basis. No one sits and checks its content before approving it. It's only usually withdrawn if people complain about it.
The official ISIS news sites have also now started broadcasting in English. Therefore making it even more readily available to people who previously wouldn't be able to understand its content.

Join the discussion

Join the discussion

Registering is free, easy, and means you can join in the discussion, get discounts, win prizes and lots more.

Register now