If police and ISP providers can remove images of terrorism, why not child abuse?(49 Posts)
I head on the radio this morning that a police task force working the the ISP providers is removing around 1000 images which promote terrorism EVERY WEEK from the web. The aim is to prevent young people from seeing them and being encouraged by them.
If they can do this, why cant they do the same with images of child abuse? Seems like a good idea to me. I will try and find a link...
The stuff they're removing is on the open web. Child abuse images are shared on the dark web, which is designed to evade the police and security services.
GCHQ/NSA and related services are the only ones with the computing power and coverage to really try and curtail the sharing of the images, but every time they try and assert control over the dark web a 'whistleblower' conveniently triggers some civil liberties debate and they have to back off. Very, very convenient.
The Dark Web is also used to evade the secret police in totalitarian countries. It's like the camera and the printing press and I suppose the quill pen: a dual use technology. Very difficult to control for good purposes but not for bad purposes.
So, why are the pro-terrorism organisations using the 'normal' web then? wouldnt it make more sense for them to go to the dark side too? presumably those interested to find whatever they are looking for could find a way in?
I take both your points, about GCHQ/NSA and the fact that the dark web has a potential good side eg in totalitarian countries - a difficult circle to square.
Because terrorists want the general public to see, spreading these images and videos in the open to incite fear or hate or to intimidate and distress people is their aim.
I think the material they were talking about was more of a recruitment campaign for would-be jihadists. not inclined to google it myself!
I think they probably do do the same with child abuse, the problem is there is always more.
I'm also not inclined to google but I would imagine that the stream of uploaded images is faster than the stream of images being taken down. Once an image is shared it's duplicated, they can't delete every copy on everyone's computer who has seen or downloaded it, or the original, if they did it makes no difference.
I'm very cynical about this and think it because there is no monetary reward for it so motivation is lacking.
Terrorism damages consumer interests and creates fear - people spend less, travel less etc. Big motivation for govts and corporations to do something about it.
Children don't spend. They have no political or consumer power. They are the most vulnerable members of society and so dealing properly with online images of child abuse is not prioritised.
You Tube can run their user generated content business without any porn on it to protect their business model - online images can be controlled if there is enough motivation to do so.
There is constantly work being done by many organisations world wide to remove images of child abuse from the deep web. Sites are shut down, people arrested, victims identified. The problem is once something has been uploaded and shared it will be out there forever. It is continuous. There are always new images, new videos made and shared faster than they can be removed. It's not simple.
It's like gossip - once it starts it is uncontrollable. Especially when the more savvy types pass it on on a CD or a USB stick rather than online.
There are agencies hard at work in the UK such as Child Exploitation and Online Protection (CEOP) and the Internet Watch Foundation (sounds a horrific, draining job). It's just not the big news story that ISIS is at the moment (and as Winterbells pointed out, ISIS are deliberately going for as much publicity as possible).
Possibly stupid question.
If a member of the public decided that they were going to try to find such images to report them to CEOP for removal, they could get prosecuted for accessing child abuse images. So it's not like we as the public can help to 'clean up' the Web, is it?
I doubt a member of the public could find images that CEOP etc can't. And I would wonder what the motivation really is of someone who went looking for these very disturbing images. After all you would be exposing yourself to seeing them.
A person I know who innocently saw some images - about 5 - (on her partners computer) was really very damaged by seeing them. 12 years on she still has them seared into her mind. The police team that subsequently investigates said that police officers were harmed by their work (via exposure to the images) and were only able to be on the team that deal with child abuse images for short periods of time due to the negative effects it has on them.
I put in peacocks once as in the shop which sells clothes and some blocked from viewing this site thing made me feel I had done something wrong. I think it should all be removed, so that the innocent can't be damaged and those who are doing something wrong have to look elsewhere. Why wouldn't anyone want it removed?
It's just not as easy as saying you want it removed.
For one thing there's the size of the internet. It's not like a town center where you can stroll around looking at shop windows for illegal adverts, but more like having to turn over each grain of sand on every beach in the world. And then checking it again an hour later because it has changed.
It can't be automated except in a very crude way. You need people to look or you need ordinary users to report it and then have your people look to see if it is illegal. You probably need to take on millions of extra staff to make a dent.
Also of course most of the internet is not coming from the UK. We don't have any authority over sites in other countries. In some cases we can mention it to their law enforcement, but that's about it.
We can block it from being viewed from here, but it's quite easy to get around those blocks and in some cases it means blocking legitimate sites. Like the time they blocked Wikipedia (over a picture that wasn't obscene in the first place)
You can post images on here now. If someone posted something illegal and it was reported what would you want to happen? Would you want all the MN servers confiscated and the owners arrested?. Would you want the domain name permanently blocked? You could ask MN to remove the image and they would in this case, but then what? Do the authorities check all images on MN daily/hourly to see if there's another?
It's just not as easy as it looks.
Blocking software is crude in any case. In the early days of the internet some filters blocked the website of the local council in SCUNTHORPE and you can all see what they spotted!
@lougle: Deliberately searching out (even though with good intentions) could, one imagines, get you into trouble. The behind the scenes Guardian article on the Internet Watch Foundation* that I linked to before does state that members of the public can anonymously report any images/videos they stumble across for investigation via their website.
*That article stuck in my head ever since I read it. Am in awe of the people who work there, think I'd last all of two seconds trying to do a job like that.
To add a further comment to Andrewofgg: blocking software is not just crude, but sometimes outright dangerous. Last year it turned out that O2's porn filter was also cutting off access to ChildLine, the NSPCC and the Samaritans. O2 did fix this, but they then also removed the tool that enabled people to discover that these sites were blocked (one imagines - although hopes not - as a way to stop more negative stories).
There can also be some pretty dodgy motivations behind the proposers or businesses pushing filtering / blocking software. There was a story in the mid-200s when it turned out that a company that was (at the time) providing the blocking software bought by most UK schools was run by Evangelical Christians who were deliberately blacklisting sites that provided teen advice on LGBT issues.
Audeca That's horrible, sorry I was flippant about it, I had no idea of that.
The cynical part of me agrees with Children don't spend. They have no political or consumer power. They are the most vulnerable members of society and so dealing properly with online images of child abuse is not prioritised.
I think there are too many people in positions of authority who have more than a vested interest in keeping vulnerable children just that vulnerable. Cleaning up the internet will not be what they want!
I think the interest in historical abuse is the tip of the iceberg.
You Tube can run their user generated content business without any porn on
That's not how it works though. It just means that when someting is reported they take it down because it's their site and they control it. At any given time there could be thousands of videos waiting for someone to notice them and report.
If I offered you £100 for each bad site you close down you still couldn't because you don't have control of them.
Can we talk about Ad-click revenue?
Do you think advertising revenue is being generated by sites that put up child abuse images?
As lots of these images are available for free, are these sites run and paid for out of personal pockets, or are they generating ad-click revenue i.e. contributing to Googles billions?
I just can't imagine that all these images of abuse are being freely distributed, sites run and funded by people without generating any income at all for anyone. Money must be involved. Follow the money to find a solution. Credit card, paypal, ad-click - there will be money involved.
And a question for the techy people among us, is it possible to digitally 'stain' files on the internet? So images on an abuse site can be somehow stained prior to the site being closed down, so if they are uploaded on another site they can be easily located? If it doesn't exist yet, please can someone invent it soon.
Whilst I understand that there are many of these sites around, they can't be THAT hard to find - as the people who want these images manage to find them!
Join the discussion
Please login first.