Meet the Other Phone. Child-safe in minutes.

Meet the Other Phone.
Child-safe in minutes.

Buy now

Please or to access all these features

Mumsnet campaigns

For more information on Mumsnet Campaigns, check our our Campaigns hub.

Internet porn may be blocked at source

366 replies

David51 · 20/12/2010 11:05

Communications minister Ed Vaizey is working on plans designed to prevent children gaining access to internet pornography.

He hopes to introduce a system that would enable parents to ask internet service providers (ISPs) to block adult sites at source, rather than relying on parental controls that they need to set themselves.

Adults using the internet connection would then have to specifically 'opt in' if they want to view pornography.

Full story:

www.metro.co.uk/news/850896-new-porn-controls-for-children-on-internet-planned-by-government

Mumsnet PLEASE think about doing a campaign about this. Or at least keep us posted on if & when the government decides to ask for our views.

In the meantime maybe we should all contact our current ISPs to ask what they plan to do and letting them know what we want as their customers.

OP posts:
slhilly · 24/12/2010 09:40

Hmm...Kaloki and Snorbs, you're overstating the position somewhat re image filtering. Google SafeSearch, for example, is not hopelessly inaccurate. It has more Type 1 and Type 2 errors than you'd want, but it's not too bad -- if you search for complaints about its inaccuracies, for example, you'll find only a few articles. Whether it's good enough for a mandatory ISP-side filter is a different question; and of course it does nothing re torrents, facebook, etc. But let's not overplay the problem.

Most anti-porn feminists are primarily (and understandably) concerned about video and still images made using women, because that's where the bulk of exploitation happens. Text-based porn is a lot lower down the list of priorities.

So clearly the technical solution being striven for is an automated image-recognition system that triggers blocking of IP addresses, presumably with some sort of human oversight. I could imagine a system that piggybacks off Google SafeSearch, eg if an image doesn't get through the "moderate" filter, the IP address from which the image comes from is blocked. I think there'd be significant technical challenges, but it could be done. However, as I've said repeatedely, it wouldn't materially affect any teen's exposure to internet-sourced porn, and would have significant downsides.

BadgersPaws · 24/12/2010 10:13

"So clearly the technical solution being striven for is an automated image-recognition system that triggers blocking of IP addresses, presumably with some sort of human oversight. I could imagine a system that piggybacks off Google SafeSearch, eg if an image doesn't get through the "moderate" filter, the IP address from which the image comes from is blocked."

The whole IP address is blocked? So one suspect image, which might be completely innocent, on a web page and the entire site gets blocked? And not just that site but anything on the same IP address?

About 1000 photos per second are uploaded to Facebook, if one of them trips the image filter then the whole of Facebook gets blocked until someone comes along with some human oversight and unblocks it.

And then the 1000 photos a second keep on coming so how long before it's blocked again.

So Facebook would be repeatedly blocked throughout a day and a small workforce would be assigned to just try and keep on top of those photos out of the 1000 a second that trip the filter.

And that's just Facebook. It would be the same for every single site that accepts user submitted content.

Utterly impractical.

"I think there'd be significant technical challenges, but it could be done."

I don't think that it could be done, the sheer volume of new content is overwhelming and the technical and human infrastructure required to keep on top of it would be impossible to put in place.

Look again at China, as it's the example given by the promoters of this for working Internet regulation, even they can't do it. The solution they've put in place for sites with a lot of user content is just to block the whole thing, they don't even try to keep on top of it.

slhilly · 24/12/2010 10:26

Badgers, if you re-read my earlier messages, you'll see that I explicitly say that filters cannot deal with user-generated content and community websites such as Facebook.

BadgersPaws · 24/12/2010 10:27

I can't find statistics for the number of false positives for Googles image scanning software but similiar packages for home use are boasting having only a 10% false positive rate.

So apply that to Facebook and 100 false positive photos a second out of the 1000 images would trip the image filter and lead to blocking of the IP address, which would block Facebook.

So Facebook would be permanently blocked and produce a work queue of 100 images a second every second to be checked by a human.

Call it 5 seconds to call up an image, look at it and approve or disapprove it and you need a work for of 500 people working 24 hours a day just to keep on top of Facebook.

And that's just Facebook.

And that's why automatic filtering and blocking with human oversight just doesn't work. As said even the Chinese couldn't put the resources in place to keep up.

BadgersPaws · 24/12/2010 10:29

"Badgers, if you re-read my earlier messages, you'll see that I explicitly say that filters cannot deal with user-generated content and community websites such as Facebook."

Fair point.

But what I said was still worth saying for those that believe that automatic image filters with automatic blocking are a technical and human possibility.

Snorbs · 24/12/2010 11:09

I was talking about a specific technology - that which analyses the actual image to decide whether it is pornographic or not. They're (as far as I am aware) hopelessly unreliable.

I think Google image Safe Search works by taking a look at the context surrounding the image. So if the image comes from a page that's got lots of porn-related descriptions then it's assumed to be dodgy. Or if the site has links to other, known porn sites, then it's assumed dodgy. And if someone clicks on the "Report this image" link in Safe Search then that is also added to the weighting.

I do recall a discussion a year or so ago on some techie website (could've been slashdot) about an analysis of Google safe search results. Apparently it does tend to err on the side of caution and exclude images that are actually ok, just as a "guilt by association" thing based on the sites that link to them.

I don't think Google safe search actually analyses the image itself. It's hard enough to get a computer to understand what's in a picture at the best of times, let alone to distinguish between (say) a picture of a bloke with his shirt off and a woman with her shirt off.

Arguably it would be easier to analyse and detect porn videos than still images. Most sex acts have a certain, um, rhythm to them and it's easier to identify limb position in moving images than in still pictures. It's a hugely processor-intensive task but not an impossible one.

BadgersPaws · 24/12/2010 12:37

"Arguably it would be easier to analyse and detect porn videos than still images. Most sex acts have a certain, um, rhythm to them and it's easier to identify limb position in moving images than in still pictures. It's a hugely processor-intensive task but not an impossible one."

YouTube adds 30 hours of new video every minute, that's an enormous amount of content to keep on top of and that's just YouTube and not all the other video sites out there.

And do you filter every new video that appears? In which case you need some serious hardware.

Or do you just filter videos as they are requested by someone within the firewall? In which case you still need some very heavy hardware and give everyone who watches a video for the first time a serious delay as the video is first downloaded by the checker and then analysed before being released.

And then you still need the staff to deal with the enormous volume of falsely categorised, either as safe or unsafe, videos.

In isolation scanning one video and saying that it's porn or not might be plausible. Once you scale that up and include the impact of the false positives and negatives it becomes completely unworkable.

Once again look at China who have thrown themselves into Internet filtering with serious commitment and they are unable to do this, and they'd be happy to accept a very high false positive rate as they've got no qualms about blocking when they shouldn't. Even then they can't manage it.

And once again the moment network traffic becomes encrypted the ISP is utterly powerless to do anything.

Snorbs · 24/12/2010 13:44

Oh absolutely. It's not something that scales well at all. I think I used the wrong word when I said "easier".

What I meant was that as a technical exercise, and just pondering off the top of my head, I reckon you'd have a greater chance of a machine analysing and correctly identifying porn videos than still porn images.

You'd need to throw a staggering amount of bandwidth and processor time at it and you'd still get a huge lot of false-positives and false-negatives. You'd likely need a significant proportion of Google's server estate and backbone links just to keep up with new videos being added every day let alone checking all the umpty-exabytes of existing video files out there.

And it's definitely not something that is feasible on-the-fly as you're downloading it so it wouldn't be difficult to circumvent (hint: make the humans enter captchas to get to the videos. The indexer/analyser will never see the videos and so will be unable to check them. And you can't just blanket-deny all non-verified videos as that'll screw all legitimate sites sitting behind a login system or paywall).

It's just that I think that, as a gedanken experiment, with video you'd get fewer false-pos and false-neg than you do with still images. I absolutely agree that it's still totally impracticable in the real world.

NetworkGuy · 02/02/2011 22:20

Very disappointing to see Mumsnet has decided to support this idea and now list it on the campaigns page.

I would urge the team at MNHQ to talk to the Tech bods and see how poor this idea is from technical viewpoint.

I would then urge MNHQ to publically tell the Minister that it will be a waste of time and customer money to expect the ISPs to build some sort of giant filtering facility, especially when some ISPs offer generous quantities of data for very low cost.

(I pay under 10 pounds for one account and can download without limit between 0000 and 0800. My average quantity of traffic for the quarter ending 31/12/2010 was nearly 200 GB a month (it could have been several times that if my line was not at 2.5 Mbps, when you consider someone on Fibre could get 30 Mbps all night!). In my case my ISP gives me an allowance of 60 GB to use in the 'peak hours ' (0800-2400) and I used over double that during 'off peak' hours.)

Who really expects the ISPs to be able to implement this for under say 5 pounds a month? Also, for those who decide to "opt in" why should they want to pay the ISPs costs to filter out material, when they are going to get an uncensored feed anyway ?

I suggest 5 pounds a month, but of course that would be 5 pounds surcharge forever, because the initial tens of millions would have to be borrowed by the ISPs and they would likely never recover the full cost as there would be running costs and upgrades to cope with forever more traffic.

Please, MNHQ, have a rethink, get your technical support people to give a summary of points where the whole thing fails to make sense. Educating parents, offering a software product (or promoting one such as can no doubt be recommended by existing users) would be far more beneficial than attempts to filter any of the material out, because there are ways around each and every firewall, given some ingenuity and that 'academic interest' to achieve a goal.

[ Incidentally, "Academic Interest" was the name of an underground news sheet by hackers, for hackers. I still have a copy I was given by someone running a security firm, who had managed to infiltrate the ranks, so he could keep tabs on what had been achieved. ]

I am another person who hopes that the .XXX TLD comes into use sooner rather than later (took 5 years because of the ICANN people being put under pressure by groups like the American Family Association). Of course, once it comes into use, there needs to be pressure to force any businesses using .org and eventually .com onto .xxx so they can be more easily filtered out by internet users with minimal effort.

Snorbs · 02/02/2011 22:44

What!? FFS.

MNHQ, did you not bother to read this thread?

NetworkGuy · 02/02/2011 23:13

Come on Snorbs - it doesn't need MNHQ staff to read, but for their Techie staff to explain the problems, as what they say will be taken notice of. However computer literate MNHQ people are, they perhaps underestimate the scale of the problem, as various people have explained earlier. It sounds such a good idea, in theory, but the practical hurdles are many and we know it will be as useful as a condom with a hole in it!

If Mumsnet then went public on radio and TV saying the suggestion is unlikely to work, will cost a fortune with the costs passed to ISP customers, and, overall, will not guarantee everything is 'clean', so parents will still need to be educated and consider software to block porn, then the Minister might take some notice.

The rest of the media would, and I suspect they'd get journalists from Computing and Computer Weekly to discuss the pros and cons.

It could save time and cash if a quick radio debate circumvented the Minister getting some civil service study undertaken where a number of IT consultants will suggest figures which might treble before implementation, and of course there will be only partial guarantee of achieving the stated aims.

Snorbs · 02/02/2011 23:34

Or, maybe, just make the point that ISP-level filters are already sodding available for anyone who cares to move ISP to one that offers them as a service.

But, apparently, that's not good enough. No, we've got to have a crappy, expensive, poorly designed and above all ineffective technical solution to what is predominantly a social problem. Technical solutions to social problems never work.

Anybody who knows what the real issues are will continue using PC-level filtering and keeping an eye on what their kids are doing. Anybody who doesn't will either opt out of the ISP-level filters because they realise their favourite (not necessarily porn) sites are being inadvertently blocked, or stay opted in but still not supervising and have their kids circumventing the filter anyway.

I thought nanny-state politics was supposed to be a vice of the left, not the right?

NetworkGuy · 02/02/2011 23:36

Out of interest which ISPs do offer them, and are they effective ?

NetworkGuy · 02/02/2011 23:37

sorry - mostly effective :)

BaroqueAroundTheClock · 03/02/2011 01:24

Google safe search is hopeless;y inaccurate.

I had to turn mine off recently to find anything vaguely usable that would give enough impact (without being graphic) for a project.

40 images, ranging from trafficking, poverty, war, homelessness, and the like - and only a handful I found without the safe search turned totally off.

And the end results were not graphic - I have shown both my older DS's what I produced at the end of it.

However going back to the actual thread topic (which I have just read in it's entireity).

Lets just say they do it, and it's techinically possible, doesn't add £20 a momth to my broadband bill, I only have to ring up 25 times in the first month to get safe websites (presuming it was individual websites that were blocked and not just a "yes to porn" "no to porn" option). So -my children are "safe" (almost).

But then - they go to their friends house - and their parents watch porn,and have asked for all porn websites to be accessible...well there's nothing I can do about that. Just like there's nothing I can do about it now while I have (personalish) control over what they see at home on the computer.

ChunkyPickle · 03/02/2011 01:30

Or, they do what I would do and use secure proxy which encrypts what's sent back so the filter at the ISP can't see it anyway.

That's what I would do, and that's exactly what any kid would do after a 2 minute google session.

Snorbs · 03/02/2011 07:33

AOL is the first ISP that springs to mind that offers content filtering. I think Talk Talk does as well.

Plus, many (if not all) of the mobile Internet providers provide adult service filtering. Vodafone does it as an opt-out service - ie, you have to prove you're over 18 before you're allowed to adult sites.

Snorbs · 03/02/2011 07:36

Baroque, as noted on the other thread, you won't have the ability to turn on and off access to individual sites. It would be a system management nightmare for the ISP. It would have to be either on or off.

KalokiMallow · 03/02/2011 10:33

I still think that the most useful and most cost effective solution would be free software to every household. And a centrally updated database which the user can then choose to use as is, or allow/add individual sites to as they see fit.

But would that involve people actually taking responsibility? Hmm

Seriously MNHQ, please read this thread. Or in fact any article based on techological facts rather than scaremongering.

Snorbs · 03/02/2011 11:14

Kaloki, you're absolutely right. The place to do filtering is on the PC, and that should be backed up with parental supervision. That some parents cannot be bothered to do so is a social issue, not a technological one.

There are already a number of free software filters that are available of course. The Windows PC that my DCs use is set up with the (free!) Microsoft Family Safety Centre. That's been working well but, even then, I've had to fine-tune the list of allowed and disallowed sites.

KalokiMallow · 03/02/2011 12:45

Thought I'd post this here as well, to save people wading through. It should be pretty jargon free (I tried at least) for the non-tech lot. Explanation.

differentnameforthis · 04/02/2011 00:28

A lot of people who are against campaigns like this will be porn users or even make a living from the sex industry

What a load of....

I am neither a porn user, nor make my living in the sex industry...but I don't want a filter!

I don't want the Gov/my ISP dictating what I use my PC for. Wars were not fought for dictatorship.

threefeethighandrising · 05/02/2011 01:23

"So clearly the technical solution being striven for is an automated image-recognition system that triggers blocking of IP addresses "

So no pictures of breastfeeding on the internet then. Or any art with nudes in it. Or medical sites with naked people. Or .. the list is endless.

A computer is never going to get it right, and as people explained above the sheer amount of people needed to check the decisions made by the computer made this unworkable.

I don't give a stuff about porn on the internet. But I do care greatly about freedom of information. Once we go down the censorship route, what will be next?

It's not about porn, there are better, cheaper ways of stopping your children getting access to porn. The internet is bringing people together in ways which were unimaginable not so long ago, and giving us access to an amazing wealth of information. Knowledge is power, and the internet is shifting power to the people. Look at how the uprising in Egypt began, for example.

Of course governments are twitchy and the less enlightened ones (such as ours, right now) are keen to control it.

This is not something we should support!

threefeethighandrising · 05/02/2011 01:28

How can we write to mumsnet to ask them to reconsider their support for this?

BaroqueAroundTheClock · 05/02/2011 01:31

there is another (quite lengthy) discussion on this taking place \link{http://www.mumsnet.com/Talk/site_stuff/1141192-recent-decision-by-MNHQ\here} - including MNHQ's response to the situation

Also mentions the sort of stuff which I can (guarantee) you wouldn't want your children to be finding which wouldn't be blocked by an ISP filter