FWIW, here is what the 2008 Byron Review (conducted by Dr Tanya Byron for the last government) had to say on the subject of "network level blocking" (what the present govt is proposing):
Network level blocking
4.54
Some material on the internet, such as child abuse images, material inciting racial hatred
and extreme pornography is clearly illegal in the UK. For such material, there is a strong
case for it to be blocked by ISPs at a ?network level? using the Internet Watch Foundation?s
list, so that when a user tries to access a website they are blocked from doing so. Countries
like China and Saudi Arabia have a much wider list of content which is illegal, and use
similar techniques to prevent their citizens (including adults) from accessing it.
4.55
In the UK, at least one ISP offers users the option of connection to the internet which blocks
material that is unsuitable for children to access. Some people have suggested that this
approach should be extended to all ISPs in the UK. Users aged 18 and over would have to
opt out of such a system in order to receive un-filtered access to the internet from their ISP.
Proponents of extending network level blocking point to the fact that it does not rely on
families to set up their own filtering software, and that, unlike filtering software on the
user?s computer, it cannot be disabled by technologically advanced children. However,
there are a number of problems with a policy of blocking non-illegal material at a network
level.
4.56
Firstly, there is the problem of deciding what material should be blocked. There is a general
social consensus, reflected in our approach to film and television content, that explicit
pornography and violent material such as videos of executions is not suitable for children.
However, there is no such consensus about material such as non-pornographic nudity,
violence or death in an educational context (such as information about wars or the
holocaust) and the websites of extremist political parties. Similarly, many parents would
wish to stop young children from stumbling across such material, but would be keen for
their children to see such material when they are older teenagers or when it can be put in
an appropriate context.
4.57
The decision about what constitutes ?inappropriate content? can be highly subjective.
What one person views as harmful, another might find offensive, whilst yet another might
see it as an important, empowering learning experience for their child; and this view is
likely to change depending on the age of the child. An example of this might be a sex
education website. In consequence, any attempt to block content which falls into these
grey areas would leave some parents unhappy that the system was either too restrictive or
not restrictive enough (especially where there is more than one child in the house). There is
also the possibility that someone whose content had been blocked as being unsuitable for
under 18s might bring a successful legal challenge under Article 10 of the European
Convention on Human Rights (right to freedom of expression).
4.58
Secondly, the task of blocking material at a network level presents a range of technical
issues. The construction of a comprehensive list of harmful and inappropriate material
(even if a satisfactory definition could be agreed), would be extremely difficult and
expensive. Alternatively, the use of a program to automatically filter content based on
words, phrases and the properties of images is likely to prove difficult. The extra equipment
required by ISPs to operate such a system can be costly, and the process may have the side
effect of slowing down internet access for users. For example, an Australian Government
feasibility study of a network level filtering trial in Tasmania (NetAlert, 2006) found that the
use of filters significantly reduced network performance, although only one in six users
noticed this. Problems may also arise around words which can be used in several different
contexts (e.g. the word ?breasts? might denote a pornographic website, but it might appear
on a site about breast cancer support or recipes for chicken breasts). Although this problem
applies to all types of content filter it is particularly problematic at a network level, where
users cannot override the filter for sites they know to be acceptable or set a different level
of filtering for different members of the family, as they can with many PC-based filters.
4.59
Thirdly, there are problems with the way that network level blocking can appear to be an
easy way of protecting children from all harmful and inappropriate material online. Even if
it were possible to put a block on all content that is ?unsuitable for under 18s?, the
presence of a content filter would do nothing to prevent harmful or inappropriate contact
of the child or conduct by the child online. Also, it is wrong to assume that tech-savvy
children determined to access blocked material could not ?get round? the system. There are
a number of techniques such as using ?proxy websites? and certain kinds of encryption
software, which make any network level filter ? including those used by the Chinese and
Saudi governments ? possible to evade. As such there is a risk that purporting to give
parents a ?safe? internet connection could lull them into a false sense of security, preventing
them from developing effective parenting strategies to empower their children ? especially
older children ? to use the internet safely.
4.60
For these reasons I do not recommend that the UK pursue a policy of blocking non-illegal
material at a network level at present. However, this may need to be reviewed if the other
measures recommended in this report fail to have an impact on the number and frequency
of children coming across harmful or inappropriate content online.
Safer Children in a Digital World
The Report of the Byron Review
media.education.gov.uk/assets/files/pdf/s/safer%20children%20in%20a%20digital%20world%20the%202008%20byron%20review.pdf page 92-94