Tuesday, 20 August 2013
Voluntary Filtering Coming to UK Web Hosts
In an attempt to protect minors from inappropriate Internet content, PM David Cameron's government recently announced plans to implement a new voluntary filtering system among the nation’s Web hosts. The system is designed to protect people from being inadvertently exposed to objectionable content while browsing the web.
In a speech outlining the plans, Cameron said that the voluntary filtering program would make it more difficult to directly access adult material or to find illegal content via standard Internet searches. Although it will not cut off inappropriate content entirely, the system will supposedly make it much more difficult to come by.
In the early stages, ISPs deciding to implement the filtering will be responsible for classifying content and training employees appropriately. New customers will have to actively opt out of the filtering when signing up for the service. Existing customers will not be impacted immediately, but they will eventually be automatically subject to filtering. They will need to inform their ISPs it they want access to blocked content.
The government insists filtering will occur at the domain level rather than the IP level. This essentially gives web hosts and ISPs greater control over what is blocked and what's not. In theory, two sites utilising the same shared server could see one filtered and the other left alone as long as their domain names are different.
Critics warn the system is too simple to be of any real value for protecting kids from adult material. For starters, it wouldn't take much for children to opt out of the filtering without their parents or carers knowing. Secondly, there are software and hardware tools that can easily get around filters. Today's Internet users are already well-versed in these tools.
While the plan sounds very appealing on the surface, there are real concerns about potential censorship issues. Those concerns come by way of how the filtering deals with content other than what is deemed as ‘adult material’. This additional content would the classified as objectionable if it didn't deal with certain topics in a way deemed inappropriate by ISPs.
Examples of such content would include topics promoting extremism, terrorism, violence or suicide. Questionable material regarding various emotional and mental disorders might also be included. The problem is that all of this is rather subjective.
If political organisations or advocacy groups were to become involved, for example, they might be able to convince an ISP to block a number of sites whose only offence is displaying content that runs contrary to the opinions of those lobbying the ISPs. This type of thing happens in China all the time.
Even though the idea of filtering may be a good one, more thought needs to be put into how to implement it in a way that still contain safeguards against censorship. In the end, no amount of filtering is going to totally eliminate all of the dangers on the web so trampling on the rights of content owners in the name of Internet safety does not seem reasonable.