Following up from my last post I’ve pieced together a bunch of relevant articles and arguments for anyone interested in some further reading on this issue.
A good article by Suzanne Dvorak
from the ABC the chief executive of Save the Children in Australia about why the filter will not make children safer. The synopsis is that we’d be better spent teaching our children how to be safe:
*The best way to protect children from harm is to teach them to protect themselves. Just as we teach children how to cross the road safely, we must teach them how to safely navigate the internet. We do this by assisting and guiding them in the first instance, teaching them how to identify and avoid danger, and what to do if an unsafe situation occurs. As they mature, we allow them greater freedom. *
From the official Google Australia blog, a post detailing the key points of their submission to the Department of Broadband, Communications and the Digital Economy. The synopsis is:
- It would block some important content – the scope of ‘refused classification’ is incredibly wide
- It removes choices and creates a false sense of security
- It isn’t effective protecting kids – sexual abuse content is not found on public web sites but in chat rooms and closed peer to peer networks
- It won’t work for sites such as high volume sites such as Facebook, YouTube, Twitter etc.
The full submission can be read here (24 pages).
While on Facebook, it appears that Senator Conroy doesn’t quite know how it works and thinks that companies like Facebook can somehow prevent cyber vandalism. As per Google’s point above, it’s not possible for companies like Facebook to moderate and clear all comments posted before they are visible to other users, the systems just wouldn’t work. This particular outrage came after people vandalised a page set up in memory of murdered schoolboy Elliot Fletcher. After hundreds of people left heartfelt messages of grief and support for his family, dickheads started leaving messages that contained pornographic images and other vulgar material.
The article describes this as a hack, but I highly doubt that any hack was involved – it wouldn’t take much for someone to setup a Facebook account under a false name and then leave unsavoury content.
This is the quote attributed to Conroy from the article:
“I think there is a situation where people take Facebook with an enormous amount of trust and they’ve got to clearly explain what went wrong with their security systems, how this was able to happen (and) importantly, how they’re going to ensure that this doesn’t happen again.”
What has security got to do with it? If people want to post unsavoury content (either with their real accounts or under false names), then security has nothing to do with it. I’m not sure what Senator Conroy expects them to do here, but to me it demonstrates that he doesn’t get it.
The irony is that according to the article Queensland Police contacted Facebook and had the page taken down. Thus proving that there are already mechanisms in place for dealing with dodgy material.
What is the proposed legislation?
We don’t know yet because there has been no draft legislation released by Senator Conroy. But fellow Labor Senator Kate Lundy wants to include a clause that would allow people to opt out of the filter. This implies that Senator Conroy would be proposing legislation that would be mandatory for all. Let’s hope Lundy is successful.
From the speech made by Senator Conroy on Dec 15 2009 it appears that RC content filtering will be mandatory and optional additional filtering will be available through ISPs.
While the proposed filter will start with excluding content that has been refused classification (RC), groups such as the Australian Christian Lobby are already getting excited at the prospect of being able to extend the filter to exclude content that was only rated at R! According to this article even Senator Conroy thinks this is going too far and indicates further legislation will be required to bump up the level of the filter. But once the filter is in place the hurdles to increasing the level of content that is excluded are considerably lower.
The technical limitations
The internet wasn’t designed with content filtering in mind. The sheer volume of web content available today is beyond the capacity for any government to classify. Among the problems are:
- Secure web sites that use HTTPS such as online banking sites can’t be filtered. The content is encrypted, that’s what makes this technology secure. Sites hosting RC content could simple use HTTPS
- The filter is based on a list of site addresses. Every request to a web site has to be checked against this list to ensure that the request is not for one of these blacklisted sites. The list could be from 1,000 to 10,000 sites we’ll never really know because it will be secret. The longer the list, the slower the internet will be
- The filter is targeting websites only. The internet has so many more applications than the world wide web – email, FTP, peer to peer file sharing such as BitTorrent. Unless they start scanning your email attachments, how will they to know if they contain pictures of Miranda Kerr or something much nastier.
- There are already systems out there called anonymisers that allow users to bypass these filters. They work by using computers overseas to act as proxies and retrieve the content that has been blocked by the Australian ISP. There are also virtual networks like Tor that take this to the next level and help to obscure
the identity of the person making the requests.
- Mandatory ISP level filtering will be less configurable than software installed in your own home.
(With thanks to the Open Internet campaign)
On the whole this just doesn’t seem to add up. We know from the outset that this will only be effective for content hosted on websites and can’t do anything about content on other protocols. So why go to this effort at all?
An education and policing based approach would be more effective because it would be holistic. Parents are still free to install filtering software in their own homes if they so wish (much of which is subject to the same technical limitations as listed above though).
So do something – find out how!