In Bernard Keane’s response yesterday to my article before Christmas  — “Filtering the facts: Conroy slips up when hitting back” — he again makes several claims that are wrong.

The closest Keane is prepared to go in admitting he was wrong when he claimed that the independent Enex filtering trial saw 3.4% of over-blocking of the ACMA blacklist, is when he says he is conveniently, happy to “leave aside the point”.

But he still wishes to dispute the independent trial report’s finding that all filters were able to achieve 100% accuracy in blocking the ACMA blacklist.

Keane seems to think he has stumbled on some sinister secret in a paper written by one of the trial vendors — Watchdog. The fact that some participants, including Watchdog, had initial difficulties in loading the ACMA blacklist was no secret.

As previously stated, the ISP content filtering pilot showed that a specified list of URLs can be blocked with 100% accuracy. I quote from the Enex TestLab report Internet Service Provider (ISP) Content Filtering Pilot Report:

Initially, several participants experienced difficulty loading and blocking the complete ACMA blacklist. Some of the filters needed adjustments made so that they could recognise URLs that were long and complex and included spaces. Others included colons, question marks and percentages. Some URLs were associated with more than one IP address and some URLs redirected the user to a second URL.

Following consultations with the product vendors, all issues experienced with loading URLs contained on the ACMA blacklist were resolved (page 11).

These issues were resolved by the filter vendor making adjustments to the filtering device so that it would accurately block URLs of the nature described above. Contrary to Keane’s claims, these URLs were not removed from the testing process.

The lists used in the testing, including the ACMA blacklist, were washed before testing to remove any URLs that were no longer live — that is, the URL no longer existed so the filter would not be able to find it. This is a process ACMA currently carries out periodically.

Enex TestLab also checked the lists used for testing:

Prior to performing the testing during the pilot using the three lists, each site on each list was tested by Enex to ensure that is was still live. (page 11)

Keane also points out what he calls the “well-known problem” that blocking a YouTube page could cause problems to the load on the filter. The reason this is a well known issue is because it is in the Enex report (page 19):

Capacity of filters to handle high traffic loads/sites

In a pass-by filtering solution the actual traffic load placed through the filters is very low because only a small percentage of end-users would be attempting to access sites on the blacklist at any one time. However, in situations where there is a potential for very high traffic sites, such as YouTube, to have pages on the filtering list, this could result in significantly higher traffic rates passing through the filter, even though the specific pages being accessed are not those on the blacklist. This could cause additional load on the filtering infrastructure and subsequent performance bottlenecks.

As Keane points out, the issue is also addressed in the FAQs on the department website.

Keane’s speculation of whether Google will comply with the laws of the Australian Government is interesting, however it should be noted that Google has operated within the Chinese regime for many years. It also abides by the laws in Thailand requiring it to filter from its search results any criticism of the Thai king and filters Nazi propaganda content from its German search results.

As Keane points out there are many videos on YouTube “about euthanasia and suicide, some offering instructions or recommending it”. Euthanasia has long been a hotly debated and divisive issue in Australia but the fact remains that instruction in self-harm in Australia is a crime and therefore content containing such instruction is deemed Refused Classification under the National Classification Scheme guidelines.

A time may come where instruction in self harm is no longer a crime under Australian law and such content would therefore not be deemed Refused Classification. People who object to this content being included in the filtering policy should turn their focus to changing the laws regarding euthanasia in Australia.

Keane criticises the Department’s website for pointing out what RC-rated content is, but it is very clear from the opponents of the policy that either they do not know or are wilfully misleading the public. Colin Jacobs, the CEO of the Electronic Frontier Association, wrote in his article in Crikey on 21 December that “subjects such as abortion, anorexia, Aborigines and legislation on the sale of marijuana would all risk being filtered.”

RC-content can’t be found on the library shelves, it is not available in the newsagency, you won’t see it in the cinema and you certainly can’t watch it on DVD or television. That is why the Government will continue to explain to the Australian people what RC content is.

The Government has never claimed that ISP filtering is a silver-bullet solution and that is why our cyber-safety policy includes $49 million in funding for an additional 91 Australian Federal Police officers for the Child Protection Unit, additional funding for prosecution of offenders, $32.8 million for education and outreach programs for teachers, parents and students, research into cyber bullying and online threats, and the establishment of a Youth Advisory Group on cyber safety.

Filtering is one component of the policy but unfortunately the rest of the policy is largely ignored by those who oppose it.