Wednesday, February 25, 2009

Brooklyn Law School study highlights net censorship problems


A recent study entitled “Filtering in Oz: Australia’s Foray Into Internet Censorship” by Derek Bambauer of the Brooklyn Law School in New York examines the Rudd Government’s upcoming Internet filtering plan and provides a thorough analysis of its legitimacy. This report is important - not only is it authored by a reputable and neutral foreign observer but it focuses more on the legitimacy of the scheme than the technical or other concerns, and highlights some serious problems.
The study’s author applies a process-based methodology to determining the legitimacy of such a scheme by asking four questions. Is the country open about its censorship plans and the reason behind them? Is it transparent about what is to be restricted? How narrow is the filtering? And finally, are the processes and decision makers behind the scheme accountable? While the Government earns praise for openness (Internet filtering was a central campaign promise), serious issues are highlighted in the other three areas.
Electronic Frontiers Australia has consistently called for clarity on the aims of the censorship scheme and which material is to be targeted. Yet phrases like “other unwanted material” still represent the best information we have received from the Government. Whether or not this is a deliberate attempt to hobble debate or merely represents confusion within the DBCDE we cannot say, but the situation was not lost on Bambauer:
To date, Australia’s transparency regarding its filtering has been poor. The country has vacillated on what material it will target for blocking. This uncertainty makes it difficult for citizens to assess whether the scope of material blocked is appropriate, and whether the set of targeted sites comports with the underlying rationales for censorship. The Labor government is opaque about the types of sites that will be blocked, how a site will be evaluated for filtering, and how those decisions map to larger social and political goals.
Indeed, in another part of the study the author examines the hypothetical 10,000-site blacklist floated by the Government, and wonders whether this proves they have an idea of the scope or are merely guessing. “The latter seems more likely,” he concludes.
This confusion has the net effect of robbing Australians of the ability to make decisions about the merits of the scheme, but also makes it hard to measure the scheme against its stated goal - protecting children. If the target of the filter is now primarily web sites accessed by adults, this suggests that the rationale for Net censorship has changed since the election promises were made. “In short, the Rudd government’s inability, or unwillingness, to elucidate a consistent set of content categories that will be off-limits, either to all Australians or to minors, undermines citizens’ ability to compare concrete plans for filtering to the reasons for implementing it initially.”
On the issue of narrowness, the author examines the state of dynamic filters as tested by the Government, and comes to similar conclusions to other commentators that such filters come with inherent under- and over-blocking. Furthermore, since commercial software products are developed and administered by third-parties, discretion for what is blocked may be lost to the potentially over-broad, built-in lists provided by independent software vendors. “If the country’s filtering employs vendor-supplied block lists, or allows ISPs to choose which product to implement… then Australia’s controls will inevitably be both under- and overbroad, with implications for access to legitimate information, transparency, and accountability.”
Although the study largely refrains from a technical analysis, it makes the point that Australia’s network was designed for openness and transparency, in contrast to other nations such as China and Iran where bottlenecks were deliberately engineered into the network from the beginning.
Technologically, the Rudd government seeks to mandate that ISPs retrofit filtering to a heterogeneous network architecture designed to avoid network blockages of exactly the type that on-line censorship creates. The cost, difficulty, and performance drop that the country’s Internet access suffers as a result of filtering will guide other nations that consider implementing broad, mandatory content restrictions on-line.
The study does take note of one of EFA’s major concerns, that any filtering system, no matter how narrow, “makes later expansion of censorship easier by reducing the cost of blocking additional content.”
Finally, the study looked at the accountability issues surrounding expanded Internet censorship powers, and it is, unsurprisingly, hard to avoid the conclusion that the Government has not made Government accountability a centrepiece of their cyber-safety platform. The lack of clarity on who controls the blacklist undermines the ability of the citizenry to ensure the scheme is fairly administered.
The planned outsourcing of filtering decisions to unaccountable and overseas third parties such as the Internet Watch Foundation also raises severe issues and may in some instances contravene existing laws by bypassing the ACMA’s complaints-based mechanism. What are the implications of installing filters, by Government mandate, that make filtering decisions based on software algorithms rather than accountable human judgement?
If filtering is implemented based on software vendors’ decisions about whether content is sexually explicit, rather than on the Classification Board’s judgments, this will decrease the Australian citizens’ ability to have a voice in what they can access on-line.
Some of these concerns could be remedied, perhaps, if the Government could lay out their plans in sufficient detail. Instead, we are left waiting for clarity until after the ISP trial. (What end is served by a trial conducted in such a policy vacuum we cannot say.) The author notes that the Minister’s rhetoric has not helped assuage concerns. His habit of tarring opponents as supporters of illegal material has not gone unnoticed overseas: “While hyperbolic rhetoric is common in democracies, attempts to silence dissenters or to conflate policy differences with support for unlawful behavior undermine accountability.”
Overall, the study concludes that
Accountability problems are inherent in censorship achieved through computer technology. These challenges increase when some voices are magnified, and others silenced, in policy debates, and when content categorization is done by unaccountable (and perhaps foreign) entities. How Australia implements filtering will influence the control its citizens have over on-line content restrictions.
The (draft) study is quite comprehensive and the Ministry would be well served to study it. The study does err, we think, in the amount of power it ascribes to Senator Steve Fielding of Family First in driving the policy. Nevertheless, it reinforces the position of the many stakeholders in Australia who have opposed the filter, not solely on technical grounds or from some misguided sense of cyber-anarchism, but from solid and fundamental legal/democratic principles. We are not the only ones who question the ability of our Government to anticipate, understand and manage the many complex issues surrounding such a radical Internet policy.
In his conclusion, the study’s author makes the following observation:
Filtering looks easy and cheap, and calls to block access to material that is almost universally condemned – such as child pornography, extreme violence, or incitements to terrorism – are hard to resist. But this focus confuses means with ends.
Electronic Frontiers Australia concurs, and awaits clarity on both means and ends from the Government. The Government cannot claim a mandate for such a poorly-defined policy. If it is to have any legitimacy, the public and industry must be informed well in advance of the next stages of the plan.

No comments:

Post a Comment