Update, 4/4: Australia’s Parliament has passed the amendments.
Update, 4/3: the draft legislation is now available online here.
AUSTRALIA — The Australian Government have announced the introduction of a new bill aimed at imposing criminal liability on executives of social media platforms if they fail to remove “abhorrent violent content.” The hastily drafted legislation could have serious unintended consequences for human rights in Australia.
The rushed and secretive approach, the lack of proper open, democratic debate, and the placement of far-reaching and unclear regulatory measures on internet speech in the the criminal code are all matters of grave concern for digital rights groups, including Access Now and Digital Rights Watch.
“Poorly designed criminal intermediary liability rules are not the right approach here, which the Government would know if it had taken the time to consult properly. It’s simply wrong to assume that an amendment to the criminal code is going to solve the wider issue of content moderation on the internet,” said Digital Rights Watch Chair, Tim Singleton Norton.
In particular, the lack of any public consultation is particularly worrisome as it shows that impacts on human rights were not likely to be considered by the government in drafting the text. Forcing companies to regulate content under threat of criminal liability is likely to lead to over-removal and censorship as the companies attempt their best to avoid jail-time for their executives or hefty fines on their turnover. Also worryingly, the bill could encourage online companies to constantly surveil internet users by requiring proactive measures for general content monitoring, a measure that would be a blow to free speech and privacy online.
“Reforming criminal law in a way that can heavily impact free expression online is unacceptable in a democracy. If Australian officials seek to ram through half-cooked fixes past Parliament without the proper expert advice and public scrutiny, the result is likely to be a law that undermines human rights. Last year’s encryption-breaking powers are a prime example of this,” said Lucie Krahulcova, Australia Policy Analyst at Access Now.
“Regulating online speech in a few days is a tremendous mistake. Rather than pushing through reactionary proposals that make for good talking points, the Australian government and members of Parliament should invest in a measured, paced participatory reflection carefully aimed at achieving their legitimate public policy goals,” added Ms. Krahulcova.
“The reality here is that there is no easy way to stop people from uploading or sharing links to videos of harmful content. No magic algorithm exists that can distinguish a violent massacre from videos of police brutality. The draft legislation creates a great deal of uncertainty that can only be dealt with by introducing measures that may harm important documentation of hateful conduct. In the past, measures like these have worked to harm, rather than protect, the interests of marginalised and vulnerable communities,” said Mr. Singleton Norton.
“This knee-jerk reaction will not make us safer or address the way that hatred circulates and grows in our society. We need to face up to the cause of this behaviour, and not look for quick fixes and authoritarian approaches to legislating over it,” he concluded.