CIO

Law Council: Turnover based fines for social media platforms sets ‘dangerous precedent’

President Arthur Moses claims proposed post-Christchurch legislation will have a ‘chilling effect’ on business

Government plans to fine social media companies up to 10 per cent of their annual turnover for failing to quickly remove violent material from their platforms sets a “dangerous precedent”, according to the president of the Law Council of Australia.

Arthur Moses SC said today the proposed policy was “bad for certainty and bad for business” and was being rushed through Parliament without proper consultation.

“Bad and ineffective legislation is enacted when it is a knee jerk emotional reaction to a tragic event,” Moses said in a statement from the council, which represents 65,000 Australian lawyers.

On Saturday, Prime Minister Scott Morrison announced new legislation – to be introduced into Parliament this week – aimed at preventing the live-streaming of violent crimes online, as was the case in the Christchurch terror attack.

“Big social media companies have a responsibility to take every possible action to ensure their technology products are not exploited by murderous terrorists. It should not just be a matter of just doing the right thing. It should be the law,” Morrison said.

The proposed legislation will make it a criminal offence for “social media platforms not to remove abhorrent violent material expeditiously,” punishable by three years imprisonment or “fines that can reach up to 10 per cent of the platform’s annual turnover”. Juries would decide whether the content was removed fast enough.

Moses said the legislation was “being thought up on the run without any proper consultation with the companies that will be bound by it and lawyers who will be asked to advise on it”.

“It will lead to difficulties with sentencing and mean companies will be punished by reference to their size rather than the seriousness of their breach…Such an approach to penalties, if used as a precedent for other areas of government regulation, could have a chilling effect on businesses investing in Australia or providing their services in this country,” Moses said.

'A serious step'

According to Facebook, a live stream from the head-cam of the Christchurch attacker was viewed fewer than 200 times during the broadcast, and about 4000 times in total before being removed. But users shared and re-uploaded the video to the platform: Facebook said it had removed 1.5 million videos of the attack globally, 1.2 million of them at upload.

On Friday, the company banned praise, support and representation of white nationalism and white separatism on its platforms.

YouTube said, last month it had removed “tens of thousands of videos and terminated hundreds of accounts created to promote or glorify the shooter”; and had been working “around the clock’ to rid the platform of the footage.

Twitter said it had created new automation tools to improve its “proactive removal powers”.

“More than 90 per cent of terrorist content on our service is now removed proactively using our own purpose-built proprietary technology, and the majority of accounts are suspended before their first Tweet. As the Christchurch attack shows, our work will never be complete,” a Twitter spokesperson said.

Despite their response, the firms have been roundly criticised for failing to do enough to curb hateful content being spread on their networks. On Tuesday, Attorney-General Christian Porter met with representatives from Facebook, Google and Twitter to threaten legislative action, after calling their response “thoroughly underwhelming”.

“The time that it took Facebook to act with respect to the Christchurch events was totally unreasonable,” Porter said after the meeting.

However, Moses questioned whether the demands being made of the social media companies were reasonable.

“We also need to be sensible when working on these offences and not demand of social media companies what they cannot reasonably be expected to do. A machine cannot easily pick up the difference between a computer game and online live streaming. The algorithms may need time to be developed, assuming they can be,” he said.

“Parliament making social media companies and their executives criminally liable for the live streaming of criminal content is a serious step which needs to be through carefully including what defences will be available,” Moses added.