Media law developments in the third quarter of 2021

I haven’t posted anything for a while and this is an update post to share some of my recent pieces in the media, and other developments

At the beginning of July this year, the Constitutional Court handed down its decision in the CR17 case – where President Ramaphosa  successfully challenged the Public Protector’s report where she found he had acted unlawfully in relation to donations made to his CR17 election campaign. We acted for amaBhungane, who was interested in the case not because it thought the Public Protector’s report was correct but because it argued that if the Executive Ethics Code did not require disclosure of donations made to internal political campaigns, it was unconstitutional.  In the piece I authored with Lavanya Pillay, we analyse the Con Court’s decision – available here  – which held that the Full Bench of the High Court ought to have considered amaBhungane’s constitutional challenge, having found that the Code did not require disclosure by Mr Ramaphosa.  You can read our Daily Maverick analysis, ‘To disclose or not to disclose”, here.  The Full Bench of the High Court acted swiftly in re-enrolling amaBhungane’s application for hearing: the application was heard on 7 September 2021. Judgement awaited.

The next analysis Lavanya and I did was of the landmark hate speech case of the Constitutional Court in the case involving the late Jon Qwelane.  The judgment – available here – was handed down at the end of July.  It related to a disgusting piece penned by Mr Qwelane in The Sun in 2008 – the wheels of justice certainly turn slowly.  Our “Sticks and Stones” analysis in the Daily Maverick is here.  The case is certainly a game changer in relation to the contours of hate speech in our Equality Act.  We acted for an amicus in the case, Media Monitoring Africa. On the facts, the Court rightly held that Qwelane’s column was hate speech.

In a related issue, with Advocates Stuart Scott and Tidimalo Ngakane, we assisted a new NGO – the Campaign for Freedom of Expression – headed by Anton Harber, in its submissions (sent on 1 October) on the Prevention of Hate Crimes and Hate Speech Bill. Anton wrote a neat summary of our submissions in News 24 – see here. Essentially, we welcome the hate crimes part of the bill but query whether criminalisation of hate speech is best way to deal with the scourge of hate speech – not least given that there are existing crimes such as crimen injuria which do the necessary work in this regard.

Then, as the social media giants come under increased scrutiny around the world,  the issue of a media company’s liability for publishing Facebook comments of its readers was examined by the Australian High Court – the equivalent of our Constitutional Court. I wrote a piece in Sunday Times considering this issue – and the broader issue of social media companies’ liability to defamed individuals.  The piece is reproduced below.  This also was a segue from an interesting launch by GIBS in September of a Media Leadership Think Tank in which I was privileged to participate – see here

And here’s my Sunday Times piece, published on 26 September (link to the Australian judgment here)

To post or not to post : Holding media and social media companies liable for the comments of others

A few weeks ago, the Australian High Court – the equivalent of our Constitutional Court – handed down a decision which has sent shockwaves to internet publishers. The Court held that newspapers and broadcasters may be responsible for comments on their Facebook pages posted by members of the public.

The case deals with one of the burning questions of digital media law that faces countries around the world including South Africa: should companies who allow others to post content on their pages or platforms be legally responsible for that content?

Depending on how broadly the question is framed, this would cover not only media companies who allow comments on their social media pages, as in the Australian case, but also the tech giants such as Facebook and Twitter, who provide the platform for the content in the first place.

The question is sometimes answered by the legislature.  The United States of America has its famous section 230 of the Communications Decency Act, passed in 1996 when the Internet was in its infancy. The objective then was to encourage the free flow of information over the Internet by immunising Internet companies who acted as intermediaries. So what the legislation quaintly called providers of “interactive computer services” are not regarded as publishers in law, meaning that the big social media companies like Facebook and Twitter are not liable for defamatory content they carry.  Shorn of the burdens of being regarded as a publisher, internet publishers who qualify for this protection can carry third party defamatory content with impunity.  This is of course an approach that is highly protective of the internet giants.

On the other side of the spectrum is the strict common law position which has now been confirmed in Australia – which in principle imposes strict liability for comments posted by readers on their social media pages.  In the Australian case, three media companies published links on their Facebook pages to online articles about a former youth detention centre inmate.  The articles were not problematic but the media companies allowed readers to comment on the articles on their Facebook pages.  And some of those comments were alleged to be defamatory. While the plaintiff could have sued the members of the public who posted the content, he chose instead to sue the media companies.

The question before the court was whether the media companies could be regarded as publishers of the comments, even though they did not author or edit them.  There was a split in the court: five judges said that the media companies bore responsibility for the comments as publishers; two judges disagreed. The reasoning of the majority relied heavily on the wide notion of “publication” that had been adopted under the common law since a case dating back to 1928.  The judges concluded that the acts of the media companies in “facilitating, encouraging and thereby assisting the posting of comments by the third-party Facebook users” made them publishers of those comments.  This was not least in the context of the evidence which showed that the primary purpose of the Facebook page was to optimise readership of the articles and advertising revenue.  The judges thought that a good analogy was with live television or talk radio: the broadcaster remained the publisher of the comments.

The minority pointed to the artificiality of saying that the media companies had “published” content they had never seen:

“The act of posting on a public Facebook page starts an electronic conversation, whether long or short, with potentially millions of other Facebook users. A public Facebook page is exposed to receiving potentially thousands of comments from around the world; a Facebook page owner has no actual means of controlling the contents of such comments”.

Now the media companies in the Australian case may yet win the case – the High Court did not consider whether the comments were indeed defamatory or what defences may apply.  But the case is nevertheless a setback for internet publishers who may well now decide to shut down comments rather than face the hassle of a lawsuit.

The position in South Africa is unclear.  We have had no cases directly on point (possibly because so many of our media companies have discontinued online comments).

The innocent dissemination defence to defamation claims would provide a fair and reasonable response to a claim against a media company in this context – this defence would only impose liability on the media company for others’ content if it acted negligently.  This means that if the media or social company is made aware of a defamatory comment, and acts swiftly to remove it, it will not be liable for the posting on its page of the defamatory comment.  A similar regime applies to print and electronic media who are members of the Press Council.

While the “notice and take down” model that this essentially amounts to has its own challenges from a freedom of speech perspective, this is certainly preferable to imposing liability on the media company simply for carrying the content.  And it is also not optimal from the perspective of balancing constitutional rights to speech and dignity, to adopt the absolute immunity model which applies in America.  Indeed, even Facebook has realised that absolute immunity is not good for business; it has set up the Facebook Oversight Board, colloquially known as Facebook’s Supreme Court.  This was the body that held that former President Donald Trump had violated the Community Standards of Facebook by supporting violence, leading to Facebook’s decision to ban Trump from the platform for two years.

The development of the Facebook Oversight Board highlights a lesson which we have learnt from the world of the traditional media: some sort of credible alternative to state regulation and to vindicating common law rights in courts may well be needed to regulate social media platforms.  Ultimately, for the platforms, self regulation or co-regulation is likely to be far preferable from an accountability and freedom of expression perspective to state regulation.