In today's digital era, individuals have the freedom to share their thoughts and content on websites owned by different parties. However, it is essential to acknowledge that posting incorrect or harmful statements on the internet can lead to legal consequences. The question of whether platform owners should be shielded from liability for content posted by third-party users is a fundamental concern in the digital landscape. Supporters argue that such protection is vital for encouraging online platform growth, facilitating free expression, and preventing potential restrictions on user-generated content. Conversely, critics contend that unlimited immunity might contribute to the proliferation of harmful content and hinder accountability. Striking a balance between fostering digital innovation and ensuring responsible online conduct remains at the heart of this ongoing debate.

US: Section 230 Defence

In the US, Section 230 of the Communications Decency Act 19961 ("Section 230") serves as a legal safeguard, offering online platform owners protection from legal repercussions concerning user-generated content. This provision empowers online platform owners to moderate content without incurring legal liability for the material posted by users.

However, it is worth noting that the Section 230 defence does not provide total immunity to online platform owners; instead, it is subject to a three-stage test:2

  1. Is the defendant a provider or user of an interactive computer service?
  2. Is the plaintiff attempting to hold the defendant liable as a publisher or speaker?
  3. Is the plaintiff's claim based on content posted by another information content provider?

In essence, the Section 230 defence only comes into play when a plaintiff endeavours to file a lawsuit against an interactive computer service provider, seeking to attribute publisher responsibility for statements made by its users.

Additionally, it is essential to recognise that this defence may not be applicable to generative Artificial Intelligence (AI) systems like ChatGPT because generative AI systems are likely to be classified as information content providers rather than merely interactive computer services.3 Nonetheless, establishing whether a defendant qualifies for Section 230 protection entails a thorough investigation into how a specific technology is applied in the context of a particular case.

Malaysia: Section 114A, Evidence Act 1950

In Malaysia, Section 114A of the Evidence Act 1950 provides that a person is presumed responsible for the content of a publication if their name, photo, or online alias is associated with that publication as the owner, host, administrator, editor, or contributor, unless they can prove otherwise.

In the Federal Court case of Peguam Negara Malaysia v Mkini Dotcom Sdn Bhd & Anor,4 internet users published critical comments on an online news portal, leading to legal action against the respondents, including the company that owns the online news portal (1st respondent) and the editor-in-chief of the news portal (2nd respondent).

Relying on this provision, the Federal Court held that Section 114A of the Evidence Act 1950 establishes a rebuttable presumption, where the existence of a basic fact is sufficient to hold the principal actor – in this case, the 1st respondent, who is the host of the publication – liable for the comments' publication. There is no requirement for the Attorney General to prove an intention to publish on the part of the respondents. While the 1st respondent was ordered to pay a fine of RM500,000.00 within three days, the 2nd respondent was not held liable due to the lack of evidence linking him to the comments.

Other Jurisdictions

In contrast, countries such as the UK and Australia lack equivalent provisions to Section 230 of the Communications Decency Act 1996 and Section 114A of the Evidence Act 1950, leaving online platform owners in these jurisdictions subject to different legal standards regarding liability for user-generated content.

In Australia, under the newly enacted Online Safety Act 2021, an eSafety Commissioner is tasked with overseeing online content and issuing notices to the relevant service providers for the removal of certain content as deemed appropriate. Compliance with such removal notices is mandatory within the specified timeframe.5 Failure to do might result in penalty units, with the actual penalty value calculated by multiplying the penalty units by the penalty unit value, determined by each jurisdiction within Australia. 6

In the UK, website operators can be held liable for defamatory third-party content only under specific conditions. The claimant would, amongst others, be required to demonstrate that "the person who posted the statement was not identifiable," that the claimant issued a "notice of complaint" to the operator, and that the operator did not respond to the notice of complaint.7

Conclusion

As the impending Communications and Multimedia Act (CMA) amendment bill approaches in early 2024,8 we stand at a critical juncture. To navigate the evolving digital landscape and promote responsible online conduct, it is essential to reconsider Malaysia's approach to online content liability. Instead of rigidly adhering to the presumptive liability stipulated under Section 114A of the Evidence Act 1950 or simply transplanting Section 230 from the US, which may provide too much protection to online platform owners, we should explore a more nuanced and adaptive framework.

This framework should take into account various factors, including the platform owner's actual awareness of the comments on their platform and measures that have been taken to deal with offensive content. Ultimately, the aim should be to strike the right balance between fostering online innovation, managing harmful content effectively, and preserving online freedoms in our rapidly evolving digital landscape.

Footnotes

1. Title V, U.S. Telecommunications Act of 1996 (Pub.L. No. 104-104, 110 Stat. 56); codified as 47 U.S.C. § 230 (CDA).

2. Kathleen Ann Ruane, 'How Broad A Shield? A Brief Overview of Section 230 of the Communications Decency Act' <https://sgp.fas.org/crs/misc/LSB10082.pdf> accessed 1.10.2023

3. Matt Perault, 'Section 230 Won't Protect ChatGPT' < https://www.lawfaremedia.org/article/section-230-wont-protect-chatgpt> accessed 26.9.2023

4. [2021] 3 CLJ 603

5. Online Safety Act 2021, Sections 109, 110, 114, 115

6. Online Safety Act 2021, Sections 111, 116

7. Defamation Act 2013, Section 5

8. https://www.freemalaysiatoday.com/category/nation/2023/09/17/communications-and-multimedia-act-amendment-bill-to-be-tabled-by-early-2024/

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.