Regulation of social media intermediary liability for illegal and harmful content

Authors

  • Murdoch Watney University of Johannesburg

DOI:

https://doi.org/10.34190/ecsm.9.1.104

Abstract

The discussion focuses primarily on the manner in which the distribution of social media content needs to be governed in ensuring illegal and harmful content is limited (e.g. not accessible to children) whilst ensuring freedom of expression and speech. Closely linked to intermediary liability is the manner in which social media platforms self-regulate harmful content on their platforms. In September 2021 criticism was leveled at Facebook that it was not doing enough to prevent harmful content targeting children. It appears that self-regulation of social media content by companies are not effective. In some instances, it is alleged that profit outweigh safety and security concerns. Zuckerberg, CEO of Facebook, indicated in March 2021 in testimony to the United States Congress that he was of the opinion that social media companies should be regulated. Governments are now in the process of implementing regulations pertaining to social media content. Such regulation will have a ripple effect across the world and other countries will most probably follow. In this regard, the European Commission is considering the implementation of the Digital Services Act.

Since governments are implementing or considering implementing social media regulations, consideration should be given to the following issues: • How and what should a government regulate pertaining to social media content? • The issue of social media intermediary liability for harmful content is contentious. A social media platform cannot escape liability where it is used, for example, for sex trafficking, but how is liability determined? In March 2021 Zuckerberg made the following statement “platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it. Platforms should not be held liable if a particular piece of content evades its detection—that would be impractical for platforms with billions of posts per day—but they should be required to have adequate systems in place to address unlawful content.” Zuckerberg submission to the US Congress is tied to the common law standard of duty of care. In the US businesses have a common law duty to take reasonable steps to not cause harm to their customers, as well as to take reasonable steps to prevent harm to their customers.  It is important to establish which human rights’ safeguards government regulation pertaining to content should have in place. Government regulations must provide a safe and secure place for free speech while addressing harmful speech, but the regulations should not silence all speech. Many governments are now concluding that seIf-regulation of harmful content by social media companies are for many reasons not effective. The Internet is borderless and therefore government regulations of social media content must comply with uniform guidelines to ensure free speech protection for the global communication environment

Downloads

Published

2022-04-28