Why we need to regulate Online Platforms and what states are missing


Posted on

A woman with a book and a laptop

Digital Regulation has become both increasingly talked about and implemented across the globe. In 2022, the European Union passed the Digital Services Act, while in 2023 Britain passed the Online Safety Act which sought to shift existing liability and placed expectations on platforms to moderate content. In the United States, there are attempts both at a state level and a federal level to introduce new regulations to target social media platforms with bipartisan support. Yet most new regulations don’t effectively recognize the global responsibility of online platforms or go far enough of content moderation standardization.

Regulation around online platforms is something everyone should take a strong interest in. Today the world is increasingly online with 61.5% of the world using a social media platform, accounting for 90% of all internet users. For modern society social media is a crucial tool. In Britain, social media has now become the third most common source of people’s news, behind news websites and national TV news. The high usage and importance placed on social media indicates that online platforms are essential to how people interact with the world. For that reason, the fairness of the regulations and their ability to keep people safe online is paramount.

History of Digital Regulation for Platforms

Since the 1990s, online platforms have benefited from large levels of freedom stemming from the Communications Decency Act of 1996, a United States law. Section 230 of this law established the principle that platforms are not responsible for the content that users upload or post - with a limited number of extreme exceptions. This means that a social media platform is not held liable if someone posts something libellous on its website because it does not explicitly utilise editorial control on its platform.

These regulations were created in a world vastly different to today’s. In 1996, less than half of Americans had a computer, and of those who did, just a third used it to access the internet. Crucially, this law predated Twitter, Facebook, Wikipedia, YouTube, and Google – the most common websites used across the world today. This means that Section 230 was not designed for today’s internet and lawmakers were forced to anticipate the problems of the future, rather than react to them. These platforms developed their policies around the freedom afforded by Section 230 to avoid burdening themselves with editorial responsibilities. This has given their users freedom, but equally put them at risk. Yet the widespread use of algorithms on most social media merits the argument that these platforms exercise some degree of editorial control because they choose the parameters of what gets promoted and what is ignored.

Why Regulation for Online Platforms Must go Further

Platforms themselves are inherently political actors because the decision to moderate is itself political. Social media has become a tool which empowers individuals to communicate their struggles to the world. The Syrian Civil War is a vivid example of this. For Syrians trying to share videos of the violence and crimes being committed by the different participants of this conflict, content moderation limited the world from being able to view their plight. This is because algorithms and moderators on platforms such as YouTube and Facebook flag this content and remove it which has led to criticisms that information which could be used to hold participants to account is being lost. With the original goals of the historical legislation to protect freedom of expression, the most positive aspects of freedom of expression are lost through the extensive use of content moderation in a way that doesn’t appreciate context.

Not only have online platforms undermined the voices of victims of violence, but they have also helped to perpetuate violence itself. In the years preceding the genocide of Rohingya Muslims Facebook in Myanmar became an echo chamber of content which targeted this vulnerable group. This contributed to the incitement of violence and atrocities which left almost a million refugees fleeing across the border to Bangladesh, while countless homes were burnt, and thousands became victims of rape and mass killings. Facebook have themselves admitted they made mistakes in their approach which was negligent in protecting human rights. As a result, Facebook shifted their policy which involved the creation of an independent company human rights strategy.

The control social media companies have over content moderation and their acceptance that failures on content moderation have negative world effects indicate that regulation to introduce a set of standards are essential. As the political nature of content moderation is increasingly revealed, there is a growing tension with those who stand to lose from content moderation. In opposition to rising moderation on Twitter/X, breakaway social media alternatives such as Parler and Truth Social demonstrated that while one social media can introduce changes, the internet is vast and other platforms can be created with different understandings of what is acceptable.

The challenge of a lack of a consensus over what level of moderation is acceptable indicates that because of the high stakes game that is taking place in how online platforms self-regulate, there is a responsibility for government to solve this dilemma. As these decisions are political, states are the actors who wield the political legitimacy to resolve the question over what should or should not be allowed. But it is also imperative that with the international scope of social media, and the fact the three most used platforms are based in liberal democracies, there is an obligation for liberal democracies to introduce regulations that protect both domestic and global citizens. There is a responsibility for Britain, the EU, and the United States to recognize this international dimension and seek to build regulation that goes further in the obligations it places on social media companies.

Related Blogs


Disclaimer

The opinions expressed by our bloggers and those providing comments are personal, and may not necessarily reflect the opinions of Lancaster University. Responsibility for the accuracy of any of the information contained within blog posts belongs to the blogger.


Back to blog listing