skip to main content
Close Icon We use cookies to improve your website experience.  To learn about our use of cookies and how you can manage your cookie settings, please see our Cookie Policy.  By continuing to use the website, you consent to our use of cookies.
Global Search Configuration

Ovum view

Summary

The nature of online platforms presents a number of issues to governments and regulators looking to develop a robust regulatory regime. While it is clear that regulatory intervention is required to protect the public against harmful content distributed over the internet, regulators face several challenges in the process that need to be overcome, including the scale of information generated on platforms, the sheer variety of content types, the lack of involvement of platforms in content creation, the variations in nature and features of online services, and the multinational nature of platform operators.The UK government announced plans in April 2019 to create new laws to make the UK "the safest place in the world to be online." Since then, Ofcom has published several papers on the issue, including a report on October 28, 2019, on online market failures and harms. The report looks at the challenges and opportunities in regulating online services.

There are several challenges that regulators must consider when regulating online harmful content

It is crucial to strike the right balance between ensuring the safety of users and protecting freedom of expression. This is central to the debate, but it will be challenging to make sure the right approach is taken. Introducing too many restrictions on platforms can be detrimental to users and oversight should not amount to censorship, with regulators or platforms deciding what can and cannot be shown on the internet. It would also be unworkable to make online companies directly legally liable for harmful content published on their sites, for example, due to the large volume of content published. This would be likely to result in platforms introducing many restrictions for users and possibly automatically removing or refusing to publish content even if it complies with legal standards, thereby curtailing freedom of expression. However, it is also important to protect the public from harm and it is clear that companies do need to be responsible for tackling a comprehensive set of online harms that involve illegal activity and content, such as terrorism, child sexual exploitation and abuse, and inciting or assisting suicide. Having said this, it is far more of a challenge to define and respond to behaviors that may not be illegal but are still very damaging to individuals or threaten the way of life in the UK. In this case, a proportionate and risk-based approach is necessary – one that prioritizes action to tackle activity or content where there is the greatest evidence or threat of harm, or where children or other vulnerable users are at risk.

Considering the broad scope of online content, it will be challenging to regulate an exhaustive list of specific harms and ensure there are sufficient enforcement procedures and proportionate or targeted redress for all types of breaches. The focus needs to be on the systems that companies must put in place to ensure compliance. The nature and features of online services vary widely, which could also pose a problem for regulators. These differences include the level of control platforms give users over what content is seen, their relationship with users, and how quickly their services evolve. The evolving nature of platforms means regulatory approaches need to have a degree of flexibility to allow them to be responsive to change and to be future-proof.Additionally, regulation must be designed to support innovation, rather than unintentionally restrict it. Striking the right balance in terms of protection and minimizing the burden on online companies, particularly small businesses, will be vital.

Not only is the diversity of content very high, but so is the volume of content being generated and shared by online platforms. The higher volume and varied nature of content means that extending existing frameworks such as broadcasting regulations would not be suitable in their entirety despite some parts having applicability, such as the protection of minors or protection from illegal content. In particular, regulators will need to consider that audience expectations regarding context, accuracy, and impartiality differ between broadcasting and online situations. Broadcasting standards might be undesirable or impractical to apply to online situations, so instead a focus on transparency might be more suitable. Public expectations of protection relating to conversations between individuals may be very different from those relating to content published by organizations. To overcome this challenge, regulators will need to carefully consider the context to create a proportionate and effective response, and the focus should be on how timely platforms are in addressing harmful content.

In addition, many platforms do not create content themselves, but they often do have a role in determining what users see via a search query or using algorithms in a news feed. User-generated content is regularly published at the same time it is submitted, which means that it is often only moderated if it is flagged as potentially harmful by other users or by the platforms' algorithms. Moderation by platforms prior to publication is unlikely to be practical given the volume of information published online, or even desirable, given the potential implications for freedom of expression.

Finally, the multinational nature of online services means that many platform operators are not based in the UK, and this can have implications for enforcement. Challenges include content being uploaded by users based abroad but delivered to UK users, and the use of servers based abroad and companies not having a UK base. This will require collaboration with regulatory authorities in other countries to ensure these services are appropriately regulated.

Appendix

Further reading

"Ireland attempts to regulate harmful online content without impacting freedom of speech," GLB005-000166 (June 2019)

"In a world first, social media platforms are to be subject to a legal "duty of care" in the UK," GLB005-000149 (April 2019)

Author

Sarah McBride, Analyst, Regulation

sarah.mcbride@ovum.com

Recommended Articles

;

Have any questions? Speak to a Specialist

Europe, Middle East & Africa team: +44 7771 980316


Asia-Pacific team: +61 (0)3 960 16700

US team: +1 212-652-5335

Email us at ClientServices@ovum.com

You can also contact your named/allocated Client Services Executive using their direct dial.
PR enquiries - Email us at pr@ovum.com

Contact marketing - 
marketingdepartment@ovum.com

Already an Ovum client? Login to the Knowledge Center now