• 0800 0862018
  • This email address is being protected from spambots. You need JavaScript enabled to view it.
  • Mon - Fri 8:00 - 17:00
What the Online Safety Bill Means for Social Media

What the Online Safety Bill Means for Social Media

Recently we’ve been breaking down the Online Safety Bill, which had an updated and strengthened version published by the government recently, and it had gone through parliamentary scrutiny. So far we’ve looked at an introduction to the Bill and what changes have been made since its original draft. Today, we’ll be taking a look at the practical implications for users, so that we can better understand how it will impact our day to day lives- if at all.

Firstly, we’ll be looking at the impact the Bill will have on children. As a result of the new law, organisations that fall within the scope will need to understand the risks their operations pose to children, and take action to stop and prevent illegal activity that threatens the safety of children. It will also be a requirement that organisations ensure that children are using their platforms in a safe manner, and are protected from seeing content that is harmful to them. Examples of the types of content that should be blocked from being accessed by children is pornography, and self abuse. In addition to this, they will also need to prevent children from having access to harmful content, which will include things like bullying. In a future article, we’ll go into more detail on how platforms can and should carry out these requirements that the new law will set out.

One way of putting more control in the hands of children is the ability (along with parents) to much more easily report content and activity that has been harmful. Platforms will then be required to take proportionate action in response to these reports. In the case of any child sexual exploitation and abuse content, organisations will have an obligation to report any that they see to the National Crime Agency, and do their best to assist the authorities in their investigations. Clearly, when it comes to children, the Online Safety Bill is looking to drastically improve the experience that children have online. Removing harmful content such as pornography, self abuse and bullying is a key goal, and allowing users to report such content in a simple and efficient manner will be key in getting rid of the content that is so harmful to children.

There will also be more support for adults when it comes to being safe online as a result of the Bill. More obligations will be put onto relevant organisations to tackle illegal content on their platforms and remove them. Easier reporting controls will also need to be put in place, so that adult users can report illegal content quickly, so that the organisation can act and remove it. At terms and conditions level, companies will also need to make efforts to be clearer what they wil and won’t allow on their respective platforms, and allow users to complain easily, should there be content that violates the terms of use. It has been clarified that the categories of content that will need to be addressed in organisations’ terms and conditions will be outlined in a secondary piece of legislation and will also be approved by Parliament. 

It seems that on larger sites and platforms, adult users will be able to have greater control over what they see, and choose types of harmful content that they do not wish to see. For example, a user who may find that seeing content relating to self harm will be particularly harmful to them and damaging to their mental health, will be able to prevent their feed from showing them this type of content. This provides a clear distinction for how companies will need to treat illegal and harmful content. Whilst illegal content should be removed as quickly as possible and prevented from being uploaded in the first place, the approach to harmful content seems to be allowing adult users greater control and the option to not see it, as opposed to preventing users from posting the content altogether. The aim is to allow adult users to make more informed decisions over the content that they see, and ensure that organisations have the controls in place to allow adult users to achieve this.