This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 5 minute read
Reposted from Lewis Silkin - Digital, Commerce & Creative

The time to get ready for the new Online Safety Act is NOW - and here's how

The Online Safety Act applies to a range of online services, including those that allow users to share content or interact with other users, and it sets out new rules aimed at keeping safe all users in the UK, especially children. 

The rules apply to organisations big and small, even ‘micro-businesses’ with limited resources, as well as individuals who run an online service. They apply to services where people can create and share content, or interact with each other (i.e., ‘user-to-user services’); or where people can search other websites or databases (‘search services’); or individuals or businesses that publish or display pornographic content.

And it isn't just UK-businesses that have to pay attention. In fact, it doesn’t matter where you or your business is based. The new rules will apply to you (or your business) if the service you provide has a significant number of users in the UK, or if the UK is a target market. 

Most of the new rules come into effect in late 2024, and today, Ofcom published some useful ‘quick guides’.

1  Quick Guide to Online Safety Risk Assessments

Risk assessments are a new legal duty for most services regulated under the Online Safety Act. Ofcom is currently consulting on its draft guidance for illegal content risk assessments, so the guidance it has issued is subject to change, but for the time being the new guidance gives businesses some useful guidance on what they need to do to prepare for the new legal obligations that will be placed on them.

The detailed guidance is available here, but as a short summary the headlines are as follows.

You will need to consider how illegal harm could take place on your service. This means you will need to:

  • Step one: Understand the harms - Identify the illegal harms that need to be assessed, taking into account a list of risk factors Ofcom has published
  • Step two: Assess the risk of harm - In the context of your own service, you’ll need to: (a) Consider any other characteristics  of your service that may increase or decrease risks of harm, (b) Assess the likelihood and impact of each kind of harm, (c) Assign a risk level for each kind of illegal harm, and (d) Consider additional guidance on the risks of certain harms.
  • Step three: Decide measures, implement and record - You’ll need to decide on the appropriate online safety measures  for your service to reduce risk of harm to individuals. You will also need to consider any additional measures that may be appropriate on your service to protect people, then implement all safety measures, and then record the outcomes of the risk assessment.
  • Step four: Report, review and update risk assessments - You will be required to report on the risk assessment and measures via relevant governance channels, monitor the effectiveness of mitigation measures, and review (and update) your risk assessment from time to time.

2  Quick guide to online safety codes of practice

Under the Online Safety Act, providers of online services have new duties to keep people safe from harm. One way they can do that is to adopt the safety measures in Ofcom's codes of practices. Ofcom is currently consulting on its draft codes for illegal harms, so they are subject to change, but Ofcom has given a brief outline of the measures it will propose in its first codes.

The details, published today (9 November) are available here.

In outline:

Removing illegal content, managing risks, updating your terms

The safety duties for illegal content focus on keeping people safe online. Ofcom says this is about making sure you have the right measures in place to protect people from harm that could take place on your service. The approach will vary depending on the type of service:

If you have a user-to-user service, it means you will need to:

  • take proportionate steps to prevent your users encountering illegal content;
  • mitigate and manage the risk of offences taking place through your service;
  • mitigate and manage the risks identified in your illegal content risk assessment;
  • remove illegal content when you become aware of it, and minimise the time it is present on your service; and
  • explain how you’ll do this in your terms of service.

If you have a search service, it means you'll need to:

  • take proportionate steps to minimise the risk of your users encountering illegal content via search results;
  • mitigate and manage the risks identified in your illegal content risk assessment; and
  • explain how you’ll do this in a publicly available statement.

Ofcom makes clear businesses can decide for themselves how to meet the specific legal duties, but one way to comply is to use the measures set out in Ofcom’s codes. Ofcom's draft codes for illegal content set out a range of measures in areas including content moderation, complaints, user access, design features to support users, and the governance and management of online safety risks. Some measures are targeted at addressing the risk of certain kinds of offences, such as child sexual abuse material (CSAM), grooming and fraud. Other measures help to address a variety of offences

Our codes of practice set our a range of measures that apply to different services

The Act is clear that the safety measures services you need to put in place should be proportionate. Different measures in the draft Codes would apply to different services based on factors such as the type of service (user-to-user or search); the features of the service; the number of users; and the results of the business or individual's own illegal content risk assessment.

  • Some measures apply to all services, including ensuring someone is responsible for online safety compliance and ensuring your terms of service (or publicly available statements) are easy to find.
  • Some measures apply to large services*, such as using specific automated tools to detect content which could lead to fraud or the sharing of child sexual abuse material, training staff working in content moderation and so on.

*In its Codes, Ofcom has defined a “large service” as a service which has an average user base of 7 million or more per month in the UK. This is equivalent to approximately 10% of the UK population.

  • Other measures apply to services that are medium or high risk. When completing your own illegal content risk assessment, Ofcom will ask businesses/individuals to decide if they are low, medium or high risk for different types of illegal harm - being as accurate as possible. Ofcom has provided draft guidance on how to do this (see above).  Lower risk services will obviously have the lightest requirements. There may also be ‘single risk’ services that have only one type of potentially harmful content (such as the risk of hate and harassment offences), but if that type of potential harm is medium or high risk, the business will have to comply with stringent requirements but only for that particular type of potentially harmful content. Or there might be services that have two or more types of potentially harmful content, which is defined by Ofcom as a ‘multi-risk service’, and additional measures will apply.

Consultation

You can read Ofcom's draft codes in full, and participate in the consultation, now:

Consultation: Protecting people from illegal harms online - Ofcom

 

Tags

online safety act, online safety bill, ofcom