This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 2 minute read
Reposted from Lewis Silkin - AdLaw

Ofcom issues new guidance for Video Sharing Platforms to address 'harmful' content

Ofcom has issued new guidance for video sharing platforms to better protect users from harmful content. 

This is not part of the Online Safety Bill (formerly the Online Harms Bill), which is still taking shape, but Ofcom will be the regulator for online harms too, so it might provide some useful insights into Ofcom's approach.

Video-sharing platforms (VSPs) are a type of online video service where users can upload and share videos with other platform users. They allow people to engage with a wide range of content and social features.

VSPs established in the UK

(including some big names such as Snapchat, TikTok, Twitch and Vimeo) are required by law to take measures to protect under-18s from potentially harmful video content; and all users from videos likely to incite violence or hatred, as well as certain types of criminal content.

Ofcom says its best practice guidance is designed to help companies understand their new obligations and judge how best to protect their users from this kind of harmful material. They also published a short explainer guide for industry on the new framework for video-sharing platforms.

Approach to regulation

Ofcom’s role is to enforce the rules set out in legislation and hold VSPs to account. Unlike its regulatory role in relation to broadcast content, when regulating VSPs Ofcom's role is not to assess individual videos. Obviously, the massive volume of online content means it is impossible to prevent every instance of harm. 

Instead, the laws focus on the measures VSPs must take, as appropriate, to protect their users – and they are given a good degree of flexibility in how they do that. To help them meet their obligations to protect users, Ofcom's guidance sets an expectation that VSPs should:

  • Provide clear rules around uploading content. Uploading content relating to terrorism, child sexual abuse material or racism is a criminal offence. Platforms should have clear, visible terms and conditions which prohibit this – and enforce them effectively.
  • Have easy reporting and complaint processes. VPSs should put in place tools that allow users to flag harmful videos easily. They should signpost how quickly they will respond, and be open about any action taken. Providers should offer a route for users formally to raise concerns with the platform, and to challenge their decisions. Ofcom considers this vital to protect the rights and interests of users who upload and share content.
  • Restrict access to adult sites. VSPs that host pornographic material should have robust age-verification in place, to protect under-18s from accessing such material.

Through all of this, Ofcom must also balance users' right to freedom of expression.

Horizon scanning

Ofcom has also set out five priorities for the year ahead, which are:

  1. working with VSPs to reduce the risk of child sexual abuse material;
  2. tackling online hate and terror;
  3. ensuring an age-appropriate experience on platforms popular with under-18s;
  4. laying the foundations for age verification on adult sites; and
  5. ensuring VSPs’ processes for reporting harmful content are effective.

Tags

a and m, adlaw, media, vsp, streaming, video sharing platform, a&m, entertainment