This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 2 minute read
Reposted from Lewis Silkin - AdLaw

In case you missed it: Stand by for a new online harms regulator...

Or rather an old regulator with a new job description, because the UK Government has now officially decided to appoint Ofcom as the UK regulator of 'online harms'.

The Government had flirted with the idea for some time, strongly hinting for the past year or so that it was "minded" to appoint Ofcom to this role, but keeping its powder dry in case it ultimately decided to establish a brand new regulator to fulfil this brief.  

Now, Ofcom's role is official. It can finally update its relationship status from "it's complicated" to "engaged".

Ofcom has extensive experience of tackling harmful content and supporting freedom of expression in a different context, through its role regulating TV and radio programmes. It is also the regulator for UK-based video sharing platforms.

In due course, Ofcom will take on its new responsibilities to protect children and vulnerable people when they are online. Ofcom will invest in new technology and recruit people with 'the right skills' as it moves into this new role.

In the meantime, Ofcom will work with the Government and Parliament as they develop the necessary legislation... no mean feat!  

Ofcom has promised to set out its "initial thinking" on its approach to regulating online harms this year.  It says it will focus particular attention on tackling the most serious harms, including illegal content and harms affecting children. 

The thorny issue of what will count as "harmful content", beyond illegal content, and the role and powers of Ofcom as the new regulator in this space, will be addressed in the forthcoming Online Safety Bill, expected in 2021.

Ofcom has been clear that it won’t be responsible for regulating or moderating individual pieces of online content. Instead, the Government’s intention is that online platforms should have appropriate systems and processes in place to protect users. Ofcom's role will be to ensure those systems are adequate and properly applied, and to take action against online platforms if they fall short. It remains to be seen how this will be done without getting into the relative merits (and into the weeds) of particular matters of public policy or controversy.

It is expected that the new regulatory framework will apply to companies whose services host user-generated content that is accessible by people in the UK and/or which allows users to interact with one another online. This includes the obvious 'big players' in terms of social media platforms, as well as search engines and online market places. It is expected that online journalism from news publishers’ websites, as well as B2B service providers, will be exempt.  

Late in 2020, the Government trailed the idea of imposing a duty of 'due impartiality' on social media platforms, not unlike the duty imposed on TV and radio services, but details are few and far between and that particular duty may well be quietly dropped.

We should know more later this year, once the New Year (and Brexit) hangovers have cleared.

We won’t censor the web or social media. Free expression is the lifeblood of the internet and it’s central to our democracy, values and modern society. Our role in upholding broadcasting standards for TV and radio programmes means we’ve gained extensive experience of protecting audiences from harm while upholding freedom of expression. An important part of our job will be to ensure online platforms do the same with their systems and processes.

Tags

adlaw, a and m, entertainment, media, online harms, social media