What it means for local government comms and other public sector organisations
The Online Safety Bill – a world-first piece of legislation from the UK government – aims to make the internet a safer place for users, and the UK the safest place in the world to be online.
With most of the responsibility for online safety placed firmly upon big tech companies either with the largest user base or which provides access to content which poses the greatest risk of “harm” to them, what are the implications for local governments and other public service providers using social to engage with their audiences?
Harmful – not illegal
The new bill hasn’t yet become law: it was put before parliament on 17 March 2022, with a second reading having taken place on 17 April and amendments announced on 19 April before it is subject to further legislative scrutiny. But what do we know so far?
What’s clear is businesses in scope – with the social media giants very much at the top of that list will need to put in place risk assessments, policies and procedures to deter access to and dissemination of “harmful” content online and remove it where necessary. What’s not clear yet is its impact on and cost to businesses in scope, or a more precise meaning of what’s “harmful” over and above the fact that it covers physical and psychological harm to individuals, rather than wider society.
The bill doesn’t just cover content that’s already “illegal” (think hate crimes like racial abuse)’ it’s there to target content that’s “legal but” MAY cause significant harm to some audiences.
The problem for public services
The internet’s a pretty big place, so it’s likely that platforms will be forced, either by market pressure or by OFCOM as the new Online Safety Regulator, to use automation and even increased human intervention to deter access to harmful content. But that means there’s potential for legitimate posts and pages to be banned, too – the protection of freedom of expression and user privacy are live issues in the online safety debate, along with journalism and political content.
Think of a Facebook group or page for local foster carers – a much-needed discussion to help carers support children who are abuse survivors could be targeted and removed by well-intentioned but unsophisticated bots or even deemed as too “risky” by platforms.
Local services aiming to support people with eating disorders may find themselves struggling to share vital resources with sufferers and their families as a result of the being confused with material designed to encourage harmful behaviours.
Legal advice
The government says that major service providers will need to be clear about what legal content’s acceptable on their sites. And they’ll need to give user-friendly ways to complain about harmful content, too alongside much more detailed guidance upon how to comply with the new regime once it becomes law
If you’re in local government or the public sector, this will affect you – even if you don’t believe that interactions with services users fall within the new legislation’s scope, local governmental bodies will need to assess how their services are accessed online to ensure that they’ve assessed and mitigated any related risks.
SoCrowd are an LGcomms Partner.