By Georgia Turner and Alex Waddington
The rise of fake news – a local problem, Georgia Turner, Communications Consultant at Georgia Turner Communication, and LGA Associate
When I hear the phrase “fake news” it is more often than not being spat out by a certain US-politician, using it as a double bluff to dismiss and discredit mainstream news stories containing information that may actually be true that he simply doesn‘t like, let alone have an educated, evidence-based response to. It’s become a throw-away phrase, dangerously lobbed around to label as disinformation anything the utterer doesn’t agree with or like the sound of. It deliberately causes confusion and blurs the line between what is true and what is not.
Disinformation (or, more emotively, “fake news”) is the deliberate sharing of untruths with a conscious intent to mislead or cause harm. Very often, audiences convinced it is true, share disinformation without malintent; is this sharing is misinformation. It is the misinformation that the initial creators rely on to amplify and give further credence to their fabricated content or partial truth.
The rise of mis- and disinformation has been growing in all levels of society for many years, largely consistent with the increasing accessibility and availability of the tools needed to create and share scarily compelling untruths – digital technology and social media. It’s also aligned to people’s falling trust in politicians, large organisations and the mainstream media – something explored in more detail in the LGA’s Future Comms toolkit. This explores how, by building trust from the ground up, councils can work with communities to reduce the creation of disinformation and minimise its spread as misinformation when it does occur.
The LGA’s toolkit draws on data from Edelman’s annual trust barometer report. Their 2023 study found that trust in government fell to 27%, the lowest since 2016. Only 21% or people said they trusted the government to ease the cost of-living crisis – currently a major focus of all local authorities. Perhaps most significantly in the context of this article is that 68% feel politicians are more likely to lie and mislead the public.
Whilst this research provides a global and national national perspective, the LGA toolkit notes that “it is not too big a leap to suggest that the global decline in trust is the single biggest challenge to effective communications across the world in the years ahead.”
Disinformation often feels more noticeable during times of crisis or incident. During a council-organised airshow, I was involved in the response to a small jet plane crash in which the pilot tragically died at the scene. Yet an individual posted on the council’s social media he had survived and was in a local hospital – the poster knew this, they stated, as they were a nurse working on the ward where he was being cared for. It was completely untrue, desperately unhelpful and to this day I do not know the motivations for such a distracting and cruel lie.
More recently, during the Covid-19 pandemic, councils and many others worked round-the-clock to counter misinformation circulating within communities. A case study on the LGA’s website shares how the London Borough of Hounslow engaged with GPs of South East Asian and Black Afro-Caribbean heritage, as trusted and relatable voices to those communities, to counter vaccine hesitancy and in particular rumours circulating widely about the safety of the vaccine. The power of empathy – “someone like me” – was, unsurprisingly, a greater counter to disinformation than a local councillor or any council-, NHS- or government-branded message.
It’s not just at times of significant crisis that disinformation can damage society, individuals or the reputation of organisations. How many times have you seen repeated untruths posted as fact posted by often opaque individuals and accounts when commenting on an article on the local news media’s website about your council? Or responses to your own content on social media – even positive news – that are completely untrue or at best only partially accurate?
Louisa Dean, Head of Communications for Reading Borough Council is discussing the issues with her team and organisation’s leadership. She says of the immediate challenges: “How do we identify the disinformation that isn’t obvious? What do we correct, and how? How do we understand what motivates malicious intent? What are our escalation and response strategies? These are questions we are actively asking ourselves in Reading as we define the capacity and capability needed to determine when, why and how to respond.”
Finding, assessing, responding
Fortunately, the Government Communication Service has launched a useful guide for civil servants, that is equally relevant to those working in local government – RESIST.
- Recognise mis- and disinformation – how to define it
- Early warning – how to find it
- Situational insight – how different audiences are reacting to it
- Impact analysis – how it does (or could) affect your key priorities, your communities and your reputation
- Strategic communication – how you should (or shouldn’t) respond
- Tracking effectiveness – how to measure the impact of your activities
One of the most fascinating sections of the RESIST2 toolkit is around intent, and how difficult that can be to pin down. Some are obvious – an individual has a personal grievance with the council that they can’t let go, or an issues-based group is diametrically opposed to a specific policy and seeks to discredit your rationale or further polarise the debate. Or someone could simply be trying to see if they can get away with it.
Whatever the motivation, having a plan in place to deal with the outputs and outcomes of such intent is now an essential part of any local government comms leader’s remit. Your plan should point to a response, if needed, that’s a fine balance of insight and evidence with empathy and openness, presented according to the needs of the assessed situation.
The high-profile Online Safety Act will put more responsibility on social media companies to deal with disinformation at source. Only time will tell how effective that is. One thing we know is certainly true – this problem isn’t going away any time soon, and it’s our job, as communications leaders within organisations, to be prepared to counter it.
Alex Waddington, Founder of Whetstone Communications, data analyst to communications teams, shares this perspective on the growing role of data – and in-house data skills – in tackling disinformation
In a digital world, having and using good data is key to countering the work of bad actors. The RESIST framework is excellent here, outlining the role of data plays at different stages. Here are some practical ideas for teams looking to get started:
Given digital channels are where disinformation proliferates and gains traction as misinformation, having robust listening and monitoring tools in place is important.
This is your early warning system – without this, you may miss early opportunities to assess, plan and respond.
Stefan Rollnik, a former disinformation analyst in the Office of the First Minister of Wales, said in a recent article; “There is no substitute for listening to your audience and customers so you can better identify sources of misinformation and disinformation.”
With growing concern about disinformation (especially given the major role of AI), the big reputational risk of not having monitoring tools could be a key argument for an internal business case.
But if the money simply isn’t there, free tools like Google News or TalkWalker alerts can help.
And don’t overlook valuable on-the-ground sources like community public health teams. What examples of health misinformation they are picking up, for example? How can you tap in to this intelligence?
Bringing clarity to numbers
If you’re communicating statistics you don’t understand about key issues, alarm bells should be ringing. Public outputs that lack clarity can provide opportunities for disinformation, and inadvertently lead to misinformation.
With data visualisation tools becoming ever more accessible (for example Canva), communications teams should be looking at how they can use these to increase understanding of numbers and stats. This doesn’t have to mean charts or graphs, which can add to the confusion for people. Visualisation could be used to help communicate clearly and ‘pre-bunk’ around annual budgets, for example.
Data confidence and literacy in communications is a factor here, as it isn’t always the natural territory for creatives; a survey of PR professionals in 2023 found over half lack confidence in their data skills. Recognising the skills gap, the Chartered Institute of Public Relations now offers a half-day Data Literacy Skills course.
Investing in data literacy will also boost monitoring effectiveness, helping teams spot where numbers and statistics are being misrepresented and potentially misused by bad actors.