England & Wales, Global Communities and society, HR, workforce and communications, Public health

Misinformation comes home


Image by Pablo Jimeno from Pixabay

Last year we took a look at Covid-19 misinformation and what both central government and local government can do about it. We also surveyed councillors about their experiences and beliefs about commonly spread misinformation. Some of our findings made for grim reading, but it was also clear that trusted sources with local knowledge and local listening could make a big difference.

Since that time and the widespread availability of vaccines against Covid in many countries, the nature of the misinformation has changed to become more focused on vaccinations and potential treatments. It’s also become clearer that those who spread misinformation of one type can spread and share misinformation about other things, too, in a wider ecosystem of distrust. Donald Trump, last year’s number one Covid misinformation spreader, has now turned into the world’s number one elections misinformation spreader. And this has everyday consequences on democracy and on the lives of those workers who support our democracy.

The school site anti-vaccine protests

For many of us over the last year, misinformation has become more personal and closer to home. When it was my teenage son’s turn to get vaccinated against Covid, like many secondary school students in England, he had to walk past a group of determined anti-vaccine protesters standing in the narrow residential road in front of his school. Despite warnings from the school not to engage with protesters, my son walked up to them.

“I have some questions about the vaccine,” he said and began talking to them. (He didn’t have any questions about the vaccine that they could answer. He can be an occasional mischief stirrer, and to make it clear he is vaccinated.)

They talked for a while. My son collected a handful of leaflets from them – and proud of his trolling – shared them with us. All of the leaflets were littered with easily disprovable claims and colourful accusations such as vaccinations being provided in educational settings was the “Nazification of our schools”. Many of the leaflets edged into other subjects such as climate change denialism and, in a nod to localism, screeds against pedestrian and cycle-friendly road modifications. But there were so many leaflets and they were so confident in their ‘facts’. The protesters, my son said, were nice to him. He described one as “somebody’s mom”. We’d already talked with my son at home about vaccines and misinformation, so he was well-prepared (if still ill-advised) when he engaged with people who were seeking to share misinformation about the vaccine.

From furry kittens to furied insurrection

The infodemic is spreading. It’s emerged from the dark corners of the Internet to something frequently encountered in our Facebook feeds and in front of our schools. In a rare glimpse of how easily this can happen, my feed was overtaken by misinformation groups over the last couple of months. I lost my much loved senior cat not long ago and in search of distraction, I began watching a lot of kitten videos. So Facebook began serving me with lots more kitten videos in a cycle of furry cuteness. This isn’t surprising and it is the way the algorithm is meant to work. I demonstrate that I like kitten videos, Facebook gives me more kitten videos. It keeps me on site longer and increases the chances that I’ll see and respond to advertising or sponsored content.

But I noticed that one of the kitten videos was from a known source of mad conspiracy theories, an organisation which had targeted UK councillors with a direct email campaign with Covid misinformation early in the pandemic. There was nothing wrong with the video though, the kittens were cute, and I watched it to the end – it really was just a kitten video – and I wondered why such a notorious organisation had such innocent and adorable content. It soon became clear. Almost immediately afterwards, I noticed that the content in my feed changed. I still got lots of kitten content, but I now also getting vaccine ‘hesitant’ content, lots of pro-gun content and even stuff about the evils of central banks and fiat currency. I stopped clicking and in some cases started reporting, and the number of disturbing posts went down, but they haven’t stopped. Today I was pushed a post with an iconic image of the US revolutionary war with a message which hinted at armed insurrection (and was probably just inside the line of ‘acceptability’). If I hadn’t already been aware of how the Facebook algorithm works and the tendency of many platforms to push people into ever more contentious discussions, I might have been completely mystified. But as it was, it showed just how easily people can be sucked into groups that share misinformation and disinformation, a single misclick can you lead you to some dark places. And like the cliched boiled frog, you’re completely overwhelmed with misinformation before you know it.

No kittens harmed. Our new cat.

Tools to fight misinformation

Given the ubiquity of misinformation, it can feel hopeless. But it isn’t. There is still a lot local and central governments can do to stop misinformation beyond taking on the big social tech companies. One key finding of the last year is that truth travels just as fast as misinformation and so we need to make sure to give the facts wings and give people the tools they need to identify and counter misinformation. Our latest Global Local Recap looks at local government and misinformation and helps highlight tools and approaches that are working at the local level and for everyday people like you and me.


Leave a Reply

Your email address will not be published. Required fields are marked *