‘It is the long-term threats that we need to be most concerned about – every day, we are feeling the corrosive impacts of misinformation, but its effect on society and democracy will only fully show in the longer term,’ believes Polis founder Thomas Barton.
Having founded Polis to empower people with awareness and fact-based knowledge of global politics, Thomas believes there is real opportunity in the fight against misinformation this year. As part of the Online Safety Bill, protections against the damages of untruths presented as fact will be put in place in law, but will the legislation be enough?
Why are efforts to fight misinformation so important this year in particular?
Research from Eurasia Group’s Top Risks forecast for 2023 found that disinformation is going to become even more pervasive due to disruptive technological developments, like ChatGPT. We have got to take action now to mitigate that threat.
From a public health perspective, while we are coming out of Covid, we have got to be ready for future pandemics. As part of building resilience, we need to be tackling disinformation and misinformation around vaccines – data shows that false information being spread online on vaccines has had a negative impact on uptake among young people. At the start of the pandemic hesitancy stood at 14% for younger age groups, falling drastically to 5% for over 30s.
If we want to be better prepared, we need to use 2023 to actually learn the lessons of the past and protect ourselves for the future.
Alongside threats to health, a US global trends report found that the biggest threat to social trust over the next 20 years will be an inability to agree on what the facts are – we will become more polarized as a society. How can we have a conversation if we can’t agree on what constitutes the truth?
There is real opportunity now because of the Online Safety Bill. This is the first time the UK Government has tried to introduce regulation in the online space.
What initiatives have Polis been working on?
Polis has taken a two-step approach to the campaign we are running on misinformation and disinformation.
The first is to raise awareness. Even though this issue poses huge threats to our democracy – Russia has used disinformation in Ukraine with deep fake technology, for example – it is not high on the political agenda. Rightly, people are focused on the war in Ukraine and the cost-of-living crisis here in the UK, but misinformation poses systemic challenges to our society.
We are also promoting solutions for tackling misinformation. Alongside talking at universities to engage young people in the conversation, I’ve been meeting with members of Parliament and the House of Lords with amendments to improve the Bill. We will be delivering briefings, policy papers and our own research to political stakeholders.
We have had encouraging results – Polis was one of the only contributors to the Online Safety Bill pre-legislation scrutiny committee that spoke about this issue and we made an impact. 66 of the committee’s recommendation made it into the Bill.
But the Government did not adopt all of our recommendations – there is far more work to be done.
Will the Online Safety Bill do enough?
The short answer is no. This is clearly a landmark legislation and there is opportunity to be more ambitious.
We believe that online platforms should be bound by similar conditions to ‘traditional’ broadcasters – the licensing terms of the Broadcasting Act around impartiality and ensuring factual information is put forward. If we can do it for the BBC, we should be able to copy and paste that model and apply it to the online space.
The Online Safety Bill is at an advanced stage in Parliament, so we have got one window of opportunity for someone in the Lords to table those amendments and make sure they get debated in the House of Commons, at which point we are hoping that MPs we have briefed agree that they need to be passed. Right now, the Bill is pretty lackluster when it comes to fighting disinformation.
What do PR and comms people need to be aware of?
Obviously, the job of a PR is to protect the reputation of their organisation, or the organisations that they work with – corporations are not going to be immune from the onslaught of misinformation.
We cannot escape conversations around ChatGPT at the moment – any activist or online troll could use that technology to spread all sorts of content on social media to trash the reputation of a corporation. If you are a bit more sophisticated, you could use deep fakes to impersonate senior figures in business to create a PR disaster. For a listed company, bad actors could move their share price.
And I am not making this up. The Eurasia Group has forecast this as a possibility in 2023. PRs must be aware of the reputational challenges posed by actors harnessing tech for malicious ends. Misinformation touches everyone.
How much responsibility falls on social media platforms and publishers?
We have been relying on voluntary action from social media companies so far, and look where we are.
According to Full Fact, only 1/5 of social media users who encounter misinformation on their feeds actually do something about it. Our civic duty means that those of us who have the necessary digital literacy skills to identify mis and disinformation online should actively take action and report the content to protect others.
I remember Mark Zuckerberg saying Facebook wouldn’t be ‘arbiters of truth’ – this is not what we are asking. We are asking for information to be taken down when it is blatantly fake and is causing damage to society, and that you have a responsibility when you have unleashed the floodgates and have given billions the opportunity to freely publish.
And ‘publishers’ can be individuals. Anyone can be a publisher if they have a social media account. We have a civic duty, in my view, to make sure that before we share content online, that we have read the content first, that we understand it, that we have looked into the source, that we do not just publish something on our feeds before we even engage with the content.
Along with regulations and legislation that comes from the ‘top-down’, you also need individuals – from the bottom-up – to take responsibility.
What is coming up over the next year for Polis?
At the moment, we’re in the weeds on the Online Safety Bill, but we need to think about life beyond it.
The EU’s Digital Services Act is robust, but there is nothing it, or the Government, can do about misinformation being shared on WhatsApp.
However, if the person receiving false information there has the media literacy and critical thinking skills to question what is coming through – to look at the validity of the source and whether it has been produced with malicious intent to mislead, or is accidentally misleading – we can inoculate against misinformation and disinformation; people can protect themselves.
The situation with education on this is dire. A report on the digital literacy of school children found that only 2% can tell fake news from legitimate news online. The next generation is not equipped with the skills they need to protect themselves.
We will be campaigning for major changes to the curriculum in schools – young people are not getting their news by watching the BBC, they aren’t picking up a copy of The Times or The Guardian on the way to school. The sources of information young people use the most for news are Instagram, Facebook, TikTok and Twitter. If that is where they are getting their news, they need the skill set to use them.
I don’t want to live in a society where we can’t agree on what basic facts are.
For more on the Online Safety Bill, the Digital Services Act and other UK and EU regulation changes to be aware of this year, click here. You can also download the Vuelio white paper ‘Medical misinformation: How PR can stop the spread‘ for a closer look at the situation within the health sector.