Lack of trust in health authorities, combined with the fear and uncertainty about the disease, created fertile ground for false rumours to spread about Covid-19 vaccines. Countering the rumours may be about attitude as well as facts.
False assertions about Covid-19 vaccines have had a deadly impact – they are one reason why some people delayed being inoculated until it was too late. Some still refuse to be vaccinated.
More than two years after the start of the pandemic, false rumours continue to circulate that the vaccines do not work, cause illness and death, have not been properly tested and even contain microchips or toxic metals.
Now a study raises hopes of deflecting such falsehoods in future by changing the tone of official health messaging and building people’s trust.
In many countries, public confidence in government, media, the pharmaceutical industry and health experts was already on the wane before the pandemic. And in some cases, it deteriorated further during the rollout of Covid vaccines.
This was partly because some national campaigns said the jabs would protect people from falling ill.
Friends over facts
‘There was a lot of overpromising around the vaccine without really knowing what would happen,’ said Prof Dimitra Dimitrakopoulou, research scientist and Marie Curie Global Fellow at the Massachusetts Institute of Technology and the University of Zurich.
‘Then people started getting sick, even though they were vaccinated. That created a lack of trust in the government issuing these policies, and in the scientific community.’
Prof Dimitrakopoulou studied public perceptions of Covid vaccines and obstacles to acceptance of reliable information as part of a project called FAKEOLOGY.
She found that, when people lose faith in institutional sources, they end up relying only on themselves, close friends and family.
‘They trust their instincts, they trust what resonates with them,’ Prof Dimitrakopoulou said. That means they will search the internet, social media and other sources until they find information that reinforces the beliefs they already hold.
‘We have lived with fake news and misinformation long enough to understand that it cannot be debunked with facts,’ she said. ‘People just raise these emotional blocks.’
For example, a story about a mother whose child fell sick after getting a Covid vaccination would likely be more influential than a message containing scientific facts.
Building trust
Prof Dimitrakopoulou surveyed 3 200 parents of children under 11 years old in the United States, and conducted focus groups with 54 of them, to discuss their views about Covid vaccines for kids.
Many parents felt confused by conflicting information about the shots and had a lot of questions about their effectiveness.
She gave the parents a selection of messages to assess. They were put off by the ones that were largely factual, rigid and prescriptive – the tone of many public health campaigns.
They were more persuaded by messages that addressed their concerns about the vaccines with empathy and compassion while acknowledging that they face a difficult decision.
‘We need to be ready to answer any questions they may have and be ready to have a conversation – without expecting the conversation to end with someone getting vaccinated,’ said Prof Dimitrakopoulou.
Those exchanges will ultimately help bolster public faith in health bodies and government institutions. ‘Covid is a great opportunity for us to start building this trust,’ she said.
While a lengthy process, building these bridges could enlighten people’s perceptions for the rest of their lives, she said.
Fake news filter
It is also important for journalists, researchers and the general public to be able to spot and filter out fake news.
Researchers on a project called SocialTruth have developed a tool to flag fake news content on the internet and social media.
The software, called a Digital Companion, can check the reliability of a piece of information. It analyses the text, images, source and author and, within two minutes, produces a credibility score – a rating of between one and five stars.
‘This is a computer-generated score that can give a red-flag warning if the content is very similar to other types of content that have been found to be false,’ said Dr Konstantinos Demestichas, researcher at the Institute of Communication and Computer Systems in Athens and coordinator of SocialTruth.
The Digital Companion uses computer algorithms that draw on a wide variety of verification services. These include non-governmental organisations, businesses and academic institutions – all with different interests, opinions and intentions.
Because of the diversity of verification-service providers, ‘We need to establish their trustworthiness by continuously evaluating their results,’ said Dr Demestichas.
To do this, the project uses blockchain to record all the scores and results produced by the verifiers. If the verifiers perform poorly, they lose their status – ensuring the Digital Companion can offer a quality assurance, he said.
Digital and human fact checkers
For now, the technology has been developed to scan health science and political content. In future, it could be developed for almost all areas.
Initially it will be for institutions that monitor fake news and disinformation, but the aim is to enable journalists and the general public to take advantage of the resource too.
The technology ‘could really make a difference in the daily use of the internet and social media,’ said Dr Demestichas.
Still, because it will never be able to spot all fake news, ‘We need journalists, fact checkers, and citizens to be well-trained to exercise their critical thinking,’ he said.
Manipulated feelings
The fight against misinformation is about more than protecting people’s health, important as that is. The well-being of democratic societies themselves is also at stake, said Dr Demestichas.
‘Fake news tries to manipulate our feelings and fears to get our “clicks” to read their content,’ he said.
Curbing it is critical ‘to defend our democracies and allow our societies to function better.’
The research in this article was funded by the EU. This article was originally published in Horizon, the EU Research and Innovation Magazine