COVID-19 is bad enough.
But the virus-related misinformation proliferation that infested social media has made a bad situation even worse.
Yifan Sun is an assistant professor in William & Mary’s Department of Computer Science. He is a member of a group of researchers from Georgia Tech, Northeastern University and Emory University that conducted a series of surveys and interviews aimed at trying to understand how trust, distrust and information-seeking behavior are braided together among social media consumers.
Their work is an important analysis of the shifting nature of trust and how trust is critical in the formation of connections within a society — particularly during the course of an epidemic.
Sun and his collaborators will present their paper, “Shifting Trust: Examining How Trust and Distrust Emerge, Transform, and Collapse in COVID-19 Information Seeking,” at the Association for Computing Machinery’s CHI Conference on Human Factors in Computing Systems, an international conference dealing with human-computer interaction.
“We found a few interesting things,” Sun said. “For example, people who do not trust authority and who tend to transmit misinformation — they tend to have very limited COVID-19 information exposure.”
At the other end of the spectrum are the people who demonstrate willingness to trust information that originates from official sources. Those folks, Sun said, tend to have more and broader information sources and look for info beyond the memeland of social media.
“Some people who have the lowest mistrust levels actually take a look at multiple information sources,” he said. “They may receive misinformation, but they can be critical and think one level deeper than the superficial value. They’re able to build up a sophisticated — but not blind — trust or distrust for authority.”
The group found a number of other interesting things. For instance, the participants reported that their consumption of COVID-19 information on social media decreased over the period of the survey, even though the respondents reported getting information from traditional media at about the same level.
The paper notes that these findings refute earlier work that suggested reliance on social media for COVID-19 information was high during the pandemic. The paper reads, “…many of our participants ‘used’ social media for COVID-19 information consumption, but many did not necessarily ‘rely on’ nor ‘trust’ this information. ”
Further, the paper points out that many of the participants fell into the category of “essential workers” during the pandemic, and may simply have been too busy or tired to spend time on social media during the survey period.
Sun explained that many users became aware of the degree of misinformation swirling around the social media platforms and cut down or cut out their social media use.
“Some other participants are disappointed by their friends or extended family members who have accepted misinformation on social media,” he said. “So some of our survey participants felt frustrated by their inability to change their friends’ minds on social media.”
“I try not to discuss this stuff on Facebook, actually intentionally avoiding Facebook feeds [about COVID-19]… I don’t want to go and fight with [my social media contacts] because the people who are not going to understand, they’re never gonna understand…”
—Survey participant
The results also showed that many respondents still used social media, but did not trust the pandemic information they saw there, no matter the source. Interestingly, the survey showed a shift in trust/distrust from the information to the social media platforms themselves.
Sun points out that a large number of the survey participants believe various aspects of COVID-19 misinformation.
“About 30 percent of the participants believe that the authorities are intentionally reporting inaccurate COVID-19 data,” he said. “But the distrust of authorities is decreasing as the data becomes more consistent.”
The collaborators conducted a mixed-methods longitudinal study, consisting of a set of surveys taken every two weeks from September to December, 2020, then followed up with interviews with selected participants. A total of 177 people completed the set of surveys, most of whom were classified as low or middle income.
The surveys were conducted during a time that turned out to be the final months of the Trump Administration, and one in which election-related information/misinformation competed with COVID-19 content on social media.
“During the election week, we recorded the least COVID-19 information exposure and also the lowest level of distrust of authorities,” Sun said. “The reason may be that people are distracted by the election news.”
Sun and the other researchers devised a list of questions to establish a Misinformation Belief Score, establishing the kinds of debunked-but-widespread COVID-19 beliefs the respondents had accepted. Some of the questions, which elicited a 5-degree level of participant belief, include:
- The Coronavirus does not exist
- The Coronavirus was purposefully created and released as part of a conspiracy
- Hydroxychloroquine can be used in the treatment of COVID-19
- Diluted bleach can be used as a treatment for COVID
- The Coronavirus is caused by 5g technology
- The threat of Coronavirus has been exaggerated by political groups who want to damage President Trump
- The Coronavirus is being used to force a dangerous and unnecessary vaccine on Americans
- The Coronavirus outbreak is a population-control scheme developed by Bill Gates.
“These questions are well studied during COVID-19 and have been validated as misinformation,” Sun explained.
Another set of questions dealt with the respondent’s trust in authority by asking if they believe that government agencies intentionally report incorrect information and data.
“There are a few participants who pretty much believe all this information,” he said. “Including the belief that diluted bleach can be used to treat Coronavirus.”
Sun and his co-authors acknowledge the steps that social media platforms have taken to limit the spread of misinformation, but note in their paper that the typical approaches such as using artificial intelligence to target misinformation likely will not be sufficient.
“It is difficult, if not impossible, to find a simple solution for such a complex problem in the midst of a crisis, and thus, more research is needed to explore interventions that aid in both trust revision and tension mitigation to improve the crisis infrastructure,” the paper states.
Joseph McClain, Research Writer