Share This Story for the REAL TRUTH About Vaccines

Your feed's probably full of misinformation and misleading headlines. Do people really believe this stuff? And if not, why are they sharing it?


Fake news is on the rise, and it’s posing a serious threat to global vaccination efforts. Misleading or inaccurate headlines shared on social media have the potential to reach large audiences, leading to what many health experts are now referring to as a second pandemic: the pandemic of misinformation.

But why do people choose to share fake news, and what can we do to prevent its spread? A recent study from the University of Regina investigated these questions, finding that the majority of social media users do care about accuracy — but when it comes to social media, intuitions and emotional thinking can get in the way.

The research was led by Gordon Pennycook, an Assistant Professor of Behavioural Science at the University of Regina’s Hill/Levene Schools of Business, and published in Nature.

Separating fact from fiction

To investigate why fake news is so prominent on social media, Pennycook and his colleagues surveyed social media users on their ability to judge, and their willingness to share, fake news headlines.

The survey participants were divided into two groups. Both groups were presented with a series of true and false headlines, but one group was asked to judge the headline’s accuracy, while the other was asked if they would consider sharing it.

In both cases, some of the headlines were skewed towards different political beliefs. This allowed Pennycook and his colleagues to investigate whether political views might influence a user’s willingness to share fake news.

Overall, Pennycook and his colleagues found that users were good at determining a headline’s accuracy. Yet when it came to sharing posts, users were more likely to share extreme, sensationalist content that aligned with their political views — even if the posts themselves were false.

Despite these results, the authors found that social media users don’t consider themselves to be motivated by their political views. In a follow-up survey, more than 80% of study participants — including those who had chosen to share fake news in the original study — said that it’s more important to share accurate content than it is to share content that aligns with their political views.

This suggests that while social media users don’t want to share false, politically-charged information, they often end up doing it anyways.

Given this mismatch between what social media users say they care about and what they end up sharing, Pennycook and his colleagues theorized that other factors may be at play. In particular, they suggested that the engagement-based design of social media platforms might be contributing to the spread of fake news stories.

Engagement-based designs may be to blame

“To state the obvious: Social media platforms are social. This focuses our attention on social concerns, such as how much engagement our posts will get, how much our friends will enjoy them, […] and so on,” Pennycook and study co-author David Rand, an Associate Professor at the MIT Sloan School of Management, explained in an article for Scientific American.

“These considerations may distract us from even considering whether or not news content is accurate before we share it. This is surely facilitated by the fact that social media algorithms are optimized for engagement instead of truth.”

Extreme, sensational content tends to draw the most engagement on social media, which leads to social media companies prioritizing these posts in their algorithms. This means that on many social media platforms, a shocking-but-false news story be featured more prominently than a less-sensational, true news headline.

The authors went on to investigate how altering the design of social media platforms could help prioritize accurate content.

Similar to their first survey, the authors presented study participants with a series of true and false news headlines. This time, however, they asked participants to assess a headline’s accuracy before deciding whether or not to share it.

A shift in attention can help

The authors found that users were less likely to share fake news if they were asked to consider the story’s accuracy beforehand. Rather than sharing a story based on intuitions or emotional thinking, the accuracy prompt forced users to slow down and think critically before making their decision.

The authors then tested this method on thousands of Twitter users who had shared politically-charged, fake news stories in the past.

“[We] asked them for their opinion about the accuracy of a single nonpolitical news headline,” Rand and Pennycook explained. “[B]eing asked the single accuracy question improved the average quality of news sources the users subsequently shared on Twitter.”

Going forward, the authors suggest that social media platforms incorporate accuracy prompts into their designs. Doing so will encourage social media users to think critically and reduce the amount of fake news that gets shared.

Last year, Twitter started asking users to read article before sharing them. More recently, Facebook has begun incorporating similar prompts into their platform as well.

Encouraging other social media platforms to follow suit will reduce the spread of fake news, and stop the pandemic of misinformation in its tracks.

‹ Previous post
Next post ›

Emily Deibert is a PhD student in the Department of Astronomy & Astrophysics at the University of Toronto with a passion for science outreach and communication. She earned her HBSc (Astronomy, English, and Mathematics) at the University of Toronto. She is excited about turning scientific research into stories and sharing these stories with the public.