Culture 3 min read

New Study Explores Why People Share Fake News

Brovko Serhii / Shutterstock.com

Brovko Serhii / Shutterstock.com

A new study suggests that people are more likely to share fake news they've encountered more than once because it feels less immoral.

A new study suggests that the more people encounter fake news, the less unethical they’ll feel about sharing it on social media. In other words, people will share fake news even when they don’t believe in the information.

Fake news is news, stories, or hoaxes that are created misinform or deceive readers. Tricksters usually create these stories to either push a political agenda or influence people.

A previous study suggests that we fall sway to fake news its outlandish claims grabs our attention. In most cases, people can identify this misinformation, but they still unwittingly share it on social media.

Now a study suggests that morality may play a significant role in whether we share misinformation.

In a statement, London Business School associate professor of organizational behavior and author of the study, Daniel A. Effron said:

“We suggest that efforts to fight misinformation should consider how people judge the morality of spreading it, not just whether they believe it.”

According to findings from the study, sharing fake news feels less immoral when we’ve seen it before.

Understanding Why We Share Fake News

In a series of experiments, the researchers asked 2,500 online survey participants how they would handle fake news.

The respondents had to state how unethical or unacceptable they thought it would be to publish a fake headline. Also, they had to say write how likely they would be to share or like the misinformation or block and unfollow the person that shared the news.

Findings from the study suggest that participants rated fake news that they had seen before as less unethical to publish. Meanwhile, they didn’t feel the same way about the headlines that they saw for the first time.

Also, the participants said that they were more likely to share and “like” a previously seen fake news. And they were unlikely to block or unfollow the person that posted the story on social media.

Now you may argue that this could be a result of a tendency to misremember headlines. But you would be wrong.

The researchers pointed out that respondents did not rate previously seen headlines as significantly more accurate than news ones. This raises a big issue.

Distinguishing Facts May Not Prevent the Spread of Fake News

Current methods of curtail misinformation generally focus on pushing the facts.

Social media platforms are continually adding features to help users distinguish fact from fiction. For example, Facebook introduced new guidelines for publishers looking to sign up for its news platform.

But, such a strategy would ultimately fail if users are willing to share misinformation they know to be fake because they’ve seen it before.

According to the researchers, repeating a fake headline gives it a “ring of truthfulness.” As a result, people may give it a moral pass, whether they believe it or not.

The results should be of interest to citizens of contemporary democracies,” Effron adds. “Misinformation can stoke political polarization and undermine democracy, so people need to understand when and why it spreads.”

Read More: AI Text Generators: The Edgy Labs Guide

First AI Web Content Optimization Platform Just for Writers

Found this article interesting?

Let Sumbo Bello know how much you appreciate this article by clicking the heart icon and by sharing this article on social media.


Profile Image

Sumbo Bello

Sumbo Bello is a creative writer who enjoys creating data-driven content for news sites. In his spare time, he plays basketball and listens to Coldplay.

Comments (0)
Most Recent most recent
You
share Scroll to top

Link Copied Successfully

Sign in

Sign in to access your personalized homepage, follow authors and topics you love, and clap for stories that matter to you.

Sign in with Google Sign in with Facebook

By using our site you agree to our privacy policy.