People Believe Most Of The Health-Related Misinformation Online

Such has been the extent of concern around the impact misinformation on social media is having on society, Facebook boss Mark Zuckerberg has gone from one of the most admired people in the world to the most despised.

The extent of the misinformation challenge was highlighted by recent research from Kingston University, which revealed that people typically believe 60% of the misinformation shared about health topics on social media is regarded as credible.

What’s more, attempts by social networks to mitigate the risk of misinformation by displaying banners warning people of the lack of credibility in a particular source were also found to be largely ineffective.

“The belief in fake news stories about healthcare is understandable. Most people do not have specialist medical knowledge, so if claims are put in a way that sounds like they make sense, why would the public not believe them?” the researchers say. “One of our most concerning findings is that prior exposure to stories increases credibility—repetition counts, so the more someone sees something, the more they believe it.”

Led astray

The findings emerged after a study involving over 1,900 people from a wide range of ages and backgrounds who were randomly assigned to one of two groups.  Both groups were shown six real and six fake news stories in a Facebook style interface, and asked to explain whether they would share the stories or not.  One of the groups was shown a banner warning of the risks of fake news, whilst the other group was not.

Unfortunately, the warning appeared to have no tangible impact at all on the behavior of the volunteers, whether in terms of them believing the story or sharing it.  Bizarrely, even when the story was recognized as fake, there was still a strong chance it would be shared anyway.

“Media organisations publishing fake news stories have a responsibility to act. Facebook is planning to invest in teams of experts to look at the trustworthiness of the information being shared on its platform. If a story is not reliable, we recommend a publisher should have two choices—either delete the post or use the search algorithm to ensure scientifically inaccurate stories are relegated to appearing at the end of search results,” the researchers say.

Bad at spotting fake news

A recent study from the University of Texas at Austin highlights the challenges involved, as Facebook users were generally very bad at identifying fake news.

Volunteers were kitted out with a wireless electroencephalography headset whilst they read various political news headlines in a Facebook-style interface.  Each volunteer was asked to determine the credibility of the stories.  Alarmingly, they were only able to accurately do so 44% of the time when the news was aligned with their own political beliefs.

“We all believe that we are better than the average person at detecting fake news, but that’s simply not possible,” the researchers say. “The environment of social media and our own biases make us all much worse than we think.”

All of which highlights the fundamental difficulty society faces in ensuring that citizens are as well informed as possible, whether on health-related topics or anything else.

Facebooktwitterredditpinterestlinkedinmail