Facebook has recently revealed that it’s downgrading content that promotes “miracle cure” as well as those that make dubious health claims.
Big tech companies such as Google, Facebook, and Twitter are regularly criticized for promoting the distribution of fake or misleading content. Just last year, Facebook reportedly featured “homemade cancer cures” more prominently on its feed than reliable information from cancer support groups.
Since then, social media platforms have made moves to tackle fake news. An example of such is its plan to combat anti-vaccine misinformation.
Fighting Misinformation
The fight against digital misinformation extends beyond spurious health cures.
Earlier in the year, Google announced plans to remove conspiracy theory videos from recommendations on YouTube. This includes videos claiming that the earth is flat and that the moon landings did not happen.
However, the latest focus on health-related misinformation may be in response to an investigation by the Wall Street Journal.
The post reads:
“Facebook Inc. and YouTube are being flooded with scientifically dubious and potentially harmful information about alternative cancer treatments, which sometimes gets viewed millions of times.”
While Google’s response was to cut off advertising revenue for such videos, Facebook worked to reduce visibility.
Facebook’s product manager, Travis Yeh, wrote in a blog post:
“To help people get accurate health information and the support they need, it’s imperative that we minimize health content that is sensational or misleading.”
Downgrading Dubious Health Claims
According to the product manager, the company has made two ranking updates within the previous month to limit the visibility of specific health-related remedies. How accurate, you ask?
Reports suggest Facebook designed the updates to target posts that promote products or services which promises health miracles. This ranges from easy weight loss to cancer cure.
“In our ongoing efforts to improve the quality of the information in News Feed, we consider ranking changes based on how they affect people, publishers, and our community as a whole,” Yeh added. “We know that people don’t like posts that are sensational or spammy, and misleading health content is particularly bad for our community.”
There’s just one thing. Content creators and promoters have found a way around the similar algorithmic sensor in the past. This could be no different.
In other words, this could quickly turn into a game of whack-a-mole. So, rather than curb fake news, Facebook could unintentionally help the content creators discover new creative ways to spread misinformation.
Comments (0)
Most Recent