Meta is walking back its fact-checking efforts, Joel Kaplan, Meta’s chief global affairs officer, said in a January 7 post on the company’s website, raising concerns about the potential for health misinformation to increase on its social platforms. Kaplan explained that over the next few months, Meta will phase out and replace its third-party fact-checking program—in place since 2016—with a community notes model, à la X, across Facebook, Instagram, and Threads. Through this model, users can identify and add disclaimers to posts with misinformation, and other users’ votes determine whether the note will be shown next to the original post. While Kaplan said the move encourages “free expression,” experts say it should be a wake-up call for healthcare organizations. Multiple studies conclude that health misinformation on the internet has decreased trust in conventional treatments like vaccines. For instance, a 2021 randomized controlled trial published in journal Nature Human Behavior suggested that some people were less willing to take a Covid-19 vaccine after viewing online misinformation. The portion of those “definitely” willing as of September 2020 was only 43% in the US. A 2022 study from Indiana University, published in Scientific Reports, also linked lower vaccine uptake to higher exposure to online misinformation. Keep reading here.—CC |