![]() ![]() This diagnosis already implies a cure: reduce the supply of false information and increase the supply of accurate information. More to our point, the shared focus on technological solutions by both platforms and critics urging them to go further in removing misinformation implies that vaccine hesitancy is still widely seen as primarily an informational problem, rather than a trust problem. Recent interventions demanded of social media companies raise major questions around whether private technological monopolies have both the democratic legitimacy and the institutional competence to arbitrate the scientific merits and likely real-world consequences of speech acts within the digital public sphere. Facebook, the world’s largest social media platform, claimed that by August 2021 it had removed over 3,000 accounts, pages and groups since the beginning of the pandemic for repeatedly violating its rules against spreading COVID-19 and vaccine misinformation, along with 20 million individual pieces of content 8. Typically, these involve a combination of signposting users to credible information sources, placing warning labels on potentially misleading information and removing content that has the highest risk of causing real-world harm. Facebook, Instagram, Twitter and YouTube all now have explicit policies regarding COVID-19 and vaccine misinformation more broadly. Since the pandemic began, social media companies have come under increasing public and political pressure to prevent misinformation spreading on their platforms. In July 2020, an investigation by the Center for Countering Digital Hate, a UK-based campaign group, found that avowedly anti-vaccination accounts on English-language social media had a combined 58 million followers, which it estimated could be worth up to US$1 billion a year to the platforms 7. However, these measures typically stopped short of removing misleading content. Around the same time, YouTube began to prevent anti-vaccination channels from raising money through advertisements 6. It further pledged to reject advertisements that included misinformation about vaccines and to stop showing or recommending such content on the Explore and hashtag pages on Instagram, which Facebook owns. In early 2019, in response to a series of measles outbreaks in the United States, Facebook announced for the first time that it would reduce the ranking of groups and pages promoting vaccine misinformation in its news feed and search tool 5. Until shortly before the pandemic, most social media platforms had few if any policies to address vaccine misinformation. Other studies have reached similar conclusions about the effect of exposure to online vaccine misinformation 4. In the United Kingdom, there was a 6.2 percentage point drop in the respondents who ‘strongly agree’ that they would get vaccinated, alongside a 6.4 percentage point drop in the same response among US respondents. The study found that, relative to factual information, these items of misinformation induced a decline in intent to vaccinate. As part of a randomized control trial conducted in the United Kingdom and United States, participants were exposed to examples of misinformation circulating on Twitter, including one post falsely claiming that a COVID-19 vaccine would alter DNA in humans and another falsely claiming that a COVID-19 vaccine would cause 97% of recipients to become infertile. Research conducted by the Vaccine Confidence Project in 2020 aimed to quantify how exposure to online misinformation around COVID-19 vaccines might be affecting vaccination intent 3. Rumors and conspiracy theories around COVID-19 vaccines have undoubtedly been damaging. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |