Facebook has gone from 0 to 100 when it comes to content moderation. In addition to flagging wrong elections and COVID-19 information, the platform will now remove false claims about the upcoming coronavirus vaccines.
Facebook released its statement as leading vaccines near their final stages and approval for release.
These vaccines are key to fighting the virus as vague hopes of herd immunity development have proven ineffective and multiple waves of the virus have spread across the globe.
The company has stated that the new regulations will contain false statements about “safety, efficacy, ingredients or side effects of the vaccines”. For example, we will remove false claims that COVID-19 vaccines contain microchips or anything not on the official vaccine ingredient list. ”
This is a bold move as the Antivaxx (anti-vaccine) community has a strong presence on the social media website and there are many groups advocating it.
Facebook said it would update the rules as soon as new vaccine-related issues emerged. This is a smart move as the Antivaxx movement has impressive traction and PR, making it a shape shifter when it comes to arguments regarding its incorrect position.
How effective this enforcement will be remains to be seen. The website has had content management issues in the past which allowed false or harmful information to leak through.
However, Facebook has partnered with fact-checking organizations around the world to aid in this fight. In South Africa and Africa, the company has partnered with Africa Check, the leading fact-checking website in the region.
Why incorrect information needs to be blocked
The problem with misinformation about the COVID-19 vaccine is that protection against the virus will only come if anyone who is able gets the FDA-approved shot.
This is how herd immunity develops. Since there are some who cannot take the vaccine for medical reasons, others must do so to preserve those who cannot safely protect and completely eliminate the virus in the human population.
In the past, viruses were completely removed by vaccination. Smallpox has been completely eradicated since the vaccine, with the last known case occurring in Somalia in 1977.
Moderation and freedom of speech
Those in the Antivaxx community and others who like it often refer to this type of content moderation as a restriction on freedom of speech. In the US, this is even more important as there are no laws that define exactly what freedom of speech is all about.
However, Facebook and other websites have recognized the importance of stopping the spread of false information as a security point that trumps free speech.
During the pandemic, conspiracy theories and misinformation have increased significantly in locations ranging from Antivaxx theories to QAnon expansion. This has made websites more convenient when it comes to sharing content.
Effectiveness of content moderation
A new problem is whether content moderation and warnings are even effective.
According to Facebook, despite warnings of misinformation about coronavirus, Donald Trump has done little to stop the spread, according to the company’s internal data.
This creates new concerns about how far websites must go to stop these theories. If putting a “misinformation” mark isn’t enough to stop people, is the answer to getting rid of it completely?
Source link : https://www.techradar.com/news/facebook-to-remove-antivaxx-posts/