You are here: Home / Internet Governance / Blog / Submission to the Facebook Oversight Board in Case 2021-008-FB-FBR: Brazil, Health Misinformation and Lockdowns

Submission to the Facebook Oversight Board in Case 2021-008-FB-FBR: Brazil, Health Misinformation and Lockdowns

Posted by Tanvi Apte and Torsha Sarkar at Jun 30, 2021 12:00 AM |
In this note, we answer questions set out by the Board, pursuant to case 2021-008-FB-FBR, which concerned a post made by a Brazilian sub-national health official, and raised questions on health misinformation and enforcement of Facebook's community standards.

Background 

The Oversight Board is an expert body created to exercise oversight over Facebook’s content moderation decisions and enforcement of community guidelines. It is entirely independent from Facebook in its funding and administration and provides decisions on questions of policy as well as individual cases. It can also make recommendations on Facebook’s content policies. Its decisions are binding on Facebook, unless implementing them could violate the law. Accordingly, Facebook implements these decisions across identical content with parallel context, when it is technically and operationally possible to do so. 

In June 2021, the Board made an announcement soliciting public comments on case 2021-008-FB-FBR, concerning a Brazilian state level medical council’s post questioning the effectiveness of lockdowns during the COVID-19 pandemic. Specifically, the post noted that lockdowns (i) are ineffective; (ii) lead to an increase in mental disorders, alcohol abuse, drug abuse, economic damage etc.; (iii) are against fundamental rights under the Brazilian Constitution; and (iv) are condemned by the World Health Organisation (“WHO”). These assertions were backed up by two statements (i) an alleged quote by Dr. Nabarro (WHO) stating that “the lockdown does not save lives and makes poor people much poorer”; and (ii) an example of how the Brazilian state of Amazonas had an increase in deaths and hospital admissions after lockdown. Ultimately, the post concluded that effective COVID-19 preventive measures include education campaigns about hygiene measures, use of masks, social distancing, vaccination and extensive monitoring by the government — but never the decision to adopt lockdowns. The post was viewed around 32,000 times and shared over 270 times. It was not reported by anyone. 

Facebook did not take any action against the post, since it had opined that the post is not violative of its community standards. Moreover, WHO has also not advised Facebook to remove claims against lockdowns. In such a scenario, Facebook referred the case to the Oversight Board citing its public importance. 

In its announcement, the Board sought answers on the following points: 

  1. Whether Facebook’s decision to take no action against the content was consistent with its Community Standards and other policies, including the Misinformation and Harm policy (which sits within the rules on Violence and Incitement). 

  2. Whether Facebook’s decision to take no action is consistent with the company’s stated values and human rights commitments. 

  3. Whether, in this case, Facebook should have considered alternative enforcement measures to removing the content (e.g., the False News Community Standard places an emphasis on “reduce” and “inform,” including: labelling, downranking, providing additional context etc.), and what principles should inform the application of these measures. 

  4. How Facebook should treat content posted by the official accounts of national or sub-national level public health authorities, including where it may diverge from official guidance from international health organizations. 

  5. Insights on the post’s claims and their potential impact in the context of Brazil, including on national efforts to prevent the spread of COVID-19. 

  6. Whether Facebook should create a new Community Standard on health misinformation, as recommended by the Oversight Board in case decision 2020-006-FB-FBR.

Submission to the Board

Facebook’s decision to take no action against the post is consistent with its (i) Violence and Incitement community standard read with the COVID-19 Policy Updates and Protections; and (ii) False News community standard. Facebook’s website as well as all of the Board’s past decisions refer to the International Covenant on Civil and Political Rights’ (ICCPR) jurisprudence based three-pronged test of legality, legitimate aim, and necessity and proportionality in determining violations of Facebook’s community standards. Facebook must apply the same principles to guide the use of its enforcement actions too, keeping in mind the context, intent, tone and impact of the speech. 

First, none of Facebook’s aforementioned rules contain explicit prohibitions on content questioning lockdown effectiveness. There is nothing to indicate that “misinformation”, which is undefined, includes within its scope information about the effectiveness of lockdowns. The World Health Organisation has also not advised against such posts. Applying the principle of legality, any person cannot reasonably foresee that such content is prohibited. Accordingly, Facebook’s community standards have not been violated, 

Second, the post does not meet the threshold of causing “imminent” harm stipulated in the community standards. Case decision 2020-006-FB-FBR, notes that an assessment of “imminence” is made with reference to factors like context, speaker credibility, language etc. Presently, the post’s language and tone, including its quoting of experts and case studies, indicate that its intent is to encourage informed, scientific debate on lockdown effectiveness. 

Third, Facebook’s false news community standard does contain any explicit prohibitions. Hence there is no question of its violation. Any decision to the contrary may go against the standard’s stated policy logic of not stifling public discourse, and create a chilling effect on posts questioning the lockdown efficacy. This will set a problematic precedent that Facebook will be mandated to implement.

Presently, Facebook cannot remove the post since no community standards have been violated. Facebook must not reduce the post’s circulation since this may stifle public discussion around lockdown effectiveness. Further, its removal would have resulted in violation of the user’s right to freedom of opinion and expression, as guaranteed by the Universal Declaration of Human Rights (UDHR) and the ICCPR, which are in turn part of Facebook’s Corporate Human Rights Policy. 

Instead, Facebook can provide additional context along with the post through its “related articles” feature, by showing fact checked articles talking about the benefits of lockdown. This approach is the most beneficial since (i) it is less restrictive than reducing circulation of the post; (ii) it balances interests better than not taking any actions by allowing people to be informed about both sides of the debate on lockdowns so that they can make an informed assessment. 

Further, Facebook’s treatment of content posted by official accounts of national or sub-national health authorities should be circumscribed by its updated Newsworthy Content Policy, and the Board’s decision in the 2021-001-FB-FBR, which had adopted the Rabat Plan of Action to determine whether a restriction on freedom of expression is required to prevent incitement. The Rabat Plan of Action proposes a six-prong test, that considers: a) the social and political context, b) status of the speaker, c) intent to incite the audience against a target group, d) content and form of the speech, e) extent of its dissemination and f) likelihood of harm, including imminence. Apart from taking these factors into consideration, Facebook must perform a balancing test to determine whether the public interest of the information in the post outweighs the risks of harm. 

In the Board’s decision in 2020-006-FB-FBR, it was recommended to Facebook to: a) set out a clear and accessible Community Standard on health misinformation, b) consolidate and clarify existing rules in one place (including defining key terms such as misinformation) and c) provision of "detailed hypotheticals that illustrate the nuances of interpretation and application of [these] rules" to provide further clarity for users. Following this, Facebook has notified its implementation measures, where it has fully implemented these recommendations, thereby bringing it into compliance.

Finally, Brazil is one of the worst affected countries in the pandemic. It has also been struggling to combat the spread of fake news during the pandemic. President Bolsanaro has been criticised for curbing free speech by using a dictatorship-era national security law., and questioned on his handling of the pandemic, including his own controversial statements questioning lockdown effectiveness. In such a scenario, the post may be perceived in a political colour rather than as an attempt at scientific discussion. However, it is unlikely that the post will lead to any-knee jerk reactions, since people are already familiar with the lockdown debate on which much has already been said and done. A post like this which merely reiterates one side of an ongoing debate is not likely to cause people to take any action to violate lockdown.

For detailed explanation on these questions, please see here.