As we all know, Facebook decided on January 6 to kick the mango-hued shitgibbon off of their platform, having found the Stupid Coup was, you know, promoted by him and his trolls. They sent that decision to something they call their Oversight Board which (I’m not kidding) is something like an appeals court for them.
The Oversight Board has upheld Facebook’s decision to suspend Mr. Trump’s access to post content on Facebook and Instagram on January 7, 2021. However, as Facebook suspended Mr. Trump’s accounts ‘indefinitely,’ the company must reassess this penalty.
Within six months of this decision, Facebook must reexamine the arbitrary penalty it imposed on January 7 and decide the appropriate penalty. This penalty must be based on the gravity of the violation and the prospect of future harm. It must also be consistent with Facebook’s rules for severe violations, which must, in turn, be clear, necessary and proportionate.
If Facebook decides to restore Mr. Trump’s accounts, the company should apply its rules to that decision, including any changes made in response to the Board’s policy recommendations below. In this scenario, Facebook must address any further violations promptly and in accordance with its established content policies.
Policies they recommend?
In a policy advisory statement, the Board made a number of recommendations to guide Facebook’s policies in regard to serious risks of harm posed by political leaders and other influential figures.
The Board stated that it is not always useful to draw a firm distinction between political leaders and other influential users, recognizing that other users with large audiences can also contribute to serious risks of harm.
While the same rules should apply to all users, context matters when assessing the probability and imminence of harm. When posts by influential users pose a high probability of imminent harm, Facebook should act quickly to enforce its rules. Although Facebook explained that it did not apply its ‘newsworthiness’ allowance in this case, the Board called on Facebook to address widespread confusion about how decisions relating to influential users are made. The Board stressed that considerations of newsworthiness should not take priority when urgent action is needed to prevent significant harm.
Facebook should publicly explain the rules that it uses when it imposes account-level sanctions against influential users. These rules should ensure that when Facebook imposes a time-limited suspension on the account of an influential user to reduce the risk of significant harm, it will assess whether the risk has receded before the suspension ends. If Facebook identifies that the user poses a serious risk of inciting imminent violence, discrimination or other lawless action at that time, another time-bound suspension should be imposed when such measures are necessary to protect public safety and proportionate to the risk.
The Board noted that heads of state and other high officials of government can have a greater power to cause harm than other people. If a head of state or high government official has repeatedly posted messages that pose a risk of harm under international human rights norms, Facebook should suspend the account for a period sufficient to protect against imminent harm. Suspension periods should be long enough to deter misconduct and may, in appropriate cases, include account or page deletion.
In other recommendations, the Board proposed that Facebook:
- Rapidly escalate content containing political speech from highly influential users to specialized staff who are familiar with the linguistic and political context. These staff should be insulated from political and economic interference, as well as undue influence.
- Dedicate adequate resourcing and expertise to assess risks of harm from influential accounts globally.
- Produce more information to help users understand and evaluate the process and criteria for applying the newsworthiness allowance, including how it applies to influential accounts. The company should also clearly explain the rationale, standards and processes of the cross check review, and report on the relative error rates of determinations made through cross check compared with ordinary enforcement procedures.
- Undertake a comprehensive review of Facebook’s potential contribution to the narrative of electoral fraud and the exacerbated tensions that culminated in the violence in the United States on January 6. This should be an open reflection on the design and policy choices that Facebook has made that may allow its platform to be abused.
- Make clear in its corporate human rights policy how it collects, preserves and, where appropriate, shares information to assist in investigation and potential prosecution of grave violations of international criminal, human rights and humanitarian law.
- Explain its strikes and penalties process for restricting profiles, pages, groups and accounts in Facebook’s Community Standards and Instagram’s Community Guidelines.
- Include the number of profile, page, and account restrictions in its transparency reporting, with information broken down by region and country.
- Provide users with accessible information on how many violations, strikes and penalties have been assessed against them, and the consequences that will follow future violations.
- Develop and publish a policy that governs Facebook’s response to crises or novel situations where its regular processes would not prevent or avoid imminent harm. This guidance should set appropriate parameters for such actions, including a requirement to review its decision within a fixed time.
*Case summaries provide an overview of the case and do not have precedential value.