The Oversight Board Moment You Should’ve Been Waiting For: Facebook Responds to the First Set of Decisions

Read the original article: The Oversight Board Moment You Should’ve Been Waiting For: Facebook Responds to the First Set of Decisions


The most important determinant of whether the Facebook Oversight Board (FOB) contributes, well, anything, to content moderation is how Facebook responds to the voluntary policy recommendations the FOB makes. The scope of the FOB’s “binding” authority is extremely limited—Facebook is only required to do what the FOB says about the individual piece of content at issue in a case. This isn’t much: A single H2O molecule in the ocean of millions of content moderation decisions Facebook makes everyday. The FOB’s impact therefore hinges on whether and how Facebook responds to its non-binding policy recommendations. And the moment has finally arrived where the public can start evaluating that contribution: Facebook responded to the FOB’s first set of decisions yesterday.

The good news is the response is not a big “eff you” to the FOB. Facebook made 11 commitments in response to the FOB’s recommendations (that tally is based on Facebook’s own framing; my count has it lower, which I will return to below), said it would assess the impact of five other recommendations, and only refused one: The FOB’s recommendation that Facebook should take “less intrusive measures” in response to COVID-19 misinformation where harm is identified but “not imminent.” Facebook says it is committed to its robust response during the pandemic, and has built out its COVID rules in consultation with global public health experts and won’t let the FOB overrule them. Fair enough. 

But there’s still a lot to be desired in Facebook’s responses. Some of the “commitments” are likely things that Facebook had in train already; others are broad and vague. And while the dialogue between the FOB and Facebook has shed some light on previously opaque parts of Facebook’s content moderation processes, Facebook can do much better. Lawfare’s FOBblog has Facebook’s responses to each case here. This post has some higher level takeaways.

The Good Parts (And the Questions They Raise)

Facebook responded and accepted some of the broader recommendations! Well done, Facebook. Even that low bar is better than many skeptics have been expecting.

Some of the responses are substantive improvements. In the one case concerning an Instagram post, the FOB pointed out that it wasn’t clear how Facebook and Instagram’s rules interacted. As a result, Facebook has begun the process of clarifying their relationship and committed to providing more detail. The comprehensiveness of Facebook’s Community Standards compared with the more bare-bones Instagram Community Guidelines, and the sometimes inconsistent enforcement actions between them, has always been a puzzle. The FOB has prompted progress (again, admittedly from a low baseline).

There were other promising signs. Facebook has committed to providing more detail in the notifications it sends to people whose content is removed. It will also reassess how automated moderation is used and also how users get told when a decision on a piece of content is made by AI and not a human. Facebook has promised more detail on policies in a new Transparency Center, including clarifying their Dangerous Individuals and Organizations policy and the definitions of “praise,” “support” and “representation.” It has also consolidated information about its policies on COVID-19 misinformation, which were previously dispersed around various blog posts and webpages. This is something researchers have been asking Facebook to do for a while. The FOB delivers what frustrated researchers’ tweets could not.

Facebook’s responses were also educational. I watch the content moderation space and Facebook pretty closely, and I learnt things. 

For example, Facebook explained the error choice calculation it has to make when using automated tools to detect adult nudity while trying to avoid taking down images raising awareness about breast cancer (something at issue in one of the initial FOB cases). Facebook detailed that its tools can Become a supporter of IT Security News and help us remove the ads.


Read the original article: The Oversight Board Moment You Should’ve Been Waiting For: Facebook Responds to the First Set of Decisions