Oversight Boards for everything

Evelyn Douek, who researches and writes on content moderation, joked recently about “oversight boards for all”.

For those of us working in and around this space, it actually is pretty amusing to think that an idea that has, to date, produced zero rulings on content and is currently waiting with baited breath for whether the ultimate test – the President of the United States’ case against his deplatforming – will come before them, should be something we’re already looking to scale up.

Now, it might have been a joke, but we actually think “oversight boards for everyone” has great validity, particularly if you don’t think of them as being “for everyone”, but “for everything” – different types of content, context, presentation and functionality.

So, instead of having just a single Facebook Oversight Board (whose scope is limited to content posted on Facebook), you’d have many specific boards overseeing different things. You’d have a newsfeed oversight board, Instagram and Whatsapp oversight boards, a groups board, a recommendations board, a video board, a news board, an elections and political advertising board* and more. Each of these can then work in much greater detail on the problems posed by and range of solutions in these areas. Usually they work alone, but they’re welcome to collaborate. If they make good decisions, the others can learn from them, if they make bad ones, there’s more chance they’ll get cancelled out elsewhere.

Furthermore, if you also accept (as you should) that the internet looks at least a little different in every corner of the world, you recognise the need to bring in more local expertise. Having many smaller boards makes it easier to plug in this (and other) contextual knowledge when you need it. 

If all that sounds too much, you could cut the number of boards by having some of them operate across several platforms (e.g. the elections board). 

The overarching goal of a system like this is to try and move beyond the leave up/take down/label paradigm for dealing with edge-case pieces of content, replacing it with structures for dealing with things that include, but go beyond content – functionality, design, misuse and more.

The boards could also play a role in deciding what data should be published, the research agenda require to further understand the individual and social effects of the areas they’re responsible for and so on.

Taken together, you end up with a system that looks something like the relationship between lower and higher courts, with the Supreme Court still being… the law and the actual courts. 

Perhaps this feels like an ungainly way of dumping a bunch of quasi-public infrastructure over the top of some private institutions. Perhaps the boards’ respective goals and incentives would be a nightmare to reconcile. Perhaps it sounds a bit too bureaucratic and expensive (though trillion dollar companies with 50%+ margins would have to convince us they can’t afford a little more scrutiny).

But the test – that the Facebook Oversight Board has to some extent already failed – is whether you can create a system that’s responsive to the rapid and changing demands of the internet. And in that respect, a greater number of smaller, domain-specific boards, transparently publishing caselaw that anyone (particularly new entrants) can learn from, seems like a way of breaking a very big problem into some more manageable pieces. 


*Obviously we’re very interested in this one, happy to help set it up and participate. Does your government or global platform need one? Send us an email.