Big Tech is outsourcing its hardest content moderation decisions
Published Date: 1/29/2021
Source: axios.com

Faced with the increasingly daunting task of consistent content moderation at scale, Big Tech companies are tossing their hardest decisions to outsiders, hoping to deflect some of the pressure they face for how they govern their platforms.

Why it matters: Every policy change, enforcement action or lack thereof prompts accusations that platforms like Facebook and Twitter are making politically motivated decisions to either be too lax or too harsh. Ceding responsibility to others outside the company may be the future of content moderation if it works.


What's happening: The Facebook Oversight Board overturned the social network's decisions in four of its five first cases Thursday, meaning Facebook will have to restore four posts it previously took down.

  • Earlier this week, Twitter introduced Birdwatch, a pilot program that allows users to add context to what they think are misleading tweets.

Between the lines: The Oversight Board, although funded by a $130 million grant from Facebook, is asserting its independence. It's making it clear it will look at content moderation decisions with a different perspective than Facebook and won't be afraid to reverse company decisions.

  • The board is hopeful that outside entities making these decisions will help inspire public confidence.

Be smart: The Oversight Board's rationale in directing Facebook to restore the posts, which included one comparing Donald Trump to Joseph Goebbels and another insulting Muslim men for not speaking up about China's treatment of Uyghurs, focused on maximizing free speech as long as it doesn't pose a risk of real-world harm.

What they're saying: "I think the thing that the Oversight Board can try to deliver is some degree of trust," Jamal Greene, the board's co-chair, said at an internet policy event this week.

  • "There are all kinds of reasons not to trust governments to regulate content. There are all kinds of reasons not to trust private companies to regulate content. And what we're trying to do with the Oversight Board ... is to try to create institutions that don't have perverse incentives."

Yes, but: In Twitter's case, the decision to offload the final word in some moderation decisions to the public instead of a jury of experts could result in just those sorts of perverse incentives.

  • Abusive Twitter users could use the system to add misleading "context" to tweets that involve, for instance, already marginalized groups.

Our thought bubble: It's still hard to see whether farming out moderation will take any pressure off the companies.

  • In our highly polarized political environment, it's not clear that a board made up of people unfamiliar to most social media users, or random Twitter users, will make anyone feel better about how their content is handled.