Facebook Parent Meta’s Oversight Board Can Now Apply Warning Screens to Moderate Content

HomeTech NewsFacebook Parent Meta's Oversight Board Can Now Apply Warning Screens to Moderate Content

Meta’s Independent Oversight Board can now directly give a nod on flagging “sensitive” or “disturbing” content by using a warning screen. Let’s understand why this is important.

When it comes to content moderation on social media or other online platforms, too much will always be too little. Facebook Parent Meta has been trying for a long time to fight off all the criticism that comes its way for not doing enough to mark sensitive content on its platforms. The latest addition in its series of efforts is the decision to let the Independent Oversight Board take the call on applying a warning screen on disturbing or sensitive content when reported.

The oversight board said they would now be able to take binding decisions towards appeals filed by users across the world.

Also Read: Facebook Asks User Help for Algorithm Improvement

In a separately released transparency report, the Meta independent board highlighted that in the second quarter of the year i.e. from 1st April till 30th June 2022, the company had received 3,47,000 appeals from the users of Facebook and Instagram across the world combined. The board report mentions, “Since we started accepting appeals two years ago, we have received nearly two million appeals from users around the world.”

“This demonstrates the ongoing demand from users to appeal Meta’s content moderation decisions to an independent body,” the report also added.

Meta’s Independent Oversight Board

Although the oversight board was first formed to take decisions on small number of content moderation appeals, over time, its role expanded. It has also been given the responsibility to advice on social media platforms’ policies and consumer decisions.

One of the board’s most significant recent decisions was its public objection to Facebook removing a newspaper report about the Taliban. Facebook removed the report terming it “positive”. However, the board clearly backed the users’ freedom of expression. They also criticised how the tech giant relies on automated moderation.

Also Read: Meta Plans on Facebook & Instagram to Get More Paid Features

It must be noted that artificial intelligence is heavily used for content moderation in most tech companies, which is not always dependable.

More On Meta’s Independent Oversight Board

The Meta independent oversight board is the independent body formed to take content moderation decisions for Meta’s popular social media platforms, Facebook and Instagram. The initial proposal by Harvard Law School professor Noah Feldman was to create a quasi-judiciary for Facebook, an idea approved and supported by Meta CEO Mark Zuckerberg who also described it as a “Supreme Court.” According to Zuckerberg, the oversight board will have all the rights to settle, negotiate, and mediate content moderation issues. He also specified that the board would eventually have the authority to override the decision of Meta as a company.

The oversight board includes eminent academics, rights experts and lawyers.

The idea of forming a dedicated oversight board was first announced in November 2018. Zuckerberg made the announcement after having consultation sessions with many industry and law experts. Following expected industry speculations and media coverage of the idea, the first 20 founding members of the board were announced in May 2020. Later, it was on October 2020 when the board officially started its work.

Also Read: Meta Upgrades Privacy Updates for Teens

Latest Articles

CATEGORIES