Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Meta’s Oversight Board Can Now Apply Warning Screens to Moderate Content

Share

Meta Platforms’ independent oversight board said on Thursday that starting this month, it can decide on applying warning screens, marking content as “disturbing” or “sensitive”.

The board, which already has the ability to review user appeals to remove content, said it would be able to make binding decisions to apply a warning screen when “leaving up or restoring qualifying content”, including to photos and videos.

Separately in its quarterly transparency report, the board said it received 3,47,000 appeals from Facebook and Instagram users around the world during the second quarter ended June 30.

“Since we started accepting appeals two years ago, we have received nearly two million appeals from users around the world,” the board report said.

“This demonstrates the ongoing demand from users to appeal Meta‘s content moderation decisions to an independent body.”

The oversight board, which includes academics, rights experts and lawyers, was created by the company to rule on a small slice of thorny content moderation appeals, but it can also advise on site policies.

Last month, it objected to Facebook’s removal of a newspaper report about the Taliban that it considered positive, backing users’ freedom of expression and saying the tech company relied too heavily on automated moderation.

In other news, Meta on Wednesday said that it built an artificial intelligence system that translates Hokkien into English even though the Taiwanese language lacks a standard written form.

The Silicon Valley tech titan that owns Facebook and Instagram billed the work at its Universal Speech Translator project as an effort to enable users from around the world to socialise regardless of the languages they speak.

“Spoken communications can help break down barriers and bring people together wherever they are located — even in the metaverse,” Meta said in a blog post.

© Thomson Reuters 2022


Buying an affordable 5G smartphone today usually means you will end up paying a “5G tax”. What does that mean for those looking to get access to 5G networks as soon as they launch? Find out on this week’s episode. Orbital is available on Spotify, Gaana, JioSaavn, Google Podcasts, Apple Podcasts, Amazon Music and wherever you get your podcasts.
Affiliate links may be automatically generated – see our ethics statement for details.