Facebook automated XCheck content monitoring and censoring system exempts 5.8 Mn. ‘rich and powerful’ suggests new report

Facebook XCheck
Facebook rules and policies are apparently not the same for all? Pic credit: Christopher/Flickr

Facebook has a reverse content monitoring system called XCheck. It doesn’t automatically hunt and flag potentially inappropriate, sensitive, abusive, or sexually explicit content.

Instead, XCheck protects certain high-profile, rich, and well-connected individuals. These accounts apparently get preferential treatment, claims an explosive report.

Facebook treats ‘newsworthy’, ‘influential’, ‘popular’, or ‘PR risky’ accounts differently?

Social media was supposed to be the great equalizer. It was to truly democratize the Internet and news. However, Facebook, it seems, treats certain individuals to a higher standard than the common masses.

Facebook has a secret internal system that exempts 5.8 million users from having to follow the rules on its platform, reports the Wall Street Journal.

The publication has reportedly seen internal Facebook documents. They allegedly reveal how the social media giant does not use the same governing principles for high-profile users, as it does for the masses.

Facebook accounts that belong to users who are newsworthy, influential, or popular, or are ‘PR risky’, can apparently get away with more on the platform. In other words, the standard censorship rules and regulations, or perhaps even the warning system, don’t apply to certain accounts.

Moreover, Facebook “routinely makes exceptions for powerful actors,” claimed an employee in a memo. Some of the notable personalities who get preferential treatment, or are subject to lax scrutiny, include former President Donald Trump, soccer star Neymar da Silva Santos Júnior, Sen. Elizabeth Warren, etc.

Basically, Facebook XCheck allegedly treats 5.8 Million individuals as “special cases” who get preferential treatment. These accounts are reportedly above human and AI censorship.

Facebook XCheck prohibits human moderators from intervening and taking down questionable content that certain individuals’ post:

All the alleged 5.8 million individual Facebook accounts fall within the ambit of Facebook XCheck. The social media giant initially created XCheck or Cross Check to address the myriad shortcomings of the company’s dual human and AI moderation processes.

Gradually, the system sat above human moderators. When Facebook’s executives add an account to the XCheck database, it becomes very difficult for moderators to take action against them.

WSJ further reported that Facebook cross-verified or investigated less than 10 percent of the content that XCheck flagged to the company as needing attention.

It appears several Facebook employees have the power to add users to the XCheck system. The social media giant calls it “Whitelist Status”. Strangely, there’s allegedly no strict record-keeping process and neither is there auditing or even a review process for adding or removing an account within the XCheck system.

Incidentally, Facebook has long ago acknowledged XCheck. It seems the company is well aware of the platform’s problems. And it has been gradually deprecating the system. But apparently, hasn’t yet shuttered the practice of shielding some rich and powerful individuals from the consequences of posting inappropriate content that violates Facebook’s policies.

Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x

Warning: Undefined variable $posts in /home/thetechherald/public_html/wp-content/themes/generatepress_child/functions.php on line 309

Warning: Trying to access array offset on value of type null in /home/thetechherald/public_html/wp-content/themes/generatepress_child/functions.php on line 309

Warning: Attempt to read property "post_author" on null in /home/thetechherald/public_html/wp-content/themes/generatepress_child/functions.php on line 309