Facebook has a secret internal system that exempts 5.8 million users from having to follow the rules on its platform, according to the Wall Street journal.
The paper on Monday published an investigation detailing how high-profile users on its services who are “newsworthy,” “influential or popular, or “PR risky” don’t see the same enforcement action as do ordinary users, citing company documents it had viewed.
A former Facebook employee said in a memo that the company “routinely makes exceptions for powerful actors,” per the Journal.
Figures like former President Donald Trump, soccer star Neymar da Silva Santos Jnior, Sen. Elizabeth Warren, and even Doug the Pug are covered by the system, nicknamed “XCheck” or “cross check.” The system was created in response to the shortcomings of Facebook’s dual human and AI moderation processes.
But as The Journal reported, XCheck has led to a bevy of other problems.
When users are added to it, it’s more difficult for moderators to take action against them, like with Neymar, who posted his WhatsApp communication with a woman who accused him of rape to his Facebook and Instagram accounts. The screenshots showed her name and nude photos of her.
Neymar’s sharing of “nonconsensual intimate imagery” would have prompted Facebook to delete the post, but since Neymar was covered by XCheck, moderators were blocked from removing the content, which was then seen by 56 million online users.
Less than 10% of the content that XCheck flagged to the company as needing attention was reviewed, per a document reported by the paper. Facebook spokesperson Andy Stone told the Journal that the number grew in 2020 but did not provide evidence to support that assertion. Most Facebook employees have the power to add users to the XCheck system for whitelisting status, a term used to describe high-profile accounts that don’t have to follow the rules. But the Journal viewed a 2019 audit that found Facebook doesn’t always keep a record of who it whitelists and why, which poses “numerous legal, compliance, and legitimacy risks for the company and harm to our community.”
Facebook employees, including an executive who led its civic team, expressed disapproval with the company’s practice of doling outs special treatment for some users and said it was not in alignment with Facebook’s values, the paper reported.
Having different rules on speech for different people is very troubling to me,” one wrote in a memo viewed by the Journal. Another employee also said Facebook is “influenced by political considerations” when making content moderation decisions, the paper reported.
Facebook acknowledged XCheck and its downfalls years ago and told the Journal that it’s trying to terminate its whitelisting practice. Company documents also show Facebook’s intention to eradicate the system – a product manager proposed a plan to stop allowing Facebook employees to add new users to XCheck as a solution. Some of the company documents will be handed over to the Securities and Exchange Commission and Congress, with that person requesting federal whistleblower protection, per the WSJ.
Zuckerberg has long touted one of his signature taglines: that Facebook’s leaders don’t want the platform to be the “arbiters of truth” or to decide what is true or false and then leave up or remove content accordingly.