Page Nav

HIDE

Grid

GRID_STYLE

Trending News

latest

Facebook Oversight Launches Review On XCheck System

Facebook logo. The inquiry was prompted by the Wall Street Journal investigation into social-media giant’s treatment of high-profile users. ...

Facebook logo.
The inquiry was prompted by the Wall Street Journal investigation into social-media giant’s treatment of high-profile users. Facebook Inc.’s Oversight Board said it is reviewing the company’s practice of holding high-profile users to separate sets of rules, citing apparent inconsistencies in the way the social-media giant makes decisions.

The inquiry follows an investigation by The Wall Street Journal into the system, known internally as “cross-check” or “XCheck.” The Oversight Board, an outside body that Facebook created to ensure the accountability of the company’s enforcement systems, said it has reached out to the company and expects a briefing...

Oversight Board said it is reviewing the company’s practice of holding high-profile users to separate sets of rules, citing apparent inconsistencies in the way the social-media giant makes decisions. The inquiry follows an investigation by The Wall Street Journal into the system, known internally as “cross-check” or “XCheck.” The Oversight Board, an outside body that Facebook created to ensure the accountability of the company’s enforcement systems, said it has reached out to the company and expects a briefing in the coming days.

The XCheck program was initially intended as a quality-control measure for actions taken against high-profile accounts, including celebrities, politicians and journalists. It grew to include millions of accounts, according to documents viewed by the Journal. In addition, some users are “whitelisted,” meaning they were rendered immune from enforcement actions, the documents showed.

Facebook headquarters in Menlo Park, Calif. Facebook created the Oversight Board to ensure the accountability of its enforcement systems / Nina Riggio.
A 2019 internal Facebook review found that the practice of whitelisting was “not publicly defensible.” The company had previously told the Oversight Board in writing that its system for high-profile users was only used in “a small number of decisions.”

In a blog post on Tuesday, the board said it was looking into whether Facebook has “been fully forthcoming in its responses in relation to cross-check, including the practice of whitelisting.”

The post continued: “This information came to light due to the reporting of the Wall Street Journal, and we are grateful to the efforts of journalists who have shed greater light on issues that are relevant to the Board’s mission. These disclosures have drawn renewed attention to the seemingly inconsistent way that the company makes decisions, and why greater transparency and independent oversight of Facebook matter so much for users.”

A Facebook spokesman has previously said that criticism of how it executed the system was fair, but added that it was designed “for an important reason: to create an additional step so we can accurately enforce policies on content that could require more understanding.” He said the company is phasing out the practice of whitelisting.

Our reporters discussed their findings from the WSJ’s Facebook Files investigation during a live Q&A on Monday. A Facebook spokesperson on Tuesday declined to comment further on the topic.

The details about the XCheck program and whitelisting were part of a series of articles published in the Journal last week detailing how Facebook's platforms have negative effects on teen mental health; its algorithm fosters discord and that drug cartels and human traffickers use its services openly.

Mark Zuckerberg, CEO of Facebook testifying in court. 
In response, Facebook vice president of global affairs Nick Clegg on Saturday published a blog post saying the articles “have contained deliberate mischaracterizations of what we are trying to do, and conferred egregiously false motives to Facebook’s leadership and employees.”

The post didn’t cite any factual inaccuracies. In a separate post on Tuesday, the company highlighted that it now has 40,000 employees working on safety and security and that it has invested more than $13 billion in these areas since 2016.

“How technology companies grapple with complex issues is being heavily scrutinized, and often, without important context,” said the unsigned post. “What is getting lost in this discussion is some of the important progress we’ve made as a company and the positive impact that it is having across many key areas.”

Separately, Sen. Richard Blumenthal (D., Conn.) said Tuesday that lawmakers are seeking a high-ranking representative of Facebook to testify at a Sept. 30 hearing, in part to respond to the Journal’s reporting on the company’s internal research about the effects of Instagram on teen girls.

“The simple fact of the matter is that Facebook has known for years that Instagram is directly involved in an increase in eating disorders, mental-health issues, and suicidal thoughts, especially for teenage girls,” Mr. Blumenthal said at a Senate Judiciary Committee hearing on privacy and antitrust issues, adding that he felt Facebook had misled Congress in previous statements about the impact of its platforms on mental health.

Steve Satterfield, a Facebook privacy and public-policy vice president, disagreed with Mr. Blumenthal’s characterization of the company’s statements to Congress and said it would “follow up promptly” about the request for testimony. Facebook understands “the frustration and concern that we are hearing about these reports,” he said. “The safety and well-being of teens on our platform is a top priority for the company,” he added. “This was important research.”

The criticism of Facebook at Tuesday’s hearing was bipartisan. Sen. Mike Lee (R., Utah) said he felt the Journal’s reporting showed Facebook lacked competition. “This too looks like the behavior of a monopolist, a monopolist that is so sure that its customers have nowhere else to go that it expresses a reckless disregard for quality assurance, for its own brand image, and even just being honest with its users about the obvious safety risks,” Mr. Lee said.

Mr. Satterfield said Facebook faces intense competition. The Oversight Board, in its blog post, said it planned to release details on what it heard from Facebook in October as part of its quarterly transparency report.

“The choices made by companies like Facebook have real-world consequences for the freedom of expression and human rights of billions of people across the world,” the group said. “By having clear rules and enforcing them consistently, platforms can give users the confidence that they’ll be treated fairly. Ultimately, that benefits everyone.”

The Oversight Board has previously requested information on the company’s XCheck program—asking it to explain how the system works and to share the criteria for adding pages and accounts to the system and its error rates.

In its response, Facebook provided an explanation but didn’t elaborate on criteria for adding pages and accounts to the system, and declined to provide reporting on error rates.