The company’s own body, created to make rulings on delicate matters, has published a report concluding that “significant” action needs to be taken when reviewing content from high profile users.
Billions of content pieces are posted on Facebook and Instagram every day by users. Most are written by regular people, but some prominent politicians and celebrities are said to have outside influence.
The vast majority of posts are perfectly acceptable. They are authored by individuals or groups and are within the bounds of permitted content allowed by the social media platform’s parent company, Meta (META) – Get Free Report.
But some number of stories published are inevitably found to be in violation of the company’s community standards. This is content considered by Meta to contain misinformation, promote violence or other criminal behavior, compromise the safety of individuals or groups, be sexually graphic, or otherwise exhibit questionable integrity or authenticity.
When the company finds content that is in violation of its community standards, it can be removed. It also can be flagged as requiring additional information or context, as allowed but with a warning screen, or as content that can only be viewed by adults aged 18 and older.
Meta says its primary review systems use technology to prioritize high-severity nefarious content that it recognizes as spreading quickly, or going viral.
This is where its “cross-check” system comes into play. It is this system that is under scrutiny in a report published by the company’s Oversight Board Dec. 6.
Meta’s Cross-Check System Viewed as Needing Reform
Cross-check is used to help be sure high-importance content review decisions are made accurately and with review by people rather than entirely with technology. The company acknowledges some if its technology-driven reviews result in what it calls “false-positives.” Human reviews can reduce these when implemented properly.
Depending on the content’s complexity, multiple levels of review may be assigned. In extremely high-visibility and high-severity cases, company leadership gets involved.
Meta says it is continually looking for ways to improve its systems. One fairly new structural change it has made is adding two components to cross-check: General Secondary Review and Early Response (ER) Secondary Review. The ER Secondary Review cases are the smaller number of the two and are the highest-priority cases.
In its new report, Meta’s Oversight Board is now recommending an overhaul of the cross-check system. A concern is that in some cases posts from VIPs (politicians, celebrities and other high-profile users), are exempted from Meta’s rules.
A VIP list exists that includes former President Donald Trump and his son Donald Trump Jr., Democratic Senator Elizabeth Warren of Massachusetts, conservative activist Candace Owens, Brazilian soccer star Neymar and even Meta CEO Mark Zuckerberg, according to a story in the Wall Street Journal.
The board is troubled by the fact Meta says cross-check protects the vulnerable, but in practice benefits the powerful.
Oversight Board Finds Program Benefits ‘Business Concerns’
“In our review, we found several shortcomings in Meta’s cross-check program,” the report says. “While Meta told the Board that cross-check aims to advance Meta’s human rights commitments, we found that the program appears more directly structured to satisfy business concerns.”
“The Board understands that Meta is a business, but by providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period, potentially causing harm,” it continues. “We also found that Meta has failed to track data on whether cross-check results in more accurate decisions, and we expressed concern about the lack of transparency around the program.”
The board’s conclusion was that, though Meta says its rules treat people equally, it in fact favors VIPs over regular users.
“Correlating highest priority within cross-check to concerns about managing business relationships suggests that the consequences that Meta wishes to avoid are primarily business-related and not human rights-related,” the report said. “Cross-check grants certain users greater protection than others.”