Revealed: Facebook’s Hidden System to Rate Your ‘Trustworthiness’

Silicon Valley wunderkind Zuckerberg in eye of the storm
AFP

Social media giant Facebook has reportedly developed a hidden system to assign “reputation scores” to users, rating their trustworthiness according to multiple factors. This system, reminiscent of the Chinese government’s “social credit system,” came to light in a new report by the Washington Post.

A previously unreported user rating system has been discovered to be in use at Facebook, where each Facebook user is assigned a trustworthiness score on a scale. The Washington Post reports that this system was developed by Facebook over the past year so that Facebook could measure the trustworthiness of users in order to pick out malicious actors on the platform.

Tessa Lyons, the product manager in charge of fighting fake news on the Facebook platform, said that the reputation assessment system was developed as part of Facebook’s methods to crack down on misinformation. Facebook previously relied on user reports to determine if misinformation was being spread, but some users began reporting information that they didn’t agree with as untrue, leading to issues for Facebook moderators.

Lyons said that it was “not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher.” Lyons said that the trustworthiness score between zero and one isn’t meant to be an absolute indicator of a users trustworthiness, but rather the score is one of a thousand new behavioral measurements that Facebook takes into account when reviewing reported content.

Facebook will also be noting which users have a higher propensity for reporting content and which publishers on Facebook are the most trusted amongst users. Claire Wardle, director of First Draft, a research lab within Harvard’s Kennedy School, discussed Facebook’s measurement system saying: “Not knowing how [Facebook is] judging us is what makes us uncomfortable. But the irony is that they can’t tell us how they are judging us — because if they do, the algorithms that they built will be gamed.”

Lyons gave further insight into how Facebook measures a users trustworthiness saying: “One of the signals we use is how people interact with articles. For example, if someone previously gave us feedback that an article was false and the article was confirmed false by a fact-checker, then we might weight that person’s future false-news feedback more than someone who indiscriminately provides false-news feedback on lots of articles, including ones that end up being rated as true.”

When asked by the Washington Post, Lyons refused to give further information on what factors were used to determine user trustworthiness, stating that she did not want to tip off malicious actors to Facebook’s methods.

“I like to make the joke that, if people only reported things that were false, this job would be so easy!” said Lyons in the interview. “People often report things that they just disagree with.”

The system is being compared to the Chinese government’s “social credit system” which applies a rating to every citizen in the country. As explained by Breitbart News’ John Hayward, the Chinese system “hands out points to citizens for everything from traditional credit scores to surveillance data collected by millions of cameras, Internet tracking, and bonuses for performing ‘heroic acts.'”

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan_ or email him at lnolan@breitbart.com

COMMENTS

Please let us know if you're having issues with commenting.