Facebook revealed this week it’s trying to stem the flow of fake news by assigning trust values to users.
It insists on keeping its criteria for trustworthiness secret though, in case untrustworthy people try to game the system — and they almost certainly will.
Tessa Lyon, Facebook‘s product manager, told The Washington Post a bit more about the system, in which the company uses several flags to identify which people on the site are more trustworthy than others.
So if a person consistently flags a news source as fake when Facebook itself doesn’t judge the source to be untrustworthy, then it judges that person to be untrustworthy.
Lyons implied the company takes this to mean the person reported the site out of an ideological disagreement: “I like to make the joke that, if people only reported things that were [actually] false, this job would be so easy!
People often report things that they just disagree with.”