Facebook has trust ratings for users, but it won't tell you your score

Facebook is rating users based on how "trustworthy" it thinks they are.

Users receive a score on a scale from zero to one that determines if they have a good or bad reputation – but it's completely hidden.

The rating system was revealed in a report by the Washington Post, which says it's in place to "help identify malicious actors".

Facebook tracks your behaviour across its site, and uses that info to assign you a rating.

Tessa Lyons, who heads up Facebook's fight against fake news, said: "One of the signals we use is how people interact with articles.

"For example, if someone previously gave us feedback that an article was false and the article was confirmed false by a fact-checker, then we might weight that person’s future false news feedback more than someone who indiscriminately provides false news feedback on lots of articles, including ones that end up being rated as true."

Earlier this year, Facebook admitted it was rolling out trust ratings for media outlets.

A figurine is seen in front of the Facebook logo in this illustration taken, March 20, 2018. REUTERS/Dado Ruvic - RC155C02C7D0 (Reuters)

This involved ranking news websites based on the quality of the news they were reporting.

FACEBOOK ALLOWS LANDLORDS TO DISCRIMINATE ON HOUSING ADS, HUD CHARGES

This rating would then be used to decide which posts should be promoted higher in users' News Feeds.

It's not clear exactly what users' ratings are for, but it's possible they may be used in a similar way.

But Facebook hasn't revealed exactly how ratings are decided, or whether all users have a rating.

According to Lyons, a user's rating "isn't meant to be an absolute indicator of a person's credibility".

Instead, it's intended as a measurement of working out how risky a user's actions may be.

It's Facebook's latest bid to tackle fake news, a growing problem for the social network.

The site, which sees 2.23billion users log on every single month, has become a hot-bed for falsified news coverage.

Earlier this year, billionaire Facebook boss Mark Zuckerberg vowed to fight fake news.

"The world feels anxious and divided, and Facebook has a lot of work to do," the 34-year-old Harvard dropout explained.

Facebook has admitted that its site has been the subject of political fakery campaigns from Russia.

After initially denying any complacency on its part, the social network admitted more than 126 million US users had viewed some form of Russian propaganda.

A congressional hearing followed, with Facebook, Twitter, and Google in the dock.

FACEBOOK MUST BE BROKE UP AND FACE STRICT PRIVACY CONTROLS, COALITION URGES FTC

And Facebook's been grappling with the problem ever since.

Speaking in January, Samidh Chakrabarti, who heads up civic engagement at Facebook, said: "Even a handful of deliberately misleading stories can have dangerous consequences.

Facebook CEO Mark Zuckerberg is seen above before his testimony on Capitol Hill earlier this year. (AP)

"We're committed to this issue of transparency because it goes beyond Russia.

"Without transparency, it can be hard to hold politicians accountable for their own words.

"Democracy then suffers because we don't get the full picture of what our leaders are promising us," he wrote, in what looks like a subtle snipe at US President Donald Trump.

"This is an even more pernicious problem than foreign interference.

"But we hope that by setting a new bar for transparency, we can tackle both of these challenges simultaneously."

Chakrabarti said that the misinformation campaigns targeting Facebook users are "professionalised, and constantly try to game the system".

"We will always have more work to do," he added.

We've asked Facebook for comment and will update this story with any response.

This article originally appeared in The Sun. 

Load more..