Facebook has begun rating its users on the basis of their trustworthiness. The company wants to know the value of reports made by the users that claims certain news stories to be fake. So, ratings of users will help the company to consider the importance of reports. According to The Washington Post, Facebook has begun to implement this system. This may sound unpleasing, but the company believes it as an important tool for deleting false information.
Facebook’s fight against malicious and fake stories mission lead the company to have trust ratings. Facebook partially relies on reports from users to seize fake stories. If the number of reports claiming a story to be fake is good enough, someone from the reports-checking team will look for it. But, checking each story reported would become overburden. So, Facebook uses other information to consider what it should do and what it should not.
People often reports things they do not like
Trust rating is completely based on this. User’s track record of reporting stories as false determines his/her trust rating. For example, a user is regularly reporting stories as fake and someone from the reports-checking team discovers them to be fake, his/her trust rating will go up. Similarly, if a user is regularly reporting stories as fake that are later discovered to be true, his/her trust rating will go down.
According to Tessa Lyons, Product Manager at Facebook, believes that people often report things they do not like or do not agree with.
In this way, if many people report the same story they disagree with, it will force the report-checking team to have a look at. But, after checking the story, it may be found to be true. So, Facebook wants to avoid such mass-reports that leads to deletion of anything that may be true. So, trust rating is the way to consider which report is important and which is not.
Facebook will not use this for ranking users or creating discrimination among them. The system is just to protect information from massive reports flagging it to be fake.
Currently, Facebook isn’t using the trust score for anything other than reviewing reports on news stories and posts.
Fact-checkers require transparency from Facebook
If the company uses these trust rankings efficiently, it could help them to remove all disinformation that exists around the network. Bad reports or false reports will come from all over, as anyone could likely dislike the story. So, it could lead Facebook fact-checkers wasting time on stories that are completely true.
The only back of Facebook is it relies on third-party for fact-checking services. These services include PolitiDact and Snopes. So, the final results are ought to be trustworthy with a layer of Facebook’s algorithm in its way.
According to the report published by The Columbia Journalism, Facebook’s fact-checkers requires transparency from Facebook. They were completely unaware about Facebook’s algorithm. So, we can say the final result is in the hands of Facebook. Only Facebook will determine which stories will be displayed in the first place.