By TALI ARBEL, AP Technology Writer
Facebook is taking another step to try to make itself more socially beneficial, saying it will boost news sources that its users rate as trustworthy in surveys.
In a blog post and a Facebook post from CEO Mark Zuckerberg Friday, the company said it is surveying users about their familiarity with and trust in news sources. That data will influence what others see in their news feeds.
It’s the second major tweak to Facebook’s algorithm announced this month. The social-media giant, a major source of news for users, has struggled to deal with an uproar over fake news and Russian-linked posts, meant to influence the 2016 U.S. elections, on its platform. The company has slowly acknowledged its role in that foreign interference.
Zuckerberg has said his goal for this year is to fix Facebook , whether by protecting against foreign interference and abuse or by making users feel better about how they spend time on Facebook.
Facebook announced last week that it would try to have users see fewer posts from publishers, businesses and celebrities, and more from friends and family. Zuckerberg said Friday because of that, news posts will make up 4 percent of the news feed , down from 5 percent today.
Facebook says it will start prioritizing news sources deemed trustworthy in the U.S. and then internationally. It says it has surveyed a “diverse and representative sample” of U.S. users and next week it will begin testing prioritizing the news sources deemed trustworthy. Publishers with lower scores may see a drop in their distribution across Facebook.
“There’s too much sensationalism, misinformation and polarization in the world today. Social media enables people to spread information faster than ever before, and if we don’t specifically tackle these problems, then we end up amplifying them. That’s why it’s important that News Feed promotes high quality news that helps build a sense of common ground,” Zuckerberg wrote.
Of course, there are worries that survey-takers will try to game the system, or that they just won’t be able to differentiate between high-quality and low-quality news sources — an issue made evident by the spread of many fake-news items in the past few years.
Zuckerberg says that some news organizations “are only broadly trusted by their readers or watchers, and others are broadly trusted across society even by those who don’t follow them directly.” But this is complicated.
In the U.S., there has been a growing partisan split in perceptions of the media . Roughly a third of Democrats in early 2017 said they trusted information from national news organizations a lot; only 11 percent of Republicans did, according to Pew Research Center; that gap had grown from early 2016.
Facebook’s move is a positive one, but that it’s not clear how effective this system will be in identifying trustworthy news sources, David Chavern, CEO of the news media trade group News Media Alliance, said in a statement Friday.
No comments so far.
Be first to leave comment below.