There are a lot of things wrong with Facebook’s algorithms.
On the one hand, the ad algorithm showed plenty of biases towards specific user groups. A study by researchers from Northeastern University found last year that Facebook shows more ads for janitor and driver jobs to minorities and nurse and secretary jobs to women, and real state ownership to white users.
On the other hand, the newsfeed algorithm was proven to exploit controversy and expediency for the sake of clicks. This, in turn, has increased divisiveness—as shown in recently-discovered documents that Facebook’s top executives decided to wilfully ignore.
When COVID-19 hit, divisive content began to be dangerous for users worldwide; it wasn’t just about removing fringe white nationalist and anti-vaxx propaganda anymore. Greater efforts were made to train the AI to take down larger quantities of posts spreading coronavirus misinformation. But the problem remained: the content may be limited, but the vehicle making that divisive content prominent in the first place went unchanged—until now.
Facebook has announced that they would be making a significant design update to their NewsFeed algorithm to prioritize authentic, original news reporting over dubious, divisive stories. To identify these better quality articles, Facebook will be:
“looking at groups of articles on a particular story topic and identifying the ones most often cited as the original source.”
To ensure that all authorship is transparent, they also will:
“review news articles for bylines or a staff page on the publisher’s website that lists the first and last names of reporters or other editorial staff. We’ve found that publishers who do not include this information often lack credibility to readers and produce content with clickbait or ad farms, all content people tell us they don’t want to see on Facebook.”
For now, this only applies for English news content. Will we stop seeing fake news altogether? Probably not. Nevertheless, any change that reduces divisiveness is always welcome.
Learn more here.