Facebook, meanwhile, has announced that groups and pages that push misinformation about vaccines will get lower rankings and won’t be recommended to users.

These overdue moves illustrate the companies’ ability to identify and police false content, and they undercut a notion widely embraced in the social media industry that Facebook, Twitter, and YouTube shouldn’t be “arbiters of the truth.”

In fact, the major social media companies already play the arbiter role, just not in a systematic way.

A new report from the New York University Stern Center for Business and Human Rights urges these companies to take a more active stance in preventing disinformation from spreading online.

The NYU report focuses on domestically generated disinformation, noting that far more false and divisive online content is produced domestically than comes from abroad.

One goal was to siphon conservative votes to a write-in candidate as part of an effort to defeat Republican Roy Moore.

The text above is a summary, you can read full article here.