Facebook extremism and fake news: How Facebook is training us to be conspiracy theorists — Quartz

The first problem is that saying is believing. This is an old and well-studied phenomenon, though perhaps understudied in social media. So when you see a post … and you retweet or repost it, it’s not a neutral transaction. You, the reposter, don’t end that transaction unchanged.

It’s worthwhile to note as well that the nature of social media is we’re more likely to share inflammatory posts than non-inflammatory ones

from Facebook’s perspective they have two goals, and neither is about the quality of the community or well-being of its members. The first goal is to keep you creating Facebook content in the form of shares, likes, and comments. … The second Facebook goal is to keep you on the site at all costs, since this is where they can serve you ads.

There will be a lot of talk in the coming days about this or that change Facebook is looking at. But look at these two issues to get the real story:

  • Do they promote deep reading over interaction?
  • Do they encourage you to leave the site, even when the link is not inflammatory?

The larger problem is the far larger number of people who see the headlines and do not reshare them. Why is this a problem? Because for the most part, our brains equate “truth” with “things we’ve seen said a lot by people we trust.” The literature in this area is vast—it’s one of the reasons, for example, that so many people believe that global warming is a hoax.

Source: Facebook extremism and fake news: How Facebook is training us to be conspiracy theorists — Quartz