In The News

Is Facebook Trying To Sway Public Opinion On The 2016 POTUS Primaries?

If you’re a social media wonk, or even just a casual social media user, you’ve probably heard lots of stories, or even experienced retaliation from Facebook for conservative value posts. Nobody likes being forced into Facebook Jail, but how many of us realize the more subtle suggestions that the Democrat point of view is the one to follow in the upcoming POTUS primaries and elections?

For instance, how many of you saw this in your Facebook search bar, over the past few days when the Democratic debates were the feature of leftist main stream media prior to the South Carolina primary?

Screen Shot 2016-02-12 at 10.42.00 AM

I certainly don’t recall seeing this kind of messaging durning any of the Republican primary debates, do you? Perhaps its Facebook’s way of collecting more unsolicited data?

In 2014 Forbes posted this concerning the unethical confiscation of data from Facebook and Twitter and just what they were doing with that data to sway main stream culture.

Social media like Facebook and Twitter TWTR +8.53% are far too biased to be used blindly by social science researchers, two computer scientists have warned.

Writing in today’s issue of Science, Carnegie Mellon’s Juergen Pfeffer and McGill’s Derek Ruths have warned that scientists are treating the wealth of data gathered by social networks as a goldmine of what people are thinking – but frequently they aren’t correcting for inherent biases in the dataset.

If folks didn’t already know that scientists were turning to social media for easy access to the pat statistics on thousands of people, they found out about it when Facebook allowed researchers to adjust users’ news feeds to manipulate their emotions.

The poorly handled research resulted in headlines screaming about “secret experiments”, while data watchdogs in Europe launched investigations and the Electronic Privacy Information Centre filed an official complaint with the US Federal Trade Commission.

The outrage was because Facebook had allowed the researchers to conduct their experiment without explicit permission (which would have biased the results of course), but the situation was exacerbated by the social network’s failure to get in front of the story and hand out an apology before it was forced to it. had this to say about what Facebook is doing to manipulate the media to which you are exposed:

The long-held notion that Facebook’s algorithm leads to the creation of “echo chambers” among users isn’t exactly true, according to a report published Thursday in the journal Science.

After studying the accounts of 10 million users, data scientists at Facebook found that liberals and conservatives are regularly exposed to at least some “crosscutting” political news, meaning stories that don’t conform to their pre-existing biases.

The algorithm for Facebook’s News Feed leads conservatives to see 5% less liberal content than their friends share and liberals to see 8% less conservative content. But the biggest impact on what users see comes from what they clicked on in the past. Liberals are about 6% less likely to click on crosscutting content, according to the research, and conservatives are about 17% less likely. Facebook’s algorithm serves users stories based in part on the content they have clicked in the past.

Ultimately, the study suggest it’s not Facebook’s algorithm that’s making your profile politically one-sided, it’s your own decisions to click on or ignore certain stories. However, some observers argue the Facebook study is flawed because of sampling problems and interpretation issues.

Or perhaps Facebook is just doing us a favor, as suggested in this commentary by, none other than Cass Sunstein in the Chicago Tribune, by relieving us of our own ‘confirmation bias.’ Yeah, I don’t think so, either.

Why does misinformation spread so quickly on the social media? Why doesn’t it get corrected? When the truth is so easy to find, why do people accept falsehoods?

A new study focusing on Facebook users provides strong evidence that the explanation is confirmation bias: people’s tendency to seek out information that confirms their beliefs, and to ignore contrary information.

Confirmation bias turns out to play a pivotal role in the creation of online echo chambers. This finding bears on a wide range of issues, including the current presidential campaign, the acceptance of conspiracy theories and competing positions in international disputes.

The new study, led by Michela Del Vicario of Italy’s Laboratory of Computational Social Science, explores the behavior of Facebook users from 2010 to 2014. One of the study’s goals was to test a question that continues to be sharply disputed: When people are online, do they encounter opposing views, or do they create the virtual equivalent of gated communities?

Del Vicario and her coauthors explored how Facebook users spread conspiracy theories (using 32 public web pages); science news (using 35 such pages); and “trolls,” which intentionally spread false information (using two web pages). Their data set is massive: It covers all Facebook posts during the five-year period. They explored which Facebook users linked to one or more of the 69 web pages, and whether they learned about those links from their Facebook friends.

In sum, the researchers find a lot of communities of like-minded people. Even if they are baseless, conspiracy theories spread rapidly within such communities.

More generally, Facebook users tended to choose and share stories containing messages they accept, and to neglect those they reject. If a story fits with what people already believe, they are far more likely to be interested in it and thus to spread it. Read more at the Trib.

If you want to see more conservatively based news in your feed, take matters into your own hands and spread the message your self. Share this, and any other conservative news, in your feed and on your favorite pages. Ultimately, Facebook is a publicly held company and in the end it’s gonna be about the dollar. So, control the messaging yourself.

To Top