A new study has found that the news feeds in social networks may not have as much control over what news articles users read as their own choices.

The new research shows that users of Facebook and similar sites tend to click on articles they will most likely agree with, and that while sites' algorithms that sort and present material based on what users' previously liked has a huge impact on what they read, their own choices have an even greater impact, according to The Monterey Herald.

Researchers from Facebook and the University of Michigan's School of Information looked to figure out the influence social media had on the content Americans are exposed to by analyzing 10 million users' activity on social media and how it affects their ideological preferences.

While the team found that such preferences depend on the stories sent to the user by their friends as well as those they click on, it also found that users were still provided stories that shared different points of view, Wired reported. The results showed that Facebook's algorithm only suppresses challenging opinions about eight percent of the time for liberals and five percent for conservatives, and that users' personal choices for clicking on stories led to six percent less exposure to opposing content for liberals and 17 percent less opposing content for conservatives.

"Our work suggests that the power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals," the study authors wrote, The Monterey Herald reported.

Critics of the study say users can only click on news links provided to them by the algorithm, and the links a user clicks on feed into the algorithm. Wired also pointed out how the users in the study chose to identify themselves as either liberal or conservative in their profiles, and that they make up only about nine percent of Facebook's 1.4 billion user base. Those who don't reveal their political affiliations may not make the same choices in the links they click on as users who do reveal their affiliations.

The researchers pointed out that Facebook users are not exclusively clicking on links that share their ideology as much as people think, in that the average user who publicly shares his political opinions has one online friend who shares opposing viewpoints for every four who share the same viewpoints, The Monterey Herald reported. This finding reveals that users are provided with a good amount of diverse content.

The study was published in the May 7 issue of the journal Science.