TURKEY-POLITICS-MUNICIPAL-VOTE
(Photo: YASIN AKGUL/AFP via Getty Images)

New research published in American Psychologist showed that even neutral facial expressions can reveal someone's political beliefs. 

However, this study raises serious privacy concerns, given that facial recognition technology can function without a person's permission. 

The study reveals that an AI algorithm can accurately determine a person's political convictions in contrast to the predictability of job interview outcomes on job performance or the correlation between alcohol use and aggressiveness.

Michal Kosinski, the study's lead author and a voice in the dialogue with Fox News Digital, said that 591 people participated in the experiment. The participants were asked to answer questions about their political orientations. 

An AI mechanism then extracted what Kosinski called a numerical "fingerprint" from photographing the participants' faces. This data was compared to a database of survey responses to forecast the respondents' political opinions accurately.   

"I think that people don't realize how much they expose by simply putting a picture out there," Kosinski, an associate professor of organizational behavior at Stanford University's Graduate School of Business, said. 

He claimed that they know that people's sexual orientation, political orientation, and religious views should be protected. He added that it used to be different because, in the past, anyone could enter anybody's Facebook account and see their political views, the likes, and the pages they follow. 

However, Facebook closed this many years ago since it was obvious to journalists, Facebook, and lawmakers that this was inappropriate and too risky. 

Furthermore, he explained that people can still go to Facebook and see anybody's picture. He said that although this person has never met you and can never see a picture, Facebook still lets people show them. 

He noted that their study is equivalent to just telling people their political orientation.