Recent advances in AI technology are now allowing machines to tell a person’s sexual orientation just by analyzing pictures.
A research study conducted at the Stanford University by Michal Kosinski and Yilun Wang has shown that AI technology can surmise people’s sexual orientation by looking at and studying their faces.
According to the Economist, the machine learning software used by the duo for their research can spot signs of sexuality by referring to the subtle differences in facial structure. Kosinski suggested that with the right data sets, an AI could see patterns in behavior that humans could not.
Furthermore, the researchers also claimed that this kind of AI technology could also be used to spot other intimate traits such as intelligence quotient (IQ) and political views.Researchers discovered that #AI can spot the sexual orientation of a person!Click To Tweet
Artificial Intelligence is Quickly Changing the World
The series of innovations using artificial intelligence and machine learning today is indeed a clear indication of just how far AI technology has gone. For years, tech giants have developed different AI-based systems and devices to take advantage of the said technology’s many known benefits.
For instance, Facebook has been using machine learning algorithms to produce maps of poor regions in extreme details. Same with Google, who was just recently reported to update its Google Street View cameras to incorporate AI analysis. Some medical researchers are also exploring the potentials of training AI in smartphones to detect cancerous lesions.
With the way AI technology revolutionizes the future of our world, it comes as no surprise that it can now be used to study and understand human behavior. Kosinski and Wang’s AI research study only proves that certain facial signs unnoticeable to humans might perfectly be visible to machines.
How AI Technology Can See Through a Person
The research program spearheaded by Kosinski and Wang used 36,630 men and 38,593 women as subjects. From these thousands of individuals who were part of a popular American dating site, the two downloaded 130,741 images of men and 70,360 images of women.
The downloaded pictures were exposed to a basic facial-detection technology “to select all images which showed a single face of sufficient size and clarity to subject to analysis.” The process left the researchers with 35,326 pictures of 14,776 people, with equally represented gay and straight sexuality for both male and female gender.
The filtered images were later on fed into a different piece of software called VGG-Face, which produces a long strip of numbers representing each person–the so called ‘faceprint.’ Using a simple predictive model commonly known as logistic regression, the researchers search the correlations between the features of those faceprints and their owners’ sexuality according to the dating website.
Astonishingly, the analysis results produced by the AI technology outperformed humans when it came to identifying homosexual from heterosexual peoples’ faces!
Apparently, the research suggested that gay men and women tend to have ‘gender-atypical‘ features, expressions, and grooming styles. A clear indication was that gay men appear to be more feminine and vice versa–not exactly a startling revelation to most.
The data produced by the machine learning system also identified certain trends like gay men tending to have narrower jaws, longer noses, and larger foreheads as compared to straight men. On the other hand, gay women were found to have larger jaws and smaller foreheads in contrast to straight women.
The AI technology model correctly distinguished 81% of the time a gay and a straight person randomly chosen from the images. According to the Economist:
“When shown five photos of each man, it attributed sexuality correctly 91% of the time. The model performed worse with women, telling gay and straight apart with 71% accuracy after looking at one photo, and 83% accuracy after five.
In both cases, the level of performance far outstrips human ability to make this distinction. Using the same images, people could tell gay from straight 61% of the time for men, and 54% of the time for women.”
The results of the research study claimed to support the theory that sexual orientation stems from exposure to certain hormones before birth. Simply put, the researchers suggest that people are born gay and being one is not a choice. Furthermore, the lower success rate of the AI technology in analyzing women could also support the view that female sexual orientation is more fluid than with males.
While the research yielded impressive results, it has limitations. Firstly, the images used from the dating sites were more likely to be sexual orientation revealing. Outside the lab, the researchers believe that the results would be quite different. Also, the study did not include people of color and had given no consideration to transgenders and bisexual people.
Despite the study’s obvious limitations, this new AI technology is quite alarming. The researchers noted that with billions of facial images of individuals stored on social media sites and in government databases, it could be all exploited and used to detect people’s sexual orientation without their consent.
Just imagine the horror of discovering your doubtful spouse using the technology to determine your sexuality, or finding teens using the machine learning algorithm on themselves or their friends. But what’s really frightening is the idea of governments, prosecuting LGBT community, using the AI technology to track down their population.
However, in defense of the two researchers, they said that the technology already exists. All they did was to expose its capabilities so governments and companies could put in place regulations that would safeguard the privacy of everyone.
A report from The Guardian also showed concerns over other scenarios which might enable authorities to arrest random people based on AI predictions suggesting they might potentially commit crimes. A scene made popular in the movie Minority Report.
“AI can tell you anything about anyone with enough data,” said Brian Brackeen, CEO of Kairos, a face recognition company.
“The question is as a society, do we want to know?”