• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Row over AI that 'identifies gay faces'

this

I respect the science, but I can easily see the dangers of such tech if public opinion shifts, oppressive governments want to mass target homosexuals, or even smaller groups of people who want to make the lives of people a living hell.

I don't even think it has to go that far. If this thing is accurate enough, it takes away the power for people to choose if and when they come out and to whom. Being outed to your super religious family by an AI would be a nightmare.
 

Alx

Member
For their study, the researchers trained an algorithm using the photos of more than 14,000 white Americans taken from a dating website.
(...)

"This research isn't science or news, but it's a description of beauty standards on dating sites that ignores huge segments of the LGBTQ (lesbian, gay, bisexual, transgender and queer/questioning) community

That's basically where I stand based on that info. Well, it is science, but its interpretation is hasty. The learning algorithm has found trends among self-selected photographs, for a specific purpose (looking good), of a specific part of the gay/hetero population. And it may be interesting from a sociological point of view, like "beauty standards are narrower/wider jaws" or something like that.
But it doesn't seem accurate to consider they found a reliable "gay detector" for the general population, or have some proof for biological explanations/measurements of homosexuality. (even disregarding if such thing is possible or not, and what the consequences would be)
 

BajiBoxer

Banned
I still don't understand why it isn't a positive overall by proving people are born gay and it's not a choice or something that can be burned out of a person somehow.

That's how it should be if proven correct, but that's not how billions of people think. There are also multiple genocidal efforts out there targeting homosexuals backed up by over 1000 years of religious teachings in many cases. This software could be used in a manner similiar to Hitler's IBM developed database, which was used to identify people of Jewish decent.
 
Personally I actually think this has a lot of potentially good applications. The fact that it can tell if someone is gay just from an image, leads me to wonder what other things it could deduce from images. For example, could it also be taught to detect people with cancer or the early stages of other diseases?

That being said, I do think there needs to be oversight on what functionality people research and develop in AI. This is one such area that obviously presents significant risks to people, and feels pretty unethical. It also kinda begs the question as to why they chose to teach this AI this particular function, over other applications.
 
Now imagine if this becomes a software that could be sold regardless of the cost? What if Saudi Arabia, China, or Russia get a hold of this, and start grouping people that marks as gay? Even with it's current accuracy of 91%, it will at least leave people under suspicion. People have to realize that LGBTQ people are persecuted and potentially could be killed in many parts of the world. Think about this and do some research before dismissing the consequences. It's really unethical at the least.
 
Didn't this studying ignore that bisexuality exists? So automatically it's a load of nonsense.

Exactly. Right from the get-go, it ignores the fact that human beings are sexually fluid, and looking at a god damn picture of human being wont tell you anything.

Which is why this whole thing is totally flawed and non-sense.
 

ttimebomb

Member
The point of the study was to show how machine learning software could be used to harm...

Anyone getting mad at Stanford didn't read the article.
 

stupei

Member
It seems like this is based on the idea that people who would self-identify as closer to a Kinsey 6 often exhibit different levels of testosterone and estrogen than a Kinsey 0, which would perhaps alter development of certain features, particularly bone structure. But facial features tend to vary somewhat based on ethnic group, and this was explicitly created using white people from a dating site. The article uses singular, so I imagine it was a site that doesn't cater to a wide range of the queer community, which would have a more accurate sample of what a wider range of LGBTQ people look like. Seems like a sample size with a pretty clear bias.

Does this theory just ignore the existence of bisexuals? Or are they magically caught between these biological markers that are meant to clearly identify sexuality? And this seems to ignore trans people as well.

Do asexuals just not have faces?

Can't help but think of something I read before: when gay people talk about gaydar, it's a means of forming a community, but when straight people talk about gaydar it often feels like a threat.
 
Exactly. Right from the get-go, it ignores the fact that human beings are sexually fluid, and looking at a god damn picture of human being wont tell you anything.

Which is why this whole thing is totally flawed and non-sense.

And yet it's 91% accurate when talking about people who identify as gay or straight.
 

Eridani

Member
Now imagine if this becomes a software that could be sold regardless of the cost? What if Saudi Arabia, China, or Russia get a hold of this, and start grouping people that marks as gay? Even with it's current accuracy of 91%, it will at least leave people under suspicion. People have to realize that LGBTQ people are persecuted and potentially could be killed in many parts of the world. Think about this and do some research before dismissing the consequences. It's really unethical at the least.

The study used well known, freely accessible methods. There's not really much to "get a hold of" here. If countries wanted to use similar technologies to profile people, they could do so regardless of this study.
 

GeoNeo

I disagree.
Now imagine if this becomes a software that could be sold regardless of the cost? What if Saudi Arabia, China, or Russia get a hold of this, and start grouping people that marks as gay? Even with it's current accuracy of 91%, it will at least leave people under suspicion. People have to realize that LGBTQ people are persecuted and potentially could be killed in many parts of the world. Think about this and do some research before dismissing the consequences. It's really unethical at the least.

AI research in China & Russia is taking place there is def an arms race around the world when it comes to A.I. If this research is proven to have merit nothing will stop other countries from developing such A.I if they want to use it to hurt / kill people. Sadly Humans always twist science to hurt each other.
 
Didn't this studying ignore that bisexuality exists? So automatically it's a load of nonsense.

"It's a load of nonsense". No. It isn't. Not necessarily anyway.

The potential for people either being bisexual or incorrect in their own preference is mentioned in the pre-print and it takes more than a line of feeling based text to disprove the scientific method.

This post a little further down is why I would defend the work these people did even though I've got no real experience or knowledge of the field.
I get why those organizations are concerned about potential implications of such software but their response is terrible. What the hell do they know about the science behind it?

"we don't like something so let's immediately call it fake!" Works for vaccinations, works for climate change...

It was good enough to get published in a decent journal (impact factor about 5); either it will be disproven by someone else and therefore retracted, or I have enough reason to believe it's correct.
 
There's no way they didn't have to make the case for this research to an ethics board before they started.

Don't forget that this ethics board are mostly white people in liberal campuses. Did they think of what will happen to boy in Ghana, or Chechnya of this research was successful? Did they research the level of persecution that LGBTQ people suffer in the world? For some reason I seriously doubt it. I grew up in a country that the legal punishment for anyone who is gay is death. Gay men, killers and rapists have the same penalty in my country of birth. Now imagine if my government had an effective method of identifying people who are gay just like me? There is no burden of proof to be had anymore, or eye witnesses.
 

stupei

Member
And yet it's 91% accurate when talking about people who identify as gay or straight.

It's 91% accurate with a sample of white people sucking their cheeks in on OK Cupid or whatever.

edit: Wait, that's not even correct. It's only 81% accurate with men and 71% with women.
 

sikkinixx

Member
I get why those organizations are concerned about potential implications of such software but their response is terrible. What the hell do they know about the science behind it?

"we don't like something so let's immediately call it fake!" Works for vaccinations, works for climate change...
 
It's 91% accurate with a sample of white people sucking their cheeks in on OK Cupid or whatever.

edit: Wait, that's not even correct. It's only 81% accurate with men and 71% with women.

91% with more than one photo of the person. Either way, it's enough to dismiss any argument that it's 'totally flawed' and that 'looking at a picture won't tell you anything'.
 
The study itself is flawed.

Its sample is not representative. Out of the closet white gay males at OkCupid are not representative of the gay community at large.

The software has just an slightly better performance than the average person. It won't do much more than what your average person can do. The difference is that the software won't be able to understand nuance; that in some cases, a guy with feminine traits won't necessarily be gay. What could be a suspicious from your average person, this machine transform into an objective and undeniable fact.

Why is this software needed, again? I don't see a single reason for this trash to exists other than to discriminate.

Junk science indeed.
 

Eridani

Member
It's 91% accurate with a sample of white people sucking their cheeks in on OK Cupid or whatever.

edit: Wait, that's not even correct. It's only 81% accurate with men and 71% with women.

It's 91% accurate (or rather, the AUC is 91%) when it decides based on 5 pictures, which is something that's pretty commonly done on studies like this.

Focusing on a specific race is also something that's pretty common in facial biometry. A lot of approaches used work much worse on non-white faces, so people often only focus on one race for better results.

The study itself is flawed.

Its sample is not representative. Out of the closet white gay males at OkCupid are not representative of the gay community at large.

The software has just an slightly better performance than the average person. It won't do what your average person can do. The difference is that the software won't be able to understand nuance; that in some cases, a guy with feminine traits won't necessarily be gay.

Junk science indeed.

A non-representative sample doesn't invalidate the entire study. The concern over privacy this raises is a very real concern, and was one of the reasons this was published (from the abstract):

Additionally, given that companies and governments are increasingly using computer vision algorithms to detect people’s intimate traits, our findings expose a threat to the privacy and safety of gay men and women.

It also performed vastly better than humans. 91% vs 61% is a huge difference.
 

Lijik

Member
I get why those organizations are concerned about potential implications of such software but their response is terrible. What the hell do they know about the science behind it?

"we don't like something so let's immediately call it fake!" Works for vaccinations, works for climate change...
Its a good thing that quote is exactly what they said and the group didnt put out a measured breakdown on why they feel its a flawed study thats in the OP, or else youd look massively foolish.
 
Very easy to go against the scientific method for things you don't want to believe.

As others said, it's positive that it proves people are born that way. But the people who disagree won't really care...

Very easy to go against...
 

stupei

Member
Thanks for the correction on the percentage. My bad.

91% with more than one photo of the person. Either way, it's enough to dismiss any argument that it's 'totally flawed' and that 'looking at a picture won't tell you anything'.

It's enough to dismiss the argument that looking at pictures on a website intended to help people find dates wouldn't indicate things about one's sexual orientation. Gender identity is partially performative. People tend to choose angles and images that present their bodies and even features in particular ways when they are trying to seem sexually desirable. The idea that one set of features would be most ideal within a sexual community is not particularly strange and doesn't necessarily extend to what is average within the community as a whole.

That is to say: the queer people who don't fit into certain boxes aren't necessarily looking for dates on those sites anyway.

Very easy to go against the scientific method for things you don't want to believe.

As others said, it's positive that it proves people are born that way. But the people who disagree won't really care...

Very easy to go against...

I mean it could also be that queer people might know a lot of people who don't fit these biological markers -- as well as those who do -- and are skeptical based on purely anecdotal evidence within our own lived experience within a community being studied.
 

Mr.Mike

Member
From the paper.

Finally, the predictability of sexual orientation could have serious and even life-threatening implications to gay men and women and the society as a whole. In some cultures, gay men and women still suffer physical and psychological abuse at the hands of governments, neighbors, and even their own families. Perhaps due to discrimination and stigmatization, gay people are also at a higher risk of depression, suicide, self-harm, and substance abuse (King et al., 2008). Consequently, their well-being and safety may depend on their ability to control when and to whom to reveal their sexual orientation. Press reports suggest that governments and corporations are developing and deploying face-based prediction tools aimed at intimate psycho–demographic traits, such as the likelihood of committing a crime, or being a terrorist or pedophile (Chin & Lin, 2017; Lubin, 2016). The laws in many countries criminalize same-gender sexual behavior, and in eight countries—including Iran, Mauritania, Saudi Arabia, and Yemen—it is punishable by death (UN Human Rights Council, 2015). It is thus critical to inform policymakers, technology companies and, most importantly, the gay community, of how accurate face-based predictions might be.

As far as the machine learning goes the important point here is that there is some information present in photos on dating sites that can be used to greatly improve accuracy in classification. Their theory of what exactly that information is might be wrong, but that wouldn't disprove that there's some information in photographs that can be used for this.
 

psyfi

Banned
Yeah, I unequivocally condemn this technology until humanity gets our shit together and stops killing people for who they are and who they love.
 
It's 91% accurate (or rather, the AUC is 91%) when it decides based on 5 pictures, which is something that's pretty commonly done on studies like this.

Focusing on a specific race is also something that's pretty common in facial biometry. A lot of approaches used work much worse on non-white faces, so people often only focus on one race for better results.



A non-representative sample doesn't invalidate the entire study. The concern over privacy this raises is a very real concern, and was one of the reasons this was published (from the abstract):



It also performed vastly better than humans. 91% vs 61% is a huge difference.

The 61% figure is with one picture. The "scientists" didn't provide a percentage of people's accuracy with multiple pictures did they?

And if a person / the software checks their Facebook likes they will probably come to a accurate conclusion too. They will be wrong in enough instances theough. Both the software and the person. The software is not doing anything groundbreaking, nothing that people are not already able to do. Its trying to pass cultural traits as biological realism.

Maybe we should also make a software to identify people with little hair as baldness-prone people!
 
From the paper.



As far as the machine learning goes the important point here is that there is some information present in photos on dating sites that can be used to greatly improve accuracy in classification. Their theory of what exactly that information is might be wrong, but that wouldn't disprove that there's some information in photographs that can be used for this.

Wow a machine that does exactly what gay people have done since forever to identify themselves.

Sometimes I forget straight males are so clueless about this type of things.
 
There's a further detailed comment on the paper from one of the authors, which is also worth reading if you're interested enough to actually read the paper and not jump right to conclusions.
https://docs.google.com/document/d/11oGZ1Ke3wK9E3BtOFfGfUQuuaSMR8AO2WfWH3aVke6U/edit#
Particularly this one early section
We did not build a privacy-invading tool. We studied existing technologies, already widely used by companies and governments, to see whether they present a risk to the privacy of LGBTQ individuals.
which is true; anyone can buy the ability to use the FACE++ software that built the foundation of the paper, and I'm sure there are a large number of other facial recognition softwares that could be used similarly.
Wow a machine that does exactly what gay people have done since forever to identify themselves.

Sometimes I forget straight males are so clueless about this type of things.

Hey, you're equally clueless about the scientific process, and deliberately making baseless insults about the authors, so it all balances out.
 

VegiHam

Member
Hey, you're equally clueless about the scientific process, and deliberately making baseless insults about the authors, so it all balances out.

Scientists: Yo we taught a robot to gaydar

Gays: Oh wow that's terrifying and could lead to more persecution and even death. I sure hope this is wrong somehow. I'm gunna try and rationalise this so I can sleep at night.

You: OMG science is under attack people these days just don't respect the well established scientific process.

Like you're not wrong but you could be a little more sensitive about something this frightening.
 

Zoe

Member
The point of the study was to show how machine learning software could be used to harm...

Anyone getting mad at Stanford didn't read the article.

Right:
Dr Kosinski says he conducted the research as a demonstration, and to warn policymakers of the power of machine vision. It makes further erosion of privacy “inevitable”; the dangers must be understood, he adds. Spouses might seek to know what sexuality-inferring software says about their partner (the word “gay” is 10% more likely to complete searches that begin “Is my husband…” than the word “cheating”). In parts of the world where being gay is socially unacceptable, or illegal, such software could pose a serious threat to safety. Dr Kosinski is at pains to make clear that he has invented no new technology, merely bolted together software and data that are readily available to anyone with an internet connection.

By knowing this kind of thing is possible, it allows us to take steps to counteract it.
 

Eridani

Member
The 61% figure is with one picture. The "scientists" didn't provide a percentage of people's accuracy with multiple pictures did they?
So it's 81% vs 61% then. Still a huge, measurable improvement in performance compared to human judges.

And if a person / the software checks their Facebook likes they will probably come to a accurate conclusion too. They will be wrong in enough instances theough. Both the software and the person. The software is not doing anything groundbreaking, nothing that people are not already able to do. Its trying to pass cultural traits as biological realism.

Maybe we should also make a software to identify people with little hair as baldness-prone people!

Not quite sure I follow. The algorithm works better than humans. It's also not always correct, because that's how machine learning works. People are much less accurate than the software in this case.
 
You: OMG science is under attack people these days just don't respect the well established scientific process.

Like you're not wrong but you could be a little more sensitive about something this frightening.

The authors have verbatim called the result horrifying too. I fully agree it's a horrifying result; but people disregarding a result just because it's scary is how we ended up in this climate change mess, how people were manipulated in to believing vaccines cause autism related conditions and I'm sure I could come up with lots of other examples. I'm going to give people shit if they disregard what is currently a valid scientific result based on feelings whether they like it or not.

A result being terrifying doesn't make it untrue, and nobody has made a reasoned argument for it not being (and nobody will in this thread, because it would likely be weeks of dedicated work at minimum). The closest we've gotten is a one line comment from a research lab in glasgow, which isn't even close to enough to form a opinion counter to the paper on, but I'm sure they will be paying very close attention to the results.
 

g11

Member
On a purely research level, it's kind of fascinating if true, but as far as real-world applications, I can't see perceive of any positive use of the tech. That said, I'm not going to blast them for their research either. Whether or not this research was ever done, inevitably facial recognition will be turned to this type of purpose. It's never the tools that are malevolent, it's the people who wield them.

That said, can we run this thing through a facebook of the Republican Party first? Just for shits and giggles?
 
This needs to be peer reviewed before any conclusions can really be made. However, the idea itself is a double edged sword like nearly all in science - it can be used for good and ill and who gets to decide which is the problem as it tends to be politicians and authorities and some would argue they can not be trusted.
 

mokeyjoe

Member
Very easy to go against the scientific method for things you don't want to believe.

As others said, it's positive that it proves people are born that way. But the people who disagree won't really care...

Very easy to go against...

Er, no. Doesn't prove that at all.

People can be treated different socially based on their appearance (feminine looking boys, masculine looking girls), which could then factor into the way their sexuality develops through puberty.
 

Eridani

Member
This needs to be peer reviewed before any conclusions can really be made. However, the idea itself is a double edged sword like nearly all in science - it can be used for good and ill and who gets to decide which is the problem as it tends to be politicians and authorities and some would argue they can not be trusted.

It's going to be published in a journal, so it was already peer reviewed to a certain extent. Could still be wrong obviously, and due to the attention it's getting I expect a lot of people will be looking at it quite closely.
 
This needs to be peer reviewed before any conclusions can really be made. However, the idea itself is a double edged sword like nearly all in science - it can be used for good and ill and who gets to decide which is the problem as it tends to be politicians and authorities and some would argue they can not be trusted.

As far as I can tell, it's been accepted for publication already and so has been through peer review. It will now be open to criticism from the wider audience in the field, and likely attempted to be replicated. Given the high profile nature of this (i.e. a bunch of newspaper organisations picking up on it before anyone really has had a chance to look at it in detail), I expect there will be pressure from various groups to try to replicate the results. There will be comments in the journal a couple of months from now, no doubt.
 

Ms.Galaxy

Member
Interesting results that gives us a likely link how genetics and fetal development might have a say in one's sexuality. At the same time, however, this information and technology will be used against LGBT people around the world if it's that easy to acquire. All one needs is to have that type of AI scan an entire database of driver's licences and IDs to classify people as homosexual, and then countries like Russia or Saudi Arabia can start a mass exodus or genocide of gay people. Even companies could use to it ensure they don't hire gay people.

I'm conflicted on this, as someone who loves science I appreciate the science but as someone who's pansexual I'm afraid of the outcome should the peer review verify the results.
 

Teletraan1

Banned
Sounds like either Gay Sentinels or Gay Project Insight are just around the corner which is terrifying and what is triggering this response.
 
Top Bottom