What happens when technology not only knows who you are but knows what you are, what you do, and what you might do ala the movie Minority Report? Google and Facebook already collect scary amounts of data. Facial recognition has been around for a decade and is more accurate than most know.
And now, a facial recognition technology algorithm can, with 81% accuracy for white men and 71% accuracy for white women, predict one’s sexual preferences. White gay people, it turns out, have a “gay face.”
The Gay & Lesbian Alliance Against Discrimination issued this statement:
“Technology cannot identify someone’s sexual orientation. What their technology can recognize is a pattern that found a small subset of out white gay and lesbian people on dating sites who look similar. Those two findings should not be conflated,” said Jim Halloran, GLAAD’s Chief Digital Officer. “This research isn’t science or news, but it’s a description of beauty standards on dating sites that ignores huge segments of the LGBTQ community, including people of color, transgender people, older individuals, and other LGBTQ people who don’t want to post photos on dating sites.”
This is an interesting assertion. First off, the algorithm looked at multiple factors, not just style and trends. But even if it did look at those things alone and came up with the same findings, that would still be a scientific basis to hypothesize that would be far stronger than random chance. Second, the study obviously was attempting to control variables. Race is a variable. What’s more invasive about this finding than say, search history that would reveal the same thing?
Tech companies that use data on individuals to sell to other companies are interested in people of all persuasions to more accurately market to them. If a person’s likes, interests, searches, and friend groups don’t reveal a user’s sexuality, then, using sophisticated photo analysis algorithms would be, and probably are right now, being used.
What happens when a similar algorithm accurately predicts sociopaths based on facial structure? How about pedophiles? How about conservatives or liberals?
The concern amongst pro-life people has always been that genetic testing might afford parents not only the ability to pick traits they deem desirable but to eliminate traits they don’t want, either. That technology seems further off.
Computer algorithms might present the more concerning predictive ability. Google just fired an employee for publicly sharing his feedback about hiring practices. Imagine what Google will do with the ability to predict behavior based on reading an employee’s face. Now, imagine them selling that ability to others.
I’m sure it’s nothing to worry about.