A beauty contest was judged by AI and the robots didn’t like dark skin. The first international beauty contest decided by an algorithm has sparked controversy after the results revealed one glaring factor linking the winners
The first international beauty contest judged by “machines” was supposed to use objective factors such as facial symmetry to identify the most attractive contestants. After Beauty.AI launched this year, roughly 6,000 people from more than 100 countries submitted photos in the hopes that artificial intelligence, would determine that their faces most closely resembled “human beauty”.
Out of 44 winners, nearly all were white, a handful were Asian, and only one had dark skin.
Winners of the Beauty.AI contest in the category for women aged 18-29.
Alex Zhavoronkov, Beauty. AI’s chief science officer, said “If you have not that many people of color within the dataset, then you might actually have biased results,” “When you’re training an algorithm to recognize certain patterns … you might not have enough data, or the data might be biased.”
Humans who create the algorithms have their own deeply entrenched biases. That means that despite perceptions that algorithms are uniquely objective, they can often reproduce existing prejudices.
Bernard Harcourt, Columbia University professor of law and political science, said that The Beauty.AI results offer “the perfect illustration of the problem”, “The idea that you could come up with a culturally neutral, racially neutral conception of beauty is simply mind-boggling.”
The case is a reminder that “humans are really doing the thinking, even when it’s couched as algorithms and we think it’s neutral and scientific”
AI does not have concept of ‘like’, it looks a dataset to replicate and generalize human labelled data, but of course simply saying AI did not like dark skin is more ‘media darling’. So this very misleading news was all over the place.
Software programs do not yet have any sense of “qualia.” A software program would have be programmed with a set of parameters regarding what a human being considers to be “beauty;” parameters which are entirely subjective. If the software was programmed to regard the facial characteristics of a chimpanzee as the ideal form of beauty, it would compare all human faces to the face of a chimpanzee, and the human face which was most like the chimpanzee would give the highest result. Most software programmers are anyway either of European, Indian or Chinese descent. I would expect Chinese and Indian programmers to produce software which gives a different result; similarly with software produced by native African programmers.
So the AI programmers, coded the algorithms/statistics with racism in mind… Enjoy programming…
Problem is with the data you feed –> http://www.nytimes.com/…/artificial-intelligences-white…
I think AI is just automation on a sofisticated level. Microsofts AI had the same problem too….maybe we should agree, its a product of its human creators and their perception of the world .
AI just doesn’t have good tastes :)))