Researchers have developed an artificial intelligence capable of identifying, for each of us, the facial features that we consider attractive and, on this basis, to create portraits corresponding to our criteria.
- By creating an algorithm that they then connected to EEG data from volunteers, the researchers were able to interpret brain responses and then model attractive, personalized face images.
- Reliable at 80%, this discovery could apply to other cognitive functions such as perception and decision-making.
Although personal and subjective, beauty is a concept that can be deciphered and transcribed by artificial intelligence. Example: if the smile of the Mona Lisa makes you vibrate, an analysis of your sensations will allow the AI to draw … Mona Lisa!
Researchers from the University of Helsinki and the University of Copenhagen have managed to program an artificial intelligence to understand our subjective notions of what makes faces attractive. In the review IEEE Transactions in Affective Computing, they explain that they used artificial intelligence to interpret brain signals. They then combined the resulting brain-computer interface with a generative model of artificial faces. The AI was thus able to create faces that matched everyone’s preferences.
As Michiel Spapé, principal researcher and professor at the Department of Psychology and Speech Therapy at the University of Helsinki, reminds us, the attraction is “associated with cultural and psychological factors that probably play an unconscious role in our individual preferences”. “Indeed, it is often very difficult to explain what makes something, or someone, beautiful: Beauty is in the eye of the beholder.”
An 80% reliable algorithm
To develop this new AI, the researchers gave an algorithm called generative adversarial networks the task of creating hundreds of artificial portraits. The images were shown, one by one, to 30 volunteers who were asked to pay attention to faces they found attractive, while their brain responses were recorded by electroencephalography (EEG).
The researchers used the same “swipe” process as the dating app Tinder. “Participants ‘swiped right’ when they encountered an attractive face. Here, however, they had nothing to do but look at the images. We measured their brain’s immediate reaction to these images”explains Professor Spapé.
The researchers then analyzed the EEG data, which they connected to a generative adversarial network via a brain-computer interface. “Such a brain-computer interface is able to interpret users’ opinions about the attractiveness of a series of images. By interpreting their opinions, the AI model that interprets brain responses and the generative neural network that models face images together can produce an entirely new face image by combining what a particular person finds attractive”explains Professor Tuukka Ruotsalo, who leads the project.
New portraits were therefore generated for each participant, predicting that they would find them personally attractive. They then tested them in a double-blind procedure, and they found that these new images matched the subjects’ preferences with more than 80% accuracy.
Better understand subjective preferences
According to the authors of the study, this new work could help computers learn and understand subjective preferences better and better, thanks to the interaction between AI solutions and brain-computer interfaces.
“If it is possible for something as personal and subjective as attractiveness, we could also look at other cognitive functions such as perception and decision making. identification of stereotypes or implicit prejudices and better understanding of individual differences”concludes Professor Spapé.
.