Female bots are perceived to have more positive human qualities, such as warmth, experience and emotion, than male bots, and this greater humanness leads consumers to prefer female artificial intelligence, a new study has found.
Consumers think that female AI is not only more human but also more trustworthy and more likely to consider the unique needs of users, according to a new study published March 22 in Psychology & Marketing. The large-scale study is the first of its kind about gendered AI and is key to understanding how current technology may objectify women.
The preference for female AI is evident in the market today, from virtual assistants such as Siri and Alexa to robots such as Sophia (who recently sold an NFT for nearly $700,000). Machines lack warmth and friendliness, and humanizing technology makes users more apt to trust robots — especially when the robots are cast as female.
Previous studies have shown that gendered and realistic robots are seen as more human, but researchers have largely overlooked the differences between female and male bots. Sylvie Borau, lead author of the study and a professor of ethical marketing at TBS Business School in Toulouse, France, said this distinction is important.
Borau and her colleagues use "what really makes something human as a building block [to] explain why female features are so prominent in AI settings." The authors tested human characteristics, including competence, warmth and experience, in five separate studies. Online interactions and surveys with over 3,000 people compared gender perceptions not just in machines but in humans and animals, too.
In one study, participants were randomly assigned to either a female or a male chatbot and then answered questions about the chatbot's humanness, such as how mechanical, cold and evolved the bot was perceived to be. The male and female chatbots' ages, facial expressions and physical attractiveness were identical. Even their names, Oliver and Olivia, were similar.
In another study, participants were shown an image called "The March of Progress," which illustrates man's progression from ape to human. Though the drawing is famous in human evolution, there has never been a female version of this "Road to Homo Sapiens." The authors had to create a gendered image for their research, which measured dehumanization in humans and animals.
Participants reported that their views on gender were influenced by qualities such as age and parenthood, which have been shown to increase gender biases. People who follow gender stereotypes are more likely to dehumanize women, whether human or machine, according to the researchers. This study found that older participants were less likely to humanize bots and, perhaps unsurprisingly, that narcissists were more negative toward bots.
Overall, participants perceived human women as having more positive human qualities than human men for nearly all variables in the study. People made both subtle and blatant links between female robots and their human character. In health care services, the female bots and chatbots were thought to give more unique treatment solely because of their gender. In all cases, male bots were perceived to be less human.
But relying on female AI creates an ethical issue.
Transforming women into objects as virtual bots "could actually lead to women's objectification," Borau told The Academic Times. Using a female personality to make technology seem more human promotes an "idea that women are simple tools designed to fulfill their owners' needs," the authors said.
And most virtual programs use only the mental state of women, while the woman's body disappears. Similar to how women are "sexually objectified based on their physical appearance in advertising," female bots are "mentally objectified in AI," Borau explained.
Gender-neutral robots are one possible solution, though the researchers see their voices as a big issue. "AI engineers will have to work very hard to make gender-neutral robots sound human," because their voices tend to sound more robotic, they said.
Another solution is to achieve gender parity in AI, which will be the focus of this team's future research. While governments and workplaces sometimes struggle to have an equal number of women and men, the researchers think gender parity can be very easy in AI. Companies choose the sex of the robot as part of the design, and there's no need for gender biases to get in the way of equality. "It's impossible that we're going to let these companies feminize all AI," Borau said.
Borau and her colleagues also plan to research how humans verbally abuse female robots. They assume that people are more likely to make sexist statements to female bots. "I don't understand why there's not more backlash on women in AI," Borau said. "AI should reflect the ideal world we want to live in, not be a selection of the current sexist biases" in our society, she said.
The study, "The most human bot: Female gendering increases humanness perceptions of bots and acceptance of AI," published March 22 in Psychology & Marketing, was authored by Sylvie Borau and Samuel Fosso Wamba, TBS Business School; Tobias Otterbring, University of Agder and the Institute of Retail Economics; and Sandra Laporte, Toulouse School of Management.