Computer Electronics

Analyzing Gender Bias with AI

13 November 2017
Only one of the two female protagonists in "Frozen" is portrayed with high power and positive agency, according to a new analysis. Image credit: University of Washington.

Computer scientists at the University of Washington have used artificial intelligence to quantify gender-based power imbalances in Hollywood movies.

Using natural language processing tools employing machine learning, a research team analyzed the language in nearly 800 movie scripts. They found subtle, but widespread, gender bias in the way male and female characters are portrayed.

One example is “Frozen.” It appears to have two strong female protagonists -- Elsa, the elder princess with powers over snow and ice, and Anna, her sister who spends much of the film on a quest to save their kingdom.

"'Frozen' is an interesting example because Elsa really does make her own decisions and is able to drive her own destiny forward,” observed doctoral student Maarten Sap, lead author of a paper on the research. “Anna consistently fails in trying to rescue her sister, and often needs the help of a man.”

More of their findings:

  • Women were consistently portrayed in ways that reinforce gender stereotypes -- more submissive positions, less agency (control over one’s life) than men
  • Male characters spoke more in imperative sentences ("Bring me my horse"); female characters tended to hedge their statements ("Maybe I am wrong")
  • Male actors spent more time on screen than female actors and also spoke more, accounting for 71.8 percent of the words spoken across all movies
  • The male characters’ tendency to score higher on both power (authority over another character) and agency dimensions held true throughout all genres, and the same gender bias existed even for films with female casting directors or script writers

The team’s work was recently presented at the 2017 Conference on Empirical Methods in Natural Language Processing (EMNLP) in Denmark. Their tool offers a much more nuanced analysis of gender bias in fictional works than the Bechdel Test, which evaluates only whether at least two female characters have a conversation about something other than a man.

The team also created a searchable online database showing subtle gender biases in hundreds of Hollywood movie scripts, ranging from cult classics like "Heathers" to romantic comedies like “My Best Friend’s Wedding," to war films like "Apocalypse Now."

Next steps for the researchers include broadening the tool to not only identify gender bias, but also to correct for it -- by offering rephrasing suggestions or ways to make language more equal. The methodology could be applied to any text, including books and plays.

"We developed this tool to help people understand how they may be perpetuating these subtle but prevalent biases that are deeply integrated in our language," said senior author Yejin Choi, an associate professor at the university’s Paul G. Allen School of Computer Science & Engineering. “We believe it will help to have this diagnostic tool that can tell writers how much power they are implicitly giving to women versus men."



Powered by CR4, the Engineering Community

Discussion – 0 comments

By posting a comment you confirm that you have read and accept our Posting Rules and Terms of Use.
Engineering Newsletter Signup
Get the GlobalSpec
Stay up to date on:
Features the top stories, latest news, charts, insights and more on the end-to-end electronics value chain.
Advertisement
Weekly Newsletter
Get news, research, and analysis
on the Electronics industry in your
inbox every week - for FREE
Sign up for our FREE eNewsletter
Advertisement