Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Using AI to Examine Gender Bias in Hollywood

DZone's Guide to

Using AI to Examine Gender Bias in Hollywood

Researchers used AI to examine the roles played by a wide range of actors in over 800 movies and uncovered subtle but widespread gender bias in the way characters are portrayed.

· AI Zone ·
Free Resource

Insight for I&O leaders on deploying AIOps platforms to enhance performance monitoring today. Read the Guide.

Earlier this year, data was released that highlighted the gender discrepancies in pay for Hollywood stars, with male actors significantly out-earning their female co-stars. Such differences don't just emerge in terms of pay, however, with a recent study from Washington University also highlighting the different kinds of roles given to male and female actors.

The researchers used AI to examine the roles played by a wide range of actors in over 800 movies and uncovered subtle but widespread gender bias in the way characters are portrayed.

Gender Bias

The analysis found that women were often portrayed in movies in a way that reinforces the gender stereotypes that exist. For instance, they would often have less agency than men and appear in submissive positions. This was found to occur both in the words used but also the portrayal of women in the storylines.

The researchers used participants from Mechanical Turk to annotate scripts based upon things such as agency and power.

"For example, if a female character 'implores' her husband, that implies the husband has a stance where he can say no. If she 'instructs' her husband, that implies she has more power," the authors say. "What we found was that men systematically have more power and agency in the film script universe."

This initial insight was then used to create an algorithm that was able to trawl through the scripts and assess 21,000 characters in those movies. The algorithm was capable of accurately identifying the gender of the characters from the words spoken and autonomously assigned a power and agency score to each character.

The team believes that their tool provides a richer and more nuanced analysis of gender bias in fictional works than that provided by the likes of the Bechdel Test, which takes a simpler approach by analyzing whether at least two female characters speak about topics other than a fellow male character.

The results are certainly interesting, and the higher male scores were consistent across all genres of film, whether comedy or horror, sci-fi or drama. Indeed, the bias persisted even when the casting director or screenwriter were female.

"We controlled for this. Even when women play a significant role in shaping a film, implicit gender biases are still there in the script," the authors reveal.

The team hopes to further develop the tool so that in addition to analyzing scripts for gender bias, they can also propose corrections to rectify matters.

"We developed this tool to help people understand how they may be perpetuating these subtle but prevalent biases that are deeply integrated into our language," they conclude. "We believe it will help to have this diagnostic tool that can tell writers how much power they are implicitly giving to women versus men."

TrueSight is an AIOps platform, powered by machine learning and analytics, that elevates IT operations to address multi-cloud complexity and the speed of digital transformation.

Topics:
ai ,pay gap ,data analytics ,bias

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}