Using AI & compares bias in Bollywood movies

The study uses AI & compares bias in Bollywood movies with those in Hollywood and finds the difference

India would be fighting hard against racism, but the country’s obsession with fair skin is something to note. Bollywood, an Indian equivalent of Hollywood and the largest film industry in the country has a toxic infatuation with lighter skin and everything male. A recent study powered by AI quantifies the gender and skin colour bias that Bollywood movies have been adhering to. The inherent gender stereotypes in movies are another pest that has infested the industry from the beginning. “A girl who is alone is like an open treasure”, is a roughly translated dialogue from a high-grossing Bollywood movie and we can see how sexist the comment is. This is just the tip of an iceberg and as you go deep you might find several cases of discrimination and lack of representation in movies.

The study was conducted by researchers Kunal Khadilkar and Ashiqur R. KhudaBukhsh at Carnegie Mellon University’s (CMU) Language Technology Institute and is co-authored by Tom Mitchell, a Founders University Professor at CMU. The research paper goes by the title “Gender Bias, Social Bias And Representation: 70 Years Of B(H)ollywood” and reflects the gender bias in numbers. According to the study, the AI-driven algorithm analyzed the data sets including subtitles of almost 1,400 movies from the last seven decades of Bollywood and a similar set of movies from Hollywood. The NLP tools provided statistical analysis and the researchers believe that this study can help understand cultural implications in movies.

The experiment revealed gender bias in Bollywood by assessing the high prevalence of male characters in the movies by leveraging the Male Pronoun Ratio (MPR) and Word Embedding Association Test (WEAT). MPR is a metric that identifies the occurrence of male pronouns in subtitles and compares it with the number of female pronouns.

Another technology used in the study is BERT, a language model used to perform Cloze tests. In this scenario, they introduced the sentence into the model, “A beautiful woman should have _____ skin”. The paper says that the base model of BERT suggested ‘soft’ but the BERT model trained on the texts from these movies suggested ‘fair’. This is just a part of the study. In fact, this research also analyzed subtler biases in dialogues, religious and geographical representations, etc.

The study revealed different words associated with the dowry and how it evolved through different generations. However, the experiment remarked a considerable shift in the ratio of babies born in films. Movies from the 1960s and 90s had increasingly depicted the birth of male children and the films from the 2000s and after have an almost equal number of male and female births.

The penetration of disruptive technology has enhanced our lives. AI and machine learning models analyzing films to understand gender bias in movies is a big step since it statistically projects the inherent biases and helps understand the problem better.

On Similar Notes

The lack of gender diversity and bias is not something that is unique to Bollywood but Hollywood and world cinema also carries such toxic perceptions. Geena Davis Institute of Gender in Media collaborated with Google, USC’s Viterbi School of Engineering and SAIL Laboratory to leverage machine learning to develop a unique software that can identify and measure the screen presence shared by women. The tool known as Geena Davis Inclusion Quotient (GD-IQ) can efficiently identify the character’s gender, their screen time and how long they speak. Since the award-winning actor, Geena Davis realized the underrepresentation of women and the overwhelming presence of men in movies, she initiated this experiment. The tool analyzed the highest-grossing movies from 2014-16 and the results revealed that men are seen and heard more than women in movies and academy award-winning movies barely had women. This tool is capable of accurate and automated analysis of large volumes of data.

A Texas-based company StoryFit developed an AI tool in 208 to analyze movie scripts to identify the characters, their emotions, and gender biases. The tool revealed the stereotypical and biased representation of women and their personalities in Hollywood movies. The software found that women are portrayed as agreeable, less outgoing, and talk less about the core subject of the movie. Machine learning, sentiment analysis, and AI are the basic techs behind this software.

Apart from all these efforts, Hollywood is also trying to leverage AI in deciding the value of movies and stars. Last year, Warner Bros. Studio signed a deal with Cinelytic to use their AI-driven tool that enables filmmakers to understand the value of a star and predict how the movie will perform when released. It will analyze the project and provide insights on potential risks and how to manage the release, distribution, promotion, etc.

All these experiments and initiatives tell us the impact of AI and other modern technology on the entertainment industry and that it can accurately quantify the biases and demands so that it becomes easier to understand and will encourage fewer controversies and debates.

This article has been published from the source link without modifications to the text. Only the headline has been changed.

Source link