Study conducted by Women’s World Banking finds that fintech firms in emerging markets are missing out on an opportunity to reach 1 bn new customers.
New Delhi: Artificial intelligence systems that predict credit score of individuals, used by global financial technology companies in emerging markets, often discriminate against women, a study has found. These financial technology companies, prevalent in Indian markets too, exclude women from loans and other financial services, the report said.
A credit score, calculated on the basis of income, loan and repayment history, among other factors, depicts the creditworthiness of a consumer.
The study was conducted by Women’s World Banking, which said financial technology companies in the emerging markets are missing out on an opportunity to reach 1 billion new customers and are contributing to the already existing “$17 billion gender credit gap”.
Titled ‘Algorithmic Bias, Financial Inclusion and Gender’, the study was funded by the Visa Foundation. It aimed to decode the limitations of digital tools in opening up credit to women who are working or are entrepreneurs. The study highlighted the biases within AI systems and why they work against women.
While the report was released in February, the Women’s World Banking shared the findings in a statement Tuesday.
“Leveraging digital finance is critical to promoting women’s financial inclusion and economy empowerment, and the financial service industry and policymakers in emerging markets need to act immediately to address sexism in credit scoring technology not only because it’s the right thing to do but also to better equip the industry to attract 1 billion new female customers who are currently underbanked,” Mary Ellen Iskenderian, CEO of Women’s World Banking, said in the statement.
Women’s World Banking is a non-profit organisation that “designs and invests” in financial solutions, institutions, and policy environments in emerging markets “to create greater economic stability and prosperity for women”.
‘Algorithms are often biased’
For the purpose of the study, researchers at the Women’s World Banking studied data that digital credit providers collect to create algorithms. They conducted interviews with “thought leaders and practitioners across the digital credit space” — including data scientists, academics, entrepreneurs, app developers and coders.
It was found that these algorithms are often tilted in favour of men since the unconscious bias of those creating the systems seeps in. Another reason, the study said, could be because of the “incomplete, faulty, prejudicial” data sets that eventually inform the algorithm. Further, a major chunk of data sources are susceptible to “gender-based bias”, the report noted.
Finally, the study also found that those developing algorithms typically are based out of the US, male and are awarded high income. These characteristics aren’t representative of those who are using this algorithm.
The study also has recommendations, which it said are inexpensive and easy to implement, for financial institutions to reduce this bias.
For instance, it said, regularly evaluating gender-based discrepancies in data could help reduce this bias. The report also asserted that addressing this bias must be made “everyone’s responsibility” — from data scientists to CEOs of these financial institutions.
“De-biasing scoring models by creating audits or checks to sit alongside the algorithm, and/or running post-processing calculations to consider whether outputs are fair,” it said.
This article has been published from the source link without modifications to the text. Only the headline has been changed.