Gendered credit scoring increases women's chance of credit acceptance

Gendered credit scoring increases women's chance of credit acceptance

Following news of the probe into Apple Card’s credit worthiness algorithm after sexism claims, University of Edinburgh Business School reveals that including gender in credit scoring models improves women’s chances of being granted credit.

79,386 customers that were issued a car loan from a major European bank between 2003 and 2009 across four different credit scoring models were analysed, resulting in gender being determined as a statistically significant variable.

However, research from the University of Edinburgh finds that when gender is left out of the credit scoring model equation, the outcomes are not equal for men and women and in addition to this, has a negative effect on women.

Gender is prohibited in decision making in most developed countries. In the EU, the European Equal Treatment in Goods and Services Directive ensures that certain protected characteristics such as gender are not considered as a variable.

While gender does not impact the predictive power of the models and neither does it effect lenders who can continue maintaining similar levels of bad debt, when including gender, female loan applicants – having had lower default rates – benefited and were given extra points for being good risks.

Under this model, female applicants were more likely to be accepted for credit and their rejection rates were lower in comparison to men, proving that equality law in fact is a disadvantage for women in algorithmic credit decisions.

Dr Galina Andreeva, senior lecturer in management science at University of Edinburgh Business School, highlights that their aim was to investigate the consequences of legal restrictions on situations of automated decision making.

“Our findings highlight the inconsistencies of the existing regulations as they don’t ensure equality of outcome for consumers. From the data that we analysed we were able to prove that women can benefit from gender being included in credit scoring models as this improves chances for them to be granted credit.

“Nevertheless, we would caution against extrapolating our findings to all protected groups and all credit products. It is possible that in other credit portfolios protected characteristics show different patterns.

“Our research is an illustration to show that equal treatment does not automatically translate into equal outcome. It also demonstrates that this happens because of correlation between gender and other predictors that remain in the models.

“We hope that our study will inspire the development of better and more effective regulations for automated decisions and solutions for both consumers and lenders to make sure that everyone has equal and fair opportunities when applying for credit, no matter their gender or background.”

Following the Twitter firestorm ignited by tech entrepreneur David Heinemeier Hansson and fueled by Apple co-founder Steve Wozniak, Goldman Sachs is now embroiled in a regulatory probe and in the middle of the Equal Credit Opportunity Act.

As reported in Bloomberg, computerised decision making could lead to unfair lending, a point echoed by a number of industry leaders and namely Linda Lacewell, the superintendent of the New York Department of Financial Services, who refers to algorithms as a ‘black box’ and launched the investigation into the Apple Card.

FTC commissioner Rohit Chopra also said in October: “Algorithms are not only nonpublic, they are actually treated as proprietary trade secrets by many companies.

“To make matters worse, machine learning means that algorithms can evolve in real time with no paper trail on the data, inputs, or equations used to develop a predictions.

“Victims of discriminatory algorithms seldom if ever know they have been victimized.”

Comments: (0)

Trending