Standard Chartered is teaming up with US outfit Truera to help tackle unjust bias in AI-assisted decision making.
The bank describes itself as an "active proponent" of the use of AI and data analytics but notes that machine learning models are built using complex automated algorithms meaning that they can act like a "black box".
In order to ensure that data is used ethically, Standard Chartered has enlisted Truera and its model intelligence platform to help identify and eliminate unjust biases in the decision-making process.
Truera worked with the bank's retail analytics, risk, digital and technology teams on a pilot that focused on a challenger credit decisioning algorithm which uses a combination of traditional data, and with clients’ consent, alternative data.
The technology can pinpoint specific variables that influence risk scoring and look for correlations between seemingly impartial variables that can act as proxies for demographic indicators such as race or gender, which could lead to the introduction of unjust bias resulting in unfair decisions.
Following the pilot, Standard Chartered says it plans to work with Truera to further develop the software and explore its application in other areas.
Vishu Ramachandran, group head, retail banking, Standard Chartered, says: "Ensuring transparency and explainability in AI-based decision making is not just a competitive advantage for us, but also the right thing to do by our clients.
"Our partnership with Truera will help us better explain and justify our models, support us in building a stronger and more sustainable business as well as give confidence to both customers and regulators in the fairness of our data-driven processes and outcomes."