Artificial intelligence (AI) will produce more accurate, reliable and transparent credit decisions than human-based systems within five years, according to capital markets professionals surveyed by Intertrust.
Intertrust, a global leader in providing expert administrative services to clients operating and investing in the international business environment, surveyed over 500 capital markets executives to identify the impact that disruptive technology is having on jobs and skills. Of these, one in six (14%) believe that AI has already surpassed human-based systems.
In recent years, the data sources used in credit decision-making have become increasingly broad and non-traditional, now including social media activity, retail spending habits and even political inclinations.
The research revealed a division in the industry about the impact of using such data on the quality of decision-making. While a third (30%) of respondents believe that using a broader range of data reduces subjectivity, a fifth (18%) think AI exacerbates existing prejudices in the credit decision-making process.
Intertrust’s study also highlighted privacy concerns regarding expanded data sets. Although almost a third (31%) of respondents think that the use of non-traditional data such as and personalised algorithms leads to better credit decisions than just relying on detached data, 36% believe tighter legislation is required to protect borrowers’ rights when they apply for funding and to restrict the information included in the assessment. A fifth (20%) suggested that the use of non-traditional data has already overstepped the ethical line and needs to be better controlled.
Cliff Pearce, Global Head of Capital Markets at Intertrust said, “The use of AI in credit decision-making has become increasingly commonplace, with the potential to make quicker more accurate credit decisions based on an expanded set of available data.
“A challenge in this area is that AI systems are only as good as the information programmed into them. For example, while a prospect may look like a poor risk at first sight, there may be extenuating circumstances overlooked by the system that a human would have noted. Put simply, AI underlines the contrast between the prime and more specialised non-conforming lending markets.”
Contributed | what does this mean?