/retail banking

News and resources on retail banking, consumer finance and reinventing customer experience in finance.

BBVA builds gender-neutral global chatbot

BBVA has kicked against the trend to assign female voices to artificial intelligence assistants with the launch of Blue, a gender-neutral chatbot trained to answer customer's everyday banking queries.

  2 7 comments

BBVA builds gender-neutral global chatbot

Editorial

This content has been selected, created and edited by the Finextra editorial team based upon its relevance and interest to our community.

A recent Unesco report analyses the role of education in helping to remedy gender bias in technology. The United Nations entity maintains that virtual assistants’ feminine nature and the subservience they express is a clear example of how technology contributes to the perpetuation of these biases.

The study also notes that the trend of assigning virtual assistants a feminine gender occurs in a context of gender imbalance in technology companies, where men account for between 60 and 75 percent of the sector’s total workforce.

BBVA's Blue, an attempt to unite the multiple virtual assistants employed by the bank across its global network, is deliberately built to liberate the AI from unjustified social stereotypes.

Blue never tries to pass itself off as a human, says the bank. Neither does it totally identify itself as a robot.

"The reason we chose not to assign Blue a gender is when we were defining its personality, we were clear that it was non-human," explains BBVA design manager, Julián García Ruiz. "Being able to reflect Blue’s non-human nature while creating a balance between inclusive language and the need for clarity and space limitations helped us hone the uniqueness of our assistant."

The task of creating Blue fell to 12 designers, six product owners, three program managers, and 20 developers across BBVA's global network, and took almost two years to come to fruition.

Blue is currently operational in Spain and Mexico, and will be gradually rolled out across the bank's global footprint and online channels.

Sponsored New Report – The Future of AI in Financial Services 2025

Related Company

Keywords

Comments: (7)

David Abashidze

David Abashidze Adviser/Investor at in FinTech

Missed opportunity.

People respond better to female voice as their assistant. I have seen in many times in customer testing research. 

Generally females are perceveid as more trustworthy. This perception is backed by data. Over 80% of criminals and harmful activity addicts globally are male.

Arguably, being perceived more trustworthy could be one of the reasons why female voices are performing better in product testings.

So not a correct product design decision.

A Finextra member 

I disagree - the technology is still stuck way too deep in the uncanny valley to be humanized. It is true that people tend to treat a humanized bot better (more polite to Alexa vs. Google Assistant, for instance), but the negative emotional impact of a bot's failure to deliver is also that much greater. Not to plug my article on the subject, but it's a lot easier than to restate the arguments here. https://www.linkedin.com/pulse/advice-humanizing-your-virtual-assistant-dont-yet-jay-tkachuk/ 

David Abashidze

David Abashidze Adviser/Investor at in FinTech

OK,I hear your argumetns and read your article.

So humanised bots have more polarised reactions: if they work well they have good reactions. But if they screw up, negative response is also more intense. Makes sense. 

But the logic here is that technology is not advanced enough for bots to work well, and if they don't work well they should be dehumanized to smooth the anger of customers.

Well, when I constructed my argument I assumed that BBVA bot works well. And if it does not work well, it should not be used in financial services at all. Money management is very sensitive topic for people and and nothing annoys them more than non functioning help from their bank. So putting not well working bot in a bank is a mistake. especially in post-MVP phase when the product is live with non beta tested users.

Now assuming bots work well, years in retail banking demonstrated that clients respond to women bank operators way better, and trust them more (for exmaple they do no recount money when given by female cash operator, while male operators have queues as customers recount money all the time).

So given the bot performs well, better product design would be to make it female, and if it does not work well it should be back to developers and data sicentists team.

A Finextra member 

"Works well" is the rub. My point is that we should not expect the level of service and experience we have gotten accustomed through science fiction from the current NLP/NLU platforms - they simply can't deliver it. VA's work well for simple tasks, informational and transactional, and will drive a better, responsive to voice digital user experience in just a few short years. But until major breakthroughs are made towards general AI and actual language comprehension vs. mimicking it, HAL remains out of reach, and thus shouldn't be humanized. 

David Abashidze

David Abashidze Adviser/Investor at in FinTech

Clear, hear your point. Makes sense.

As specifically concerned financial services: When people call their bank they do expect HAL, and if they don't get it, they freak out. Interaction with a bank where you have your money stored is not the same as interaction with internet provider when line has problems. 
So either bring HAL, or pay to well trained people if you don't have it.

A Finextra member 

Well, I wouldn't go that far - for instance, a large segment of the VA user base uses it for navigational shortcuts. Also, such tools are great for short, 1-3 intent-response, conversations. After that, the limitations start showing themselves. So, just like with Alexa/Siri/Google Assistant, such platforms are good for a certain number of use cases, deflecting calls and answering questions better and faster than say a standalone knowledge base. So there are good uses for them, but HAL they are not. 

David Abashidze

David Abashidze Adviser/Investor at in FinTech

OK, aknowledge the argument here too.
If used properly with limited set of simple tasks agree that can be value creative. And hear the arguments that dehuminized is better for these simple versions.

[Webinar] 2025 Fraud Trends: Synthetic Identity, AI and Incoming MandatesFinextra Promoted[Webinar] 2025 Fraud Trends: Synthetic Identity, AI and Incoming Mandates