“Web3 is the ‘perfect platform’ to be compensated for one’s own data,” explains Rumi Morales, a founding director at Navigate, a Web3 platform developing an AI-powered map from crowdsourced data.
The AI race between tech giants continues to be one of the hottest stories. Just this week Amazon announced it will be rolling out generative AI in Alexa, its consumer virtual assistant. This ramping up of AI interest is exciting, but it does not negate the existing problems with gendered bias in AI.
One of the ways to address gendered bias in AI is to employ people of diverse genders to work within the industry, yet according to the World Economic Forum, only 30% of those in the AI space are women, and this is growing more slowly than other tech sectors which have only increased by 4% since 2016.
Finextra spoke with Morales about her involvement in the AI space and some of the ways the industry can improve the gender bias which plagues AI models.
Morales posits: “As a senior woman in AI leadership, I've seen firsthand the challenges that women face in this field. Gender bias can creep into AI systems at every stage of development, from the data that's used to train the models to the people who are involved in building and deploying them.”
She offered four key ways to avoid bias before the AI has become fully formed:
- Make sure that AI teams are diverse and inclusive of all people
- Be transparent about how our AI systems work and how we're mitigating bias
- Be totally accountable for the outcomes of our systems and to take steps to address any unintended consequences
- Educate the public, AI practitioners and policymakers about AI bias
She argued: “Above all, it's critically important to actively involve women in every stage of the AI development process - from ideation to design to deployment. We need to build a culture that values safety and respect in the AI community.”
Morales continued: “Historically - for me at least - the AI community has felt a bit closed off and intimidating for those eager to learn. Maybe that was more understandable during the earlier stages of AI development when it was less accessible. But now, as AI tools become more widely available, it's crucial that we create an environment that welcomes diverse input.”
For those AI models that are already formed, this does not necessarily make them completely unusable. Morales offered some points on how these can be improved: “Monitoring systems in real-world settings is crucial to detect and address emerging biases and issues. Additionally, regularly updating AI models with fresh data and knowledge is vital for eliminating bias and enhancing fairness. This can involve retraining the model using improved datasets or implementing techniques like re-weighting or re-sampling to balance gender representation and mitigate bias. Most importantly, we must actively encourage and involve more women in AI research, welcoming their perspectives in this ever-evolving landscape.”
However, Morales also argued that Web3 may offer one of a whole new space for these bias data problems to be solved. She argues that in Web3 data isn't controlled by any single entity, so you have more control over your data and how it's used; You can see exactly how your data is being used and who's using it, holding companies accountable; You own your data. You decide who can use it and how; Companies need your permission to use your data, so they're willing to pay for it.
Morales demonstrated this in her own work: “Navigate is building an AI-powered map platform where users earn rewards every time they contribute image data to the platform. Those rewards allow users to redeem gift cards from the Navigate Marketplace, choosing from hundreds of global brands, such as Airbnb, EA, Under Armour, and Uber Eats. When it comes to platforms like Waze or other mapping providers, oftentimes their success is due to the efforts of individuals who contribute to them. But those people typically see very little rewards in return and their data becomes owned by Waze (which is owned by Google). With Navigate, users continue to own their own data but moreover receive rewards for helping improve it. “