Hello

Your subscription is almost coming to an end. Don’t miss out on the great content on Nation.Africa

Ready to continue your informative journey with us?

Hello

Your premium access has ended, but the best of Nation.Africa is still within reach. Renew now to unlock exclusive stories and in-depth features.

Reclaim your full access. Click below to renew.

Is there mother of economics? What AI reveals about gender bias

University students in a library. AI tools, depending on how they are designed, can perpetuate, reinforce, or even widen existing gender gaps.

Photo credit: Photo I Pool

What you need to know:

  • A student’s search for the “mother of economics” reveals how AI systems mirror and magnify gender inequality.
  • AI bias is not accidental; it’s baked into data and design and experts say fixing it requires intentional diversity.


Brian Omwanza, a second-year undergraduate, was struck by his history of economics lecturer’s constant reference to Adam Smith as the “father of economics”.

Since his aunt works in the gender and development space and often talks about gender inequalities, he asked her what she thought about the lecturer’s framing.

“My aunt told me to search online if there is a mother of economics. So I turned to my AI (artificial intelligence) assignment tool and discovered that indeed Adam Smith is widely known as the father of modern economics, but Joan Robinson is sometimes referred to as the mother of economics, though nowhere near as prominently as Adam,” he says.

“When I told my aunt what I found, she said: ‘You see now, that is gender bias. Unfortunately, the AI tool is reinforcing it. But now you know, there was also a woman who was the mother of modern economics, only that society chose not to give her the same prominence. And AI is simply reflecting that bias.’”

Brian’s experience highlights a deeper problem: AI tools, depending on how they are designed, can perpetuate, reinforce, or even widen existing gender gaps. AI is about data, but most of the data is biased, explains Dr Almaz Yohannis Mbathi, a lecturer at the Department of Computing and Informatics at the University of Nairobi.

But these technologies, which allow computers to perform complex tasks faster than humans, are also quickly transferring the masculinities embedded in datasets back into human society.

Genevieve Smith, associate director at the Center for Equity, Gender and Leadership at UC Berkeley Haas School of Business, and her colleague Ishita Rustagi, a business strategies and operations analyst at the same centre, put it plainly in a 2021 article: “AI systems are biased because they are human creations. Who makes decisions informing AI systems and who is on the team developing AI systems shapes their development.”

Yet women remain a minority in this field. According to the World Economic Forum, the tech workforce is still male-dominated, with women holding only a quarter of all tech jobs and a small fraction of leadership roles.

The gender bias is also evident in the outcomes of AI systems. The Berkeley Haas Center tracked 133 publicly available cases of AI bias across industries between 1988 and 2021. Their analysis found that 44.2 per cent demonstrate gender bias, while 25.7 per cent exhibit both gender and racial bias.

Genevieve and Ishita note in their article that humans generate, collect and label the data fed into AI systems. They also decide the datasets, variables and rules that algorithms learn from. As a result, any bias at these stages becomes embedded in AI.

Scientists say the solution lies in feeding AI systems with gender-sensitive datasets curated by professionals who are cautious of gender bias. “The only way we can overcome that is through awareness, mentorship for women, and creating content for Africa specifically,” Dr Mbathi suggests.

“For instance, if you ask AI to show you a woman walking down the street, it shows you a white woman, not a black woman. That’s because of the data it has. Unless we are intentional, AI will keep ignoring women, minorities, and the Global South.”

Her concerns echo those of Zinnya del Villar, Director of Data, Technology, and Innovation at Data-Pop Alliance, in an interview with UN Women. “To reduce gender bias in AI, it’s crucial that the data used to train AI systems is diverse and represents all genders, races, and communities,” stresses Zinnya.

“This means actively selecting data that reflects different social backgrounds, cultures and roles, while removing historical biases, such as those that associate specific jobs or traits with one gender.”

Zinnya says the solution also depends on the diversity of teams developing the systems. “AI systems should be created by development teams that include people of different genders, races, ages, abilities, and cultural backgrounds. This brings a wide range of perspectives into the process and helps reduce blind spots that can lead to biased AI systems,” she says.