Dr Rita Udor (middle), Gender Inclusivity Officer at the Kwame Nkrumah University of Science and Technology, making a point during the panel discussion. With her is Selaseh Pashur Akaho (right), a Statistician at the Ghana Statistical Service, and Animwaa Anim Addo (left), the moderator
Dr Rita Udor (middle), Gender Inclusivity Officer at the Kwame Nkrumah University of Science and Technology, making a point during the panel discussion. With her is Selaseh Pashur Akaho (right), a Statistician at the Ghana Statistical Service, and Animwaa Anim Addo (left), the moderator

Activists caution against gender bias in use of AI

Panellists at a workshop on artificial intelligence (AI) have called on the public to build their capacities to avoid being threatened by AI.

Advertisement

They explained that the existence of AI was to fine tune work and not put people out of jobs.

They, however, confirmed the existence of gender bias in the use of AI due to how data was collected and how AI models were designed.

The panellists were a Gender Inclusivity Officer at the Kwame Nkrumah University of Science and Technology (KNUST), Dr Rita Udor; a Statistician at the Ghana Statistical Service, Selaseh Pashur Akaho; and an Machine Learning (ML) Researcher at Saarland University, Germany, Deborah Dormah Kanubala.

Workshop

They were speaking during a workshop organised by the Ghana Centre for Democratic Development (CDD-Ghana) to disseminate and discuss findings of a study on AI in Accra on Wednesday.

The study, conducted by AidData and CDD-Ghana was aimed at evaluating gender bias in AI applications in measuring wealth using data from the Ghana Demographic and Health Survey (DHS).

The study also utilised geospatial data and DHS data as its foundation, to look into the nuances of gender bias in wealth estimates generated by AI.

AidData was awarded funding for the project through USAID's Equitable AI Challenge, designed to fund approaches that will increase the accountability and transparency of AI systems used in global development contexts.

Findings from the study revealed that some Al Application models that use household survey data to generate results on wealth estimates, unintentionally disseminate results that were bias towards women.

Dr Udor stated the usefulness of AI for the country’s development but stressed that it must be used cautiously.

She said although there was currently no law regulating AI, some guidance was needed to prevent gender inequality from being amplified by AI.

 “We must work to make sure that whatever innovation coming up doesn’t amplify inequality,” she said.

Ms Kanubala commended the research team for considering local content in the research and said it was in the right direction.

Mr Akaho on the other hand emphasised the need to train people in how to work with AI model so they can read out results and make the needed change.

The Director of Research of CDD-Ghana, Dr Edem Selormey, said recent research has shed light on instances where AI systems inadvertently perpetuate ethnic, disability and gender biases, amplifying societal inequalities.

She said the workshop was significant as it provided a platform to explore the relationship between AI and gender bias as well as some of the tools needed to foster equitable and unbiased Al technologies.

“From the study, while AI models demonstrate their strength and usefulness, any level of bias, no matter how small, can cast shadows on accuracy.”

“For instance, we notice that the digital realm can mirror and amplify the inequalities ingrained in our societies.

Gender bias is insidiously woven into the algorithms that underpin AI systems,” she said.

She expressed the hope that AI would be harnessed for the betterment of all, and become a driving force for empowerment rather than one that perpetuates disparities.

Advertisement

A Research Scientist at AidData, Rachel Sayers, said with enough resources to measure outcomes, AI has helped predict the outcomes of data in other areas more frequently.

She called for such approaches to be implemented to evaluate the results of policies and programmes in the country. 

Connect With Us : 0242202447 | 0551484843 | 0266361755 | 059 199 7513 |

Like what you see?

Hit the buttons below to follow us, you won't regret it...

0
Shares