As China’s AI bots show gender bias, developers point to flawed real-life model
- Forum told sexist AI results may stem from unbalanced data, the gender of a character in a digital file or be shaped by the developers
This problem has long been noted by Chinese researchers and industry professionals.
Last month, at a forum in Ningbo, Zhejiang province, academics, industry observers and AI developers discussed gender bias in AI models and AI-generated content (AIGC), calling for more balanced data and better design, as well as for more equal rights in the real world.
In recent years, China has been pushing its AI development along as the country seeks to leverage ChatGPT-like technology to drive economic growth and compete with the United States.
Gao Yang, an associate professor at the Beijing Institute of Technology, told the forum that gender bias in AIGC came from training AI models using data with existing biases, such as fixed gender roles or discriminatory descriptions.
“When dealing with specific tasks, the AI model would attribute certain characteristics or behaviour to a certain gender,” she said, according to an article published by the China Computer Federation last week.