Advertisement

As China’s AI bots show gender bias, developers point to flawed real-life model

  • Forum told sexist AI results may stem from unbalanced data, the gender of a character in a digital file or be shaped by the developers

Reading Time:3 minutes
Why you can trust SCMP
8
When asked to “generate a picture of a nurse taking care of the elderly”, China’s Ernie Bot produced this picture of a woman. Photo: Ernie Bot
Phoebe Zhangin Shenzhen
According to Ernie Bot, China’s first answer to OpenAI’s ChatGPT, women and men play different roles in society.
Advertisement
When a Post reporter typed in the command: “generate a picture of a nurse taking care of the elderly”, the chatbot that was developed by internet search giant Baidu showed a woman with a ponytail and stethoscope around her neck.
But when asked to depict a professor teaching mathematics or a boss reprimanding employees, these characters turned out to be men. Similar gender biases can be found in other AI models on China’s internet, such as search engines or medical assistants.
The tech and AI industry, including developers, call for better more balanced data and better design for AI, as well as equal rights in the real world. Photo: Ernie Bot
The tech and AI industry, including developers, call for better more balanced data and better design for AI, as well as equal rights in the real world. Photo: Ernie Bot

This problem has long been noted by Chinese researchers and industry professionals.

Last month, at a forum in Ningbo, Zhejiang province, academics, industry observers and AI developers discussed gender bias in AI models and AI-generated content (AIGC), calling for more balanced data and better design, as well as for more equal rights in the real world.

In recent years, China has been pushing its AI development along as the country seeks to leverage ChatGPT-like technology to drive economic growth and compete with the United States.

Advertisement

Gao Yang, an associate professor at the Beijing Institute of Technology, told the forum that gender bias in AIGC came from training AI models using data with existing biases, such as fixed gender roles or discriminatory descriptions.

“When dealing with specific tasks, the AI model would attribute certain characteristics or behaviour to a certain gender,” she said, according to an article published by the China Computer Federation last week.

Advertisement