Advertisement
Exclusive | PolyU’s top AI scientist Yang Hongxia seeks to revolutionise LLM development in Hong Kong
- Yang Hongxia said she wants to change the resource-intensive process of training LLMs into a decentralised, machine-learning paradigm
Reading Time:2 minutes
Why you can trust SCMP
1
Wency Chenin Shanghai
Mainland Chinese artificial intelligence (AI) scientist Yang Hongxia, the newly named professor at Hong Kong Polytechnic University (PolyU), is on a mission to revolutionise the development of large language models (LLMs) – the technology underpinning generative AI services like ChatGPT – from her base in the city.
Advertisement
In an interview with the South China Morning Post, Yang – who previously worked on AI models at ByteDance and Alibaba Group Holding’s Damo Academy – said she wants to change the existing resource-intensive process of training LLMs into a decentralised, machine-learning paradigm. Alibaba owns the South China Morning Post.
“The rapid advances in generative AI, spearheaded by OpenAI’s GPT series, have unlocked immense possibilities,” she said. “Yet this progress comes with significant disparities in resource allocation.”
At present, Yang said LLM development has mostly relied on deploying advanced and expensive graphics processing units (GPUs), from the likes of Nvidia and Advanced Micro Devices, in data centres for projects involving vast amounts of raw data, which has put deep-pocketed Big Tech companies and well-funded start-ups at a major advantage.
Yang said she and her colleagues propose a “model-over-models” approach to LLM development. That calls for a decentralised paradigm in which developers train smaller models across thousands of specific domains, including code generation, advanced data analysis and specialised AI agents.
Advertisement
These smaller models would then evolve into a large and comprehensive LLM, also known as a foundation model. Yang pointed out that this approach could reduce the computational demands at each stage of LLM development.
Advertisement