Major Chinese tech companies last week announced extensive price cuts for their large language model (LLM) products used for generative artificial intelligence, a move that experts say is tipped to speed up the application of AI models in the domestic market and in research. While price wars are not uncommon in the world of LLMs, this comes as Chinese developers continue to make progress in the commercialisation of AI technology in recent months. By Amber Wang.
Lin Yonghua, deputy director and chief engineer of the Beijing Academy of Artificial Intelligence (BAAI), a leading non-profit in AI research and development, said the “bottleneck” around training data is looming large.
The article then explains:
- Need for localised models
- Filling the data gap
- The GPU bottleneck
Industry discussions also focus on the strong demand for computing power. Lack of computing power inhibits the implementation of LLMs and delays product deployment, they say. Good read!
[Read More]