According to industry analyst Patrick Moorhead, China has achieved a remarkable feat by developing a single generative AI (GAI) model that spans multiple data centers. This accomplishment is particularly impressive given the complexity involved in using different GPUs within a single data center, let alone coordinating servers across multiple geographic locations. Moorhead, who is the Chief Analyst at Moor Insights & Strategy, revealed this discovery during a discussion about an unrelated NDA meeting.
This ability to train GAIs across disparate locations and architectures is crucial for China to advance its ambitious AI agenda, especially as American sanctions have restricted its access to the latest, most powerful chip technologies needed to drive research and development. In response, Nvidia has created the less capable H20 AI chips that align with Washington's performance limitations. However, there are rumors that even these downgraded chips may soon be banned, underscoring the uncertainty faced by Chinese tech companies in the current geopolitical climate.
Faced with this challenge, Chinese researchers have been working on integrating GPUs from various brands into a single training cluster. This approach allows them to combine their limited supplies of sanctioned high-performance chips, such as the Nvidia A100, with more readily available but less powerful GPUs like Huawei's Ascend 910B or Nvidia's H20. While this technique has historically resulted in significant efficiency losses, it appears that China has found ways to overcome this issue, as evidenced by the reported development of the single GAI model across multiple data centers.
Although details about this GAI system are still unknown, it demonstrates the lengths to which Chinese researchers are willing to go to ensure the continued advancement of the country's AI ambitions. As Huawei has stated, China will find ways to keep its AI development moving forward despite the American sanctions. Necessity, as the saying goes, is the mother of invention.
Comments