
Instead of relying solely on Microsoft as at the beginning, OpenAI will now use many cloud service providers including: Google Cloud, CoreWeave and Oracle. This diversification is considered by analysts to be a strategic step to solve the "bottleneck" of GPU resources and reduce dependent risks.
Google is already lagging behind Amazon and Microsoft in cloud computing. However, it is now said to have made its mark as an infrastructure operator partner for ChatGPT and OpenAI's APIs in many regions of the world such as the US, Japan, and Europe.
Notably, although both sides are in the AI race and developing their own models (Gemini of Google and GPT of OpenAI), they are still willing to cooperate in infrastructure. This clearly reflects the trend of "coopetition".
The expansion comes as CEO Sam altman has publicly admitted that it is difficult to find GPUs to maintain large-scale AI models. If anyone has 100,000 GPUs, please call us, he wrote on social network X in April.
Previously, OpenAI also signed a 5-year contract worth nearly $12 billion with CoreWeave and announced a tripartite partnership plan with Microsoft and Oracle to share computing resources on a cloud platform.
Although Microsoft remains a long-term and exclusive strategic partner with OpenAI's programming interfaces, the expansion of its infrastructure network shows that the company is entering a period of global growth, requiring flexibility and the ability to optimize resources at the highest level.