Accordingly, these AI chips are a transformation of Qualcomm, which has previously focused on semiconductors for wireless connectivity and mobile devices, not large data centers.
Qualcomm said that both the AI200 chip - expected to launch in 2026, and the AI250 - expected to launch in 2027, can be integrated into a liquid cooling server system.
Qualcomm is reportedly competing with Nvidia and AMD, two companies that provide graphics processing (GPU). Qualcomm's data center chip is built on AI components in Qualcomm's smartphone chip, known as the Hexagon neural processor, or NPU.
Qualcomm's move into the data center world marks a new competition in the fastest-growing technology market: equipment for new server clusters focused on AI. It is estimated that nearly 6,700 billion USD of capital will be spent on data centers by 2030, mostly on AI chip-based systems.
Nvidia is now dominating the industry, with GPUs accounting for more than 90% of the market share to date and sales have helped the company reach a market capitalization of more than $4,500 billion. Nvidia's chip has been used to train OpenAI's GPT, major language models used in ChatGPT.
But companies like OpenAI have been looking for alternatives, and in early October 2025, OpenAI also announced plans to buy chips from the second-placed GPU maker - AMD, and is likely to buy a stake in the enterprise. Other companies such as Google, Amazon and Microsoft are also developing their own AI accelerators for their cloud services.
Qualcomm said its chip focuses on reasoning, which is running AI models, rather than training - which is how laboratories like OpenAI create new AI capabilities by processing terabyte data.
Other AI chip companies such as Nvidia or AMD may even become customers buying some of Qualcomm's data centers.
In May 2025, Qualcomm announced a partnership with Saudi Arabia's Humain to provide AI reasoning chips to data centers in the region.
Qualcomm says its AI chip has a advantage in terms of power consumption, cost, etc. According to Qualcomm, its AI chip supports 768 gigabyte storage, higher than Nvidia and AMD products.