Qualcomm has announced plans to launch new artificial intelligence accelerator chips. This move brings fresh competition to Nvidia, which currently leads the AI semiconductor market.
The announcement marks a major shift for Qualcomm. The company has mainly focused on chips for mobile devices and wireless connectivity. The new products are designed for large scale data centres.
Qualcomm said the AI200 chip will be available in 2026. The AI250 chip will follow in 2027. Both chips can be placed in a full liquid cooled server rack. These systems allow many chips to function together as one powerful computer. AI labs require this level of performance to run advanced models.
The company said the new data centre chips are based on the Hexagon neural processing units inside Qualcomm smartphone processors.
Durga Malladi, General Manager for data centre and edge at Qualcomm, said: “We first wanted to prove ourselves in other domains, and once we built our strength over there, it was pretty easy for us to go up a notch into the data center level.”
Qualcomm will now compete in the fastest growing technology market. AI data centre infrastructure is expected to attract about 6.7 trillion dollars in capital investment by 2030. Most of this spending will support systems that rely on AI chips.
Nvidia currently holds over 90 percent of the AI GPU market. Its chips have helped train models such as OpenAI GPTs used in ChatGPT. The strong demand for Nvidia products has pushed its market value to more than 4.5 trillion dollars. Companies such as OpenAI are now exploring alternatives and recently announced plans to buy chips from AMD. Cloud providers including Google, Amazon and Microsoft are also working on their own AI accelerators.
Qualcomm said its new chips are designed for inference, which means running AI models rather than training them. The company believes its systems will cost less to operate. A single rack uses 160 kilowatts of power, similar to the power needs of Nvidia systems.
Malladi said Qualcomm will also offer its chips and parts separately. He added that other AI chip makers could become customers for Qualcomm data centre components. Malladi said: “What we have tried to do is make sure that our customers are in a position to either take all of it or say, I am going to mix and match.”
Qualcomm did not reveal pricing details or how many NPUs will fit in each rack. The company recently partnered with a technology firm in Saudi Arabia to supply AI inferencing chips for new data centres in the region.
Qualcomm said its chips offer advantages in power consumption, cost of ownership and memory handling. Its AI cards support 768 gigabytes of memory. This is higher than current products from Nvidia and AMD.
Also read: Viksit Workforce for a Viksit Bharat
Do Follow: The Mainstream formerly known as CIO News LinkedIn Account | The Mainstream formerly known as CIO News Facebook | The Mainstream formerly known as CIO News Youtube | The Mainstream formerly known as CIO News Twitter |The Mainstream formerly known as CIO News Whatsapp Channel | The Mainstream formerly known as CIO News Instagram
About us:
The Mainstream formerly known as CIO News is a premier platform dedicated to delivering latest news, updates, and insights from the tech industry. With its strong foundation of intellectual property and thought leadership, the platform is well-positioned to stay ahead of the curve and lead conversations about how technology shapes our world. From its early days as CIO News to its rebranding as The Mainstream on November 28, 2024, it has been expanding its global reach, targeting key markets in the Middle East & Africa, ASEAN, the USA, and the UK. The Mainstream is a vision to put technology at the center of every conversation, inspiring professionals and organizations to embrace the future of tech.



