Wednesday, July 23, 2025

Top 5 This Week

Related News

Nvidia picks Micron for massive SOCAMM rollout

Nvidia is preparing to enter a new phase in the memory market by planning to deploy between 600,000 and 800,000 SOCAMM modules in 2025. This initiative positions SOCAMM as a potential successor to high-bandwidth memory (HBM). Although the initial deployment volumes are relatively modest compared to HBM, industry analysts suggest the move could trigger a wider transformation in the memory and substrate sectors.

According to reports, Nvidia has confirmed its intent to integrate SOCAMM into its next-generation AI products, sharing projected order quantities with key memory and substrate suppliers. The company’s upcoming GB300 “Blackwell” platform will be among the first to adopt SOCAMM, alongside the AI PC Digits, unveiled during Nvidia’s GTC 2025 conference in May.

Micron secures first-mover advantage

Nvidia initially tapped Samsung Electronics, SK Hynix, and Micron to co-develop SOCAMM. However, Micron has emerged as the first memory maker to receive approval for volume production, outpacing its South Korean rivals in the race to support Nvidia’s latest architectures.

Designed for low-power, high-bandwidth AI computing, SOCAMM leverages LPDDR DRAM and offers a significant upgrade over conventional notebook DRAM modules such as LPCAMM. The new module boosts input/output speeds and data transfer rates while maintaining a compact, upgrade-friendly form factor.

Micron claims its SOCAMM delivers a 2.5x increase in bandwidth and a one-third reduction in size and power consumption compared to traditional RDIMM modules used in servers.

From servers to PCs

While Nvidia’s initial SOCAMM deployment will focus on AI servers and workstations, the inclusion of the module in the Digits AI PC signals broader ambitions for the consumer market. Industry players believe this crossover potential will be key to scaling adoption.

Though the projected 600,000–800,000 units are dwarfed by Nvidia’s planned procurement of 9 million HBM units in 2025, analysts say SOCAMM’s introduction marks a pivotal inflection point. The module’s appeal lies in bridging the gap between cost-effective, scalable memory and the performance demands of AI workloads.

Substrate makers eye growth opportunity

SOCAMM’s rise is also reshaping dynamics in the substrate sector. The module requires custom-designed PCBs, creating an entirely new category of demand. With Micron already in mass production and both Samsung and SK Hynix actively negotiating supply partnerships, the stage is set for intensifying competition among the world’s top DRAM vendors.

Substrate suppliers, too, are preparing for an inflection in demand. Industry insiders note that while early volumes are limited, a wave of large-scale orders could follow if Nvidia’s SOCAMM strategy gains traction, potentially triggering a fierce scramble among PCB vendors to capture the new business.

Also read: Viksit Workforce for a Viksit Bharat

Do Follow: The Mainstream formerly known as CIO News LinkedIn Account | The Mainstream formerly known as CIO News Facebook | The Mainstream formerly known as CIO News Youtube | The Mainstream formerly known as CIO News Twitter |The Mainstream formerly known as CIO News Whatsapp Channel | The Mainstream formerly known as CIO News Instagram

About us:

The Mainstream formerly known as CIO News is a premier platform dedicated to delivering latest news, updates, and insights from the tech industry. With its strong foundation of intellectual property and thought leadership, the platform is well-positioned to stay ahead of the curve and lead conversations about how technology shapes our world. From its early days as CIO News to its rebranding as The Mainstream on November 28, 2024, it has been expanding its global reach, targeting key markets in the Middle East & Africa, ASEAN, the USA, and the UK. The Mainstream is a vision to put technology at the center of every conversation, inspiring professionals and organizations to embrace the future of tech.

 

Popular Articles