🟦 NVIDIA Deploys Self-Designed SOCAMM in Large Numbers in AI Product Portfolio

企業分析

NVIDIA has started introducing its own standard new memory “SOCAMM” in earnest, and it is reported that it will procure up to 800,000 units by the end of the year. As an innovative technology that combines AI and low power consumption, it is attracting attention from the industry.

🟦 NVIDIA Deploys Self-Designed SOCAMM in Large Numbers in AI Product Portfolio

NVIDIA has decided to procure up to 800,000 units of its own next-generation memory module “SOCAMM (Small Outline Compression Attached Memory Module)” in 2025, and it is reported that it is sharing the supply system with memory and board manufacturers. This module will be mainly installed in NVIDIA’s AI server products and AI PC “DIGITS”.

  • SOCAMM is a modular DRAM stacked with LPDDR5X, and Micron is responsible for initial mass production. Samsung and SK hynix are also in discussions to supply
  • Expands I/O (input/output) performance compared to conventional LPCAMM for notebook PCs, and is designed with high bandwidth, small size, and excellent interchangeability
  • According to Micron’s published values, it achieves overwhelming efficiency with 2.5 times the bandwidth, 1/3 power consumption, and 1/3 (14×90 mm) size compared to DDR5 RDIMM

This “SOCAMM 1” is in the initial introduction phase, and “SOCAMM 2” is expected to see even more full-scale demand growth in “SOCAMM 2” scheduled to appear in 2026 or later.

🟦 Is SOCAMM “post-HBM”?

With the increasing performance of AI models, traditional DDR5 RDIMM and HBM are seeing limitations in power consumption and scalability. SOCAMM emerged as a unique form factor promoted by NVIDIA as an innovative memory standard that combines power savings, scalability, and maintainability.

In addition to the needs of AI servers and workstations, its compact and highly efficient structure is also ideal for deployment in AI PCs. In fact, SOCAMM is also used in NVIDIA’s AI PC “DIGITS” announced at GTC 2025.

image

Although the shipment volume in 2025 is about 60-800,000 units, which is less than HBM’s estimated annual shipment volume (about 9 million units), it is considered to be a sufficient “catalyst” for the creation of new markets.

🟦 Summary

NVIDIA’s adoption of SOCAMM is not just about memory procurement, it’s about presenting a new standard for AI infrastructure. With Micron starting mass production and Samsung and SK hynix considering entering, SOCAMM is pioneering the new LPDDR-based AI memory market.

HBM and SOCAMM complement each other, and we have high hopes for a future where memory structures in the AI era will become more diverse.

Copied title and URL