We use cookies

We use cookies to ensure you get the best experience on our website. For more information on how we use cookies, please see our cookie policy.

Back to home

Nvidia's homegrown memory design is almost standardized and ready for everyone to use — JEDEC says SOCAMM2 compact DRAM module for AI servers boasts higher speeds and broader compatibility

Source

Tom's Hardware

Published

TL;DR

AI Generated

Nvidia's SOCAMM2 memory design, nearing standardization, will utilize LPDDR5X memory and feature upgrades like an SPD profile and faster transfer rates. The new memory type is set to be integrated into next-gen AI servers. Initially, Nvidia faced technical challenges with SOCAMM, leading to overheating issues in its servers. The updated SOCAMM2 is expected to be more widely adopted by hardware manufacturers beyond Nvidia. The finalized design is anticipated in the coming months, allowing time for production and testing before deployment in next year's Rubin platform.