We use cookies

We use cookies to ensure you get the best experience on our website. For more information on how we use cookies, please see our cookie policy.

Back to home

Samsung introduces SOCAMM2 LPDDR5X memory module for AI data centers — new standard set to offer reduced power consumption and double the bandwidth versus DDR5 RDIMMs

Source

Tom's Hardware

Published

TL;DR

AI Generated

Samsung has introduced the SOCAMM2 LPDDR5X memory module for AI data centers, offering reduced power consumption and double the bandwidth compared to DDR5 RDIMMs. The module is detachable and aligns with an emerging JEDEC standard for AI-focused systems, addressing the challenge of soldered LPDDR in servers. Samsung is collaborating with Nvidia on infrastructure built around SOCAMM2 to tackle rising memory power costs and serviceability concerns in large-scale deployments. The LPDDR5X module aims to provide high bandwidth at lower power for AI systems, complementing HBM and DDR5 DIMMs in the memory hierarchy. Despite potential latency trade-offs, SOCAMM2's efficiency and bandwidth advantages make it a promising solution for inference-heavy AI deployments.