Back to home
Technology

Server memory prices to double year-over-year in 2026, LPDDR5X prices could follow — 'seismic shift' means even smartphone-class memory isn't safe from AI-induced crunch

Source

Tom's Hardware

Published

TL;DR

AI Generated

DRAM prices have surged due to shortages, with server-grade DDR5 prices expected to double by late 2026. The use of LPDDR5X memory by Nvidia in its CPUs is also driving up prices, impacting smartphone-class memory. The shortage of new DRAM manufacturing capacity and increased demand for HBM memory are contributing to the price hikes. This trend is likely to affect various industries, including DIY PC, automotive, server, and smartphone manufacturing sectors. The shortage could lead to a significant increase in memory prices, affecting budget smartphones first and potentially leading to bill-of-materials increases for mid-tier and premium devices.

Read Full Article

Similar Articles

Samsung and SK hynix warn AI-driven memory shortages could last until 2027 and beyond, as HBM demand explodes — customers already reserving supply years ahead, while the wider DRAM market begins to tighten

Samsung and SK hynix warn AI-driven memory shortages could last until 2027 and beyond, as HBM demand explodes — customers already reserving supply years ahead, while the wider DRAM market begins to tighten

Samsung and SK hynix are warning of AI-driven memory shortages potentially lasting until 2027 and beyond, with HBM demand surging. The companies are struggling to meet demand as customers reserve supply years in advance, impacting the broader DRAM market. The shortages are fueled by the need for high-speed memory in AI infrastructure, particularly HBM, which is challenging to manufacture. Despite efforts to develop alternative memory technologies, the demand for existing memory remains overwhelming, prompting companies to invest in expanding production capacity. The memory crunch is part of a larger trend of resource shortages in the tech industry due to the rapid growth of AI infrastructure.

Tom's Hardware
With $1 Cyberattacks on the Rise, Durable Defenses Pay Off

With $1 Cyberattacks on the Rise, Durable Defenses Pay Off

As cyberattacks that cost as little as $1 become more prevalent, the importance of robust cybersecurity defenses is highlighted. The article emphasizes the significance of writing memory-safe code over relying solely on patching vulnerabilities. Experts Evan Johnson and Justin Cappos from New York University stress the need for durable defenses in the face of rapid and powerful cyberattacks facilitated by large language models like Anthropic’s Claude Mythos. They suggest that a comprehensive approach beyond generative AI is essential for effective cyberdefense.

IEEE Spectrum
Commodore backs down over FPGA firmware lockdown — it won’t now try and block third-party firmware installs but will stand firm against bricked modded units

Commodore backs down over FPGA firmware lockdown — it won’t now try and block third-party firmware installs but will stand firm against bricked modded units

Commodore has reversed its decision to block third-party firmware installs on the C64 Ultimate computer, allowing users to experiment freely. However, the company will not provide support or replacements for modded units that become bricked. The initial plan to restrict non-Commodore FPGA firmware caused a divide among fans, leading to heated discussions on social media and forums. Commodore now emphasizes user freedom but warns that using community-installed firmware is at the owner's risk, with no free support or warranty service provided for damaged units.

Tom's Hardware
CEO Interview with Xianxin Guo of Lumai

CEO Interview with Xianxin Guo of Lumai

Xianxin Guo, CEO of Lumai, discusses the company's optical computing technology for AI and data center acceleration, aiming to address power efficiency and scalability limitations of traditional silicon-based approaches. Lumai's hybrid optical-electronic design enhances compute efficiency by leveraging light for key operations, reducing energy consumption and breaking through AI system bottlenecks. The technology is well-suited for high-throughput AI inference workloads in data centers, offering a more cost-effective and scalable solution. By focusing on optical compute, Lumai differentiates itself from competitors and aims to redefine AI compute efficiency for long-term scalability and performance gains. The company engages with customers through collaborative discussions and partnership-driven approaches to integrate optical computing seamlessly into existing AI infrastructure.

SemiWiki

We use cookies

We use cookies to ensure you get the best experience on our website. For more information on how we use cookies, please see our cookie policy.