Back to home
Technology

Asus, MSI, other manufacturers panic-buying RAM stocks, while major memory chipmakers rake in profits — massive demand for HBM and RDIMM for data centers driving shortage

Source

Tom's Hardware

Published

TL;DR

AI Generated

Major tech companies like Asus and MSI are panic-buying RAM stocks due to a shortage of DRAM and NAND chips. The demand for memory modules from data centers is causing a supply crunch for consumer products, leading to price increases and limited availability. Memory chipmakers are benefiting from the surge in demand, with some raising prices by up to 60%. The global demand for HBM and RDIMM for AI data centers is driving the shortage in consumer memory, creating pricing challenges that could persist for years. Memory chip manufacturers are cautious about investing in new facilities to increase production due to market volatility and concerns about an AI bubble potentially bursting.

Read Full Article

Similar Articles

Samsung and SK hynix warn AI-driven memory shortages could last until 2027 and beyond, as HBM demand explodes — customers already reserving supply years ahead, while the wider DRAM market begins to tighten

Samsung and SK hynix warn AI-driven memory shortages could last until 2027 and beyond, as HBM demand explodes — customers already reserving supply years ahead, while the wider DRAM market begins to tighten

Samsung and SK hynix are warning of AI-driven memory shortages potentially lasting until 2027 and beyond, with HBM demand surging. The companies are struggling to meet demand as customers reserve supply years in advance, impacting the broader DRAM market. The shortages are fueled by the need for high-speed memory in AI infrastructure, particularly HBM, which is challenging to manufacture. Despite efforts to develop alternative memory technologies, the demand for existing memory remains overwhelming, prompting companies to invest in expanding production capacity. The memory crunch is part of a larger trend of resource shortages in the tech industry due to the rapid growth of AI infrastructure.

Tom's Hardware
Six AI data centers proposed for a small town of 7,000, equal to 51 Walmart Supercenters in 17 square mile area —  four out of the seven town council members have resigned from their positions as town fights back

Six AI data centers proposed for a small town of 7,000, equal to 51 Walmart Supercenters in 17 square mile area — four out of the seven town council members have resigned from their positions as town fights back

Developers have proposed building six AI data centers in Archbald, Pennsylvania, a small town with a population of 7,000. These data centers would cover 14% of the town's 17-square-mile area and are equivalent in size to 51 Walmart Supercenters. The community is pushing back due to concerns about the impact on local utilities, noise, and light pollution. The fight has led to resignations from town council members and delays in the projects, with President Donald Trump getting involved in urging hyperscalers to address community concerns.

Tom's Hardware
Commodore backs down over FPGA firmware lockdown — it won’t now try and block third-party firmware installs but will stand firm against bricked modded units

Commodore backs down over FPGA firmware lockdown — it won’t now try and block third-party firmware installs but will stand firm against bricked modded units

Commodore has reversed its decision to block third-party firmware installs on the C64 Ultimate computer, allowing users to experiment freely. However, the company will not provide support or replacements for modded units that become bricked. The initial plan to restrict non-Commodore FPGA firmware caused a divide among fans, leading to heated discussions on social media and forums. Commodore now emphasizes user freedom but warns that using community-installed firmware is at the owner's risk, with no free support or warranty service provided for damaged units.

Tom's Hardware
CEO Interview with Xianxin Guo of Lumai

CEO Interview with Xianxin Guo of Lumai

Xianxin Guo, CEO of Lumai, discusses the company's optical computing technology for AI and data center acceleration, aiming to address power efficiency and scalability limitations of traditional silicon-based approaches. Lumai's hybrid optical-electronic design enhances compute efficiency by leveraging light for key operations, reducing energy consumption and breaking through AI system bottlenecks. The technology is well-suited for high-throughput AI inference workloads in data centers, offering a more cost-effective and scalable solution. By focusing on optical compute, Lumai differentiates itself from competitors and aims to redefine AI compute efficiency for long-term scalability and performance gains. The company engages with customers through collaborative discussions and partnership-driven approaches to integrate optical computing seamlessly into existing AI infrastructure.

SemiWiki

We use cookies

We use cookies to ensure you get the best experience on our website. For more information on how we use cookies, please see our cookie policy.