Back to home
Technology

Dell and Lenovo set to increase server and PC costs by as much as 15% as soon as this month, according to industry sources — DRAM and AI demand create tight market for businesses and consumers

Source

Tom's Hardware

Published

TL;DR

AI Generated

Dell and Lenovo are expected to raise server and PC prices by up to 15% due to a tight market caused by high demand for DRAM and AI technologies. Lenovo has already warned clients of an impending price hike in early 2026, while Dell may increase prices by 15-20% as early as mid-December. The component crisis, accelerated by AI demand, is affecting the entire tech industry, leading to supply chain disruptions and price increases. Major players like Samsung, LG, Dell, HP, and Lenovo are reevaluating their product roadmaps due to the ongoing challenges in the market.

Read Full Article

Similar Articles

Samsung and SK hynix warn AI-driven memory shortages could last until 2027 and beyond, as HBM demand explodes — customers already reserving supply years ahead, while the wider DRAM market begins to tighten

Samsung and SK hynix warn AI-driven memory shortages could last until 2027 and beyond, as HBM demand explodes — customers already reserving supply years ahead, while the wider DRAM market begins to tighten

Samsung and SK hynix are warning of AI-driven memory shortages potentially lasting until 2027 and beyond, with HBM demand surging. The companies are struggling to meet demand as customers reserve supply years in advance, impacting the broader DRAM market. The shortages are fueled by the need for high-speed memory in AI infrastructure, particularly HBM, which is challenging to manufacture. Despite efforts to develop alternative memory technologies, the demand for existing memory remains overwhelming, prompting companies to invest in expanding production capacity. The memory crunch is part of a larger trend of resource shortages in the tech industry due to the rapid growth of AI infrastructure.

Tom's Hardware
Framework's new RTX 5070 12GB graphics module costs a whopping $1,199 — 72% more expensive than $699 8GB version, says pricing is beyond its control

Framework's new RTX 5070 12GB graphics module costs a whopping $1,199 — 72% more expensive than $699 8GB version, says pricing is beyond its control

Nvidia released a new 12GB version of the RTX 5070 mobile GPU with upgraded memory chips, increasing memory throughput. Framework introduced a new graphics module for its Framework Laptop 16 featuring this GPU, priced at $1,199, a significant increase from the $699 8GB version. The high cost is attributed to the expensive GDDR7 memory and the ongoing global memory shortage. Framework clarified that the pricing is influenced by external factors and not within its control, highlighting the challenges faced by consumers due to the current market conditions.

Tom's Hardware
Commodore backs down over FPGA firmware lockdown — it won’t now try and block third-party firmware installs but will stand firm against bricked modded units

Commodore backs down over FPGA firmware lockdown — it won’t now try and block third-party firmware installs but will stand firm against bricked modded units

Commodore has reversed its decision to block third-party firmware installs on the C64 Ultimate computer, allowing users to experiment freely. However, the company will not provide support or replacements for modded units that become bricked. The initial plan to restrict non-Commodore FPGA firmware caused a divide among fans, leading to heated discussions on social media and forums. Commodore now emphasizes user freedom but warns that using community-installed firmware is at the owner's risk, with no free support or warranty service provided for damaged units.

Tom's Hardware
CEO Interview with Xianxin Guo of Lumai

CEO Interview with Xianxin Guo of Lumai

Xianxin Guo, CEO of Lumai, discusses the company's optical computing technology for AI and data center acceleration, aiming to address power efficiency and scalability limitations of traditional silicon-based approaches. Lumai's hybrid optical-electronic design enhances compute efficiency by leveraging light for key operations, reducing energy consumption and breaking through AI system bottlenecks. The technology is well-suited for high-throughput AI inference workloads in data centers, offering a more cost-effective and scalable solution. By focusing on optical compute, Lumai differentiates itself from competitors and aims to redefine AI compute efficiency for long-term scalability and performance gains. The company engages with customers through collaborative discussions and partnership-driven approaches to integrate optical computing seamlessly into existing AI infrastructure.

SemiWiki

We use cookies

We use cookies to ensure you get the best experience on our website. For more information on how we use cookies, please see our cookie policy.