Back to home
Technology

Why does OpenAI need six giant data centers?

Source

Ars Technica

Published

TL;DR

AI Generated

OpenAI, Oracle, and SoftBank are expanding their Stargate AI infrastructure project with five new US data center sites, totaling nearly 7 gigawatts of planned capacity and over $400 billion in investment over three years. The goal is to support ChatGPT's 700 million weekly users and train future AI models, aiming to reach a $500 billion, 10-gigawatt commitment by 2025. The new sites, including locations in Texas, New Mexico, and the Midwest, will create over 25,000 jobs and deliver over 5.5 gigawatts of capacity, allowing for significant electricity consumption when running at full load. Critics question the sustainability of this massive investment structure.

Read Full Article

Similar Articles

Zuckerberg's Meta will beam sunlight from space to power AI data centers, solar-collecting satellites will orbit 22,000 miles above Earth — firm reserves 1 Gigawatt of orbital solar energy and 100 Gigawatt-hours of long-duration storage

Zuckerberg's Meta will beam sunlight from space to power AI data centers, solar-collecting satellites will orbit 22,000 miles above Earth — firm reserves 1 Gigawatt of orbital solar energy and 100 Gigawatt-hours of long-duration storage

Meta, led by Zuckerberg, plans to power its AI data centers with sunlight beamed from space using solar-collecting satellites in geosynchronous orbit 22,000 miles above Earth. The company has reserved 1 Gigawatt of orbital solar energy and 100 Gigawatt-hours of long-duration storage to address the increasing energy demands of its AI infrastructure. This move is part of Meta's strategy to secure long-term energy supplies for its expanding AI operations, with a first orbital demonstration planned for 2028 and potential commercial delivery by 2030. The partnerships with Overview Energy and Noon Energy aim to tackle the challenges of intermittency and long-duration energy storage in renewable energy systems.

Tom's Hardware
Maine governor vetoes bill that bans large new data centers — says legislature should’ve exempted one particular well-supported data center

Maine governor vetoes bill that bans large new data centers — says legislature should’ve exempted one particular well-supported data center

Maine Governor Janet Mills vetoed a bill that would have banned new data center projects exceeding 20MW until 2027, citing concerns about their environmental impact and electricity rates. She supported the moratorium but wanted an exemption for a data center in Jay due to its positive local impact. Despite the veto, the legislature may override it, potentially affecting Mills' political standing. She plans to create a council to study data center impacts and signed a bill preventing them from accessing tax incentives. Residents' opposition to data centers due to increased costs and power quality issues is growing.

Tom's Hardware
CPU requirements for AI workloads are multiplying, driving intensifying shortages and price hikes — Intel already shifting production from consumer chips to Xeon as inference workloads drive server CPU ratios back toward parity with GPUs

CPU requirements for AI workloads are multiplying, driving intensifying shortages and price hikes — Intel already shifting production from consumer chips to Xeon as inference workloads drive server CPU ratios back toward parity with GPUs

Intel is experiencing a surge in demand for server CPUs due to the increasing requirements for AI workloads, particularly in inference tasks. This shift has led to shortages and price hikes, with server CPU prices rising by up to 20% since March. The company is redirecting production from consumer chips to Xeon to meet the growing demand for data center chips. The ratio of CPUs to GPUs in data centers is expected to reach parity as AI workloads evolve, driving the need for more powerful CPUs. Intel anticipates further price increases in the second half of 2026 as demand continues to rise.

Tom's Hardware
Global memory shortage expected to get worse before it gets better

Global memory shortage expected to get worse before it gets better

The global memory shortage is expected to worsen, with reports indicating that DRAM shortages may persist until the end of the decade. Major manufacturers like Samsung, SK Hynix, and Micron are investing in expanding production facilities, but the additional capacity won't be fully operational until 2027 or later, leading to a multi-year supply gap. The rise in AI infrastructure demand for high-bandwidth memory is prioritizing production over traditional DRAM used in consumer devices, causing further supply constraints. Analysts predict a shortfall in production growth compared to demand, potentially extending the memory shortage until 2030, resulting in continued high prices for consumers.

TweakTown

We use cookies

We use cookies to ensure you get the best experience on our website. For more information on how we use cookies, please see our cookie policy.

Why does OpenAI need six giant data centers? | Tech News Aggregator