Back to home
Technology

Three big things we still don’t know about AI’s energy burden

Source

MIT Technology Review

Published

TL;DR

AI Generated

Tech researchers have struggled to determine the energy consumption of leading AI models like ChatGPT and Gemini, with companies like Google and OpenAI initially withholding this crucial data. However, recent disclosures have shed some light on the matter, revealing that a single ChatGPT query uses 0.34 watt-hours of energy. Despite this progress, there are still gaps in understanding the full energy impact of AI, especially beyond chatbot interactions. Companies investing heavily in AI are grappling with balancing energy demands with sustainability goals, with uncertainties surrounding whether AI adoption will meet industry projections or falter. The future trajectory of AI's energy burden remains uncertain, dependent on factors like technological advancements and societal adoption rates.

Read Full Article

Similar Articles

Inventor showcases 3D printer filament dryer that mines Bitcoins and dries filament with waste heat, capable of 6 TH/s at 140W — joins Bitcoin-mining 3D printer in hobbyist-focused miner lineup

Inventor showcases 3D printer filament dryer that mines Bitcoins and dries filament with waste heat, capable of 6 TH/s at 140W — joins Bitcoin-mining 3D printer in hobbyist-focused miner lineup

An inventor has created a filament dryer that not only dries filament for 3D printing but also mines Bitcoins, utilizing waste heat from the process. This new prototype is capable of 6 TH/s at 140W and is part of a lineup of hobbyist-focused mining devices. The filament dryer helps prevent common printing issues caused by moisture absorption and offers a proactive solution for maintaining filament quality. It is likely that this new device shares underlying technology with the inventor's previous 3D printer that also doubled as a Bitcoin miner, hinting at further scalability and innovation in this unique niche market.

Tom's Hardware
SemiEngineering

The Smart Advantage: How Artificial Intelligence Is Transforming Inspection And Metrology In Semiconductor Manufacturing

Artificial Intelligence (AI) is revolutionizing semiconductor inspection and metrology by enhancing defect detection processes with automation, speed, and adaptability. AI-driven systems leverage Big Data to uncover patterns and anomalies that traditional methods may miss, leading to improved accuracy and efficiency. AI-integrated platforms like Nordson's SQ3000 Multi-Function System can detect microscopic flaws with unparalleled speed and efficiency, surpassing traditional methods. AI's real-time, in-line inspection capabilities enable rapid data processing without compromising production speed, while machine learning models adjust quickly to new production requirements. The advancement of Machine Learning (ML) in inspection systems is transforming defect detection by creating self-teaching AI systems that become smarter and more adaptable with each interaction.

SemiEngineering
US Commerce Department confirms harsh new AI export rules, shoots down reports over the return of Biden-era AI Diffusion rule — DoC to formalize a new approach to strategic AI accelerator export controls

US Commerce Department confirms harsh new AI export rules, shoots down reports over the return of Biden-era AI Diffusion rule — DoC to formalize a new approach to strategic AI accelerator export controls

The US Commerce Department is planning to implement new AI export rules that require buyers of large quantities of AI accelerators to invest in US AI infrastructure. The proposed rules introduce a multi-level licensing structure based on computing capacity, with different requirements for small, medium, and large shipments. The Department clarified that these new rules are not a return to the burdensome AI Diffusion Rule from the Biden era. The regulations aim to promote secure exports of American tech while potentially making AI hardware more expensive for certain countries. The final version of the rules is still pending, and some requirements may change.

Tom's Hardware
Nvidia-backed trial shows AI data centers can flexibly adjust power use in near real time, with global implications for energy consumption — suggests hyperscalers can reduce consumption as necessary, ensuring grid isn’t overloaded during peak demand

Nvidia-backed trial shows AI data centers can flexibly adjust power use in near real time, with global implications for energy consumption — suggests hyperscalers can reduce consumption as necessary, ensuring grid isn’t overloaded during peak demand

A recent U.K. study involving Nvidia and other partners demonstrated that AI data centers can dynamically adjust power consumption, potentially reducing strain on the grid during peak demand. The trial showed that hyperscalers could swiftly modify power usage, tapping into renewable energy surplus during low demand periods. While this flexibility may not align with the need for maximum power utilization by hyperscalers, it could expedite data center deployment by relieving grid pressure. If data centers and grid operators collaborate on such adaptive systems, it could optimize power usage and facilitate quicker infrastructure connectivity.

Tom's Hardware

We use cookies

We use cookies to ensure you get the best experience on our website. For more information on how we use cookies, please see our cookie policy.