Back to home
Technology

The Download: Google’s AI energy expenditure, and handing over DNA data to the police

Source

MIT Technology Review

Published

TL;DR

AI Generated

Google has released data on the energy consumption of its AI prompts, revealing that the median prompt uses 0.24 watt-hours of electricity. This transparency is a first for a major tech company with an AI product like Gemini. In a separate story, a writer discusses giving police access to their DNA through a genealogical database, highlighting the role of DNA in solving crimes. Additionally, an upcoming academic conference will feature work primarily researched and presented by AI, sparking debate about the capabilities of AI in scientific research.

Read Full Article

Similar Articles

MIT Technology Review

The Download: AI’s energy future

The article discusses AI's impact on energy consumption, highlighting how AI's growing popularity is driving up electricity demand and reshaping the grid. Despite concerns about increased energy consumption, some argue that AI could benefit the grid by enabling more efficient power systems and predicting failures. The piece also touches on the challenges of understanding AI's energy burden and Google DeepMind's efforts to enhance AI interpretability. Additionally, the article includes other tech news stories, such as Meta's suppression of research and advancements in AI technology like real-time language translation with Apple's new AirPods.

MIT Technology Review
Three big things we still don’t know about AI’s energy burden

Three big things we still don’t know about AI’s energy burden

Tech researchers have struggled to determine the energy consumption of leading AI models like ChatGPT and Gemini, with companies like Google and OpenAI initially withholding this crucial data. However, recent disclosures have shed some light on the matter, revealing that a single ChatGPT query uses 0.34 watt-hours of energy. Despite this progress, there are still gaps in understanding the full energy impact of AI, especially beyond chatbot interactions. Companies investing heavily in AI are grappling with balancing energy demands with sustainability goals, with uncertainties surrounding whether AI adoption will meet industry projections or falter. The future trajectory of AI's energy burden remains uncertain, dependent on factors like technological advancements and societal adoption rates.

MIT Technology Review
MIT Technology Review

In a first, Google has released data on how much energy an AI prompt uses

Google has released a technical report revealing that the median energy consumption of its Gemini AI apps for a single query is 0.24 watt-hours, equivalent to running a microwave for one second. The report offers insights into water consumption and carbon emissions associated with AI prompts. The study highlights the energy demand breakdown, showing that AI chips account for 58% of the total electricity demand, with other infrastructure components also playing significant roles. Google's transparency in sharing this data is seen as a significant step in understanding AI energy usage, with experts praising the comprehensive analysis provided. The company's efforts to optimize energy efficiency in AI models have led to a significant reduction in energy consumption over time.

MIT Technology Review
Dr. L.C. Lu on TSMC Advanced Technology Design Solutions

Dr. L.C. Lu on TSMC Advanced Technology Design Solutions

Dr. L.C. Lu, a key figure at TSMC, focuses on design-technology co-optimization, packaging innovations, and AI-driven methodologies for next-gen semiconductor systems. TSMC emphasizes DTCO and DDCL innovations for scaling from N5 to A14 nodes, with NanoFlex and NanoFlex Pro architectures offering efficiency gains. N2P and N2U nodes incorporate advanced DTCO and power delivery optimizations, with hybrid dual-rail architectures achieving significant energy savings. TSMC collaborates with EDA partners for AI integration, enhancing productivity and design quality. Advanced packaging technologies like CoWoS and SoIC play a crucial role in enabling AI scaling, with memory bandwidth and interconnect performance scaling aggressively. TSMC addresses power delivery and thermal management challenges in AI systems through advanced solutions. TSMC's advancements in design methodologies and AI-driven automation promise improved productivity and scalability in chip-package co-design.

SemiWiki

We use cookies

We use cookies to ensure you get the best experience on our website. For more information on how we use cookies, please see our cookie policy.