Data Centers May House AI—But Operators Don’t Trust AI (Yet)
Source
Published
Source
Published
Maine Governor Janet Mills vetoed a bill that would have banned new data center projects exceeding 20MW until 2027, citing concerns about their environmental impact and electricity rates. She supported the moratorium but wanted an exemption for a data center in Jay due to its positive local impact. Despite the veto, the legislature may override it, potentially affecting Mills' political standing. She plans to create a council to study data center impacts and signed a bill preventing them from accessing tax incentives. Residents' opposition to data centers due to increased costs and power quality issues is growing.
Intel is experiencing a surge in demand for server CPUs due to the increasing requirements for AI workloads, particularly in inference tasks. This shift has led to shortages and price hikes, with server CPU prices rising by up to 20% since March. The company is redirecting production from consumer chips to Xeon to meet the growing demand for data center chips. The ratio of CPUs to GPUs in data centers is expected to reach parity as AI workloads evolve, driving the need for more powerful CPUs. Intel anticipates further price increases in the second half of 2026 as demand continues to rise.
Artificial Intelligence (AI) is revolutionizing semiconductor inspection and metrology by enhancing defect detection processes with automation, speed, and adaptability. AI-driven systems leverage Big Data to uncover patterns and anomalies that traditional methods may miss, leading to improved accuracy and efficiency. AI-integrated platforms like Nordson's SQ3000 Multi-Function System can detect microscopic flaws with unparalleled speed and efficiency, surpassing traditional methods. AI's real-time, in-line inspection capabilities enable rapid data processing without compromising production speed, while machine learning models adjust quickly to new production requirements. The advancement of Machine Learning (ML) in inspection systems is transforming defect detection by creating self-teaching AI systems that become smarter and more adaptable with each interaction.
Researchers at Technische Universitat Berlin published a technical paper on the challenges of Silent Data Corruption (SDC) in Large Language Model (LLM) training. As LLMs grow in size, hardware-induced faults like SDC can bypass detection mechanisms, leading to severe consequences during training. The study explores how intermittent SDC impacts LLM pretraining, highlighting the sensitivity of different factors like bit positions and kernel functions. The research proposes a lightweight detection method to identify harmful parameter updates and demonstrates the effectiveness of recomputing training steps upon detection in mitigating corruption.
We use cookies to ensure you get the best experience on our website. For more information on how we use cookies, please see our cookie policy.