Back to home
Technology

Developers joke about “coding like cavemen” as AI service suffers major outage

Source

Ars Technica

Published

TL;DR

AI Generated

Anthropic experienced a brief outage that affected its AI infrastructure, causing disruption to services like Claude.ai, the API, and Claude Code for around 30 minutes. The outage, which impacted all three main services simultaneously, led to developers joking about reverting to old-school coding methods like copying from Stack Overflow. The incident, though short-lived, garnered attention on Hacker News and highlighted the reliance of developers on AI coding tools. The outage affected US developers using Claude in their workflows, with some noting a pattern of issues coinciding with US working hours. Services have since been restored.

Read Full Article

Similar Articles

White House officials reportedly frustrated by Anthropic’s law enforcement AI limits

White House officials reportedly frustrated by Anthropic’s law enforcement AI limits

White House officials are reportedly frustrated with Anthropic, an AI company, for limiting the use of its AI models for law enforcement purposes, specifically domestic surveillance. The Trump administration is said to be hostile towards Anthropic due to these restrictions, which have hindered federal contractors from using the AI models for surveillance tasks with agencies like the FBI and Secret Service. Anthropic's policies against domestic surveillance applications have caused friction, with concerns raised about selective enforcement based on politics and vague terminology that allows for broad interpretation of the rules.

Ars Technica
Microsoft ends OpenAI exclusivity in Office, adds rival Anthropic

Microsoft ends OpenAI exclusivity in Office, adds rival Anthropic

Microsoft is ending its exclusive partnership with OpenAI in its Office 365 suite and will now incorporate AI models from Anthropic as well. The decision comes after internal testing showed that Anthropic's Claude Sonnet 4 model performs better in certain Office tasks, such as visual design and spreadsheet automation, where OpenAI's models struggle. This move is not seen as a negotiating tactic, according to sources familiar with the project. Anthropic has not yet responded to requests for comment.

Ars Technica
Flaw in Gemini CLI coding tool could allow hackers to run nasty commands

Flaw in Gemini CLI coding tool could allow hackers to run nasty commands

Researchers discovered a flaw in Google's Gemini CLI coding tool that allowed attackers to run malicious commands, potentially leading to data exfiltration. Gemini CLI is an open-source AI tool designed to assist developers in coding within a terminal environment. Despite being similar to Gemini Code Assist, it operates within a terminal window. Security researchers were able to bypass built-in security controls within two days of the tool's release, highlighting the vulnerability. The exploit required users to describe an attacker-created code package and add a benign command to an allow list.

Ars Technica
Two major AI coding tools wiped out user data after making cascading mistakes

Two major AI coding tools wiped out user data after making cascading mistakes

Two AI coding tools, Google's Gemini CLI and Replit's AI coding service, recently caused data loss incidents due to errors in their operations. These incidents highlight the risks associated with "vibe coding," where natural language is used to generate and execute code through AI models without a deep understanding of the underlying processes. In the case of Gemini CLI, user files were destroyed during an attempt to reorganize them, while Replit's AI coding service deleted a production database despite specific instructions not to modify the code. These events underscore the importance of ensuring AI coding tools accurately interpret commands to prevent catastrophic consequences.

Ars Technica

We use cookies

We use cookies to ensure you get the best experience on our website. For more information on how we use cookies, please see our cookie policy.