Back to home
Technology

White House officials reportedly frustrated by Anthropic’s law enforcement AI limits

Source

Ars Technica

Published

TL;DR

AI Generated

White House officials are reportedly frustrated with Anthropic, an AI company, for limiting the use of its AI models for law enforcement purposes, specifically domestic surveillance. The Trump administration is said to be hostile towards Anthropic due to these restrictions, which have hindered federal contractors from using the AI models for surveillance tasks with agencies like the FBI and Secret Service. Anthropic's policies against domestic surveillance applications have caused friction, with concerns raised about selective enforcement based on politics and vague terminology that allows for broad interpretation of the rules.

Read Full Article

Similar Articles

MIT Technology Review

The Download: animal welfare gets AGI-pilled, and the White House unveils its AI policy

Animal welfare advocates and AI researchers are exploring the potential of artificial general intelligence (AGI) to prevent animal suffering, with discussions ranging from using AI in advocacy work to cultivating meat with AI tools. The White House has revealed its AI policy blueprint, aiming to codify a light-touch framework into law and block state limits on AI, sparking a brewing war over AI regulation in the US. Elon Musk has been found liable for misleading Twitter investors, while the Pentagon is adopting Palantir AI as the core US military system. OpenAI plans to show ads to all US users of the free version of ChatGPT to generate revenue amid rising computing costs.

MIT Technology Review
Wi-Fi signals can now create accurate images of a room with the help of pre-trained AI — 'LatentCSI' leverages Stable Diffusion 3 to turn Wi-Fi data into a digital paintbrush

Wi-Fi signals can now create accurate images of a room with the help of pre-trained AI — 'LatentCSI' leverages Stable Diffusion 3 to turn Wi-Fi data into a digital paintbrush

Researchers have developed LatentCSI, a method that combines pre-trained AI with Wi-Fi CSI data to create high-resolution images of rooms accurately and efficiently. By translating Wi-Fi data into latent space and using Stable Diffusion 3, the AI can generate detailed images based on the room's layout. While LatentCSI requires pre-trained models, it can provide real-time information about a room's occupants and layout. However, privacy concerns may arise due to the potential surveillance implications of this technology.

Tom's Hardware
MIT Technology Review

Shoplifters could soon be chased down by drones

Flock Safety, known for drones used by police, is now offering them for private security, targeting businesses combating shoplifting. Companies can install drone docking stations on their premises and fly drones within a few miles if FAA waivers are obtained. The drones can track suspects, transmit video to security or police, and are being pitched to retailers, hospitals, warehouses, and more. However, concerns about privacy, overpolicing, and erosion of Fourth Amendment rights have been raised with this expansion into private-sector security.

MIT Technology Review
Developers joke about “coding like cavemen” as AI service suffers major outage

Developers joke about “coding like cavemen” as AI service suffers major outage

Anthropic experienced a brief outage that affected its AI infrastructure, causing disruption to services like Claude.ai, the API, and Claude Code for around 30 minutes. The outage, which impacted all three main services simultaneously, led to developers joking about reverting to old-school coding methods like copying from Stack Overflow. The incident, though short-lived, garnered attention on Hacker News and highlighted the reliance of developers on AI coding tools. The outage affected US developers using Claude in their workflows, with some noting a pattern of issues coinciding with US working hours. Services have since been restored.

Ars Technica

We use cookies

We use cookies to ensure you get the best experience on our website. For more information on how we use cookies, please see our cookie policy.