Sambanova introduces new AI accelerator, partners with Intel to deploy Xeon CPUs for inferencing and agentic workloads — Sambanova claims SN50 chip is three times more efficient than Nvidia B200
Source
Published
TL;DR
AI GeneratedSambaNova has introduced its SN50 AI processor for agentic inference, partnering with Intel to deploy Xeon CPUs for inferencing workloads. The SN50 chip is claimed to be three times more efficient than Nvidia's B200. SambaNova focuses on AI inference with the SN50 accelerator, emphasizing low latency and power consumption. The company's collaboration with Intel aims to offer Xeon-based AI systems to enterprises and governments, targeting large-scale AI inference infrastructure. Additionally, SambaNova secured $350 million in Series E funding to expand its manufacturing and cloud capacity.