We use cookies

We use cookies to ensure you get the best experience on our website. For more information on how we use cookies, please see our cookie policy.

Back to home

Sambanova introduces new AI accelerator, partners with Intel to deploy Xeon CPUs for inferencing and agentic workloads — Sambanova claims SN50 chip is three times more efficient than Nvidia B200

Source

Tom's Hardware

Published

TL;DR

AI Generated

SambaNova has introduced its SN50 AI processor for agentic inference, partnering with Intel to deploy Xeon CPUs for inferencing workloads. The SN50 chip is claimed to be three times more efficient than Nvidia's B200. SambaNova focuses on AI inference with the SN50 accelerator, emphasizing low latency and power consumption. The company's collaboration with Intel aims to offer Xeon-based AI systems to enterprises and governments, targeting large-scale AI inference infrastructure. Additionally, SambaNova secured $350 million in Series E funding to expand its manufacturing and cloud capacity.