We use cookies

We use cookies to ensure you get the best experience on our website. For more information on how we use cookies, please see our cookie policy.

Back to home

Meta reveals four new MTIA chips built for AI inference — to be released on a six-month cadence

Source

Tom's Hardware

Published

TL;DR

AI Generated

Meta has unveiled four new generations of its Meta Training and Inference Accelerator (MTIA) chips, developed in collaboration with Broadcom and set to be released every six months. The new chips, MTIA 300, 400, 450, and 500, focus on AI inference tasks and boast significant improvements in HBM bandwidth and compute FLOPs compared to previous models. Meta's strategy emphasizes rapid development, an inference-first approach, and adherence to industry standards. The company plans to deploy these chips in its data centers for various AI workloads, aiming to reduce reliance on Nvidia GPUs in its AI infrastructure.