AMD adds ultra-fast memory to flagship AI Instinct accelerator as it looks to next-generation CDNA 4 architecture – Instinct MI325X accelerator has 2x memory and 30% more bandwidth compared to Nvidia’s H200

AMD has unveiled new CPU, NPU and GPU architectures aimed at “powering end-to-end AI infrastructure from the data center to PCs,” alongside an expanded AMD Instinct accelerator roadmap and a new Instinct MI325X accelerator, which the company says will be available in Q4 2024.

The new Instinct MI325X offers 288 GB of HBM3E memory and 6 TB/s of memory bandwidth. AMD says this means it will offer 2x the memory capacity and 1.3x bandwidth than “the competition,” by which it means Nvidia’s H200, as well as 1.3x better compute performance.