AMD Radeon Instinct MI200 Accelerator Gets 128GB HBM2e

AMD’s Radeon Instinct MI200 data center accelerator appears to be getting 128GB of HMB2e. That is four times as much as the current MI100. The accelerator is also probably the first GPU from AMD that, like the Ryzen processors, is made up of multiple chiplets.

The Australian Pawsey Supercomputing Center is working on a new supercomputer called Setonix, which will contain ‘MI-next’ GPUs with 128GB of vram per GPU, writes HPC Wire. AMD hasn’t announced its Instinct MI200 card yet, but more information has surfaced about it before.

Twitter user Locuza published last week a diagram of the GPU, which is based on AMD’s CDNA2 architecture. It would be an MCM design, or a Multi-Chip-Module. Multiple chips are combined, just like with the Ryzen processors with chiplets that contain the CPU cores.

According to the diagram, AMD would use a total of eight stacks of 16GB HBM2e and the MI200 would consist of two chiplets. The accelerator is said to contain 128 compute units, which could account for 16,384 stream processors, but it’s not clear yet if all of those are activated.

The MI200 is the first accelerator based on the CDNA2 architecture, which was previously on the roadmap for 2022. The GPU would bear the code name Aldebaran and is the intended successor of Arcturus, or the Radeon Instinct MI100. AMD’s Instinct accelerators are used in data centers and supercomputers. The accelerators cannot be used as a video card.

AMD Instinct Accelerators
Accelerator Name AMD Instinct MI100 AMD Instinct MI200*
Architecture gpu name GPU cores gpu speed FP16 Compute FP32 Compute FP64 Compute Vram Memory Speed Memory bus Memory bandwidth form factor Cooling tdp
7nm CDNA1 (GFX908) cDNA2 (GFX90A)
Arcturus Aldebaran (Multi-Chip-Module)
7680 Nnb
~1500MHz Nnb
185Tflops Nnb
23.1Tflops Nnb
11.5Tflops Nnb
32GB HBM2 128GB HBM2E
1200MHz Nnb
4096bit bus Nnb
1.23TB/s Nnb
Dual Slot, Full Length OAM
passive Nnb
300W Nnb

*Specifications not yet officially confirmed

Comments
Loading...