Nvidia now supplies Tesla V100 accelerator with 32GB hbm2

Spread the love

Nvidia now supplies all variants of its Tesla V100 accelerator with Volta GPU with 32GB instead of 16GB hbm2. The upgrade is the result of improved availability of 8GB HBM2 chips.

AnandTech reports that the upgrade applies to both the sxm modules and the PCI-e plug-in cards. Nvidia has tweaked production and now uses 8GB hbm2 stacks, instead of the 4GB hbm2 stacks that Tesla V100 accelerator was initially equipped with. Versions with 16GB are no longer made; all newly sold products have 32GB hbm2. Also OEMs that use the accelerator will probably switch to the 32GB versions soon.

The upgrade to 32GB hbm2 is possible because memory manufacturers are now able to supply enough chips with a capacity of 8GB. They consist of eight layers and Nvidia combines four to come to a total of 32GB. Samsung announced the production of new 8GB hbm2 modules at the beginning of this year, and SK.Hynix also has such chips in production.

Nvidia announced the Tesla V100 in May last year. It was the first product with the Volta GPU. In all other respects the chip remains the same, only the amount of memory has been doubled. Nvidia has not disclosed new prices.

Nvidia has said nothing about the Titan V, which also has a Volta GPU and is now equipped with 12GB of HBM2. This card for workstations could be equipped with 24GB hbm2 with the new chips, but it is not yet clear whether such a variant will actually come out.

Pci-e version of Nvidia Tesla V100 accelerator

You might also like