Nvidia presents Tesla T4 accelerator with Turing-GPU

Spread the love

Nvidia has announced the Tesla T4 accelerator, which is equipped with a Turing gpu with Tensor cores and 16GB gddr6 memory. The card is intended for data centers where deep learning is used.

The Tesla T4 is a PCI-e-x16 plug-in card that is equipped with a Turing GPU with 2560 cudacores and 320 Tensor cards. cores. Nvidia does not mention any RT-cores, so it seems to be a modified gpu where the cores for raytracing are absent. The Tesla T4 accelerator has a memory bandwidth of 320GB / s and consumes 75 watts.

Nvidia gives little details about the gpu, but because of the number of cudacores it seems to be a small version of the Turing gpu, like the TU106 used in the RTX 2070 video card. That variant of the gpu has 2304 cudacores and 288 Tensor cores, but also 36 RT cores. The Quadro RTX cards have at least 3072 cudacores and are also equipped with RT cores.

According to Nvidia, the card is suitable for training machine learning models. The gpu-maker also comes with TensorRT 5, an interference optimizer and runtime engine with support for the Tensor cores of the Turing gpu. Nvidia shares the whole under the name TensorRT Hyperscale Platform.

In the announcement the company announces that the Google Cloud Platform will soon support the Tesla T4, also manufacturers of servers, such as Cisco UCS, Dell EMC Fujitsu, HPE and IBM Cognitive Systems say they support the TensorRT Hyperscale Platform.

Nvidia has not announced the price of the Tesla T4 accelerator. It is the successor of the Tesla P4 which was announced in 2016 and can currently be found in the Pricewatch for about 2500 euro.

You might also like