Cerebras makes Wafer Scale chip with 2.6 trillion transistors and 850,000 AI cores

Spread the love

Cerebras is working on its second Wafer Scale Engine, a chip so large that it takes up a full 7nm wafer from TSMC. The new chip will have 2.6 trillion transistors and 850,000 cores for artificial intelligence computation.

Cerebras provided the first details on its second-generation Wafer Scale Engine at Hot Chips 2020, AnandTech writes. The company planned to present the chip in detail, but has delayed it until later this year. Substantive details are not known further.

The huge chip is the successor to the Wafer Scale Engine that American chipmaker Cerebras unveiled on Hot Chips last year. That chip was made using a 16nm process from TSMC and contained 1.2 trillion transistors and 400,000 cores.

That first chip had dimensions of 21.5×21.5cm. That is the maximum size that can be obtained from a 300mm wafer. The new chip is likely to be roughly the same size as TSMC also makes its 7nm chips on 300mm wafers. Due to the smaller process, however, many more transistors fit on the same surface, and therefore more cores.

With the first Wafer Scale Engine, Cerebras provided each core with 48 kilobytes of sram, good for a total of 18 GB of sram. The chip has fairly simple cores made for artificial intelligence calculations. Cerebras itself makes a 254 kg computer that contains the enormous chip.

Chip Cerebras Wafer Scale Engine (2nd gen.) Cerebras Wafer Scale Engine
Process 7nm, TSMC 16nm, TSMC
Format Whole wafer (300mm)
Chip dimensions not yet known
Whole wafer (300mm)
21.5×21.5cm (462.3cm²)
Transistors 2.6 trillion 1.2 trillion
Number of cores 850,000 400,000
Sram Not yet known 48KB per core, 18GB total

You might also like