TSMC: Shortage of Nvidia AI data center GPUs will last until the end of 2024

Spread the love

Nvidia’s AI video cards will remain scarce until at least the end of 2024. This is what TSMC chairman Mark Liu said in an interview with Nikkei. Liu blames the GPU shortage on a shortage of Chip-on-Wafer-on-Substrate packaging capacity.

TSMC expects the shortage to continue for another eighteen months, as demand for the AI ​​GPUs continues to rise, while the capacity of TSMC’s CoWoS packaging technology is only being expanded very slowly. Lui informs Nikkei that the capacity is expected to be expanded sufficiently within a year and a half to be able to meet the full demand. Currently, TSMC says it can meet about eighty percent of the demand.

TSMC is the only company that manufactures Nvidia’s H100 and A100 AI data center GPUs. This hardware is used to power many AI applications, including ChatGPT. Like most AI chips, these two Nvidia GPUs use CoWoS. The company claims that demand for products using this advanced packaging technique unexpectedly skyrocketed last year and is now approximately three times higher than the same period last year. TSMC says it is installing new tools at its existing packaging plants to boost capacity, but this will take time.

Nvidia H100

You might also like