As The Demand for HBM Explodes, SK Hynix is Expected to Benefit
The demand for high bandwidth memory is set to explode in the coming quarters and years due to the broader adoption of artificial intelligence in general and generative AI in particular. SK Hynix will likely be the primary beneficiary of the HBM rally as it leads shipments of this type of memory, holding a 50% share in 2022, according to TrendForce.
Analysts from TrendForce believe that shipments of AI servers equipped with compute GPUs like Nvidia's A100 or H100 will increase by roughly 9% year-over-year in 2022. However, they do not elaborate on whether they mean unit shipments or dollar shipments. They now estimate that the rise of generative AI will catalyze demand for AI servers, and this market will grow by 15.4% in 2023 and continue growing at a compound annual growth rate of 12.2% through 2027.
The upsurge in AI server usage will also increase demand for all types of memory, including commodity DDR5 SDRAM, HBM2e as well as HBM3 for compute GPUs, and 3D NAND memory for high-performance and high-capacity storage devices.
TrendForce estimates that while general-purpose servers pack 500 GB – 600 GB of commodity memory, an AI server uses 1.2 TB – 1.7 TB. In addition, such machines use compute GPUs equipped with 80 GB or more of HBM2e/HBM3 memory. Since each AI machine comes with multiple compute GPUs, the total content of HBM per box is now 320 GB – 640 GB, and it is only set to grow further as accelerators like AMD's Instinct MI300 and Nvidia H100 NVL carry more HBM3 memory.
Speaking of HBM3 adoption, it is necessary to note that SK Hynix is currently the only maker that mass produces this type of memory, according to TrendForce. As a result, as demand for this type of memory grows, it will benefit the most. Last year SK Hynix commanded 50% of HBM shipments, followed by Samsung with 40% and Micron with 10%. This year the company will solidify its positions and control 53% of HBM shipments, whereas shares of Samsung and Micron will decline to 38% and 9%, respectively, TrendForce claims.
Nowadays, AI servers are used primarily by the leading U.S. cloud service providers, including AWS, Google, Meta, and Microsoft. As more companies launch their generative AI products, they will inevitably have to use AI servers either on-prem or at AWS or Microsoft. For example, Baidu and ByteDance plan to introduce generative AI products and services in the coming quarters.
Source: TrendForce
from AnandTech https://ift.tt/4n9SyrN
Post a Comment