fbpx

Generative AI, High Performance Computing to Fuel High Bandwidth Memory Market Growth

This market research report was originally published at the Yole Group’s website. It is reprinted here with the permission of the Yole Group.

The market for high bandwidth memory (HBM) is poised for rapid growth over the next five years, announces Yole Group in its latest analyses. According to the analysts, it is led by the continuous expansion of data-intensive artificial intelligence (AI) and high-performance computing (HPC) applications. As a result, the HBM sector will largely outgrow the overall DRAM market and remain undersupplied throughout 2024.

What are the latest innovations? What impact will they have on the ecosystem? Simone Bertolazzi and Emilie Jolivet, respectively Principal Analyst and Director, More Moore activities at Yole Group offer you today a snapshot of this industry.

This article is based on the key results of the Next-Generation DRAM 2024 – Focus on HBM and 3D DRAM report published this month. More information about Yole Group’s memory products: HERE including specific teardowns: Nvidia H100 Tensor Core GPUAMD 3D V-Cache with TSMC SoIC 3D Packaging

The rapid rise of generative AI has boosted demand for high-speed DDR5 DRAM and HBM technologies in the data center market. AI workloads are driving the need for higher bandwidth to increase data transfer rates between devices and processing units.

Hyperscalers and original equipment manufacturers (OEMs) are increasing their server capacity to support model training and inference, requiring more AI accelerators. This is in turn driving strong growth in HBMs associated with these accelerators. Demand for data center accelerators exceeded four million units in 2023 and is poised to nearly double in 2024.

At Yole Group, Simone Bertolazzi and Emilie Jolivet estimate that following an impressive 93% year-on-year increase in bit-shipment growth, data center DRAM bit demand could grow at a compound annual growth rate (CAGR) of 25% in 2023-2029, driven by a 39% growth in DRAM for AI servers over that period.

The share of HBM in overall DRAM bit shipments is forecast to rise from approximately 2% in 2023 to 6% by 2029, as AI server demand outpaces other applications. But as HBM technology is priced significantly higher than DDR5, in revenue terms the share is anticipated to climb from $14 billion in 2024 to $38 billion in 2029 – after having soared by more than 150% year on year from around $5.5 billion in 2023.

Competition intensifies as suppliers vie for leadership

 “To take advantage of the new generative AI wave and to speed up the market recovery process, Samsung, SK Hynix and Micron have started diverting more of their wafer capacity to address HBM opportunities, leading to an overall bit production slowdown and accelerating the shift to undersupply for non-HBM products.”
Simone Bertolazzi, PhD.
Principal Analyst for Memory, Yole Group

Memory suppliers have increased their HBM wafer production, which Yole Group estimates increased from 44,000 wafers per month (WPM) in 2022 to 74K WPM in 2023 and could grow to 151K WPM in 2024.

SK Hynix is leading the development and commercialization of HBM, but its competition with Samsung is becoming more intense. Micron has a relatively small market share compared with the South Korean companies but is ramping up its production to capitalize on the market opportunity.

AI demand accelerates rollout of new HBM generations

The need to increase bandwidth for HPC applications is accelerating suppliers’ roadmaps for the development and commercialization of new HBM generations. Companies are striving to meet their goals for the launch of each generation to be well-positioned for the strong demand to come.

SK Hynix gained a significant advantage with the introduction of HBM3 in the second half of 2022, and while this generation and its extended version HBM3E are still in the early stages of deployment, all three key players are planning to introduce HBM4 in 2026.

In addition to the anticipated growth in orders from customers such as NVIDIA, the likes of AMD, Google and Amazon plan to start manufacturing their own AI accelerators to power their AI-based applications.

“All the buyers are competing to be AI ready, to make sure they have the infrastructure to serve the needs of AI. That is why we see them spending a lot of money and buying servers equipped with AI chips and HBM memory at prices that might be overestimating its value – they are strong enough to deal with the supply chain on their own.”
Emilie Jolivet
Business Line Director for More Moore activities, Yole Group

Rapid growth clouds market visibility

With existing suppliers expanding capacity and multiple new entrants buying HBM to leverage generative AI for new solutions, the market is changing rapidly – making it challenging to quantify.

The next several quarters will be a time for suppliers to recover from the extended market downturn. Yole Group expects that the full year 2024 and part of 2025 will be marked by undersupply and rising prices. Yole Group will continue monitoring the market to see how it evolves.

Stay tuned!

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top