As hyperscale data centers rewrite the rules of the memory market, shortages could persist until 2027.
Strong server DRAM demand for AI data centers is driving memory prices higher throughout the market, as customers scramble to secure supply for their production needs amid fears of future shortages.
The DRAM market is in an AI-driven upcycle, with hyperscale data centers soaking up supply and pushing prices higher since Q3 2025. Because AI servers require far more DDR5 (and HBM) per system than traditional servers, availability is tightening across PCs, smartphones, and other end markets.
In this context, John Lorenz, Director, Memory & Computing activities at Yole Group, highlights a key driver of today’s price dynamics: fear of future scarcity. As DRAM manufacturers prioritize higher-margin HBM and server-grade DDR5, other segments react defensively, often buying ahead, amplifying shortages and pushing spot prices higher.

At Yole Group, memory activity tracks these structural changes across the value chain, from technology roadmaps including DDR5, LPDDR, HBM and more to supply capacity, pricing mechanisms and end-market demand. Drawing on perspectives from leading memory experts, Yole Group’s related analyses quantify how hyperscaler behavior, manufacturing constraints and long fab lead times could keep market tightness and elevated pricing, an important theme well into 2027. Enjoy reading this snapshot!
The latest price upswing started during the third quarter of 2025, when DRAM prices climbed by 13.5% quarter over quarter. While the DRAM market can be volatile, with price changes of 15-20% in the past, the rally came on top of a strong rebound from 2023 through late 2024 and early 2025. That suggested the market had reached a cyclical peak and was poised for a downturn. Instead, early signals from company earnings suggest prices may have jumped a further 30% in the fourth quarter.
Spot prices for DDR5 used in servers have surged by as much as 100% in some cases. PC makers are already feeling the impact: Hewlett Packard and Dell have warned they may remove certain laptop models from their line-ups next year, either because DRAM has become too expensive or they are concerned they will not be able procure enough.
AI infrastructure is redrawing the DRAM demand curve
At the heart of the imbalance is the AI infrastructure buildout. Data center operators are buying AI accelerators at scale, along with the general-purpose servers needed to run them. AI accelerators rely on high-bandwidth memory (HBM), while the host servers consume large volumes of standard DDR5.
A single AI server configured with eight accelerators, each with 200GB of HBM, contains around 1.6TB of HBM and roughly 3TB of DDR5. By comparison, a typical non-AI server built in 2025 uses less than 1TB of DRAM in total. This rapid increase in memory content per system is outpacing supply.
HBM further distorts the market, commanding far higher prices and margins than DDR5, and manufacturers have strong incentives to prioritize it. Producing HBM can take up to four times as many wafers per gigabyte as DDR5, meaning that shifts to increase output reduce the available capacity for conventional server memory.
The effects are rippling into other end markets. Automotive applications typically use LPDDR4 and LPDDR5, the same memory found in smartphones, tablets and laptops. But as automotive is still a strategic play for memory suppliers, particularly with the growth of self-driving cars which require more memory, they are unlikely to cut off the industry. They do, however, have the upper hand to charge more for automotive customers to still get their supply.
That dynamic helps explain strategic moves such as Micron’s decision to wind down its Crucial consumer business, reflecting a focus on higher-margin, AI-driven demand rather than direct-to-consumer products.
Outside the data center, smartphones account for around 25% of global DRAM bit demand, while PCs represent roughly 10–11%. Consumer electronics, beyond phones and PCs, including gaming devices and wearables, add another 6%. Automotive accounts for about 5%, and industrial, medical and military uses combined roughly 4%.
Data centers dominate, representing around 50% of total DRAM bit demand. AI workloads alone account for roughly 30% of that total (HBM and non-HBM) giving them outsized influence over pricing.
Hyperscaler demand increasingly sets DRAM pricing
History shows how quickly DRAM cycles can turn. Between 2014 and 2016, prices fell in response to flat demand, prompting Android-based smartphone manufacturers, especially in China, to compete by increasing memory content. That additional demand absorbed excess supply and pushed prices higher, until costs squeezed margins and vendors paused content growth or shifted toward lower-spec models.
This time, the usual self-correcting mechanism, where high prices trigger pullbacks in demand, has not yet materialized. Hyperscalers and server manufacturers are far less price-sensitive than consumer device makers and are willing to pay up to secure DRAM supply to remain competitive in the AI race, keeping prices elevated for everyone else.
On the supply side, relief is structurally constrained by long lead times. Building or expanding a DRAM fab typically takes 2-3 three years to reach volume production. Some incremental supply is expected in 2026, but much of it is limited.
China’s CXMT is adding capacity but mainly serves domestic customers and has yet to meet the requirements of leading global buyers. Samsung is adding equipment at its P4 facility but is prioritizing HBM rather than broader DRAM supply. SK hynix’s M15X fab should begin contributing output in the second half of 2026, with more meaningful volumes in 2027, while Micron’s new Boise fab is also expected to add supply in 2027.
Until then, it would take smartphone and PC makers slowing memory content growth or AI infrastructure spending moderating to ease pricing pressure ahead of large-scale capacity additions.
As AI infrastructure continues to reshape memory demand, DRAM pricing will remain a key watchpoint for the entire electronics ecosystem, well beyond the data center. Understanding how technology transitions, supply allocation, and hyperscaler procurement strategies interact is essential to anticipate risk and opportunity across markets.
To stay ahead, follow Yole Group and explore the memory-focused products and analyses for data-driven perspectives on pricing, capacity, and end-market impacts. And stay tuned throughout 2026: analysts will be sharing fresh insights via Yole Group’s events program, new articles, and expert webinars, bringing you timely updates, deep dives, and actionable takeaways as the market evolves!
About the author
John Lorenz is Director, Memory & Computing at Yole Group.
He leads the growth of the team’s technical expertise and market intelligence, while managing key business relationships with industry leaders. John also drives the development of Yole Group’s market research and strategy consulting activities focused on memory and computing technologies and markets.
Having joined Yole Group’s computing team in 2019, John brings deep insight leading-edge semiconductor manufacturing to the division, which has been responsible for over 100 marketing and technology analyses delivered for industrial groups, start-ups, and research institutes.
Before joining Yole Group, John spent 15 years at Micron Technology in R&D/manufacturing, engineering, and strategic planning roles gaining experience across the memory and computing industries.
He holds a Bachelor of Science in Mechanical Engineering from the University of Illinois Urbana-Champaign (USA), where he specialized in MEMS devices.

