AI’s Insatiable Appetite: Why Memory Chip Prices Are Soaring
If it feels like everything in technology revolves around AI these days, you’re not wrong. Nowhere is this more apparent than in the market for DRAM (Dynamic Random-Access Memory), the type of computer memory crucial for powering GPUs and other accelerators in AI data centers. Demand is skyrocketing, diverting supply from other sectors and causing prices to surge.
The DRAM Price Explosion: A Current Snapshot
DRAM prices have already risen 80-90% this quarter, according to Counterpoint Research. This trend is expected to continue, with potential increases of 30% in Q4 2025 and another 20% in early 2026. This follows a 50% price increase earlier in 2025. The largest AI hardware companies have secured their chip supply well into the future – some as far out as 2028 – leaving PC and consumer electronics manufacturers scrambling for limited availability and facing inflated costs.
HBM: The Core of the Problem
The primary driver behind this surge is high-bandwidth memory (HBM). HBM utilizes 3D chip packaging, stacking multiple DRAM dies to increase bandwidth and performance. Each HBM chip can contain up to 12 thinned-down DRAM chips, connected by microscopic solder balls. This complex technology is positioned close to the GPU or AI accelerator, minimizing the “memory wall” – the bottleneck in transferring data between the processor and memory.
HBM is significantly more expensive than other types of memory, generally costing three times as much and accounting for 50% or more of the total GPU cost. As AI models grow in size, the importance of HBM increases, further straining supply and driving up prices.
A History of Boom and Bust, Amplified by AI
The DRAM industry is historically cyclical, characterized by periods of boom and bust. Building new fabrication plants (fabs) costs upwards of $15 billion and takes over a year and a half to become operational, meaning capacity often lags behind demand. The current situation is a collision of this cyclical nature and the unprecedented scale of the AI infrastructure build-out.
The COVID-19 pandemic initially triggered a chip supply panic, leading hyperscalers like Amazon, Google, and Microsoft to stockpile memory and storage, boosting prices. However, a subsequent slowdown in 2022 and 2023 caused prices to plummet, even prompting some companies like Samsung to cut production by 50% to avoid losses. Investment in new production capacity remained limited throughout 2024 and most of 2025.
The Data Center Boom Fuels Demand
Currently, nearly 2,000 new data centers are planned or under construction globally, representing a potential 20% increase in overall capacity. McKinsey predicts that companies will spend $7 trillion by 2030, with $5.2 trillion dedicated to AI-focused data centers. This massive investment is heavily reliant on GPUs and, HBM.
Nvidia, a leading GPU manufacturer, has seen its data center revenue soar from barely $1 billion in late 2019 to $51 billion in October 2025. Their GPUs are increasingly demanding more DRAM and more HBM chips. Micron, a major DRAM producer, reported that HBM and other cloud-related memory accounted for nearly 50% of its DRAM revenue in 2025, up from 17% in 2023.
What Does the Future Hold for DRAM Supply?
Addressing the supply shortage requires both innovation and increased production capacity. Micron, Samsung, and SK Hynix are all investing in new fabs and facilities, but these won’t significantly impact prices for several years. Micron is building an HBM fab in Singapore (production in 2027), retooling a fab in Taiwan (production in late 2027), and constructing a DRAM fab complex in New York (full production by 2030). Samsung plans to start production at a new plant in South Korea in 2028, and SK Hynix is building HBM and packaging facilities in Indiana (production by 2028) and an HBM fab in Cheongju (completion in 2027).
Intel CEO Lip-Bu Tan recently stated that there will be “no relief until 2028.” Further gains will come from process learning, better DRAM stacking efficiency, and tighter coordination between memory suppliers and AI chip designers.
Technologies like advanced packaging, hybrid bonding, and increasing the number of dies per HBM stack (potentially up to 16 or even 20) are too being explored to improve performance and efficiency.
FAQ: DRAM Shortage and Pricing
Q: How long will DRAM prices remain high?
A: Experts predict high prices will persist until at least 2028, with potential for continued increases in the short term.
Q: What is HBM and why is it so important?
A: HBM (High-Bandwidth Memory) is a 3D-stacked memory technology that provides significantly faster data transfer rates, crucial for AI applications.
Q: Will new fabs solve the shortage?
A: New fabs are being built, but they take years to become fully operational, meaning they won’t provide immediate relief.
Q: What can consumers expect?
A: Consumers can expect to see higher prices for devices that rely on DRAM, such as PCs and gaming consoles.
Did you know? The DRAM industry’s cyclical nature means periods of oversupply often follow periods of shortage, but the current AI-driven demand is unprecedented.
Pro Tip: Keep an eye on announcements from major DRAM manufacturers like Micron, Samsung, and SK Hynix for updates on production capacity and technology advancements.
Stay informed about the latest developments in the memory market. Explore our other articles on semiconductor technology and the impact of AI on the tech industry. Read more here.
