• Business
  • Entertainment
  • Health
  • News
  • Sport
  • Tech
  • World
Newsy Today
news of today
Home - Computer memory
Tag:

Computer memory

Tech

AI Boom Drives DRAM Shortage & Skyrocketing Memory Prices | IEEE Spectrum

by Chief Editor February 11, 2026
written by Chief Editor

AI’s Insatiable Appetite: Why Memory Chip Prices Are Soaring

If it feels like everything in technology revolves around AI these days, you’re not wrong. Nowhere is this more apparent than in the market for DRAM (Dynamic Random-Access Memory), the type of computer memory crucial for powering GPUs and other accelerators in AI data centers. Demand is skyrocketing, diverting supply from other sectors and causing prices to surge.

The DRAM Price Explosion: A Current Snapshot

DRAM prices have already risen 80-90% this quarter, according to Counterpoint Research. This trend is expected to continue, with potential increases of 30% in Q4 2025 and another 20% in early 2026. This follows a 50% price increase earlier in 2025. The largest AI hardware companies have secured their chip supply well into the future – some as far out as 2028 – leaving PC and consumer electronics manufacturers scrambling for limited availability and facing inflated costs.

HBM: The Core of the Problem

The primary driver behind this surge is high-bandwidth memory (HBM). HBM utilizes 3D chip packaging, stacking multiple DRAM dies to increase bandwidth and performance. Each HBM chip can contain up to 12 thinned-down DRAM chips, connected by microscopic solder balls. This complex technology is positioned close to the GPU or AI accelerator, minimizing the “memory wall” – the bottleneck in transferring data between the processor and memory.

HBM is significantly more expensive than other types of memory, generally costing three times as much and accounting for 50% or more of the total GPU cost. As AI models grow in size, the importance of HBM increases, further straining supply and driving up prices.

A History of Boom and Bust, Amplified by AI

The DRAM industry is historically cyclical, characterized by periods of boom and bust. Building new fabrication plants (fabs) costs upwards of $15 billion and takes over a year and a half to become operational, meaning capacity often lags behind demand. The current situation is a collision of this cyclical nature and the unprecedented scale of the AI infrastructure build-out.

The COVID-19 pandemic initially triggered a chip supply panic, leading hyperscalers like Amazon, Google, and Microsoft to stockpile memory and storage, boosting prices. However, a subsequent slowdown in 2022 and 2023 caused prices to plummet, even prompting some companies like Samsung to cut production by 50% to avoid losses. Investment in new production capacity remained limited throughout 2024 and most of 2025.

The Data Center Boom Fuels Demand

Currently, nearly 2,000 new data centers are planned or under construction globally, representing a potential 20% increase in overall capacity. McKinsey predicts that companies will spend $7 trillion by 2030, with $5.2 trillion dedicated to AI-focused data centers. This massive investment is heavily reliant on GPUs and, HBM.

Nvidia, a leading GPU manufacturer, has seen its data center revenue soar from barely $1 billion in late 2019 to $51 billion in October 2025. Their GPUs are increasingly demanding more DRAM and more HBM chips. Micron, a major DRAM producer, reported that HBM and other cloud-related memory accounted for nearly 50% of its DRAM revenue in 2025, up from 17% in 2023.

What Does the Future Hold for DRAM Supply?

Addressing the supply shortage requires both innovation and increased production capacity. Micron, Samsung, and SK Hynix are all investing in new fabs and facilities, but these won’t significantly impact prices for several years. Micron is building an HBM fab in Singapore (production in 2027), retooling a fab in Taiwan (production in late 2027), and constructing a DRAM fab complex in New York (full production by 2030). Samsung plans to start production at a new plant in South Korea in 2028, and SK Hynix is building HBM and packaging facilities in Indiana (production by 2028) and an HBM fab in Cheongju (completion in 2027).

Intel CEO Lip-Bu Tan recently stated that there will be “no relief until 2028.” Further gains will come from process learning, better DRAM stacking efficiency, and tighter coordination between memory suppliers and AI chip designers.

Technologies like advanced packaging, hybrid bonding, and increasing the number of dies per HBM stack (potentially up to 16 or even 20) are too being explored to improve performance and efficiency.

FAQ: DRAM Shortage and Pricing

Q: How long will DRAM prices remain high?
A: Experts predict high prices will persist until at least 2028, with potential for continued increases in the short term.

Q: What is HBM and why is it so important?
A: HBM (High-Bandwidth Memory) is a 3D-stacked memory technology that provides significantly faster data transfer rates, crucial for AI applications.

Q: Will new fabs solve the shortage?
A: New fabs are being built, but they take years to become fully operational, meaning they won’t provide immediate relief.

Q: What can consumers expect?
A: Consumers can expect to see higher prices for devices that rely on DRAM, such as PCs and gaming consoles.

Did you know? The DRAM industry’s cyclical nature means periods of oversupply often follow periods of shortage, but the current AI-driven demand is unprecedented.

Pro Tip: Keep an eye on announcements from major DRAM manufacturers like Micron, Samsung, and SK Hynix for updates on production capacity and technology advancements.

Stay informed about the latest developments in the memory market. Explore our other articles on semiconductor technology and the impact of AI on the tech industry. Read more here.

February 11, 2026 0 comments
0 FacebookTwitterPinterestEmail
Business

SK Hynix overtakes Samsung in annual profits for the first time

by Chief Editor January 29, 2026
written by Chief Editor

SK Hynix’s AI Memory Triumph: A Glimpse into the Future of Chipmaking

The recent news that SK Hynix surpassed Samsung in operating profit for 2025, largely fueled by its dominance in High-Bandwidth Memory (HBM), isn’t just a South Korean tech story – it’s a bellwether for the future of the semiconductor industry. This shift highlights the growing importance of specialized memory chips in the age of Artificial Intelligence (AI). But what does this mean for the broader tech landscape, and what challenges and opportunities lie ahead?

The HBM Advantage: Why AI Needs Specialized Memory

Traditional memory chips aren’t optimized for the massive data processing demands of AI. HBM, however, is designed to deliver significantly faster speeds and greater bandwidth. This is crucial for AI applications like large language models (LLMs), machine learning, and high-performance computing. Nvidia, a leading AI chip designer, is a key driver of HBM demand, and SK Hynix has secured a significant portion of Nvidia’s HBM supply – a strategic advantage that’s paying off handsomely.

Did you know? HBM is stacked vertically, allowing for more memory in a smaller space and reducing the distance data needs to travel, resulting in faster processing speeds.

Beyond HBM3: The Race to HBM4 and Beyond

The current generation, HBM3, is already a game-changer. However, the industry is rapidly moving towards HBM4, promising even greater performance gains. Samsung is actively working to catch up, aiming to address quality issues that hampered its previous efforts and deliver competitive HBM4 products this year. Analysts predict a tighter race between SK Hynix and Samsung in the HBM4 arena, with Micron also attempting to gain ground.

The evolution doesn’t stop at HBM4. Research is already underway on future generations, exploring new materials and architectures to further enhance memory performance. Expect to see innovations like 3D stacking and advanced packaging techniques becoming increasingly important.

The Broader DRAM Market: A Shifting Landscape

SK Hynix’s success isn’t limited to HBM. The company also outperformed Samsung in the broader DRAM market, which encompasses the memory chips used in PCs, servers, and data centers. This suggests a broader trend of SK Hynix gaining market share across multiple memory segments. This is partially attributable to SK Hynix’s focused strategy – concentrating almost exclusively on memory chips, allowing for greater specialization and efficiency compared to Samsung’s diversified portfolio.

Competition Heats Up: Micron’s Role and Emerging Players

While SK Hynix and Samsung currently dominate the HBM market, Micron is actively investing in HBM technology and is expected to become a more significant competitor. Furthermore, Chinese memory chip manufacturers are also making strides, albeit facing challenges related to technology access and geopolitical factors. This increased competition will likely drive down prices and accelerate innovation.

Pro Tip: Keep an eye on advancements in chiplet technology. Chiplets – small, modular chips – can be combined to create more powerful and customized processors, potentially reducing reliance on monolithic HBM solutions.

The Impact on AI Infrastructure and Cloud Computing

The availability of high-performance memory like HBM is critical for the continued growth of AI infrastructure. Cloud providers like Amazon Web Services (AWS), Microsoft Azure, and Google Cloud are heavily investing in AI-optimized servers, driving demand for HBM. This, in turn, is fueling the growth of the entire AI ecosystem, from software development to data analytics.

The demand for HBM is also impacting the design of data centers. Data centers are increasingly adopting liquid cooling and other advanced cooling technologies to manage the heat generated by high-performance processors and memory.

Future Trends to Watch

  • Computational Memory: Integrating processing capabilities directly into memory chips, reducing data movement and improving performance.
  • Persistent Memory: Combining the speed of memory with the non-volatility of storage, enabling faster boot times and improved application performance.
  • Advanced Packaging: Developing new packaging techniques to improve chip density and reduce latency.
  • AI-Driven Chip Design: Utilizing AI algorithms to optimize chip design and improve performance.

FAQ

Q: What is HBM?
A: High-Bandwidth Memory is a high-performance RAM interface for 3D-stacked synchronous dynamic random-access memory (SDRAM). It’s designed for applications requiring high bandwidth, like AI and high-performance computing.

Q: Why is HBM important for AI?
A: AI models require massive amounts of data to be processed quickly. HBM provides the necessary bandwidth and speed to handle these workloads efficiently.

Q: What is the difference between HBM3 and HBM4?
A: HBM4 offers significantly higher bandwidth and capacity compared to HBM3, enabling even more powerful AI applications.

Q: Who are the major players in the HBM market?
A: Currently, SK Hynix and Samsung are the leading players, with Micron also gaining traction.

Q: Will HBM become more affordable in the future?
A: Increased competition and advancements in manufacturing processes are expected to drive down HBM prices over time.

What are your thoughts on the future of AI memory? Share your insights in the comments below!

Explore more articles on semiconductor technology and artificial intelligence on our website.

Subscribe to our newsletter for the latest updates on the tech industry!

January 29, 2026 0 comments
0 FacebookTwitterPinterestEmail

Recent Posts

  • Readers Speak: Vessel seizures top Hormuz risk

    May 4, 2026
  • All-you-can-drink Bali resort kids will go gaga over

    May 4, 2026
  • US to Assist Ships Trapped in Strait of Hormuz

    May 4, 2026
  • Trump: US to Assist Stuck Ships in Strait of Hormuz

    May 4, 2026
  • PSSI Approves Persija vs Persib Match at SUGBK

    May 4, 2026

Popular Posts

  • 1

    Maya Jama flaunts her taut midriff in a white crop top and denim jeans during holiday as she shares New York pub crawl story

    April 5, 2025
  • 2

    Saar-Unternehmen hoffen auf tiefgreifende Reformen

    March 26, 2025
  • 3

    Marta Daddato: vita e racconti tra YouTube e podcast

    April 7, 2025
  • 4

    Unlocking Success: Why the FPÖ Could Outperform Projections and Transform Austria’s Political Landscape

    April 26, 2025
  • 5

    Mecimapro Apologizes for DAY6 Concert Chaos: Understanding the Controversy

    May 6, 2025

Follow Me

Follow Me
  • Cookie Policy
  • CORRECTIONS POLICY
  • PRIVACY POLICY
  • TERMS OF SERVICE

Hosted by Byohosting – Most Recommended Web Hosting – for complains, abuse, advertising contact: o f f i c e @byohosting.com


Back To Top
Newsy Today
  • Business
  • Entertainment
  • Health
  • News
  • Sport
  • Tech
  • World