ENG
2024-12-26
HBM4 race heats up among memory giants: customization and DRAM scaling at the crux

3_b.jpg

Micron is aggressively expanding its footprint in the high-bandwidth memory market, capitalizing on surging AI chip demand to accelerate growth. Backing this expansion, the company is set to receive up to US$6.165 billion in CHIPS Act funding from the US Department of Commerce by the end of 2024. To strengthen its market position, it plans to expand production in the US and Malaysia, aiming for a 20% share of the HBM market by 2025.

Meanwhile, Broadcom's custom chips are attracting attention as AI inference demands surge. Tesla has reportedly requested HBM4 samples from Samsung and SK Hynix, while Samsung is developing tailored HBM4 solutions for Meta and Microsoft.

  • HBM4 era: custom solutions and strategic partnerships redefine AI chip design

The transition from HBM3E to HBM4 and HBM4E marks a new standard in AI chip design, driven primarily by customized HBM solutions. SK Hynix and Micron have bolstered their positions with TSMC's advanced process node support for base die production, enhancing their competitive edge.

While HBM3E uses DRAM-based 2.5D/3D-stacked dies, HBM4 adopts 3D-stacked logic die architecture, allowing customers to integrate proprietary IP for enhanced performance and customization. Notably, base dies for HBM4 are now produced by wafer foundries, not memory makers. Experts highlight 3D die stacking and customization as pivotal for success in the HBM4 era.

In a significant development in December, Marvell partnered with Micron, Samsung, and SK Hynix to create a custom HBM architecture, designed to enhance memory bandwidth and chip capacity while reducing power consumption.

  • TSMC partnerships bolster SK Hynix and Micron in HBM4 race

SK Hynix and Micron are collaborating with TSMC to develop HBM4 and customized HBM4E using 3nm base dies. In contrast, Samsung is reportedly relying on its in-house 4nm foundry process for HBM4 base dies.

Analysts believe TSMC's partnerships with SK Hynix and Micron could provide them a competitive edge, highlighting the need for closer collaboration between Samsung's memory division and Samsung Foundry.

  • HBM4: SK Hynix, Micron, and Samsung vie for leadership

SK Hynix aims to debut its HBM4 prototype by March 2025, with mass production for Nvidia set for later that year. Simultaneously, Micron is likely to adopt fluxless bonding, while Samsung focuses on 1c-process DRAM.

According to TrendForce, Micron is testing fluxless bonding for HBM4 with partners to address DRAM spacing challenges and increase stack layers, targeting revenue contributions by 2026.

Samsung has adopted 10nm-class 1c DRAM for HBM4, prioritizing yield improvements, while SK Hynix and Micron have opted for 10nm-class 1b DRAM. Analysts see Samsung's focus on 1c DRAM yields as crucial to reclaiming competitiveness in the HBM market.

  • HBM3E: SK Hynix and Micron lead the charge

SK Hynix and Micron have emerged as leaders in the HBM3E market. Micron has ramped up production, achieving mass production and delivery of 12-layer HBM3E, while growing ASIC demand positions the company to expand its customer base.

Micron's first-quarter fiscal 2025 report shows HBM revenue more than doubled compared to the previous quarter, with full-year revenue expected to reach US$5 billion. Shipments to its second and third-largest HBM customers began in December and will continue into early 2025.

SK Hynix, having surpassed Samsung's slower HBM3E progress, unveiled its 48GB 16-layer HBM3E at the SK AI Summit 2024. Sample deliveries are expected in early 2025, paving the way for accelerated 16-layer HBM4 production.

Samsung's faster mass production of HBM3E and HBM4, or SK Hynix's outperformance, could intensify competition and pressure Micron's market position and profitability.

SIGN UP FOR
MARKET INSIGHTS

Stay up to date with the latest in industry offers by subscribing us. Our newsletter is your key to receiving expert tips.

Related News