Jukan@jukan05
[EXCLUSIVE] Samsung Electronics Breaks Into OpenAI — Sole Supplier of 800 Million Gb HBM4
Samsung Electronics will become the first and sole supplier of next-generation High Bandwidth Memory 4 (HBM4) to OpenAI, the world’s largest artificial intelligence company. OpenAI plans to integrate Samsung’s HBM4 into its first-generation in-house AI chip, codenamed “Titan.” With this win following its earlier HBM4 supply agreement with NVIDIA, Samsung is being credited with cementing its leadership in the advanced AI chip market.
According to industry sources on the 19th, Samsung Electronics has agreed to supply OpenAI with up to 800 million gigabits (Gb) of HBM4 (12-layer product) in the second half of this year. That volume represents approximately 7% of Samsung’s total planned HBM output for the year (over 11 billion Gb), and roughly 15% of its HBM4-specific production (approximately 5.5 billion Gb). The allocation is understood to be the third largest after NVIDIA and AMD.
The HBM4 units Samsung will deliver are destined to sit alongside OpenAI’s first-ever AI chip, Titan Gen 1, developed in partnership with Broadcom. TSMC is expected to begin production in Q3, with a launch targeted for year-end.
The deal is seen as particularly significant given that OpenAI — the company that ignited the generative AI boom with the launch of ChatGPT in 2022 — selected Samsung as its inaugural HBM supplier. OpenAI also sits at the center of the United States’ Stargate Project, a planned $500 billion AI infrastructure initiative.
First Fruits After the JY Lee–Altman Meeting… AI Chip Orders Keep Coming
OpenAI operates hundreds of thousands of AI chips across its data centers to deliver generative AI services, having relied heavily on NVIDIA’s general-purpose AI semiconductors. More recently, the company concluded that it needed its own custom chips optimized for inference workloads — a trend that has become central to next-generation AI development.
An industry insider noted that “OpenAI has been investing heavily in R&D to successfully mass-produce its Titan chip,” adding that “Samsung satisfied the stringent HBM4 requirements that OpenAI set out, which is what made this deal possible.”
Given that OpenAI has chosen Samsung as its first-ever HBM supplier, observers believe Samsung HBM is likely to be incorporated into future Titan generations as well.
HBM stacks multiple DRAM dies vertically — similar to floors in an apartment building — delivering greater capacity and faster data transfer speeds than conventional DRAM, making it the memory of choice for AI applications. Micron Technology has projected that the global HBM market will grow from approximately $35 billion in revenue last year to around $100 billion by 2028.
Samsung’s HBM business had a difficult stretch through last year, suffering back-to-back failures in NVIDIA’s qualification tests for HBM3 and HBM3E — a significant blow to the pride of the world’s top memory chipmaker. The turnaround came in May 2024, when Vice Chairman Jeon Young-hyun made the bold decision to redesign the DRAM at the core of Samsung’s HBM.
This year, Samsung passed NVIDIA’s HBM4 qualification without a single design revision, enabling direct shipment of mass-production units. On March 18th, AMD announced it had designated Samsung as its preferred HBM4 supplier. Demand for the prior-generation HBM3E is also said to be surging. Samsung is reportedly targeting over 5 billion Gb of HBM3E for Google’s Tensor Processing Units (TPUs) — also developed in partnership with Broadcom — in addition to its NVIDIA supply commitments.
Behind the OpenAI HBM4 deal, JY Lee, Chairman of Samsung Electronics, reportedly played a pivotal role. In October of last year, Lee met with OpenAI CEO Sam Altman and exchanged a Letter of Intent (LOI) covering the supply of cutting-edge AI memory including HBM, laying the groundwork for the agreement that has now come to fruition.