Rob’s Educated Guesses
35.4K posts

Rob’s Educated Guesses
@RobEducated
Old hedge fund manager replaced fundamental analysis for narrative watching. Data point junky first, long term investor next.



😂











NVIDIA's Vera Rubin to Use Only Samsung and SK Hynix HBM4 Samsung Electronics and SK Hynix's sixth-generation High Bandwidth Memory (HBM4) will be incorporated into NVIDIA's next-generation AI accelerator "Vera Rubin," slated for release in the second half of this year. Micron, the world's third-largest memory company, has been excluded from the HBM4 supply chain for Vera Rubin — a result widely interpreted as a recognition of Korean memory makers' technological superiority in key HBM4 performance metrics such as operating speed and bandwidth. According to industry sources on March 8, Samsung Electronics and SK Hynix have been confirmed as suppliers on NVIDIA's Vera Rubin vendor list. The two companies have been provisionally selected as HBM4 suppliers for the flagship next-gen AI accelerator, whose performance hinges critically on this component. HBM4 is an AI server memory chip manufactured by stacking 8 to 16 layers of advanced DRAM built on 11–13nm process nodes atop a base die, enabling rapid delivery of large volumes of data to the GPU in AI accelerators. Micron, which currently supplies HBM3E to NVIDIA, has been dropped from the Vera Rubin HBM4 vendor list. The company is expected to supply HBM4 for mid-tier AI accelerators optimized for inference — such as the "Rubin CPX" — rather than for Vera Rubin itself. The selection of Samsung and SK Hynix as the sole HBM4 suppliers reflects their ability to meet NVIDIA's performance and yield requirements. Samsung has effectively passed NVIDIA's bifurcated HBM4 qualification tests targeting operating speeds of 10Gb/s and 11Gb/s. SK Hynix is in the process of optimizing its product for the 11Gb/s test. Samsung's foundry division has also won an order to produce NVIDIA's GPU "RTX 3060," which the company has decided to resume manufacturing, with production set to begin shortly on an 8nm process. Only Samsung and SK Hynix at the Heart of NVIDIA's "Monster AI Chip" — Micron Eliminated Since around 2022, when the AI era began, NVIDIA has been the company that determines the fate of memory semiconductor firms. Those that made it into NVIDIA's HBM supply chain became central players in the AI industry; those that didn't saw their standing diminish. Samsung was a prime example — the two-year crisis narrative that plagued Samsung Semiconductor was triggered by delays in HBM3 supply to NVIDIA and only subsided last September when Samsung passed NVIDIA's HBM3E 12-layer qualification test. Against this backdrop, the selection of Samsung and SK Hynix — and exclusion of Micron — as Vera Rubin HBM4 suppliers is being assessed as highly significant. It opens the door to large-scale NVIDIA shipments over the next one to two years, securing their HBM dominance. High-Spec HBM4 Orders According to industry sources on March 8, Vera Rubin hardware will be unveiled for the first time at NVIDIA's developer conference GTC 2026, scheduled for March 16 in Silicon Valley. While no official launch date has been set, a second-half 2026 release is expected. NVIDIA is putting everything into making Vera Rubin's performance more than five times greater than its predecessor, in order to decisively outpace rivals including AMD and Broadcom. More than 80 partner companies worldwide are aligning with NVIDIA to support the launch of this "monster AI accelerator." Since last year, NVIDIA has identified HBM4 as the critical component underpinning Vera Rubin's success, pushing memory companies to develop high-performance products. It demanded operating speeds exceeding 10Gb/s — well above the 8Gb/s JEDEC standard — for Vera Rubin's HBM4. Capacity has also been scaled up: Vera Rubin will incorporate 16 HBM4 stacks totaling 576GB, surpassing the 432GB HBM4 capacity in AMD's competing next-gen AI accelerator, the MI450. Micron Eliminated from Vera Rubin Global memory companies competed fiercely for Vera Rubin HBM4 supply contracts. Winning over NVIDIA — which commands over 80% share in the AI accelerator market where HBM is heavily integrated — means both technological validation and guaranteed strong earnings. The race for Vera Rubin HBM4 has narrowed to Samsung and SK Hynix. Micron does not appear on the vendor list. "Micron isn't even being discussed as a Vera Rubin HBM4 supplier," one industry source said. Between the two, Samsung has recently pulled ahead. Samsung has effectively passed NVIDIA's bifurcated qualification tests for both 10Gb/s and 11Gb/s variants of HBM4, and began shipping finished products to NVIDIA last month, albeit in limited volumes. SK Hynix is continuing product optimization with NVIDIA to pass the 11Gb/s test. Given that the process from HBM4 DRAM wafer input to final packaging takes over six months, both companies are expected to begin full-scale HBM4 production as early as this month. Micron is not entirely out of the HBM4 picture, however — it is likely to supply HBM4 for mid-tier products in the Rubin series rather than for Vera Rubin itself. Commodity DRAM Pricing Is a Wild Card Volume allocations and pricing for Vera Rubin HBM4 have yet to be finalized. Some industry observers suggest that while SK Hynix will retain more than half of NVIDIA's total HBM shipments — including HBM3E — this year, Samsung may emerge as the largest supplier when it comes to Vera Rubin HBM4 specifically. Samsung recently expressed strong confidence, stating that its HBM revenue this year will be triple that of last year. A key variable is commodity DRAM pricing, which has been roughly doubling quarter-over-quarter. The per-Gb price of server DRAM modules such as SOCAMM2 has reportedly risen to approximately $1.30 — approaching the level of HBM3E, the flagship HBM product. From Samsung's perspective, producing commodity DRAM — which does not require the additional expensive stacking processes that HBM4 demands — may be more profitable. Jensen Huang's meeting with SK Hynix engineers in Silicon Valley to encourage HBM4 development is also being interpreted as a move to check Samsung's growing negotiating leverage. "Samsung holds HBM4, commodity DRAM, and other products, giving it a diverse set of negotiating cards it can play with NVIDIA," one industry observer noted. $MU

Pretty big backwardation in VIX futures on Friday. As for what this means, look back in the chart at the other times this spread indicator has gone below zero. Draw your own conclusions. I have shared this chart recently in my newsletter and my Daily Edition.



