Waleed Chaudhry

719 posts

Waleed Chaudhry

Waleed Chaudhry

@ser_kingslayer_

Katılım Nisan 2017
726 Takip Edilen148 Takipçiler
Jim Liu
Jim Liu@jiahanjimliu·
@Jespabe TPU benchmarks are all good but when you look at the actual model adoption among enterprises, OpenAI > Gemini and Anthropic is loosing ground as it gets compute constrained on inference side.
English
2
0
4
573
Jim Liu
Jim Liu@jiahanjimliu·
Part of Anthropic mis-calculated compute needs is how much they can get out of TPUs in practice. TPUs look strong on paper with all their white papers but Nvidia is engineering that works in practice. People in engineering understand that there can be a large gap between theory and practice. Last year when Gemini was all the hype, there was some signal on X that Blackwells would bring Nvidia back into singular monopoly position. Now with Gemini falling off and Anthropic compute constraint due to their heavy TPU mix are both showing. Nvidia is the undisputed king again. White papers, benchmarks and theory falls flat in the face of know how and real world performance.
Midnight Capital LLC@Midnight_Captl

Dario did wayyyy too many TPUs, and wayyyy too few Blackwell racks huh

English
37
36
455
102.1K
Waleed Chaudhry
Waleed Chaudhry@ser_kingslayer_·
@wecandobetter0 @ProfessorPape Trump is the President. He doesn't call for anything. If he wants to escalate, he'll escalate. The professor, however, needs to sell some substack subscriptions, so he's gotta manufacture some drama
English
0
0
1
28
Mark Miller
Mark Miller@wecandobetter0·
@ser_kingslayer_ @ProfessorPape And Trump is calling for escalation. Thanks for proving the Professor right. Also, can you trolls get a new script, this is the same one on thousands of feeds.
English
1
0
0
56
Robert A. Pape
Robert A. Pape@ProfessorPape·
Iran’s military is ready for major escalation. Get ready for rapidly mounting damage in: —Iran —Gulf states — World economy
Robert A. Pape tweet media
English
79
339
1.1K
51.5K
Degen Lane Capital LLC
Degen Lane Capital LLC@DegenLaneCap·
@GavinSBaker was right, just early on $P
Jukan@jukan05

"The Next Bottleneck After HBM Is HBF"... A Computing Pioneer's Prediction "I have been consistently paying close attention to High Bandwidth Flash (HBF). I'm also collaborating with semiconductor companies on this. HBF is highly likely to stand at the center of the next bottleneck — a surge in demand." David Patterson, professor at UC Berkeley, Turing Award laureate, and widely recognized as the architect of RISC (Reduced Instruction Set Computing — an approach that simplifies instructions to improve processing efficiency), made these remarks on April 30 (local time) when he met with reporters in San Francisco immediately after delivering a keynote at the Dreamy Next event. Asked about what comes after HBM (High Bandwidth Memory), which is currently in a supply-constrained bottleneck, Professor Patterson answered that HBF will emerge as the next focus. Specifically, he said, "Although a number of technical challenges still remain, the HBF being developed by companies such as SK hynix and SanDisk is a meaningful alternative in that it can deliver large capacity with low power consumption," adding, "Going forward, how efficiently data can be stored and delivered will become the critical variable." This past March, SK hynix announced that it had joined hands with U.S. flash memory company SanDisk to drive the global standardization of HBF. Unlike HBM, which stacks DRAM, HBF is built by stacking NAND flash — a non-volatile memory. Their roles are also distinct. While HBM serves as a fast computation aid, HBF is focused on storing the vast amounts of data that AI processes at high capacity. HBF is drawing attention as the AI inference market grows. The AI market is broadly divided into learning (training) and inference. Training is the process of feeding massive amounts of data to teach an AI model. Inference is the stage in which results are derived based on the trained data. In inference AI, the ability to continuously store and retrieve vast amounts of intermediate data — such as prior conversations, judgment outcomes, and task context — is crucial. This is because AI carries out reasoning by remembering context and building upon it. The problem is that all of this data is difficult to fit into HBM. Since HBM is optimized for handling data used immediately, its capacity itself is inherently limited. Moreover, given its high price, processing the enormous amounts of context data generated during inference using HBM alone would impose significant cost burdens. As a result, an environment has formed in which both HBM and HBF are needed simultaneously — a kind of division of labor. Domestic experts in Korea also anticipate that the importance of HBF will grow going forward. At an HBF research and technology development strategy briefing held this past February, Kim Jung-ho, professor in the School of Electrical and Electronic Engineering at KAIST, stated, "If the central processing unit (CPU) was the core in the PC era and low-power technology was the core in the smartphone era, memory will be the core of the AI era," adding, "What determines speed is HBM, and what determines capacity is HBF." He further predicted, "From 2038 onward, demand for HBF will surpass that of HBM."

English
2
0
2
279
Waleed Chaudhry
Waleed Chaudhry@ser_kingslayer_·
@mzuhair123 Do you think it's possible for prices to drop up as the LTAs kick in starting this quarter?
English
1
0
0
22
Muhammad Zuhair
Muhammad Zuhair@mzuhair123·
Okay, so a question. Zuhair, what's one signal after which you might talk about memory cycle cooling off? Well, my best bet would be that as soon as hyperscaler CapEx starts see a flat-line growth, or a potential drop, I would say that the downstream supply chain will see a drop in demand. It's important to see the AI industry in a two-way supply flow. Top-to-bottom indicates the overall market sentiment. Bottom-to-top shows us the constraints in the AI industry, just as the photonics narrative, power semis, and similar aspects.
Muhammad Zuhair@mzuhair123

Confused about the memory cycle, and how things will evolve moving ahead, here are quick pointers. Regarding the memory demand thesis, and whether you the growth of the industry will remain, there is a lot of skepticism in people. This mainly comes from the fact that YTD figures of the likes of $MU, Samsung and SK hynix have blown off, giving the perception that the stock has made their move. This was a mistake that people have made throughout the infrastructure buildout, since they have shifted their focus away from supply chain signals. The best way to conclude on whether memory stocks are still a worth-it bet is by looking at DRAM contract prices, and their rate of expansion, along with the change in hyperscaler CapEx, as well. These signals are publicly available, and easy to access for the commoner. From a more technical aspect, the best way is to look and how memory/rack, and memory/accelerator figures are evolving, and how the role of memory with AI workloads has changed. For me, my thesis is simple, as long as agentic AI persist, memory be a hot commodity. And, based on my experience, and knowing about expansion timelines, and LTA knowledge, I believe the cycle won't cool off until 2028, and this 'cool off' is mainly attributed to the new capacity coming online, by a 20% YoY growth. Let me know if you folks need a more deep dive on how the position respective memory companies, and their narratives. Sharing an infographic from @SemiAnalysis_ for a more macro-view of supply/demand.

English
1
0
1
556
Waleed Chaudhry
Waleed Chaudhry@ser_kingslayer_·
@mzuhair123 Also where do you see DRAM contract prices publicly? Trend force gives a small summary. Is there a different place to look at?
English
1
0
0
42
Waleed Chaudhry
Waleed Chaudhry@ser_kingslayer_·
@mzuhair123 Hynix is already projected to make like 300B in FY 2027. Is there more money that can still be spent there?
English
1
0
0
97
Muhammad Zuhair
Muhammad Zuhair@mzuhair123·
Confused about the memory cycle, and how things will evolve moving ahead, here are quick pointers. Regarding the memory demand thesis, and whether you the growth of the industry will remain, there is a lot of skepticism in people. This mainly comes from the fact that YTD figures of the likes of $MU, Samsung and SK hynix have blown off, giving the perception that the stock has made their move. This was a mistake that people have made throughout the infrastructure buildout, since they have shifted their focus away from supply chain signals. The best way to conclude on whether memory stocks are still a worth-it bet is by looking at DRAM contract prices, and their rate of expansion, along with the change in hyperscaler CapEx, as well. These signals are publicly available, and easy to access for the commoner. From a more technical aspect, the best way is to look and how memory/rack, and memory/accelerator figures are evolving, and how the role of memory with AI workloads has changed. For me, my thesis is simple, as long as agentic AI persist, memory be a hot commodity. And, based on my experience, and knowing about expansion timelines, and LTA knowledge, I believe the cycle won't cool off until 2028, and this 'cool off' is mainly attributed to the new capacity coming online, by a 20% YoY growth. Let me know if you folks need a more deep dive on how the position respective memory companies, and their narratives. Sharing an infographic from @SemiAnalysis_ for a more macro-view of supply/demand.
Muhammad Zuhair tweet media
English
4
3
13
3.1K
Midnight Capital LLC
Midnight Capital LLC@Midnight_Captl·
$NVDA down 4% on the news that all the hyperscalers are massively supply constrained and will all be raising CapEx “significantly” next year Lol. Never change wallstreet… never change
English
60
51
1.1K
146.5K
Waleed Chaudhry
Waleed Chaudhry@ser_kingslayer_·
@Midnight_Captl I don't get why Daario doesn't scale on Blackwells and Rubins. Jensen would invest more to have him but the compute. Why isn't he doing it
English
1
0
1
215
Waleed Chaudhry
Waleed Chaudhry@ser_kingslayer_·
@Midnight_Captl $NVDA and $META for sure. Not sure I'd put $GOOG in the same category. TPUs aren't as good as the media thinks they are, and search will be agentic in the future
English
1
0
4
772
Midnight Capital LLC
Midnight Capital LLC@Midnight_Captl·
The safest investments for the coming decade are Nvidia, Google, and Meta and I don’t think anything else comes close
English
37
6
274
24.7K
Midnight Capital LLC
Midnight Capital LLC@Midnight_Captl·
And there it is… Officially. Finally. At long last $NVDA new ATHs Congrats to everyone who’s been onboard for the journey 💚
Midnight Capital LLC tweet media
English
12
6
178
5.5K
Waleed Chaudhry
Waleed Chaudhry@ser_kingslayer_·
@BenBajarin NVDA is catching up to where it should have been 2 months ago. This is a 260 dollar stock at a minimum
English
0
0
4
362
Ben Bajarin
Ben Bajarin@BenBajarin·
I imagine $NVDA is setting up for what the big three hyperscalers will announce this week (Wed) with earnings that should only further the confidence in the durability of the cycle and unprecedented amount of compute demand for AI.
English
5
18
191
12.5K
Waleed Chaudhry
Waleed Chaudhry@ser_kingslayer_·
@iruletheworldmo I don't get why Daario is still doing deals with GOOG/AMZN. NVDA has the supply. Do more compute deals with them
English
0
0
2
163
🍓🍓🍓
🍓🍓🍓@iruletheworldmo·
i find it surprising that people continue to count out xai. if there’s anyone that has the engineering nous to build the super clusters that will be needed for agi, its elon. its not the age of research, its the age of hard engineering problems and energy shortages. the clear favorites to win this race have to be xai and openai. google should be a contender but they lack conviction and dario forgot to buy compute and theres none let sooooooo. that’s anthropic out.
English
33
10
278
15.7K
Waleed Chaudhry
Waleed Chaudhry@ser_kingslayer_·
@taobanker $Meta is so ridiculously cheap for the growth is has, and will have, it's nuts. Cleanest ROI story in AI outside the semis, and somehow only up 2.5% YTD
English
0
0
3
214
taobanker
taobanker@taobanker·
I hate it how every time I say big tech is cheap I'm 100% right but just lack the confidence -- it's actually absurd how easy this shit is. At least I got in $meta this time.
English
5
0
78
5.2K
Waleed Chaudhry
Waleed Chaudhry@ser_kingslayer_·
@taobanker For sure. Long term Meta will continue to be a fantastic business. But we have so much ADHD in investing now, no one invests longer than like 3 months
English
1
0
2
144
taobanker
taobanker@taobanker·
@ser_kingslayer_ I'm fine with fading that bear case , it's something with a 0% chance of being a real long term issue imo. Absolute worst case the capex just comes down
English
1
0
6
434
taobanker
taobanker@taobanker·
$META bear argument (chronological edition): 1) Zuckerberg clearly overcommitted to capex 2) Wait, revenue was up 22% last year? 3) That's clearly just a heavier ad load -- no need for me to investigate this further 4) LALALLAA I CANT HEAR YOU WITH MY FINGERS IN MY EARS
English
9
1
49
5.7K
Waleed Chaudhry
Waleed Chaudhry@ser_kingslayer_·
@taobanker Now if Zuck could just give up on model development, license Gemini/Anthropic, slash headcount due to AI efficiencies, the stock actually moons. The core business is fantastic but Zuck going crazy on model development risk is what's keeping it down
English
0
0
1
288