

Richard Medina
460 posts

@RichardMInvest
Daily breakdowns on $NVDA $TSM $HOOD $TSLA | Semiconductors · Fintech · Robotics 📈




Price/Perf of NVIDIA GB200 vs Google Ironwood TPU v7 patreon.com/posts/15596024…


“Anthropic trains its models only on $AMZN Trainium” That’s a clickbait story. Not reality. Anthropic runs its models across: $AMZN Trainium, $GOOGL TPUs, and $NVDA GPUs. This hybrid approach is deliberate. Two reasons: 1. $AMZN and $GOOGL both own >10% of Anthropic, and those deals included commitments to heavily use their chips 2. Anthropic doesn’t want to depend on a single supplier This does NOT mean $NVDA is losing its edge. Its chips are still the best. What’s changing is behavior: AI labs don’t want to rely entirely on Nvidia and keep paying a ~10x markup on every chip in their data centers. Implications: • $NVDA will lose some market share, but in a market growing ~60% annually it’s not a real problem • $AMD will capture part of that share




Introducing Claude Design by Anthropic Labs: make prototypes, slides, and one-pagers by talking to Claude. Powered by Claude Opus 4.7, our most capable vision model. Available in research preview on the Pro, Max, Team, and Enterprise plans, rolling out throughout the day.


OpenAI said it had 1.9 GW of capacity available in 2025, and expects to add in the “low-double-digit range" this year and ultimately scale to around 30 GW by 2030, whereas it believes Anthropic had 1.4GW available in 2025 and 7-8GW this year. $MSFT $AMZN $GOOG $NVDA















