Systems Monk
71 posts

Systems Monk
@systemsmonk
Helping you go from 0 → $10k/month using AI systems, offers, and ruthless discipline. Daily breakdowns, no fluff. Follow for daily AI systems + money threads.











Your brain knows your imagination as reality.



All the best programmers I know are starting to write code by hand again

Google just broke a decade-long tradition. At Cloud Next 2026, the company unveiled not one, but two new AI chips, the TPU 8t for training and TPU 8i for inference. For the first time ever, Google is splitting its custom silicon into specialized architectures instead of relying on a one-size-fits-all design. The TPU 8t superpod packs 9,600 liquid-cooled chips delivering 121 FP4 ExaFlops of peak compute, roughly a 3x leap over the previous generation. The TPU 8i delivers 80% better performance-per-dollar than its predecessor, with triple the on-chip memory and a new Boardfly topology that cuts network latency in half. The important aspect: Anthropic, Meta, and now OpenAI are buying multi-gigawatt allocations of TPU capacity. OpenAI booking Google silicon is a first visible crack in NVIDIA's grip on frontier AI training. Broadcom co-designed the TPU 8t, while MediaTek handles the TPU 8i, both fabbed by TSMC. NVIDIA still holds 81% of the AI chip market, but the era of serious competition has officially begun.


SITUATION DETECTED: Former Google DeepMind researcher David Silver has raised $1.1B at a $5.1B valuation for Ineffable Intelligence, a company building AI that can teach itself without human generated data. The round was led by Sequoia and Lightspeed, with Nvidia, Google, and the British government also participating.








Research shows you spend ~47% of your time mentally not living in the present, and it makes you less happy. So time to stop your mind from doing side quests and get back to the moment you’re actually in.











