Nick Gates

77 posts

Nick Gates banner
Nick Gates

Nick Gates

@ngates94

가입일 Eylül 2009
192 팔로잉69 팔로워
Tyler Norris
Tyler Norris@tylerhnorris·
Today marks a significant milestone in the history of demand flexibility, as @Google announces it has contracted one gigawatt of data center demand response capacity into long-term energy contracts with multiple utilities across the US.  Very proud of this team for pioneering a new way to utilize data centers as grid-responsive assets – and while there are limits to how flexible a given data center can be, Google is committed to continue developing this capability and modernizing power system planning to help realize its potential.
Tyler Norris tweet media
English
12
36
194
36.7K
Nick Gates
Nick Gates@ngates94·
@ShanuMathew93 The pace/scale of construction is hard to understand until you see it in person. It’s awesome.
English
1
0
1
19
Shanu Mathew
Shanu Mathew@ShanuMathew93·
Nothing short of incredible. Started off as a flared gas bitcoin mining play. Now, has transformed into a leader in scaling up AI HPC. $10bn valuation, top caliber investor team, marquee partnerships in AI. Standing up Stargate phase I in ~12-18 months. ~20 GWs in the pipe. Doing crazy things like standing up their own switch gear capacity. Kudos to this team! Many shrugged off the pivot or whether there was enduring value to the accelerated development/time to power thesis, but they are very much a real player in the market it seems.
Chase Lochmiller@ChaseLochmiller

I’m excited to announce that @CrusoeAI has closed our Series E round of financing valuing the company at $10.4B to help us build the infrastructure of intelligence. This round was led by our incredible partners at Valor Equity Partners and @MubadalaCapital. Solving the scaling needs of AI is one of the greatest challenges of our generation. If you’re inspired by working on big and difficult problems, come and join us! We had an amazing group of investors in the round including @137ventures, @1789Capital, Activate Capital, @AltimeterCap, @Atreidesmgmt, BAM Elevate, DPR Construction, @OraGlobal, @Fidelity, @foundersfund, @FTI_US, @GalvanizeLLC, @LongJourneyVC, @lowercarbon, M37, MCJ, @nvidia, @RadicalVentures, @RibbitCapital, @SalesforceVC, @saquon, @sparkcapital, @stepstonegroup, @Supermicro, @TRowePrice, Tiger Global, @upper__90, @winklevosscap, @ziggcap crusoe.ai/resources/news…

English
3
6
114
24.5K
Shanu Mathew
Shanu Mathew@ShanuMathew93·
Alright, I know others have covered the @karpathy on @dwarkesh_sp podcast a million different better ways, but it was genuinely one of the most enjoyable episodes I've listened to this year. Karpathy is a gifted teacher & orator while Dwarkesh is a great interviewer and pulls on the right threads. I learned a ton and wanted to organize my notes and share. I thought the most amazing parts of the interview were the analogs to how humans learn and comparing that to what we have today with current LLMs. So much of what we consider intelligence comes to us or animals in ways we haven't really figured out structurally yet (e.g., why does a zebra run after its mother right after being born - it wasn't fed intelligence to do this at birth). I come away realizing how much of the current setup is inefficiently jamming more memory/context/parameters into models and getting pattern recognition vs. truly teaching it how to reason and develop knowledge. Here were my highlights 1. Pragmatic Optimism - The Decade-Long Path to AGI: Contrary to the popular parts of the pod getting shared calling for popping the AI bubble, Karpathy is genuinely bullish on achieving AGI. However, he rejects the hype cycle's compressed timelines. Problems are tractable and solvable, just not overnight. We have impressive early agents (Claude, Codex are only a year old) or functions (e.g., auto complete) that he uses daily, but "they just don't work" as autonomous employees yet. >Missing: continual learning, full multimodality, sufficient intelligence. It will take a decade to "work through all of those issues." >"The problems are tractable, they're surmountable, but they're still difficult. If I just average it out, it just feels like a decade to me." 2. Realist's Critique - Current AI is Sophisticated Slop: Despite progress, current LLMs have serious structural limitations that the industry isn't confronting honestly, often due to fundraising/commercial motivations. In practice when he was building Nanochat, models couldn't understand his custom code structure, kept trying to use standard patterns, added bloated try-catch statements, used deprecated APIs. >What we have is models that "cognitively feel like kindergarten or elementary school student" despite passing PhD tests. >"The models are amazing. They still need a lot of work. For now, autocomplete is my sweet spot... I feel like the industry is making too big of a jump and is trying to pretend like this is amazing, and it's not. It's slop." >"They're savant kids. They have perfect memory of all this stuff. They can convincingly create all kinds of slop that looks really good. But I still think they don't really know what they're doing." 3. Reinforcement Learning is "Terrible" By Design: Current RL approaches are fundamentally noisy and inefficient, rewarding wrong behaviors simply because they appeared in successful trajectories. Example -models find adversarial examples like "thethethethe" that fool judges into giving 100% reward - can't scale beyond 10-20 RL steps without breaking. >i) Try 100 different solution paths, check answer at end; ii) Every token in successful paths gets upweighted, even the wrong turns; iii) "Every single little piece of the solution that you made that arrived at the right answer was the correct thing to do, which is not true" >"You're sucking supervision through a straw. You've done all this work that could be a minute of rollout, and you're sucking the bits of supervision of the final reward signal through a straw and you're broadcasting that across the entire trajectory and using that to upweight or downweight that trajectory. It's just stupid and crazy." >"A human would never do hundreds of rollouts. When a person finds a solution, they will have a pretty complicated process of review of, 'Okay, I think these parts I did well, these parts I did not do that well.'" 4. Memory vs. Intelligence - The Cognitive Core Vision: Current trillion-parameter models are bloated with memory when we actually want small (~1B parameter) models that reason rather than recall. >The vision is in 20 years, billion-parameter cognitive cores where "if you ask it some factual question, it might have to look it up, but it knows that it doesn't know and it will just do all the reasonable things." >Today's internet training data is "total garbage" - random documents are "stock tickers, symbols, huge amount of slop". >"What I think we have to do going forward is figure out ways to remove some of the knowledge and to keep what I call this cognitive core. It's this intelligent entity that is stripped from knowledge but contains the algorithms and contains the magic of intelligence and problem-solving." >"Most of that compression is memory work instead of cognitive work" 5. What Intelligence Actually Is - Human vs. LLM Learning: LLMs and humans learn through fundamentally different mechanisms, and understanding these differences reveals current AI's limitations. Example - Sleep/consolidation process. Humans distill daily experiences into brain weights during sleep whereas LLMs "always restart from scratch where they were". >Humans learn generalizable patterns because we're bad at memorization (can't recite random number sequences). >LLMs also lack culture - LLMs can't "write books for other LLMs" or build shared knowledge repositories like humans can. >"We're not building animals. We're building ghosts or spirits or whatever people want to call it, because we're not doing training by evolution. We're doing training by imitation of humans and the data that they've put on the Internet." 6. Model Collapse - The Entropy Problem: LLMs suffer from "silent collapse". When AI models generate their own training data, they gradually lose creativity and diversity, getting stuck repeating the same limited set of outputs. >Ask ChatGPT to tell you a joke multiple times. You'll notice it basically has "like three jokes" - it keeps giving you variations of the same few responses instead of the full breadth of possible jokes that exist. >"Humans collapse over time. This is why children haven't overfit yet. They will say stuff that will shock you... But we're collapsed. We end up revisiting same thoughts, saying more of same stuff, learning rates go down." >Individual samples look good - if you look at one generated example, it seems reasonable. But ask 10 times, you get essentially the same thing - the distribution is "terrible" even though examples look fine. Training on this makes it worse - "if you continue training on too much of your own stuff, you actually collapse" further
English
7
8
120
20.1K
Shanu Mathew
Shanu Mathew@ShanuMathew93·
Anyone going to RE+ in Vegas next month? I'll be there Mon-Wed (8-10th) for a Data Center conference, Yotta. Holler at me if you'll be at either and want to catch up!
English
3
0
8
1.6K
Nick Gates
Nick Gates@ngates94·
@ShanuMathew93 Interesting comments on run rate + backlog. Curious how this might impact annualized new capacity forecasts. Maybe we’ll see some supply-constrained compression through 2030… or will the forecasts plug the difference with alt. bridge gen sources…
English
0
0
1
71
Shanu Mathew
Shanu Mathew@ShanuMathew93·
GEV call highlights imo: -9 GW of new contracts in Q2 - mostly went to slot reservations (7 GW) vs. immediate orders (2 GW), showing strong future pipeline -Total pipeline now ~55 GW (29 GW backlog + 25 GW slot reservations), up from 50 GW last quarter; targeting 60GW by YE -Services growth isn't just gas - broad-based demand across power fleet types (Hydro uprates +60%, Steam service +30%) -Synchronous condensers: $5B annual market opportunity emerging from niche technology as grids need stability -Datacenters: Already at $500M orders in H1 2025 vs. $600M for all of 2024 -Power margins >16%, Electrification approaching 15% in 2Q25. "Real opportunity to continue to accrete margins higher". In Electrification, 9 percentage points of margin expansion in backlog over past 2 years. -Combined cycle orders will dominate H2 vs. simple cycle in H1 -20 GW run rate target by H2 2026 - current focus on execution; Need 80-100 GW backlog before major capacity expansion (4-5yrs backlog). -Aeroderivatives are the 'Bridge Power' solution: fast commissioning, premium pricing, bridge to future grid connection (backup once plant gets interconnect in 3-6yrs)
Shanu Mathew@ShanuMathew93

Big jump in the quarterly orders for aeros. Heavy-duty gas turbines remain strong too. Won't increase capacity (20GW run-rate by 2H26) until they have 4-5 years of capacity (80-100GW backlog). They will get to 60GW backlog by YE25. Will continue to drive price in meantime...

English
1
1
7
2.8K
Nick Gates
Nick Gates@ngates94·
@ShanuMathew93 These are firm PPAs or all corporate offtake (I.e. including synthetic offtake)?
English
1
0
0
115
Shanu Mathew
Shanu Mathew@ShanuMathew93·
YTD there's been ~8GW of corporate solar PPA issuance in the US and ~75% of that is hyperscalers. Last two months were 1.6GW/each and were 100% hyperscalers.
Shanu Mathew tweet media
English
5
4
38
5.3K
Duncan S. Campbell
Duncan S. Campbell@duncancampbell·
Jony and Sam really outdid themselves with their latest product
Duncan S. Campbell tweet media
English
6
1
62
2.8K
Nick Gates
Nick Gates@ngates94·
@ShanuMathew93 @NREL Cool map that’s getting a lot of attention… but missing what is arguably the most important data layer: gas pipeline infrastructure & ownership.
English
1
0
1
116
Shanu Mathew
Shanu Mathew@ShanuMathew93·
Remarkable visualization all the Data Center and associated infrastructure in the contiguous US. Transmission, fiber lines, data centers by capacity, operational status, and water availability. h/t @NREL (Billy Roberts)
Shanu Mathew tweet media
English
2
8
54
5.7K
Nick Gates 리트윗함
Thaaat Colin
Thaaat Colin@ThaaatColin·
There’s a lot of buzz about traditional VC firms getting behind American “reindustrialization.” But I don’t see how that fits their model, maybe I’m missing something… The typical VC expects an exit in 5–7 years, via IPO or acquisition. That works great (in theory) for SaaS. But rebuilding American industry? That’s a 10–20 year play, especially when building businesses with significant physical assets & infrastructure.
English
31
17
217
42.3K
Nick Gates
Nick Gates@ngates94·
@ShanuMathew93 Have to imagine an HPC is going to open their balance sheet to support a ‘mega project.’ No gw-scale power is going to get built and no ESA approved without meaningful credit/investment from the HPC. Load bubble pops quickly without upstream investment.
English
0
0
0
30
Shanu Mathew
Shanu Mathew@ShanuMathew93·
Bingo. "Big tech companies are prone to rapid changes in capital allocation, oftentimes with only 6-9 months of visibility into HPC/cloud demand for end customers, which is highly mismatched to the 3-5 year timeline needed for construction of HPC datacenters."
English
2
0
7
640
Shanu Mathew
Shanu Mathew@ShanuMathew93·
NEE outlines three approaches for powering data centers: on-grid for normal loads (<400MW), on-grid for very large loads (1-3GW), and off-grid islanded solutions (3-5GW). NEE will offer solutions based on needs but: -Customers care about time first, then money, and ideally want to be FTM vs. BTM -Will keep talking to hyperscalers but work for utilities predominately -Off-grid is higher price point and competition is intense right now -Optionality makes on-grid more desirable
Shanu Mathew tweet media
English
1
1
4
638
Nick Gates
Nick Gates@ngates94·
@BruinCap8 @_LatitudeMedia Where can I find the interview? I was posting about this ‘redundant demand’ last week — glad to hear the idea is substantiated. Medium term load growth probably looks like 50 GWs of new data centers… tops.
English
2
0
0
196
BruinCap
BruinCap@BruinCap8·
Brian Fitzsimons of GridUnity interview with @_LatitudeMedia: “For one of our utilities, which is one of the largest utilities in the United States, nearly 30% of all the applications submitted in 2024 were canceled…Additionally, when large tech companies decide to build 30 data centers for 100 megawatts each [these are random numbers] they go to multiple transmission owners across the country with the same data centers, asking if they have the generation capacity to serve their needs. Then, the tech companies will decide that they’re only going to build five out of those 30 data centers this year, and they only select five locations. But in the meantime, they’ve sent a signal across North America that has everyone thinking they might have 100 megawatts in their backyard this year. And you get an inflated load forecast for the nation”
English
1
2
14
3.5K
Nick Gates
Nick Gates@ngates94·
@ShanuMathew93 Nobody’s talking about it, because nobody’s developing it, because nobody’s going to finance it…
English
0
0
0
73
Ben James
Ben James@BenJames_____·
Who would like to build a teeny Solar Data Center at Edge Esmeralda in June? Completely off-grid w solar, batteries, cooling, starlink all integrated. Will put together a squad if there’s interest. @JoinEdgeCity
Ben James tweet media
English
39
14
128
35K
Nick Gates
Nick Gates@ngates94·
@ShanuMathew93 Project specific investment for Louisiana mega site capex, or is this separate?
English
0
0
0
19
Shanu Mathew
Shanu Mathew@ShanuMathew93·
“Apollo is in talks to lead a roughly $35 billion financing package for Meta Platforms Inc. to help develop data centers in the US. The funding conversations are at an early stage and there’s no guarantee a deal will be completed, with KKR & Co. also part of the investor group.”
Shanu Mathew tweet media
Shanu Mathew@ShanuMathew93

Seen via BBG but Information reporting: Meta plans $200B+ data center for AI. Campus could require 5-7 GW of power and would significantly exceed the $10B Louisiana facility previously announced. This massive investment aims to support AI features across Meta's apps and compete with OpenAI's expansion.

English
2
1
10
2.8K
Nick Gates
Nick Gates@ngates94·
@SimonMahan I’d posit that gas can’t do it alone, regardless of manufscturing ramps. In both BTM and FTM scenarios, gas serving GW-loads will more than likely require new pipeline buildout, co-located renewables, and massive dev stage credit to get built. Many parts make the whole.
English
0
0
1
112
Simon Mahan
Simon Mahan@SimonMahan·
Research paper idea for someone else (not me) to do: Current gas turbine manufacturing capacity Compared against load growth demand forecasts Inflationary impact of extreme gas demand Basically, can gas do it alone? What are the costs of going that route?
English
6
6
26
4.1K
Duncan S. Campbell
Duncan S. Campbell@duncancampbell·
Nuclear friends, do you agree with this statement? That objectively only thing the solar industry has been great at is essentially grifting. Solar friends, please refrain from responding on this one.
Andrew Follett@AndrewCFollett

@clawrence @duncancampbell Let me put it this way...the ONLY thing impressive about the solar industry is its remarkable ability to retain top level PR, lawyers, and lobbyists. This is objectively true. And the fact that you ran away offended within seconds of hearing this truth just confirms it.

English
8
0
10
2.7K