Jordan MacLeod

18.6K posts

Jordan MacLeod banner
Jordan MacLeod

Jordan MacLeod

@newcurrency

Author of New Currency: How Money Changes The World As We Know It. Automaticity in monetary policy: achieving Friedman's Optimum without deflation. Sculptor 🌱

Katılım Nisan 2009
836 Takip Edilen1.8K Takipçiler
Sabitlenmiş Tweet
Jordan MacLeod
Jordan MacLeod@newcurrency·
In 1969, Milton Friedman proposed what became known as the "Friedman Rule" - a monetary policy where nominal interest rates should be set to zero, eliminating the opportunity cost of holding money and optimizing monetary efficiency. 🧵 1/15
English
1
2
5
3.8K
Jordan MacLeod retweetledi
tobi lutke
tobi lutke@tobi·
@JsonBasedman @Shopify Get a few good waterloo interns access to the ground truth payroll historic data and the union contracts and a well funded anthropic account and they can solve this problem with Claude code in a few weekends. You don’t need us.
English
62
45
1.3K
67.5K
Jordan MacLeod retweetledi
Elan Barenholtz
Elan Barenholtz@ebarenholtz·
Cool new JEPA paper from @ylecun's group. But calling this a "world model" (or "LeWorld) is a stretch. Here's what LeWM actually does: compress frames into 192-dim vectors, then learn to predict where an action takes you in that compressed space. No scene representation, no causal structure, no physics engine. Just a smooth manifold over action-conditioned transitions, stabilized by a Gaussian regularizer. This is not modeling the world. It's learning the structure the world stamps into the data stream. A very different thing. "World models," to most people, means inferring and modeling a causal mechanism that generated the data. Think predictive processing frameworks in neuroscience, where the brain maintains explicit probabilistic beliefs about hidden causes and updates them via Bayesian inference. Or classical Kalman filters, which maintain an explicit state estimate and transition model that gets updated with each new observation. In both cases there is a separable internal structure that represents the world, can be queried independently of the immediate input, and supports genuine counterfactual reasoning. Despite effective branding, the JEPA approach is not fundamentally different from standard generative models, except that the data stream and generative space are compressed. This is not a model of how the world generates structured data; it's a model of the structured data itself. I've argued this is true of cognition generally: @generativebrain/note/p-187130246?utm_source=notes-share-action&r=353g9l" target="_blank" rel="nofollow noopener">substack.com/@generativebra… So why do the authors call it a world model? Partly the architecture, partly a result they call "physical latent probing": a separately trained linear probe recovers physical quantities like block location from the latent space. The implication is that the model has internalized physical structure. But this is the wrong inference. A probe measures linear covariance between a representation and some external quantity. It tells you nothing about whether the model explicitly represents that quantity. Physical quantities are recoverable because they determined the pixels that determined the compression. The causal chain is: world → pixels → compression → probe-recoverable covariance. The model doesn't represent block location. It inherits covariance with block location from the statistics of its training data. I made this argument recently in a different domain. In a new paper (arxiv.org/abs/2603.04317), I showed that GloVe and Word2Vec — models with no architecture for spatial reasoning — recover geographic structure via linear probe at essentially the same level as models being credited with rich world models. The structure was in the statistics all along. The probe found the covariance. Nobody would say Word2Vec has a world model. The probe shows the world left fingerprints in the stream. It doesn't show the model built a map. LeWM works, and works elegantly. But what it learned is competence over continuation in a compressed action-conditioned space. Which, as I've argued, may be all cognition ever was. Link to paper: arxiv.org/html/2603.1931…
Elan Barenholtz tweet media
English
10
24
212
17.6K
Jordan MacLeod retweetledi
Lucas Maes
Lucas Maes@lucasmaes_·
JEPA are finally easy to train end-to-end without any tricks! Excited to introduce LeWorldModel: a stable, end-to-end JEPA that learns world models directly from pixels, no heuristics. 15M params, 1 GPU, and full planning <1 second. 📑: le-wm.github.io
English
102
539
3.9K
908K
Jordan MacLeod retweetledi
Pratap Ranade
Pratap Ranade@PratapRanade·
Today, I’m excited to introduce @arenaphysica. For the past few years, we’ve been quietly partnering with companies pushing the frontier of hardware like @AMD , @anduriltech and @SiversSemicond  – deep in the guts of their most complex machines. Where applied physics, specifically the laws of electromagnetism, dictate performance. Electromagnetism is a domain poorly suited to LLMs, and a domain I spent most of my physics PhD trying to understand. At Arena Physica, we are in pursuit of electromagnetic superintelligence. We believe that a new class of foundation model will let humans push farther into our understanding of physics and will let us wield forces like EM that shape our world, but are fundamentally unintuitive to humans. It was an honor to partner with my favorite essayist, @packyM to explain how electromagnetism secretly runs the world
Packy McCormick@packyM

The future is electromagnetic. One challenge is that there are ~ten people in the world who can deeply intuit electromagnetism. RF engineering is "black magic." Arena Physica thinks machines can intuit EM better. CEO Pratap Ranade & I on AI for EM: notboring.co/p/electromagne…

English
13
27
182
49.9K
Jordan MacLeod retweetledi
Elon Musk
Elon Musk@elonmusk·
Mass drivers on the Moon!
English
6.6K
13.1K
130.6K
75.3M
Jordan MacLeod retweetledi
Rand Group
Rand Group@cryptorand·
Visa was founded in 1958. Mastercard in 1966. Combined they move $24.8T a year. Stablecoins exist for 6 years and already passed them both. Let that sink in.
Rand Group tweet media
English
72
158
539
48.3K
Jordan MacLeod retweetledi
Steve Jurvetson
Steve Jurvetson@FutureJurvetson·
First view of the 100kw AI Mini Sat with solar panels and heat radiator to scale. “And that’s just the Mini version. We expect future versions to go to the megawatt range.” — Elon The key missing ingredient is a terawatt of AI compute. Fully integrated fab with recursive improvement locally. Will explore non-traditional computing. Austin, TX. Optimus robots: 1-10 billion units/year. D3 chip optimized for space, designed to run hotter to minimize radiator mass. It will be the vast majority of the compute 100-200GW/yr on Earth. +1TW/yr in space because of power constraints on Earth.
Steve Jurvetson tweet mediaSteve Jurvetson tweet mediaSteve Jurvetson tweet mediaSteve Jurvetson tweet media
English
229
677
5.6K
21.6M
Jordan MacLeod retweetledi
Richa Sharma
Richa Sharma@richa_lq·
@lreyzin the physics and computer sciences are morphing into each other.
English
3
3
25
6.8K
Jordan MacLeod retweetledi
Richa Sharma
Richa Sharma@richa_lq·
EVERYTHING IS INFORMATION: Ever since Demis' Lex interview, I haven't been able to shake this idea--information isn't just in the universe. It is the universe. More fundamental than matter or energy. And it has a wild implication most people miss: P=NP is actually a physics question. If the universe runs on information processing, then the limits of computation are the limits of reality itself-- what the universe can and cannot do.
Richa Sharma tweet media
English
90
72
672
42K
Jordan MacLeod retweetledi
Demis Hassabis
Demis Hassabis@demishassabis·
@elonmusk Actually it can all be thought of as information.
English
119
83
1.8K
128.4K
Jordan MacLeod retweetledi
Jordan MacLeod
Jordan MacLeod@newcurrency·
I'm thinking about your quote (2022): "In QI, the information is not deleted, but you simply can't see some of it while accelerating. Stop the acceleration, and you can again see it." I'm thinking game-engine terminology actually fits perfectly. Occlusion culling (extreme view-dependent compression) mirrors the Rindler horizon: it skips processing inaccessible bits behind you. LOD (level of detail) adds the nuance for distant cosmic horizons. Gradually reducing "resolution" on far-off Unruh/vacuum modes as distance grows, saving even more "processing" at low accelerations and naturally lowering inertial mass. Nothing is truly deleted, just culled from the current "frustum" or down-graded until acceleration stops and full visibility/detail returns.
English
2
5
12
451
Jordan MacLeod retweetledi
Garry Tan
Garry Tan@garrytan·
Everyone will code and it will be glorious
Todd Saunders@toddsaunders

I know Silicon Valley startups don't want to hear this..... But the combination of someone in the trades with deep domain expertise and Claude Code will run circles around your generic software. I talked to Cory LaChance this morning, a mechanical engineer in industrial piping construction in Houston. He normally works with chemical plants and refineries, but now he also works with the terminal He reached out in a DM a few days ago and I was so fired up by his story, I asked him if we could record the conversation and share it. He built a full application that industrial contractors are using every day. It reads piping isometric drawings and automatically extracts every weld count, every material spec, every commodity code. Work that took 10 minutes per drawing now takes 60 seconds. It can do 100 drawings in five minutes, saving days of time. His co-workers are all mind blown, and when he talks to them, it's like they are speaking different languages. His fabrication shop uses it daily, and he built the entire thing in 8 weeks. During those 8 weeks he also had to learn everything about Claude Code, the terminal, VS Code, everything. My favorite quote from him was when he said, "I literally did this with zero outside help other than the AI. My favorite tools are screenshots, step by step instructions and asking Claude to explain things like I'm five." Every trades worker with deep expertise and a willingness to sit down with Claude Code for a few weekends is now a potential software founder. I can't wait to meet more people like Cory.

English
49
45
551
153.8K
Jordan MacLeod retweetledi
Satya Nadella
Satya Nadella@satyanadella·
We’ve trained a multimodal AI model to turn routine pathology slides into spatial proteomics, with the potential to reduce time and cost while expanding access to cancer care.
English
456
1.9K
11.2K
2.8M