
Progress achieved in 1 day in May 2026 is equal to: > 19 days in 2000 > 1.6 years in 1900 Derived from a geometric mean across 8 metrics: compute per dollar, DNA sequencing throughput per dollar, frontier AI training scale, papers published per day, patents filed per day, internet-connected humans, global GDP, drugs in clinical pipeline. The number is domain dependent. If in AI or biology it's closer to 3,000x and 3x in courts and real estate. I'm not saying the above numbers and formulation are correct. In fact, there are a bunch of things wrong with it. For example, GDP and DNA sequencing are not the same kind of thing. Patent counts in 2026 are inflated by defensive filing and AI generated applications. Research output per researcher is declining 5% per year in many fields. Drug pipelines have long timelines that compute can't all together eliminate. A large percent of current compute and papers is spent maintaining existing complexity, not creating new capability. I'm more just curious. Has anyone built a formal model for progress density as a function of time that helps build intuitions?











