Gregor

230 posts

Gregor

Gregor

@GregorMakes

Current hyperfocus in tinkering Healthtech, Apps, Mobility, AI I am home in "Layer 5". Projects: - LongResearch - https://t.co/9G3M1I0CSb - Infrahub (B2G) - Kalo

Berlin Katılım Mart 2026
99 Takip Edilen34 Takipçiler
bchoor
bchoor@iambchoor·
@ClaudeDevs Doesn’t work for me, didn’t get the extra 50%. Im on Max. I’m still at 100% so switched to codex until it resets. And I’m really liking GPT-5.5; it doesn’t need skills to work right.
bchoor tweet media
English
15
2
87
19.5K
ClaudeDevs
ClaudeDevs@ClaudeDevs·
Claude Code weekly limits are increasing 50%, now through July 13. Live now for all Pro, Max, Team, and seat-based Enterprise users.
ClaudeDevs tweet media
English
1.3K
2K
21.9K
2.5M
Gregor
Gregor@GregorMakes·
Can't hide when I figured Boris Cherny's Claude Code autoloop git management for my projects.
Gregor tweet media
English
0
0
1
36
Gregor
Gregor@GregorMakes·
@itsolelehmann This only matters if you can’t collaborate and need to think in national silos.
English
0
0
1
19
Ole Lehmann
Ole Lehmann@itsolelehmann·
I'm German. Germany's ENTIRE AI data center capacity is less than 1/2 of just one site being built in Texas. We have 530 megawatts of AI data center capacity in the entire country. The US has 8.2 gigawatts. That's 15x more compute on a country with only 4x the people. Per German, the US has roughly 4x the AI infrastructure. One university computer at MIT is 4x faster than Germany's most important commercial AI facility. The obvious reaction here is "so what, German companies can just rent compute from AWS." But that's the same logic Germany applied to Russian gas for two decades. Roughly 70% of German enterprise AI today runs on American cloud providers like AWS, Microsoft, and Google. Which means it runs under American law. Every AI tool running in German hospitals, courts, ministries, banks, and factories sits on a foreign platform. Here's why this can actually become problematic. Imagine these scenarios: > The next GPU generation launches and American companies get access first because they own the data centers. German firms wait 12 months and pay 2-3x more for what's left. > A frontier AI model gets released and US export controls block it from being deployed in Germany. SAP and Siemens watch American competitors integrate it for a year before they can. > And in the worst case, a US president decides to use AI access as leverage in a trade dispute. German companies get cut off from the models their American competitors are still running. All of them are compounding problems that will negatively impact the German economy (and everyone's standard of living/jobs etc). None of this is hypothetical. > The US pulled Starlink as leverage with Ukraine in March 2025 > Chip exports to China have been throttled for three years > And the CLOUD Act lets the US demand any data stored by American cloud providers (even when the customer is a German company and the servers are physically in Germany). Germany doesn't have an answer for any of those scenarios today because the infrastructure that would make those answers possible isn't built yet. Now look at why this is actually happening on the ground. In the last 3 months Germany rejected 3 AI data center projects in a row: > Groß-Gerau, February: Vantage Data Centers, €2.5 billion, 174 MW. Voted down 18-14 by the local council > Maintal: EdgeConnex, €1 billion, 170 MW. Blocked over a backup gas generator the developer needed because grid connections in Germany take 7-10 years and a data center is built in 2 > Freyenstein, Brandenburg, April: 700 MW AI campus. Killed by protests before construction €3.5 billion in AI infrastructure turned away in one quarter. And the situation is more urgent than it looks because compute is getting harder to access, not easier. NVIDIA's Blackwell GPUs are already allocated through the second half of 2027. The American hyperscalers locked in the bulk of new production with forward orders placed in 2025. TSMC's advanced packaging lines (the actual bottleneck) are sold out through 2026. Germany has no hyperscaler of its own. That means German industry sits at the back of the queue, and the gap compounds every quarter that goes by. Where Germany is falling short right now comes down to three things: > Public backlash, because the case for what AI data centers actually do for a country has never been made to the people voting on them > Industrial electricity at €0.16-0.18 per kWh vs about $0.08 in Texas. For a 1 GW campus that's $700-900 million extra per year just for power > Grid connections taking 7-10 years for large facilities when the data center itself is built in 2. No serious operator runs on math where the wait is longer than the build And the first one is the biggest. Electricity policy and grid timelines are fixable. Public consent isn't, until someone makes the case that this infrastructure isn't nice-to-have. It's the foundation everything else runs on. The average person only feels the downside (noise, rising electricity cost, terror attack vector) We have a big messaging and marketing problem around data centers and why they are critical for everyone's future. Germany still has the foundation to win this if it moves now. Germany adopted its first national data center strategy in March 2026. 28 concrete measures, annual progress reports, doubling overall capacity and quadrupling AI capacity by 2030. The plan exists. The Industriestrompreis launched on January 1st of this year. It targets 5 cents per kWh for half of an industrial user's annual consumption. If data centers get cleanly pulled into that framework, the electricity cost gap with Texas gets significantly closer. Deutsche Telekom turned on 10,000 NVIDIA Blackwell GPUs in Munich in Q1. One facility increased Germany's available AI compute by roughly 50% overnight. And the demand is already domestic. SAP, Siemens, BMW, BASF. The German industrial anchors that benefit most from AI are German companies. The customers are at home, the infrastructure should be at home too. And this is the thing that most people forget. Germany won the second industrial revolution. By 1900 German chemical output had passed Britain's, Siemens was wiring the world, and BASF and Bayer were inventing industries that didn't exist before they built them. The companies that came out of those decisions are still the largest employers in Germany 130 years later. Germany sat out the third industrial revolution, the software one, and that was survivable because software didn't run factories. But AI runs factories. It runs hospitals, logistics, courts, and financial markets. This one is infrastructure in the same category as railways and chemical plants. The plan is written and the money is ready. The only question left is whether the country will let it get built. There's a lot of work left to do, but I'm staying optimistic.
Ole Lehmann tweet media
English
160
185
894
64.5K
Louise de Sadeleer
Louise de Sadeleer@LouiseDSadeleer·
my biggest flex is that this was my first month on YouTube with @TellaHQ wanna know how I did this?
Louise de Sadeleer tweet media
English
20
6
125
14.2K
Gregor
Gregor@GregorMakes·
@ClaudeDevs Nope, hasn’t increased here either (max 5x, germany)
Gregor tweet media
English
3
0
5
4K
ClaudeDevs
ClaudeDevs@ClaudeDevs·
Details: - Applies everywhere you use Claude Code — CLI, IDE extensions, desktop, and the web - Live now, runs through July 13 at 6PM PDT / 1AM GMT - Nothing to opt into, it’s already applied to your account - This stacks with the 2x increase to 5-hour limits announced last week
English
74
34
1.4K
168.4K
Vlad Sazonau
Vlad Sazonau@vladsazonau·
I want to connect with people who are into: ➡️LLMs ➡️AI workflows & automation ➡️Building with Cursor/Claude Drop your project below or just say hi 👋
Vlad Sazonau tweet media
English
54
2
39
1.9K
Gregor
Gregor@GregorMakes·
@liamsLCjourney Afaik, subagents get a cleaned context limited to their specific need. When they are done they automatically clear too.
English
0
0
0
10
Liam's LC/ME Journey
Liam's LC/ME Journey@liamsLCjourney·
@GregorMakes I'm curious why it's better for token burn - is it that each agent isn't dealing with the entire context of the previous prompt?
English
1
0
0
20
Gregor
Gregor@GregorMakes·
100% of code is written by claude for months now, thats what Boris Cherny told us last week. I studied and rebuilt the claude code automator method of him and Jarred Sumner from their live session at the Claude Code dev conference and it immediately did 10x my coding output. Here is what they do and how you can do it too (drop me a comment for the skill):
Gregor tweet media
English
1
0
0
87
Gregor
Gregor@GregorMakes·
I have all this packaged up as a skill. Will dm this to whoever is interested. <3
English
0
0
0
59
Gregor
Gregor@GregorMakes·
With this method this is how you build your projects: 1/ tell one Claude Code session "bug report: X" or "feature request: Y" — it files a structured GitHub issue from the matching template. 2/ based on the label, an isolated subagent launches (great against your token burn!). fix-bug-worker for bugs, implement-issue-worker for features. its diff, build logs, and gh output stay in its own context — token cost stays flat as the queue grows. 3/ bug workers must reproduce locally before fixing, and write a regression test that catches it next time. then they branch, implement, open a PR, and fix their own red CI until green. 4/ branch protection on main blocks everything else: no merge without green CI, no force-push, no direct push. even admins. 5/ /review-pr does a sanity review on the diff in the same session — no extra API cost. you click merge. only human step. 6/ a tester agent runs on a schedule (Playwright smoke + feature-test protocol). finds regressions, files new bug issues — the loop closes itself. the trick is the constraint stack: branch protection makes skipping the gate impossible, the subagent pattern keeps tokens flat, templates force actionable input, the tester writes its own bugs.
English
2
0
0
96
Gregor
Gregor@GregorMakes·
@JackHadfield14 Hard to be impolite when cortisol is your enemy NO1.
English
0
0
2
147
Jack | amatica health
Jack | amatica health@JackHadfield14·
HIV activists ACT UP seized the FDA. They dumped patients ashes on the White House lawn. They marched coffins through DC. And it worked. Long Covid patients are being told to wait politely. Polite has never moved the world on a health crisis.
English
24
314
1.3K
18.7K
Ole Lehmann
Ole Lehmann@itsolelehmann·
i feel intense fomo not owning any sandisk probably means it's gonna pullback shortly lmao
English
4
1
10
2.6K
Gregor
Gregor@GregorMakes·
@cfs_research @mecfsskeptic With 4-5 prominent subgroups, 160 patients means n=32 for each subgroup. Way to small. Should have prefiltered for PEM and neuroinflam and then use n=100 for that specific target group.
English
0
0
0
12
Gregor
Gregor@GregorMakes·
@cfs_research @mecfsskeptic Just read that their results are leading them to further analyse for subgroups, since thats what the data seems to show. I'd guess exactly that. They measured the wrong thing (Fatigue, not PEM) and avg. out subgroups. Which is what we could have told them from the start. 🫠
English
2
0
0
24
Gregor
Gregor@GregorMakes·
@cfs_research @mecfsskeptic But.. isn't "not being suitably powered" why Luis Nacul + Others were asking themselves these questions?
Gregor tweet media
English
1
0
0
30
ME/CFS Research
ME/CFS Research@cfs_research·
@GregorMakes @mecfsskeptic If the trial is suitably powered, there will still be a difference in the combined group. The only reason it is being tested is because of anecdotal reports, but we know that those are subject to the placebo effect.
English
1
0
0
35