Hans Gruber’s Parachute

880 posts

Hans Gruber’s Parachute banner
Hans Gruber’s Parachute

Hans Gruber’s Parachute

@jon_duffy

I am a figment of my own imagination LLMs, Generative AI, Productivity, Growth, Progress. https://t.co/6JoILZK1Ry

เข้าร่วม Mayıs 2008
1.1K กำลังติดตาม240 ผู้ติดตาม
Hans Gruber’s Parachute รีทวีตแล้ว
Ruth Husko
Ruth Husko@dank_ackroyd·
Hello British Transport Police? I’ve seen something that doesn’t look right
Ruth Husko tweet media
English
22
124
5.6K
133.6K
Hans Gruber’s Parachute รีทวีตแล้ว
Pablo Senabre
Pablo Senabre@pablosenabre·
Lo último que me esperaba de mi semana era darme cuenta de que mi profesor está metiendo prompt injection en los PDFs de las prácticas para pillar a los que las hacen full con IA… y que estaba cayendo TODO el mundo
Español
57
272
11.7K
1M
Hans Gruber’s Parachute รีทวีตแล้ว
Boris Cherny
Boris Cherny@bcherny·
👋 1h prompt cache is nuanced actually. It costs more for cache writes, and less for cache reads. Whether you benefit from cheaper cache reads depends on your usage pattern -- context window size, whether the query is the main agent or subagent, etc. We have been testing a number of heuristics to give subscribers better prompt cache hit rates, which means lower token usage and lower latency, when it works. But this effect is far from uniform due to the nuance above. Say you use 1h cache for an agent, but only used the agent to make a single query -- in this case the 1h cache would be wasted and you'd be overcharged. At this point we have rolled out 1h prompt cache by default in a number of places for subscribers to optimize cache duration based on real usage patterns, but we actually keep it at 5m for many queries also (eg. subagents, which are rarely resumed so you'd be paying for them even though they do not benefit from 1h). We also are not defaulting API customers to 1h yet -- this needs more testing to make sure it's a net improvement on average. Separately, when we do this kind of experimentation, we use experiment gates that are cached client-side. When you turn off telemetry we also disable experiment gates -- we do not call home when telemetry is off -- so Claude reads the default value, which is 5m. We will soon be changing the client side default to 1h for a few queries, since we now feel good that it is a small token savings on average for those queries. We will also give you env vars to force 1h and 5m. In any case, the token savings is nowhere near 12x unfortunately. It is a small win though, that we have been in the process of rolling out to everyone. Hope the explanation helps. More here: #pricing" target="_blank" rel="nofollow noopener">platform.claude.com/docs/en/build-…
English
44
53
1K
307K
Hans Gruber’s Parachute รีทวีตแล้ว
Bogáta Timár
Bogáta Timár@BogataTimar·
okay I guess I have to talk about Péter Magyar here. Let me just start with saying, in a very unladylike way, that you guys seem to have zero clue what happened in Hungary in the last two years, you completely miss the point, and you're a disappointing bunch. Let's go.
English
502
4.7K
22.2K
2.7M
Hans Gruber’s Parachute
Hans Gruber’s Parachute@jon_duffy·
I wonder if @anthropic @claudeai now asking this quite regularly is linked to the rollout of memory, having visibility of all the recent work that is added to memory persevered between sessions...
Hans Gruber’s Parachute tweet media
English
0
0
0
7
Hans Gruber’s Parachute รีทวีตแล้ว
Alec Stapp
Alec Stapp@AlecStapp·
Thinking about this Bono quote again
Alec Stapp tweet media
English
129
1.4K
11.8K
1.6M
Fleur Elizabeth
Fleur Elizabeth@fleurmeston·
I was very restrained
Fleur Elizabeth tweet mediaFleur Elizabeth tweet media
English
23
1
232
16.1K
Hans Gruber’s Parachute รีทวีตแล้ว
“paula”
“paula”@paularambles·
anthropic: “we have finished training the ultimate god model exposing zero-day vulnerabilities in all software including linux and ffmpeg and we also made ten billion dollars last month” openai: “we have acquired TBPN”
English
48
96
3.6K
196.9K
Hans Gruber’s Parachute รีทวีตแล้ว
FFmpeg
FFmpeg@FFmpeg·
@aakashxsh They did
English
24
126
4K
144.2K
Hans Gruber’s Parachute รีทวีตแล้ว
Trevor Levin
Trevor Levin@trevposts·
I did the math a couple weeks ago and it turns out a vegan prompting a frontier LLM *every second, 24/7* consumes less water than the average omnivore who never uses AI.
mary@theoceanblooms

people are trying to argue with this and it’s literally the truth. a kilogram of beef requires over 15,000 litres of water to produce. a vegan who uses chatgpt every day is living a more sustainable lifestyle than someone who regularly eats beef while boycotting AI.

English
30
204
3K
327K
Hans Gruber’s Parachute รีทวีตแล้ว
rohit
rohit@krishnanrohit·
Was looking for buildings inspired by Escher and came across this Chongqing bookstore.
rohit tweet media
English
22
122
1.7K
56.8K
Hans Gruber’s Parachute รีทวีตแล้ว
Haseeb >|<
Haseeb >|<@hosseeb·
This is wild. Google Research demonstrates a ~20x more efficient implementation of Shor's algorithm that could break ECDSA keys within minutes with ~500K physical qubits. Google is now are more confident on a 2029 post-quantum transition. We are no longer looking at mid 2030s, we could have quantum computers of this scale by the end of the decade. They believe this result is so severe that they are not publishing the actual circuits. They instead published a ZKP proving that they know of the quantum circuit with these properties. This is very atypical, showing Google thinks this is serious shit. All blockchains need a transition plan ASAP. Post-quantum is no longer a drill.
Haseeb >|< tweet media
nic carter@nic_carter

Many are wondering "what Google saw" that caused them to revise their post-quantum cryptography transition deadline to 2029 last week. It was this: research.google/blog/safeguard…

English
261
625
5K
1M