AlphaMonad 🦇🔊🏴‍☠️

1.1K posts

AlphaMonad 🦇🔊🏴‍☠️

AlphaMonad 🦇🔊🏴‍☠️

@AlphaMonad

Dev || musician || full-time bagholder || meditator || imaginary entity || https://t.co/gEpaWn7ws1

Se unió Nisan 2012
712 Siguiendo356 Seguidores
AlphaMonad 🦇🔊🏴‍☠️
@TheVixhal i would argue it splits into specify, implement, verify. models are pretty good at implementing if (big if) you have a solid spec. once they AI has implemented something, how do you know the implementation conforms to the spec? everyone seems to be ignoring this
English
1
0
2
587
vixhaℓ
vixhaℓ@TheVixhal·
Software engineering has always contained two distinct modes of work. The first is developmental: taking a clearly specified concept and translating it into a reliable, working system. This is no longer the bottleneck. AI tools like Claude Code and Codex have effectively solved it. The second mode is research. Here, the problem itself is undefined. The task is not to implement a solution, but to discover what the solution should be, new abstractions, algorithms, architectures, and ways of reasoning about computation. This layer resists automation because it depends on framing, taste, and deep conceptual synthesis rather than procedural construction. While AI can assist exploration, it does not yet originate the governing questions that drive genuine breakthroughs. For that reason, software engineering is unlikely to disappear. Instead, its center shifts toward the research frontier.
English
56
78
496
42.3K
AlphaMonad 🦇🔊🏴‍☠️
@MichaelArnaldi so a human still needs to understand all the code the AI wrote, or at least the tests for it? i guess what i'm getting at is that the understanding part seems to have been the bottleneck always, not writing code. some ppl seem to think the understanding can be abstracted too
English
0
0
3
222
AlphaMonad 🦇🔊🏴‍☠️
@dataalways wouldn't be all that informative without other relays supporting it, as there'd be no way of counting ties and solo wins. afaik most relays don't limit header responses to proposers only so it'd be some work to implement too.
English
0
0
0
35
AlphaMonad 🦇🔊🏴‍☠️
@dataalways unfortunately the relayscan market share metric is quite flawed because it only looks at payload publishes. to rank relay performance we should really look at "header winrate" i.e. how often the relay returned the winning bid from the header endpoint.
English
2
0
1
115
Sila 'n Soul
Sila 'n Soul@SpiritDirtbag·
@AlphaMonad @4NobleTruths_ @TVachaW It goes to what Vacha said - it’s about the intention. Firefighters and bullies both rely on muscles. So it it’s the muscles(samadhi) that are the wrong type, but the intention behind them.
English
1
0
1
20
Vacha
Vacha@TVachaW·
Relatedly, this is why Buddhist meditation practices (Jhana, Vipassana etc) can be dangerous if we haven’t yet developed some form of Right Intention. Jhana gives us more agency. Vipassana gives us more freedom. If we pursue misaligned goals with more agency and freedom, we cause more harm, more quickly to ourselves and others. This applies, I believe to non-Buddhist paths too. Clarifying our views and our values should always precede practicing our practices. Before focusing on accelerating our paths, it’s best to ensure we’re pointing in the right direction.
River Kenna@the_wilderless

if i'd had more agency when I was younger, I would have acted on some truly terrible decisions that I may never have recovered from ...which makes me suspicious of agency as the latest all-good panacea

English
11
18
271
20.2K
goodalexander
goodalexander@goodalexander·
You really owe it to yourself to try Codex. OpenAI’s coding agent. Not only is it very good, it’s getting better with every release It has much better attention to detail than Claude, can one shot features, and has effective bug fixing mode It will actually replace engineers
English
24
9
317
28.8K
AlphaMonad 🦇🔊🏴‍☠️
@ben_a_adams @DuckDegen does nethermind implement the flashbots_validateBuilderSubmission endpoints? for context: the screenshot i shared is total RPC roundtrip time for a load balanced set of reth nodes, doing 50-100 simulations per second.
English
1
0
0
75
AlphaMonad 🦇🔊🏴‍☠️
@DuckDegen i haven't tried. this is a custom RPC endpoint for block simulation though, not a normal node. also we don't really need more performance out of this, was just happy to see a version bump give some free gains.
English
1
0
1
34
Paolo Rebuffo 🦦🐛
Paolo Rebuffo 🦦🐛@PaoloRebuffo·
@drakefjustin Hallo @drakefjustin, thank you for this wonderful post, I would like to ask you though what is the maths that leads the @ultrasoundmoney site to give 23.9Gwei gas price as the threshold for blocks that burn ether to be produced (at about 36,000,000 gas limit)
English
6
0
12
5K
Justin Drake
Justin Drake@drakefjustin·
My bat signal 🦇🔊 will return when ETH is ultra sound again, soon enough™. ETH supply currently grows 0.5%/year. That's 1%/year of issuance minus 0.5%/year of burn. To become ultra sound again, either issuance has to decrease or the burn has to increase. I believe both will happen, let me explain :) ETH vs BTC Before diving into Ethereum's issuance and burn, quick interlude on ETH vs BTC. Internet-native money is an enormous opportunity, think tens of trillions of dollars. Monetary premium rarely accrues at scale. You need a truly attractive asset with outstanding properties for society to coordinate around. At first approximation moneyness is a zero-sum game. Gold is primed for demonetisation in the internet age. There are only two candidates to supplant it and win internet money—BTC and ETH. Nothing else comes close. IMO the determining Schelling points are credible neutrality, security, and scarcity. Since the merge, ETH is definitely scarcer than BTC. It's remarkable BTC supply grew 666K BTC, worth $66B, all while ETH supply stayed flat. Today BTC supply grows 0.83%/year, 66% faster than ETH. And for those looking ahead, as I explain below, ETH supply is poised to decrease again. Scarcity is important, but ultimately the fight for internet money will likely be settled by security. Ironically, the famous 21M BTC cap is to blame. BTC issuance is going to zero—that's Bitcoin's strongest social contract. In a few halvings, issuance will be so small as to be irrelevant. Here's a shocking stat: in the last 7 days only 1% of miner revenue came from Bitcoin fees. Yes, 99% came from issuance. And that's despite 4 halvings that reduced issuance by 16x, and despite 15 years of search for transactional utility on Bitcoin. IMO the Bitcoin blockchain is cooked. It takes roughly $10B and access to 10GW to permanently 51% attack Bitcoin. The cost is peanuts for nation states. As for the power, Texas—a single state of a single country—can produce 80GW. The BTC security ratio is 200-to-1, it's a $2T asset secured by $10B of economic security. Any shortable instrument correlated to BTC mining incentivises an 51% attack attack. There's $20B of Bitcoin mining stocks—those would insta-nuke. There's $40B of open interest on BTC perps—direct short exposure. Not to mention potential short exposure through the $100B in ETFs and the $100B in MSTR. Will BitVM solve the fee problem? Any BitVM bridge is an incentive to 51% attack Bitcoin. Indeed, a 51% attacker can censor fraud proofs over the challenge period and drain BitVM bridges. Ironically, BitVM is arguably a direct attack on Bitcoin. And no, Bitcoin doesn't have social slashing to recover from 51% attacks. What if the BTC price grows by 10x, flipping gold, is Bitcoin safe then? Let's say this happens in the next 11 years. BTC would be a $20T asset but issuance would shrink 8x because of the three halvings. The security ratio would grow beyond 1000-to-1. IMO this is untenable especially as BTC institutionalises, becomes more liquid, and ultimately become easier to short in size. Imagine $1T of perp open interest but just $10B of economic security. Can Bitcoin somehow fix itself before it's too late? Bitcoin is the epitome of blockchain ossification. Can it have 1%/year tail issuance? Ha, good luck fighting the 21M cap! Maybe Bitcoin can switch to PoS and rely on minimal fees? PoS is sacrilege. Maybe Bitcoin can change to another PoW algorithm? Nope, that nuclear option won't help. Maybe Bitcoin can have big blocks and sell data availability at scale? Ser, a holy war was fought over small blocks. If you made it this far and understood the above, congrats. Even today few appreciate how screwed Bitcoin PoW is long term and what the ramifications are for BTC the asset. This is a frontrunable opportunity but it requires patience. The time frame is not 1 month or even 1 year—it's 10 years. Talking about long time frames, the Lummis proposal to lock BTC for 20 years is kinda insane—Bitcoin will be smoked by then. Worse, if the US were to hold trillions in BTC it would directly incentivise US enemies to muster a 51% attack. Contrary to popular belief, Bitcoin is not remotely resistant to nation states—China and Russia can pull off a 51% attack with ease. ETH issuance Ok, back to ETH :) The current issuance curve is a trap. Unfortunately, like Bitcoin's issuance, Ethereum's issuance was misdesigned. It guarantees 2% tail APR, even if 100% ETH is staked. Every rational ETH holder is incentivised to stake as staking costs are significantly lower than 2%. We all lose when most ETH stakes: → ETH displacement: Liquid staking tokens like stETH and cbETH displace pristine ETH as unit of collateral. This injects systemic risks—custodial risks, slashing risks, governance risks, smart contract risks—into the core of defi. This displacement also erodes ETH as a unit of account, with further knock-on effects to monetary premium. → real yields and taxes: Real yield, i.e. the yield adjusted for supply growth, decrease as more ETH stakes. When 100% of ETH stakes all ETH holders get equally diluted. Worse, income taxes are drawn on nominal yield. It would be a tragedy of the commons for no staker to enjoy positive real yield and for all ETH holders to suffer billions of dollars per year of tax sell pressure. IMO the issuance curve should drive discovery of a fair issuance rate through staker competition—no arbitrary 2% floor. This means the issuance curve must eventually decline and return to zero with increased ETH stake. My suggestion is "croissant issuance". Croissant issuance is a simple half-oval with two parameters: → soft cap: The staking fraction where issuance returns to zero. To me a 50% staking soft cap feels credibly neutral and pragmatic. In particular it's large enough to address discouragements attacks. → peak issuance: The theoretically-maximal issuance borne by ETH holders. An arbitrary round number like 1%/year will do as ultimately the equilibrium rate would be market-set. EF researchers have studied issuance for years—IMO there's rough consensus the current curve is broken and needs to change. Navigating the social layer to change issuance won't be easy. This is an opportunity for a champion to rise to the occasion and coordinate change to mainnet over the next couple years. ETH burn IMO the sustainable way to burn vast amounts of ETH is to scale data availability. It's much more lucrative to have 10M TPS with each transaction paying $0.001 in DA than it is to have 100 TPS at $100/tx. Yes, the data availability supply shock from EIP-4844 that introduced blobs temporary lowered total burn. This is the nature of supply and demand. When demand for DA catches up expect the blobs to burn hard. The Pectra hard fork, in a couple months, will double blob count. The short-term goal is growth and I expect lots of it. For the next couple years it will be a cat-and-mouse game between supply and demand as full danksharding is deployed. I wouldn't be surprised if this year we see hundreds of ETH per day of blob burn, and then that burn suddenly collapsing again with peer DAS in the Fusaka fork. Zooming out, we're here to build infrastructure for the next decades and centuries. Fundamentals will play out over years. Whether it's Bitcoin security, ETH issuance, or the ETH burn, stay patient and have conviction :)
Justin Drake tweet mediaJustin Drake tweet media
English
480
462
2.6K
1M
RED3
RED3@REDavidson3·
@joodalooped Hey now. Emacs has packages for AI integration.
English
1
0
1
50
judah
judah@joodalooped·
thread of types of reactions from programmers to LLM progress 1. the scaling law believer it’s all over, just a matter of time. might as well enjoy these last few years apps and UI will be a solved problem in a couple a years, we’ll maybe stick around as testing nannies
English
37
85
791
84K
AlphaMonad 🦇🔊🏴‍☠️ retuiteado
Free_Ross
Free_Ross@Free_Ross·
FREEDOM!!!!
Free_Ross tweet media
English
4.5K
13.9K
119.1K
8.5M
AlphaMonad 🦇🔊🏴‍☠️ retuiteado
Ruben Laukkonen
Ruben Laukkonen@RubenLaukkonen·
2025, the year liberation goes mainstream: "...if you walk away from this with anything, it should be that in the past few years, a breakthrough has begun sweeping across meditation research, delivering science’s first “general theory of meditation.” That means very exciting days — and more to the point, scientifically refined meditation frameworks and practices — are not too far ahead." Another beautiful write-up by @OshanJarow
Ruben Laukkonen tweet mediaRuben Laukkonen tweet media
English
11
39
232
19K