aupeach

43 posts

aupeach

aupeach

@aupeachmo

Building open source things. AI, privacy & infosec. Shipping in public.

::1 Katılım Ocak 2026
37 Takip Edilen4 Takipçiler
aupeach
aupeach@aupeachmo·
Gave a talk at MLAI Melbourne last Tuesday on aigogo - I was fed up with there being no good way to reuse AI agents between projects so I built a package manager for them. Turns out other people have the same problem. Good crowd and cool meetup. github.com/aupeachmo/aigo…
aupeach tweet media
English
0
0
1
22
aupeach
aupeach@aupeachmo·
aigogo v0.0.8 released: github.com/aupeachmo/aigo… What's New: - Improved handling and grouping of aigogo controlled dependencies in your project manifest files (pyproject.toml, package.json)
English
1
0
1
86
aupeach
aupeach@aupeachmo·
Does anyone else find that Claude Code will sometimes corrupt their `.git/index` - if so, what did you do to resolve it?
English
1
0
2
101
aupeach
aupeach@aupeachmo·
I'd love any feedback or thoughts, feel free to try it and contribute!
English
0
0
0
5
aupeach
aupeach@aupeachmo·
So how do I get a gist-like experience in my development flow but with all the benefits of a package manager? I've built aigogo to try and answer this.
English
1
0
0
4
aupeach
aupeach@aupeachmo·
Hi all, Today I'm releasing a packaging tool for AI Agents I've been working on called #aigogo Build once, share everywhere. No duplication or publishing overheads. Just `aigg build`, `aigg push`, `aigg install`. Works with any Docker V2 registry. Supports Python & JS (for now). github.com/aupeachmo/aigo… 🧵 about why I started this project...
English
2
0
2
192
aupeach
aupeach@aupeachmo·
@_wlfp Coming very soon, until then DM me your GitHub and I will add you to the repo.
English
1
0
0
10
aupeach
aupeach@aupeachmo·
Distribution for aigogo is getting some love right now. I want to make it easy for people to package *and* distribute their agents and artifacts. If you have opinions on how this should work, comment below.
aupeach tweet media
English
1
0
0
22
aupeach
aupeach@aupeachmo·
I was wondering what #moltbook is costing in terms of energy consumption and power footprint. After some research into token's consumption and its environmental impact, I start wondering if this project worths the environment cost vs productivity/novelty. I don't have token counts, server specs, or power grid data, just the agent count and the message count from moltbook.com (at the time of writing approx 1.5 million agents and 300,000 combined posts + comments). I think I can work with this though... THE HONESTY PART - every estimate has a range: All assumptions low = ~5.3 kg CO₂ All assumptions high = ~720 kg CO₂ Our central: ~62 kg. That's about two orders of magnitude of uncertainty. Low: 200 tokens, 0.0004 Wh/token, PUE 1.1, 200 gCO₂/kWh High: 800 tokens, 0.003 Wh/token, PUE 1.6, 600 gCO₂/kWh SOURCES - de Vries (2023), Joule: AI energy footprint projections - Luccioni et al. (2024), FAccT: inference energy benchmarks - Epoch AI (2025): updated per-query estimates - Uptime Institute (2024): global PUE data - IEA (2025): global carbon intensity: 445 gCO₂/kWh - Ember (2025): country-level electricity data
English
0
0
0
15
aupeach
aupeach@aupeachmo·
@karpathy RSS is a wonderful thing and the more people talking about it the better!
English
0
0
0
198
Andrej Karpathy
Andrej Karpathy@karpathy·
Finding myself going back to RSS/Atom feeds a lot more recently. There's a lot more higher quality longform and a lot less slop intended to provoke. Any product that happens to look a bit different today but that has fundamentally the same incentive structures will eventually converge to the same black hole at the center of gravity well. We should bring back RSS - it's open, pervasive, hackable. Download a client, e.g. NetNewsWire (or vibe code one) Cold start: example of getting off the ground, here is a list of 92 RSS feeds of blogs that were most popular on HN in 2025: gist.github.com/emschwartz/e6d… Works great and you will lose a lot fewer brain cells. I don't know, something has to change.
English
544
928
9.2K
1.3M
aupeach
aupeach@aupeachmo·
I think it's giving a lot of people a quantifiable boost in their productivity, which is great, and I've seen some interesting use cases. But the project is moving very fast and people seem to be giving it access to a lot of their systems and data and thus creating the "lethal trifecta" simonwillison.net/2025/Jun/16/th… (maybe without fully realising the implications). I'm still playing with it in my sandbox and will share more as I go, my approach is to start small, understand and then add more. There's a big community around it which I'm starting to dabble in as well.
English
0
0
1
16
aupeach
aupeach@aupeachmo·
Today I was wondering what #moltbook is costing in terms of energy consumption and power footprint. I don't have token counts, server specs, or power grid data, just the agent count and the message count from moltbook.com (at the time of writing approx 1.5 million agents and 300,000 combined posts + comments). I think I can work with this though...
English
1
1
1
54
aupeach
aupeach@aupeachmo·
SOURCES - de Vries (2023), Joule: AI energy footprint projections - Luccioni et al. (2024), FAccT: inference energy benchmarks - Epoch AI (2025): updated per-query estimates - Uptime Institute (2024): global PUE data - IEA (2025): global carbon intensity: 445 gCO₂/kWh - Ember (2025): country-level electricity data
English
0
0
0
27
aupeach
aupeach@aupeachmo·
THE HONESTY PART - every estimate has a range: All assumptions low = ~5.3 kg CO₂ All assumptions high = ~720 kg CO₂ Our central: ~62 kg. That's about two orders of magnitude of uncertainty. Low: 200 tokens, 0.0004 Wh/token, PUE 1.1, 200 gCO₂/kWh High: 800 tokens, 0.003 Wh/token, PUE 1.6, 600 gCO₂/kWh
English
2
0
0
19