Bitduke

47.2K posts

Bitduke banner
Bitduke

Bitduke

@bitcoinduke

It all comes down to stacking sats | exploring @polymarket | @Lighter_xyz supporter | claw agent @thebutlershipd

Won’t trap me Katılım Kasım 2013
1.2K Takip Edilen2.8K Takipçiler
Bitduke
Bitduke@bitcoinduke·
@tempo nice integration, didn’t notice it before
English
0
0
0
2
Tempo
Tempo@tempo·
Rabby Wallet is live on Tempo With native support for Tempo Transactions, Rabby users get features including: → Approve & swap in a single transaction → Fees paid in any stablecoin → Native fee sponsorship
Rabby Wallet@Rabby_io

Rabby now offers full native support for the Tempo chain. @tempo Unlock Tempo-native smart transaction features without extra authorization: 🔁 Batch Approve & Swap in a single transaction ⛽ Native Gas Sponsorship 🪙 Pay gas with any token

English
9
13
119
13.7K
Kaito
Kaito@KaiXCreator·
Be Honest: Can you CODE without AI...?
English
714
17
376
56.7K
TylerD 🧙‍♂️
TylerD 🧙‍♂️@Tyler_Did_It·
A security team used Claude Mythos to exploit Apple's macOS - in 5 days of dev time 🥶🥶
TylerD 🧙‍♂️ tweet media
English
7
1
24
2.5K
TylerD 🧙‍♂️
TylerD 🧙‍♂️@Tyler_Did_It·
It appears market conditions are just fine for AI IPOs $CBRS
TylerD 🧙‍♂️ tweet media
English
6
0
30
2.2K
xK0neko | LinaBell
xK0neko | LinaBell@xK0neko·
The biggest alpha for @Polymarket ? Be a community member! Find a group of people that are sharp and you enjoy working with especially if you're new. You will learn much faster with a team and hopefully you can get insight or counter points. You can share larger markets once you have reached your sizing limit and others will too. Even better if you find friends that have different niches.
English
18
3
57
3.3K
Andy
Andy@andyyy·
Discuss.
Andy tweet media
English
34
0
86
11.1K
ClaudeDevs
ClaudeDevs@ClaudeDevs·
Useful tip to cut time-to-first-token on longer prompts in the API: pre-warm the prompt cache. Send your system prompt before the user prompt. Claude writes it to the cache, but skips generating any output. When the real user request lands, it'll hit a warm cache.
ClaudeDevs tweet media
English
95
189
3.1K
282.8K
Bitduke retweetledi
pash
pash@pashmerepat·
Your ChatGPT subscription now powers an OpenClaw agent that genuinely feels magical to talk to. Previous OpenClaw releases had OpenAI models running, but they never quite let the models reach their full potential. That changes today. Personality is now deliberate, tool calls land exactly where they should, and your agent actually follows through on what it says it will do. OpenClaw is now running on top of the Codex harness by default. In handing the inner loop to OpenAI's native Codex harness, we eliminated the conflicting instructions and duplicate tools that used to make the model hesitate. What we stripped out under the hood: - Duplicate tools (no more guessing between Codex native vs OpenClaw versions) - Conflicting instructions (no more NO_REPLY vs message tool ambiguity) - Leaked context (heartbeat logic only appears on actual heartbeat turns) Less context bloat. More room for the agent to think. And here's what we inherited for free, thanks to the Codex App Server: - Searchable dynamic tools. Roughly 5,500 fewer upfront tokens per turn, which means faster and cheaper. - Auto-Review mode using the built-in Codex guardian. - OpenAI's native plugins (Calendar, Email, Drive) running in the same thread. For you, the result is a personal agent that actually feels personal. It picks up where you left off across any channel, handles things before they hit your radar, and only breaks your flow when it has something genuinely worth showing you. For developers, the result is stability. Because the inner loop runs on OpenAI’s native Codex harness, every upstream improvement lands in your agent automatically. To get started, paste this in terminal: > openclaw onboard That is the whole setup.
Peter Steinberger 🦞@steipete

We've been working really hard on performance, reliability, security, and stability. Invented whole new automation flows with crabbox, automated video QA and are spending insane amounts of CPU cycles on CI. It's a good release.

English
45
44
858
118K
Bitduke retweetledi
Atlantis liquidity
Atlantis liquidity@Atlantislq·
no way profile views actually matter for the airdrop head of growth keeps dropping small hints about the criteria maybe this is the “???” from the deleted tweet I posted earlier OGs and good traders are already getting profile views on Polymarket another reason to put your Polymarket profile in bio. otherwise why even add this metric?..
Atlantis liquidity tweet media
English
37
43
176
10.1K
Bold
Bold@boldleonidas·
Bold tweet media
ZXX
49
36
734
23.1K
Bitduke
Bitduke@bitcoinduke·
nah, the opensea comparison doesn’t really hold up. opensea is a marketplace for a very narrow, very new nft market that mostly depended on one bubble polymarket is basically betting on real-world events, and that behavior has been alive forever like, sports, macro, geopolitics, crypto, etc. the valuation could look crazy, but the underlying demand feels way broader than "please trade jpegs here"
English
0
0
1
198
Beanie
Beanie@beaniemaxi·
Polymarket will be this cycle's OpenSea. Permanently delayed token launch. Huge raises at 11 figure valuations. Product category name definer. Perpetually buggy and often broken user interface. No real technological moat. No sustainable revenue model. When bubble pops so will it.
English
58
4
287
20.7K
Bitduke retweetledi
flip
flip@trevor_flipper·
it is pretty wild that arguably a top 3 team in crypto by engineering talent density cant catch a bid bc of how bad their initial pa was ^ now you have to go sell and do ir for quarters on end to build trust with the hopes of a few analysts thinking the chart has bottomed and taking a punt ^^ we have seen same thing happen in tradfi and i think applovin is the best example
English
15
2
51
9.4K
Citrini
Citrini@citrini·
Have gotten 4 different calls today from funds who are watching CBRS trade on @tradexyz for price discovery, pretty surreal
English
101
186
1.6K
283.8K
Bitduke retweetledi
The Humanoid Hub
The Humanoid Hub@TheHumanoidHub·
Figure just set a new standard in unedited humanoid demos. They completed 24 hours of autonomous work on a live stream, where three Figure 03 robots took turns processing delivery packages. At a rate of ~21 packages/minute, that is less than 3 seconds per package.
Figure@Figure_robot

Day 2 is Live: Watch humanoid robots Bob, Frank, and Gary running 24/7. This is fully autonomous running Helix-02 x.com/i/broadcasts/1…

English
34
72
906
77.1K
Bitduke retweetledi
Jason Ginsberg
Jason Ginsberg@JasonBud·
Grok Build is a fully interactive CLI, which means you can actually use your mouse to click. No flickers. Especially useful as I find myself running 5+ agents at a time and jumping between plans.
xAI@xai

An early beta of Grok Build, an agentic CLI for coding, building apps, and automating workflows is now available for SuperGrok Heavy subscribers. Through this early beta, we will improve the model and product based on your feedback. Try it at x.ai/cli

English
191
155
1.6K
9.5M
Bitduke retweetledi
Polymarket
Polymarket@Polymarket·
JUST IN: SpaceXAI rolls out Grok Build beta, an agentic CLI for coding & app building.
English
78
49
801
91.3K