Adam Stacoviak

12.4K posts

Adam Stacoviak banner
Adam Stacoviak

Adam Stacoviak

@adamstac

Founder/EIC @Changelog and 😇 to @supabase & @calcom

Austin, TX 가입일 Mart 2007
1.5K 팔로잉4.7K 팔로워
Adam Stacoviak
Adam Stacoviak@adamstac·
Next I need to try it out with some coding tasks.
English
0
0
0
109
Adam Stacoviak
Adam Stacoviak@adamstac·
Seriously impressed by Gemma 4. Impressive responses. Open source. Fast as all get out. Wow.
English
1
0
0
187
Adam Stacoviak
Adam Stacoviak@adamstac·
@garrytan It’s more of a compartmentalization, not shutdown. Any they’ve been SUPER clear why/how.
English
0
0
0
181
Garry Tan
Garry Tan@garrytan·
Should have been more precise: OpenClaw now requires the full metered API key and cannot be used with the $200 Max Subscription
English
13
1
92
15.4K
Garry Tan
Garry Tan@garrytan·
Anthropic shutting down OpenClaw may turn out to be a strategic blunder, or strategic genius. The OpenClaw community will be the determiner of whether it is A or B. It's an interesting moment in history. Personally I never bet against open source.
English
386
130
2.6K
230.7K
Adam Stacoviak
Adam Stacoviak@adamstac·
Came back from Spring Break and my @AmpCode free is gone now 😭
English
1
0
0
224
Adam Stacoviak
Adam Stacoviak@adamstac·
Real is rare. Fake is everywhere.
English
0
0
0
146
Nicholas C. Zakas
Nicholas C. Zakas@slicknet·
Never thought buying a new oven would involve and extensive internet search for "disable wifi on oven."
English
2
0
9
1.5K
Adam Stacoviak
Adam Stacoviak@adamstac·
Legit considering 8TB this time.
English
2
0
2
300
John Nunemaker
John Nunemaker@jnunemaker·
Just committed by hand without claude. Felt weird.
English
7
0
12
1.5K
Adam Stacoviak
Adam Stacoviak@adamstac·
Shannon Entropy Formula H = -Σ(p(x) * log₂(p(x))) Where: H = entropy score p(x) = probability (frequency) of each character Σ = sum across all unique characters In plain English: Count how often each character appears, then calculate how "surprising" that distribution is. @mattzcarey ^
English
3
0
3
279
Adam Stacoviak
Adam Stacoviak@adamstac·
Quick pre-processing trick. Run Shannon Entropy on a chunk of input before you do anything with it. The score will tell you a lot! Is this text, binary, compressed, code? Saves you from doing expensive work on data that was never what you expected.
English
0
0
0
65
Adam Stacoviak
Adam Stacoviak@adamstac·
Shannon entropy measures randomness or unpredictability in a string of characters.
English
0
0
0
59
Adam Stacoviak
Adam Stacoviak@adamstac·
We're now in the era of AX (Agent Experience)
English
0
0
0
147
Adam Stacoviak
Adam Stacoviak@adamstac·
AI is faster than every human and smarter than 99.9% of us. The next iteration of software development has to be built on AI. Building with AI means learning how to flow with AI to create software products. It's not about prompt and pray, it's about flowing with confidence.
English
1
0
0
174
Jeff Weinstein
Jeff Weinstein@jeff_weinstein·
who is building the next great ci startup optimized for agent generate code? (there is about to be _extremely_ ci needs)
English
37
5
110
21.6K