doffy

1.9K posts

doffy banner
doffy

doffy

@Doffee_

Doffy

San Diego, CA Beigetreten Aralık 2014
575 Folgt215 Follower
Aman
Aman@Amank1412·
@madhujith12 yeah this kinda changes things fr
English
1
0
4
3.6K
Aman
Aman@Amank1412·
SOMEONE JUST RAN LLAMA 70B LOCALLY ON A MACBOOK FOR 11 HOURS ON A FLIGHT. no wifi. no API. no subscriptions. cleared his entire client queue before landing. local AI is not a hobby anymore.
English
269
257
9.2K
995.4K
doffy
doffy@Doffee_·
@baoskee That’s a mistake as it’s not a taxable event. You’re chillin to excluded but I’m not a tax advisor.
English
0
0
0
24
baoskee
baoskee@baoskee·
turbo tax classified my $15k USD -> USDC conversion as a capital gain prime example of how the poor stays poor
English
19
6
1.1K
73.5K
Kenn
Kenn@ghettokenn·
@ViktorBunin Previous AI winters had impressive demos and no revenue. This wave has companies cutting headcount because the tools actually work. Wrong cycle to pattern-match against.
English
1
0
3
307
Viktor Bunin 🛡️🇺🇸
Viktor Bunin 🛡️🇺🇸@ViktorBunin·
it is so fucking obvious we're not going to have AGI in the next few years. it's all a big psyop to make AI builders feel important and to justify the extravagant spend and valuations. everyone should calm down.
English
33
3
129
11.9K
doffy
doffy@Doffee_·
@benjamincowen What happened to all the educational crypto influencers? The ones that taught the basics? I guess the interest just isn’t there
English
0
0
0
11
R A W S A L E R T S
R A W S A L E R T S@rawsalerts·
🚨#BREAKING: According to new reports documents tied to the Epstein files reveals that the FBI’s New York office was hacked in 2023 on the night of the Super Bowl, resulting in the reported deletion of approximately 100 terabytes of evidence data.
English
918
2.8K
20.6K
1.7M
doffy
doffy@Doffee_·
@froggy_dl You also don’t need a bunch of gimmicky stuff to give movement a high skill ceiling. Look at CS and the movement disparity between even pro players. I’m not talking about fancy ladder jumps or kz either.
English
0
0
0
26
froggy
froggy@froggy_dl·
im glad they're patching movement stuff, as much as i would love deadlock to have apex movement on crack, i know i'm in the minority and it'll be better for the game
English
13
3
301
21.4K
doffy
doffy@Doffee_·
@benjamincowen Unfortunately people only see the second one. The first one means CBDC’s. That’s the view from normies.
English
2
0
0
17
Benjamin Cowen
Benjamin Cowen@benjamincowen·
Being "pro-crypto" and "pro-how-can-I-extract-as-much-value-out-of-crypto" are two very different things.
English
118
75
1.3K
65.1K
doffy
doffy@Doffee_·
@CoinbaseDuck Would make tracking cost basis easier too
English
0
0
0
4
CBduck
CBduck@CoinbaseDuck·
Sending some stables from my wallet to Coinbase to set up more buys. Would be nice if I never have to access Defi via wallet, I could just do it within Coinbase so I never have to touch wallet as a normie.
English
5
0
24
1.6K
doffy
doffy@Doffee_·
@GenAI_is_real Ai slop post. You don’t even try to hide it
English
0
0
0
12
Chayenne Zhao
Chayenne Zhao@GenAI_is_real·
17k tok/s is a shiny benchmark, but "hard-wiring" weights into silicon is a fundamental bet against the speed of ai research. we’re moving from static inference to dynamic agentic loops. in sglang, we need the flexibility to optimize kernels and manage memory on the fly as models evolve. baking llama 3.1 into a chip in 2026 is like building a high-speed railway that only leads to a ghost town—by the time it's finished, everyone has moved on to the next architecture. the real moat isn't raw speed on a frozen model; it's the reprogrammable efficiency of the serving stack. i’d rather have 5 lines of optimized, flexible code than a billion-dollar brick that can't change its mind.
Wildminder@wildmindai

17,000 tokens per second!! Read that again! LLM is hard-wired directly into silicon. no HBM, no liquid cooling, just raw specialized hardware. 10x faster and 20x cheaper than a B200. the "waiting for the LLM to think" era is dead. Code generates at the speed of human thought. Transition from brute-force GPU clusters to actual AI appliances. taalas.com/the-path-to-ub…

English
8
3
43
8K
Shayan
Shayan@InboxZero·
@mikubeam @chribjel But you can work backwards by training your own model. We don’t know what prompts they used to be fair
English
3
0
0
391
Shayan
Shayan@InboxZero·
@chribjel It means copying the model at a foundational level then selling it
English
5
0
1
6.6K
doffy
doffy@Doffee_·
@ratechange1 @atmoio It’s easy to over extrapolate technologies we don’t understand
English
0
0
0
29
Ratechange
Ratechange@ratechange1·
@atmoio everything happens on the margin and on the margin, white collar is fucked and will only increasingly be more fucked...but sure everyone you know will be fine
English
1
0
0
691
doffy
doffy@Doffee_·
@AnthropicAI Honestly I’m just going to use whatever model I want
English
0
0
0
4
Anthropic
Anthropic@AnthropicAI·
We’ve identified industrial-scale distillation attacks on our models by DeepSeek, Moonshot AI, and MiniMax. These labs created over 24,000 fraudulent accounts and generated over 16 million exchanges with Claude, extracting its capabilities to train and improve their own models.
English
7.2K
6.3K
54.8K
33.7M
Jamshid Hormuzdiar
Jamshid Hormuzdiar@JamshidHormuz·
Bayesian models also predict future states probabilistically. A transformer modeling P(next token | context) is doing the same thing. Different architecture doesn't change the objective. it's still prediction. And improving prediction requires accurate models of how the world works.
English
1
0
0
87
Ab Homine Deus
Ab Homine Deus@AbHomineDeus·
"AI isn't thinking! It's just pattern matching and predicted the most likely outcome." My Brother in Christ what do you think thinking is?
English
188
104
999
40.7K
doffy
doffy@Doffee_·
@AbHomineDeus So the cure to cancer will be found by predicting the most likely next token? Definitely not. Otherwise we would’ve found it by now.
English
0
0
0
6
doffy
doffy@Doffee_·
@JamshidHormuz @AbHomineDeus It’s definitely not predicting the future. Look up Bayesian models. Directed Acyclic graphs can predict outcomes. ChatGPT is forward feeding and it’s more like walking on a footpath through the nature reserve. Actual thinking involves walking off the beaten path.
English
1
0
1
22
Jamshid Hormuzdiar
Jamshid Hormuzdiar@JamshidHormuz·
@AbHomineDeus Next token prediction = Predicting the future The craziest part is that they are actually saying "AI isn't thinking, it just figured out how to predict the future" like this isn't incredibly impressive
English
5
0
14
1.5K
doffy
doffy@Doffee_·
Ya it’s tough for me to hop on the AI replacement train. As someone who’s coded generative forward feeding neural networks since 2023 it’s pretty obvious the limitations. Worse customer support, can’t work in large codebases. Individuals can be more empowered but that’s not job replacement.
English
2
0
1
37
Benjamin Cowen
Benjamin Cowen@benjamincowen·
Interesting read, certainly something to consider. Even somewhat aligns with the business cycle views I expressed here: youtube.com/watch?v=2ehTz4… I think there is a reason to believe the current business cycle will end within the next 2-3 years from a liquidity point of view.
YouTube video
YouTube
Citrini@citrini

JUNE 2028. The S&P is down 38% from its highs. Unemployment just printed 10.2%. Private credit is unraveling. Prime mortgages are cracking. AI didn’t disappoint. It exceeded every expectation. What happened?​​​​​​​​​​​​​​​​ citriniresearch.com/p/2028gic

English
50
49
721
335.5K
Benjamin Cowen
Benjamin Cowen@benjamincowen·
Anyone remember back when the replies were crypto scam bots? Now they are just AI bots
English
114
27
1K
80.1K
doffy
doffy@Doffee_·
@mattshumer_ @Curlh1 LLM performance is compute bound O(N^2). We’ve known this since 2017
English
0
0
0
5
Matt Shumer
Matt Shumer@mattshumer_·
@Curlh1 Probably engagement farming... well, it worked :) Gets him likes but will hurt people who will believe him and ignore the (obvious) progress.
English
3
0
18
2.2K