Dokmai IC retweetledi
Dokmai IC
1.8K posts

Dokmai IC retweetledi
Dokmai IC retweetledi

"The next wave of AI-powered cybersecurity attacks will be like nothing we’ve seen before."
Internet Computer technology creates tamperproof clouds using mathematically secure networks.
cnn.com/2026/04/03/tec…
English
Dokmai IC retweetledi

🚨 Why the hell is $ICP still ranked #51?!
Because people are fucking stupid and don’t understand how $ICP actually works or how insanely powerful it is!
Wake up, idiots. The real ones already know. 🔥

English
Dokmai IC retweetledi
Dokmai IC retweetledi

Rebuilding the Internet: Pierre Samaties on Web3, AI & Digital Sovereignty youtu.be/kFgQl1xxM5k?si… via @YouTube

YouTube
English
Dokmai IC retweetledi
Dokmai IC retweetledi
Dokmai IC retweetledi
Dokmai IC retweetledi

@scottsummers @idunno_maan Exact. Most important point though, is that within the competitive sweet spot (70B open weights models initially), both speed and cost is competitive with cloud, with cost often significantly lower, while verifiable inference is extra value = decentralization as real *benefits*
English
Dokmai IC retweetledi

@BobbyO_ @bighab247 Bought at launch ($350-$450) and never looked back. Just kept DCA-ing through the volatility. Proudly compounding my 8-year stake and haven’t sold once! 🔥♾️
English

@bighab247 I think my first was august 2021.
Glad to see a lot of people that came in 2023-24 still here. Y’all have survived an entire bear market imo. Bc we’ve essentially had two consecutive bear markets.
English
Dokmai IC retweetledi

NVIDIA showcasing new "interfence-only tech" @ GTC in next few days (building on Groq acqui.)
The pending earthquake I predicted in 2024: inference will become key to training, and many that are investing heavily in trad GPU infra will get caught out and by this coming shift.
Why? There is a shortage of fresh training data to help with scaling intelligence through pretraining.
To solve this, CoT reasoning will be applied within agentic architectures to perform research that axiomatically generates discoveries and insights — with such frameworks validating empirical assumptions and reasoning to prevent the generation of training data based on hallucinations or faulty thinking.
New frontier knowledge insights will be formulated for use as training data in ways that both help embed new knowledge in model weights, and increase embedded "intuition," which directly supports the notional "intelligence quotient" that can be ascribed to textual synthesis by LLMs.
Inference-generated training data will prove instrumental to advancing frontier LLM capabilities. Meanwhile, its generation will involve vast amounts of inference computation, and interfence-only ASICs (application specific integrated circuits) will be required to do it competitively.
The only question is when agentic frameworks and LLM capabilities will advance sufficiently to make this possible. My own experiences applying agentic frameworks at work in recent months make me think that agentic generation of training data at scale may not be that far off.
This is definitely something tech firms investing heavily in do-it-all GPU chips and data centers specialized to host them should think about.
Once end-users get a taste for inference on ASICs, which can be 10X faster compared to the latest GPUs, they're also not going to want to do inference on anything else, at least for chat and code generation, which are very time-sensitive tasks — for example, caffeine.ai plans to adopt ASICs as soon as possible.
It's a shame NVIDIA bought Groq, because competition is needed to advance this sector, so all eyes on Cerebras and their dinner plate sized chips!
English

@MastrXYZ | ̄ ̄ ̄ ̄ ̄ ̄ ̄ ̄ ̄ ̄ ̄ ̄ ̄|
#InternetComputer ❤
|_____________|
\ (👀) /
\ /
——
| |
|_ |_
Dokmai IC retweetledi
Dokmai IC retweetledi

#Mission70 has moved into the realm of implementation. The 1st ICP pull request has hit GitHub. More developments coming 🔥
github.com/dfinity/ic/pul…
English
















