lwi

9.6K posts

lwi banner
lwi

lwi

@0tisticwizard

tech pragmatic | midwit 1st hater

Paris, France Beigetreten Şubat 2018
1.2K Folgt239 Follower
lwi
lwi@0tisticwizard·
@thismacapital Concrètement openclaw à part lire les mails et organiser un workflow c’est quoi le use case
Français
1
0
1
1.2K
THISMA
THISMA@thismacapital·
Je viens de prendre un abonnement supplémentaire OPENAI 200$ codex pour pouvoir être illimité sur mon openclaw car je peux plus vivre sans aujourd'hui Donc pour le dev : claude code 200$ Pour l'agentic : codex 200$ Je pense que si y'a un abonnement plus complet max de Claude Code je le prendrai même à 1000$ par mois c'est tellement rentable en terme de temps
Français
22
1
68
23.4K
lwi retweetet
moon
moon@MoonOverlord·
so haha what did you guys build with all those mac minis haha
English
118
67
2.5K
168.3K
lwi retweetet
Ryan Watkins
Ryan Watkins@RyanWatkins_·
“Dude I have 10 agents running while I sleep. No one is prepared for AGI in 2 years man.” “So what are you building?” “Bro all my smartest friends are vibe coding until 3am every night. It’s all about agency. Intelligence is a commodity man.” “So what are you building?” “Do you even study exponentials? Have you seen the latest METR chart? You’re going to be stuck in the permanent underclass bro.” “So what are you building?” “Did you even setup OpenClaw? I’m maxing out my token budget everyday man.” “So what are you building?” “I promise you I’m 10x more productive bro! You just don’t understand! Please bro just…. I know you use this stuff everyday too, but you must not be prompting it right! Please broo…”
English
541
1K
14.6K
1.4M
lwi retweetet
Dean W. Ball
Dean W. Ball@deanwball·
This is probably the most believable piece of AI scenario modeling, positive or negative, I have ever read. Plenty of contestable assumptions, of course, but undoubtedly worth your time.
Citrini@citrini

JUNE 2028. The S&P is down 38% from its highs. Unemployment just printed 10.2%. Private credit is unraveling. Prime mortgages are cracking. AI didn’t disappoint. It exceeded every expectation. What happened?​​​​​​​​​​​​​​​​ citriniresearch.com/p/2028gic

English
48
70
1.7K
958.1K
Melko
Melko@MelkoXMR·
15 mac minis pour faire des calls API 😂😂😂
Français
1
0
26
2.7K
lwi retweetet
blockgraze
blockgraze@blockgraze·
"bro have you tried clawdbot it's so insane" "a little but not much, what do you use it for" "it's crazy many you can do anything with it" "what are you doing with it" "you gotta try it" "try it for what" "don't get left behind man"
English
381
971
23.3K
768.2K
lwi retweetet
Aakash Gupta
Aakash Gupta@aakashgupta·
The math on this project should mass-humble every AI lab on the planet. 1 cubic millimeter. One-millionth of a human brain. Harvard and Google spent 10 years mapping it. The imaging alone took 326 days. They sliced the tissue into 5,000 wafers each 30 nanometers thick, ran them through a $6 million electron microscope, then needed Google’s ML models to stitch the 3D reconstruction because no human team could process the output. The result: 57,000 cells, 150 million synapses, 230 millimeters of blood vessels, compressed into 1.4 petabytes of raw data. For context, 1.4 petabytes is roughly 1.4 million gigabytes. From a speck smaller than a grain of rice. Now scale that. The full human brain is one million times larger. Mapping the whole thing at this resolution would produce approximately 1.4 zettabytes of data. That’s roughly equal to all the data generated on Earth in a single year. The storage alone would cost an estimated $50 billion and require a 140-acre data center, which would make it the largest on the planet. And they found things textbooks don’t contain. One neuron had over 5,000 connection points. Some axons had coiled themselves into tight whorls for completely unknown reasons. Pairs of cell clusters grew in mirror images of each other. Jeff Lichtman, the Harvard lead, said there’s “a chasm between what we already know and what we need to know.” This is why the next step isn’t a human brain. It’s a mouse hippocampus, 10 cubic millimeters, over the next five years. Because even a mouse brain is 1,000x larger than what they just mapped, and the full mouse connectome is the proof of concept before anyone attempts the human one. We’re building AI systems that loosely mimic neural networks while still unable to fully read the wiring diagram of a single cubic millimeter of the thing we’re trying to imitate. The original is 1.4 petabytes per millionth of its volume. Every AI model on Earth fits in a fraction of that. The brain runs on 20 watts and fits in your skull. The data center required to merely describe one-millionth of it would span 140 acres.
All day Astronomy@forallcurious

🚨: Scientists mapped 1 mm³ of a human brain ─ less than a grain of rice ─ and a microscopic cosmos appeared.

English
1.2K
12.1K
64.1K
4.6M
lwi
lwi@0tisticwizard·
SillyCUDA va être le projet du siècle je dois juste arrêter de me branler
Français
0
0
1
1.8K