Raul Pino

5.6K posts

Raul Pino banner
Raul Pino

Raul Pino

@p1nox

Coder. Coffee lover. Researcher wannabe. Opinions are my own. 🇻🇪

Katılım Mayıs 2009
1.3K Takip Edilen868 Takipçiler
Sabitlenmiş Tweet
Raul Pino
Raul Pino@p1nox·
Already been a year since I presented my talk at @pyconit 2023, youtube.com/watch?v=6M0lUv…, having so much FOMO this year 🥲🇮🇹 , thanks so much to the organization for making these events possible, the best of luck this year to everyone (waiting for the videos 👨‍💻) 👏🐍
YouTube video
YouTube
English
2
0
6
416
Raul Pino retweetledi
Kristin Fisher
Kristin Fisher@KristinFisher·
Did y’all know that Radiohead made this song specifically for this moment?
English
107
1.7K
13.6K
486.6K
Raul Pino retweetledi
Feross
Feross@feross·
🤨 People keep asking how to protect yourself. #1: set min-release-age=7 in .npmrc #2: install Socket for GitHub (it's free!) to protect PRs from bad dependencies: socket.dev/features/github #3: install Socket Firewall (also free!) to protect your laptop: socket.dev/features/firew…
Feross@feross

🚨 CRITICAL: Active supply chain attack on axios -- one of npm's most depended-on packages. The latest axios@1.14.1 now pulls in plain-crypto-js@4.2.1, a package that did not exist before today. This is a live compromise. This is textbook supply chain installer malware. axios has 100M+ weekly downloads. Every npm install pulling the latest version is potentially compromised right now. Socket AI analysis confirms this is malware. plain-crypto-js is an obfuscated dropper/loader that: • Deobfuscates embedded payloads and operational strings at runtime • Dynamically loads fs, os, and execSync to evade static analysis • Executes decoded shell commands • Stages and copies payload files into OS temp and Windows ProgramData directories • Deletes and renames artifacts post-execution to destroy forensic evidence If you use axios, pin your version immediately and audit your lockfiles. Do not upgrade.

English
57
286
2.4K
343.8K
Raul Pino retweetledi
Jesus Lara
Jesus Lara@phenobarbital·
sólo para dar proporción a la cifra, tomando en cuenta estimaciones históricas, el oro movido por Maduro a Suiza (sin contar otros lugares, como Inglaterra o Rusia) es semejante a todo el oro extraído por el imperio Español en 3 siglos de historia. Para que se entienda la magnitud del daño a la nación.
Español
2
134
285
7.3K
Raul Pino retweetledi
Aníbal Rojas
Aníbal Rojas@anibal·
@paoalbornozf y @esluisbenitez crearon vzla.fyi una web que describe la anatomía del colapso de Venezuela 🇻🇪 de una forma fáctica, limpia y ordenada. Con fuentes confirmadas, presentado en lenguaje neutral y completamente transparente. REFERENCIA INDISPENSABLE.
Aníbal Rojas tweet mediaAníbal Rojas tweet mediaAníbal Rojas tweet mediaAníbal Rojas tweet media
Español
3
17
35
1.7K
Raul Pino retweetledi
Andrej Karpathy
Andrej Karpathy@karpathy·
New post: nanochat miniseries v1 The correct way to think about LLMs is that you are not optimizing for a single specific model but for a family models controlled by a single dial (the compute you wish to spend) to achieve monotonically better results. This allows you to do careful science of scaling laws and ultimately this is what gives you the confidence that when you pay for "the big run", the extrapolation will work and your money will be well spent. For the first public release of nanochat my focus was on end-to-end pipeline that runs the whole LLM pipeline with all of its stages. Now after YOLOing a few runs earlier, I'm coming back around to flesh out some of the parts that I sped through, starting of course with pretraining, which is both computationally heavy and critical as the foundation of intelligence and knowledge in these models. After locally tuning some of the hyperparameters, I swept out a number of models fixing the FLOPs budget. (For every FLOPs target you can train a small model a long time, or a big model for a short time.) It turns out that nanochat obeys very nice scaling laws, basically reproducing the Chinchilla paper plots: Which is just a baby version of this plot from Chinchilla: Very importantly and encouragingly, the exponent on N (parameters) and D (tokens) is equal at ~=0.5, so just like Chinchilla we get a single (compute-independent) constant that relates the model size to token training horizons. In Chinchilla, this was measured to be 20. In nanochat it seems to be 8! Once we can train compute optimal models, I swept out a miniseries from d10 to d20, which are nanochat sizes that can do 2**19 ~= 0.5M batch sizes on 8XH100 node without gradient accumulation. We get pretty, non-itersecting training plots for each model size. Then the fun part is relating this miniseries v1 to the GPT-2 and GPT-3 miniseries so that we know we're on the right track. Validation loss has many issues and is not comparable, so instead I use the CORE score (from DCLM paper). I calculated it for GPT-2 and estimated it for GPT-3, which allows us to finally put nanochat nicely and on the same scale: The total cost of this miniseries is only ~$100 (~4 hours on 8XH100). These experiments give us confidence that everything is working fairly nicely and that if we pay more (turn the dial), we get increasingly better models. TLDR: we can train compute optimal miniseries and relate them to GPT-2/3 via objective CORE scores, but further improvements are desirable and needed. E.g., matching GPT-2 currently needs ~$500, but imo should be possible to do <$100 with more work. Full post with a lot more detail is here: github.com/karpathy/nanoc… And all of the tuning and code is pushed to master and people can reproduce these with scaling_laws .sh and miniseries .sh bash scripts.
Andrej Karpathy tweet mediaAndrej Karpathy tweet mediaAndrej Karpathy tweet mediaAndrej Karpathy tweet media
English
227
681
5.4K
708.1K
Raul Pino retweetledi
Jesus Lara
Jesus Lara@phenobarbital·
Pues básicamente les enviábamos 400 mil barriles DIARIOS de petróleo y a cambio nos envían soldados, técnicos, entrenadores deportistas y médicos internistas (Y de colofón, combustible de avión con todo y el avión pagado, el YV1128 de PDVSA es usado por Diaz-Canel para sus traslados privados). De pagarlo a precio de mercado, la deuda de Cuba ascendería a 20 mil millones de dólares.
Jesus Lara tweet media
Español
6
59
315
21.8K
Raul Pino retweetledi
Raul Pino retweetledi
swyx 🇸🇬
swyx 🇸🇬@swyx·
excited to kick off the year by dropping @trq212's full Claude Agent SDK workshop from AlE CODE POV video to give you an idea of how insanely packed this one was also peek at the incredible venue at @datadoghq - was very grateful to have their support esp since they were right above the conf venue + accepted our badges!
AI Engineer@aiDotEngineer

🆕 Claude Agent SDK [Full Workshop] youtube.com/watch?v=TqC1qO… For our first big drop of the year, excited to bring you @trq212's full 2 hour workshop covering all of @AnthropicAI's agentic SDK (formerly known as Claude Code SDK). By far the most popular workshop of AIE CODE! Now published online for free (sorry for AV/delay issues)... long story

English
18
15
180
29.7K
Raul Pino retweetledi
Luis Carlos 🏴‍☠️ One Piece
Participación en la Televisión Española, TVE, sobre lo ocurrido en Venezuela. Ya saben que España tiene el problema particular de que hay miembros del gobierno y aliados vinculados a las tramas de corrupción del chavismo, además de una polarización interna que enloda todo.
Español
80
600
1.8K
125.5K
Raul Pino retweetledi
Giuseppe Gangi
Giuseppe Gangi@ggangix·
Para los cómplices de la dictadura que hablan de la autodeterminación de los pueblos.
Giuseppe Gangi tweet media
Español
47
3K
7.6K
154.6K
Raul Pino retweetledi
Jacky ₿ ✨🐇
Jacky ₿ ✨🐇@imjackyrivero·
Casi 10 años después. 2017 - 2026
Jacky ₿ ✨🐇 tweet mediaJacky ₿ ✨🐇 tweet media
Español
31
4.7K
16.8K
424.8K
Raul Pino retweetledi
Aníbal Rojas
Aníbal Rojas@anibal·
Sigamos con este horrendo deporte de espectadores que nos tocó vivir a los venezolanos. Votamos, firmamos, protestamos, negociamos; pusimos los presos, los secuestrados, los torturados y los muertos. La comunidad internacional se dividió entre la comodidad de desentenderse del cáncer, o ser complices de una dictadura en proceso por décadas, porque la democracia no importa si hay alguna resemblanza ideológica. Y pues aquí llegamos, amanecerá (de nuevo) y veremos.
Español
9
46
250
16.1K
Raul Pino retweetledi
Andrej Karpathy
Andrej Karpathy@karpathy·
nanoGPT - the first LLM to train and inference in space 🥹. It begins.
Adi Oltean@AdiOltean

We have just used the @Nvidia H100 onboard Starcloud-1 to train the first LLM in space! We trained the nano-GPT model from Andrej @Karpathy on the complete works of Shakespeare and successfully ran inference on it. We have also run inference on a preloaded Gemma model, and we plan to try more exciting models in the future. Getting the first H100 to work in space required a lot of innovation and hard work from the incredible Starcloud team to make this breakthrough. This is a significant first step toward moving almost all computing off Earth to reduce the burden on our energy supplies and take advantage of abundant solar energy in space! 🚀

English
321
877
11.1K
1.1M
Raul Pino retweetledi
Jorge Glem
Jorge Glem@jorgeglem·
Ayer viví uno de los momentos más significativos de mi vida: tocar el Himno Nacional en Oslo, justo antes de la Marcha de las Antorchas, en el marco del Nobel de la Paz a @MariaCorinaYA . Más de mil personas, banderas, emoción y un país latiendo en tricolor desde lejos. Viva 🇻🇪
Español
152
2.4K
9.9K
107.1K
Raul Pino retweetledi
Melanio Escobar
Melanio Escobar@MelanioBar·
8 páginas de discurso que resumen 26 años de tragedia.
Español
29
1K
5.2K
53.8K
Raul Pino
Raul Pino@p1nox·
@anibal Gestalt session appointment pa' cuando? Llevo empanadas y malta 👍
Español
0
0
0
31
Aníbal Rojas
Aníbal Rojas@anibal·
Esto de llorar durante la entrega de el Premio Nobel de la Paz es normal o debería pedir una cita con el médico?
Español
19
6
164
3.3K