Alex Bodner

3.7K posts

Alex Bodner banner
Alex Bodner

Alex Bodner

@AlexBodner_

AI engineering at @UdeSA🇦🇷 | open source @roboflow Posting on AI progress and my own projects

http://alexbodner.com Katılım Mayıs 2016
935 Takip Edilen1K Takipçiler
Sabitlenmiş Tweet
Alex Bodner
Alex Bodner@AlexBodner_·
I’m thrilled to present the KAN Convolutional Layer, a promising neural architecture for image processing. Last week KANs came out, as an alternative to the MLP. We extended this idea to Convolutional Layers, creating the KAN Convolution. Join me in this thread to more about it🧵
English
32
149
1.2K
199.7K
Aksel
Aksel@akseljoonas·
ml-intern is fully on mobile now you can launch 8 A100s from your phone. while on the couch. while commuting. wherever I just did this while biking. same sessions as your desktop too — start a run on your laptop, check on it from your phone, it's all there the research lab is now just wherever you are
Aksel@akseljoonas

Introducing ml-intern, the agent that just automated the post-training team @huggingface It's an open-source implementation of the real research loop that our ML researchers do every day. You give it a prompt, it researches papers, goes through citations, implements ideas in GPU sandboxes, iterates and builds deeply research-backed models for any use case. All built on the Hugging Face ecosystem. It can pull off crazy things: We made it train the best model for scientific reasoning. It went through citations from the official benchmark paper. Found OpenScience and NemoTron-CrossThink, added 7 difficulty-filtered dataset variants from ARC/SciQ/MMLU, and ran 12 SFT runs on Qwen3-1.7B. This pushed the score 10% → 32% on GPQA in under 10h. Claude Code's best: 22.99%. In healthcare settings it inspected available datasets, concluded they were too low quality, and wrote a script to generate 1100 synthetic data points from scratch for emergencies, hedging, multilingual etc. Then upsampled 50x for training. Beat Codex on HealthBench by 60%. For competitive mathematics, it wrote a full GRPO script, launched training with A100 GPUs on hf.co/spaces, watched rewards claim and then collapse, and ran ablations until it succeeded. All fully backed by papers, autonomously. How it works? ml-intern makes full use of the HF ecosystem: - finds papers on arxiv and hf.co/papers, reads them fully, walks citation graphs, pulls datasets referenced in methodology sections and on hf.co/datasets - browses the Hub, reads recent docs, inspects datasets and reformats them before training so it doesn't waste GPU hours on bad data - launches training jobs on HF Jobs if no local GPUs are available, monitors runs, reads its own eval outputs, diagnoses failures, retrains ml-intern deeply embodies how researchers work and think. It knows how data should look like and what good models feel like. Releasing it today as a CLI and a web app you can use from your phone/desktop. CLI: github.com/huggingface/ml… Web + mobile: huggingface.co/spaces/smolage… And the best part? We also provisioned 1k$ GPU resources and Anthropic credits for the quickest among you to use.

English
12
21
171
20.3K
echo.hive
echo.hive@hive_echo·
@tom_doerr These things remind me of the olden days of middle late 2010s :)
English
1
0
0
292
Alex Bodner retweetledi
Yann LeCun
Yann LeCun@ylecun·
Actually, AI already saves lives. In several countries, mammograms are examined by AI and radiologists. Reliability is improved. In the EU, every car sold must be equipped with Automatic Emergency Braking Systems. That's AI. They reduce frontal collisions by 40%. Modern MRI machines are equipped with AI technology that reduces the time of imaging by 4x or more. You can now get a full-body MRI in 40 minutes for about $1000. Reduced time -> reduced cost -> more/earlier detection. And that's not counting the progress in medicine enabled by modern AI, including Nobel Prize-winning protein structure prediction.
English
44
162
1.7K
113.9K
Alex Bodner
Alex Bodner@AlexBodner_·
Wow
SpaceX@SpaceX

SpaceXAI and @cursor_ai are now working closely together to create the world’s best coding and knowledge work AI. The combination of Cursor’s leading product and distribution to expert software engineers with SpaceX’s million H100 equivalent Colossus training supercomputer will allow us to build the world’s most useful models. Cursor has also given SpaceX the right to acquire Cursor later this year for $60 billion or pay $10 billion for our work together.

QST
0
0
0
68
Alex Bodner
Alex Bodner@AlexBodner_·
Look at that ball tracking! 🔥Improving object tracking using IoU variants, soon released open source
English
2
1
6
629
Fleetwood
Fleetwood@fleetwood___·
The models, they just want to learn (their current task and literally nothing else). Training a toy transformer on 3 digit addition, sorting, reversal and modular addition. Complete lobotomy at every task transition.
Fleetwood tweet media
English
37
21
590
111.2K
Tomas
Tomas@tomasholtz_·
We made LLMs 2.3x faster and 63% cheaper on top of @aisdk 👇
Nacho Gorriti@nacho_gorriti_

At the @AnthropicAI hackaton we've been testing a way of adding muscle memory to LLMs on top of the @aisdk. You follow a recipe the first 10 times, then you don't need it. Same with repetitive workflows, tool calls get automated, skip the LLM entirely. Less cost & latency

English
4
2
35
6.2K
Alex Bodner
Alex Bodner@AlexBodner_·
@trajektoriePL Interesting take, wonder if we will be able to make a theory of general intelligence in our life time
English
0
0
1
571
Michał Podlewski
Michał Podlewski@trajektoriePL·
Terence Tao proposes what he calls a "Copernican view of intelligence". Instead of buying into the common, one-dimensional narrative that artificial intelligence will simply evolve from "subhuman" to "superhuman" and ultimately make humanity entirely redundant, Tao urges us to look at the bigger picture. Much like the Copernican revolution proved the Earth is not the center of the universe, Tao suggests we need to realize that human intelligence isn't the only, or necessarily the highest, form of intellect. Historically, we have treated other forms of storing or creating knowledge—like animals, books, and computers—as secondary. However, we actually exist within a much richer universe of intelligence. Both human intelligence and computer intelligence possess their own distinct strengths and weaknesses. The true potential lies not in viewing them as direct competitors, but rather in focusing on collaboration. By working together, humans and computers can achieve additional things that neither could accomplish on their own, requiring us to think in much wider terms than just what humans or computers can do alone.
English
140
608
4.1K
599.5K
Fausto
Fausto@FaustoR0·
Me intriga saber que va a pasar cuando los tokens dejen de estar financiados y la gente sea extremedamente dependientes de AI. No soy ni pesimista ni optimista. Todavía lo sigo pensando.
Español
202
39
860
81K
Alex Bodner retweetledi
Aakash Gupta
Aakash Gupta@aakashgupta·
There's a physicist at Stanford named Safi Bahcall who modeled this exact principle and the math is wild. He calls it "phase transitions in human networks." When you're stationary, your probability of a lucky event is limited to your existing surface area: the people you already know, the places you already go, the ideas you've already been exposed to. Your opportunity window is fixed. When you move, your collision rate with new nodes in a network increases nonlinearly. Double your movement (new conversations, new cities, new projects) and your probability of a serendipitous encounter doesn't double. It roughly quadruples. Because each new node connects you to their entire network, not just to them. Richard Wiseman ran a 10-year study at the University of Hertfordshire tracking self-described "lucky" and "unlucky" people. The single biggest differentiator wasn't IQ, education, or family money. Lucky people scored significantly higher on one trait: openness to experience. They talked to strangers more, varied their routines more, and said yes to invitations at nearly twice the rate. The "unlucky" group followed the same routes, ate at the same restaurants, and talked to the same 5 people. Their networks were closed loops. No new inputs, no new collisions. Luck isn't random. Luck is surface area. And surface area is a function of movement. The lobster emoji is doing more work than most people realize. Lobsters grow by shedding their shell when it gets too tight. The growth requires a period of total vulnerability. No protection, no armor, soft body exposed to the ocean. That's the cost of movement nobody posts about. You have to be uncomfortable first. The new shell only hardens after you've already moved.
@d9vidson

a moving man will meet his luck 🥀

English
506
14K
67.5K
4.5M
Luciano Neimark
Luciano Neimark@neimarkl·
Después de un finde a mil en Hackitba (@cs_itba) les presento el proyecto que hicimos con @MatiMutz_ y @MatiasOstrower Se llama Alias, una skill + un MCP que le permite a cualquier agente (@openclaw, ChatGPT, Claude, etc) hacer transferencias a cualquier CBU/ALIAS Como? Te creas una cuenta, la fondeas (gracias @talo_pay) y tus agentes disponen de esos pesos para poder transferir a cualquier cuenta. Los casos de uso son realmente infinitos Esto destraba una problemática enorme que tienen los agentes ahora que si bien son cada vez más inteligentes, no pueden pagar (y menos en pesos$$$) Eh? Si escuchaste bien ;) Demo en el video: …ant-clarity-production.up.railway.app
Español
15
15
116
14.3K
Alex Bodner
Alex Bodner@AlexBodner_·
In Trackers we are starting to add different kinds of Intersection over Union (IoU) like GIoU or DIoU. Are you already using them? We want to know what you are building🔥
English
0
0
1
100
DataCanaya
DataCanaya@Data_Canaya·
@dataref_ar Jajajja se la pido a los chinos por 30 mil pesos e igual calidad
Español
1
0
0
365
dataref
dataref@dataref_ar·
💰🇦🇷 Los PRECIOS de la nueva camiseta de la Selección Argentina en la web de Adidas: 👉 Versión FAN sin estampado: $150.000 👉 Versión FAN con estampado de Messi: $180.000. 👉 Versión JUGADOR sin estampado: $220.000. 👉 Versión JUGADOR con estampado de Messi: $250.000. 👉 Versión JUGADOR manga larga sin estampado: $250.000.
dataref tweet mediadataref tweet media
dataref@dataref_ar

🔍🇦🇷 Estos son los principales detalles de la NUEVA CAMISETA SUPLENTE de la Selección Argentina de cara a la Copa del Mundo 2026: ◉ Predomina el negro con logotipos blancos y detalles en azul cielo. ◉ Está inspirada en el fileteado porteño que decora carteles y colectivos urbanos. ◉ Utilización del logo de Adidas con el trébol. ◉ Parche de campeón del mundo. ◉ Sol de Mayo en la nuca junto a la palabra ‘Argentina’.

Español
27
24
376
52.4K
Alex Bodner
Alex Bodner@AlexBodner_·
@LearnOpenCV Thanks for the shoutout! More cool open source stuff to come🤝
English
0
0
1
18
Satya Mallick
Satya Mallick@LearnOpenCV·
🔍 Mastering Multi-Object Tracking with Roboflow & OpenCV 🏀🚗 From tracking basketball players to monitoring traffic, detection alone isn’t enough-you need Multi-Object Tracking (MOT). With Roboflow Trackers + OpenCV, you can assign persistent IDs to objects across frames, even in high-speed or occluded scenarios. 👉 Learn how SORT & ByteTrack make MOT practical and powerful in real-world pipelines. 🔗 Read the full blog: vist.ly/4vc73 #ComputerVision #OpenCV #Roboflow #MultiObjectTracking #AI #DeepLearning #SportsAnalytics #DroneTech #TrafficMonitoring #ByteTrack #SORT
Satya Mallick tweet media
English
1
1
7
328
Erick
Erick@ErickSky·
⚽️ Hace un año vimos ESTE video y muchos nos quedamos con las ganas de poder probarlo… Hoy puedes CLONAR el repo, abrirlo con Claude Code y decirle: “haz funcionar este repo con la web que NO LE GUSTA AL TEBAS”. Es magia. REPOOO👇
Español
7
11
253
41.9K
Alex Bodner
Alex Bodner@AlexBodner_·
oh, and you can track objects adding less than 5 lines of code ⚡️
Alex Bodner tweet media
Alex Bodner@AlexBodner_

🔥𝗡𝗲𝘄 𝗧𝗿𝗮𝗰𝗸𝗲𝗿 in @roboflow 𝗧𝗿𝗮𝗰𝗸𝗲𝗿𝘀 library 🚀 After ByteTrack, we are adding 𝗢𝗖-𝗦𝗢𝗥𝗧: a tracker built for what most trackers fail at: non-linear motion + occlusion Think: dancers, athletes, animals, chaos. SORT losses track when occluded, OC-SORT doesn’t.👇

English
0
0
12
255