Aitor Ormazabal

82 posts

Aitor Ormazabal

Aitor Ormazabal

@aormazabalo

Member of Technical Staff @RekaAILabs. Prev. PhD @ixaGroup, @Aiatmeta FAIR

Katılım Haziran 2019
167 Takip Edilen266 Takipçiler
Aitor Ormazabal retweetledi
Reka
Reka@RekaAILabs·
98ms time to first token (faster than human visual reaction time), built for agentic workflows. 65% faster throughput compared to leading 8B models. Reka Edge is a 7B VLM built for latency-sensitive apps: real-time video analysis, agentic workflows, on-device deployment reka.ai/news/reka-edge…
Reka tweet media
English
2
8
43
2.7K
Aitor Ormazabal retweetledi
Reka
Reka@RekaAILabs·
Meet Reka Edge – Our next-generation vision language model for physical AI. Uses 3x fewer input tokens and achieves 65% faster throughput compared to leading 8B models. Image understanding, video analysis, object detection, and tool use. Built for Action. Fast enough for production, deployable anywhere. Read more: reka.ai/news/reka-edge…
English
8
46
189
51.3K
Aitor Ormazabal retweetledi
Reka
Reka@RekaAILabs·
Introducing Reka Speech: an efficient and accurate transcription & translation model. 🗣️ On modern GPUs (e.g. H100), it runs 8x–35x faster than existing solutions for batch processing.
Reka tweet media
English
10
28
141
25.2K
Aitor Ormazabal retweetledi
Reka
Reka@RekaAILabs·
🚨 New benchmark release 🚨 We're introducing Research-Eval: a diverse, high-quality benchmark for evaluating search-augmented LLMs 👉 Blogpost: reka.ai/news/introduci… 👉 Dataset + code: github.com/reka-ai/resear… 🧵 Here's why this matters:
Reka tweet media
English
7
24
91
26.4K
Yi Tay
Yi Tay@YiTayML·
First official Gold medal at IMO from DeepMind🥇 with Gemini Deep Think. A general purpose text-in text-out model achieving gold medal is something quite unthinkable just about one year ago and here we are! The frontier of AI is incredibly exciting! Happy to have co-led / captained model training! 😎 A fun fact was that I got roped into this effort thinking it was going to be a fun little side quest, little did I know that it turned out to be such a huge breakthrough. And yes, the results are certified by the IMO! 😉
Google DeepMind@GoogleDeepMind

An advanced version of Gemini with Deep Think has officially achieved gold medal-level performance at the International Mathematical Olympiad. 🥇 It solved 5️⃣ out of 6️⃣ exceptionally difficult problems, involving algebra, combinatorics, geometry and number theory. Here’s how 🧵

English
29
30
524
90.3K
Aitor Ormazabal retweetledi
Sharath Raparthy
Sharath Raparthy@sharathraparthy·
looking for an agent which excels at questions that require dozens of sources and delivers accurate responses with reasoning traces in a few minutes? Reka Research is here for you. start building today: docs.reka.ai/quick-start
English
0
5
18
1.4K
Eneko Agirre @eagirre.bsky.social
Among the impressive developments at Reka, I'm particularly thankful to @RekaAILabs for open sourcing the much needed quantization library. Congrats @aormazabalo for the great work! We are very lucky for having you co-supervising PhDs at @Hitz_zentroa
Aitor Ormazabal@aormazabalo

For our third day of releases we are open sourcing some of our building blocks! I'm particularly happy to be open-sourcing RekaQuant 🗜️, part of our internal quantization stack that I led last year. Short thread on our approach to quantization 🧵1/n

English
1
1
13
634
Aitor Ormazabal
Aitor Ormazabal@aormazabalo·
Instead, RekaQuant quantizes tensors one by one progressively through training, and does direct self-distillation for the unquantized tensors in between. This allows us to use any expensive quantization approach (LDLQ!), and is highly effective.
Aitor Ormazabal tweet mediaAitor Ormazabal tweet media
English
2
0
4
248
Aitor Ormazabal
Aitor Ormazabal@aormazabalo·
For our third day of releases we are open sourcing some of our building blocks! I'm particularly happy to be open-sourcing RekaQuant 🗜️, part of our internal quantization stack that I led last year. Short thread on our approach to quantization 🧵1/n
Reka@RekaAILabs

📢 We are open sourcing ⚡Reka Flash 3.1⚡ and 🗜️Reka Quant🗜️. Reka Flash 3.1 is a much improved version of Reka Flash 3 that stands out on coding due to significant advances in our RL stack. 👩‍💻👨‍💻 Reka Quant is our state-of-the-art quantization technology. It achieves near-lossless compression of Reka Flash 3.1 to 3.5 bits. 💻

English
1
10
35
6.6K
Aitor Ormazabal retweetledi
Mikel Artetxe
Mikel Artetxe@artetxem·
We are open sourcing Reka Flash 3.1, our new 21B reasoning model! Big gains over the previous version, go build with it 🔥🔥🔥 And... we are also open sourcing Reka Quant, our new quantization method achieving SOTA compression to 3.5 bits!
Reka@RekaAILabs

📢 We are open sourcing ⚡Reka Flash 3.1⚡ and 🗜️Reka Quant🗜️. Reka Flash 3.1 is a much improved version of Reka Flash 3 that stands out on coding due to significant advances in our RL stack. 👩‍💻👨‍💻 Reka Quant is our state-of-the-art quantization technology. It achieves near-lossless compression of Reka Flash 3.1 to 3.5 bits. 💻

English
1
7
49
2.5K
Aitor Ormazabal retweetledi
Sharath Raparthy
Sharath Raparthy@sharathraparthy·
🎉🚀 Release Day 3: It’s #OpenSource time! We’re pleased to open-source Reka Flash 3.1, our coding model achieving state-of-the-art performance on LiveCodebench and other benchmarks. 📈🔍 🛠️✨ We’re also opensourcing Reka Quant, our cutting-edge quantization tech that compresses Reka Flash 3.1 to just 3.5 bits with near-lossless fidelity🪄 Congrats to the team (@aormazabalo , Xiaonan and co) who worked on this.
Reka@RekaAILabs

📢 We are open sourcing ⚡Reka Flash 3.1⚡ and 🗜️Reka Quant🗜️. Reka Flash 3.1 is a much improved version of Reka Flash 3 that stands out on coding due to significant advances in our RL stack. 👩‍💻👨‍💻 Reka Quant is our state-of-the-art quantization technology. It achieves near-lossless compression of Reka Flash 3.1 to 3.5 bits. 💻

English
0
2
7
499
Aitor Ormazabal retweetledi
Reka
Reka@RekaAILabs·
📢 We are open sourcing ⚡Reka Flash 3.1⚡ and 🗜️Reka Quant🗜️. Reka Flash 3.1 is a much improved version of Reka Flash 3 that stands out on coding due to significant advances in our RL stack. 👩‍💻👨‍💻 Reka Quant is our state-of-the-art quantization technology. It achieves near-lossless compression of Reka Flash 3.1 to 3.5 bits. 💻
Reka tweet media
English
12
50
206
735.5K