
Martin Grebhahn
2.8K posts

Martin Grebhahn
@MartinGrebhahn
Building practical AI products. Product ops by trade. Focused on AI automation, S&P 500 trading & designing green energy systems! For builders & operators!





MiMo-V2-Pro & Omni & TTS is out. Our first full-stack model family built truly for the Agent era. I call this a quiet ambush — not because we planned it, but because the shift from Chat to Agent paradigm happened so fast, even we barely believed it. Somewhere in between was a process that was thrilling, painful, and fascinating all at once. The 1T base model started training months ago. The original goal was long-context reasoning efficiency. Hybrid Attention carries real innovation, without overreaching — and it turns out to be exactly the right foundation for the Agent era. 1M context window. MTP inference for ultra-low latency and cost. These architectural decisions weren't trendy. They were a structural advantage we built before we needed it. What changed everything was experiencing a complex agentic scaffold — what I'd call orchestrated Context — for the first time. I was shocked on day one. I tried to convince the team to use it. That didn't work. So I gave a hard mandate: anyone on MiMo Team with fewer than 100 conversations tomorrow can quit. It worked. Once the team's imagination was ignited by what agentic systems could do, that imagination converted directly into research velocity. People ask why we move so fast. I saw it firsthand building DeepSeek R1. My honest summary: — Backbone and Infra research has long cycles. You need strategic conviction a year before it pays off. — Posttrain agility is a different muscle: product intuition driving evaluation, iteration cycles compressed, paradigm shifts caught early. — And the constant: curiosity, sharp technical instinct, decisive execution, full commitment — and something that's easy to underestimate: a genuine love for the world you're building for. We will open-source — when the models are stable enough to deserve it. From Beijing, very late, not quite awake.



An Australian breakfast radio show spent their morning show today seeing how far they could drive their Tesla past 0%. Probably the most entertaining piece of Tesla content I’ve ever watched 😂


i heard about a guy in a small town in england who turned his openclaw into a short form video marketing machine millions of views, steady app downloads, and revenue coming in every day i needed to find out how he was doing it 1. spin up an ai “employee” using openclaw 2. give it one job like grow your app with tiktokk 3. give it access to tiktokk analytics, a browser to research and image/video tools to create content 4. the openclaw studies your niche and starts generating slideshows and videos 5. every post feeds performance data back into the system views → hook quality downloads → CTA quality revenue → funnel quality the openclaw then iterates on - new hooks - new formats - new CTAs until it finds winners one of his posts hit 170k+ views and the system keeps improving because the analytics loop feeds back into the content generation so the agent slowly learns what works what i like about this is the framing most people think about ai tools this is different you spin up an ai employee you give it a job and let it run the loop thanks to @oliverhenry for coming on the @startupideaspod today more like this soon, i will share the most interesting stories and gatekeep nothing this episode was dripping in sauce i gotta try this and see if it works kinda wild if it does watch

Siebenmal teurer als erneuerbare Energie. Purer Lobbyismus und Ideologie.













