Bodega One

92 posts

Bodega One banner
Bodega One

Bodega One

@BodegaOneAI

We got tired of $20/month for a chat app. So we built a full IDE + AI agent you can own. No subscription. Local-first. Ships May 2026.

bodegaone.ai Katılım Mart 2026
11 Takip Edilen9 Takipçiler
Sabitlenmiş Tweet
Bodega One
Bodega One@BodegaOneAI·
We just got featured on BetaList! Take a look. betalist.com/startups/bodeg… We built the AI coding environment we wanted to use: - Any LLM - Pay once - Monaco editor - Air Gap mode - No cloud dependency - No silent usage nerfs Signup at bodegaone.ai
English
0
2
0
181
Bodega One
Bodega One@BodegaOneAI·
@jun_song We think this is the best course of growth though. Why pay monthly for all these subs (we know you don’t have just one!) when you could just early invest on your own hardware and utilize it in ways other than just local LLMs to make that ROI.
English
0
0
0
168
송준 Jun Song
송준 Jun Song@jun_song·
One issue with getting into Local LLMs: You begin with affordable hardware, but give it time, and you'll be constantly looking for upgrades. Most people I know ended up doing exactly this. Fair warning! 😂
송준 Jun Song tweet media
English
44
7
196
16.3K
Manish Kumar
Manish Kumar@Manixh02·
hey guys, Is there a way to make VS Code look like this..?? It's looks so good
Manish Kumar tweet media
English
117
157
6.4K
726.4K
Gurpreet Singh
Gurpreet Singh@gurpreet671·
Builders only. Show what you’re building. Skip the pitch and just drop the link. 50k builders watching ↓
English
381
2
181
14.9K
Bodega One
Bodega One@BodegaOneAI·
@thdxr Honestly leaning towards 2 just for the ability to plug and play. We don’t believe in locking down users to specific agents and LLMs. That’s why our IDE was built for multi model swapping. Nothing worse than having to start a fresh session to swap models then losing context.
English
0
0
0
554
dax
dax@thdxr·
if you look around you can see everyone is completely confused about whether one: every product needs an agent or two: every product needs to plug into an agent users are already using everyone picking 1 or 2 and building infra for that and praying they're right
English
88
20
584
47.1K
Bodega One
Bodega One@BodegaOneAI·
Interested in customizing your entire flow to specific models? Well check out our model settings here!
English
0
0
0
159
Bodega One
Bodega One@BodegaOneAI·
Biggest frustration with using multi LLM swapping right now is having to start a new session when you want to utilize say qwen3.6 for lighter task and Claude opus for the heavy lifting. Well we fix that.
English
1
0
5
45.9K
Bodega One
Bodega One@BodegaOneAI·
We asked Bodega One to build a browser dino game from scratch then change up the style. You don't always need to be cloud dependent. Watch what the agent does.
English
0
2
2
173
Bodega One
Bodega One@BodegaOneAI·
5/5 Frontier (multi-GPU or 96GB+ unified): GLM-5.1 (744B-A40B) or DeepSeek V4 (1.6T-A49B). Both MIT. Both ranked top of open-weight leaderboards last week. Local stopped being "the cheaper option." For several use cases it's now the better one. #LocalLLM
English
0
0
0
50
Bodega One
Bodega One@BodegaOneAI·
4/5 24-32GB → Qwen3.6-35B-A3B (MoE) or Devstral Small 24B for purpose-built agent work. 48GB+ → Llama 4 Scout subset, GLM-5.1 partial loads.
English
1
0
0
64
Bodega One
Bodega One@BodegaOneAI·
1/5 Updated VRAM-to-model map for April 2026. The advice changed materially in the last 90 days.
English
1
0
1
39
Bodega One
Bodega One@BodegaOneAI·
@0xSero Bodega One hits 5/6: 15+ provider presets + custom endpoint, real BYOK (keys go direct), Monaco desktop app, runs local or cloud, not a terminal. No built-in browser yet, agent uses web_fetch instead. Beta May 1st, sign up for the waitlist: Bodegaone.ai
English
0
0
0
557
0xSero
0xSero@0xSero·
Help, I'm looking for a desktop app with the following features: - Can add custom models/providers - Has a slick UI like this - Can bring my own keys - Performs well - Is not a terminal - Has a browser Here's what I know of: - Claude Desktop - Cursor Glass - OpenCode Desktop - Factory Desktop - Goose?? - T3Code - Aider The best ones are Cursor and Codex, but Codex only supports GPT models (unless someone knows how I can change that?) and Cursor doesn't support BYOK at all :P
0xSero tweet media
English
124
9
300
175.6K
Bodega One retweetledi
Xiaomi MiMo
Xiaomi MiMo@XiaomiMiMo·
Xiaomi MiMo-V2.5 Series: Pushing Open-Source Agents Forward 🔸 MiMo-V2.5-Pro, our strongest model yet. A major leap from MiMo-V2-Pro in general agentic capabilities, complex software engineering, and long-horizon tasks, now matching frontier models like Claude Opus 4.6 and GPT-5.4 across most benchmarks (SWE-bench Pro 57.2, Claw-Eval 63.8, τ3-Bench 72.9). It can autonomously complete professional tasks involving 1,000+ tool calls, work that would take human experts days. Tech Blog: mimo.xiaomi.com/blog/mimo-v2.5… 🔸 MiMo-V2.5, native omnimodal with strong agentic capabilities. Pro-level agent performance at roughly half the cost. Improved multimodal perception across image and video understanding, native 1M-token context window, and significantly more efficient inference. Tech Blog: mimo.xiaomi.com/blog/mimo-v2.5 🔗 API & Token Plan: platform.xiaomimimo.com/token-plan
Xiaomi MiMo tweet media
English
135
279
2.6K
354.9K
Bodega One
Bodega One@BodegaOneAI·
Only 2 more weeks until beta license keys are dropped for those who signed up on the waitlist!! If you haven't signed up yet for Bodega One head over to Bodegaone.ai and enter in your name and email to be sent a beta key May 1st!!
English
0
0
0
23
Bodega One
Bodega One@BodegaOneAI·
Everyone's always talking about Claude and ChatGPT but have you actually tried out a local LLM? If so, what have you tried and whats your hardware setup? If not, then what's stopping you?
English
0
0
0
19
Bodega One
Bodega One@BodegaOneAI·
What's your local AI setup right now? Ollama? LM Studio? Something else entirely? Genuinely asking — always curious what people are running.
English
0
0
0
27