Osaurus

815 posts

Osaurus banner
Osaurus

Osaurus

@OsaurusAI

Own your AI. Agents that remember, execute code in isolated VMs, and stay reachable from anywhere -- all on your Mac. Any model. No cloud required. Open source.

Katılım Mayıs 2025
15 Takip Edilen6.1K Takipçiler
Sabitlenmiş Tweet
Osaurus
Osaurus@OsaurusAI·
This is what native feels like. No Electron. No Python runtime. Just Swift on Apple Silicon.
English
119
150
3.5K
453.8K
Osaurus
Osaurus@OsaurusAI·
@icefrog_sol @TechCrunch Local-first everything, inference is local/cloud agnostic. There's no backend, everything is running locally.
English
0
0
0
10
Osaurus
Osaurus@OsaurusAI·
@PsudoMike @TechCrunch You can choose both as default. Osaurus also ships with it's own router, which is hosted locally
English
0
0
1
39
PsudoMike 🇨🇦
PsudoMike 🇨🇦@PsudoMike·
@TechCrunch The real question is the routing logic. Most local plus cloud apps default to cloud and use local as fallback. Flipping that default changes battery and latency assumptions overnight. Curious which way Osaurus leans.
English
1
0
0
150
h
h@fhreire·
@OsaurusAI wait. when did we change the dinosaur???
English
1
0
0
32
Osaurus
Osaurus@OsaurusAI·
Two new primitives in Osaurus: Encrypted SQLite per agent. Agent owns the schema and data. Self-scheduling. Agent picks when to call inference next. A daily tracker picks check-in time. A weekly report picks Friday.
Osaurus tweet media
English
1
0
4
225
Osaurus
Osaurus@OsaurusAI·
The agent decides when to run. Told it to track my calories. It designed the schema, scheduled its own daily check-ins, logged the entry, built a dashboard. One sentence. Encrypted SQLite on my Mac.
Osaurus tweet media
English
1
1
16
751
Osaurus retweetledi
Leonardo Moura
Leonardo Moura@lfsmoura·
🧵 1/4 You can run Hindsight (an AI memory layer) entirely locally — no cloud LLM required. Here's how to wire it up with @OsaurusAI, a local OpenAI-compatible model server. 🔒
English
2
1
3
526
Osaurus retweetledi
Luke
Luke@CuriousLuke93x·
If you want to beat Claude at Legal AI, make your SaaS work on local AI inference. Client's data then never leaves their control. I told @MaxJunestrand to create a "Legora Server" earlier this year. Offline = inevitable. Happy to advise anyone on how to do it.
English
9
2
68
8.7K
Osaurus
Osaurus@OsaurusAI·
@v0idm4in No it doesn’t. Model only knows about the browser state and everything is controlled and held within your local machine. It only exposes the interface which the model uses to work with the secure session. Nothing leaves your device without your consent
English
1
0
1
57
v0id
v0id@v0idm4in·
@OsaurusAI does the browser session is sent to the model?
English
1
0
0
36
Osaurus
Osaurus@OsaurusAI·
@v0idm4in Once it's secured the session, it no longer needs to log in again. I had it for the demo but it remembers the browser sessions for subsequent sessions (and background tasks!)
English
1
0
1
42
Osaurus
Osaurus@OsaurusAI·
@v0idm4in I typed it myself. Model invoked the browser for me to create the secure session, which model used in the background to place the order on my behalf.
English
1
0
2
66
Osaurus
Osaurus@OsaurusAI·
@InsiderPresider Sometimes you need those box of tissues when you make breakthroughs
English
0
0
1
59
Vincent
Vincent@InsiderPresider·
@OsaurusAI It is impressive that the AI finally mastered the art of solving your minor inconvenience while the rest of the world waits for it to solve the big ones.
English
1
0
2
67