Tome

16 posts

Tome banner
Tome

Tome

@get_tome

A magical desktop app that puts the power of LLMs and MCP in the hands of everyone.

Bay Area Katılım Haziran 2023
10 Takip Edilen21 Takipçiler
Tome retweetledi
Runebook
Runebook@runebookai·
We're joining @KeycardLabs! More details here: blog.runebook.ai/taking-mcp-fur…
Keycard@KeycardLabs

Keycard has acquired Runebook! 🎉 @peteraldehyde & @mattenoble have been on the front lines of MCP from day one - building tooling and seeing the trust + security gaps up close. To ship agents into production, teams need to know: • Which MCP servers are safe • What an agent can do, and on whose behalf • How to audit every tool call • How to prevent data & credential leakage Keycard provides the missing layer: cryptographic agent identities, dynamic authorization, and full audit lineage. Customers are already moving MCP from sandbox → production. Peter & Matte know that “easy to connect” means nothing without “safe to deploy.” Together we’ll continue to make MCP production-ready by default. 🧵Read the full story in the thread →

English
0
1
2
59
Tome
Tome@get_tome·
By popular demand - our docs are live! 🏄 Dive in: docs.gettome.app We'll be adding more over the next few weeks - let us know what tutorials or guides you'd love to see!
English
0
2
3
141
Tome
Tome@get_tome·
🚀 New in Tome v0.10.0: Relays 📡 Chat with your LLMs + MCP servers directly from Telegram On a plane? On a mountain? Under the ocean? Fire off Claude Code or toggle your smart lights from anywhere using Tome's new Telegram integration More here: blog.runebook.ai/tome-relays-ch…
English
0
2
2
246
Tome
Tome@get_tome·
🧙 New in Tome: Scheduled Tasks Tome now lets you run any #LLM + #MCP + prompt on an hourly or daily schedule - no cron jobs, no agent chains. Automatic price checks, Slack digests into Notion, etc, without writing a single line of code. blog.runebook.ai/scheduled-task…
English
0
2
3
203
Tome
Tome@get_tome·
Changelog for 0.7.0 - First external contributor alert 🚨 Thanks to aristideubertas for submitting a PR to add a customizable system prompt to the settings menu. It's a lighter week but we've got a fun new feature in the works, stay tuned! blog.runebook.ai/changelog-0-7-…
English
0
1
2
77
Tome retweetledi
peter cho
peter cho@peteraldehyde·
🥳Tome 0.5.0 adds Windows + OpenAI + Gemini support demo: grabs random MTG cards then uses gemini to write songs in the style of sum 41 then writes a script that returns a song at random download here: github.com/runebookai/tom…
English
0
1
1
181
Tome
Tome@get_tome·
Introducing Tome, a magical LLM client! We're the easiest way to play with local LLMs and MCP on MacOS (Windows and cloud model support coming later this week!). We wrote here about what Tome is, why we built it, and where we're headed: blog.runebook.ai/introducing-to…
English
0
1
3
98
Tome
Tome@get_tome·
🎵 New Release Friday v0.4.0 adds in-app updates - No more redownloading and reinstalling the app, always stay up to date with the latest and greatest! github.com/runebookai/tom…
English
0
0
2
32
Tome retweetledi
Henry Mao
Henry Mao@Calclavia·
Another Smithery integration released this week: You can now install Smithery MCPs on Tome ( @get_tome )! Tome allows anyone to get started with local LLMs easily. Now, extensible with Smithery MCPs! @mattenoble @peteraldehyde
peter cho@peteraldehyde

Very hyped for this - If you've been wanting to dabble with local models and MCP, this is the simplest and quickest way to get started! @ollama + @get_tome + @SmitheryDotAI = local LLMs making tool calls in minutes 🤗 github.com/runebookai/tome

English
3
3
14
1.7K
Tome
Tome@get_tome·
📢 As of v0.3.0 you can now browse, search, and one-click install thousands of MCP servers via our Smithery integration! @ollama + Tome + @SmitheryDotAI = instant local LLM/MCP server playground 🤗 github.com/runebookai/tom…
English
0
0
1
188
Tome
Tome@get_tome·
🎉 New Release Friday! v0.2.0 lets you configure context windows and we have better visualization of MCP tool calls in chat github.com/runebookai/tom…
English
0
0
1
33
Tome
Tome@get_tome·
Our first technical preview is live on GitHub! Tome is an open source local LLM client that lets you easily connect @ollama to MCP servers without managing uv/npm or any json config. We've got some fun features in the works, check it out here! github.com/runebookai/tome
English
0
1
2
68