Tilantra

28 posts

Tilantra banner
Tilantra

Tilantra

@tilantra

Katılım Aralık 2025
5 Takip Edilen2 Takipçiler
Tilantra retweetledi
Uneed
Uneed@UneedLists·
Here's yesterday's podium, congratulations 🙌🏻 🥇 Remly (@remly_it) - 117 points 🥈 RandomMockups.com (@txh) - 93 points 🥉 CapsuleHub (@tilantra) - 72 points
English
2
4
4
143
Tilantra
Tilantra@tilantra·
We are looking for early stage founders and people looking to solve the context problem across verticals. Dm's are open.
English
0
1
2
9
Tilantra
Tilantra@tilantra·
AI isn’t expensive. Re-processing the same context is. Every summary, doc, or “quick insight” often re-reads the same data and bills you like it’s new. That’s how output ends up costing more than input. Not a model problem, but a context problem. So, we have built Capsules: store context once, reuse it anywhere. And soon, not just prompts… but everything around them too. Feels obvious in hindsight. Are others seeing this? #AI
Tilantra tweet media
English
1
0
2
28
Tilantra
Tilantra@tilantra·
@cto_ya_know @rohanpaul_ai Well, the delivery of the same is through an extension to allow users to use it easily, but its much more than just that! Really love the "fancy ragdb" term tho! But we not only peform RAG but also version, share and plug into essentially any tool of your choice
English
0
0
1
9
Rohan Paul
Rohan Paul@rohanpaul_ai·
The paper says the best way to manage AI context is to treat everything like a file system. Today, a model's knowledge sits in separate prompts, databases, tools, and logs, so context engineering pulls this into a coherent system. The paper proposes an agentic file system where every memory, tool, external source, and human note appears as a file in a shared space. A persistent context repository separates raw history, long term memory, and short lived scratchpads, so the model's prompt holds only the slice needed right now. Every access and transformation is logged with timestamps and provenance, giving a trail for how information, tools, and human feedback shaped an answer. Because large language models see only limited context each call and forget past ones, the architecture adds a constructor to shrink context, an updater to swap pieces, and an evaluator to check answers and update memory. All of this is implemented in the AIGNE framework, where agents remember past conversations and call services like GitHub through the same file style interface, turning scattered prompts into a reusable context layer. ---- Paper Link – arxiv. org/abs/2512.05470 Paper Title: "Everything is Context: Agentic File System Abstraction for Context Engineering"
Rohan Paul tweet media
English
64
184
1.1K
81.5K
Tilantra
Tilantra@tilantra·
Hi! Love this and this is exactly what we are solving!! We are building 💊Capsules, which are basically artifacts that can transfer context from one tool to another as easily as a drag and drop, while being versionable and shareable. It’s persistent state context which is dynamically injected, prevents context rot, hallucinations and fetches context as needed!! Imagine this, you “capsule” an email, drop it into gpt for ideas, drop the same versioned capsule into figma to create decks, and finally into Cursor. Connect to ANY agent using our MCPs. Without ever repeating yourself. Check us out! We are already live! Product: chromewebstore.google.com/detail/capsule…
English
0
0
0
8
Victoria Slocum
Victoria Slocum@victorialslocum·
Prompt engineering is dead. Long live 𝗖𝗼𝗻𝘁𝗲𝘅𝘁 𝗘𝗻𝗴𝗶𝗻𝗲𝗲𝗿𝗶𝗻𝗴 (Well, not quite dead - but it's definitely evolving into something way more powerful) Meet 𝗖𝗼𝗻𝘁𝗲𝘅𝘁 𝗘𝗻𝗴𝗶𝗻𝗲𝗲𝗿𝗶𝗻𝗴 - the art of building dynamic systems that give LLMs exactly what they need to succeed. As we move from simple chatbots to complex AI agents, we're realizing that clever prompts aren't enough. What matters is orchestrating an entire ecosystem of information that flows into your LLM. So what exactly does that mean? It's about building dynamic systems to provide the right information and tools in the right format such that the LLM can plausibly accomplish the task. The anatomy of a context-engineered system includes: 📊 𝗨𝘀𝗲𝗿 𝗜𝗻𝗳𝗼𝗿𝗺𝗮𝘁𝗶𝗼𝗻: Preferences, history, and personalization data 🔧 𝗧𝗼𝗼𝗹 𝗨𝘀𝗲: APIs, calculators, search engines - whatever the LLM needs to get the job done 🔍 𝗥𝗔𝗚 𝗖𝗼𝗻𝘁𝗲𝘅𝘁: Retrieved information from vector databases like Weaviate 💬 𝗨𝘀𝗲𝗿 𝗜𝗻𝗽𝘂𝘁: The actual query or task at hand 🧠 𝗔𝗴𝗲𝗻𝘁 𝗥𝗲𝗮𝘀𝗼𝗻𝗶𝗻𝗴: The LLM's thought process and decision-making chain 📜 𝗖𝗵𝗮𝘁 𝗛𝗶𝘀𝘁𝗼𝗿𝘆: Previous interactions that provide continuity But what about the memory architecture? • 𝗦𝗵𝗼𝗿𝘁-𝘁𝗲𝗿𝗺 𝗺𝗲𝗺𝗼𝗿𝘆: Lives in the context window, handling current conversations • 𝗟𝗼𝗻𝗴-𝘁𝗲𝗿𝗺 𝗺𝗲𝗺𝗼𝗿𝘆: Stored in vector databases (like Weaviate), persisting user preferences and past interactions across sessions Why does this matter? Because when agentic systems fail, it's rarely because the model isn't smart enough. It's because we haven't given it the right context. The format matters too. A well-structured error message beats a massive JSON blob every time. Just like humans, LLMs need clear, digestible communication.
Victoria Slocum tweet media
English
18
223
1.1K
80.5K
Tilantra
Tilantra@tilantra·
Brilliant viewpoint! We agree 100% and WE SOLVE THIS! Files? No. Capsules. Yes. We are building 💊Capsules, which are basically artifacts that can transfer context from one tool to another as easily as a drag and drop, while being versionable and shareable. We take care of the “when” using Dynamic Context Injection!! Imagine this, you “capsule” an email, drop it into gpt for ideas, drop the same versioned capsule into figma to create decks, and finally into Cursor. Without ever repeating yourself. Check us out! We are already live! Product: chromewebstore.google.com/detail/capsule…
English
1
0
1
8
alex morris 🔥
alex morris 🔥@cto_ya_know·
ok so "what if context was just files" has been proposed roughly 47 times since 2023 the paper's insight: LLMs have fragmented context (prompts, RAG, tools, memory) and this is messy. true! their solution: unify it under filesystem semantics. files, directories, read/write ops, timestamps the problem is filesystems are the wrong abstraction: files are static, context is dynamic context relevance changes MID-GENERATION a filesystem snapshot at prompt construction time is already stale you'd need continuous re-evaluation which defeats the point "everything is a file" failed before unix tried this. then we got sockets, pipes, procfs, sysfs, etc turns out different things have different access patterns forcing tool calls into file semantics is impedance mismatch city the components they describe already exist separately: "constructor to shrink context" = every RAG system "updater to swap pieces" = sliding window attention "evaluator to check answers" = RLHF / constitutional AI "persistent repository" = literally just a database packaging these as "filesystem" doesn't add capability, it adds a metaphor provenance/logging is good but not novel langsmith, langfuse, phoenix, etc already do this the filesystem framing doesn't make lineage tracking easier if anything it obscures the actual dependency graph AIGNE framework claims: "agents remember past conversations" = memory module (everyone has this) "call github through file interface" = MCP with extra steps "reusable context layer" = that's just... a library the real problem with context engineering isn't the abstraction — it's WHAT to include and WHEN. this paper proposes WHERE to store it, which is the easy part also suspicious: no benchmarks comparing to existing context management approaches. just "we built a thing and it has these components" tldr: filesystem metaphor is elegant but leaky. the hard problems (relevance, freshness, coherence) aren't solved by renaming "retrieve" to "read". it's a systems paper cosplaying as an AI paper
English
1
0
0
53
Tilantra
Tilantra@tilantra·
Files? No. Capsules. Yes. We are building 💊Capsules, which are basically artifacts that can transfer context from one tool to another as easily as a drag and drop, while being versionable and shareable. Exactly as you mentioned, persistent context state. That is never lost! Imagine this, you “capsule” an email, drop it into gpt for ideas, drop the same versioned capsule into figma to create decks, and finally into Cursor. Without ever repeating yourself. Check us out! We are already live! Product: chromewebstore.google.com/detail/capsule…
English
0
0
0
7
Tilantra
Tilantra@tilantra·
This is great! But with multiple AI tools out there even with the best prompts, you need to re-explain yourself, have context rot and hallucinations. We are building 💊Capsules, which are basically artifacts that can transfer context from one tool to another as easily as a drag and drop, while being versionable and shareable. Imagine this, you “capsule” an email, drop it into gpt for ideas, drop the same versioned capsule into figma to create decks, and finally into Cursor. Without ever repeating yourself. Connect to ANY agent using MCPs. Check us out! We are already live! Product: chromewebstore.google.com/detail/capsule…
English
0
0
0
4
Alex Prompter
Alex Prompter@alex_prompter·
R.I.P generic prompting. Context engineering is the new king. Anthropic, OpenAI, and Google engineers don't write prompts like everyone else. They engineer context. Here are 8 ways to use context in your prompts to get pro-level output from every LLM out there:
Alex Prompter tweet media
English
86
265
2.1K
563.7K
Tilantra
Tilantra@tilantra·
Hi ! We are building 💊Capsules, which are basically artifacts that can transfer context from one tool to another as easily as a drag and drop, while being versionable and shareable. Imagine this, you “capsule” an email, drop it into gpt for ideas, drop the same versioned capsule into figma to create decks, and finally into Cursor. Without ever repeating yourself. Connect to ANY agent using MCPs! Check us out! We are already live! Product: chromewebstore.google.com/detail/capsule…
English
0
0
1
60
Kaito
Kaito@KaiXCreator·
Drop your project URL Let’s drive some traffic Curious to know what you all are building 👇🏼
Kaito tweet media
English
420
4
172
13.8K
Mike Scully
Mike Scully@Mike_Scully_·
Sell me your service in 4 words
English
1K
13
433
58.7K
Tilantra
Tilantra@tilantra·
Hi ! We are building 💊Capsules, which are basically artifacts that can transfer context from one tool to another as easily as a drag and drop, while being versionable and shareable. Imagine this, you “capsule” an email, drop it into gpt for ideas, drop the same versioned capsule into figma to create decks, and finally into Cursor. Without ever repeating yourself. Check us out! We are already live! Product: chromewebstore.google.com/detail/capsule… Feature Video: drive.google.com/file/d/1p6DR1K…
English
0
0
0
6
Ruben
Ruben@rdominguezibar·
I'd love to angel invest in a handful of startups this month. Pre-seed and seed. Ideally AI or VC-adjacent, but open to all. My value add: ▪️ 500K+ newsletter subscribers across The VC Corner and The AI Corner ▪️ 300K+ LinkedIn followers, 2–4M weekly impressions ▪️ a16z speedrun scout ▪️ Network of top VCs, operators and founders 👉 Pitch in comments
Ruben@rdominguezibar

the PITCH DECKS💰 that raised billions are now public. Study them before your next raise: 1️⃣ 26 pitch decks that raised $400M in 2026 → thevccorner.com/p/26-pitch-dec… 2️⃣ Anthropic's 2022 pitch deck just leaked: 10 slides, no product, now worth $380B → thevccorner.com/p/anthropic-20… 3️⃣ 16 unicorn pitch decks: the actual slides before the billions → thevccorner.com/p/unicorn-pitc… 4️⃣ Peter Thiel only explained once how to raise money. Here it is → thevccorner.com/p/peter-thiel-… 5️⃣ SpaceX: how to build and pitch the most ambitious company of our time → thevccorner.com/p/spacex-strat… 6️⃣ Synthesia turned down Adobe's $3B offer. Here's the 18-slide deck that raised $180M → thevccorner.com/p/inside-synth… 7️⃣ How Brex raised $57M and rebuilt startup banking → thevccorner.com/p/how-brex-rai… 8️⃣ 50 real pitch decks from startups that raised $380M+ → thevccorner.com/p/50-real-star… 9️⃣ 200+ pitch decks that raised over $50 billion → thevccorner.com/p/200-startup-… 🔟 153 startups fundraising right now with their actual decks → thevccorner.com/p/153-startups… Bookmark this. The best founders study what worked before they pitch. How much does a pitch deck actually matter vs the founder behind it?

English
348
27
443
59.9K
Ryan K. Rigney
Ryan K. Rigney@RKRigney·
If you're interested in chatting with a partner from the a16z @speedrun team, my calendar is open this Thurs and Fri morning! applications for the next batch open later this month, so come AMA 1:1 first come, first served. drop a comment + like and I'll DM you a calendly link
Troy Kirwin@tkexpress11

i'm opening my calendar for 15 min slots this Friday! a16z speedrun application opening is just weeks away - come AMA 1:1 first come, first served... drop a comment / like and I'll send you a link to schedule! (if there's space remaining)

English
186
8
486
66.3K
Tilantra
Tilantra@tilantra·
@atShruti You could solve this with Capsules if I am not wrong right?
English
0
0
0
6
Tilantra
Tilantra@tilantra·
@atShruti Founders: Ex-Samsung Research, Microsoft and Amazon Engineers
English
0
0
2
21
Tilantra
Tilantra@tilantra·
@atShruti Hi! We are building Capsule Hub in Bangalore, context infrastructure for AI systems. Capsules are lightweight, versionable memory artifacts that plug into any AI stack, shareable across teams. MVP stage, would love to connect!
GIF
English
2
0
2
91
Shruti Gandhi / Array VC preseed rounds
In Bangalore/Mumbai, starting a new company that we should invest in? We invest $250k-$2m at inception. We value your time & decide within days. Looking for - vertical AI, security, energy, health & everything in between! Last investment was in AI security last week! DM us!
English
150
46
886
65.7K
Tilantra
Tilantra@tilantra·
Your AI needs the right context, at the right time. Not more context. With Capsule Hub v2: Drop a Capsule → scene set. If more info is needed, it GLOWS and fetches it. No prompt bloat. No wasted tokens. Coming soon. #AI #GenAI To view V1 & capsules: capsulehub.tilantra.com
Tilantra tweet media
English
0
0
1
41
Tilantra
Tilantra@tilantra·
@trq212 Love this direction. Now imagine those memories weren’t model-locked but were portable, versioned & team-shareable. That’s what we’re building with Capsules. If this resonates, try it out: Capsule Hub by Tilantra - Chrome Web Store.
English
0
0
0
20
Thariq
Thariq@trq212·
We've rolled out a new auto-memory feature. Claude now remembers what it learns across sessions — your project context, debugging patterns, preferred approaches — and recalls it later without you having to write anything down.
English
852
1.1K
15.9K
3.2M
Tilantra
Tilantra@tilantra·
@y_nizan 100% it’s a context fragmentation problem. Capsules make it portable across tools via MCP. If you’re already experimenting with browser-aware flows, would love for you to try it out and share feedback. Curious how it fits into your current setup 👀
English
0
0
0
12
Yaniv Nizan
Yaniv Nizan@y_nizan·
@tilantra Browser-aware context in Cursor and Claude is huge. The biggest friction in AI coding workflows is context switching — this collapses that gap. Watching how MCP evolves this year closely.
English
1
0
1
20
Tilantra
Tilantra@tilantra·
The bridge between browser & code is officially OPEN. 🌉 With our new MCP support, your ChatGPT chats now know which bug you're fixing in Antigravity. Web context, now natively in Cursor & Claude. 🚀 Get your API key: capsulehub.tilantra.com #MCP #CursorAI #BuildInPublic
English
2
0
1
105