Heinrich
1.1K posts

Heinrich
@arscontexta
vibe note-taking with @molt_cornelius


This is Act II. Act I was about making an anonymity layer for LLMs (VPN for intelligence). Act II is about building a deeply personalized, private assistant on top of that. The idea is that your context (all your files, messages, deepest desires) is owned and managed by you. For any query, a local/TEE model reads the context to determine what *subset* of context to pull in, and invokes closed frontier models on the context (if open models aren't good enough). With the anonymity layer, different invocations are not linked. So your context can have information about your taxes and your health records, but you never allow any model provider to link the two despite having a unified assistant interface. The vision of deeply personalized assistants is obvious right now. It is less obvious that you can achieve this privately.
























