EverMind

195 posts

EverMind banner
EverMind

EverMind

@evermind

Give Your AI Agent Self-evolving Memory. GitHub: https://t.co/ZsoTOVW2sf Discord: https://t.co/DgVdDw3x6B

United States Katılım Kasım 2025
21 Takip Edilen3K Takipçiler
Sabitlenmiş Tweet
EverMind
EverMind@evermind·
One repository. 18 use cases, with many more on the way. Multiple architecture methods and benchmarks for each. Every kind of agent, every memory pattern, in one place.
English
7
16
169
1.6M
EverMind
EverMind@evermind·
This is how we work with memory. Someone in the community turned a to-do app into an RPG. Tasks become quests, finishing them earns XP, every day brings new events, every week ends with a summary, and the system even generates a character portrait that evolves with you. Gamified to-dos are nothing new. What sets Earth Online apart is that it remembers. Before the assistant replies, it digests your recent progress, so suggestions are grounded in the specific task you almost finished yesterday rather than recycled encouragement. The memory layer is EverOS, EverMind's open-source memory repo. Earth Online is now part of the EverOS use case collection, where you can see who else is building on it.
EverMind tweet media
English
2
0
7
245
Sebastian Buzdugan
Sebastian Buzdugan@sebuzdugan·
@evermind one repo is cute until versioning and plugin compatibility quietly eat your velocity
English
1
0
2
64
EverMind
EverMind@evermind·
One repository. 18 use cases, with many more on the way. Multiple architecture methods and benchmarks for each. Every kind of agent, every memory pattern, in one place.
English
7
16
169
1.6M
EverMind
EverMind@evermind·
We're giving AI agents long-term memory. We are hiring product managers, agent engineers, DevRel engineers and UX/UI designers. building self-evolving memory for AI agents. Come build the part of the stack that makes agents actually useful.
English
5
0
10
53.1K
EverMind retweetledi
邓亚峰
邓亚峰@LongTermMemoryE·
EverMind is Hiring: Technical PM (Agent OS & Memory) Based in Silicon Valley | Shanghai | Beijing What we need: Tech + Product: Deeply understand Agent execution mechanics under the hood (OpenClaw/Hermes Agent is a +). AI-Native: Fluent in Vibe Coding. You can spin up working demos yourself to validate concepts. LTM Focus: Intensely driven to build an Agent OS with true Long-Term Memory. Flexible setup: Open to part-time/intern as a trial. DM me to chat!
English
14
10
44
34.1K
EverMind
EverMind@evermind·
Some of our favorite people in the open-source world have become EverMind Ambassadors: @li9292, @zstmfhy, @Synslius, @xiqingongzi, @Tz_2022 Open source has never been only about code. It’s built by people who care enough to contribute their time, energy, ideas, and passion, often simply because they believe in what a community can become. That spirit means everything to us. Our ambassadors aren’t defined by whether they write code. They contribute in some of the ways that matter most: organizing events, sharing thoughtful feedback, supporting users, running communities, connecting people, and helping ideas spread. That’s the kind of contribution that makes open source real. We’ve had the chance to work alongside truly generous and talented people, and we’re proud to be building more closely with them. With these ambassadors and the wider EverMind community, we’re excited about what we’ll build together, online and off.
English
5
1
18
229.5K
EverMind
EverMind@evermind·
We’re conducting product research to collect feedback that will help us improve the next version of our product. If you have a moment, please complete the survey, it would help us a lot. Here you is the link: forms.office.com/Pages/Response…
EverMind tweet media
English
0
0
7
696
艾略特
艾略特@elliotchen100·
确实很酷,拿 2900 万美金做 12M context, 侧面证明了一件事:整个行业都开始相信稀疏注意力是 dense attention 的解药。 SubQ 走的是「重训一个模型」,属于垂直整合,风险大回报也大。 @evermind 的 MSA 走的是「给主流模型加记忆」,属于 水平嵌入,谁的模型都能用。 另外,SubQ API 跟 SubQ Code (类似 Claude Code 的 CLI agent),主打把整个代码库一次性塞进 context,从 API 可以看出来,场景已经非常固定了。 EverMind 早两个月把这条路走到了 100M, 论文 + 代码全开源。同一个方向,不同的赌法。 另外,EverMind 的模型开源,SubQ 的模型不开源。
Alexander Whedon@alex_whedon

Introducing SubQ - a major breakthrough in LLM intelligence. It is the first model built on a fully sub-quadratic sparse-attention architecture (SSA), And the first frontier model with a 12 million token context window which is: - 52x faster than FlashAttention at 1MM tokens - Less than 5% the cost of Opus Transformer-based LLMs waste compute by processing every possible relationship between words (standard attention). Only a small fraction actually matter. @subquadratic finds and focuses only on the ones that do. That's nearly 1,000x less compute and a new way for LLMs to scale.

中文
8
10
100
22.4K
EverMind
EverMind@evermind·
The design is pluggable along 3 axes: Agent × Domain × Self-Evolving Method Filter the leaderboard by any combination to see how a specific agent performs on a specific domain when paired with a specific evolution method. 114 configurations evaluated so far.
English
1
0
1
698
EverMind
EverMind@evermind·
A couple of weeks ago, we released EvoAgentBench, a benchmark for testing both your agent's raw capabilities and its self-evolving capabilities. Since release, it's been downloaded over 730 times — ranking it the #2 agent benchmark on hugging face. What it actually test🧵
English
1
2
9
12.8K