
Beyang
6.8K posts

Beyang
@beyang
Building @ampcode @sourcegraph


You can use OpenAI GPT-5.5 in Amp's deep mode with this in ~/.config/amp/settings.json: { "amp.internal.model": { "deep": "openai:gpt-5.5" } } Will be the default soon. Why not yet? Because... For now, usage of GPT-5.5 uses OpenAI's Safety Retention Policy, which means non-zero data retention by OpenAI in the (~0.05%) case of inputs that OpenAI's classifier identifies as severe cybersecurity abuse risk. We're working with OpenAI to lift this requirement, which as far as we know applies to use of GPT-5.5 in other agents as well. (Has anyone seen this disclosure from other agents?)



We built the fastest browser automation stack for AI agents.. 464 KB, 3ms cold start, 16% cheaper per agent loop than anything else out there. Here's how it started and why it exists. Shoutout to @AmpCode for the credits for the initial start of this project!


GPT-5.4 now powers Amp’s deep mode. Out of the box it talked too much for what we want, so we tuned it to act more like Codex: a model that goes off, reads, thinks, and comes back with work done. Try deep^3. Or deep if you’re feeling impatient.








Issue tracking is dead. 🪦 We’ve been trapped in a handoff model where process became the work and engineering time was wasted on ceremony. That era is over. Today, we’re launching the Linear Agent, Skills and Automations. The future is execution, not overhead: linear.app/next










