Tom Lienard
3.9K posts

Tom Lienard
@tomlienard
Compute @vercel, prev @scaleway @fig. Founder @lagonapp (acquired by @vercel) https://t.co/UNJcyK9OvJ https://t.co/1mrd14AyZH
🇫🇷 living in 🇬🇧 London Katılım Temmuz 2020
400 Takip Edilen4.8K Takipçiler

just-bash is the lowest latency option to give bash and a filesystem to your agent.
But tbh, I'm much more excited about the Vercel Sandbox team absolutely cooking on latency reduction on the full VMs.

ComputeSDK@computesdk
👀 we have a new person in first place?
English

@gerynugrh > Using sandbox does help but the dx suck.
Are you referring to Vercel Sandbox? If yes, what could we improve on the DX side?
English

@mikepunzano @rauchg We’re working on it, more improvements coming soon
Tom Lienard@tomlienard
nice
English

What service should we build next, with deep care and investment into its security, availability, and durability?
Guillermo Rauch@rauchg
Queues are one of the most requested services since I started Vercel. They're now here. It's just two APIs: 𝚜𝚎𝚗𝚍 and 𝚑𝚊𝚗𝚍𝚕𝚎𝙲𝚊𝚕𝚕𝚋𝚊𝚌𝚔 😌. The use-cases are basically infinite. Notably: queues can make agents and AI apps reliable. Quality and reliability are top of mind for everyone now, including our own team. We went through 3 iterations of the infrastructure while in private beta, and we're excited for you to build 'unbreakable software' with it.
English

@nicoalbanese10 @i_am_brennan @hp_arora It’s based on the average size in GB of all your snapshots, per month. You can still delete snapshots that you don’t need anymore, which would stop contributing to your bill
English

@__morse @tobiaslins On Sandboxes we support from 4GB to 16GB of memory. We allocate 2GB of memory per vCPU: #resource-limits" target="_blank" rel="nofollow noopener">vercel.com/docs/vercel-sa…
English

@tobiaslins is the limit still 2GB of RAM? increasing that to something like 8 would be amazing
English

@tomlienard @vercel_dev nice. will try and circle back, do you guys plan on increasing the 5hr limit?
English
Tom Lienard retweetledi

Vercel Sandboxes now support filesystem snapshots.
Capture complete state with 𝚜𝚊𝚗𝚍𝚋𝚘𝚡.𝚜𝚗𝚊𝚙𝚜𝚑𝚘𝚝() to skip repeatedly slow setup steps like git clone and dependency installation.
vercel.com/changelog/file…
English

@sionemart @vercel_dev $0.08/GB/month. Let us know what you think!
#pricing" target="_blank" rel="nofollow noopener">vercel.com/docs/vercel-sa…
English

@vercel_dev wow this is huge. any information regarding pricing?
only reason i was using daytona over vercel sandbox was persistent storage
English

@skovorodan @MattIPv4 @vercel Looking at the last Node.js releases, it seems this hasn’t be released yet. It’s currently staged for v25.4.0: github.com/nodejs/node/pu…
English

@tomlienard @MattIPv4 @vercel See also github.com/nodejs/node/pu…
This is now marked stable in the current form.
The reasoning why this was disabled in AWS no longer holds.
English

Why on earth am I hitting `ERR_REQUIRE_ESM` errors (`Error: require() of ES Module abc.js from xyz.js not supported.`) on @vercel with their Node.js 24.11.0 runtime? Are they actively setting a flag to opt out of default functionality (#loading-ecmascript-modules-using-require" target="_blank" rel="nofollow noopener">nodejs.org/docs/v24.11.0/…)?
English

Bypass token is only for deployment protection, doesn’t affect the firewall. You can whitelist specific IPs on pro, but I’d recommend just running 10 RPS for a few minutes to get a good baseline
The underlying infra for Node.js and Bun on Vercel is the same, so any difference in performance would be from the runtimes themselves
English

@tomlienard @_pi0_ Any recommendations / settings for wrk to not be as stressful for the system? I'm not planning to stresstest this often but when changing the main arch/approach to check if it's viable.
English

@tomhacks @_pi0_ Were they all 200 responses? You've likely tripped the firewall (vercel.com/YOURTEAM/YOURP…), we won't allow un-approved load tests
English

wrk -t12 -c400 -d30s
Node.js 22.x -> ~6,800 req/s
Bun -> ~5,400 req/s
Architecture x86_64, ARM64
-
Repro: tanstack start bare example & conf:
nitro: {
preset: "vercel",
vercel: {
functions: {
runtime: "nodejs22.x", // switching to "bun"
architecture: "arm64",
// environment: { ...bun_vars }
},
},
compressPublicAssets: {
brotli: true,
gzip: true,
},
minify: true,
},
English

@tomhacks It is not normal. Would be nice if you can benchmark nitro+bun alone on vercel. Perf should be close to native bun as there is no wrapper added by nitro.
Also feel free to report an issue/repro for us to investigate.
/cc @tomlienard
English

@pariscestchiant @vercel @rauchg @cramforce Quick update, it's been released for builds and is currently rolling for functions. Feel free to DM me your team id if you want to get the update immediately
English

@vercel would it be possible to bump the Bun version used in the function runners to 1.3.6 (latest) please?
all of our function executions are crashing atm, and the fix is in that version 🫠
cc @rauchg @tomlienard @cramforce

Célia@pariscestchiant
can't release any of this until @bunjavascript 1.3.6 comes out 🫠
English

We've completed the acquisition of the 𝟷𝟺𝟹.𝟷𝟹.𝟶.𝟶/𝟷𝟼 and 𝟷𝟻𝟻.𝟷𝟸𝟷.𝟶.𝟶/𝟷𝟼 IPv4 blocks for @vercel CDN and Fluid compute.
Fun fact: these are a piece of internet history, which we acquired from P&G, as part of the "legacy space".
ARIN legacy IPv4 ranges are address blocks originally issued before the inception of the regional Internet registry system in 1997. These larger blocks were originally given to corporations like Procter & Gamble and Ford (!) with fewer restrictions and regulations.
These days the world is transitioning to the much larger IPv6 space, but IPv4 is still incredibly valuable for interoperability with billions of devices and huge amounts of infrastructure.
High quality and high reputation IPs like these are scarce and expensive. They're like a rare, borderless PSA 10 Charizard. Each day there's fewer of them.
Security and availability are the top priority at @vercel. Our teams do a lot of work behind the scenes to make sure your visitors get the best global access to your projects, with your infra on autopilot 🫡
English

@simonfarshid @rauchg Elysia/TypeBox for example use the Function constructor, which I believe isn't required but would still be a breaking change on our side
English

@tomlienard @rauchg I wonder, any popular libs that depend on dynamic evals? 😯
English

@rauchg @tomlienard That’d be great! I sleep happy knowing workerd / Cloudflare Workers doesn’t support string evals at all — would be good to have it on our Vercel compute endpoints as well
English







