Sabitlenmiล Tweet
Yash Singhal ๐ฅ
491 posts

Yash Singhal ๐ฅ
@yash_25log
Eng @ IFAI๐ | Ex- InteligenAI, CrossTower๐จโ๐ป | Sharing backend, design & dev experiments โ๏ธ | Hackathons ๐ | 22 | โก Helping devs grow
Delhi, IND Katฤฑlฤฑm Mayฤฑs 2021
1.6K Takip Edilen210 Takipรงiler

@kirat_tw @jarredsumner Jarred speedran โbuild startup โ get acquiredโ faster than Bun installs dependencies
English

@kirat_tw Reading it on Twitter hits different than experiencing it at 3AM in production. ๐ญ๐ฅ
English

@mehulmpt Bro didnโt just warn us, he basically posted the spoiler alert for the whole outage arc. ๐
English

Well this didn't take long
Mehul Mohan@mehulmpt
X API is proxied by cloudflare, cloudflare going down will take out X (anytime)
English

@arpit_bhayani Wild ! how an amoeba explains range partitioning better than half the system design textbooks ๐
English

Range-based partitioning is like an amoeba.
You split when a partition gets hot, you merge when things cool down, and you move the partition across nodes (and transitively the data, if needed) when you want to grow beyond a single machine.
The best part of range-based partitioning is that it sits in the middle between hash and static approaches. It avoids the randomness of hash ownership and the heavy metadata burden of static ownership.
That's why, for stateful workloads, so many systems, operating at scale, prefer range-based ownership.
English

@archiexzzz Absolutely. JSON is a token tax - every { , : " } is money
English

JSON is a token-hungry parasitic dinosaur. If you look at openai/tiktoken, the pre-trained BPE vocabulary has most JSON structural characters - brackets, commas, quotes - as individual tokens in the vocabulary. This means even a small JSON can emit >1000 tokens, since every {, }, ", :, and , burns a separate token. Now imagine your use case: you're ingesting some massive JSON response from an MCP server tool, and you need to shove it all into your LLM for inference. Your cost just went through the roof.
I've been there. We were constantly hitting into context windows longer than 1M tokens, just inferring over these giant JSON strings. We ended up with all these hacky % operations on the token length just to figure out how many LLM calls we needed to digest the whole thing. It's a nightmare.
Your options are trash:
> either you throw away data, which is a hard no because you don't want data loss, or
> you accept that you're just going to burn more VC money because every provider charges you per token.
TOON is a decent alternative, but it's not perfectly accurate. The thing is, LLMs were pre-trained on a lot of JSON. You switch everything to TOON, and your accuracy takes a hit. It's a trade-off.
If you're ranking formats for accuracy, it's roughly:
Markdown-KV > XML > YAML > HTML > JSON > TOON
TOON is getting better, but as an AI engineer, the most optimized way to solve this right now is to figure out your personal trade-off between latency <> cost <> model accuracy. If your task is simple and doesn't live or die by accuracy, try TOON -> Run evals on a small dataset -> If the numbers look good, stick with it -> If they're bad, fall back to JSON or XML.
But really, the smart move is to think before the LLM call:
> Can you filter that JSON down to just the fields the LLM actually needs?
> Can you summarize big text blobs with a cheaper model first?
> Can you break the task into a conversation instead of one giant context dump?
That's where you win. The format is just one part of the puzzle.


English

@LowLevelTweets Relax ๐ thatโs just bots knocking on every door on the internet hoping someone left .env outside. Ours is locked.
English

The takeaway:
For global businesses, depending on a robust CDN isn't optionalโit's survival. ๐ฅท
A single point of failure at this scale creates a cascading crisis for whole ecosystem. ๐
Are we too centralized? What's solution?
#Tech #Cloudflare #CDN #Cybersecurity #DevOps
English

/1
๐๐ญ๐ฐ๐ถ๐ฅ๐ง๐ญ๐ข๐ณ๐ฆ is worldโs largest ๐๐๐ .
What it does:
- ๐๐ฅ๐จ๐ฌ๐๐ซ ๐๐จ๐ง๐ญ๐๐ง๐ญ: Uses global PoPs & edge servers to cut latency.
- Faster connections: Terminates TLS handshakes instantly at edge.
- ๐๐๐๐ก๐ข๐ง๐ : Protecting your origin server from massive load.
English

When ๐, ๐๐ฉ๐๐ง๐๐, ๐๐ฉ๐จ๐ญ๐ข๐๐ฒ, & ๐๐๐ง๐ฏ๐ all vanish at once, it's not a simple server crashโit's an INFRASTRUCTURE FAILURE. ๐จ
The recent ๐๐ฅ๐จ๐ฎ๐๐๐ฅ๐๐ซ๐ outage was a $100๐ ๐ณ๐ฆ๐ฎ๐ช๐ฏ๐ฅ๐ฆ๐ณ of how 1 company powers internet.
Why?
A deep dive into the tech ๐
English

6.
Thatโs the first half of the story.
๐๐ต๐ข๐ต๐ช๐ค โ ๐๐๐ โ ๐๐๐ โ ๐๐๐
Next up in ๐๐๐ซ๐ญ ๐:
How ๐๐๐, ๐๐๐ & ๐๐๐๐๐ญ ๐๐๐ซ๐ฏ๐๐ซ ๐๐จ๐ฆ๐ฉ๐จ๐ง๐๐ง๐ญ๐ฌ changed everything โก๏ธ
#frontend #webdev #nextjs #react #softwarearchitecture
English

5.
SPAs solved interactivity but not complexity.
So next: the ๐๐๐๐ค๐๐ง๐ ๐๐จ๐ซ ๐
๐ซ๐จ๐ง๐ญ๐๐ง๐ (๐๐
๐
) pattern.
Each frontend (web, mobile) gets its own API.
โ
๐๐ญ๐ฆ๐ข๐ฏ๐ฆ๐ณ ๐ฅ๐ข๐ต๐ข
โ
๐๐ข๐ด๐ต๐ฆ๐ณ ๐ช๐ต๐ฆ๐ณ๐ข๐ต๐ช๐ฐ๐ฏ
โ ๐๐ฐ๐ณ๐ฆ ๐ช๐ฏ๐ง๐ณ๐ข ๐ต๐ฐ ๐ฎ๐ข๐ฏ๐ข๐จ๐ฆ
English





