Sabitlenmiş Tweet
Paras
785 posts

Paras
@parasdevlife013
Building @Neatlogs | Write. Break. Debug. Repeat.
India Katılım Eylül 2023
121 Takip Edilen40 Takipçiler

Love the design that feels first? Want to solve a real problem with an amazing team? Then, come build with us
neatlogs@neatlogs
we’re hiring for a UI/UX designer (remote) someone who is: • design + feel first • strong craft, ships fast • solid with tools (figma/framer) • uses AI in their workflow bonus: • experience designing devtools products drop your best work or DM @BetterSayAJ your portfolio
English
Paras retweetledi

@RoundtableSpace I don't think deepseek v3.2 special matches the gemini 3.1 pro level
English

The @shadcn ecosystem just got stronger ⚔️
New open source library: shadcnspace.com
Really impressive blocks & templates.
Definitely trying this out in a real project soon.
English

We just released 𝚛𝚎𝚊𝚌𝚝-𝚋𝚎𝚜𝚝-𝚙𝚛𝚊𝚌𝚝𝚒𝚌𝚎𝚜, a repo for coding agents.
React performance rules and evals to catch regressions, like accidental waterfalls and growing client bundles.
How we collected them and how to install the skill ↓
vercel.com/blog/introduci…
English


I literally laugh when I see people comparing TanStack Start and Next.js like one is automatically better. Most devs follow hype, most humans follow hype
disclaimer : I love both @tan_stack start and @nextjs and daily I use it but comparing them superficially is nonsense here I want to talk on both sides of the coin around both the frameworks
let me make a hardcore and naked rant around this firstly people compare it with
Bundle size. People always start here. Stop. That is surface-level nonsense. TanStack Start gzipped is smaller, 100–120 kB. Next.js bigger, 150–176 kB. Smaller doesn’t automatically mean faster.
If your SSR isn’t optimized, your tiny bundle doesn’t help at all. Next.js ships more JS because it runs the RSC runtime, streaming, and hydration logic. That’s intentional, not bloat.
Comparing bundles here is like comparing a bicycle to a freight truck. One is small, one actually carries the load.
smaller bundle size in start ≠faster first paint if SSR isn’t optimized.
export default function Page() {
return (
<>
// server-only
// server-only
// client-only, hydrated
>
)
}
Partial hydration in Next.js means only the interactive parts ship JS. Tiny bundle doesn’t automatically equal speed. Real users care about time to interact, not gzip stats.
SSR and hydration. TanStack Start gives explicit SSR control per route. You know exactly what runs where. Dev experience is predictable, clean.
Next.js uses RSC and hybrid SSR. Only client parts that need interactivity hydrate. Smaller payload, faster TTI, but async boundaries and Suspense can be confusing. Debugging can feel like your stack traces just vanished. Community benchmarks show TanStack Start : FCP/TTI at 1100/1600 ms, Next.js RSC at : 1050/1200 ms. So Next.js can feel faster in real apps even if the bundle is bigger.
Type safety and server functions. TanStack Start is sexy here. You get end-to-end type safety for routes, loaders, server functions. Next.js requires manual validation and can cause runtime errors.
export const getTodos = createServerFn({ method: 'GET' })
.inputValidator(zodValidator(z.object({ userId: z.string() })))
.handler(async ({ data }) => db.todos.findMany({ where: { userId: data.userId } }));
const todos = await getTodos({ data: { userId: '123' } });
That is predictable. Next.js doesn’t do this automatically yet.
Developer experience. TanStack Start has a lightning-fast Vite dev server, instant HMR, explicit SSR. Predictable local dev makes coding joyful.
Next.js uses Turbopack or Webpack. HMR is slower, heavier dev tooling, but it comes with powerful production defaults: automatic image optimization, caching, SEO, ISR.
Reality: TanStack Start equals dev-first happiness, Next.js equals production-first performance. Both valid, hype-driven debates are useless.
Caching and data fetching. TanStack Start gives explicit caching per route. You can fine-tune everything, but one misconfiguration and you’re chasing ghosts. Next.js has built-in caching through ISR and streaming. Less manual work, but you need to understand it or you get surprising bugs.
Hydration and streaming complexity. TanStack Start is explicit, predictable, clean. Easy to debug. Next.js streams server components, uses Suspense, smaller interactive payloads, faster TTI, but debugging is harder. Async boundaries will bite you if you don’t fully understand streaming logic.
Production vs dev tradeoffs. TanStack Start is lightweight, dev-first, predictable, makes you happy while coding. Next.js is production-first, bigger bundle, hybrid SSR complexity, optimized for user experience.
Stop judging these frameworks by hype tweets or gzip numbers. Look at SSR control, hydration, developer experience, caching, type safety, and real production performance. That’s the real comparison.
English

Just switched @neatlogs linting & formatting from Eslint + Prettier to @biomejs + Ultracite V7 🚀
14k+ lines of code changed, but definitely worth it.
Shoutout to @haydenbleasel for buildint Ultracite - setup was quick

English

@breathMessi21 @arpit_bhayani It's good for daily use, and I don't think he is someone who always sticks with his phone or plays games on it.
So pixel is fulfilling his priorities
English

@arpit_bhayani Mid range performance for one lakh plus phone 😂😂
English

I think going into 2026, as a developer, you'll have to accept that AI is "a thing" and won't go away.
Take advantage of it or ignore it (if you can) but don't hope for it to disappear.
I think there's a bubble that will burst at some point but that does not imply that the technology will disappear.
It does have its use and it does change things.
But just to be clear: AI is NOT that thing that will make you as a developer obsolete. Vibe coding is not the future for devs. It may be good for one-off, throwaway software but that's it.
Instead, I convinced using & controlling AI assistants is the future. Just as we've used auto-completion before AI.
Obviously, the amount of usage will differ on the problem you're tackling and it's easy to rely on AI too much.
I've said it before: Don't limit your skill level to that of the AI you're using. Instead, combine your skill & knowledge with AI assistants.
At least, that does work for me.
English


















