Dev Patel

1.1K posts

Dev Patel banner
Dev Patel

Dev Patel

@devnp2007

CSE'29 @ Nirma 🇮🇳 | Robotics | Hardware | Software | AI ML | Devops | Chips

India Katılım Mayıs 2023
93 Takip Edilen62 Takipçiler
Sabitlenmiş Tweet
Dev Patel
Dev Patel@devnp2007·
Pick one vertical, and gain breadth not the other way around, learning breadth and then specialising vertically in one field, it would always seem time pass. Hence for these next 3 months I have decided to pick "AI/Ml" as the only major vertical from scratch >agents>deployment.
English
3
0
1
324
Dev Patel
Dev Patel@devnp2007·
My feed is filled with cracked people celebrating getting papers accepted into top conferences like ICML, Etc. I am must commit it getting one published or be a part of in next 18 months, remind me every month (as I tend to forget)
English
0
0
1
98
Mingchen Zhuge
Mingchen Zhuge@MingchenZhuge·
The terminology 𝗡𝗲𝘂𝗿𝗮𝗹 𝗖𝗼𝗺𝗽𝘂𝘁𝗲𝗿𝘀 is now officially endorsed by Jürgen Schmidhuber @SchmidhuberAI (JS) and Andrej Karpathy @karpathy (AK)! Yesterday at Sequoia's AI Ascent 👩 Host: "What's something obviously important in 2026 but still in its infancy, like Websites in the 90s, Mobile Apps in 2010, or SaaS in the cloud era?" 🦸 AK answered: 𝗖𝗼𝗺𝗽𝗹𝗲𝘁𝗲𝗹𝘆 𝗡𝗲𝘂𝗿𝗮𝗹 𝗖𝗼𝗺𝗽𝘂𝘁𝗲𝗿𝘀!!! Watch the cool sharing of @karpathy : youtu.be/96jN2OCOfLs?si… & My blog: metauto.ai/neuralcomputer/ & Paper: arxiv.org/pdf/2604.06425
YouTube video
YouTube
Mingchen Zhuge tweet mediaMingchen Zhuge tweet mediaMingchen Zhuge tweet media
English
0
11
49
4.9K
Hero Of Justice
Hero Of Justice@herooffjustice·
@sharpeye_wnl Don't. Just give it to someone n pay them I'm doing it now and I should've done it long before
English
2
0
2
120
ShaRPeyE
ShaRPeyE@sharpeye_wnl·
assignment likhte likhte zindagi nikl jaegi meri
ShaRPeyE tweet media
Indonesia
13
2
96
1.7K
Dev Patel
Dev Patel@devnp2007·
@DirhousssiAmine what laptop spec do you recommend to neither overspend nor underspend ?
English
0
0
0
11
Dev Patel
Dev Patel@devnp2007·
AM I DOING GOOD ?
Dev Patel tweet media
English
0
0
1
19
Dan Kornas
Dan Kornas@DanKornas·
The Kaggle Book, Second Edition by Luca Massaron (@lucamassaron), Bojan Tunguz (@tunguz), and Konrad Banachewicz is a practical guide to getting better at machine learning through competitive data science. The reason Kaggle still matters is simple: it forces honesty. Your model either generalizes or it doesn’t. Your validation setup either matches the real problem or it doesn’t. Your metric either rewards the behavior you care about or it quietly pushes you in the wrong direction. That is why this book is useful for AI engineers, not just Kaggle competitors. A lot of AI work now happens behind clean APIs and strong foundation models. But the hard parts are still the same: evaluation, data quality, leakage, error analysis, experimentation, and knowing whether a model actually improved or just looked better in a narrow test. This book focuses on those muscles. You learn how to think through: - validation without fooling yourself - public vs private performance - messy datasets and leakage - evaluation metrics and objective functions - tabular modeling with feature engineering, gradient boosting, tabular deep learning, and AutoML - hyperparameter tuning, blending, stacking, and ensembling - computer vision, NLP, time series, simulation, optimization, and GenAI / LLM competitions - Kaggle Notebooks, Datasets, Models, and Discussions as a learning workflow The second edition is 708 pages and includes strategies from 30+ Kaggle Masters and Grandmasters. That matters because a lot of ML skill is not learned from clean tutorials. It is learned by seeing how strong practitioners debug, compare models, read failure cases, and decide what to try next. That transfers directly into real AI system work. The tools keep changing. Models get stronger. APIs get easier. But engineers still need the same fundamentals: evaluation discipline, data intuition, careful experiments, and the ability to improve a system without lying to themselves about why it improved. That is the muscle Kaggle builds. If you are an AI engineer, ML engineer, data scientist, or builder who wants more practical reps, this is a strong resource to add to your learning stack. It is a practical way to sharpen the instincts you bring back to real ML work. Thanks to @PacktPublishing for the book.
Dan Kornas tweet media
English
4
19
103
3.7K
Yash Panditrao
Yash Panditrao@yashpanditrao·
One of my network’s got an AI Research Intern role open. Remote, pay (upto 1.5L pm), building agents, evals and RL environments for trading and similar use cases. If you’re actually solid with LLMs and python with solid projects. Happy to make the intro for the right person.
English
30
6
196
8.4K
initlayers
initlayers@initlayers·
We're looking for a Research Assistant (Embedded AI Systems) at Language Technologies Research Center, IIITH. This role is focused on building and optimizing real-world on-device AI systems. The work involves developing embedded Linux pipelines (Buildroot-based devices), handling real-time audio/video data transfer, and running AI models directly on edge hardware. On the mobile side, you'll work with on-device models (Gemma with LiteRT, MLKit, speech systems), and upcoming work includes optimizing inference using Qualcomm's QNN stack on NPUs/GPUs. This is hands-on systems work. You should be interested in low-level programming (C/C++), comfortable exploring Linux/embedded systems, and curious about performance optimization and hardware-accelerated AI. Prior experience helps, but willingness to learn and build matters more. The work is tied to building AI-enabled accessibility systems for visually and speech-impaired users, so there is a clear real-world impact alongside the technical depth. Duration is 2 months (on-campus), with possible extension. Stipend is ₹10,000/month. If this aligns with what you want to work on, reach out.
English
7
7
66
3.7K
Dev Patel
Dev Patel@devnp2007·
@kevinxu But 20 years ago elon was not like us he was millionaire building spacex and tesla
English
7
0
4
348
Kevin Xu
Kevin Xu@kevinxu·
Today these 4 men decide the fate of humanity but 20 years ago they were just like you and me.
Kevin Xu tweet mediaKevin Xu tweet mediaKevin Xu tweet mediaKevin Xu tweet media
English
20
22
393
72.8K
Dev Patel
Dev Patel@devnp2007·
@ThisIsBhandari Only one issue is initial round due to ai many people fork git projects or copy from yt so CGPA importance might increase as you cannot hack that metrics but strong fundamentals and knowledge of profile is enough ig ?
English
2
0
2
28
Devaansh Bhandari
Devaansh Bhandari@ThisIsBhandari·
This is exactly what I’ve been saying. People are grinding LeetCode and DSA, but many startups have already shifted to system design and live coding rounds. As we move deeper into the AI era, being a strong builder and understanding real world systems matters far more. DSA and on campus prep have been overhyped for years, especially in the Indian engineering ecosystem. You really see this when you try to switch or explore off campus opportunities. Most of what you prepare for on campus doesn’t translate well to the actual market. In the global market, it’s even clearer. Companies don’t care much about your college, branch, or cgpa. What actually matters: • Skills • Projects • Proof of work If you’re in your 1st or 2nd year, focus on building. Ship projects. Share them publicly. Document your journey. Make your LinkedIn, X, GitHub, and resume strong. In today’s market, that will take you much further than just a degree.
R𝛼m🦅@rambuilds_

If you're looking for a switch should really read this reddit post, quite an insightful read.

English
11
27
390
29.9K
Dev Patel
Dev Patel@devnp2007·
@ThisIsBhandari Possibility is endless once you get into field you can go into ml compillers kernel engineering etc. even transition to chip design, robotics , frontier tech then achieving CM on cf as most cheat their. Instead even 10% time on kaggle will give you more return and goat community
English
0
0
3
38