Intelligent Robotics and Vision Lab @ UTDallas

77 posts

Intelligent Robotics and Vision Lab @ UTDallas banner
Intelligent Robotics and Vision Lab @ UTDallas

Intelligent Robotics and Vision Lab @ UTDallas

@IRVLUTD

Welcome to the X page of the Intelligent Robotics and Vision Lab @UT_Dallas

ECSS 4.222, UT Dallas, TX Katılım Ekim 2021
17 Takip Edilen380 Takipçiler
Intelligent Robotics and Vision Lab @ UTDallas retweetledi
Jishnu Jaykumar Padalunkal
Jishnu Jaykumar Padalunkal@jishnu_jaykumar·
Had the pleasure of meeting @rao2z during #ECSAIDays2026 at our demo booth. I’ve been following his work since my undergraduate days, so this was a genuinely special moment. Really appreciated the time he took to stop by, engage with our demo, and share thoughts. Moments like these make the long hours behind the scenes feel worth it. @IRVLUTD @UT_Dallas @UTDCompSci @UTDResearch #Robotics #AI #Research
Jishnu Jaykumar Padalunkal tweet media
English
0
1
7
97
Intelligent Robotics and Vision Lab @ UTDallas retweetledi
Sai Haneesh Allu
Sai Haneesh Allu@saihaneesh_allu·
Excited to show our HRT1 framework live today at #ECSAIDays2026. If you're curious how robots mimic human actions, especially outside controlled setups, we’ll be running real demos and sharing some of the practical challenges behind it. Visit us at: 📍 ECS West 2.100, UT Dallas 🗓️ April 30 from 11 AM @UT_Dallas @IRVLUTD
Jishnu Jaykumar Padalunkal@jishnu_jaykumar

Stepping out of the lab and into the real world. Today at #ECSAIDays2026, we’ll be presenting a live demo of #HRT1 — a system that transfers human demonstrations to robot actions for mobile manipulation. 🔗 irvlutd.github.io/HRT1/ This short timelapse is a behind-the-scenes glimpse of what it actually takes. From moving the robot across campus, navigating buildings, setting up hardware, testing repeatedly, to making everything work outside controlled environments — a lot goes into what eventually looks like a “simple demo.” Especially in smaller labs, it’s all hands-on, end-to-end effort. Also, when it’s just two of us running a two-person setup and trying to film it, the camera doesn’t always cooperate 😄 — but the live demo will be much more fun. Grateful to be building this with @saihaneesh_allu at the @IRVLUTD If you’re around, drop by and see it live. 📍 ECSW, UT Dallas ⏳ Apr 30, 2026 Thanks to @Sriraam_UTD, @VibhavGogate, @YuXiang_IRVL and Tyler Summers for the opportunity and support. @UT_Dallas @UTDCompSci @UTDResearch

English
0
1
2
91
Intelligent Robotics and Vision Lab @ UTDallas retweetledi
Jishnu Jaykumar Padalunkal
Jishnu Jaykumar Padalunkal@jishnu_jaykumar·
Stepping out of the lab and into the real world. Today at #ECSAIDays2026, we’ll be presenting a live demo of #HRT1 — a system that transfers human demonstrations to robot actions for mobile manipulation. 🔗 irvlutd.github.io/HRT1/ This short timelapse is a behind-the-scenes glimpse of what it actually takes. From moving the robot across campus, navigating buildings, setting up hardware, testing repeatedly, to making everything work outside controlled environments — a lot goes into what eventually looks like a “simple demo.” Especially in smaller labs, it’s all hands-on, end-to-end effort. Also, when it’s just two of us running a two-person setup and trying to film it, the camera doesn’t always cooperate 😄 — but the live demo will be much more fun. Grateful to be building this with @saihaneesh_allu at the @IRVLUTD If you’re around, drop by and see it live. 📍 ECSW, UT Dallas ⏳ Apr 30, 2026 Thanks to @Sriraam_UTD, @VibhavGogate, @YuXiang_IRVL and Tyler Summers for the opportunity and support. @UT_Dallas @UTDCompSci @UTDResearch
English
0
1
6
768
Intelligent Robotics and Vision Lab @ UTDallas retweetledi
Jishnu Jaykumar Padalunkal
Jishnu Jaykumar Padalunkal@jishnu_jaykumar·
🛠️ While working on #iTeach (lnkd.in/gj9s2eJR), where unseen object instance segmentation (UOIS) was a key task, I kept hitting the same wall: every UOIS dataset had its own loader and quirks. Getting data ready for training meant writing custom glue for each — time that should have gone into the models themselves. ✨ So I put together uois_toolkit — a small PyTorch library that wraps 5 popular UOIS datasets (Tabletop, OCID, OSD, Robot Pushing, iTeach-HumanPlay) behind one API. 🚀 Features:  📦 Load any dataset in 3 lines  📊 Compute F1, IoU, Precision, and Recall with a single call  ⚡ Plug directly into PyTorch Lightning  🤖 Works out of the box with robotics pipelines 💡 The idea is simple — take the friction out of data pipelines so more time can go into model building. 🎉 It has picked up around 2K downloads on PyPI since release, which was a nice surprise. 🙏 Sharing it a bit more openly now in case others find it helpful. ❤️ Huge thanks to the original dataset authors, whose open codebases made this possible. And to Avaya Aggarwal and Animesh Maheshwari for testing and feedback along the way. 🔗 GitHub: lnkd.in/grPU7rx5 📥 PyPI: lnkd.in/gugFzGWr @IRVLUTD @UT_Dallas #Robotics #RobotPerception #ComputerVision #PyTorch #Lightning #OpenSource
GIF
English
0
1
2
131
Intelligent Robotics and Vision Lab @ UTDallas retweetledi
Yu Xiang
Yu Xiang@YuXiang_IRVL·
Great to have @Jesse_Y_Zhang visiting us @IRVLUTD today! He shared his journey toward generalist robotics reward models (RoboCLIP, ReWiND, Robometer), followed by a great buffet with the lab.
Yu Xiang tweet mediaYu Xiang tweet mediaYu Xiang tweet mediaYu Xiang tweet media
English
0
5
38
3.7K
Intelligent Robotics and Vision Lab @ UTDallas retweetledi
Jishnu Jaykumar Padalunkal
Jishnu Jaykumar Padalunkal@jishnu_jaykumar·
🤖 Robots don't fail in the lab. They fail in the wild — clutter, occlusion, constantly changing environments. The real question: Can robots learn directly from these failures during deployment? How about teaching robots the way we'd teach a child — by showing them where they went wrong? 🧵👇
English
6
3
13
2.9K
Intelligent Robotics and Vision Lab @ UTDallas
Congratulations to Felipe!!
Luis Felipe Casas@lfcasas7

Thrilled to have won the Louis Beecherl, Jr. Graduate Fellowship for 2025–2026 from the Erik Jonsson School at @UT_Dallas ! This prestigious merit-based award has been a game-changer for my PhD in Computer Science and research at the Intelligent Robotics and Vision Lab—real financial support and recognition. Grateful to the committee! Current/future UTD grad students: Apply for the 2026-2027 Graduate fellowships! It eases the journey and rewards excellence. Don't miss out. 🚀 #UTDallas #Fellowship

English
0
0
3
161
Intelligent Robotics and Vision Lab @ UTDallas retweetledi
Jishnu Jaykumar Padalunkal
Jishnu Jaykumar Padalunkal@jishnu_jaykumar·
After ~2 years in stealth, I’m making #Robokit public. 🔗 github.com/jishnujayakuma… A small toolkit that ended up supporting many of @IRVLUTD's research. Supports: CLIP, GroundingDINO, MobileSAM, DepthAnything, FeatUp, SAMv2. Coming soon: SAM3, SAM3D + more FM tools. #Robotics
Jishnu Jaykumar Padalunkal tweet media
English
1
2
3
322
Intelligent Robotics and Vision Lab @ UTDallas
(10/11) The standout aspect: No Training Needed: Full mobile manipulation with no learned policy — one shot, no RL, no finetuning. One human demo → real robot execution. And while most progress is on tabletop tasks, #HRT1 shows how far unified mobility + manipulation can go.
English
1
0
1
196