Pedro Lopes

10.3K posts

Pedro Lopes banner
Pedro Lopes

Pedro Lopes

@plopesresearch

Assoc. Prof. of Comp. Science @uchicagocs / Human Computer Integration Lab / XR, haptics, hw / HCI lab: https://t.co/lNuu6qMtRX / music: https://t.co/XcUgF18GAu

Chicago, IL Katılım Aralık 2010
4.1K Takip Edilen8.7K Takipçiler
Sabitlenmiş Tweet
Pedro Lopes
Pedro Lopes@plopesresearch·
I get a bunch of emails/comments about my single-take videos we do to showcase our lab's work, so we gathered them in one page: lab.plopes.org/videos.html (including an annotated history of how we started making these, and how they transformed over the years, to become more improv!)
Pedro Lopes tweet media
English
1
4
34
3.1K
Pedro Lopes
Pedro Lopes@plopesresearch·
@avizurlo For human objects yes, for some other objects other animal hands might be better grippers!
English
0
0
0
13
avi
avi@avizurlo·
The terminal state of UMI grippers are human hands
English
4
0
14
604
Jacy Reese Anthis
Jacy Reese Anthis@jacyanthis·
Thrilled to be joining @GoogleDeepMind as a student researcher in SF! We're building a multi-agent system to scale AI safety research and ensure pluralistic alignment. I think this is a crucial piece of safe AGI development for cooperation across many diverse human and AI agents!
Jacy Reese Anthis tweet media
English
6
2
65
5K
Pedro Lopes
Pedro Lopes@plopesresearch·
@AndrewAsksHow I suspect a lot of CS labs would buy it. Consumers... Maybe unlikely?
English
0
0
0
6
Andrew Forte
Andrew Forte@AndrewAsksHow·
Would you buy this for $4,700
Andrew Forte tweet media
English
34
1
142
188.9K
Pedro Lopes
Pedro Lopes@plopesresearch·
@kanair I thought the question was perfectly posed (despite some weird answers you got). It's a really good one, simple and deep.
English
0
0
0
4
Ryota Kanai
Ryota Kanai@kanair·
I asked this question because it's now common to talk with non-human agents (i.e. AI), and I thought maybe we can also talk with animals if there is a technological solution for a better inter-species interface.
Ryota Kanai@kanair

A random question. Could we ever ask a mouse a simple Yes/No question? I don't mean training them to press a "Yes" or "No" button for a reward. I mean genuinely asking something like "Are you hungry right now?" and getting a real answer. Is that possible?

English
5
0
10
1.7K
Dhruv Shah
Dhruv Shah@shahdhruv_·
I am excited to serve as a Sponsorship Chair for the upcoming @corl_conf in Austin TX: the largest and most talent dense gathering of robot learning researchers ever! 🤠 Please reach out to sponsorship@corl.org for details re. booths, demos, talent program, recruiting, ...
Dhruv Shah tweet media
Conference on Robot Learning@corl_conf

Calling all researchers! 🤖The CoRL 2026 website is officially live at corl.org with key dates for your submissions: 🗓 May 25: Abstract Submission 🗓 May 28: Full Paper Submission 🗓 Nov 9-12: Conference in Austin, TX Send us your coolest work! #RobotLearning

English
3
4
49
3.7K
Pedro Lopes
Pedro Lopes@plopesresearch·
@lauriewired Awesome explanations as always! Indeed snares and most drum sounds are terrible through this algorithm
English
0
0
0
139
LaurieWired
LaurieWired@lauriewired·
Telephone Hold music sounds really bad. Mostly because it’s mapping complex instruments to a human throat. Many phone lines these days use CELP, Code-Excited Linear Prediction algorithms. Music breaks down in weird ways when you turn a piano into…speech.
English
29
26
525
30.3K
Loren
Loren@murloren·
I am very happy to share the result of my internship at FAIR (Meta): V-JEPA 2.1: Unlocking Dense Features in Video Self-Supervised Learning with @ylecun @AdrienBardes Our approach learns dense, spatially coherent features from video while preserving strong global understanding
Loren tweet media
English
5
12
77
5.3K
Pedro Lopes
Pedro Lopes@plopesresearch·
@gabriberton Indeed, they literally agreed to not use LLMs and then .. used it
English
0
0
0
153
Gabriele Berton
Gabriele Berton@gabriberton·
ICML found a very smart way to detect highly negligent reviewers. Desk rejected their papers And yet I saw posts complaining about it??? These are reviewers who fed the paper to an LLM and copy pasted its output, without reading it, despite agreeing not to use LLMs! I wish ICML named these reviewers so I'd know who to never choose as co-author. Desk rejecting is too kind
ICML Conference@icmlconf

To ensure compliance w peer-review policies, ICML has removed 795 reviews (1% of total) by reviewers who used LLMs when they explicitly agreed to not. Consequently, 497 papers (2% of all submissions) of these (reciprocal) reviewers have been desk rejected Details in blog post 👇

English
4
2
54
5.2K
Pedro Lopes
Pedro Lopes@plopesresearch·
New AI tool/system that leverages vision models and helps planning physical movements: not with a robot, but by means of stimulating the body with electrical muscle stimulation to perform challenging physical tasks! Read more below!
Romain Nith@romainnith

"What if AI could show you how to perform a physical task through your own body?" I am excited to share our latest #CHI2026 Best Paper where we demonstrate an Embodied-AI for physical tasks using muscle stimulation 🔗 embodied-ai.tech @yun_yh_ @plopesresearch

English
0
0
3
192
Pedro Lopes retweetledi
University of Chicago Neuroscience Institute
Next Thursday, March 26, we will host a Special Seminar with Dr. Angelos Barmpoutis from the University of Florida. Join us! “Building Real-World AI Infrastructure for Diagnosis in Parkinsonism: From Multisite Clinical Data Collection to Commercial and Regulatory Readiness”
University of Chicago Neuroscience Institute tweet media
English
0
2
1
112
Arata Jingu
Arata Jingu@artjng·
Excited to share my final PhD paper, "HapticPipette," which I presented remotely at #AugmentedHumans2026 this week! While my previous research focused on haptic rendering and generation, this project focuses on their fundamental issue: the "measurement bottleneck." The success of large vision and audio models is built on collecting massive amounts of real-world data via ubiquitous cameras and microphones. In contrast, acquiring physical properties like compliance (the non-linear force-displacement relationship that strongly contributes to perceived softness) requires active contact and precise recording. Because measuring this macroscopic deformation typically relies on bulky instruments, we lack the mobile tools needed to scale real-world haptic data collection. HapticPipette introduces an "egocentric force-displacement measurement" approach that turns the everyday action of pressing an object into a sensing interaction. By combining a thin, flexible finger-worn force sensor with a head-worn RGB-D camera, even novice users can capture the complex non-linear compliance of everyday objects entirely in situ, as shown in our studies. Crucially, this setup is well compatible with other natural finger interactions, such as existing MR hand gestures. As AI advances in haptics (like my prior work, Scene2Hap), scaling real-world data collection is the next major bottleneck. Given its simplicity, our approach could eventually enable in-situ robotic data collection to truly scale this process, unlocking both realistic haptic generation and broader physical AI capabilities, such as training robotic hands to grasp non-rigid objects effectively. A huge shoutout to my co-authors, Jianhui Yan, Maja Fehlberg, Roland Bennewitz, and my advisor, Jürgen Steimle!
English
2
1
24
1.5K
Pedro Lopes retweetledi
Romain Nith
Romain Nith@romainnith·
"What if AI could show you how to perform a physical task through your own body?" I am excited to share our latest #CHI2026 Best Paper where we demonstrate an Embodied-AI for physical tasks using muscle stimulation 🔗 embodied-ai.tech @yun_yh_ @plopesresearch
Pedro Lopes@plopesresearch

Text or video is not only form AI can take. In embodied-ai.tech (#CHI2026, Best Paper 🏆) we create an embodied AI that acts via muscle stimulation to perform physical tasks: e.g., place a bike on a bus rack and more! youtube.com/watch?v=pJM2Z8… by @yun_yh_ @romainnith

English
0
0
4
394
Pedro Lopes
Pedro Lopes@plopesresearch·
@OshoDayo Was a great talk and paper. Congratulations once more!
English
1
0
0
91
Pedro Lopes retweetledi
Akihiko Murai 🔩
Akihiko Murai 🔩@OshoDayo·
AHs2026にて,”Induced Human-Driven Actuation: Controlling Timing of Human Movement via Constraint-Release Mechanism”がBEST PAPER AWARDをいただきました.ありがとうございます!そして鈴木くん,おめでとう! > augmented-humans.org #AHs2026
Akihiko Murai 🔩 tweet media
日本語
2
4
22
1.2K
Pedro Lopes
Pedro Lopes@plopesresearch·
The highlight of #AugmentedHumans2026 was definitely Masa's incredible performance while Djing and Vjing using his #BCI, eye tracking devices and robotic arms! Super energetic performance and great panel at the end of the performance. #AHs2026 #AHs26
武藤 将胤 WITH ALS/EYE VDJ MASA@Masatane_Muto

【沖縄で現在開催中の国際学会Augmented Humans 2026の3/18にEYE VDJ MASA(武藤将胤)として出演させていただきました!大盛況で皆様に心より感謝です!】 脳波でロボットアームを操りながら、視線でVDJライブパフォーマンスをするBRAIN BODY JOCKEYを披露させていただきました!!海外のお客様が多かったので、盛り上がりが凄くてとても嬉しかったです! また海外ゲストの皆様と、クロスリンガル音声合成技術を駆使して、英語でのトークセッションにも参加してきました! 昨日沖縄に着いてからすぐに会場でチームのみんなと遅くまでリハーサルだったので、あまり沖縄感はないまま本番に突入でしたが、大自然の中にあるとっても素敵なホールでの国際学会した! 沢山の海外の研究者の皆様に僕らの挑戦を知っていただけて、驚いてもらえたのでとても嬉しかったです! 関係者の皆様お疲れ様でしたー! 明日東京に戻るまで沖縄を堪能したいと思います! augmented-humans.org #AugmentedHumans2026 #AHS2026 #BRAINBODYJOCKEY #B2J #WITHALS #ALS #EYEVDJMASA #武藤将胤

English
0
3
25
2.9K