

Christopher L Dutton
2.4K posts

@ChrisLDutton
Asst Professor of Ecology (Ecosystems & Animals) @UFBiology | Postdoc @UF Biology/Anthropology | PhD @yale_eeb












Simulated lab worlds straight from pixels. We scan the lab scene with depth cameras and reconstruct the environment. Each object already has a high-quality mesh in our object library from our mesh scanning system. A localization pipeline places those meshes into a shared coordinate frame, creating a digital version of the lab. This works well for science automation because the environment is constrained: the set of objects is limited, we have accurate models of them, and we don’t need to solve the full open-world perception problem. With this digital lab in place we can: • Drive safe motion planning around real geometry • Track state changes by re-localizing mesh parts (e.g. a lid opening) • Attach objects to the robot when picked so the planner accounts for them and avoids collisions built by @BastotdeHeijden @BlerimAbdullai @TahirMello



