Pushing the boundaries of how machines perceive and interact with the world.
Fundamental research in scene understanding, 3D reconstruction, and robust detection. We develop algorithms that can handle complex lighting, occlusions, and dynamic environments.
Predicting human behavior and multi-modal trajectory forecasting. We focus on social and physical constraints to understand how humans move and interact in crowded spaces.
Developing robotic systems capable of autonomous navigation and task planning. Our research integrates perception directly into the control loop for intelligent interaction.
JRDB is a novel dataset and benchmark of egocentric robot visual perception, collected using a mobile robot platform. It features over 60 minutes of data with consistent identity tracking, 3D bounding boxes, and human activity labels.