Title: Sensing Touch from Vision for Humans and Robots
Date: Tuesday, November 28, 2023
Time: 2:00pm-4:00pm EST
Location: Klaus 1116
Zoom: https://gatech.zoom.us/j/96467976963?pwd=MkxSUDFnaFJ6eFp0dXBkMHNQU3BtZz09
Patrick Grady
Robotics PhD Student
School of Electrical and Computer Engineering
Georgia Institute of Technology
Committee:
Dr. James Hays (Advisor) – School of Interactive Computing, Georgia Tech
Dr. Charlie Kemp (Advisor) – CTO, Hello Robot
Dr. Seth Hutchinson – School of Interactive Computing, Georgia Tech
Dr. Animesh Garg – School of Interactive Computing, Georgia Tech
Dr. Chengcheng Tang – Meta Reality Labs
Abstract:
To affect their environment, humans and robots use their hands and grippers to push, pick up, and manipulate the world around them. At the core of this interaction is physical contact which determines the underlying mechanics of the grasp. While contact is useful in understanding manipulation, it is difficult to measure. In this thesis, we explore methods to estimate contact between humans, robots, and objects using easy-to-collect imagery. First, we demonstrate a method which leverages subtle visual changes to infer the pressure between a human hand and surface using RGB images. We initially explore this work in a constrained laboratory setting, but also develop a weakly-supervised data collection technique to estimate hand pressure in less constrained settings. A parallel approach allows us to estimate the pressure and force that soft robotic grippers apply to their environments, allowing for precise closed-loop control of a robot. Finally, we develop a joint pose and contact estimator which may generalize to internet-scale images. Our model leverages multiple heterogeneously labeled datasets and images with contact labeled by human annotators. Overall, this thesis makes progress towards understanding human and robot manipulation from only visual sensing.