Title: Sensing Touch from Images for Humans and Robots​

 

Date: Friday, June 16, 2023

Time: 10am - Noon ET

In-Person Location: Klaus 1447 

Virtual Link: https://gatech.zoom.us/j/95907899998?pwd=WGNwN29JaDlyOVBNenZsRDBrZVZsZz09

 

Patrick Grady

Robotics Ph.D. Student

School of Electrical and Computer Engineering
Georgia Institute of Technology

 

Committee

Dr. Charlie Kemp (Advisor), Department of Biomedical Engineering, Georgia Tech

Dr. James Hays, School of Interactive Computing, Georgia Tech

Dr. Seth Hutchinson, School of Interactive Computing, Georgia Tech

Dr. Animesh Garg, School of Interactive Computing, Georgia Tech

Dr. Chengcheng Tang, Meta Reality Labs

 

Abstract

To affect their environment, humans and robots use their hands and grippers to push, pick up, and manipulate the world around them. At the core of this interaction is physical contact which determines the underlying mechanics of the grasp. While contact is highly useful to understanding manipulation, it is difficult to measure. In this proposal, we explore methods to estimate contact between humans, robots, and objects using easy-to-collect imagery. First, we demonstrate a method which leverages subtle visual changes to infer the pressure between a human hand and surface using RGB images. We initially explore this work in a constrained laboratory setting, but also develop a weakly-supervised data collection technique to estimate hand pressure in less constrained settings. A parallel approach allows us to estimate the pressure and force that soft robotic grippers apply to their environments, allowing for precise closed-loop control of a robot. Finally, I propose extending the hand pressure estimation work by leveraging data which is labeled by human annotators to build a robust egocentric hand contact estimator. This estimator will be used to analyze human behavior in egocentric datasets and identify patterns in how people interact with their environment.