Title: Human-Centered Embodied Intelligence for Assistive Robots
Date: Friday, December 12, 2025
Time: 2:00 - 4:00 PM EST
Location: Coda C1215 Midtown (Zoom Link)
J. Taery Kim
Ph.D. Student in Computer Science
School of Interactive Computing
Georgia Institute of Technology
https://taery-kim.notion.site/
Committee members
Dr. Sehoon Ha (Advisor) - School of Interactive Computing, Georgia Institute of Technology
Dr. Greg Turk - School of Interactive Computing, Georgia Institute of Technology
Dr. Bruce N. Walker - Schools of Psychology & School of Interactive Computing, Georgia Institute of Technology
Dr. Wenhao Yu - Staff Research Scientist at Google DeepMind Robotics
Dr. Donghyun Kim - CICS, University of Massachusetts Amherst
Abstract
Robots intended to assist or collaborate with people must not only perceive their environments but also understand human needs, social context, and interaction dynamics. Yet current embodied systems often address these challenges in isolation, limiting their ability to align with human expectations in real-world settings. This proposal presents a human-centered embodied intelligence framework that integrates user-grounded insights, interaction modeling, and context-adaptive navigation for assistive robots. First, I investigate the expectations and design preferences of blind or visually impaired individuals through two user-centered studies. These findings identify essential functional, aesthetic, and interaction requirements, establishing a grounded foundation for designing assistive robotic systems that integrate into users’ daily lives and broader social contexts. Next, I formalize the collaborative guidance task between humans and assistive robots and develop data-driven interaction models that capture human-robot dynamics during teamed navigation. These models enable safe and user-aligned guiding behaviors in real-world scenarios. Finally, I propose a context-aware object navigation framework for indoor environments to investigate how semantic and situational cues can shape navigation behavior and potentially improve alignment with human expectations. Together, this work advances human-centered embodied intelligence by combining user-grounded insights, interaction modeling, and adaptive navigation behavior for robots operating alongside people.