Read about out current human-robot interaction projects below
We aim to better understand use cases of mobile robots that connect people with far-away environments through videoconferencing and navigation.
Robot expressiveness, from LED light displays to motion, is important to how people perceive and interact with robots. We work to better understand these topics.
Limited options are available for young children who require motion interventions. We view assistive robots as a key opportunity for improvement in this space. This work is funded by the NSF National Robotics Initiative (CMMI-2024950) and the Caplan Foundation.
As robots become more common in everyday spaces, studying robotic art and other robots in the wild can help us understand perceptions of these systems.
Jon the Robot, our lab’s robotic stand-up comedian, helps us study autonomous humor skills to make your future robots and virtual assistants better.
Touch is an important part of human development and connection, but the touch abilities of robots is underdeveloped. We help robots feel things.
On-body systems that delver haptic feedback have powerful potential for applications from improving navigation aids to assisting in anxiety management.
We are studying how small and low-cost socially assistive robots can encourage frequent computer users to take breaks and be more active during the workday.