Read about out current human-robot interaction projects below
We aim to better understand use cases of mobile robots that connect people with far-away environments through videoconferencing and navigation.
Robot expressiveness, from LED light displays to motion, is important to how people perceive and interact with robots. We work to better understand these topics.
Limited options are available for young children who require motion interventions. We view assistive robots as a key opportunity for improvement in this space. This work is funded by the NSF National Robotics Initiative (under award CMMI-2024950) and the Caplan Foundation.
As the population of older adults rises, we are investigating ways that technology can support healthy aging in the home and wellness in skilled nursing facilities. This work is supported by the NSF (under award IIS-2112633) and NIH (under award 1R01AG078124).
Jon the Robot, our lab’s robotic stand-up comedian, helps us study autonomous humor skills to make your future robots and virtual assistants better.
Touch is an important part of human development and connection, but the touch abilities of robots is underdeveloped. We help robots feel things. This work is funded by the Oregon Manufacturing Innovation Center (OMIC).
On-body systems that delver haptic feedback have powerful potential for applications from improving navigation aids to assisting in anxiety management. This work is funded by the OSU Advantage Accelerator.
We are studying how small and low-cost socially assistive robots can encourage frequent computer users to take breaks and be more active during the workday.