Read about out current human-robot interaction projects below
We aim to better understand use cases of mobile robots that connect people with far-away environments through videoconferencing and navigation.
Limited options are available for young children who require motion interventions, as well as preteens who require accessible and one-on-one study skill support. We view assistive robots as a key opportunity for improvement in this space. This work is funded by the NSF National Robotics Initiative (under award CMMI-2024950) and the Caplan Foundation.
Robot expressiveness, from LED light displays to motion, is important to how people perceive and interact with robots. We work to better understand these topics.
As the population of older adults rises, we are investigating ways that technology can support healthy aging in the home and wellness in skilled nursing facilities. This work is supported by the NSF (under award IIS-2112633) and NIH (under award 1R01AG078124).
Jon the Robot, our lab’s robotic stand-up comedian, helps us study autonomous humor skills to make your future robots and virtual assistants better.
We are studying how small and low-cost socially assistive robots can encourage frequent computer users to take breaks and be more active during the workday.
Everyday robots are here! We are studying how to facilitate better policymaking related to robots and how to help the general public understand the realities of robotic systems.
Touch is an important part of human development and connection, but the touch abilities of robots is underdeveloped. We help robots feel things. This work is funded by the Oregon Manufacturing Innovation Center (OMIC).
On-body systems that deliver haptic feedback have powerful potential for applications from improving navigation aids to assisting in anxiety management. This work is funded by the OSU Advantage Accelerator.