Expressive Robots

Robot expressiveness, from LED light displays to motion, is important to how people perceive and interact with robots. We work to better understand these topics.


During interactions with other people, speech is one part of the way we communicate, but a rich multidimensional space of nonverbal expression is also essential. Nuanced nonverbal expression is also important in interactions with robots, but the research community is yet to fully understand this space and harness full potential of nonverbal communication in human-robot interaction (HRI).


In studies involving robot light signals, sound, and facial expressions, we have begun to investigate human responses to and perceptions of different nonverbal robot signaling. One approach to unlocking the potential of nonverbal robot signaling is to quantify human responses to these nonverbal behaviors and share open-source resources from the explored space. Another approach is to create and validate models that can extend to future HRI work.


  • Brian Zhang (PhD Student)
  • Lilian Chan (Undergraduate Researcher)


  • Elizabeth Cha, Naomi T. Fitter, Yunkyung Kim, Terry Fong, and Maja Matarić, "Generating expressive light signals for appearance-constrained robots," Presented at the International Symposium on Experimental Robotics and awaiting publication, Buenos Aires, Argentina, 2018. [BibTeX] [PDF]
  • Elizabeth Cha, Naomi T. Fitter, Yunkyung Kim, Terry Fong, and Maja Matarić, "Effects of robot sound on auditory localization in human-robot collaboration," Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI), Chicago, IL, USA, 2018. [BibTeX] [PDF]
  • Naomi T. Fitter and Katherine J. Kuchenbecker, "Designing and assessing expressive open-source faces for the Baxter robot," Proceedings of the International Conference on Social Robotics (ICSR), Kansas City, MO, USA, 2016. [BibTeX] [PDF]