Defining Cognitive Science
Multimodal Social Signal Processing for Human-Robot Interaction
Dr. Angelica Lim, Simon Fraser University
Date: Tuesday, November 5th, 1:00pm - 2:00pm
Location: RCB 6152
Abstract: Science fiction has long promised us interfaces and robots that interact with us as smoothly as humans do - Rosie the Robot from The Jetsons, C-3PO from Star Wars, and Samantha from Her. Today, interactive robots and voice user interfaces are moving us closer to effortless, human-like interactions in the real world. In this talk, I will discuss the opportunities and challenges in finely analyzing, detecting and generating non-verbal communication in context, including gestures, gaze, auditory signals, and facial expressions. Specifically, I will discuss how we might allow robots and virtual agents to understand human social signals (including emotions, mental states, and attitudes) across cultures as well as recognize and generate expressions with controllability, transparency, and diversity in mind.
Biography: Dr. Angelica Lim is the Director of the Rosie Lab, and an Assistant Professor in the School of Computing Science at Simon Fraser University (SFU). Previously, she led the Emotion and Expressivity teams for the Pepper humanoid robot at SoftBank Robotics. She received her B.Sc. in Computing Science with Artificial Intelligence Specialization from SFU and a Ph.D. and M.Sc. in Computer Science (Intelligence Science) from Kyoto University, Japan. She and her team have received Best paper in Entertainment Robotics and Cognitive Robotics Awards at IROS 2011 and 2022, and Best Demo and LBR at HRI 2021 and 2023. She has been featured on the BBC, TEDx, hosted a TV documentary on robotics, and was recently featured in Forbes 20 Leading Women in AI. Her research interests include multimodal machine learning, affective computing, and human-robot interaction.
F T I