Affective Human-Robot Interaction

ACII 2022 Workshop on

Affective Human-Robot Interaction (AHRI)

  • 08:45-09:00 Welcome

  • Prof. Johnathan Mell - Conflict Resolution and Negotiation with Social Virtual Agents

  • Prof. Xiaofeng Liu - On the road of affective computing in human-robot interactions

  • Robot-assisted personalized upper limb rehabilitation training based on reachable workspace

  • Multiple attention convolutional-recurrent neural networks for speech emotion recognition

  • [Best Paper] Non-parallel Controllable Cross-gender Voice Conversion Model with CycleGAN and Transformer for Social Robot

  • Shape parameters of UGV delivery robots that affect subject’s perception of safety

  • Affective Interaction in Domestic Service System: Emotion Evaluation and Regulation for the Older Adults

  • Perception of Multimodal Hedges in Communicative Behavior of a Companion Robot

  • Prof. Aiguo Song - Force Feedback Teleoperation Robot and Its Application in Rehabilitation

  • Prof. Hatice Gunes - Affective Computing for Humanoid Service Robotics

  • Prof. Adriana Tapus - Socially Acceptable Robots with Humour and Personalizable Behaviours based on User's Profile


In recent years, robotic applications have entered various aspects of our lives, especially healthcare services. It is common in these applications that a user interacts directly with a robot. In Human-Robot Interaction (HRI), trust and mutual adaption are established and maintained through a positive social relationship between the robot and the human interactor and rely on the perceived competence of a robot on the social-emotional dimension. How a user perceives a robot's social intelligence and their social relationship with the robot can have a direct influence on the outcomes of an HRI system, for example, whether a user decides to accept the recommendation from a robot. Moreover, in many HRI applications, social-emotional interaction with the intended users is the main goal of the system or a core strategy to achieve the desired outcomes. For example, in HRI techniques can be further applied to various human external behaviours and internal states understanding applications for various, such as gesture and facial expression recognition, emotion/dimensional affect analysis, as well as mental health (e.g., depression, anxiety, bipolar, etc.) and personality recognition. Thus, such affective HRI applications require emotion-awareness and social-emotional competence in the robot's functions to deliver acceptable services. In addition, for robots deployed in shared spaces with humans, even when direct HRI is not expected to occur, social intelligence of robots, such as the ability to follow certain social norms or to predict human intentions, is key to safe and effective deployment.

This workshop provides a communication and collaboration platform for researchers from the human-robot interaction (HRI), emotion recognition, affective computing, deep learning, and healthcare communities. This workshop will focus on discussing the following research questions:

  1. How to perceive unimodal or multimodal affective human behaviour adaptively/accurately in HRI?

  2. How to efficiently generate natural and affective robot behaviour in HRI?

  3. How to advantageously facilitate human users’ mental and physical well-being with affective HRI applications?


Our workshop will bring together researchers from the disciplines of HRI, affective computing, healthcare, and various related fields to facilitate discussion and future collaborations. We expect that this will greatly advance the benefits of affective computing systems for healthcare applications and affective HRI research.


  • Submission deadline: 22 July 2022

  • Notification of acceptance: 10 Aug 2022

  • Camera-ready: 15 Aug 22 Aug 2022

  • Workshop date: 17 Oct 2022 JST (virtual)




Dr. Chuang Yu

University of Manchester

Dr. Siyang Song

University of Cambridge

Dr. Leimin Tian

Monash University

Dr. Zhao Han

Colorado School of Mines

Prof. Xiaofeng Liu

Hohai University

Prof. Aiguo Song

Southeast University

Prof. Adriana Tapus

Institut Polytechnique of Paris