We are living in a world where a cognitive robot can act as a human assistant and has the potential to offer social experiences through human-robot interactions. To foster this coexistence, a robot should be able to infer human emotions allowing for a more meaningful and enriched interaction between robots and humans. This project aims to design an emotion recognition component which will help establish an affective loop between humans and robots.

In order to achieve this I plan to benchmark the well-known publicly available state-of-the-art image and video datasets for classical machine learning algorithms and deep learning based approaches. Based on my observations and evaluation metrics I propose to integrate a trained model with the emotion recognition component in the robocomp codebase.



Palash Agarwal


  • Luis J. Manso
  • Daniel Rodriguez Criado
  • Diego R. Faria