One of the capabilities of the robot is to efficiently communicate with humans, either through voice or gestures. Currently, EBO is equipped with components that enable it to perform social tasks like face detection, emotion recognition, people identification, etc. The aim of this project is to add upon the existing functionalities by integrating a ‘Hand Gesture Recognition Component’. Hand Gestures are very significant for human-robot interaction. This component can be used as a tool to enable communication between robots and humans. Furthermore, this component will enable the robot to understand American Sign Language (ASL), which will be a really efficient way of communication.

RoboComp is also working on the Human Activity Recognition project. Hand Gestures are a key component in both natural and social human activities. This component when integrated with the Human Activity Recognition Component can be used for various use cases (like detecting commands given by hands), which adds value to this project.

Organization

Student

Kanav

Mentors

  • Francisco Andrés
  • Aditya Aggarwal
close

2020