Currently, the Liquid Galaxy project has only space navigator, touch screen and voice to controls it, but it would be interesting to see more ways to interact with Google Earth, so my proposal is to develop a controller based on hand gestures.

It will be made using Tensorflow Lite and Python to develop a software that can classify the gestures and send the corresponding command to a local NodeJs server to actually interact with Liquid Galaxy. Also, Dialogflow and Javascript to create the Google Assistant app.

The smartphone running the app, using the camera, will watch the hand movements of the user and with the trained model will check if it corresponds with any of the dataset gestures, all of this will be done in real-time.

Organization

Student

Bruno Faé Faion

Mentors

  • Marc Gonzalez
  • Andreu Ibanez
  • Iván Santos González
  • María Luz Mosteiro
close

2020