Interactive Web Demos using the MediaPipe Machine Learning Library
- Mentors
- Jen Person
- Organization
- TensorFlow
- Technologies
- javascript, html, tensorflow, reactjs, Tailwind CSS, MediaPipe
- Topics
- web, computer vision, accessibility, MediaPipe, Gesture Detection, interactability
The COVID-19 pandemic has increased awareness of hygiene risks associated with touchscreens, with reports indicating that 80% of people find them unhygienic. Touchless gesture-based intuitive systems can reduce transmission in public settings and workplaces, and offer a seamless and convenient experience. Touchless technology is expected to remain popular in various industries, such as retail, healthcare, and hospitality.
In this proposal, I suggest developing an interactive web app using the Mediapipe Hands JS Solution API and simple human gestures to provide perfect contactless interactions with interfaces. The app will showcase an augmented transaction panel previewed on the screen, enabling users to perform essential CRUD operations of items through custom simple-to-use gestures, without physical touch. Both custom defined & pre-trained gestures from Mediapipe’s tasks-vision API will be utilized to classify gestures and trigger events on the interface.
The project targets most platforms, primarily big screens, and may run on selective mobile devices with a camera module for input feed. All data taken via input video feed is deleted after returning inference and is computed directly on the client side, making it GDPR compliant. Once completed, the Web App will be uploaded to Codepen and/or deployed on Vercel.