Contributor
Matt Wiese

Face Tracking to Improve Accessibility and Interaction in the Metaverse with Satellite


Mentors
Christian Frisson (SAT), Emmanuel Durand, Fay Askari
Organization
Society for Arts and Technology (SAT)
Technologies
python, javascript, MediaPipe, Mozilla Hubs, Satellite, LivePose
Topics
virtual reality, web, machine learning, artificial intelligence, computer vision, accessibility, Metaverse, Telepresence, Face Tracking, Alternate Reality
The "metaverse" has seen exceptional interest in both tech and the wider market. However, metaverse software currently on the market, from Second Life to Horizon Worlds, either limit input to traditional input devices like mice and keyboards, or require expensive gear in the form of VR headsets and specialized controllers. In this project we aim at integrating interaction techniques that rely on human facial and body gestures, similar to these employed in high-fidelity telepresence systems like Google’s Starline, that we will apply to a hybrid presence system and with low-cost hardware to enable accessibility and affordability. This GSoC project aims to perform much needed research into using face tracking as a viable means to improve the accessibility of 3D virtual spaces, much like those found in the metaverse. The result benefits all users, whether or not they have specific needs for accessibility: improved interaction vectors and empathy channels liberate all users. Face tracking software built upon open source technologies will be integrated with Satellite/Mozilla Hubs to provide this crucial feature, enabling a practical testbed for future innovation in hybrid telepresence interaction.