Vrgineers and ManoMotion unveil AI hand-tracking algorithm for pilot training

|
,

Vrgineers, Inc., in partnership with ManoMotion, has developed an AI hand-tracking algorithm designed to enhance the immersive experience in pilot training scenarios within virtual and mixed-reality environments. This remarkable feature, exclusive to XTAL Mixed Reality headsets, enables real-time hand-tracking functionality that is specifically optimized for cockpit settings.

Estimated reading time: 2 minutes

“As a virtual and mixed reality simulation specialist, we always try to find new innovative ways to increase the immersion of our solutions. Vrgineers was the first company to fully integrate infrared hand-tracking into a virtual reality headset. And now, together with ManoMotion AI recognition using visible light, we are doing the same for the mixed reality,” says Marek Polcak, CEO of Vrgineers, and adds: “By supporting both technologies, we can combine their strengths to deliver the very best existing hand-tracking available in any headset on the market.”

https://youtu.be/kLkxlfnoPDE
The XTAL headset in action this headset will integrate with the new AI technology being developed.

The key differentiator of this solution lies in its integration of pass-through RGB cameras and an advanced AI algorithm. By harnessing this unique combination, the new hand visualization algorithm effectively eliminates the background and allows for a focused display of the hands within virtual environments.

Using the RGB camera stream, ManoMotion leverages this data to visualize the hands in both virtual reality (VR) and mixed reality (MR) simulations. It is important to note that this development was directly tailored to meet the specific requirements of Vrgineers’ pilot cockpit tracking needs.

The overarching goal of the mixed reality cockpit hand tracking project is to create a proof-of-concept, AI-driven hand visualization functionality for pilot cockpit training. By utilizing the real-time RGB camera stream, the system detects and isolates the pilot’s hands, seamlessly overlaying them onto the virtual reality scene.

The development process involved the collaboration of Vrgineers and ManoMotion, who worked together to gather training data in the form of video feeds capturing users interacting with aircraft cockpit instrument panels. Subsequently, the real-time data stream undergoes processing, leveraging a pre-trained hand-tracking model to separate the hands from the live video stream and render them in a visual representation.

What do you think of this new AI training from Vrgineers and ManoMotion? Please share your thoughts on any of the social media pages listed below. You can also comment on our MeWe page by joining the MeWe social network. And subscribe to our RUMBLE channel for more trailers and tech videos.

Previous

MediaTek and NVIDIA are working toward automotive applications

Samsung’s 2023 Smart Monitor lineup now available globally

Next

Latest Articles

Share via
Copy link
Powered by Social Snap