What is hand tracking?
Hand tracking or hand gesture recognition is a set of techniques based on vision-based gesture recognition for human-computer interaction. The hand tracking feature allows you to use natural hand movements to control, move, hold and touch subjects without using bulky controllers.
Sign language, robot control, human–computer interaction (HCI), home automation, medical applications, and virtual reality are just a few of the applications where hand tracking can be used.
Hand tracking have adopted many different techniques, including those based on instrumented sensor technology and computer vision.
Hand gestures can be static (posture) or dynamic (sequence of postures). Static gestures require less processing complexity than dynamic gestures, which are more complicated, and hence are suitable for real-time applications. The virtual hand images or spare devices such as instrumented data glove devices can be employed to extract gesture features in the recognition system.
Why Gesture Recognition is necessary?
An example of how gestures are used in our daily lives is when an airplane host and hostess use gestures to explain the safety guide to passengers before the plane takes off, because most passengers, especially the elderly and hearing-impaired, may not be able to understand their languages.
Gestures are also utilized to communicate between humans and machines. Building a natural human-computer interaction required an accurate hand gesture recognition system as an interface for easy HCI, where the recognized gestures might be used to control a robot, virtual objects, or convey meaningful information.
Gesture-based interaction has entered our daily lives as a result of the growing popularity of VR headsets and smartglasses, but we have yet to fully realize its potential.
Gestures give the user a new way to interact with the technology that is similar to how they interact in real life. They feel natural and unobtrusive. They also don't confine the user to a single point of input, instead providing a variety of options.
Unlike traditional buttons and menus, gestures do not force the user to move his hand to the place of a command, interrupting his activities. Instead, they can be done right from the cursor point.
However, gestures bring challenges that are not present in standard input techniques. The need to be remembered and learned demands the creation of guides that promote the discoverability and memorability of these gestures while also dealing with input and recognition errors. Another consideration is the design of the motions themselves, which should be memorable as well as simple and comfortable to perform.
Why hand tracking?
With the rapid advancement of machine vision technology, particularly image processing and recognition technology, people's attention is no longer limited to improving traditional human-computer interaction input methods. Instead, how to use human biological characteristics to study more natural interaction technologies, so that human and computer can interact directly, has become the current investment focus of extended reality developers.
Gesture recognition, facial expression recognition, face recognition, lip reading recognition, limb movement tracking, eye gaze tracking, and pose recognition,… are all part of the current technology advancement in human-computer interaction technology. Among these features that can be employed as a medium for human-computer interaction, hand gestures are vivid, intuitive, and contain a lot of information. Because they have the same expressive ability as natural languages like spoken and written language, they can serve as a natural means of communication between humans and machines, and they play an important role in human-computer interaction. Hand tracking, on the other hand, has proven to be a difficult technology due to the intricacy, diversity, ambiguity, and uncertainty of hand gestures.
Hand tracking in VR
Hand tracking technology is currently regarded as a technology substantially engaged in the virtual reality industry. A major business focused on hand-tracking interfaces, such as Ultraleap, has revealed that it has received a $82 million USD Series D investment, with the purpose of growing its hand-tracking and mid-air haptic technology in the XR space and beyond.
Incumbents in this field include:
- Ultraleap Gemini which uses infrared cameras and LEDs provides the input data in a variety of lighting conditions.
- ManoMotion hand tracking SDK is made for developing AR experiences for both Android and iOS mobile platforms.
- Interhaptics provides the Interaction Builder for hand tracking device developers.
- ClayAir provides hand-tracking and gesture recognition solutions for high-end and custom controls in automotive, enterprise.
- HaptX with their advanced haptic feedback gloves that let the user track movement, texture, and temperature of virtual objects.
- Pebbles Interfaces, an Israeli company working on high-definition hand-tracking technology which was acquired by Oculus since 2015.
There are also more companies offering gesture recognition for HCI solutions, such as: uSens, Nod, Crunchfish, Manus VR, Striker VR, Tactical Haptics,…
Hand tracking technology is being considered as a promising development direction for VR/AR headset developers:
- Varjo's VR-3 and XR-3 headsets come with an inbuilt Stereo IR 170 Ultraleap hand tracking camera.
- Oculus Quest 2 and Quest also include hand tracking feature, which uses inside-out cameras to let users to use their hands instead of touch controllers.
Elixir, The Line, Richie's Plank Experience, Vacation Simulator, Hand Physics Lab, Waltz of the Wizard, Cubism and Unplugged are among the best Oculus Quest hand tracking games.
Virtual reality adult games, a sensitive but in-demand and rarely mentioned market, are also being explored for deployment of the hand tracking feature. Captain Hardcore is one of the most popular VR porn games being developed with support for VR hand tracking.
Author: Nam Pham