Hand Tracking & Wrist Based Haptics

Hi Everyone! My name’s Prachi Mandil. I’m a Computer Science and Engineering student at Indian Institute of Technology (IIT) Kharagpur, India. I interned at exiii as a software developer and my project was on incorporating complete hand tracking with EXOS Wrist.


EXOS Wrist is designed to use with position-tracked controllers, such as Oculus Touch and Vive Controller. Therefore, users are not able to move and feel each finger separately in VR. We had hypothesis that combination of haptic feedback and hand tracking would make virtual object interaction much more natural and closer to the physical world. So I built several prototypes for experiments.




Selection of Hand Tracking Tech

There are various hand tracking devices available in the market, which can be majorly classified into two types: vision based and sensor based. In vision based hand tracking devices, I tried Leap Motion. And in sensor based hand tracking devices, I tried Noitom Hi5 VR Glove.

Leap Motion

The main challenge in using any vision based hand tracking with EXOS Wrist is that the device blocks a large part of dorsal side of a hand from camera view, making it difficult for optical devices like Leap Motion to detect the hand. To overcome this problem, I kept the Leap Motion device on the desk instead of mounting it on VR headset and used Vive tracker with it to track its position in VR scene. This approach gave better hand tracking results while the field of view of Leap motion reduced the play area only to the desktop area.


Noitom Hi5

In order to expand user experience to room scale, I also tried Noitom Hi5 VR Glove.

This is the picture of Noitom Hi5 and 3D-printed-hacked version of EXOS Wrist. With this try, we successfully expanded the play area and achieved stable hand tracking.


As you can imagine from the picture, however, this approach took us much more time to wear & setup all the devices compared to Leap Motion. But I guess this is the trade off we have to bear with today…hoping hand tracking technology will advance and solve this in the near future:)



Touching an Object and Pushing a Button

We started from the simplest action – touching an object. Even though EXOS Wrist cannot give haptic feedback directly on your fingers, by providing proper counter-force to your wrist synced with finger touch timing, very natural feeling was simulated. You could probably see from the animation below (although it’s very subtle in the video) that EXOS Wrist is giving reactive feedback every time a finger touches a virtual object.

One of the most requested use cases of EXOS we receive is assembly simulation using VR. For example, users like car manufacturers want to test if new car design does not only look awesome, but also is easy to assemble. As a proof of concept, we placed a button in a difficult-to-reach position with obstacles and tried to push. We wanted to see if ability to change the hand shape helps assembly simulation.

Sensor based hand tracking along with Lighthouse position tracking can precisely track our hands even when they are not inside the field of view. This, combined with proper haptic feedback, gave us the ability to understand whether we are touching an object or not. It will eventually allow users to take proper actions even in a situation where their hands are not visible.


For interaction with virtual buttons, we previously wrote another blog post about this experiment, combined with EXOS Gripper. This time, we implemented similar haptic feedback pattern to Wrist + hand tracking, and it worked really well.




Grabbing, Holding, Releasing an Object

In addition to simple object touch, I also implemented few other object interactions like grabbing and holding an object. With traditional hand tracking devices, it’s difficult for users to differentiate between grabbed and released states in VR. We wanted to improve this experience by adding haptic feedback. Here are what I implemented after multiple trial-and-errors;


  1. Simple touch feedback when a user is touching an object
  2. Vertically downward gravity feedback when the object is grabbed and lifted
  3. Inertial feedback when the object is being moved around
  4. Removal of all the feedback when the object is released


This flow dramatically improved hand tracking user experience and made it much more closer to real life. In the future, we would like to measure how haptic feedback can improve efficiency of this kind of tasks, which could be applied to VR simulations of assembly lines in factories or zero-gravity experiments in space ships etc.





In this blog post, I introduced how I implemented various object interactions like touching, grabbing and holding an object with haptic feedback in VR. Hand tracking and haptic feedback devices are still in early stages of development and a bit bulky and difficult to use. But we are pretty sure this area of technology (including exiii!) will improve at very high speed. While my task here has concluded, exiii will continue to research on the next-gen human-computer interaction in VR.