Virtual UI and Haptics

Hi exiii community, it’s Naoya here. I’m a software engineer at exiii.

 

Along with hardware development, exiii continuously explores use cases of haptics in the virtual world. In this blog post, I would like to introduce our experimental dev sprint for virtual UI control, with our haptic wearable device, EXOS Gripper.

 

While EXOS Gripper is a device designed to “grab an object,” this time, I used it as a touch tool for virtual UI.

 

Touchable Virtual UI

 

When it comes to 3D space UI control, we’ve seen various methods in the past, including 6DOF hand controllers, eye tracking and so on. However, these new types of input methods could easily make users get confused, and therefore they are not easy to design. By using EXOS Gripper, we possibly can make it much more intuitive, by virtually importing real-life UI control to the VR space, where users can simply reach out, touch, and get feedback, just like they always do.

 

There are many kinds of user interfaces which have been used for long time (both real and virtual) including buttons, sliders, knobs etc. In this blog post, I will focus on button control, which probably is the most frequently used input method.

 

Buttons are everywhere, on your keyboard, mouse, remote and so on. They all have different shapes, colors, sizes, but there is one thing in common, “click feeling.” It is one of the most important elements when designing a button. It is important to let users understand whether or not and when they send a command. Clicks from buttons also play a big role to create pleasant UX of a product. When using virtual buttons in VR/AR, by properly simulating click feelings, we can dramatically improve UI control.

 

 

Simulating Button Clicks

 

To start with, I simply tried simulating the behavior of real-world physical buttons in VR. When pushing a physical button, as you push in, the counter-force gets greater. But when you reach a certain point, the counter-force is suddenly reduced and gets back up soon after, creating “counter-force valley.” This is the source of button’s click feeling.

 

Graph 1. As you push in a button, you will hit “counter-force valley,” and point B gives you click feeling.(http://www.sitech-corp.com/blog/types-tactile-silicone-keypads/)

 

I applied the same mechanism for EXOS Gripper – giving counter-force as you push in and creating the valley at a certain point. However, as you can probably imagine, it was not that simple and easy. EXOS Gripper tends to be really unstable and keeps flapping when actually tested. It seems that changing counter-force based on how much you push in (stroke) is not really good approach due to the hardware characteristics of EXOS Gripper.

 

How much you push in = Stroke

 

Next, as an alternative approach, I gave constant counter-force increase along with a push stroke and created instantaneous pulse at a certain point, where you can feel sudden change of counter-force. I also made this pulse occur only when pushing in the button (a push stroke) not when releasing the button to the original position (a return stroke) in order to make EXOS Gripper as stable as possible.

 

Graph 2. A pulse occurs when a stroke reaches point A. It occurs only with a push stroke, not with a return stroke.

 

With this approach, I could effectively reduce Gripper’s unstable behaviors. But since this mechanism is largely different from real-world physical buttons, I needed to carefully design the pulse in order to create pleasant click feelings.

After multiple try & errors, I found that shaping the waveform with 1) radical drop followed by 2) radical rise was the best way to replicate button click feelings. I assume that feedback from this waveform can create the similar touch force of physical button (point B on Graph 1) .

Graph 3. The waveform of the pulse is designed with 1) radical drop and 2) radical rise of counter force. This pulse replicates “touch force” of physical buttons.

 

The deliverables from this dev sprint is currently used in every EXOS demo and effectively improved virtual UI control.

 

 

In this blog post, I introduced an example of touchable UI. Interfaces we are familiar with in the real world cannot be simply transferred to VR. We need to tweak and customize the mechanism depending on various factors (this time, it was hardware characteristics). While I focused on button control this time, exiii will continue to experiment with other input methods as well, such as sliders and knobs which we are all familiar with, or maybe totally new methods which are only possible in the VR environment.