Natural User Interface for 3D Voxel-based Modeling
Abstract
In this study, we propose an interaction method for 3D voxel based modeling in immersive virtual environments. We assumed that all tasks must be NUI (Natural User Interface) based interactions, rather than using conventional input devices like keyboard or mouse. To see the usefulness of our proposed interaction methods, we have created an immersive virtual reality application that can interact with three-dimensional objects using only hands and arms. The modeling process requires interactions for two types of tasks. One group is the menu selection and control tasks, and the other is the manipulation tasks for voxels. We have defined motion interactions that are appropriate for each task so that all actions are possible only with hands and arms. In order to confirm how effective the proposed methods are, we have produced various shapes of 3d objects. Based on the results, we investigated the possibility of NUI based interactions proposed in this study.
Keywords
Download Options
Introduction
With the advent of Oculus Rif in 2012, which uses advanced technology, and the introduction of various virtual reality contents using it, HMD-based immersive virtual reality has received great interest from people. On the other hand, the development of input devices that interact with virtual reality is relatively lagging behind the development of HMD. The contents that appeared in the early days used conventional devices such as keyboards, mice, and game controllers as the interaction devices between users and virtual reality. However, these input devices act as obstacles to the user's immersion into the virtual reality. For this reason, recently, devices that allow more natural interactions are being announced, and in a near future, virtual reality that does not require cumbersome devices is expected to become popular. As a result, if users are free to interact with both arms, the user will feel more immersive and the performance of tasks in virtual environments will be much higher.
Just as it is quite natural nowadays, the GUI (Graphical User Interface) we are using has evolved from the former CLI (Command Line Interface) to the interactive environment using WIMP (Windows, Icon, Menu, Pointer). Moreover, in recent years, the transition to the NUI (Natural User Interface) environment has been progressing gradually due to the development of various virtual reality related technologies. We started this study to prepare for NUI-based computing environment to be unfolded in a near future.
The structure of this paper is as follows. In Chapter 2, we describe the HMD as an immersive VR input device and the NUIbased interactions that are most suitable for a virtual environment. In Chapter 3, we introduce the researches related to this study. In Chapter 4, we describe the HW configuration and software functions of our 3-dimensional voxel-based modeling tool. Chapter 5 explains an example modeling process, and Chapter 6 concludes.
Conclusion
In this study, we investigated the NUI-based interaction. To do this, we developed a 3D voxel-based modeling tool for immersive virtual environments using HMDs, and modeled various 3D objects using the developed tools.
As a result of modeling, most of the operations were able to be used without difficulty in the virtual space. A typical example is pinch operation. In the case of pinch, it was used for menu call and voxel movement and rotation. The doubletapping action of creating a voxel is not used very well in real life, but since it is a frequently used command, the simple operation seemed rather comfortable. In the case of the touch operation used to select the menu, since it is a frequently used operation in our real life in order to press the screen on a smartphone, it was possible to naturally perform the action. The pinch and drag action used to move the fixed menu could also be performed naturally.
On the other hand, there were also some problems that were pointed out as disadvantages. When we designed a complex object, we had to do the same operation repeatedly many times. When designing by NUI-based interactions, we can see that natural behavior is not always an advantage, although we feel very natural. It is very important that interaction behaviors should be natural but not physical fatigue. There was also the opinion that in the case of the touch operation, the user feels a sense of heterogeneity. This is caused by the fact that we cannot feel physical feedback like smartphones or touch-screens. In recent years, devices with haptic capabilities that can provide physical feedback have also begun to be released, and this problem is expected to be solved in a near future.