A Novel Pointing Method Based on Bare-Hand Interactions Using the Palmar Surface In XR

Authors

  • Su Young Kim BioComputing Lab, Department of Computer Science and Engineering, Korea University of Technology and Education (KOREATECH), Cheonan 31253, Republic of Korea
  • Yu Jin Lee BioComputing Lab, Department of Computer Science and Engineering, Korea University of Technology and Education (KOREATECH), Cheonan 31253, Republic of Korea
  • Yoon Sang Kim BioComputing Lab, Institute for Bioengineering Application Technology, Department of Computer Science and Engineering, KOREATECH, Cheonan 31253, Republic of Korea

Keywords:

augmented reality, bare-hand interaction, extended reality, pointing

Abstract

Much of the current content (photos, videos, etc.) is provided through windows, which comprise a 2D graphical user interface. In extended reality (XR), it is common to render such content on a 3D plane (hereafter 3D window). XR, an emerging interactive environment, has been developed to support bare-handed input, and studies on interacting with 3D windows in this environment have primarily focused on pointing methods. However, conventional methods can cause various issues such as fatigue and discomfort owing to the use of mid-air gestures. In consideration of these problems, this paper proposes a novel pointing method. The proposed method augments a virtual pad on the palmar surface of one hand and uses the other hand to operate the pad for pointing. The pointing performance of the proposed method was evaluated by comparing it with three conventional representative pointing methods (Gaze&Gesture (GG), Handray&Gesture (HG), and virtual pad (VP)). From the analysis results, the proposed method performed better than VP but was inferior to GG and HG. However, qualitative analysis confirmed that the proposed method achieved accurate pointing, fast perceived operation speed, and low fatigue. It was also confirmed from the survey results on social acceptance that the proposed method is most preferred in public environments.

Downloads

Download data is not yet available.

References

Smith, D. C., Irby, C., Kimball, R., Verplank, B., & Harslem, E. (1982). Designing the Star user interface. Byte, 7, 242-282. [Online] Available: https://www.researchgate.net/publication/234781794_Designing_the_Star_user_interface_1982

Ens, B., Hincapié-Ramos, J. D., & Irani, P. (2014). Ethereal planes: a design framework for 2D information space in 3D mixed reality environments. Proceedings of the 2nd ACM symposium on Spatial user interaction, 2-12. DOI: 10.1145/2659766.2659769

Lin, J. L., Zheng, M. C., & Zhong, C. R. (2022). Research on the interactive behaviour of pointing devices: Is the Magic Mouse easy to use? 2022 IEEE International Conference on Consumer Electronics-Taiwan, 443-444. DOI: 10.1109/ICCE-Taiwan55306.2022.9869289

Milgram, P., Takemura, H., Utsumi, A., & Kishino, F. (1995). Augmented reality: A class of displays on the reality-virtuality continuum. Telemanipulator and telepresence technologies, 2351, 282-292. DOI: 10.1117/12.197321

Milgram, P., & Colquhoun, H. (1999). A taxonomy of real and virtual world display integration. Mixed reality: Merging real and virtual worlds, 1-26. DOI: 10.1007/978-3-642-87512-0_1

Chuah, S. H. W. (2019). Wearable XR-technology: literature review, conceptual framework and future research directions. International journal of technology marketing, 13(3-4), 205-259. DOI: 10.1504/IJTMKT.2019.104586

Lee, J. H., An, S. G., Kim, Y., & Bae, S. H. (2018). Projective windows: bringing windows in space to the fingertip. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, 1-8. DOI: 10.1145/3173574.3173792

Feiner, S., MacIntyre, B., Haupt, M., & Solomon, E. (1993). Windows on the world: 2D windows for 3D augmented reality. Proceedings of the 6th annual ACM symposium on User interface software and technology, 145-155. DOI: 10.1145/168642.168657

Kern, F., Niebling, F., & Latoschik, M. E. (2023). Text Input for Non-Stationary XR Workspaces: Investigating Tap and Word-Gesture Keyboards in Virtual and Augmented Reality. IEEE Transactions on Visualization and Computer Graphics, 29(5), 2658-2669. DOI: 10.1109/TVCG.2023.3247098

Lee, T. H., & Lee, H. J. (2018). A new virtual keyboard with finger gesture recognition for AR/VR devices. Human-Computer Interaction. Interaction Technologies: 20th International Conference, 56-67. DOI: 10.1007/978-3-319-91250-9_5

Brasier, E., Chapuis, O., Ferey, N., Vezien, J., & Appert, C. (2020). Arpads: Mid-air indirect input for augmented reality. 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 332-343. DOI: 10.1109/ISMAR50242.2020.00060

Hincapié-Ramos, J. D., Guo, X., Moghadasian, P., & Irani, P. (2014). Consumed endurance: A metric to quantify arm fatigue of mid-air interactions. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1063-1072. DOI: 10.1145/2556288.2557130

Zhang, Y., Kienzle, W., Ma, Y., Ng, S. S., Benko, H., & Harrison, C. (2019). ActiTouch: Robust touch detection for on-skin AR/VR interfaces. Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology, 1151-1159. DOI: 10.1145/3332165.3347869

Lindeman, R. W., Sibert, J. L., & Hahn, J. K. (1999). Towards usable VR: an empirical study of user interfaces for immersive virtual environments. Proceedings of the SIGCHI conference on Human factors in computing systems, 64-71. DOI: 10.1145/302979.302995

Mine, M. R. (1995). Virtual environment interaction techniques. UNC Chapel Hill CS Dept. [Online] Available: https://www.researchgate.net/publication/2812583_Virtual_Environment_Interaction_Techniques

Lystbæk, M. N., Rosenberg, P., Pfeuffer, K., Grønbæk, J. E., & Gellersen, H. (2022). Gaze-hand alignment: Combining eye gaze and mid-air pointing for interacting with menus in augmented reality. Proceedings of the ACM on Human-Computer Interaction, 6(ETRA), 1-18. DOI: 10.1145/3530886

Wagner, U., Lystbæk, M. N., Manakhov, P., Grønbæk, J. E., Pfeuffer, K., & Gellersen, H. (2023). A Fitts’ Law Study of Gaze-Hand Alignment for Selection in 3D User Interfaces. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 1-15. DOI: 10.1145/3544548.3581423

Poupyrev, I., Billinghurst, M., Weghorst, S., & Ichikawa, T. (1996). The go-go interaction technique: non-linear mapping for direct manipulation in VR. Proceedings of the 9th annual ACM symposium on User interface software and technology, 79-80. DOI: 10.1145/237091.237102

Pierce, J. S., Forsberg, A. S., Conway, M. J., Hong, S., Zeleznik, R. C., & Mine, M. R. (1997). Image plane interaction techniques in 3D immersive environments. Proceedings of the 1997 symposium on Interactive 3D graphics, 39-43. DOI: 10.1145/253284.253303

Akamatsu, M., & MacKenzie, I. S. (2002). Changes in applied force to a touchpad during pointing tasks. International Journal of Industrial Ergonomics, 29(3), 171-182. DOI: 10.1016/S0169-8141(01)00063-4

Dudley, J. J., Benko, H., Wigdor, D., & Kristensson, P. O. (2019). Performance envelopes of virtual keyboard text input strategies in virtual reality. 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 289-300. DOI: 10.1109/ISMAR.2019.00027

Zhang, Z. (2003). Vision-based Interaction with Fingers and Papers. Proceedings International Symposium on the CREST Digital Archiving Project, 83-106. [Online] Available: https://www.microsoft.com/en-us/research/publication/vision-based-interaction-fingers-papers/

Malik S., & Laszlo, J. (2004). Visual touchpad: a two-handed gestural input device. Proceedings of the 6th international conference on Multimodal interfaces, 289-296. DOI: 10.1145/1027933.1027980

Edwin G., & Supriana, I. (2011). Hand detection for virtual touchpad. Proceedings of the 2011 International Conference on Electrical Engineering and Informatics, 1-5. DOI: 10.1109/ICEEI.2011.6021588

Harrison, C., Benko, H., & Wilson, A. D. (2011). OmniTouch: wearable multitouch interaction everywhere. Proceedings of the 24th annual ACM symposium on User interface software and technology, 441-450. DOI: 10.1145/2047196.2047255

Hincapié-Ramos, J. D., Ozacar, K., Irani, P. P., & Kitamura, Y. (2015). GyroWand: IMU-based raycasting for augmented reality head-mounted displays. Proceedings of the 3rd ACM Symposium on Spatial User Interaction, 89-98. DOI: 10.1145/2788940.2788947

Liao, Y. C., Chen, Y. C., Chang, L., & Chen, B. Y. (2017). Dwell+ multi-level mode selection using vibrotactile cues. Proceedings of the 30th annual acm symposium on user interface software and technology, 5-16. DOI: 10.1145/3126594.3126627

Fitts, P. M. (1954). The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology, 47(6), 381-391. DOI: 10.1037/h0055392

Ahlström, D., Hasan, K., & Irani, P. (2014). Are you comfortable doing that? Acceptance studies of around-device gestures in and for public settings. Proceedings of the 16th international conference on Human-computer interaction with mobile devices & services, 193-202. DOI: 10.1145/2628363.2628381

Downloads

Published

02.02.2024

How to Cite

Kim, S. Y. ., Lee, Y. J. ., & Kim, Y. S. . (2024). A Novel Pointing Method Based on Bare-Hand Interactions Using the Palmar Surface In XR. International Journal of Intelligent Systems and Applications in Engineering, 12(14s), 98–107. Retrieved from https://ijisae.org/index.php/IJISAE/article/view/4641

Issue

Section

Research Article