Head-Controlled Menu in Mixed Reality with a HMD - Human-Computer Interaction - INTERACT 2019 - Part IV
Conference Papers Year : 2019

Head-Controlled Menu in Mixed Reality with a HMD

Abstract

We present a design-space and three new techniques for head-based interaction with menus in Mixed Reality (MR) with a Head-Mounted Display (HMD). Usual input modalities such as hand gestures and voice commands are not suitable in noisy MR contexts where the users have both hands occupied as in augmented surgery and machine maintenance. To address the two issues of noisy MR contexts and hand-free interaction, we systematically explore the design space of head-controlled menu interaction by considering two design factors: (1) head-controlled menu versus head-controlled cursor (2) virtual targets versus mixed targets anchored on physical objects. Based on the design space, we present three novel menu techniques that we compared with a baseline head-controlled cursor technique. Experimental results suggest that head-controlled menu and head-controlled cursor techniques offer similar performance. In addition, the study found that mixed targets do not impact ultimate user performance when users are trained enough, but improve the learning phase. When using virtual targets, users still progressed after the training phase by reducing their mean selection time by 0.84 s. When using mixed targets, the improvement was limited to 0.3 s.
Fichier principal
Vignette du fichier
488595_1_En_22_Chapter.pdf (1.58 Mo) Télécharger le fichier
Origin Files produced by the author(s)
Loading...

Dates and versions

hal-02877647 , version 1 (22-06-2020)

Licence

Identifiers

Cite

François Leitner, Laurence Nigay, Charles Bailly. Head-Controlled Menu in Mixed Reality with a HMD. 17th IFIP Conference on Human-Computer Interaction (INTERACT), Sep 2019, Paphos, Cyprus. pp.395-415, ⟨10.1007/978-3-030-29390-1_22⟩. ⟨hal-02877647⟩
69 View
99 Download

Altmetric

Share

More