HoloBar: Rapid Command Execution for Head-Worn AR Exploiting Around the Field-of-View Interaction


Inefficient menu interfaces lead to system and application commands being tedious to execute in Immersive Environments. HoloBar is a novel approach to ease the interaction with multi-level menus in immersive environments: with HoloBar, the hierarchical menu splits between the field of view (FoV) of the Head Mounted Display and the smartphone (SP). Command execution is based on around-the-FoV interaction with the SP, and touch input on the SP display. The HoloBar offers a unique combination of features, namely rapid mid-air activation, implicit selection of top-level items and preview of second-level items on the SP, ensuring rapid access to commands. In a first study we validate its activation method, which consists in bringing the SP within an activation distance from the FoV. In a second study, we compare the HoloBar to two alternatives, including the standard HoloLens menu. Results show that the HoloBar shortens each step of a multi-level menu interaction (menu activation, top-level item selection, second-level item selection and validation), with a high success rate. A follow-up study confirms that these results remain valid when compared with the two validation mechanisms of HoloLens (Air-Tap and clicker).

Houssem Saidi
IRIT - Elipse, Toulouse, France
Emmanuel Dubois
IRIT - Elipse, Toulouse, France
Marcos Serrano
IRIT - Elipse, Toulouse, France





会議: CHI 2021

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2021.acm.org/)

セッション: Input / Spatial Interaction / Practice Support

[A] Paper Room 10, 2021-05-11 17:00:00~2021-05-11 19:00:00 / [B] Paper Room 10, 2021-05-12 01:00:00~2021-05-12 03:00:00 / [C] Paper Room 10, 2021-05-12 09:00:00~2021-05-12 11:00:00
Paper Room 10
13 件の発表
2021-05-11 17:00:00
2021-05-11 19:00:00