Immersion and Interaction in Visualization

会議の名前
CHI 2022
AvatAR: An Immersive Analysis Environment for Human Motion Data Combining Interactive 3D Avatars and Trajectories
要旨

Analysis of human motion data can reveal valuable insights about the utilization of space and interaction of humans with their environment. To support this, we present AvatAR, an immersive analysis environment for the in-situ visualization of human motion data, that combines 3D trajectories, virtual avatars of people’s movement, and a detailed representation of their posture. Additionally, we describe how to embed visualizations directly into the environment, showing what a person looked at or what surfaces they touched, and how the avatar’s body parts can be used to access and manipulate those visualizations. AvatAR combines an AR HMD with a tablet to provide both mid-air and touch interaction for system control, as well as an additional overview to help users navigate the environment. We implemented a prototype and present several scenarios to show that AvatAR can enhance the analysis of human motion data by making data not only explorable, but experienceable.

著者
Patrick Reipschläger
Autodesk Research, Toronto, Ontario, Canada
Frederik Brudy
Autodesk Research, Toronto, Ontario, Canada
Raimund Dachselt
Technische Universität Dresden, Dresden, Germany
Justin Matejka
Autodesk Research, Toronto, Ontario, Canada
George Fitzmaurice
Autodesk Research, Toronto, Ontario, Canada
Fraser Anderson
Autodesk Research, Toronto, Ontario, Canada
論文URL

https://dl.acm.org/doi/abs/10.1145/3491102.3517676

動画
ReLive: Bridging In-Situ and Ex-Situ Visual Analytics for Analyzing Mixed Reality User Studies
要旨

The nascent field of mixed reality is seeing an ever-increasing need for user studies and field evaluation, which are particularly challenging given device heterogeneity, diversity of use, and mobile deployment. Immersive analytics tools have recently emerged to support such analysis in situ, yet the complexity of the data also warrants an ex-situ analysis using more traditional non-immersive visual analytics setups. To bridge the gap between both approaches, we introduce ReLive: a mixed-immersion visual analytics framework for exploring and analyzing mixed reality user studies. ReLive combines an in-situ virtual reality view with a complementary ex-situ desktop view. While the virtual reality view allows users to relive interactive spatial recordings replicating the original study, the synchronized desktop view provides a familiar interface for analyzing aggregated data. We validated our concepts in a two-step evaluation consisting of a design walkthrough and an empirical expert user study.

著者
Sebastian Hubenschmid
University of Konstanz, Konstanz, Germany
Jonathan Wieland
University of Konstanz, Konstanz, Germany
Daniel Immanuel. Fink
University of Konstanz, Konstanz, Germany
Andrea Batch
University of Maryland, College Park, College Park, Maryland, United States
Johannes Zagermann
University of Konstanz, Konstanz, Germany
Niklas Elmqvist
University of Maryland, College Park, College Park, Maryland, United States
Harald Reiterer
University of Konstanz, Konstanz, Germany
論文URL

https://dl.acm.org/doi/abs/10.1145/3491102.3517550

動画
A Design Space For Data Visualisation Transformations Between 2D And 3D In Mixed-Reality Environments
要旨

As mixed-reality (MR) technologies become more mainstream, the delineation between data visualisations displayed on screens or other surfaces and those floating in space becomes increasingly blurred. Rather than the choice of using either a 2D surface or the 3D space for visualising data being a dichotomy, we argue that users should have the freedom to transform visualisations seamlessly between the two as needed. However, the design space for such transformations is large, and practically uncharted. To explore this, we first establish an overview of the different states that a data visualisation can take in MR, followed by how transformations between these states can facilitate common visualisation tasks. We then describe a design space of how these transformations function, in terms of the different stages throughout the transformation, and the user interactions and input parameters that affect it. This design space is then demonstrated with multiple exemplary techniques based in MR.

受賞
Honorable Mention
著者
Benjamin Lee
Monash University, Melbourne, Victoria, Australia
Maxime Cordeil
Monash University, Melbourne, Australia
Arnaud Prouzeau
Inria, Bordeaux, France
Bernhard Jenny
Monash University, Melbourne, Australia
Tim Dwyer
Monash University, Melbourne, Australia
論文URL

https://dl.acm.org/doi/abs/10.1145/3491102.3501859

動画
HAExplorer: Understanding Interdependent Biomechanical Motions with Interactive Helical Axes
要旨

The helical axis is a common tool used in biomechanical modeling to parameterize the motion of rigid objects. It encodes an object's rotation around and translation along a unique axis. Visualizations of helical axes have helped to make kinematic data tangible. However, the analysis process often remains tedious, especially if complex motions are examined. We identify multiple key challenges: the absence of interactive tools for the computation and handling of helical axes, visual clutter in axis representations, and a lack of contextualization. We solve these issues by providing the first generalized framework for kinematic analysis with helical axes. Axis sets can be computed on-demand, interactively filtered, and explored in multiple coordinated views. We iteratively developed and evaluated the HAExplorer with active biomechanics researchers. Our results show that the techniques we introduce open up the possibility to analyze non-planar, compound, and interdependent motion data.

著者
Pepe Eulzer
Friedrich-Schiller University of Jena, Jena, Germany
Robert Rockenfeller
University of Koblenz-Landau, Koblenz, Germany
Kai Lawonn
University of Jena, Jena, Germany
論文URL

https://dl.acm.org/doi/abs/10.1145/3491102.3501841

動画