115. Ubiquitous, smelly & immersive visualization

開始時刻
前のセッションの直後
時間
2分30秒
発表担当

終了した勉強会

この勉強会は終了しました。ご参加ありがとうございました。

Techniques for Flexible Responsive Visualization Design
説明

Responsive visualizations adapt to effectively present information based on the device context. Such adaptations are essential for news content that is increasingly consumed on mobile devices. However, existing tools provide little support for responsive visualization design. We analyze a corpus of 231 responsive news visualizations and discuss formative interviews with five journalists about responsive visualization design. These interviews motivate four central design guidelines: enable simultaneous cross-device edits, facilitate device-specific customization, show cross-device previews, and support propagation of edits. Based on these guidelines, we present a prototype system that allows users to preview and edit multiple visualization versions simultaneously. We demonstrate the utility of the system features by recreating four real-world responsive visualizations from our corpus.

日本語まとめ
読み込み中…
読み込み中…
InChorus: Designing Consistent Multimodal Interactions for Data Visualization on Tablet Devices
説明

While tablet devices are a promising platform for data visualization, supporting consistent interactions across different types of visualizations on tablets remains an open challenge. In this paper, we present multimodal interactions that function consistently across different visualizations, supporting common operations during visual data analysis. By considering standard interface elements (e.g., axes, marks) and grounding our design in a set of core concepts including operations, parameters, targets, and instruments, we systematically develop interactions applicable to different visualization types. To exemplify how the proposed interactions collectively facilitate data exploration, we employ them in a tablet-based system, InChorus that supports pen, touch, and speech input. Based on a study with 12 participants performing replication and factchecking tasks with InChorus, we discuss how participants adapted to using multimodal input and highlight considerations for future multimodal visualization systems.

日本語まとめ
読み込み中…
読み込み中…
Scents and Sensibility: Evaluating Information Olfactation
説明

Olfaction---the sense of smell---is one of the least explored of the human senses for conveying abstract information. In this paper, we conduct a comprehensive perceptual experiment on information olfactation: the use of olfactory and cross-modal sensory marks and channels to convey data. More specifically, following the example from graphical perception studies, we design an experiment that studies the perceptual accuracy of four cross-modal sensory channels---scent type, scent intensity, airflow, and temperature---for conveying three different types of data---nominal, ordinal, and quantitative. We also present details of a 24-scent multi-sensory display and its software framework that we designed in order to run this experiment. Our results yield a ranking of olfactory and cross-modal sensory channels that follows similar principles as classic rankings for visual channels.

日本語まとめ
読み込み中…
読み込み中…
Urban Mosaic: Visual Exploration of Streetscapes Using Large-Scale Image Data
説明

Urban planning is increasingly data driven, yet the challenge of designing with data at a city scale and remaining sensitive to the impact at a human scale is as important today as it was for Jane Jacobs. We address this challenge with Urban Mosaic, a tool for exploring the urban fabric through a spatially and temporally dense data set of 7.7 million street-level images from New York City, captured over the period of a year. Working in collaboration with professional practitioners, we use Urban Mosaic to investigate questions of accessibility and mobility, and preservation and retrofitting. In doing so, we demonstrate how tools such as this might provide a bridge between the city and the street, by supporting activities such as visual comparison of geographically distant neighborhoods, and temporal analysis of unfolding urban development.

日本語まとめ
読み込み中…
読み込み中…
Towards an Understanding of Augmented Reality Extensions for Existing 3D Data Analysis Tools
説明

We present an observational study with domain experts to understand how augmented reality (AR) extensions to traditional PC-based data analysis tools can help particle physicists to explore and understand 3D data. Our goal is to allow researchers to integrate stereoscopic AR-based visual representations and interaction techniques into their tools, and thus ultimately to increase the adoption of modern immersive analytics techniques in existing data analysis workflows. We use Microsoft's HoloLens as a lightweight and easily maintainable AR headset and replicate existing visualization and interaction capabilities on both the PC and the AR view. We treat the AR headset as a second yet stereoscopic screen, allowing researchers to study their data in a connected multi-view manner. Our results indicate that our collaborating physicists appreciate a hybrid data exploration setup with an interactive AR extension to improve their understanding of particle collision events.

日本語まとめ
読み込み中…
読み込み中…