Automotive & pedestrian interfaces

Paper session

会議の名前
CHI 2020
Toward Immersive Self-Driving Simulations: Reports from a User Study across Six Platforms
要旨

As self-driving car technology matures, autonomous vehicle research is moving toward building more human-centric interfaces and accountable experiences. Driving simulators avoid many ethical and regulatory concerns about self-driving cars and play a key role in testing new interfaces or autonomous driving scenarios. However, apart from validity studies for manual driving simulation, the capabilities of driving simulators in replicating the experience of self-driving cars have not been widely investigated. In this paper, we build six self-driving simulation platforms with varying levels of visual and motion fidelities ranging from a screen-based in-lab simulator to the mixed-reality on-road simulator we propose. We compare the sense of presence and simulator sickness for each simulator composition, as well as its visual and motion fidelities with a user study. Our novel mixed-reality automotive driving simulator, named MAXIM, showed highest fidelity and presence. Our findings suggest how visual and motion configurations affect experience in autonomous driving simulators.

キーワード
Autonomous driving
driving simulator
user studies
Immersive technology
mixed reality
on-road simulation
著者
Dohyeon Yeo
Gwangju Institute of Science and Technology, Gwangju, Republic of Korea
Gwangbin Kim
Gwangju Institute of Science and Technology, Gwangju, Republic of Korea
Seungjun Kim
Gwangju Institute of Science and Technology, Gwangju, Republic of Korea
DOI

10.1145/3313831.3376787

論文URL

https://doi.org/10.1145/3313831.3376787

動画
Chase Lights in the Peripheral View: How the Design of Moving Patterns on an LED Strip Influences the Perception of Speed in an Automotive Context
要旨

LEDs on a strip, when turned on and off in a specific order, result in the perception of apparent motion (i.e. beta movement). In the automotive domain such chase lights have been used to alter drivers' perception of driving speed by manipulating the pixel speed of LEDs. We argue that the perceived velocity of beta movement in the peripheral view is not only based on the actual pixel speed but can be influenced by other factors such as frequency, width and brightness of lit LED segments. We conducted a velocity matching experiment (N=25) by systematically varying these three properties, in order to determine their influence on a participant's perceived velocity in a vehicle mock-up. Results show that a higher frequency and stronger brightness increased perceived velocity, whereas segment width had no influence. We discuss how findings may be applied when designing systems that use beta movement to influence the perception of ambient light velocity.

キーワード
Chase lights
peripheral vision
moving patterns
velocity perception
著者
Alexander Meschtscherjakov
University of Salzburg, Salzburg, Austria
Christine Döttlinger
University of Salzburg, Salzburg, Austria
Tim Kaiser
University of Salzburg, Salzburg, Austria
Manfred Tscheligi
University of Salzburg & AIT, Salzburg & Vienna, Austria
DOI

10.1145/3313831.3376203

論文URL

https://doi.org/10.1145/3313831.3376203

Pedestrian Detection with Wearable Cameras for the Blind: A Two-way Perspective
要旨

Blind people have limited access to information about their surroundings, which is important for ensuring one's safety, managing social interactions, and identifying approaching pedestrians. With advances in computer vision, wearable cameras can provide equitable access to such information. However, the always-on nature of these assistive technologies poses privacy concerns for parties that may get recorded. We explore this tension from both perspectives, those of sighted passersby and blind users, taking into account camera visibility, in-person versus remote experience, and extracted visual information. We conduct two studies: an online survey with MTurkers (N=206) and an in-person experience study between pairs of blind (N=10) and sighted (N=40) participants, where blind participants wear a working prototype for pedestrian detection and pass by sighted participants. Our results suggest that both of the perspectives of users and bystanders and the several factors mentioned above need to be carefully considered to mitigate potential social tensions.

キーワード
wearable camera
accessibility
social acceptance
pedestrian detection
face recognition
crowdsourcing
著者
Kyungjun Lee
University of Maryland, College Park, MD, USA
Daisuke Sato
Carnegie Mellon University, Pittsburgh, PA, USA
Saki Asakawa
New York University, New York City, NY, USA
Hernisa Kacorri
University of Maryland, College Park, MD, USA
Chieko Asakawa
Carnegie Mellon University & IBM, Pittsburgh, PA, USA
DOI

10.1145/3313831.3376398

論文URL

https://doi.org/10.1145/3313831.3376398

動画
CARoma Therapy: Pleasant Scents Promote Safer Driving, Better Mood, and Improved Well-Being in Angry Drivers
要旨

Driving is a task that is often affected by emotions. The effect of emotions on driving has been extensively studied. Anger is an emotion that dominates in such investigations. Despite the knowledge on strong links between scents and emotions, few studies have explored the effect of olfactory stimulation in a context of driving. Such an outcome provides HCI practitioners very little knowledge on how to design for emotions using olfactory stimulation in the car. We carried out three studies to select scents of different valence and arousal levels (i.e. rose, peppermint, and civet) and anger eliciting stimuli (i.e. affective pictures and on-road events). We used this knowledge to conduct the fourth user study investigating how the selected scents change the emotional state, well-being, and driving behaviour of drivers in an induced angry state. Our findings enable better decisions on what scents to choose when designing interactions for angry drivers.

キーワード
Perception
Smell
Odour Stimulation
Multimodal Interfaces
Notification Systems
In-Car User Interfaces
Emotions
著者
Dmitrijs Dmitrenko
University of Sussex, Brighton, United Kingdom
Emanuela Maggioni
University of Sussex, Brighton, United Kingdom
Giada Brianza
University of Sussex, Brighton, United Kingdom
Brittany E. Holthausen
Georgia Institute of Technology, Atlanta, GA, USA
Bruce N. Walker
Georgia Institute of Technology, Atlanta, GA, USA
Marianna Obrist
University of Sussex, Brighton, United Kingdom
DOI

10.1145/3313831.3376176

論文URL

https://doi.org/10.1145/3313831.3376176

Eyes on the Road: Detecting Phone Usage by Drivers Using On-Device Cameras
要旨

Using a phone while driving is distracting and dangerous. It increases the accident chances by 400%. Several techniques have been proposed in the past to detect driver distraction due to phone usage. However, such techniques usually require instrumenting the user or the car with custom hardware. While detecting phone usage in the car can be done by using the phone's GPS, it is harder to identify whether the phone is used by the driver or one of the passengers. In this paper, we present a lightweight, software-only solution that uses the phone's camera to observe the car's interior geometry to distinguish phone position and orientation. We then use this information to distinguish between driver and passenger phone use. We collected data in 16 different cars with 33 different users and achieved an overall accuracy of 94% when the phone is held in hand and 92.2% when the phone is docked (≤1 sec. delay). With just a software upgrade, this work can enable smartphones to proactively adapt to the user's context in the car and and substantially reduce distracted driving incidents.

キーワード
driver detection
position sensing
in-car behavior
situational impairments
著者
Rushil Khurana
Carnegie Mellon University, Pittsburgh, PA, USA
Mayank Goel
Carnegie Mellon University, Pittsburgh, PA, USA
DOI

10.1145/3313831.3376822

論文URL

https://doi.org/10.1145/3313831.3376822

動画