HeadReach: Utilizing Head Tracking to Address Reachability Issues on Mobile Touch Devices
説明

People often operate their smartphones with only one hand, using just their thumb for touch input. With today's larger smartphones, this leads to a reachability issue: Users can no longer comfortably touch everywhere on the screen without changing their grip. We investigate using the head tracking in modern smartphones to address this reachability issue. We developed three interaction techniques, pure head (PH), head + touch (HT), and head area + touch (HA), to select targets beyond the reach of one's thumb. In two user studies, we found that selecting targets using HT and HA had higher success rates than the default direct touch (DT) while standing (by about 9%) and walking (by about 12%), while being moderately slower. HT and HA were also faster than one of the best techniques, BezelCursor (BC) (by about 20% while standing and 6% while walking), while having the same success rate.

日本語まとめ
読み込み中…
読み込み中…
EarBuddy: Enabling On-Face Interaction via Wireless Earbuds
説明

Past research regarding on-body interaction typically requires custom sensors, limiting their scalability and generalizability. We propose EarBuddy, a real-time system that leverages the microphone in commercial wireless earbuds to detect tapping and sliding gestures near the face and ears. We develop a design space to generate 27 valid gestures and conducted a user study (N=16) to select the eight gestures that were optimal for both human preference and microphone detectability. We collected a dataset on those eight gestures (N=20) and trained deep learning models for gesture detection and classification. Our optimized classifier achieved an accuracy of 95.3%. Finally, we conducted a user study (N=12) to evaluate EarBuddy's usability. Our results show that EarBuddy can facilitate novel interaction and that users feel very positively about the system. EarBuddy provides a new eyes-free, socially acceptable input method that is compatible with commercial wireless earbuds and has the potential for scalability and generalizability

日本語まとめ
読み込み中…
読み込み中…
Nailz: Sensing Hand Input with Touch Sensitive Nails
説明

Touches between the fingers of an unencumbered hand represent a ready-to-use, eyes-free and expressive input space suitable for interacting with wearable devices such as smart glasses or watches. While prior work has focused on touches to the inner surface of the hand, touches to the nails, a practical site for mounting sensing hardware, have been comparatively overlooked. We extend prior implementations of single touch sensing nails to a full set of five and explore their potential for wearable input. We present design ideas and an input space of 144 touches (taps, flicks and swipes) derived from an ideation workshop. We complement this with data from two studies characterizing the subjective comfort and objective characteristics (task time, accuracy) of each touch. We conclude by synthesizing this material into a set of 29 viable nail touches, assessing their performance in a final study and illustrating how they could be used by presenting, and qualitatively evaluating, two example applications.

日本語まとめ
読み込み中…
読み込み中…
FaceHaptics: Robot Arm based Versatile Facial Haptics for Immersive Environments
説明

This paper introduces FaceHaptics, a novel haptic display based on a robot arm attached to a head-mounted virtual reality display. It provides localized, multi-directional and movable haptic cues in the form of wind, warmth, moving and single-point touch events and water spray to dedicated parts of the face not covered by the head-mounted display.The easily extensible system, however, can principally mount any type of compact haptic actuator or object. User study 1 showed that users appreciate the directional resolution of cues, and can judge wind direction well, especially when they move their head and wind direction is adjusted dynamically to compensate for head rotations. Study 2 showed that adding FaceHaptics cues to a VR walkthrough can significantly improve user experience, presence, and emotional responses.

日本語まとめ
読み込み中…
読み込み中…
HapBead: On-Skin Microfluidic Haptic Interface using Tunable Bead
説明

On-skin haptic interfaces using soft elastomers which are thin and flexible have significantly improved in recent years. Many are focused on vibrotactile feedback that requires complicated parameter tuning. Another approach is based on mechanical forces created via piezoelectric devices and other methods for non-vibratory haptic sensations like stretching, twisting. These are often bulky with electronic components and associated drivers are complicated with limited control of timing and precision. This paper proposes HapBead, a new on-skin haptic interface that is capable of rendering vibration like tactile feedback using microfluidics. HapBead leverages a microfluidic channel to precisely and agilely oscillate a small bead via liquid flow, which then generates various motion patterns in channel that creates highly tunable haptic sensations on skin. We developed a proof-of-concept design to implement thin, flexible and easily affordable HapBead platform, and verified its haptic rendering capabilities via attaching it to users' fingertips. A study was carried out and confirmed that participants could accurately tell six different haptic patterns rendered by HapBead. HapBead enables new wearable display applications with multiple integrated functionalities such as on-skin haptic doodles, visuo-haptic displays and haptic illusions.

日本語まとめ
読み込み中…
読み込み中…