PinpointFly: An Egocentric Position-control Drone Interface using Mobile AR

要旨

Accurate drone positioning is challenging because pilots only have a limited position and direction perception of a flying drone from their perspective. This makes conventional joystick-based speed control inaccurate and more complicated and significantly degrades piloting performance. We propose PinpointFly, an egocentric drone interface that allows pilots to arbitrarily position and rotate a drone using position-control direct interactions on a see-through mobile AR where the drone position and direction are visualized with a virtual cast shadow (i.e., the drone's orthogonal projection onto the floor). Pilots can point to the next position or draw the drone's flight trajectory by manipulating the virtual cast shadow and the direction/height slider bar on the touchscreen. We design and implement a prototype of PinpointFly for indoor and visual line of sight scenarios, which are comprised of real-time and predefined motion-control techniques. We conduct two user studies with simple positioning and inspection tasks. Our results demonstrate that PinpointFly makes the drone positioning and inspection operations faster, more accurate, simpler and fewer workload than a conventional joystick interface with a speed-control method.

著者
Linfeng Chen
Tohoku University, Sendai, Japan
Kazuki Takashima
Tohoku University, Sendai, Japan
Kazuyuki Fujita
Tohoku University, Sendai, Miyagi, Japan
Yoshifumi Kitamura
Tohoku University, Sendai, Japan
DOI

10.1145/3411764.3445110

論文URL

https://doi.org/10.1145/3411764.3445110

動画

会議: CHI 2021

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2021.acm.org/)

セッション: Human-AI, Automation, Vehicles & Drones / Trust & Explainability

[A] Paper Room 15, 2021-05-13 17:00:00~2021-05-13 19:00:00 / [B] Paper Room 15, 2021-05-14 01:00:00~2021-05-14 03:00:00 / [C] Paper Room 15, 2021-05-14 09:00:00~2021-05-14 11:00:00
Paper Room 15
12 件の発表
2021-05-13 17:00:00
2021-05-13 19:00:00
日本語まとめ
読み込み中…