AccessLens: Auto-detecting Inaccessibility of Everyday Objects

要旨

In our increasingly diverse society, everyday physical interfaces often present barriers, impacting individuals across various contexts. This oversight, from small cabinet knobs to identical wall switches that can pose different contextual challenges, highlights an imperative need for solutions. Leveraging low-cost 3D-printed augmentations such as knob magnifiers and tactile labels seems promising, yet the process of discovering unrecognized barriers remains challenging because disability is context-dependent. We introduce AccessLens, an end-to-end system designed to identify inaccessible interfaces in daily objects, and recommend 3D-printable augmentations for accessibility enhancement. Our approach involves training a detector using the novel AccessDB dataset designed to automatically recognize 21 distinct Inaccessibility Classes (e.g., bar-small and round-rotate) within 6 common object categories (e.g., handle and knob). AccessMeta serves as a robust way to build a comprehensive dictionary linking these accessibility classes to open-source 3D augmentation designs. Experiments demonstrate our detector's performance in detecting inaccessible objects.

著者
Nahyun Kwon
Texas A&M University, College Station, Texas, United States
Qian Lu
Texas A&M University, College Station, Texas, United States
Muhammad Hasham Qazi
Texas A&M University, College Station, Texas, United States
Joanne Liu
Texas A&M University, College Station, Texas, United States
Changhoon Oh
Yonsei University, Seoul, Korea, Republic of
Shu Kong
Texas A&M University, College Station, Texas, United States
Jeeeun Kim
Texas A&M University, College Station, Texas, United States
論文URL

doi.org/10.1145/3613904.3642767

動画

会議: CHI 2024

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2024.acm.org/)

セッション: Universal Accessibility A

314
5 件の発表
2024-05-15 01:00:00
2024-05-15 02:20:00