Rob2HanD: LLM-Driven Robotic Arm for IMU Interaction Dataset Generation

要旨

Fine-grained hand interaction with Inertial Measurement Unit (IMU) and machine learning offers a low-cost and effective solution. However, the robustness and generalizability of machine learning models are highly dataset-dependent. Existing datasets for interaction design are typically constructed through extensive real user data collection, which limits interaction diversity and personalization. To address these challenges, we propose Rob2HanD, a novel data-generation tool which utilizes large language models (LLMs) to regulate the motion processes of the robotic arm and rapidly constructs IMU datasets. Rob2HanD demonstrates the capability to generate large and usable IMU interaction datasets under few-shot or zero-shot conditions, thereby enhancing the potential for diverse and personalized fine-grained hand interactions. Using a real human dataset, we evaluate machine learning models trained on Rob2HanD-generated data and validate the usability of Rob2HanD. In real-world applications, models trained on Rob2HanD-generated datasets demonstrate strong performance across a variety of customized interaction tasks.

著者
Jiangyuan Liu
Zhejiang University, Ningbo, China
Chicheng Yu
Zhejiang University, NINGBO, China
Xinli Chen
Zhejiang University, Hangzhou, China
Jiajun Bu
Zhejiang University, Hangzhou, Zhejiang, China
Limin Zeng
Zhejiang University, Hangzhou, China

会議: CHI 2026

ACM CHI Conference on Human Factors in Computing Systems

セッション: Physical Tasks & Robots

P1 - Room 115
7 件の発表
2026-04-17 18:00:00
2026-04-17 19:30:00