Prototyping serves as a critical phase in the industrial conceptual design process, enabling exploration of problem space and identification of solutions. Recent advancements in large-scale generative models have enabled AI to become a co-creator in this process. However, designers often consider generative AI challenging due to the necessity to follow computer-centered interaction rules, diverging from their familiar design materials and languages. Physical prototype is a commonly used design method, offering unique benefits in prototype process, such as intuitive understanding and tangible testing. In this study, we propose ProtoDreamer, a mixed-prototype tool that synergizes generative AI with physical prototype to support conceptual design. ProtoDreamer allows designers to construct preliminary prototypes using physical materials, while AI recognizes these forms and vocal inputs to generate diverse design alternatives. This tool empowers designers to tangibly interact with prototypes, intuitively convey design intentions to AI, and continuously draw inspiration from the generated artifacts. An evaluation study confirms ProtoDreamer’s utility and strengths in time efficiency, creativity support, defects exposure, and detailed thinking facilitation.
https://doi.org/10.1145/3654777.3676399
Flywheels are unique, versatile actuators that store and convert kinetic energy to torque, widely utilized in aerospace, robotics, haptics, and more. However, prototyping interaction using flywheels is not trivial due to safety concerns, unintuitive operation, and implementation challenges. We present TorqueCapsules: self-contained, fully-encapsulated flywheel actuation modules that make the flywheel actuators easy to control, safe to interact with, and quick to reconfigure and customize. By fully encapsulating the actuators with a wireless microcontroller, a battery, and other components, the module can be readily attached, embedded, or stuck to everyday objects, worn to people’s bodies, or combined with other devices. With our custom GUI, both novices and expert users can easily control multiple modules to design and prototype movements and kinesthetic haptics unique to flywheel actuation. We demonstrate various applications, including actuated everyday objects, wearable haptics, and expressive robots. We conducted workshops for novices and experts to employ TorqueCapsules to collect qualitative feedback and further application examples.
https://doi.org/10.1145/3654777.3676364
We introduce AniCraft, a mixed reality system for prototyping 3D character animation using physical proxies crafted from everyday objects. Unlike existing methods that require specialized equipment to support the use of physical proxies, AniCraft only requires affordable markers, webcams, and daily accessible objects and materials. AniCraft allows creators to prototype character animations through three key stages: selection of virtual characters, fabrication of physical proxies, and manipulation of these proxies to animate the characters. This authoring workflow is underpinned by diverse physical proxies, manipulation types, and mapping strategies, which ease the process of posing virtual characters and mapping user interactions with physical proxies to animated movements of virtual characters. We provide a range of cases and potential applications to demonstrate how diverse physical proxies can inspire user creativity. User experiments show that our system can outperform traditional animation methods for rapid prototyping. Furthermore, we provide insights into the benefits and usage patterns of different materials, which lead to design implications for future research.
https://doi.org/10.1145/3654777.3676325
Olfactory interfaces are pivotal in HCI, yet their development is hindered by limited application scenarios, stifling the discovery of new research opportunities. This challenge primarily stems from existing design tools focusing predominantly on odor display devices and the creation of standalone olfactory experiences, rather than enabling rapid adaptation to various contexts and tasks. Addressing this, we introduce Mul-O, a novel task-oriented development platform crafted to aid semi-professionals in navigating the diverse requirements of potential application scenarios and effectively prototyping ideas. Mul-O facilitates the swift association and integration of olfactory experiences into functional designs, system integrations, and concept validations. Comprising a web UI for task-oriented development, an API server for seamless third-party integration, and wireless olfactory display hardware, Mul-O significantly enhances the ideation and prototyping process in multisensory tasks. This was verified by a 15-day workshop attended by 30 participants. The workshop produced seven innovative projects, underscoring Mul-O's efficacy in fostering olfactory innovation.
https://doi.org/10.1145/3654777.3676387