この勉強会は終了しました。ご参加ありがとうございました。
Materials are a key part of our daily experiences. Recently, researchers have been devising new ways to utilize materials directly from our physical world for the design of objects and interactions. We present a new fabrication technique that enables control of CO2 bubble positions and their size within carbonated liquids. Instead of soap bubbles, boiling water, or droplets, we show creation of patterns, images and text through sessile bubbles that exhibit a lifetime of several days. Surfaces with mixed wettability regions are created on glass and plastic using ceramic coatings or plasma projection leading to patterns that are relatively invisible to the human eye. Different regions react to liquids differently. Nucleation is activated after carbonated liquid is poured onto the surface with bubbles nucleating in hydrophobic regions with a strong adherence to the surface and can be controlled in size ranging from 0.5mm – 6.5mm. Bubbles go from initially popping or becoming buoyant during CO2 supersaturation to stabilizing at their positions within minutes. Technical evaluation shows stabilization under various conditions. Our design software allows users to import images and convert them into parametric pixelation forms conducive to fabrication that will result in nucleation of bubbles at required positions. Various applications are presented to demonstrate aspects that may be harnessed for a wide range of use in daily life. Through this work, we enable the use of carbonation bubbles as a new design material for designers and researchers.
We introduce Thermotion, a novel method using thermofluidic composites to design and display thermochromic animation effects on object surfaces. With fluidic channels embedded under the object surfaces, the composites utilize thermofluidic flows to dynamically control the surface temperature as an actuator for thermochromic paints, which enables researchers and designers for the first time to create animations not only on two and three-dimensional surfaces but also on the surface made of a few flexible everyday materials. We report the design space with six animation primitives and two modification effects, and we demonstrate the design and fabrication workflow with a customized software platform for design and simulation. A range of applications is shown leveraging the objects' dynamic displays both visually and thermally, including dynamic artifacts, teaching aids, and ambient displays. We envision an opportunity to extend thermofluidic composites to other heat-related practices for further dynamic and programmable interactions with temperature.
Surface I/O is a novel interface approach that functionalizes the exterior surface of devices to provide haptic and touch sensing without dedicated mechanical components. Achieving this requires a unique combination of surface features spanning the macro-scale (5cm~1mm), meso-scale (1mm~200μm), and micro-scale (<200μm). This approach simplifies interface creation, allowing designers to iterate on form geometry, haptic feeling, and sensing functionality without the limitations of mechanical mechanisms. We believe this can contribute to the concept of "invisible ubiquitous interactivity at scale", where the simplicity and easy implementation of the technique allows it to blend with objects around us. While we prototyped our designs using 3D printers and laser cutters, our technique is applicable to mass production methods, including injection molding and stamping, enabling passive goods with new levels of interactivity.
In this paper, we propose SwellSense, a fabrication technique to screen print stretchable circuits onto a special micro-capsule paper, creating localized swelling patterns with sensing capabilities. This simple technique will allow users to create a wide range of paper-based tactile interactive devices, which are mostly maintaining 2D planar form factor but can also be curved or folded into 3D interactive artifacts. We first present the design guidelines to support various tactile interaction design including basic tactile graphic geometries, patterns with directional density, or finer interactive textures with embedded sensing such as touch sensor, pressure sensor, and mechanical switch. We then provide a design editor to enable users to design more creatively using the SwellSense technique. We provide a technical evaluation and user evaluation to validate the basic performance of SwellSense. Lastly, we demonstrate several application examples and conclude with a discussion on current limitations and future work.
Enabling computing systems to understand user interactions with everyday surfaces and objects can drive a wide range of applications. However, existing vibration-based sensors (e.g., accelerometers) lack the sensitivity to detect light touch gestures or the bandwidth to recognize activity containing high-frequency components. Conversely, microphones are highly susceptible to environmental noise, degrading performance. Each time an object impacts a surface, Surface Acoustic Waves (SAWs) are generated that propagate along the air-to-surface boundary. This work repurposes a Voice PickUp Unit (VPU) to capture SAWs on surfaces (including smooth surfaces, odd geometries, and fabrics) over long distances and in noisy environments. Our custom-designed signal acquisition, processing, and machine learning pipeline demonstrates utility in both interactive and activity recognition applications, such as classifying trackpad-style gestures on a desk and recognizing 16 cooking-related activities, all with >97% accuracy. Ultimately, SAWs offer a unique signal that can enable robust recognition of user touch and on-surface events.
In the past few years, the widespread use of 3D printing technology enables the growth of the market of 3D printed products. On Esty, a website focused on handmade items, hundreds of individual entrepreneurs are selling their 3D printed products. Inspired by the positive effects of machine-readable tags, like barcodes, on daily product marketing, we propose AnisoTag, a novel tagging method to encode data on the 2D surface of 3D printed objects based on reflection anisotropy. AnisoTag has an unobtrusive appearance and much lower extraction computational complexity, contributing to a lightweight low-cost tagging system for individual entrepreneurs. On AnisoTag, data are encoded by the proposed tool as reflective anisotropic microstructures, which would reflect distinct illumination patterns when irradiating by collimated laser. Based on it, we implement a real-time detection prototype with inexpensive hardware to determine the reflected illumination pattern and decode data according to their mapping. We evaluate AnisoTag with various 3D printer brands, filaments, and printing parameters, demonstrating its superior usability, accessibility, and reliability for practical usage.