Being able to assess dog personality can be used to, for example, match shelter dogs with future owners, and personalize dog activities. Such an assessment typically relies on experts or psychological scales administered to dog owners, both of which are costly. To tackle that challenge, we built a device called ``Patchkeeper'' that can be strapped on the pet's chest and measures activity through an accelerometer and a gyroscope. In an in-the-wild deployment involving 12 healthy dogs, we collected 1300 hours of sensor activity data and dog personality test results from two validated questionnaires. By matching these two datasets, we trained ten machine learning classifiers that predicted dog personality from activity data, achieving AUCs in [0.63-0.90], suggesting the value of tracking psychological signals of pets using wearable technologies.
Skin-based electronics are an emerging genre of interactive technologies. In this paper, we leverage the natural uses of lotions and propose them as mediators for driving novel, low-power, quasi-bistable, and bio-degradable electrochromic displays on the skin and other surfaces. We detail the design, fabrication, and evaluation of one such "Lotion Interface," including how it can be customized using low-cost everyday materials and technologies to trigger various visual and temporal effects – some lasting up to fifteen minutes when unpowered. We characterize different fabrication techniques and lotions to demonstrate various visual effects on a variety of skin types and tones. We highlight the safety of our design for humans and the environment. Finally, we report findings from an exploratory user study and present a range of compelling applications for Lotion Interfaces that expand the on-skin and surface interaction landscapes to include the familiar and often habitual practice of applying lotion.
Ground surfaces are often carefully designed and engineered with various textures to fit the functionalities of human environments and thus could contain rich context information for smart wearables. Ground surface detection could power a wide array of applications including activity recognition, mobile health, and context-aware computing, and potentially provide an additional channel of information for many existing kinesiology approaches such as gait analysis. To facilitate the detection of ground surfaces, we present LaserShoes, a texture-sensing-enabled system using laser speckle imaging that can be retrofitted to shoes. Our system captures videos of speckle patterns induced on ground surfaces and uses pre-processing to identify ideal images with clear speckle patterns collected when users' feet are in contact with ground surfaces. We demonstrated our technique with a ResNet-18 model and achieved real-time inference. We conducted an evaluation in different conditions and demonstrated results that verified the feasibility.
We present EchoSpeech, a minimally-obtrusive silent speech interface (SSI) powered by low-power active acoustic sensing. EchoSpeech uses speakers and microphones mounted on a glass-frame and emits inaudible sound waves towards the skin. By analyzing echos from multiple paths, EchoSpeech captures subtle skin deformations caused by silent utterances and uses them to infer silent speech. With a user study of 12 participants, we demonstrate that EchoSpeech can recognize 31 isolated commands and 3-6 figure connected digits with 4.5% (std 3.5%) and 6.1% (std 4.2%) Word Error Rate (WER), respectively. We further evaluated EchoSpeech under scenarios including walking and noise injection to test its robustness. We then demonstrated using EchoSpeech in demo applications in real-time operating at 73.3mW, where the real-time pipeline was implemented on a smartphone with only 1-6 minutes of training data. We believe that EchoSpeech takes a solid step towards minimally-obtrusive wearable SSI for real-life deployment.
There is a growing interest in sustainable fabrication approaches, including the exploration of material conservation and utilisation of waste materials. Particularly, recent work has applied organic myco-materials, made from fungi, to develop tangible, interactive devices. However, a systematic approach for 3D fabrication using myco-materials is under-explored. In this paper, we present a parametric design tool and a fabrication pipeline to grow 3D designs using the mycelia of edible fungi species, such as Reishi or Oyster mushrooms. The proposed tool is designed based on empirical results from a series of technical evaluations of the geometric and material qualities of 3D-grown myco-objects. Furthermore, the paper introduces an easy-to-replicate fabrication process that can recycle different organic waste material combinations such as sawdust and coffee grounds to grow mycelia. Through a series of demonstration applications, we identify the challenges and opportunities for working with myco-materials in the HCI context.
Play-dough is a brightly-colored, easy-to-make, and familiar material. We have developed and tested custom play-dough materials that can be employed in 3D printers designed for clay. This paper introduces a set of recipes for 3D printable play-dough along with an exploration of these materials' print characteristics. We explore the design potential of play-dough as a sustainable fabrication material, highlighting its recyclability, compostability, and repairability. We demonstrate how custom-color prints can be designed and constructed and describe how play-dough can be used as a support material for clay 3D prints. We also present a set of example artifacts made from play-dough and discuss opportunities for future research.