この勉強会は終了しました。ご参加ありがとうございました。
Historians use spatio-temporal navigation for their models and studies of historical evolutions and events. Their findings can then be exhibited in cultural mediation centers or museums. The latter, both to facilitate the transmission of knowledge and to make their exhibitions more attractive, are now exploiting new technologies. Indeed, digital systems allow, among other things, visitors to navigate spatially and temporally in virtual reconstructions of historical environments. We propose to combine these virtual representations with a tangible interface to provide visitors with an immersive experience and engaging interactions. To do so, we have set up a co-design process involving cultural mediation actors (museum directors, historians, etc.). The result is SABLIER, a tangible interactor to navigate through space and time based on the interaction metaphors and natural affordance of an hourglass. Finally, we have conducted an evaluation of the acceptability of our interactor, whose results are positive.
Consumer electronics are increasingly using everyday materials to blend into home environments, often using LEDs or symbol displays under textile meshes. Our surveys (n=1499 and n=1501) show interest in interactive graphical displays for hidden interfaces --- however, covering such displays significantly limits brightness, material possibilities and legibility.
To overcome these limitations, we leverage parallel rendering to enable ultrabright graphics that can pass through everyday materials. We unlock expressive hidden interfaces using rectilinear graphics on low-cost, mass-produced passive-matrix OLED displays. A technical evaluation across materials, shapes and display techniques, suggests 3.6--40X brightness increase compared to more complex active-matrix OLEDs.
We present interactive prototypes that blend into wood, textile, plastic and mirrored surfaces. Survey feedback (n=1572) on our prototypes suggests that smart mirrors are particularly desirable. A lab evaluation (n=11) reinforced these findings and allowed us to also characterize performance from hands-on interaction with different content, materials and under varying lighting conditions.
Head-mounted augmented reality (AR) displays allow for the seamless integration of virtual visualisation with contextual tangible references, such as physical (tangible) globes. We explore the design of immersive geospatial data visualisation with AR and tangible globes. We investigate the ``tangible-virtual interplay'' of tangible globes with virtual data visualisation, and propose a conceptual approach for designing immersive geospatial globes. We demonstrate a set of use cases, such as augmenting a tangible globe with virtual overlays, using a physical globe as a tangible input device for interacting with virtual globes and maps, and linking an augmented globe to an abstract data visualisation.
We gathered qualitative feedback from experts about our use case visualisations, and compiled a summary of key takeaways as well as ideas for envisioned future improvements. The proposed design space, example visualisations and lessons learned aim to guide the design of tangible globes for data visualisation in AR.
(Dis)Appearables is an approach for actuated Tangible User Interfaces (TUIs) to appear and disappear. This technique is supported by \textit{Stages}: physical platforms inspired by theatrical stages. Self-propelled TUI's autonomously move between front and back stage allowing them to dynamically appear and disappear from users' attention. This platform opens up a novel interaction design space for expressive displays with dynamic physical affordances.
We demonstrate and explore this approach based on a proof-of-concept implementation using two-wheeled robots, and multiple stage design examples. We have implemented a stage design pipeline which allows users to plan and design stages that are composed with front and back stages, and transition portals such as trap doors or lifts. The pipeline includes control of the robots, which guides them on and off stage. With this proof-of-concept prototype, we demonstrated a range of applications including interactive mobility simulation, self re-configuring desktops, remote hockey, and storytelling/gaming. Inspired by theatrical stage designs, this is a new take on `controlling the existence of matter' for user experience design.
Autonomous actuated-interfaces provide a unique research opportunity for shared-control interfaces, as the human and the interface collaborate using the physical interaction modality, manipulating the same physical elements at the same time. Prior studies show that sharing control with physical modality interfaces often results in frustration and low sense-of-control. We designed and implemented adaptive behavior for shared-control actuated-interfaces that extends prior work by providing humans the ability to anticipate the autonomous action, and then accept or override it. Results from a controlled study with 24 participants indicate better collaboration in the Adaptive condition compared with the Non-adaptive one, with improved sense-of-control, feelings of teamwork, and overall collaboration quality. Our work contributes to shared-control tangible, shape-change, and actuated interfaces. We show that leveraging minimal non-verbal social cues to physically communicate the actuated-interface's intent, coupled with providing autonomy to the human to physically accept or override the shift-in-control, improves the shared-control collaboration.