ReCompFig: Designing Dynamically Reconfigurable Kinematic Devices Using Compliant Mechanisms and Tensioning Cables

2021-2022

From creating input devices to rendering tangible information, the field of HCI is interested in using kinematic mechanisms to create human-computer interfaces. Yet, due to fabrication and design challenges, it is often difficult to create kinematic devices that are compact and have multiple reconfigurable motional degrees of freedom (DOFs) depending on the interaction scenarios. In this work, we combine compliant mechanisms with tensioning cables to create dynamically reconfigurable kinematic mechanisms. The devices’ kinematics (DOFs) is enabled and determined by the layout of bendable rods. The additional cables function as on-demand motion constraints that can dynamically lock or unlock the mechanism’s DOFs as they are tightened or loosened. We provide algorithms and a design tool prototype to help users design such kinematic devices. We also demonstrate various HCI use cases including a kinematic haptic display, a haptic proxy, and a multimodal input device.

By Humphrey Yang, Tate Johnson, Ke Zhong, Dinesh K. Patel, Gina Olson, Carmel Majidi, Mohammad Islam, Lining Yao

In ACM Conference on Human Factors in Computing Systems (CHI '22) 2022 | DOI | PDF

🏆 Best Paper Honorable Mention Award at CHI 2022

Devices that Move and Reconfigure

Compliant mechanisms are mechanical devices made of flexible materials. They enable motions by bending, are easy to make, and are widely used across scales - from microelectronics to architecture. We leverage these features to create kinematic devices that can provide versatile kinematic functions and haptics without requiring complex mechanical parts as well as manual assembly. Tensioning cables are also added to the devices to allow them to reconfigure between distinct modes that provide unique hand feels and interactivity.

Computational Kinematics Design

Despite their advantages, compliant mechanisms are inherently challenging to design. We create a design tool using the screw theory and the freedom and constraint topology (FACT) method to assist users in creating these kinematic mechanisms. The tool takes desired kinematic mode(s) as input and provides real-time, step-by-step visual and textual guidance to help users model a viable design. In this workflow, the user and the tool each take on the aspect they are proficient at - the tool suggests geometric design strategies, and the user carries out the design with physical factors (e.g., aesthetics, fabricability, interactivity) in mind.

The design tool is available here.

Multimodal Input Device

Graphical user interfaces on a screen can change from one mode to another in split seconds. However, unlike their digital counterparts, physical input devices like keyboards, mice, and joysticks only possess a single input mode. Users are often required to switch between them for different interaction scenarios. Here, we demonstrate that as a multimodal interface, a ReCompFig input device can support various kinematic interactions at a time. The device can change between three kinematic modes on demand, each recreating the haptic experience of a familiar interface (i.e., joystick, slider, twisting knob).

Kinematic Display for Tangible Rendering

In addition to visual inputs, haptics - kinematic hand feels - also make up an essential part of how we experience the world. Here, we leverage the compactness and reconfigurability of ReCompFig mechanisms to create a kinematic material device. Different from conventional displays that render images or shape displays that physicalize geometries, this kinematic display is used to tangibilize the kinematic freedoms of a piece of material. I.e., an object’s deformability when touched by hands. The display comprises a 4x4 grid of individually addressable kinematic bits, which can provide varying deformability depending on how the cables are tightened.

Haptic Proxies for V/AR

The versatility and reconfigurability of ReCompFig make it ideal for dynamic interaction contexts that put emphasis on the sensory-motor experience, such as virtual or artificial reality. The device shown here can be worn on the user's hand to simulate the weight of an object. Depending on how the cables are configured, the weight attached to the device can shift and swing in different ways to recreate the handfeel of liquids, sheets, and solids.

Acknowledgment

This research is partially supported by Carnegie Mellon University’s (CMU) Center for Machine Learning and Health and the National Science Foundation grant ID 052184116. We also thank the CMU’s College of Engineering for supporting this work through the Moonshot program (4d6f6f6e73686f74). The authors also thank the reviewers’ and colleagues’ comments that helped improve the paper.

Previous
Previous

Geodesy: Manufacturing Sustainability | Morphing Algorithm

Next
Next

PneuMesh: Pneumatic-driven Robotic Truss | Morphing Algorithm