Overview of Fusion, wearable backpack body sharing platform.
Operator side controls the surrogate robot remotely using head mounted display and hand controllers.
Fusion as assistive and skill transfer system. Potentials in rehabilitation, and embodied learning.
Shared point of view collaboration and remote assistance.
Possibility to share experience with remote users, with full upper-body embodied actions.
What it does
Fusion is a body surrogacy system that aims to reshape the way we communicate and collaborate remotely. It explores an approach to use telepresence robots in a hybrid way by combining it into our bodies to enable shared point of view collaboration.
The advancements of Telepresence robotic research helped to define avatar systems that substitute and replicate some of our bodies functions beyond spatial limits. However, such approaches mainly focused on replicating the egocentric experience for the users through an artificial body, and the way users operate their new body. Fusion presents body surrogacy system that treats Telepresence systems as wearables — enabling embodied action sharing to a remote user. Such an approach provides new potentials for collaborative systems to enhance the quality of collective tasks and skill transfer.
How it works
Fusion explores the idea of body as a host (or surrogate) for others, to not only enable collaborative actions, but also can act as a way to mediate embodied interactions from one person to another. Fusion consists of: “surrogate” a wearable backpack robot worn by one person, and an “operator” that uses head-mounted display and hand controllers or datagloves. The operator controls the surrogate robot and gains visual, auditory, and haptic feedback, thus experiencing of being teleported to the robot. The surrogate robot supports the wearer person from the same point of view, thus providing shared body collaboration. Three main functions were explored: 1) Directed: Where a pair of humanoid hands can assist or instruct the surrogate host. 2) Enforced: Where the arms act as exoskeleton so that the operator can exercise direct physical control over the surrogate host. 3) Induced: Where the remote user forcibly direct the surrogate host by guiding them around.
The idea of body sharing spurred after a project named "MetaLimbs" we conducted in 2017, the ability to have multiple limbs that are embodied into one’s body. After we successfully demonstrated MetaLimbs, we started to imagine if these limbs were controlled by someone else, and thus if our bodies can be shared with others. We mainly designed it to support the way we work and collaborate by not only sharing the visual experience but also the embodied and physical one. Several iterations and prototypes were built before realizing its current state. We developed a three-axis humanoid head that matches user’s head sensory and motor properties. Also, we developed lightweight limbs that can be customized depending on the target application of the system (e.g. humanoid hands). Throughout these prototypes, we were able to experience the outcome of body surrogacy beyond the initial goal. For example, using Fusion, physically disabled people are capable to assist and work remotely with others without the need of physically being in the workspace, which would sustain a better level of well-being in the society. This has the potentials in enabling a wide range of users to transfer their knowledge and experience.
How it is different
Most of the telepresence robots presented nowadays are appealing because they offer some sort of mobile agency. Such robots are great if the purpose of the user is to explore a remote location or to communicate with others. Also, when handling hazardous materials or investigating dangerous sites. However, their design limits their functionality when the goal is to help others through collaborative tasks that require physical interaction. Collaboration, especially instruction and guidance, often depends on the physical act of one person showing another how to do something, and even if a telepresence robot has an arm or two, it may not be at all intuitive for a remote user have effective direct interactions. Most importantly when it comes to completely understand and comprehend the difficulty a person is facing. This is why a body sharing robot proves to be useful and allows one to inhabit the surrogate, thus assisting them with manipulation tasks.
We are currently working on the plans for deploying it as a product in the next two to three years, as we are currently conducting user tests. Also, the overall weight of the system is only 12 kg including battery, however we are aiming to reduce it further more by using lighter actuators and materials. From research perspective, we are interested in exploring the possibility of using enforced body surrogacy for rehabilitation and physical therapy, in which a therapist can dive into the patient’s body and directly assist his motion and support it. For this, it is important to collaborate with doctors to prove its feasibility.
YouFab Global Creative Awards 2018, and ACC Tokyo Creative Awards 2018. Also, Fusion was invited to United Nations AI for Good Summit 2019 in Geneva/Switzerland.