2024 21st International Conference on Ubiquitous Robots (UR)
June 24-27, 2024
This paper presents a system apparatus for supporting the remote operation of a mobile robot arm using a mixed reality (MR)-based user interface (UI). The presented system is based on Robot Operating System 2, utilizing newly developed and existing software packages for gesture-based control of the mobile base and the robotic arm. An experimental case study was designed to evaluate the system-level integration and usability. The case study involved seven participants completing a simple sequence of remote operation tasks using two different UI modalities, the MR device and a conventional computer interface (i.e., a 2D display and a keyboard). The results showed that the MR-based UI might be perceived by participants as more intuitive than the conventional control interfaces, while some limitations, such as gesture sensitivity and increased task load due to unfamiliarity, were also identified.
