• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • About the ART Lab
  • Research
  • Publications
  • People
  • Contact Us
  • News

Adaptive Robotics & Technology Lab

Texas A&M University College of Engineering

Adaptive Mixed-Reality Sensorimotor Interface for Human-Swarm Teaming: Persons with Limb Loss Case Study and Field Experiments

C. Zhao, C. Zheng, L. Roldan, T. Shkurti, A. Nahari, W. Newman, D. Tyler, K. Lee, and M. Fu

Field Robotics

February, 2023

a-SWAT

This paper presents the design, evaluation, and field experiment of the innovative Adaptable Human-Swarm Teaming (α-SWAT) interface developed to support military field operations. Human-swarm teaming requires collaboration between a team of humans and a team of robotic agents for strategic decision-making and task performance. α-SWAT allows multiple human team members with different roles, physical capabilities, or preferences to interact with the swarm via a configurable, multimodal user interface (UI). The system has an embedded task allocation algorithm for the rapid assignment of tasks created by the mission planner to the swarm. The multimodal UI supports swarm visualization via a mixed reality display or a conventional 2D display, human gesture inputs via a camera or an electromyography device, tactile feedback via a vibration motor or implanted peripheral nerve interface, and audio feedback. In particular, the UI system interfacing with the implanted electrodes through a neural interface enables gesture detection and tactile feedback for individuals with upper limb amputation to participate in human-swarm teaming. The multimodality of α-SWAT’s UI adapts to the needs of three different roles of the human team members: Swarm Planner, Swarm Tactician Rear, and Swarm Tactician Forward. A case study evaluated the functionality and usability of α-SWAT to enable a participant with limb loss and an implanted neural interface to assign three tasks to a simulated swarm of 150 robotic agents. α-SWAT was also used to visualize live telemetry from 40 veridical robotic agents for multiple simultaneous human participants at a field experiment.

Recent Posts

  • Adaptive Mixed-Reality Sensorimotor Interface for Human-Swarm Teaming: Persons with Limb Loss Case Study and Field Experiments
  • α-WaLTR: Adaptive Wheel-and-Leg Transformable Robot for Versatile Multiterrain Locomotion
  • Vision-based Ascending Staircase Detection with Interpretable Classification Model for Stair Climbing Robots
  • GA-SVM based Facial Emotion Recognition using Facial Geometric Features
  • Woody: Low-Cost, Open-Source Humanoid Torso Robot

© 2016–2023 Adaptive Robotics & Technology Lab Log in

Texas A&M Engineering Experiment Station Logo
  • State of Texas
  • Open Records
  • Risk, Fraud & Misconduct Hotline
  • Statewide Search
  • Site Links & Policies
  • Accommodations
  • Environmental Health, Safety & Security
  • Employment