• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • About the ART Lab
  • Research
  • Publications
  • People
  • Contact Us
  • News

Adaptive Robotics & Technology Lab

Texas A&M University College of Engineering

Vision-based Ascending Staircase Detection with Interpretable Classification Model for Stair Climbing Robots

Kangneoung Lee, Vishnu Kalyanram, Chuanqi Zheng, Siddharth Sane, and Kiju Lee

IEEE International Conference on Robotics and Automation (ICRA), 2022

May 25, 2022

Robots capable of traversing flights of stairs play an important role in both indoor and outdoor applications. The capability of accurately identifying a staircase is one of the vital technical functions in these robots. This paper presents a vision-based ascending stair detection algorithm using RGB-Depth (RGB-D) data based on an interpretable model. The method follows the four steps: 1) pre-processing of RGB images for line extraction by applying the dilatation and Canny filters followed by the probabilistic Hough line transform, 2) defining the regions of interests (ROIs) via Kmean clustering, 3) training the initial model based on a support vector machine (SVM) using three extracted features (i.e., gradient, continuity factor, and deviation cost), and 4) building an interpretable model for stair classification by determining the decision boundary conditions. The developed method was evaluated for its performance using our dataset, and the results showed 85% sensitivity and 94% specificity. When the same model was tested on a different test set, the sensitivity and specificity slightly decreased to 80% and 90%, respectively. By shifting the boundary conditions using only a small subset of the new dataset without rebuilding the model, performance was improved to 90% sensitivity and 96% specificity. The presented method is also compared with existing SVM- and neural-network-based methods.

Click here to check the ICRA poster: ICRA2022_Poster_FINAL

Recent Posts

  • Adaptive Mixed-Reality Sensorimotor Interface for Human-Swarm Teaming: Persons with Limb Loss Case Study and Field Experiments
  • α-WaLTR: Adaptive Wheel-and-Leg Transformable Robot for Versatile Multiterrain Locomotion
  • Vision-based Ascending Staircase Detection with Interpretable Classification Model for Stair Climbing Robots
  • GA-SVM based Facial Emotion Recognition using Facial Geometric Features
  • Woody: Low-Cost, Open-Source Humanoid Torso Robot

© 2016–2023 Adaptive Robotics & Technology Lab Log in

Texas A&M Engineering Experiment Station Logo
  • State of Texas
  • Open Records
  • Risk, Fraud & Misconduct Hotline
  • Statewide Search
  • Site Links & Policies
  • Accommodations
  • Environmental Health, Safety & Security
  • Employment