• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • ART Lab
  • Research
  • Publications
  • ART Studio
  • People
  • News
  • Contact Us

Adaptive Robotics & Technology Lab

ART Lab

Texas A&M University College of Engineering

Publications

Comparison of GPS Collars and Solar-Powered GPS Ear Tags for Animal Movement Studies

Dylan G Stewart, Egleu DM Mendes, Kiju Lee, Marcus E Blum, Luis O Tedeschi, Stephen L Webb

Smart Agricultural Technology

Smart Agricultural Technology, 11: 101021

Animal-borne tracking systems have provided unique insights into when, where, why, and how animals move and interact with the environment. GPS neck collars have been the standard for animal tracking studies, especially for mid to large-size mammals. However, new technological developments have helped to miniaturize tracking devices (e.g., GPS ear tags), including battery size and longevity (e.g., using solar panels). We initiated this study to quantify the difference in horizontal error and data loss between solar-powered GPS mOOvement version 1 ear tags and GPS Vectronic Aerospace collars during stationary testing and while deployed on beef cows (Bos taurus). Mean horizontal error was 41 m (± 1.8 SE) and 2 m (± 0.1 SE) for GPS ear tags and collars during stationary testing, respectively; during animal testing, the distance between paired ear tag and collar locations was 59.2 m (± 3.3 SE). Fix acquisition was 99.3% ± 0.3 SE for ear tags and 99.8% ± 0.2 SE for collars during stationary testing. During animal deployment, fix acquisition changed to 30.7% (± 9.1 SE) and 100% for ear tags and collars, respectively. Lower acquisition rates, driven by loss of battery life, and greater horizontal error of GPS ear tags, while on animals, may introduce bias into estimates of movement and space use; GPS collars appear to be less sensitive to these forms of bias. However, mOOvement GPS ear tag systems are more economical than commercially manufactured GPS collars. Therefore, budgetary constraints, data resolution, and study objectives will dictate which technology to use.

Low-Cost, Compact Mobile Robot for Autonomous Soil Monitoring in Crop Fields

Shrikrishna Gad, Muthukumar Bagavathiannan, Mahendra Bhandari, John Cason, Robert Hardin, Juan Landivar, Kiju Lee

2025 22nd International Conference on Ubiquitous Robots (UR)

June 30, 2025

This paper presents the development and evaluation of a mobile robotic platform for autonomous crop field scouting and soil sensing. The system combines a durable commercial chassis kit with custom 3D-printed casings, enabling reliable operation across diverse outdoor field environments. The robot features encoder-controlled motors and a swivelmounted front frame, allowing versatile and agile navigation through narrow crop rows and uneven terrain, as demonstrated in field trials conducted in cotton and peanut fields. A soil sensing mechanism, driven by a 360° servo motor and employing a linear gear-and-rack mechanism, enables consistent soil penetration. Integrated with a low-cost 7 -in-1 soil sensor, the platform provides real-time mapping of key soil parame-ters-nitrogen, phosphorus, potassium, electrical conductivity, pH, temperature, and moisture-to support data-driven farm management decisions. Preliminary experiments evaluated the robot’s field navigation and soil sensing performance. Results demonstrate the potential of the platform for low-cost, mobile soil sensing, while also highlighting limitations in the current sensor’s accuracy.

Hardware Prototype and System Apparatus of an Autonomous Robotic Harvesting Cell

Neha Vemula, Dugan Um, Mahendra Bhandari, Kiju Lee

2025 22nd International Conference on Ubiquitous Robots (UR)

June 30, 2025

This paper presents a novel autonomous robotic harvesting cell designed for hydroponic plant cultivation within a compact, self-contained system. The system integrates a robotic manipulator mounted on a gantry mechanism, enabling precise and automated pruning and harvesting operations. An RGB-D camera supports real-time object detection, while a custom-designed cutter-integrated directly into the manipulator-eliminates the need for additional actuators. The hardware prototype serves as a small-scale autonomous farming unit, demonstrating the feasibility of fully automated crop management in controlled environments. As an initial application, the system incorporates vision-based detection of tomato fruits and suckers. To facilitate continued development, a simulation model was created in Unity 3D, supporting virtual prototyping, system implementation, and algorithm evaluation prior to physical deployment. By combining advanced automation with a compact, modular design, the proposed robotic harvesting cell offers a scalable solution for indoor farming, urban agriculture, and small-scale food production.

Multi-Robot Shepherding: A CLF-CBF Approach

Abdulaziz Farhan Alharbi, Kiju Lee

2025 22nd International Conference on Ubiquitous Robots (UR)

June 30 - July 2, 2025

This paper presents a control scheme for multirobot shepherding of non-cooperative agents, using a framework based on control barrier function and control Lyapunov function. The proposed control design guarantees the feasibility of the CLF-CBF quadratic optimization problem, even when the number of non-cooperative agents significantly exceeds the number of robots. This scheme supports distributed implementation with distributed sensing, leveraging the consensus alternating direction method of multipliers. Unlike previous works, the presented method does not impose constraints on the maximum velocities or sensing ranges of non-cooperative agents. Simulation results demonstrate that the controller can successfully shepherd large numbers of non-cooperative agents even without aggregation behaviors using teams as small as two robots.

Unmanned aerial system and machine learning driven Digital-Twin framework for in-season cotton growth forecasting

Pankaj Pal, Juan Landivar-Bowles, Jose Landivar-Scott, Nick Duffield, Kevin Nowka, Jinha Jung, Anjin Chang, Kiju Lee, Lei Zhao, Mahendra Bhandari

Computers and Electronics in Agriculture, 228: 109589

January 2025

In the past decade, Unmanned Aerial Systems (UAS) have made a significant impact on various sectors, including precision agriculture, by enabling remote monitoring of crop growth and development. Monitoring and managing crops effectively throughout the growing season are crucial for optimizing crop yield. The integration of UAS-monitored data and machine learning has greatly advanced crop production management, resulting in improvements in key areas such as irrigation scheduling, crop termination analysis, and predicting yield. This study presents the development of a Digital Twin (DT) for cotton crops using UAS captured RGB data. The primary objective of this DT is to forecast various cotton crop features during the growing season, including Canopy Cover (CC), Canopy Height (CH), Canopy Volume (CV), and Excess Greenness (EXG). Predictive analytics as part of DT development employs machine learning regression to extract crop feature growth patterns from UAS data collected from 2020 to 2023. During the current season, real-time UAS data and historical growth patterns are combined to generate growth patterns using a novel hybrid model generation strategy for forecasting. Comparisons of the DT-based forecasts to actual data demonstrated low RMSE for CC, CH, CV, and EXG. The proposed DT framework, which accurately forecasts cotton crop features up to 30 days into the future starting 80 days after sowing, was found to outperform existing forecasting methods. Notably, the RRMSE for CC, CH, CV, and EXG was measured to be 9, 13, 14, and 18 percent, respectively. Furthermore, the potential applications of forecasted data in biomass estimation and yield prediction are highlighted, emphasizing their significance in optimizing agricultural practices.

Terrain-aware path planning via semantic segmentation and uncertainty rejection filter with adversarial noise for mobile robots

Kangneoung Lee; Kiju Lee

Journal of Field Robotics, 42(1): 287-301

January 2025

In ground mobile robots, effective path planning relies on their ability to assess the types and conditions of the surrounding terrains. Neural network-based methods, which primarily use visual images for terrain classification, are commonly employed for this purpose. However, the reliability of these models can vary due to inherent discrepancies between the training images and the actual environment, leading to erroneous classifications and operational failures. Retraining models with additional images from the actual operating environment may enhance performance, but obtaining these images is often impractical or impossible. Moreover, retraining requires substantial offline processing, which cannot be performed online by the robot within an embedded processor. To address this issue, this paper proposes a neural network-based terrain classification model, trained using an existing data set, with a novel uncertainty rejection filter (URF) for terrain-aware path planning of mobile robots operating in unknown environments. A robot, equipped with a pretrained model, initially collects a small number of images (10 in this work) from its current environment to set the target uncertainty ratio of the URF. The URF then dynamically adjusts its sensitivity parameters to identify uncertain regions and assign associated traversal costs. This process occurs entirely online, without the need for offline procedures. The presented method was evaluated through simulations and physical experiments, comparing the point-to-point trajectories of a mobile robot equipped with (1) the neural network-based terrain classification model combined with the presented adaptive URF, (2) the classification model without the URF, and (3) the classification model combined with a nonadaptive version of the URF. Path planning performance measured the Hausdorff distances between the desired and actual trajectories and revealed that the adaptive URF significantly improved performance in both simulations and physical experiments (conducted 10 times for each setting). Statistical analysis via t-tests confirmed the significance of these results.

Techniques for Canopy to Organ Level Plant Feature Extraction via Remote and Proximal Sensing: A Survey and Experiments

Prasad Nethala, Dugan Um, Neha Vemula, Oscar Fernandez Montero, Kiju Lee, Mahendra Bhandari

Remote Sensing, 16(23), 4370

2024

This paper presents an extensive review of techniques for plant feature extraction and segmentation, addressing the growing need for efficient plant phenotyping, which is increasingly recognized as a critical application for remote sensing in agriculture. As understanding and quantifying plant structures become essential for advancing precision agriculture and crop management, this survey explores a range of methodologies, both traditional and cutting-edge, for extracting features from plant images and point cloud data, as well as segmenting plant organs. The importance of accurate plant phenotyping in remote sensing is underscored, given its role in improving crop monitoring, yield prediction, and stress detection. The review highlights the challenges posed by complex plant morphologies and data noise, evaluating the performance of various techniques and emphasizing their strengths and limitations. The insights from this survey offer valuable guidance for researchers and practitioners in plant phenotyping, advancing the fields of plant science and agriculture. The experimental section focuses on three key tasks: 3D point cloud generation, 2D image-based feature extraction, and 3D shape classification, feature extraction, and segmentation. Comparative results are presented using collected plant data and several publicly available datasets, along with insightful observations and inspiring directions for future research.

CLAW: Cycloidal Legs-Augmented Wheels for Stair and Obstacle Climbing in Mobile Robots

Yuan Wei; Kiju Lee

IEEE/ASME Transactions on Mechatronics, Vol 30, Issue 2, p1536-1546

August 22, 2024

In this article, we introduce a cycloidal legs-augmented wheel (CLAW), a novel wheel-leg mechanism for stair and obstacle climbing. CLAW integrates three leg segments with a wheel using a specialized passive bar mechanism inspired by a cycloidal rotor design. These legs extend outward to reach heights from the front and retract back within the wheel boundary as they approach the ground. This unique mechanism enables the legs to generate cycloidal trajectories as the wheel rotates without additional actuators, thereby preserving the operation and control simplicity of conventional wheeled robots while significantly improving climbing abilities. Experimental tests with CLAWbot, a fixed-axis four-wheeled robot equipped with the CLAW mechanisms, demonstrated obstacle climbing up to 2.6 times the wheel radius and reliable traversing of different staircases, obstacles, and diverse terrains, such as a concrete floor, grass, a ramp, and rocky surfaces. It also exhibited smooth trajectories during turns and climbing.

Autonomous field navigation of mobile robots for data collection and monitoring in agricultural crop fields

Yuan Wei; Kangneoung Lee; Kiju Lee

2024 21st International Conference on Ubiquitous Robots (UR)

June 24-27, 2024

This paper introduces a mobile ground robot designed for autonomous navigation and data collection in agricultural fields, utilizing precise localization through an extended Kalman filter (EKF) that integrates data from GPS, an inertial measurement unit (IMU), and wheel encoders. We propose a novel method based on an artificial electric potential field (AEPF) for reliable and autonomous navigation in these robots. Implemented on a four-wheeled robot interfaced, our experiments showed that AEPF-based navigation processed data more quickly than the traditional Nav2 local path planner. Additionally, the robot reliably collected RGB and depth images while navigating crop rows, highlighting the method’s effectiveness and its potential for extensive applications in autonomous crop monitoring. Additionally, a graphical user interface was developed to enable users to define target areas, assign tasks, and monitor the robot’s performance in real time.

Mixed reality-based teleoperation of mobile robotic arm: system apparatus and experimental case study

Annalisa Jarecki, Kiju Lee

2024 21st International Conference on Ubiquitous Robots (UR)

June 24-27, 2024

This paper presents a system apparatus for supporting the remote operation of a mobile robot arm using a mixed reality (MR)-based user interface (UI). The presented system is based on Robot Operating System 2, utilizing newly developed and existing software packages for gesture-based control of the mobile base and the robotic arm. An experimental case study was designed to evaluate the system-level integration and usability. The case study involved seven participants completing a simple sequence of remote operation tasks using two different UI modalities, the MR device and a conventional computer interface (i.e., a 2D display and a keyboard). The results showed that the MR-based UI might be perceived by participants as more intuitive than the conventional control interfaces, while some limitations, such as gesture sensitivity and increased task load due to unfamiliarity, were also identified.

  • 1
  • Go to page 2
  • Go to page 3
  • Interim page numbers omitted …
  • Go to page 5
  • Go to Next Page »

Recent Posts

  • Comparison of GPS Collars and Solar-Powered GPS Ear Tags for Animal Movement Studies September 29, 2025
  • Low-Cost, Compact Mobile Robot for Autonomous Soil Monitoring in Crop Fields September 29, 2025
  • Hardware Prototype and System Apparatus of an Autonomous Robotic Harvesting Cell September 29, 2025
  • Multi-Robot Shepherding: A CLF-CBF Approach September 29, 2025
  • Unmanned aerial system and machine learning driven Digital-Twin framework for in-season cotton growth forecasting September 29, 2025

© 2016–2026 Adaptive Robotics & Technology Lab Log in

Texas A&M Engineering Experiment Station Logo
  • State of Texas
  • Open Records
  • Risk, Fraud & Misconduct Hotline
  • Statewide Search
  • Site Links & Policies
  • Accommodations
  • Environmental Health, Safety & Security
  • Employment