search

UMD     This Site





Fig. 1 from the paper. The trajectories of the new method, TerraPN (green, yellow, red corresponding to fast, intermediate and slow speeds respectively), versus previous methods: DWA [11] (blue), TERP [1] (orange), OCRNet-based (pink), and PSPNet-based (purple) navigation schemes in unstructured terrain. TerraPN’s trajectories navigate the robot on smooth surfaces (asphalt) as much as possible with high velocities and adapt the velocity to different surfaces based on their navigability costs. Other methods directly drive the robot towards its goal with its maximum velocity. In some cases (PSPNet-based), the trajectories are wandering since the segmentation falters when the surface becomes too unstructured.

Fig. 1 from the paper. The trajectories of the new method, TerraPN (green, yellow, red corresponding to fast, intermediate and slow speeds respectively), versus previous methods: DWA [11] (blue), TERP [1] (orange), OCRNet-based (pink), and PSPNet-based (purple) navigation schemes in unstructured terrain. TerraPN’s trajectories navigate the robot on smooth surfaces (asphalt) as much as possible with high velocities and adapt the velocity to different surfaces based on their navigability costs. Other methods directly drive the robot towards its goal with its maximum velocity. In some cases (PSPNet-based), the trajectories are wandering since the segmentation falters when the surface becomes too unstructured.

 

Autonomous robots are currently being used for food and grocery delivery, agriculture, surveillance, and planetary exploration. A robot’s ability to navigate depends in part on the geometry and surface properties of the terrain it encounters. For example, a surface’s texture influences the kind of traction the robot will be able to attain, while its bumpiness determines the vibrations the robot experiences. Whether or not a robot might get stuck in sand or mud depends on a terrain property called “deformability.” That’s why being able to both perceive and learn about surface properties is important for robots’ smooth navigation.

Recently, advances in computer vision and other areas have made better terrain navigation possible. Now, a team of students and faculty in the University of Maryland’s GAMMA Lab has developed a new way for autonomous robots to improve their navigating abilities by learning the surface characteristics of complex outdoor terrains. TerraPN: Unstructured terrain navigation through Online Self-Supervised Learning was developed by Adarsh Jagan Sathyamoorthy, Kasun Weerakoon, Tianrui Guan, Jing Liang and ISR-affiliated Professor Dinesh Manocha (CS/ECE/UMIACS).

TerraPN is a computationally light, online, self-supervised, learning-based method to compute a surface navigability cost map. It trains a neural network to learn a terrain’s surface properties, then computes a robot-speci?c 2D navigability cost map for the terrain. The method uses RGB image patches cropped from a full-sized image and the robot’s velocities as inputs, and processed 6-DOF IMU measurements and odometry errors as labels. The predicted cost map is a concatenation of n×n patches of costs corresponding to different input RGB patches.

TerraPN learns to predict navigability costs in approximately 20 minutes for ?ve different surfaces, compared to 3-4 hours for previous scene segmentation methods. This decreases inference time. TerraPN also outperforms previous works in terms of vibration costs and generates robot velocities suitable for different surfaces.

TerraPN includes a new extension to the Dynamic-Window Approach (DWA-O) that accounts for a surface’s navigability cost while computing robot trajectories. DWA-O also computes velocities with low surface navigability cost for the robot. This leads to smoother trajectories and a reduction in the vibration cost the robot experiences.

More information is available at the GAMMA Lab website. A video demonstrating TerraPN is also available.



Related Articles:
Chahat Deep Singh named a Future Faculty Fellow
New model predictive control framework improves reactive navigation for autonomous robots
The Falcon and the Flock
Aloimonos, Sandini contribute chapter to MIT Press book, Cognitive Robotics
Autonomous drones based on bees use AI to work together
New system uses machine learning to detect ripe strawberries and guide harvests
Helping robots navigate to a target, around obstacles and without a map
Game-theoretic planning for autonomous vehicles
Bee drones featured on new Voice of America video
Perception and Robotics Group creates hive of ideas for drones

April 20, 2022


«Previous Story  

 

 

Current Headlines

Srivastava Named Inaugural Director of Semiconductor Initiatives and Innovation

State-of-the-Art 3D Nanoprinter Now at UMD

UMD, Partners Receive $31M for Semiconductor Research

Two NSF Awards for ECE Alum Michael Zuzak (Ph.D. ’22)

Applications Open for Professor and Chair of UMD's Department of Materials Science and Engineering

Ghodssi Honored With Gaede-Langmuir Award

Milchberg and Wu named Distinguished University Professors

New features on ingestible capsule will deliver targeted drugs to better treat IBD, Crohn’s disease

Forty years of MEMS research at the Hilton Head Workshop

Baturalp Buyukates (ECE Ph.D. ’21) Honored by IEEE ComSoc

 
 
Back to top  
Home Clark School Home UMD Home