Autonomous mobile robot navigation in sparse lidar feature environments

Phuc Thanh Thien Nguyen, Shao Wei Yan, Jia Fu Liao, Chung Hsien Kuo*

*Corresponding author for this work

Research output: Contribution to journalJournal Article peer-review

17 Scopus citations


In the industrial environment, Autonomous Guided Vehicles (AGVs) generally run on a planned route. Among trajectory-tracking algorithms for unmanned vehicles, the Pure Pursuit (PP) algorithm is prevalent in many real-world applications because of its simple and easy implementa-tion. However, it is challenging to decelerate the AGV’s moving speed when turning on a large curve path. Moreover, this paper addresses the kidnapped-robot problem occurring in spare LiDAR environments. This paper proposes an improved Pure Pursuit algorithm so that the AGV can predict the trajectory and decelerate for turning, thus increasing the accuracy of the path tracking. To solve the kidnapped-robot problem, we use a learning-based classifier to detect the repetitive pattern scenario (e.g., long corridor) regarding 2D LiDAR features for switching the localization system between Simultaneous Localization And Mapping (SLAM) method and Odometer method. As experimental results in practice, the improved Pure Pursuit algorithm can reduce the tracking error while performing more efficiently. Moreover, the learning-based localization selection strategy helps the robot navigation task achieve stable performance, with 36.25% in completion rate more than only using SLAM. The results demonstrate that the proposed method is feasible and reliable in actual conditions.

Original languageEnglish
Article number5963
JournalApplied Sciences (Switzerland)
Issue number13
StatePublished - 01 07 2021
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2021 by the authors. Licensee MDPI, Basel, Switzerland.


  • Deep learning
  • Path planning
  • Pure pursuit controller
  • Robot kidnapping detection
  • Trajectory tracking


Dive into the research topics of 'Autonomous mobile robot navigation in sparse lidar feature environments'. Together they form a unique fingerprint.

Cite this