Hard-to-Detect Obstacle Mapping by Fusing LIDAR and Depth Camera

Sidharth Jeyabal, W. K.R. Sachinthana, S. M. Bhagya, P. Samarakoon, Mohan Rajesh Elara, Bing J. Sheu

Research output: Contribution to journalJournal Article peer-review


In the era of autonomy, the integration of intelligent systems capable of navigating and perceiving their surroundings has become ubiquitous. Many sensors have been developed for environmental perceiving, with LIDAR emerging as a preeminent technology for precise obstacle detection. However, LIDAR has inherent limitations, impeding its ability to detect specific obstacles located below the LIDAR’s height or penetrating its rays. Typical environments where robots are deployed often contain obstacles, which might cause issues for robot operations, such as collisions and entanglements, leading to performance degradation. This research addresses the identified limitations by recognizing obstacles that traditionally challenge LIDAR’s detection capabilities. Objects such as glass, carpets, wires, and ramps have been meticulously identified as Hard-to-Detect Objects by LIDAR (HDOL). YOLOv8 has been utilized to detect HDOL using a depth camera. HDOL objects are incorporated into the environmental map, circumventing the constraints posed by LIDAR. Further, HDOL-aware coverage path planning has been proposed using boustrophedon motion with an A* algorithm to navigate the robot safely in an environment. Real-world experiments have validated the applicability of the proposed method for ensuring robot safety.

Original languageEnglish
Pages (from-to)1
Number of pages1
JournalIEEE Sensors Journal
StateAccepted/In press - 2024

Bibliographical note

Publisher Copyright:


  • Cameras
  • coverage path planning
  • Glass
  • Laser radar
  • mapping
  • Navigation
  • obstacle detection
  • robot safety
  • Robots
  • Sensor fusion
  • sensor fusion
  • Sensors


Dive into the research topics of 'Hard-to-Detect Obstacle Mapping by Fusing LIDAR and Depth Camera'. Together they form a unique fingerprint.

Cite this