Abstract
In the era of autonomy, the integration of intelligent systems capable of navigating and perceiving their surroundings has become ubiquitous. Many sensors have been developed for environmental perceiving, with LIDAR emerging as a preeminent technology for precise obstacle detection. However, LIDAR has inherent limitations, impeding its ability to detect specific obstacles located below the LIDAR's height or penetrating its rays. Typical environments where robots are deployed often contain obstacles, which might cause issues for robot operations, such as collisions and entanglements, leading to performance degradation. This research addresses the identified limitations by recognizing obstacles that traditionally challenge LIDAR's detection capabilities. Objects such as glass, carpets, wires, and ramps have been meticulously identified as hard-to-detect objects by LIDAR (HDOL). YOLOv8 has been used to detect HDOL using a depth camera. HDOL objects are incorporated into the environmental map, circumventing the constraints posed by LIDAR. Furthermore, HDOL-aware coverage path planning (CPP) has been proposed using boustrophedon motion with an A∗ algorithm to navigate the robot safely in an environment. Real-world experiments have validated the applicability of the proposed method for ensuring robot safety.
Original language | English |
---|---|
Pages (from-to) | 24690-24698 |
Number of pages | 9 |
Journal | IEEE Sensors Journal |
Volume | 24 |
Issue number | 15 |
DOIs | |
State | Published - 2024 |
Bibliographical note
Publisher Copyright:© 2001-2012 IEEE.
Keywords
- Cameras
- coverage path planning
- Glass
- Laser radar
- mapping
- Navigation
- obstacle detection
- robot safety
- Robots
- Sensor fusion
- sensor fusion
- Sensors
- Coverage path planning (CPP)