Robotics

How mobile robots can navigate using 3D LiDAR technology

30th April 2024
Harry Fowle
0

The escalating demand for mobile robotic systems capable of seamless day and night operations with elevated safety standards is becoming increasingly evident across diverse fields. These applications span logistical operations, autonomous delivery services, and comprehensive site inspections. Regardless of the specific application, a fundamental prerequisite for the optimal functioning of a mobile robot is the precise and real-time awareness of its position within its operating environment to execute its mission effectively.

This article originally appeared in the March'24 magazine issue of Electronic Specifier Design – see ES's Magazine Archives for more featured publications.

Addressing the intricate challenge of localising mobile robots requires a high degree of accuracy, particularly for guidance and intricate manoeuvres. Relying solely on exteroceptive positioning solutions, such as the Global Navigation Satellite System (GNSS), proves to be unreliable in scenarios where sky occlusion occurs. This is notably observed in urban environments characterised by towering buildings and in industrial sites featuring obstructions such as cranes. In light of these challenges, the integration of vision sensors becomes imperative.

Positioning with vision sensors

Vision sensors such as cameras and LiDARs play a pivotal role in the positioning of mobile robots. When employed independently, these sensors facilitate the computation of relative motion from an arbitrary starting position. In addition, they can be fused with inertial sensors (IMU). They commonly rely on Simultaneous Localisation and Mapping (SLAM) algorithms with an Extended Kalman Filtering (EKF) backend. The output position derived from these methods, commonly referred to as odometry, tends to exhibit drift over time. This drift necessitates the use of fusion algorithms in conjunction with GNSS measures to constrain the output position and give it a global consistency.

However, it is noteworthy that the fused output encounters similar challenges as GNSS-based solutions.

An innovative approach to computing the absolute position of a system with precision, without being dependent on unreliable GNSS measures, involves the utilisation of a 3D map of the operating environment. This 3D map serves as a robust geometrical reference, allowing the real-time measurements obtained from a 3D LiDAR mounted on the robot to compute its position with an impressive accuracy level below 2cm.

Creating 3D maps of the operating area of a mobile robot

The creation of 3D maps tailored for the operational needs of mobile robots presents two primary challenges. The first revolves around achieving local accuracy, mandating that the 3D map be highly accurate within a local context (<2cm). This precision is crucial for ensuring the exact positioning of the system within the map. The second challenge is attaining global accuracy, a requisite for guiding the robot along its designated trajectory, especially in autonomous applications. To meet this need, the 3D map must exhibit global accuracy of less than 5cm, allowing it to be used redundantly with GNSS-based solutions.

Existing methodologies for generating such high-precision maps encompass static scanners and mobile mapping systems. While static scanners excel in creating accurate maps, their scalability is limited, and the process is time-consuming. On the other hand, mobile mapping systems, while accurate, pose challenges due to their exorbitant costs and the absence of global consistency, particularly concerning intersections and the inability to address ‘loop closure’ constraints. Recent advancements have introduced innovative solutions to tackle these challenges. Notably, Exwayz, a French deeptech startup, has pioneered Exwayz 3D Mapping software. This software harnesses the power of 3D LiDAR data and GNSS measurements in post-processing to create city-scale maps.

Exwayz 3D Mapping incorporates cutting-edge innovations in 3D vision technology, enabling both local accuracy through SLAM technology and efficient loop closure algorithms, and global accuracy achieved through post-processing trajectory optimisation. This innovative approach seamlessly integrates local LiDAR SLAM measurements with global GNSS measures, offering a holistic solution to the challenges associated with mapping for mobile robotic systems.

Navigating in a 3D map

Navigating within a 3D map utilising a 3D LiDAR introduces complexities primarily stemming from the expansive size of the maps, often covering dozens of kilometres, and containing vast amounts of geometric data. Additionally, challenges arise from the potential disparities between the objects measured by the LiDAR and the geometry encoded in the map. This is particularly evident in the presence of non-static objects such as parked cars, moving pedestrians, trucks, buses, and industrial site elements like cranes or containers.

 Further complexity is introduced by the remarkable quantity of 3D data measured by contemporary 3D LiDARs. This data requires rapid processing, occurring more than 20 times per second, to provide the robot with frequent estimates of its position within the environment. The positioning itself is achieved through the utilisation of registration algorithms, whose primary objective is to align current observations from the 3D LiDAR with the 3D geometry encoded in the map.

This alignment is typically accomplished through feature matching algorithms and the application of the Iterative Closest Points (ICP) algorithm, a range of algorithms that iteratively minimise the point-to-point distance between the LiDAR measures and the map. In response to these challenges, Exwayz has developed Exwayz SLAM, a software solution compatible with all LiDAR sensors available in the market. This software, in conjunction with Exwayz 3D Mapping, accurately computes the position of a 3D LiDAR within a map with a precision of less than 2cm. This technology unlocks a myriad of autonomous applications, particularly in logistics, where precise navigation and positioning are paramount.

As autonomous systems rapidly integrate into urban landscapes, ensuring safety becomes a critical priority. The democratisation of LiDAR sensors represents a revolutionary leap forward in robotics. The data generated by LiDAR sensors can be utilised not only for perception but also for navigation, providing a comprehensive solution for enhancing the safety and efficiency of robotics systems in diverse applications.

Product Spotlight

Upcoming Events

View all events
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier