CN117554990A - Laser radar SLAM positioning navigation method and unmanned aerial vehicle system thereof - Google Patents

Laser radar SLAM positioning navigation method and unmanned aerial vehicle system thereof Download PDF

Info

Publication number
CN117554990A
CN117554990A CN202210930399.8A CN202210930399A CN117554990A CN 117554990 A CN117554990 A CN 117554990A CN 202210930399 A CN202210930399 A CN 202210930399A CN 117554990 A CN117554990 A CN 117554990A
Authority
CN
China
Prior art keywords
information
pose
slam
laser
unmanned aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210930399.8A
Other languages
Chinese (zh)
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Hydrogen Source Intelligent Technology Co ltd
Original Assignee
Beijing Hydrogen Source Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Hydrogen Source Intelligent Technology Co ltd filed Critical Beijing Hydrogen Source Intelligent Technology Co ltd
Priority to CN202210930399.8A priority Critical patent/CN117554990A/en
Publication of CN117554990A publication Critical patent/CN117554990A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/933Lidar systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3841Data obtained from two or more sources, e.g. probe vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention discloses a laser radar SLAM (sequential localization and navigation) based method and an unmanned aerial vehicle system thereof, wherein the laser radar SLAM based method comprises the steps that a laser information acquisition unit acquires information for a three-dimensional laser SLAM system, a laser information processing unit filters and pre-processes the acquired information, the filtered and pre-processed data are subjected to resolving processing, motion pose information is combined with three-dimensional map model information to generate current pose and motion information, a flight control unit acquires position information appointed by a user, and an optimal route is generated according to the position information appointed by the user, map updating information, the fused pose and motion information, and the unmanned aerial vehicle is displayed and controlled to be positioned and navigated. The method and the system provided by the invention can fly indoors or under any condition without satellite GPS signal reception, are safe and reliable, are not easy to interfere with the fly by wireless signals, and can fly autonomously without communication connection.

Description

Laser radar SLAM positioning navigation method and unmanned aerial vehicle system thereof
Technical Field
The invention belongs to the technical field of unmanned aerial vehicles, and particularly relates to a laser radar SLAM positioning navigation method and an unmanned aerial vehicle system thereof.
Background
Unmanned aircraft, abbreviated as unmanned aircraft, was first developed in the beginning of the 20 th century, and then developed gradually due to military needs. The unmanned aerial vehicle uses a radio remote control device and a self-contained program control device to operate the unmanned aerial vehicle or is fully or intermittently operated autonomously by the vehicle-mounted computer. The unmanned aerial vehicle is widely applied to a plurality of fields such as post-disaster search and rescue, aviation shooting, crop monitoring, military operations and the like due to the characteristics of flexible maneuvering, small constraint by ground topography, low manufacturing cost and no casualties, and different fields have different requirements on the unmanned aerial vehicle, such as the unmanned aerial vehicle is required to realize complex tasks such as positioning, obstacle avoidance, tracking, track planning and the like.
In recent years, the four-rotor unmanned aerial vehicle has the advantages of low cost, simplicity and convenience in operation and the like, and has become an important research direction in the unmanned aerial vehicle industry. The application of the quadrotor unmanned aerial vehicle as a flying robot increasingly requires the unmanned aerial vehicle to have high-precision, intelligent and autonomous flying capabilities, and the basis for the capabilities is that the unmanned aerial vehicle has accurate state estimation and environment sensing capabilities. Currently, GPS/RTKs are commonly used for positioning of various types of aircraft, however, in some cases, such as in indoor environments or in some crowded areas, the GPS/RTK accuracy becomes poor or even fails, and is insufficient to support autonomous flight of the unmanned aerial vehicle. Unmanned aerial vehicle fixed-point hovering taking satellite navigation technology as navigation mode has non-negligible error, and finally stable hovering in a fixed area is difficult to realize. However, if the binocular vision system is carried on the unmanned aerial vehicle, the unmanned aerial vehicle can realize autonomous control without external signal transmission.
Because the current unmanned aerial vehicle most depends on GPS/RTK signal flight, but under indoor environment, GPS/RTK signal can not be obtained, so research unmanned aerial vehicle still can keep away the barrier flight under the condition that GPS/RTK signal is lost is the main difficult point, is also a big challenge to unmanned aerial vehicle flight safety. Therefore, the method has great significance for the research and development of autonomous flight of the multi-rotor unmanned aerial vehicle flight system.
Disclosure of Invention
The embodiment of the invention provides a laser radar SLAM positioning navigation-based method and an unmanned aerial vehicle system thereof, which can achieve the effect that an unmanned aerial vehicle can still perform obstacle avoidance flight without depending on GPS/RTK signals.
In one aspect, the embodiment of the invention provides a method for positioning and navigating a unmanned aerial vehicle based on a laser radar SLAM, which comprises the following steps:
(1) Acquiring IMU pose and motion information of the current frame acquired by the IMU unit, and acquiring point cloud data information of the current frame acquired by the three-dimensional laser radar;
(2) Filtering and preprocessing the acquired point cloud data information;
(3) The filtered point cloud data is resolved to obtain the pose and motion information of the laser radar of the current frame and the local map information of the surrounding environment of the current frame in the laser SLAM system;
(4) Updating the map in the laser SLAM system according to the local map information, and fusing all the local maps to generate a three-dimensional global map;
(5) Combining the pose and the motion information of the laser radar of the previous frame with the pose and the motion information of the laser radar of the current frame to generate a first motion track;
(6) Generating a second motion trail by combining the IMU pose and the motion information of the previous frame and the IMU pose and the motion information of the current frame;
(7) Comparing the first motion track with the second motion track to obtain a comparison error, and comparing the comparison error with a laser SLAM calibration error preset by a system;
(8) Fusing pose and motion information of the IMU and the laser radar according to a comparison result of the comparison error and the laser SLAM calibration error, and combining a current updated global map model to obtain fused pose and motion information of a current frame;
(9) And acquiring the position information appointed by the user, generating an optimal route according to the position information appointed by the user, the map updating information, the fusion pose and the motion information, and displaying and controlling the positioning and the navigation of the unmanned aerial vehicle.
Preferably, the filtering and preprocessing the collected point cloud data in step 2 further includes the following steps:
(2-1) filtering the acquired point cloud data;
(2-2) preprocessing the filtered point cloud data;
(2-3) updating the temporary local map according to the preprocessed point cloud data.
Preferably, the comparing the comparison error with the laser SLAM calibration error in the step 7 further includes the following steps:
(7-1) if the comparison error is larger than the laser SLAM calibration error, replacing the IMU pose and the motion information of the current frame with the laser radar pose and the motion information of the current frame;
and (7-2) if the comparison error is smaller than the laser SLAM calibration error, storing the IMU pose and motion information of the current frame.
Preferably, the step 8 further includes: when the GPS signal and/or the RTK signal exist, the fusion pose and the motion information of the current frame are combined with the GPS signal and/or the RTK signal to obtain the fusion pose and the motion information with the earth coordinate system.
On the other hand, the embodiment of the invention provides an unmanned aerial vehicle system based on laser radar SLAM positioning navigation, which comprises a three-dimensional laser SLAM system and a flight control unit, wherein the three-dimensional laser SLAM system provides unmanned aerial vehicle positioning and navigation information, and the flight control unit controls unmanned aerial vehicle positioning and navigation flight according to the positioning and navigation information of the three-dimensional laser SLAM system;
the three-dimensional laser SLAM system also comprises a laser information acquisition unit and a laser information processing unit, wherein,
the laser information acquisition unit is used for acquiring information for a three-dimensional laser SLAM system and comprises an IMU unit and at least one multi-line three-dimensional laser radar module, wherein,
the IMU unit is used for providing predicted pose and motion information for the three-dimensional laser SLAM system;
the laser radar is used for collecting point cloud data of surrounding environments;
the laser information processing unit is used for processing the information acquired by the laser information acquisition unit, and comprises:
the point cloud processing module is used for filtering and preprocessing the point cloud data;
the SLAM data processing module is used for carrying out resolving processing on the filtered and preprocessed data and outputting motion pose information and a three-dimensional map model; and
and the multi-sensor fusion module is used for generating current pose and motion information by combining the three-dimensional map model information fusion according to the motion pose information obtained by the SLAM data processing module.
Preferably, the SLAM data processing module further comprises a radar resolving module, a track generating module, a track comparing module, a calibrating module and a laser radar fusion module, wherein,
the radar resolving module is used for resolving the filtered point cloud data to obtain corresponding laser radar pose and motion information and local map information of the surrounding environment of the current frame;
the track generation module is used for generating a first movement track of the unmanned aerial vehicle by combining the pose and the movement information of the laser radar of two continuous frames and generating a second movement track of the unmanned aerial vehicle by combining the pose and the movement information of the IMU of two continuous frames;
the track comparison module is used for comparing the first motion track and the second motion track to obtain a comparison error, and comparing the obtained comparison error with a preset laser SLAM calibration error;
the laser radar fusion module fuses pose and motion information output by the IMU unit and the laser radar according to a comparison result of the track comparison module; and outputting the pose information and the motion information of the current unmanned aerial vehicle.
Preferably, the SLAM data processing module further includes:
the map updating module is used for updating the map information of the three-dimensional laser SLAM system in real time according to the local map information of the surrounding environment of the current frame; and
and the global map fusion module is used for fusing all the local map information to generate the global three-dimensional map model information, so that the unmanned aerial vehicle plans a navigation path in the global map information.
Preferably, the unmanned aerial vehicle system based on laser radar SLAM positioning navigation further comprises a calibration module, wherein the calibration module is used for correcting the IMU unit error in real time according to the comparison result of the track comparison module.
Preferably, when a GPS and/or RTK signal exists, the multi-sensor fusion module fuses pose information and motion information, GPS and/or RTK position information and current updated map information output by the laser radar fusion module to obtain fusion pose and motion information with earth coordinates.
Preferably, the step of providing the predicted pose and motion information for the three-dimensional laser SLAM system further includes that the IMU unit collects mileage information first, converts the mileage information into pose change information of the unmanned aerial vehicle through a model of unmanned aerial vehicle inertial odometer kinematics, sends the pose change information to the bayesian filter to calculate the predicted pose and motion information preliminarily, and the filtering includes noise reduction, abnormal point removal and redundant point cloud data reduction on the point cloud data.
The method and the system provided by the invention integrate inertial navigation data of the IMU, and use the direction information measured by inertial navigation as the input information of the SLAM system, so that the problem that the current movement direction cannot be acquired without GPS signals is avoided, and the reliability of autonomous positioning of the system is improved. Meanwhile, the invention constructs and updates the three-dimensional map in real time based on multi-line three-dimensional laser scanning data through an SLAM data system, so as to be used for unmanned aerial vehicle navigation. The beneficial effects of the invention are as follows:
1. the satellite GPS signal receiving device can fly indoors or under any condition without satellite GPS signal receiving, is safe and reliable, is not easy to be interfered by wireless signals to fly, and can fly autonomously without communication connection.
2. The unmanned aerial vehicle can hover and fly autonomously with high reliability, can autonomously perform rapid route planning and autonomously fly to a destination.
3. The laser radar of the system belongs to an autonomous luminous sensor and can fly in a completely dark environment.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a general flowchart of a method for laser radar SLAM positioning navigation in an embodiment of the present invention.
Fig. 2 is a block diagram of an unmanned aerial vehicle system based on laser radar SLAM positioning navigation in an embodiment of the present invention.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as provided in the accompanying drawings, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, are intended to be within the scope of the present application.
On the one hand, referring to fig. 1, the embodiment of the invention provides a method for positioning and navigating a unmanned aerial vehicle based on a laser radar SLAM, which comprises the following steps:
(1) Acquiring IMU pose and motion information of the current frame acquired by the IMU unit, and acquiring point cloud data information of the current frame acquired by the three-dimensional laser radar;
(2) Filtering and preprocessing the acquired point cloud data information;
(3) The filtered point cloud data is resolved to obtain the pose and motion information of the laser radar of the current frame and the local map information of the surrounding environment of the current frame in the laser SLAM system;
(4) Updating the map in the laser SLAM system according to the local map information, and fusing all the local maps to generate a three-dimensional global map;
(5) Combining the pose and the motion information of the laser radar of the previous frame with the pose and the motion information of the laser radar of the current frame to generate a first motion track;
(6) Generating a second motion trail by combining the IMU pose and the motion information of the previous frame and the IMU pose and the motion information of the current frame;
(7) Comparing the first motion track with the second motion track to obtain a comparison error, and comparing the comparison error with a laser SLAM calibration error preset by a system;
(8) Fusing pose and motion information of the IMU and the laser radar according to a comparison result of the comparison error and the laser SLAM calibration error, and combining a current updated global map model to obtain fused pose and motion information of a current frame;
(9) And acquiring the position information appointed by the user, generating an optimal route according to the position information appointed by the user, the map updating information, the fusion pose and the motion information, and displaying and controlling the positioning and the navigation of the unmanned aerial vehicle.
In the embodiment of the present invention, the filtering and preprocessing the collected point cloud data in the step 2 further includes the following steps:
(2-1) filtering the acquired point cloud data;
(2-2) preprocessing the filtered point cloud data;
(2-3) updating the temporary local map according to the preprocessed point cloud data.
In the embodiment of the present invention, the comparing the comparison error with the laser SLAM calibration error in the step 7 further includes the following steps:
(7-1) if the comparison error is larger than the laser SLAM calibration error, replacing the IMU pose and the motion information of the current frame with the laser radar pose and the motion information of the current frame;
and (7-2) if the comparison error is smaller than the laser SLAM calibration error, storing the IMU pose and motion information of the current frame.
In an embodiment of the present invention, the step 8 further includes: when the GPS signal and/or the RTK signal exist, the fusion pose and the motion information of the current frame are combined with the GPS signal and/or the RTK signal to obtain the fusion pose and the motion information with the earth coordinate system.
On the other hand, the embodiment of the invention provides an unmanned aerial vehicle system based on laser radar SLAM positioning navigation, and FIG. 2 is a schematic diagram of the unmanned aerial vehicle system based on laser radar SLAM positioning navigation in the embodiment of the invention, which comprises a three-dimensional laser SLAM system 100 and a flight control unit 200, wherein the three-dimensional laser SLAM system 100 provides unmanned aerial vehicle positioning and navigation information, and the flight control unit 200 controls unmanned aerial vehicle positioning and navigation flight according to the positioning and navigation information of the three-dimensional laser SLAM system;
the three-dimensional laser SLAM system 100 further includes a laser information acquisition unit 110 and a laser information processing unit 120, wherein,
the laser information acquisition unit 110 is configured to acquire information for the three-dimensional laser SLAM system 100, and includes an IMU unit 111 and at least one multi-line three-dimensional laser radar module 112, where,
the IMU unit 111 is configured to provide predicted pose and motion information for the three-dimensional laser SLAM system 100;
the laser radar 112 is used for collecting point cloud data of the surrounding environment;
the laser information processing unit 120 is configured to process information collected by the laser information collecting unit, and includes:
the point cloud processing module 130 is used for filtering and preprocessing the point cloud data;
the SLAM data processing module 140 is configured to perform a resolving process on the filtered and preprocessed data, and output motion pose information and a three-dimensional map model;
and the multi-sensor fusion module 180 is used for generating current pose and motion information by combining the three-dimensional map model information fusion according to the motion pose information obtained by the SLAM data processing module.
In the embodiment of the present invention, the SLAM data processing module 140 further includes a radar resolving module 141, a track generating module 142, a track comparing module 143, a calibrating module 144, and a lidar fusion module 145, where,
the radar resolving module 141 is configured to resolve the filtered point cloud data to obtain corresponding pose and motion information of the laser radar and local map information of the surrounding environment of the current frame;
the track generation module 142 is configured to generate a first motion track of the unmanned aerial vehicle by combining the pose and the motion information of the laser radar of two continuous frames, and generate a second motion track of the unmanned aerial vehicle by combining the pose and the motion information of the IMU of two continuous frames;
the track comparison module 143 is configured to compare the first motion track and the second motion track to obtain a comparison error, and compare the obtained comparison error with a preset laser SLAM calibration error;
the laser radar fusion module 145 fuses the pose and motion information output by the IMU unit 111 and the laser radar according to the comparison result of the track comparison module; and outputting the pose information and the motion information of the current unmanned aerial vehicle.
In an embodiment of the present invention, the SLAM data processing module further includes:
the map updating module 160 is configured to update map information of the three-dimensional laser SLAM system 100 in real time according to local map information of a surrounding environment of the current frame; and
the global map fusion module 170 is configured to fuse all the local map information to generate the global three-dimensional map model information, so that the unmanned aerial vehicle plans a navigation path in the global map information.
In the embodiment of the present invention, the unmanned aerial vehicle system based on laser radar SLAM positioning navigation further includes a calibration module 144, configured to correct the error of the IMU unit 111 in real time according to the comparison result of the track comparison module.
In the embodiment of the present invention, when there is a GPS and/or RTK signal, the multi-sensor fusion module 180 further fuses pose information and motion information output by the laser radar fusion module, GPS and/or RTK position information, and current updated map information, to obtain fused pose and motion information with earth coordinates.
Preferably, the providing the predicted pose and motion information for the three-dimensional laser SLAM system 100 further includes the IMU unit 111 collecting the mileage information first, converting the mileage information into the pose change information of the unmanned aerial vehicle through the model of the unmanned aerial vehicle inertial odometer kinematics, and sending the pose change information to the bayesian filter to calculate the predicted pose and motion information preliminarily, where the filtering includes noise reduction, abnormal point removal and redundant point cloud data reduction for the point cloud data.
In the embodiment of the present invention, the point cloud processing module 130 further includes a filtering module 131 and a preprocessing module 132, where,
the filtering module 131 is configured to filter the point cloud data collected by the lidar 112;
the preprocessing module 132 is configured to perform preliminary processing on the filtered point cloud data to obtain temporary local map information.
In an embodiment of the present invention, the flight control unit 200 further includes a global planning module 210, a local planning module 220, and an underlying control module 230, wherein,
the global planning module 210 is configured to plan a navigation optimal path of the global map;
the local planning module 220 is configured to plan a global optimal path for the real-time local map information obtained after the preprocessing;
the bottom layer control module 230 is configured to control and allocate the unmanned aerial vehicle.
The unmanned aerial vehicle system based on laser radar SLAM positioning navigation in the embodiment of the invention has the following overall operation conditions:
the IMU unit 111 acquires mileage information of the three-dimensional laser SLAM system 100, converts the mileage information into unmanned aerial vehicle pose change information through an unmanned aerial vehicle inertial odometer kinematic model, sends the unmanned aerial vehicle pose change information into a Bayesian filter to primarily calculate IMU pose and motion information of a current frame and outputs the IMU pose and motion information to the track generation module 142, and the multi-line three-dimensional laser scanning radar acquires point cloud data information of the current frame and outputs the point cloud data information to the filtering module 131;
the filtering module 131 performs filtering processing such as noise reduction, abnormal point removal, redundant point cloud data quantity reduction and the like on the acquired point cloud data, and outputs the filtered point cloud data to the radar resolving module 141 and the preprocessing module 132 respectively;
the preprocessing module 132 performs preliminary processing on the filtered point cloud data to obtain temporary local map information and outputs the temporary local map information to the local planning module 220;
the radar resolving module 141 resolves the filtered point cloud data to obtain the laser radar pose and motion information of the current frame and the local map information of the surrounding environment of the current frame in the laser system, and outputs the laser radar pose and motion information of the current frame to the track generating module 142, and outputs the local map information of the surrounding environment of the current frame in the laser system to the map updating module 160;
the map updating module 160 updates the local map information of two consecutive frames in real time and outputs the local map information to the global map fusion module 170;
the global map fusion module 170 fuses all local map information to generate global map information, and when the unmanned aerial vehicle takes off at the same position next time, the global map information can be directly used to output the global map information to the laser radar fusion module 145 and the global planning module 210;
the track generation module 142 generates a first motion track by combining the laser radar pose and motion information of the previous frame with the laser radar pose and motion information of the current frame, generates a second motion track by combining the IMU pose and motion information of the previous frame with the IMU pose and motion information of the current frame, and outputs the first motion track and the second motion track to the track comparison module 143;
the track comparison module 143 compares the first motion track and the second motion track to obtain a first comparison error, compares the first comparison error with a preset laser SLAM calibration error, and outputs a comparison result to the calibration module 144;
if the first comparison error is greater than the laser SLAM calibration error, the calibration module 144 replaces the IMU pose and the motion information of the current frame with the laser radar pose and the motion information of the current frame, and if the first comparison error is less than the laser SLAM calibration error, the IMU pose and the motion information of the current frame are stored;
outputting pose and motion information of the IMU and the laser radar to a laser radar fusion module 145, fusing the pose and motion information of the IMU and the laser radar by the laser radar fusion module 145 according to a fusion algorithm to obtain fused pose and motion information of a current frame, outputting the fused pose and motion information of the current frame to a multi-sensor fusion module 180, fusing the fused pose and motion information received by the multi-sensor fusion module 180 and updated global three-dimensional map information, and fusing GPS and/or RTK information simultaneously when external GPS and/or RTK signals exist to obtain fused pose and motion information with an earth coordinate system and outputting the fused pose and motion information to a global planning module 210;
the global planning module 210 plans an optimal navigation path of the unmanned aerial vehicle to the target point according to the position designated by the user, the map updating information, the fusion pose and the movement information, namely, a navigation path which is shortest in route and has no obstacle in the traveling process, and outputs all data to the local planning module 220;
the local planning module 220 plans a local optimal navigation path according to the real-time local map information sent by the preprocessing module in the laser information processing unit and the data output by the global planning module 210, compares the local optimal navigation path with the optimal navigation path to obtain an optimal path, and outputs the optimal path to the bottom control module 230;
the bottom layer control module 230 performs control distribution on the unmanned aerial vehicle according to the optimal path, and controls information such as the flight speed, angle and azimuth of the unmanned aerial vehicle;
the drone receives the instruction and begins sailing.
The method and the system provided by the application fuse inertial navigation data of the IMU, and use the direction information measured by inertial navigation as the input information of the SLAM system, so that the problem that the current movement direction cannot be acquired when no GPS signal exists is avoided, and the reliability of autonomous positioning of the system is improved. Meanwhile, the invention constructs and updates the three-dimensional map in real time based on multi-line three-dimensional laser scanning data through an SLAM data system, so as to be used for unmanned aerial vehicle navigation.
Meanwhile, the invention can realize autonomous obstacle avoidance flight of the unmanned aerial vehicle based on the map model scanned by the three-dimensional laser radar, compared with the common plane single-line laser radar, the three-dimensional laser radar has the advantages that the three-dimensional laser radar can acquire three-dimensional topographic information of the environment, the three-dimensional information is several orders of magnitude more than the information quantity of the two-dimensional information, which means that the SLAM positioning algorithm based on the three-dimensional data can acquire more accurate and stable position information, and in addition, the coordinates of any point in the three-dimensional space can be acquired, so that the unmanned aerial vehicle can have the capability of autonomously generating a flight track and autonomously flying, and can automatically and rapidly generate a flight path of the unmanned aerial vehicle as long as a user points out the target position which the unmanned aerial vehicle needs to arrive, the flight path is constructed based on an accurate map scanned by the surrounding environment, the generated path on the basis is very safe, and the algorithm can ensure the highest efficiency of the flight path.
In addition, the method and the system provided by the invention are used for fusing the absolute position coordinates of the GPS or RTK through the multi-sensor fusion module when the GPS and/or RTK signals exist, so that when the GPS or RTK signals are effective, the system converts the relative coordinates into global standard earth coordinates, the generated coordinate data can be directly displayed in an earth coordinate system, the generated route can be used by other application systems, the goal of sharing the data is achieved, the route of the system is not like the relative coordinate system, once the origin point is lost, the whole data is lost, and the system can be permanently and effectively like the GPS coordinates. In addition, the system corrects the GPS and/or RTK, the three-dimensional laser radar and the IMU data into a group of states at the same time, so that the multi-sensor fusion can realize seamless switching, mutually make up the defects of the sensors, and bring the performance of the sensors into play to the maximum.
The method and system for positioning and navigating a unmanned aerial vehicle by using a laser radar SLAM provided by the invention are described in detail, and for those skilled in the art, according to the idea of the embodiment of the invention, the specific implementation and application range of the method and system are changed, so that the disclosure should not be construed as limiting the invention.

Claims (10)

1. The method for positioning and navigating the unmanned aerial vehicle based on the laser radar SLAM is characterized by comprising the following steps of:
(1) Acquiring IMU pose and motion information of the current frame acquired by the IMU unit, and acquiring point cloud data information of the current frame acquired by the three-dimensional laser radar;
(2) Filtering and preprocessing the acquired point cloud data information;
(3) The filtered point cloud data is resolved to obtain the pose and motion information of the laser radar of the current frame and the local map information of the surrounding environment of the current frame in the laser SLAM system;
(4) Updating the map in the laser SLAM system according to the local map information, and fusing all the local maps to generate a three-dimensional global map;
(5) Combining the pose and the motion information of the laser radar of the previous frame with the pose and the motion information of the laser radar of the current frame to generate a first motion track;
(6) Generating a second motion trail by combining the IMU pose and the motion information of the previous frame and the IMU pose and the motion information of the current frame;
(7) Comparing the first motion track with the second motion track to obtain a comparison error, and comparing the comparison error with a laser SLAM calibration error preset by a system;
(8) Fusing pose and motion information of the IMU and the laser radar according to a comparison result of the comparison error and the laser SLAM calibration error, and combining a current updated global map model to obtain fused pose and motion information of a current frame;
(9) And acquiring the position information appointed by the user, generating an optimal route according to the position information appointed by the user, the map updating information, the fusion pose and the motion information, and displaying and controlling the positioning and the navigation of the unmanned aerial vehicle.
2. The method for positioning and navigating the unmanned aerial vehicle based on the laser radar SLAM according to claim 1, wherein the filtering and preprocessing the collected point cloud data in the step (2) further comprises the following steps:
(2-1) filtering the acquired point cloud data;
(2-2) preprocessing the filtered point cloud data;
(2-3) updating the temporary local map according to the preprocessed point cloud data.
3. The method for positioning and navigating a drone based on laser radar SLAM of claim 2, wherein comparing the comparison error with the laser SLAM calibration error in step (7) further comprises the steps of:
(7-1) if the comparison error is larger than the laser SLAM calibration error, modifying the IMU pose and the motion information of the current frame into the laser radar pose and the motion information of the current frame;
and (7-2) if the comparison error is smaller than the laser SLAM calibration error, storing the IMU pose and motion information of the current frame.
4. The method of locating a navigational drone based on lidar SLAM of claim 3 wherein said step (8) further comprises: when the GPS signal and/or the RTK signal exist, the fusion pose and the motion information of the current frame are combined with the GPS signal and/or the RTK signal to obtain the fusion pose and the motion information with the earth coordinate system.
5. The unmanned aerial vehicle system based on laser radar SLAM positioning navigation is characterized by comprising a three-dimensional laser SLAM system (100) and a flight control unit (200), wherein the three-dimensional laser SLAM system (100) provides unmanned aerial vehicle positioning and navigation information, the flight control unit (200) controls the unmanned aerial vehicle to fly in positioning and navigation according to the positioning and navigation information of the three-dimensional laser SLAM system,
the three-dimensional laser SLAM system (100) also comprises a laser information acquisition unit (110) and a laser information processing unit (120), wherein,
the laser information acquisition unit (110) is used for acquiring information for the three-dimensional laser SLAM system (100), and comprises an IMU unit (111) and at least one multi-line three-dimensional laser radar module (112), wherein,
the IMU unit (111) is used for providing predicted pose and motion information for the three-dimensional laser SLAM system (100);
the laser radar (112) is used for acquiring point cloud data of the surrounding environment;
the laser information processing unit (120) is used for processing information acquired by the laser information acquisition unit, and comprises:
the point cloud processing module (130) is used for filtering and preprocessing the point cloud data;
the SLAM data processing module (140) is used for carrying out resolving processing on the filtered and preprocessed data and outputting motion pose information and a three-dimensional map model; and
and the multi-sensor fusion module (180) is used for generating current pose information by combining the three-dimensional map model information fusion according to the motion pose information obtained by the SLAM data processing module.
6. The unmanned aerial vehicle system based on lidar SLAM positioning navigation of claim 5, wherein:
the SLAM data processing module (140) also comprises a radar resolving module (141), a track generating module (142), a track comparing module (143), a calibrating module (144) and a laser radar fusion module (145), wherein,
the radar resolving module (141) is used for resolving the filtered point cloud data to obtain corresponding laser radar pose and motion information and local map information of the surrounding environment of the current frame;
the track generation module (142) is used for generating a first movement track of the unmanned aerial vehicle by combining the pose and the movement information of the laser radar of two continuous frames and generating a second movement track of the unmanned aerial vehicle by combining the pose and the movement information of the IMU of two continuous frames;
the track comparison module (143) is used for comparing the first motion track and the second motion track to obtain a comparison error and comparing the obtained comparison error with a preset laser SLAM calibration error;
the laser radar fusion module (145) fuses the pose and the motion information of the IMU unit (111) and the laser radar according to the comparison result of the track comparison module; and outputting the pose information and the motion information of the current unmanned aerial vehicle.
7. The unmanned aerial vehicle system based on lidar SLAM positioning navigation of claim 6, wherein the SLAM data processing module further comprises:
the map updating module (160) is used for updating the map information of the three-dimensional laser SLAM system (100) in real time according to the local map information of the surrounding environment of the current frame; and
and the global map fusion module (170) is used for fusing all the local map information to generate the global three-dimensional map model information so that the unmanned aerial vehicle plans a navigation path in the global map information.
8. The unmanned aerial vehicle system based on laser radar SLAM positioning navigation of claim 7, further comprising a calibration module (144) for correcting the IMU unit (111) error in real time according to the comparison result of the trajectory comparison module.
9. The unmanned aerial vehicle system based on laser radar SLAM positioning navigation of claim 6, wherein when there is a GPS and/or RTK signal, the multi-sensor fusion module (180) further fuses pose information and motion information output by the laser radar fusion module, GPS and/or RTK position information, and current update map information to obtain fused pose information with earth coordinates.
10. The unmanned aerial vehicle system based on laser radar SLAM positioning navigation of claim 7, wherein providing predicted pose and motion information for the three-dimensional laser SLAM system (100) further comprises an IMU unit (111) acquiring mileage information first, converting the mileage information into unmanned aerial vehicle pose change information through a model of unmanned aerial vehicle inertial odometer kinematics, sending the unmanned aerial vehicle pose change information to a bayesian filter, and primarily calculating the predicted pose and motion information, wherein the filtering process comprises noise reduction, abnormal point removal and redundant point cloud data reduction of point cloud data.
CN202210930399.8A 2022-08-03 2022-08-03 Laser radar SLAM positioning navigation method and unmanned aerial vehicle system thereof Pending CN117554990A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210930399.8A CN117554990A (en) 2022-08-03 2022-08-03 Laser radar SLAM positioning navigation method and unmanned aerial vehicle system thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210930399.8A CN117554990A (en) 2022-08-03 2022-08-03 Laser radar SLAM positioning navigation method and unmanned aerial vehicle system thereof

Publications (1)

Publication Number Publication Date
CN117554990A true CN117554990A (en) 2024-02-13

Family

ID=89811630

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210930399.8A Pending CN117554990A (en) 2022-08-03 2022-08-03 Laser radar SLAM positioning navigation method and unmanned aerial vehicle system thereof

Country Status (1)

Country Link
CN (1) CN117554990A (en)

Similar Documents

Publication Publication Date Title
US10824170B2 (en) Autonomous cargo delivery system
Al-Kaff et al. Survey of computer vision algorithms and applications for unmanned aerial vehicles
US20210247764A1 (en) Multi-sensor environmental mapping
EP2177965B1 (en) High integrity coordination for multiple off-road machines
CN101598557B (en) Integrated navigation system applied to pilotless aircraft
US20180102058A1 (en) High-precision autonomous obstacle-avoidance flying method for unmanned aerial vehicle
US8639408B2 (en) High integrity coordination system for multiple off-road vehicles
CN109696663A (en) A kind of vehicle-mounted three-dimensional laser radar scaling method and system
CN106927059A (en) A kind of unmanned plane landing method and device based on monocular vision
CN110716549A (en) Autonomous navigation robot system for map-free area patrol and navigation method thereof
Liu et al. A survey of computer vision applied in aerial robotic vehicles
CN114485619A (en) Multi-robot positioning and navigation method and device based on air-ground cooperation
CN113156998A (en) Unmanned aerial vehicle flight control system and control method
CN116047565A (en) Multi-sensor data fusion positioning system
Andert et al. Improving monocular SLAM with altimeter hints for fixed-wing aircraft navigation and emergency landing
CN117554990A (en) Laser radar SLAM positioning navigation method and unmanned aerial vehicle system thereof
CN117572879A (en) Unmanned aerial vehicle based on laser radar SLAM positioning navigation
CN114281109A (en) Multi-machine cooperation control system guided by unmanned aerial vehicle
CN113110602A (en) Neural network's automatic system of patrolling and examining of high robustness unmanned aerial vehicle power equipment
Zhang et al. An integrated UAV navigation system based on geo-registered 3D point cloud
CN117572459A (en) Unmanned aerial vehicle capable of automatically switching navigation system
CN117554989A (en) Visual fusion laser radar SLAM positioning navigation method and unmanned aerial vehicle system thereof
US20230150543A1 (en) Systems and methods for estimating cuboid headings based on heading estimations generated using different cuboid defining techniques
CN115686045A (en) Indoor autonomous aircraft obstacle avoidance device based on 3DVFH algorithm
Nonami et al. Guidance and navigation systems for small aerial robots

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination