CN113763549B - Simultaneous positioning and mapping method and device integrating laser radar and IMU and storage medium - Google Patents

Simultaneous positioning and mapping method and device integrating laser radar and IMU and storage medium Download PDF

Info

Publication number
CN113763549B
CN113763549B CN202110953344.4A CN202110953344A CN113763549B CN 113763549 B CN113763549 B CN 113763549B CN 202110953344 A CN202110953344 A CN 202110953344A CN 113763549 B CN113763549 B CN 113763549B
Authority
CN
China
Prior art keywords
imu
map
sub
initial
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110953344.4A
Other languages
Chinese (zh)
Other versions
CN113763549A (en
Inventor
张�林
王�忠
沈莹
赵生捷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji University
Original Assignee
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji University filed Critical Tongji University
Priority to CN202110953344.4A priority Critical patent/CN113763549B/en
Publication of CN113763549A publication Critical patent/CN113763549A/en
Application granted granted Critical
Publication of CN113763549B publication Critical patent/CN113763549B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/06Topological mapping of higher dimensional structures onto lower dimensional surfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Remote Sensing (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Navigation (AREA)

Abstract

The invention relates to a simultaneous positioning and mapping method, a device and a storage medium for fusing a laser radar and an IMU. After preprocessing laser radar point cloud data and IMU data, initializing a system, registering the preprocessed point cloud and the existing map by adopting a direct method, calculating relevant parameters to construct a factor graph, obtaining an initial 3D sub-map, mapping the 3D sub-map to a 2D sub-map, extracting characteristic points, calculating 2D rigid body transformation parameters, calculating a 3D coordinate initial value, iteratively optimizing the value of the 3D coordinate continuously updated, and finally outputting the 3D coordinate and the 3D map which meet the accuracy requirement. Compared with the prior art, the method adopts a direct method to register the point cloud with the existing map, and the final output is obtained through continuous iteration, so that the method has the advantages of adaptability to a multi-radar system, high detection efficiency and the like.

Description

Simultaneous positioning and mapping method and device integrating laser radar and IMU and storage medium
Technical Field
The invention relates to the field of simultaneous localization mapping, in particular to a simultaneous localization mapping method, device and storage medium for fusing a laser radar and an IMU.
Background
Currently, in the related fields of autonomous mapping, robotics and the like, simultaneous localization and mapping (SLAM) is a very important basic technology. The SLAM provides real-time position estimation for the carrier and simultaneously builds an environment map, so that higher-layer applications such as autonomous navigation, path planning and the like are realized. However, under complex environments, such as tall buildings, trees, etc., shielding often makes GPS, etc., unavailable, in which case one relies primarily on active sensors such as LiDAR and IMU for SLAM system construction. However, the existing fusion mode of the laser radar and the IMU directly uses the measurement value calculation of the IMU, lacks of real-time performance and lower precision, and is not easy to expand to a multi-radar system because the characteristic point-based method is highly coupled with the point acquisition mode of the laser radar sensor on the registration method of the point cloud information. Therefore, the positioning and mapping method still has the problems of accuracy and adaptability at present.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide a method, a device and a storage medium for simultaneous positioning and mapping by combining a laser radar and an IMU, thereby solving the problems of low accuracy and poor adaptability of the existing positioning and mapping method.
The aim of the invention can be achieved by the following technical scheme:
a simultaneous localization mapping method integrating a laser radar and an IMU comprises the following steps:
s1, acquiring point cloud data and IMU data, and preprocessing the point cloud data and the IMU data.
S2, initializing a positioning mapping system by using the preprocessed point cloud data and the IMU data, registering the preprocessed point cloud data with the existing map according to the initialized positioning mapping system to obtain an initial registration value, and calculating a current gravity vector, an initial gesture and an IMU integral value.
And S3, constructing a factor graph model according to the gravity vector, the registration value and the IMU pre-integration.
S4, inputting the preprocessed point cloud data into a factor graph model to obtain real-time 3D carrier coordinates, and generating an initial 3D sub-map according to the 3D carrier coordinates and the preprocessed point cloud data.
And S5, projecting the initial 3D sub-map in the horizontal direction according to the initial gesture to obtain a test 2D sub-map.
And S6, extracting characteristic points of the test 2D sub-map, matching the characteristic points of the test 2D sub-map with the characteristic points of the existing 2D sub-map through a positioning mapping system, and calculating 2D rigid body transformation parameters according to the matched characteristic points when the number of the matched characteristic points exceeds a quantity threshold value.
S7, calculating to obtain a 3D coordinate initial value according to the preprocessed point cloud data, the initial 3D sub-map and the 2D rigid body transformation parameters, performing sliding registration in the Z direction by taking the 3D coordinate initial value as a starting point to obtain a progressive registration value, and calculating a 3D coordinate conversion parameter according to the progressive registration value; and establishing a pose graph according to the 3D coordinate conversion parameters, and obtaining a further pose. The method comprises the steps of carrying out a first treatment on the surface of the
S8, judging that the difference of the 3D coordinate conversion parameters of two adjacent times is smaller than a parameter threshold, and if yes, executing the step S10; if not, step S9 is performed.
And S9, keeping the initial 3D sub-map unchanged, remapping to obtain a new 2D sub-map according to the advanced gesture, and executing the step S6.
S10, taking the 3D coordinates corresponding to the current 3D coordinate conversion parameters as final 3D coordinates, importing the final 3D sub-map and generating the final 3D sub-map.
Further, the initializing in step S2 includes static initializing or dynamic initializing.
Further, the static initialization includes: and calculating a rotation matrix between the average value of the IMU acceleration measurement values and the gravity vector of the world coordinate system, determining an initial gesture, setting the average value of the IMU angular velocity measurement values as an IMU deviation initial value, and setting the IMU speed and the coordinate as 0.
Further, the dynamic initialization includes: establishing an initial frame coordinate system, and calculating a point cloud relative rotation value and a pre-integration relative rotation value to obtain an IMU deviation initial value; and calculating a point cloud relative displacement value and a pre-integral relative displacement value to obtain an IMU speed, coordinates, an initial posture and an initial gravity vector, and converting all values into a world coordinate system according to the initial gravity vector.
Further, in step S6, the feature points of the 2D sub-map are tested as SURF feature points.
Further, the preprocessing step in step S1 includes: pre-integrating the IMU with the time stamp smaller than the time stamp of the main radar, taking the time stamp of the main radar as a reference, and reordering all laser points with the sampling time stamp in the auxiliary radar within the time stamp according to the time stamp; and converting each laser point into an IMU coordinate system, calculating relative motion and correcting motion distortion to obtain preprocessed point cloud data.
Further, the number threshold in step S6 ranges from 4 to 6.
Further, in step S1, point cloud data and IMU data are acquired by a plurality of sixteen-line lidars and IMUs.
The simultaneous positioning and mapping device integrating the laser radar and the IMU comprises a memory and a processor; a memory for storing a computer program; and the processor is used for realizing the simultaneous positioning mapping method for fusing the laser radar and the IMU when executing the computer program.
A computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of a simultaneous localization mapping method of a fusion lidar and IMU as above.
Compared with the prior art, the invention has the following advantages:
1. the method of direct registration is adopted for the preprocessed point cloud and the existing map, so that the system has stronger adaptability, parameter changes can be reflected in real time, the system is easy to expand to a multi-radar input system, the 3D sub-map is mapped to the 2D sub-map, characteristic points are extracted, 2D rigid body transformation parameters are calculated, the detection efficiency is improved, initial coordinates can be obtained, and finally the coordinates are optimized through cyclic iteration, so that the result is more accurate.
2. When the system is initialized, static initialization or dynamic initialization is supported, and the application range is wide; and a world coordinate system is introduced in the initialization process, so that the world coordinate system can be matched with an application using the coordinate system.
Drawings
FIG. 1 is a schematic flow chart of the present invention.
Fig. 2 is a schematic diagram of feature point processing of a 2D sub-map according to the present invention.
Fig. 3 is a schematic diagram illustrating the conversion of a spatial relationship from a 2D sub-map to a 3D sub-map according to the present invention.
Detailed Description
The invention will now be described in detail with reference to the drawings and specific examples. The present embodiment is implemented on the premise of the technical scheme of the present invention, and a detailed implementation manner and a specific operation process are given, but the protection scope of the present invention is not limited to the following examples.
The embodiment provides a simultaneous positioning and mapping method for fusing a laser radar and an IMU, and the flow is shown in fig. 1, and specifically developed as follows:
step S1, acquiring point cloud data and IMU data of a multi-radar system, and preprocessing: and pre-integrating IMU readings with time stamps smaller than the time stamp of the main radar to obtain a pre-integrated measured value and an initial estimated value of the relative motion. Taking the time stamps of the laser points in all the auxiliary radars as references, taking all the points of the interval where the sampling time stamps of the laser points fall, and reordering the points according to the time stamps to obtain a point cloud growing in time sequence. And (5) performing standard sitting on each radar point and changing the standard sitting point into an IMU coordinate system. And calculating relative motion of each radar point from an integral value of the IMU according to the time stamp of each radar point, and correcting motion distortion of each radar point to obtain point cloud data after distortion correction under the main radar time stamp.
Step S2, initializing a positioning mapping system by using the preprocessed point cloud data and the preprocessed IMU data, wherein the initialization process can be divided into static initialization and dynamic initialization. The static initialization specifically comprises the steps of aligning an average value of acceleration measurement values of the IMU with a gravity vector of a world coordinate system, calculating a rotation matrix between the two vectors, determining an IMU posture at an initial moment, and taking the average value of the IMU angular velocity measurement values as an initial value of an IMU gyroscope Bias. The IMU position and velocity are initialized to 0. The dynamic initialization specifically comprises registering multiple frame point clouds in a window by using a Normal Distribution Transform algorithm, calculating relative motion between the point clouds, and calculating IMU pre-integration values between adjacent frames by calculating the relative motion under a starting frame coordinate system in the window. And solving the bias of the IMU gyroscope by combining the relative rotation value of the point cloud and the relative rotation value of the pre-integration, and solving the position, the speed, the gesture and the gravity vector of the IMU in the initial time window by combining the relative displacement of the point cloud and the relative displacement of the pre-integration. And according to the solved gravity vector and the state, the state values of all frames in the window are quasi-changed to the world coordinate system. And registering the preprocessed point cloud data with the existing map according to the initialized positioning mapping system to obtain an initial registration value, and calculating a current gravity vector, an initial posture and an IMU integral value by using data of at least 3 frames in the past.
And S3, constructing a factor graph model according to the gravity vector, the registration value and the IMU pre-integration, and updating parameters such as IMU deviation.
And S4, inputting the preprocessed point cloud data into a factor graph model to obtain real-time 3D carrier coordinates, and generating an initial 3D sub-map according to the 3D carrier coordinates and the preprocessed point cloud data.
And S5, projecting the initial 3D sub-map in the horizontal direction according to the initial gesture to obtain a test 2D sub-map.
Step S6, SURF feature points of the test 2D sub-map are extracted, feature descriptors are calculated, the feature points of the test 2D sub-map are matched with the feature points of the existing 2D sub-map, as shown in FIG. 2, when the number of the matched feature points exceeds 5, the two sub-maps are considered to have loops, and 2D rigid body transformation parameters are calculated according to the feature points of the sub-map with the loops.
Step S7, as shown in FIG. 3, according to the spatial relationship between the original frame and the initial 3D sub-map and the 2D rigid transformation parameters, calculating to obtain a 3D coordinate initial value, carrying out sliding registration in the Z direction by taking the 3D coordinate initial value as a starting point to obtain a progressive registration value, and according to the progressive registration value, carrying out local iterative optimization, and calculating a 3D coordinate conversion parameter; and establishing a pose graph according to the 3D coordinate conversion parameters, and combining optimization to obtain a further pose.
Step S8, judging that the difference of the 3D coordinate conversion parameters of two adjacent times is smaller than a parameter threshold, and if yes, executing step S10; if not, step S9 is performed.
And S9, keeping the initial 3D sub-map unchanged, remapping to obtain a new 2D sub-map according to the advanced posture, and executing a step S6 (updating the values of the 3D conversion coordinates and the advanced posture).
And S10, taking the 3D coordinates corresponding to the current 3D coordinate conversion parameters as final 3D coordinates, importing the final 3D sub-map and generating the final 3D sub-map.
The embodiment also provides a simultaneous positioning and mapping device for fusing the laser radar and the IMU, which comprises a memory and a processor; a memory for storing a computer program; a processor for implementing the simultaneous localization mapping method of fusing lidar and IMU as described above when executing the computer program.
The present embodiment also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps of a simultaneous localization mapping method of fusing lidar and IMU as described above.
The foregoing describes in detail preferred embodiments of the present invention. It should be understood that numerous modifications and variations can be made in accordance with the concepts of the invention by one of ordinary skill in the art without undue burden. Therefore, all technical solutions which can be obtained by logic analysis, reasoning or limited experiments based on the prior art by the person skilled in the art according to the inventive concept shall be within the scope of protection defined by the claims.

Claims (10)

1. The simultaneous positioning and mapping method integrating the laser radar and the IMU is characterized by comprising the following steps of:
s1, acquiring point cloud data and IMU data, and preprocessing the point cloud data and the IMU data;
s2, initializing a positioning mapping system by using the preprocessed point cloud data and the IMU data, registering the preprocessed point cloud data with the existing map according to the initialized positioning mapping system to obtain an initial registration value, and calculating a current gravity vector, an initial gesture and an IMU integral value;
s3, constructing a factor graph model according to the gravity vector, the registration value and IMU pre-integration;
s4, inputting the preprocessed point cloud data into the factor graph model to obtain real-time 3D carrier coordinates, and generating an initial 3D sub-map according to the 3D carrier coordinates and the preprocessed point cloud data;
s5, projecting the initial 3D sub-map in the horizontal direction according to the initial gesture to obtain a test 2D sub-map;
s6, extracting characteristic points of the test 2D sub-map, matching the characteristic points of the test 2D sub-map with the characteristic points of the existing 2D sub-map through a positioning mapping system, and calculating 2D rigid body transformation parameters according to the matched characteristic points when the number of the matched characteristic points exceeds a quantity threshold;
s7, calculating to obtain a 3D coordinate initial value according to the preprocessed point cloud data, the initial 3D sub-map and the 2D rigid body transformation parameters, performing sliding registration in the Z direction by taking the 3D coordinate initial value as a starting point to obtain a progressive registration value, and calculating a 3D coordinate conversion parameter according to the progressive registration value; establishing a pose graph according to the 3D coordinate conversion parameters, and obtaining a further pose;
s8, judging that the difference of the 3D coordinate conversion parameters of two adjacent times is smaller than a parameter threshold, and if yes, executing the step S10; if not, executing step S9;
s9, keeping the initial 3D sub-map unchanged, remapping to obtain a new 2D sub-map according to the advanced gesture, and executing a step S6;
s10, taking the 3D coordinates corresponding to the current 3D coordinate conversion parameters as final 3D coordinates, importing the final 3D sub-map and generating the final 3D sub-map.
2. The simultaneous localization mapping method of claim 1 in which the initializing in step S2 comprises static or dynamic initialization.
3. The simultaneous localization mapping method of claim 2 wherein the static initialization comprises: and calculating a rotation matrix between the average value of the IMU acceleration measurement values and the gravity vector of the world coordinate system, determining an initial gesture, setting the average value of the IMU angular velocity measurement values as an IMU deviation initial value, and setting the IMU speed and the coordinate as 0.
4. The simultaneous localization mapping method of claim 2 wherein the dynamically initializing comprises: establishing an initial frame coordinate system, and calculating a point cloud relative rotation value and a pre-integration relative rotation value to obtain an IMU deviation initial value; and calculating a point cloud relative displacement value and a pre-integral relative displacement value to obtain an IMU speed, coordinates, an initial posture and an initial gravity vector, and converting all values into a world coordinate system according to the initial gravity vector.
5. The simultaneous localization mapping method of claim 1 wherein the feature points of the test 2D sub-map in step S6 are SURF feature points.
6. The simultaneous localization mapping method of claim 1 wherein the preprocessing step in step S1 comprises: pre-integrating the IMU with the time stamp smaller than the time stamp of the main radar, taking the time stamp of the main radar as a reference, and reordering all laser points with the sampling time stamp in the auxiliary radar within the time stamp according to the time stamp; and converting each laser point into an IMU coordinate system, calculating relative motion and correcting motion distortion to obtain preprocessed point cloud data.
7. The simultaneous localization mapping method of claim 1 wherein the number threshold in step S6 is in the range of 4-6.
8. The simultaneous localization and mapping method of claim 1 wherein the point cloud data and IMU data in step S1 are obtained by a plurality of sixteen-line lidars and IMUs.
9. The simultaneous positioning and mapping device integrating the laser radar and the IMU is characterized by comprising a memory and a processor; the memory is used for storing a computer program; the processor is configured to implement the simultaneous localization mapping method of fusion lidar and IMU according to any of claims 1 to 8 when the computer program is executed.
10. A computer readable storage medium, wherein a computer program is stored on the computer readable storage medium, the computer program when executed by a processor implementing the steps of a simultaneous localization mapping method of fusion lidar and IMU according to any of claims 1 to 8.
CN202110953344.4A 2021-08-19 2021-08-19 Simultaneous positioning and mapping method and device integrating laser radar and IMU and storage medium Active CN113763549B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110953344.4A CN113763549B (en) 2021-08-19 2021-08-19 Simultaneous positioning and mapping method and device integrating laser radar and IMU and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110953344.4A CN113763549B (en) 2021-08-19 2021-08-19 Simultaneous positioning and mapping method and device integrating laser radar and IMU and storage medium

Publications (2)

Publication Number Publication Date
CN113763549A CN113763549A (en) 2021-12-07
CN113763549B true CN113763549B (en) 2023-07-07

Family

ID=78790411

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110953344.4A Active CN113763549B (en) 2021-08-19 2021-08-19 Simultaneous positioning and mapping method and device integrating laser radar and IMU and storage medium

Country Status (1)

Country Link
CN (1) CN113763549B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114046792B (en) * 2022-01-07 2022-04-26 陕西欧卡电子智能科技有限公司 Unmanned ship water surface positioning and mapping method, device and related components
CN114593724A (en) * 2022-01-21 2022-06-07 北京邮电大学 Cluster fusion positioning method and device
CN116878488B (en) * 2023-09-07 2023-11-28 湘江实验室 Picture construction method and device, storage medium and electronic device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111207774A (en) * 2020-01-17 2020-05-29 山东大学 Method and system for laser-IMU external reference calibration
WO2020155616A1 (en) * 2019-01-29 2020-08-06 浙江省北大信息技术高等研究院 Digital retina-based photographing device positioning method
CN113066105A (en) * 2021-04-02 2021-07-02 北京理工大学 Positioning and mapping method and system based on fusion of laser radar and inertial measurement unit

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11567201B2 (en) * 2016-03-11 2023-01-31 Kaarta, Inc. Laser scanner with real-time, online ego-motion estimation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020155616A1 (en) * 2019-01-29 2020-08-06 浙江省北大信息技术高等研究院 Digital retina-based photographing device positioning method
CN111207774A (en) * 2020-01-17 2020-05-29 山东大学 Method and system for laser-IMU external reference calibration
CN113066105A (en) * 2021-04-02 2021-07-02 北京理工大学 Positioning and mapping method and system based on fusion of laser radar and inertial measurement unit

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
一种改进ICP算法的移动机器人激光与视觉建图方法研究;张杰;周军;;机电工程(12);全文 *
基于混合地图的护理机器人室内导航方法;张立志;陈殿生;刘维惠;;北京航空航天大学学报(05);全文 *

Also Published As

Publication number Publication date
CN113763549A (en) 2021-12-07

Similar Documents

Publication Publication Date Title
CN113763549B (en) Simultaneous positioning and mapping method and device integrating laser radar and IMU and storage medium
CN110070615B (en) Multi-camera cooperation-based panoramic vision SLAM method
CN109522832B (en) Loop detection method based on point cloud segment matching constraint and track drift optimization
CN113066105A (en) Positioning and mapping method and system based on fusion of laser radar and inertial measurement unit
CN110285806A (en) The quick Precision Orientation Algorithm of mobile robot based on the correction of multiple pose
JP5627325B2 (en) Position / orientation measuring apparatus, position / orientation measuring method, and program
CN110570449B (en) Positioning and mapping method based on millimeter wave radar and visual SLAM
CN112634451A (en) Outdoor large-scene three-dimensional mapping method integrating multiple sensors
CN112987065B (en) Multi-sensor-integrated handheld SLAM device and control method thereof
CN110187375A (en) A kind of method and device improving positioning accuracy based on SLAM positioning result
CN111366153B (en) Positioning method for tight coupling of laser radar and IMU
WO2020063878A1 (en) Data processing method and apparatus
CN113763548B (en) Vision-laser radar coupling-based lean texture tunnel modeling method and system
CN114013449B (en) Data processing method and device for automatic driving vehicle and automatic driving vehicle
CN115272596A (en) Multi-sensor fusion SLAM method oriented to monotonous texture-free large scene
CN113933818A (en) Method, device, storage medium and program product for calibrating laser radar external parameter
Pan et al. Voxfield: Non-projective signed distance fields for online planning and 3d reconstruction
CN107463871A (en) A kind of point cloud matching method based on corner characteristics weighting
CN117367412A (en) Tightly-coupled laser inertial navigation odometer integrating bundle set adjustment and map building method
CN115950414A (en) Adaptive multi-fusion SLAM method for different sensor data
Kupervasser et al. Robust positioning of drones for land use monitoring in strong terrain relief using vision-based navigation
CN113495281B (en) Real-time positioning method and device for movable platform
CN114777775A (en) Multi-sensor fusion positioning method and system
Hu et al. Efficient Visual-Inertial navigation with point-plane map
CN113960614A (en) Elevation map construction method based on frame-map matching

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant