CN113850864B - GNSS/LIDAR loop detection method for outdoor mobile robot - Google Patents

GNSS/LIDAR loop detection method for outdoor mobile robot Download PDF

Info

Publication number
CN113850864B
CN113850864B CN202111071687.4A CN202111071687A CN113850864B CN 113850864 B CN113850864 B CN 113850864B CN 202111071687 A CN202111071687 A CN 202111071687A CN 113850864 B CN113850864 B CN 113850864B
Authority
CN
China
Prior art keywords
loop
gnss
point cloud
frame
similarity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111071687.4A
Other languages
Chinese (zh)
Other versions
CN113850864A (en
Inventor
陈明松
张驰洲
王冠强
蔺永诚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Central South University
Original Assignee
Central South University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Central South University filed Critical Central South University
Priority to CN202111071687.4A priority Critical patent/CN113850864B/en
Publication of CN113850864A publication Critical patent/CN113850864A/en
Application granted granted Critical
Publication of CN113850864B publication Critical patent/CN113850864B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/485Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • G06T3/067
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/02Preprocessing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Abstract

The invention discloses a GNSS/laser radar loop detection method for a mobile robot, which comprises the following steps: step 1: acquiring a point cloud frame by using a laser radar and performing filtering treatment; step 2: obtaining the distances of the current point cloud frame and the candidate point cloud frame under the odometer and the GNSS, and calculating the similarity of the distance characteristic loop; step 3: projecting an input point cloud onto a two-dimensional plane for calculating global feature loop similarity based on intensity; step 4: calculating the point-based quick characteristic loop similarity by using the FPFH characteristic; step 5: combining the distance features, the point cloud global features and the quick point local features to obtain the total similarity of the loop; step 6: if the total similarity of the loop is larger than the set threshold, judging that the current frame and the candidate frame form loop constraint, otherwise, the current frame and the candidate frame do not form loop constraint. The method effectively solves the problem of loop detection of the mobile robot in a large-scale environment, reduces the detection omission phenomenon and improves the detection precision.

Description

GNSS/LIDAR loop detection method for outdoor mobile robot
Technical Field
The invention relates to a robust loop detection method combining global and local characteristics of point cloud and GNSS information, belonging to the technical field of simultaneous localization and map construction.
Background
The invention relates to the field of laser synchronous positioning and mapping, in particular to a laser point cloud loop detection method for an outdoor mobile robot. Along with the development of application technology of the outdoor mobile robot, such as application of a mobile robot in a park, an outdoor inspection robot and the like, the high-precision synchronous positioning and mapping function is an important point of intelligent obstacle avoidance and path planning of the outdoor mobile robot. The outdoor mobile robot has wide moving range, complex and changeable environment, and the positioning system has larger accumulated error after long-time operation and needs to correct the accumulated error through loop detection.
The invention provides a laser point cloud loop detection method and system suitable for an underground roadway in patent document CN 112907491A. The method comprises the steps of obtaining three-dimensional pose information of a positioning system by using a laser mileage calculation method; describing a point cloud keyframe by using a feature histogram based on a viewpoint; and calculating the similarity of the key frame point clouds by using a vector correlation calculation formula. However, the technical solution disclosed above still has the following technical problems: (1) In the loop detection process, only the global characteristics of the point cloud are considered, other characteristics (intensity information) and local characteristics of the point cloud are not considered, and more missed detection phenomena exist; (2) The histogram feature description of the viewpoint is adopted as a loop similarity judgment standard, translational invariance of the point cloud is not considered, and the detection precision is not high.
Therefore, in order to realize loop detection with high accuracy and high quality by combining with the working environment of the outdoor mobile robot, a loop detection system and method for the outdoor mobile robot are urgently needed to detect whether the current frame and the candidate frame form loop constraint or not, so as to reduce the positioning accumulated error of the mobile robot.
Disclosure of Invention
The invention aims to provide a GNSS/LIDAR loop detection method for an outdoor mobile robot, which effectively solves the loop detection problem of the existing outdoor mobile robot in a large-scale environment through hardware such as an embedded computing platform, a multi-line laser radar, a GNSS positioning module and the like, reduces the detection omission phenomenon and improves the detection precision.
The technical scheme adopted for solving the technical problems is as follows:
a GNSS/LIDAR loop detection method for an outdoor mobile robot, the GNSS/LIDAR loop detection method comprising the steps of:
step 1: acquiring point cloud frames in two scenes by adopting a multi-line laser radar, and simultaneously performing filtering treatment on the point cloud frames in the two scenes, wherein the filtering treatment comprises the steps of removing ground point clouds, outliers and downsampling, and then acquiring two laser point cloud frames to be detected after preprocessing;
step 2: obtaining distances of a current point cloud frame and a candidate point cloud frame under an odometer and a GNSS respectively, and calculating the similarity of a distance feature loop based on GNSS information;
step 3: projecting each input frame of point cloud onto a two-dimensional plane under the visual field of the aerial view, and calculating global feature loop similarity based on intensity by adopting polar coordinate representation;
step 4: calculating the point-based quick characteristic loop similarity by using the FPFH characteristic;
step 5: combining the distance features of GNSS information, the global features based on intensity and the local features of the quick points to obtain the total similarity of the loop;
step 6: if the total similarity of the loop is larger than the set threshold, judging that the current frame and the candidate frame form loop constraint, and detecting the loop, otherwise, the current frame and the candidate frame do not form the loop constraint;
further, the computing platform for detecting loop constraints of the current frame and the candidate frame processes and judges that the time of the two frames is not more than 15 milliseconds.
Further, in the GNSS rejection environment, for example, when an obstacle occludes the GNSS signal, it can still be accurately determined whether the current frame and the candidate frame form a loop constraint.
The beneficial effects of the invention are as follows:
1. according to the invention, the laser radar and the GNSS are selected as sensors, enough effective information can be still captured in dark and weak-texture outdoor environments, and the accuracy of loop detection is ensured;
2. according to the invention, the intensity information and the quick point characteristic information are respectively used as global and local characteristics, and GNSS distance information is introduced as characteristic constraint aiming at the problem that the descriptors have no distance invariance, so that the matching accuracy is high. Meanwhile, compared with a scene recognition algorithm for deep learning, the memory occupation is small;
3. after the similarity calculation of the candidate frame and the current frame in loop detection is combined with three characteristic constraints, the algorithm robustness is better, and the accumulated error can be effectively eliminated.
In summary, the GNSS/LIDAR loop detection system and method for mobile robots realize high-precision loop detection, and after judging that the current frame and the candidate frame form loop constraint, the relative pose of the two frames can be calculated through Iterative Closest Point (ICP) or Normal Distribution Transformation (NDT) algorithm so as to reduce the positioning accumulated error.
Drawings
FIG. 1 is an overall block diagram of a mobile robot-oriented GNSS/LIDAR system;
FIG. 2 is a flow chart of a mobile robot oriented GNSS/LIDAR loop detection control method;
Detailed Description
The invention is further described below with reference to the accompanying drawings.
As shown in fig. 1, the GNSS/LIDAR loop detection system for an outdoor mobile robot of the present embodiment includes an embedded computing platform, a multi-line laser radar and a GNSS positioning module; the embedded computing platform is used for receiving laser radar point cloud data and absolute positioning data in the GNSS positioning module and performing centralized processing; the multi-line laser radar is used for sensing surrounding environment information and generating point cloud in real time; the GNSS positioning module comprises a mobile station core board, a GNSS antenna, a 4G module and a mobile station power supply, and receives satellite navigation signals through the GNSS antenna; second, the GNSS positioning module can generate a pulse-per-second signal (1 PPS) and 1hz GPRMC/GPGGA information for time synchronization with the lidar data.
Fig. 2 is a flowchart of a loop detection method of a GNSS/LIDAR for a mobile robot, including the following steps:
the loop detection method of the GNSS/LIDAR facing the mobile robot comprises the following steps:
the global loop detection based on the laser point cloud is used for judging whether the current scene is in the similar scene passing before under the condition that priori pose information is not needed. If yes, calculating the relative pose between the current scene and the historical scene, and correcting the accumulated error of the laser odometer to improve the positioning accuracy of the mobile robot.
In particular, the method comprises the steps of,
step 1: acquiring point cloud frames in two scenes by adopting a multi-line laser radar;
step 2: and filtering the point cloud frames under two scenes, including removing ground point clouds, outliers, downsampling and the like. Then, two laser point cloud frames P= { P to be detected after pretreatment are obtained 1 ,p 2 ,...,p n Sum P '= { P' 1 ,p' 2 ,...,p' n };
Step 3: then, distance features, global feature descriptors and quick point local feature descriptions based on GNSS information are calculated respectively.
Step 4: distance-based feature description calculation:
obtaining the distance d of the point cloud frames P and P' under the odometer and the GNSS respectively 1 And d 2 . Wherein the odometer can be a laser odometer or the like, through which the relative coordinates of two frames can be obtained, the odometer (x 1 ,y 1 ,z 1 ) And (x) 2 ,y 2 ,z 2 ) The distance under the odometer is calculated as:
d 1 (P,P')=((x 1 -x 2 ) 2 +(y 1 -y 2 ) 2 +(z 1 -z 2 ) 2 ) 1/2 (1)
the longitude and latitude height coordinates of two frames of point cloud frames can be obtained under the GNSS, and the longitude and latitude height coordinates are respectively as follows: (lat) 1 ,lng 1 ,alt 1 ) Sum (lat) 2 ,lng 2 ,alt 2 ) The distance after height compensation is used is calculated according to the following formula:
where r (km) is the earth radius.
The distance profile is expressed by the difference of the two, as follows:
Δd(P,P')=|d 1 (P,P')-d 2 (P,P')| (3)
step 5: for intensity-based global feature computation:
(1) Firstly, each input frame point cloud is projected onto a two-dimensional plane under the visual field of a bird's eye view, and is represented by polar coordinates, and each three-dimensional point p k ={x k ,y k ,z k ,I k And hence the following formula:
the point cloud can be distributed in an m×n two-dimensional matrix subspace in x-y polar coordinates, and each block can be expressed as:
where i e {1,2,., M }, j e {1,2,., N }, L max For the furthest scanning distance on the horizontal plane of the laser point cloud frame
Wherein each element in the matrix is described as using intensity information as:
wherein C is W ij Total number of point clouds.
(2) The intensity similarity between two frames is calculated as follows:
wherein V is i And V' i The ith column vector of the point cloud frames P and P' for the two-dimensional subspace matrix, respectively
Step 6: for local feature computation based on fast point feature descriptions:
point fast feature histogram (FastPoint Feature Histogram, FPFH) is colloquially a feature representing three-dimensional pointsSIFT, ORB features, etc. in similar two-dimensional images carry some specific information. The FPFH implementation uses 11 statistical spaces, and feature histograms are calculated separately and combined to obtain a 33-element feature vector. Thus, the feature vector H of the two-frame point cloud is calculated i And H j The similarity of (2) is as follows:
wherein N represents a set of elements in the feature vector H; h i (N) represents the feature vector H i Any element; h j (N) represents the feature vector H j Any element;representing feature vector H i Element mean value of (2); />Representing feature vector H j Element mean of (c). Step 7: the total similarity of loops obtained by combining the distance feature, the global feature descriptor based on the intensity and the quick point local feature description of the GNSS information is as follows:
when Δd (P, P') is greater than the set threshold, it indicates that the distance estimated by the odometer is far away from the distance estimated by the GNSS, and the GNSS may be in the failure process, and for the case of GNSS rejection, the weight value w=0, otherwise w takes the set constant; where λ is the intensity information global feature weight constant.
Step 8: if the total similarity of the loops of the current frame P and the candidate frame P' is greater than the set value, the loop is considered to be detected, and loop constraint is formed between the two frames.
According to the GNSS/LIDAR loop detection method for the mobile robot, disclosed by the invention, the laser radar and the GNSS are selected as sensors, enough effective information can be still captured in dark and weak-texture outdoor environments, and the accuracy of loop detection is ensured; according to the invention, the intensity information and the quick point characteristic information are respectively used as global and local characteristics, and GNSS distance information is introduced as characteristic constraint aiming at the problem that the descriptors have no distance invariance, so that the matching accuracy is high. Meanwhile, compared with a scene recognition algorithm for deep learning, the method has the advantages that the memory occupation is small, and a large number of data sets are not needed; after the similarity calculation of the candidate frame and the current frame in loop detection is combined with three characteristic constraints, the algorithm robustness is better, and the accumulated error can be effectively eliminated.
In conclusion, the GNSS/LIDAR loop detection method for the mobile robot realizes high-precision loop detection and lays a foundation for realizing high-precision positioning of the mobile robot.
While the invention has been described with reference to the preferred embodiments, it is not limited thereto, and various changes and modifications may be made by those skilled in the art without departing from the spirit and scope of the invention, and therefore the scope of the invention is defined in the appended claims.

Claims (3)

1. The GNSS/laser radar loop detection method for the mobile robot is characterized by adopting global features and local features of point cloud and GNSS information to realize high-precision point cloud scene identification, and comprises the following steps:
step 1: acquiring point cloud frames in two scenes by adopting a multi-line laser radar, and simultaneously carrying out filtering treatment on the point cloud frames in the two scenes, wherein the filtering treatment comprises the steps of removing ground point clouds, outliers and downsampling; then, two laser point cloud frames to be detected after pretreatment are obtained, wherein P= { P 1 ,p 2 ,...,p n Sum P '= { P' 1 ,p' 2 ,...,p' n };
Step 2: obtaining distances d of a current point cloud frame P and a candidate point cloud frame P' under an odometer and a GNSS respectively 1 And d 2 For calculating distance feature loop similarity based on GNSS information;
Specific:
two frames of relative coordinates were obtained by odometer, the meter (x 1 ,y 1 ,z 1 ) And (x) 2 ,y 2 ,z 2 ) The distance under the odometer is calculated as:
d 1 (P,P')=((x 1 -x 2 ) 2 +(y 1 -y 2 ) 2 +(z 1 -z 2 ) 2 ) 1/2 (1)
and obtaining longitude and latitude height coordinates of two frames of point cloud frames under the GNSS, wherein the longitude and latitude height coordinates are respectively as follows: (lat) 1 ,lng 1 ,alt 1 ) Sum (lat) 2 ,lng 2 ,alt 2 ) The distance after height compensation is used is calculated according to the following formula:
wherein r (km) is the earth radius;
the distance profile is expressed by the difference of the two, as follows:
Δd(P,P')=|d 1 (P,P')-d 2 (P,P')| (3)
step 3: projecting each input frame of point cloud onto a two-dimensional plane under the visual field of the aerial view, and calculating global feature loop similarity based on intensity by adopting polar coordinate representation;
specifically, each three-dimensional point p k ={x k ,y k ,z k ,I k And hence the following formula:
the point cloud can be distributed in an m×n two-dimensional matrix subspace in polar coordinates, and each block can be expressed as:
where i e {1,2,., M }, j e {1,2,., N }, L max The furthest scanning distance on the horizontal plane of the laser point cloud frame;
wherein each element in the matrix is described as using intensity information as:
wherein C is W ij Total number of point clouds;
the intensity similarity between two frames is calculated as follows:
wherein V is i And V' i Respectively the ith column vector of the point cloud frames P and P' for the two-dimensional subspace matrix;
step 4: calculating the point-based quick characteristic loop similarity by using the FPFH characteristic;
calculating feature vector H of two-frame point cloud i And H j The similarity of (2) is as follows:
wherein N represents a set of elements in the feature vector H; h i (N) represents the feature vector H i Any element; h j (N) represents the feature vector H j Any element;representing feature vector H i Element mean value of (2); />Representing feature vector H j Element mean value of (2); k represents the total number of point clouds;
step 5: combining the distance features of GNSS information, the global features based on intensity and the local features of the quick points to obtain the total similarity of the loop;
specific:
wherein when Δd(P When the P') is larger than the set threshold, the distance estimated by the odometer is far away from the distance estimated by the GNSS, at the moment, the GNSS possibly is in a failure process, and the weight value w=0 under the condition of GNSS rejection, otherwise, the weight value w takes a set constant; wherein lambda is the global characteristic weight constant of the intensity information;
step 6: if the total similarity of the loop is larger than the set threshold, judging that the current frame and the candidate frame form loop constraint, and detecting the loop, otherwise, the current frame and the candidate frame do not form the loop constraint.
2. The mobile robot-oriented GNSS/lidar loop detection method of claim 1, wherein: the computing platform is used for detecting loop constraints of the current frame and the candidate frame, and processing and judging that the time of the two frames is not more than 150 milliseconds.
3. The mobile robot-oriented GNSS/lidar loop detection method of claim 1, wherein: in the GNSS data missing environment, whether the current frame and the candidate frame form loop constraint can still be accurately judged.
CN202111071687.4A 2021-09-14 2021-09-14 GNSS/LIDAR loop detection method for outdoor mobile robot Active CN113850864B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111071687.4A CN113850864B (en) 2021-09-14 2021-09-14 GNSS/LIDAR loop detection method for outdoor mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111071687.4A CN113850864B (en) 2021-09-14 2021-09-14 GNSS/LIDAR loop detection method for outdoor mobile robot

Publications (2)

Publication Number Publication Date
CN113850864A CN113850864A (en) 2021-12-28
CN113850864B true CN113850864B (en) 2024-04-12

Family

ID=78974056

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111071687.4A Active CN113850864B (en) 2021-09-14 2021-09-14 GNSS/LIDAR loop detection method for outdoor mobile robot

Country Status (1)

Country Link
CN (1) CN113850864B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115797425B (en) * 2023-01-19 2023-06-16 中国科学技术大学 Laser global positioning method based on point cloud aerial view and coarse-to-fine strategy

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110849367A (en) * 2019-10-08 2020-02-28 杭州电子科技大学 Indoor positioning and navigation method based on visual SLAM fused with UWB
CN112396167A (en) * 2020-12-30 2021-02-23 桂林电子科技大学 Loop detection method for fusing appearance similarity and spatial position information
CN112665575A (en) * 2020-11-27 2021-04-16 重庆大学 SLAM loop detection method based on mobile robot
CN112907491A (en) * 2021-03-18 2021-06-04 中煤科工集团上海有限公司 Laser point cloud loopback detection method and system suitable for underground roadway
CN112950781A (en) * 2021-03-19 2021-06-11 中山大学 Point cloud map construction method for multi-sensor dynamic weighting fusion of special scene
CN112990040A (en) * 2021-03-25 2021-06-18 北京理工大学 Robust loopback detection method combining global and local features
CN113358112A (en) * 2021-06-03 2021-09-07 北京超星未来科技有限公司 Map construction method and laser inertia odometer

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110849367A (en) * 2019-10-08 2020-02-28 杭州电子科技大学 Indoor positioning and navigation method based on visual SLAM fused with UWB
CN112665575A (en) * 2020-11-27 2021-04-16 重庆大学 SLAM loop detection method based on mobile robot
CN112396167A (en) * 2020-12-30 2021-02-23 桂林电子科技大学 Loop detection method for fusing appearance similarity and spatial position information
CN112907491A (en) * 2021-03-18 2021-06-04 中煤科工集团上海有限公司 Laser point cloud loopback detection method and system suitable for underground roadway
CN112950781A (en) * 2021-03-19 2021-06-11 中山大学 Point cloud map construction method for multi-sensor dynamic weighting fusion of special scene
CN112990040A (en) * 2021-03-25 2021-06-18 北京理工大学 Robust loopback detection method combining global and local features
CN113358112A (en) * 2021-06-03 2021-09-07 北京超星未来科技有限公司 Map construction method and laser inertia odometer

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Intensity scan context: Coding intensity and geometry relations for loop closure detection;Wang H, et al.;《2020 IEEE International Conference on Robotics and Automation (ICRA)》;2095-2101 *

Also Published As

Publication number Publication date
CN113850864A (en) 2021-12-28

Similar Documents

Publication Publication Date Title
CN111462200B (en) Cross-video pedestrian positioning and tracking method, system and equipment
CN111882612B (en) Vehicle multi-scale positioning method based on three-dimensional laser detection lane line
CN113359810B (en) Unmanned aerial vehicle landing area identification method based on multiple sensors
CN109931939B (en) Vehicle positioning method, device, equipment and computer readable storage medium
Sim et al. Integrated position estimation using aerial image sequences
CN110926474B (en) Satellite/vision/laser combined urban canyon environment UAV positioning and navigation method
Qu et al. Landmark based localization in urban environment
CN110542908A (en) laser radar dynamic object perception method applied to intelligent driving vehicle
CN114325634A (en) Method for extracting passable area in high-robustness field environment based on laser radar
CN114413909A (en) Indoor mobile robot positioning method and system
CN113850864B (en) GNSS/LIDAR loop detection method for outdoor mobile robot
CN111915651A (en) Visual pose real-time estimation method based on digital image map and feature point tracking
Liao et al. Se-calib: Semantic edges based lidar-camera boresight online calibration in urban scenes
Aggarwal GPS-based localization of autonomous vehicles
CN116358547B (en) Method for acquiring AGV position based on optical flow estimation
Betge-Brezetz et al. Object-based modelling and localization in natural environments
CN110211148B (en) Underwater image pre-segmentation method based on target state estimation
Kim Aerial map-based navigation using semantic segmentation and pattern matching
Sheikh et al. Geodetic alignment of aerial video frames
CN115164900A (en) Omnidirectional camera based visual aided navigation method and system in urban environment
CN111239761B (en) Method for indoor real-time establishment of two-dimensional map
CN103473787A (en) On-bridge-moving-object detection method based on space geometry relation
Pyo et al. Development of radial layout underwater acoustic marker using forward scan sonar for AUV
TWI748678B (en) Enhanced autonomous vehicle localization system based on ground information
CN115451964B (en) Ship scene simultaneous mapping and positioning method based on multi-mode mixing characteristics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant