CN118010008A - Binocular SLAM and inter-machine loop optimization-based double unmanned aerial vehicle co-location method - Google Patents

Binocular SLAM and inter-machine loop optimization-based double unmanned aerial vehicle co-location method Download PDF

Info

Publication number
CN118010008A
CN118010008A CN202410414180.1A CN202410414180A CN118010008A CN 118010008 A CN118010008 A CN 118010008A CN 202410414180 A CN202410414180 A CN 202410414180A CN 118010008 A CN118010008 A CN 118010008A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
eye
matching
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410414180.1A
Other languages
Chinese (zh)
Other versions
CN118010008B (en
Inventor
胡劲文
冯玖松
雷毅飞
张德腾
徐钊
韩军伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN202410414180.1A priority Critical patent/CN118010008B/en
Publication of CN118010008A publication Critical patent/CN118010008A/en
Application granted granted Critical
Publication of CN118010008B publication Critical patent/CN118010008B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a binocular SLAM and inter-machine loop-back optimization-based double unmanned aerial vehicle co-location method, which comprises the following steps: obtaining a first unmanned aerial vehicle left-eye characteristic point and a second unmanned aerial vehicle left-eye characteristic point; obtaining a first unmanned aerial vehicle left-eye matching point and a second unmanned aerial vehicle left-eye matching point which are successfully matched; obtaining a first unmanned aerial vehicle right-eye matching point which is matched with the first unmanned aerial vehicle left-eye matching point according to the first unmanned aerial vehicle left-eye matching point; obtaining a three-dimensional road marking point of the first unmanned aerial vehicle; calculating the relative pose of the second unmanned aerial vehicle relative to the first unmanned aerial vehicle; updating the pose of the first unmanned aerial vehicle and the pose of the second unmanned aerial vehicle when the first unmanned aerial vehicle and the second unmanned aerial vehicle walk, and performing loop optimization updating to finally obtain a positioning map; according to the invention, ORB features and AprilTag corner points in the cross view angles of multiple unmanned aerial vehicles are extracted, inter-machine matching is performed, and relative pose is solved to complete multi-machine initialization, so that the problems of instability and easy failure during multi-machine initialization are solved.

Description

Binocular SLAM and inter-machine loop optimization-based double unmanned aerial vehicle co-location method
Technical Field
The invention belongs to the technical field of multi-machine collaborative autonomous positioning, and particularly relates to a binocular SLAM and inter-machine loop optimization-based double-unmanned-plane collaborative positioning method.
Background
SLAM (simultaneous localization AND MAPPING), for instant location and map construction, when carrying out tasks such as rescue search, situation perception, etc., unmanned aerial vehicle (Unmanned AERIAL VEHICLE, UAV) often need to accomplish the flight task under the condition that global positioning system became invalid, and Unmanned aerial vehicle often can carry vision sensor and carry out autonomous positioning navigation when carrying out the task, therefore, it is important to research ripe effectual vision positioning algorithm, improve Unmanned aerial vehicle autonomous positioning's precision and robustness. The problems of small coverage area, poor robustness, limited observation capability and the like exist when a single unmanned aerial vehicle performs tasks, along with the maturity of 5G communication and single unmanned aerial vehicle technical development, the application of a multi-unmanned aerial vehicle platform is wider and wider, the multi-unmanned aerial vehicle can cooperatively perform tasks, the environment perception capability can be enhanced, the cost of war loss can be reduced, the space utilization rate is improved, and the problem of multi-machine positioning is solved at first when the multi-unmanned aerial vehicle cooperatively performs tasks.
In order to exert the advantages of multi-unmanned aerial vehicle cooperation, many researches are developed aiming at the core problem of multi-unmanned aerial vehicle co-location, including how to share data for inter-machine association, how to effectively perform global pose optimization and the like. In the prior art, the problems that the vision autonomous positioning of multiple unmanned planes is unstable and is easy to fail during multi-machine initialization based on pure features and SfM (Structure from Motion motion recovery structure) exist.
Disclosure of Invention
The invention aims to provide a binocular SLAM and inter-machine loop-back optimization-based double unmanned aerial vehicle co-location method, which aims at solving the problems that the vision autonomous location of multiple unmanned aerial vehicles is unstable and is easy to fail during multi-machine initialization based on pure features and SfM.
The invention adopts the following technical scheme: a binocular SLAM and inter-machine loop optimization-based double unmanned aerial vehicle co-location method comprises the following steps:
Step 1: extracting ORB characteristic points which are respectively shot by the left eye of the first unmanned aerial vehicle and the left eye of the second unmanned aerial vehicle and contain the same AprilTag images, and obtaining the left eye characteristic points of the first unmanned aerial vehicle and the left eye characteristic points of the second unmanned aerial vehicle;
Step 2: matching the first unmanned aerial vehicle left-eye characteristic points and the second unmanned aerial vehicle left-eye characteristic points according to the Hamming distance, and obtaining successfully matched first unmanned aerial vehicle left-eye matching points and second unmanned aerial vehicle left-eye matching points;
Step 3: obtaining a first unmanned aerial vehicle right-eye matching point which is matched with the first unmanned aerial vehicle left-eye matching point according to the first unmanned aerial vehicle left-eye matching point;
Step 4: calculating AprilTag corner points in images shot by the left eye of the first unmanned aerial vehicle and the right eye of the first unmanned aerial vehicle respectively, and combining the left eye matching point of the first unmanned aerial vehicle and the right eye matching point of the first unmanned aerial vehicle to calculate and obtain a three-dimensional road sign point of the first unmanned aerial vehicle;
step 5: calculating AprilTag corner points in the image shot by the left eye of the second unmanned aerial vehicle, and combining the three-dimensional road mark points of the first unmanned aerial vehicle and the left eye matching points of the second unmanned aerial vehicle to obtain the relative pose of the second unmanned aerial vehicle relative to the first unmanned aerial vehicle;
step 6: and updating the pose of the first unmanned aerial vehicle and the pose of the second unmanned aerial vehicle when the first unmanned aerial vehicle and the second unmanned aerial vehicle walk, and performing loop optimization updating to finally obtain a positioning map.
Further, in step 5, the calculation formula of the relative pose of the second unmanned aerial vehicle with respect to the first unmanned aerial vehicle is:
In the method, in the process of the invention, The relative pose of the second unmanned aerial vehicle relative to the first unmanned aerial vehicle; n is the number of the first unmanned aerial vehicle left-eye matching points and the second unmanned aerial vehicle left-eye matching points which are successfully matched; /(I)The left eye matching point of the second unmanned aerial vehicle; /(I)Is a camera internal reference of the second unmanned aerial vehicle; /(I)ORB feature of three-dimensional road marking point for first unmanned aerial vehicle,/>For AprilTag corner points in the image shot from the left eye of the second unmanned aerial vehicle,/>Is AprilTag corner points of the first unmanned aerial vehicle.
Further, the step 6 specifically includes:
ORB feature point extraction is carried out on key frames of the first unmanned aerial vehicle and the second unmanned aerial vehicle, dictionary tree retrieval and a word bag model are established, a loop database is established, and sources of the current key frames and the matching frames are judged; and when the key frame and the matching frame come from the first unmanned aerial vehicle or the second unmanned aerial vehicle respectively, carrying out loop optimization on the pose of the unmanned aerial vehicle, and when the key frame and the matching frame come from the first unmanned aerial vehicle and the second unmanned aerial vehicle respectively, carrying out loop optimization on the pose between the unmanned aerial vehicles, and updating to finally obtain the positioning map.
The beneficial effects of the invention are as follows:
According to the invention, ORB characteristics and AprilTag corner points in a multi-unmanned aerial vehicle cross view angle are extracted, inter-machine matching is carried out, and relative pose is solved to complete multi-machine initialization, so that the problems of instability and easy failure during multi-machine initialization are solved;
Because the ORB features and SfM are easy to lose efficacy when being matched with each other, the invention adds AprilTag detection auxiliary initialization on the basis of ORB feature matching, and extracts ORB features at AprilTag after AprilTag detection auxiliary, thus the matching success rate of the features is high and the failure caused by the overlarge observation gap is prevented; in addition, due to the characteristic of AprilTag, the corner points of AprilTag can be stably extracted, so that matching is completed.
Detailed Description
The present invention will be described in detail with reference to the following embodiments.
The multi-unmanned aerial vehicle autonomous positioning is to estimate pose transformation of a camera coordinate system rigidly connected with the unmanned aerial vehicle at continuous moments relative to a world coordinate system, and the initialization is to solve association of initial moments of the multi-unmanned aerial vehicle, namely, transformation of the initial camera coordinate system of the multi-unmanned aerial vehicle at the initial moments relative to the world coordinate system, and unify the multi-unmanned aerial vehicle into the same world coordinate system.
Aiming at the problems that the pure feature and SfM are unstable and are easy to fail during multi-machine initialization, the relative pose of the first unmanned aerial vehicle and the second unmanned aerial vehicle with the cross visual angles is calculated based on ORB feature points and AprilTag corner points; namely: and firstly, extracting ORB features and AprilTag corner points from images shot by the left eye of the first unmanned aerial vehicle, and matching with the result of extracting the images shot by the left eye of the second unmanned aerial vehicle. Then, triangulating ORB and AprilTag corner points by the binocular of the first unmanned aerial vehicle to generate three-dimensional road sign points of the first unmanned aerial vehicle, and theoretically, establishing visual association based on common observation if the points successfully matched correspond to the same three-dimensional road sign points in the space, and solving through nonlinear optimization to obtain the relative pose. If multiple unmanned aerial vehicles exist, as long as the initial pose of one unmanned aerial vehicle is obtained, and two unmanned aerial vehicles have crossed visual angles, the initial pose can be calculated after the relative pose is solved according to the operation of any two unmanned aerial vehicles, and initialization is completed.
According to AprilTag relevant properties, aprilTag corner points are extracted based on AprilTag detection, and relative pose estimation among multiple unmanned aerial vehicles is performed by combining ORB features extracted from unmanned aerial vehicle images, so that initialization of multiple unmanned aerial vehicle platforms is completed. AprilTag extraction of corner points, namely, on the basis of a cross visual angle, placing a AprilTag, aprilTag detection based on visual detection of a two-dimensional bar code, and detecting AprilTag with a known size placed in front of a camera to obtain an ID of a calibration plate and AprilTag detection frames, thereby obtaining four corner points on a AprilTag plane. And AprilTag detection auxiliary initialization is added on the basis of ORB feature matching, namely ORB features are extracted at AprilTag, the features are matching points with high success rate in common observation, and the AprilTag corner points can be stably extracted to complete matching.
Therefore, the invention discloses a binocular SLAM and inter-machine loop-back optimization-based double unmanned aerial vehicle co-location method, which comprises the following steps:
Step 1: extracting ORB characteristic points which are respectively shot by the left eye of the first unmanned aerial vehicle and the left eye of the second unmanned aerial vehicle and contain the same AprilTag images, and obtaining the left eye characteristic points of the first unmanned aerial vehicle and the left eye characteristic points of the second unmanned aerial vehicle;
Step 2: matching the first unmanned aerial vehicle left-eye characteristic points and the second unmanned aerial vehicle left-eye characteristic points according to the Hamming distance, and obtaining successfully matched first unmanned aerial vehicle left-eye matching points and second unmanned aerial vehicle left-eye matching points;
Step 3: obtaining a first unmanned aerial vehicle right-eye matching point which is matched with the first unmanned aerial vehicle left-eye matching point according to the first unmanned aerial vehicle left-eye matching point;
Step 4: calculating AprilTag corner points in images shot by the left eye of the first unmanned aerial vehicle and the right eye of the first unmanned aerial vehicle respectively, and combining the left eye matching point of the first unmanned aerial vehicle and the right eye matching point of the first unmanned aerial vehicle to calculate and obtain a three-dimensional road sign point of the first unmanned aerial vehicle;
Step 5: calculating AprilTag corner points in the image shot by the left eye of the second unmanned aerial vehicle, and calculating the relative pose of the second unmanned aerial vehicle relative to the first unmanned aerial vehicle by combining the three-dimensional road mark points of the first unmanned aerial vehicle and the left eye matching points of the second unmanned aerial vehicle;
step 6: and updating the pose of the first unmanned aerial vehicle and the pose of the second unmanned aerial vehicle when the first unmanned aerial vehicle and the second unmanned aerial vehicle walk, and performing loop optimization updating to finally obtain a positioning map.
In the step 5, a calculation formula of the relative pose of the second unmanned aerial vehicle relative to the first unmanned aerial vehicle is as follows:
In the method, in the process of the invention, The relative pose of the second unmanned aerial vehicle relative to the first unmanned aerial vehicle; n is the number of the first unmanned aerial vehicle left-eye matching points and the second unmanned aerial vehicle left-eye matching points which are successfully matched; /(I)The left eye matching point of the second unmanned aerial vehicle; /(I)Is a camera internal reference of the second unmanned aerial vehicle; /(I)ORB feature of three-dimensional road marking point for first unmanned aerial vehicle,/>For AprilTag corner points in the image shot from the left eye of the second unmanned aerial vehicle,/>Is AprilTag corner points of the first unmanned aerial vehicle.
The step 6 specifically comprises the following steps:
ORB feature point extraction is carried out on key frames of the first unmanned aerial vehicle and the second unmanned aerial vehicle, dictionary tree retrieval and a word bag model are established, a loop database is established, and sources of the current key frames and the matching frames are judged; and when the key frame and the matching frame come from the first unmanned aerial vehicle or the second unmanned aerial vehicle respectively, carrying out loop optimization on the pose of the unmanned aerial vehicle, and when the key frame and the matching frame come from the first unmanned aerial vehicle and the second unmanned aerial vehicle respectively, carrying out loop optimization on the pose between the unmanned aerial vehicles, and updating to finally obtain the positioning map.
Examples
In this embodiment, based on the cross view angle, the relative pose of the unmanned aerial vehicle is solved, and based on the measurement generated by the first unmanned aerial vehicle and the second unmanned aerial vehicle cross view field on the same road mark point, the visual correlation solving is established
Assume that the left eye shot image of the first unmanned aerial vehicle containing AprilTag isThe image containing AprilTag shot by the right eye of the first unmanned aerial vehicle is/>Left-eye photographed image containing AprilTag of second unmanned aerial vehicle is/>The image containing AprilTag shot by the right eye of the second unmanned aerial vehicle is/>
Step 1: and extracting ORB characteristic points which are respectively shot by the left eye of the first unmanned aerial vehicle and the left eye of the second unmanned aerial vehicle and contain the same AprilTag images, and obtaining the left eye characteristic points of the first unmanned aerial vehicle and the left eye characteristic points of the second unmanned aerial vehicle.
Step 2: and matching the first unmanned aerial vehicle left-eye characteristic points with the second unmanned aerial vehicle left-eye characteristic points according to the Hamming distance, and obtaining successfully matched first unmanned aerial vehicle left-eye matching points and second unmanned aerial vehicle left-eye matching points. When matching, the hamming distance of the matched point pair is smaller than twice the minimum distance, if the hamming distance is larger than the minimum distance, the matching is considered as incorrect matching, the mismatching point is removed, and if the hamming distance is smaller than the minimum distance, the matching is considered as correct matching.
Step 3: obtaining first unmanned aerial vehicle right-eye matching points matched with the first unmanned aerial vehicle left-eye matching points according to the first unmanned aerial vehicle left-eye matching points, and respectively marking the first unmanned aerial vehicle right-eye matching points as
And/>
Because the image containing AprilTag and the image containing AprilTag shot by the left eye and the right eye of the first unmanned aerial vehicle are not changed greatly and correct matching is established, the right eye matching point of the first unmanned aerial vehicle, which is mutually matched with the left eye matching point of the first unmanned aerial vehicle, is foundIs the first unmanned aerial vehicle right-eye matching point/>
Step 4: computing AprilTag corner points in images shot by the left eye of the first unmanned aerial vehicle and the right eye of the first unmanned aerial vehicle respectively, and marking as: and/>
The binocular camera model can know that the matching point pairs of the left-eye and right-eye images of the binocular camera can calculate corresponding three-dimensional target points, namely the set of the known successful matching point pairs of the left-eye matching point and AprilTag corner points of the first unmanned aerial vehicleSet of right eye matching points and AprilTag corner successful matching point pairs of first unmanned aerial vehicleInternal parameters/>And right external parameters/>Three-dimensional road marking points/>, of the first unmanned aerial vehicle can be obtained
Step 5: aprilTag corner points in images shot by the left eye of the second unmanned aerial vehicle are calculated
According to/>And internal parameters/>Obtaining the relative pose of the second unmanned aerial vehicle relative to the first unmanned aerial vehicle:
The initial pose of the first unmanned aerial vehicle and the initial pose of the second unmanned aerial vehicle can be obtained by the method And/>The method comprises the following steps:
step 6: and updating the pose of the first unmanned aerial vehicle and the pose of the second unmanned aerial vehicle when the first unmanned aerial vehicle and the second unmanned aerial vehicle walk, and performing loop optimization updating to finally obtain a positioning map.
The foregoing description of the preferred embodiments of the invention is not intended to limit the invention to the precise form disclosed, and any such modifications, equivalents, and alternatives falling within the spirit and scope of the invention are intended to be included within the scope of the invention.

Claims (3)

1. The binocular SLAM and inter-machine loop-back optimization-based double unmanned aerial vehicle co-location method is characterized by comprising the following steps of:
Step 1: extracting ORB characteristic points which are respectively shot by the left eye of the first unmanned aerial vehicle and the left eye of the second unmanned aerial vehicle and contain the same AprilTag images, and obtaining the left eye characteristic points of the first unmanned aerial vehicle and the left eye characteristic points of the second unmanned aerial vehicle;
Step 2: matching the first unmanned aerial vehicle left-eye characteristic points and the second unmanned aerial vehicle left-eye characteristic points according to the Hamming distance, and obtaining successfully matched first unmanned aerial vehicle left-eye matching points and second unmanned aerial vehicle left-eye matching points;
Step 3: obtaining a first unmanned aerial vehicle right-eye matching point which is matched with the first unmanned aerial vehicle left-eye matching point according to the first unmanned aerial vehicle left-eye matching point;
Step 4: calculating AprilTag corner points in images shot by the left eye of the first unmanned aerial vehicle and the right eye of the first unmanned aerial vehicle respectively, and combining the left eye matching point of the first unmanned aerial vehicle and the right eye matching point of the first unmanned aerial vehicle to calculate and obtain a three-dimensional road sign point of the first unmanned aerial vehicle;
step 5: calculating AprilTag corner points in the image shot by the left eye of the second unmanned aerial vehicle, and combining the three-dimensional road mark points of the first unmanned aerial vehicle and the left eye matching points of the second unmanned aerial vehicle to obtain the relative pose of the second unmanned aerial vehicle relative to the first unmanned aerial vehicle;
step 6: and updating the pose of the first unmanned aerial vehicle and the pose of the second unmanned aerial vehicle when the first unmanned aerial vehicle and the second unmanned aerial vehicle walk, and performing loop optimization updating to finally obtain a positioning map.
2. The binocular SLAM and inter-aircraft loop optimization-based double-unmanned-aerial-vehicle co-location method according to claim 1, wherein the calculation formula of the relative pose of the second unmanned aerial vehicle with respect to the first unmanned aerial vehicle in step 5 is:
In the method, in the process of the invention,
The relative pose of the second unmanned aerial vehicle relative to the first unmanned aerial vehicle; n is the number of the first unmanned aerial vehicle left-eye matching points and the second unmanned aerial vehicle left-eye matching points which are successfully matched; /(I)The left eye matching point of the second unmanned aerial vehicle; /(I)Is a camera internal reference of the second unmanned aerial vehicle; /(I)ORB feature of three-dimensional road marking point for first unmanned aerial vehicle,/>For AprilTag corner points in the image shot from the left eye of the second unmanned aerial vehicle,/>Is AprilTag corner points of the first unmanned aerial vehicle.
3. The binocular SLAM and inter-aircraft loop optimization-based double unmanned aerial vehicle co-location method of claim 1, wherein the step 6 is specifically:
ORB feature point extraction is carried out on key frames of the first unmanned aerial vehicle and the second unmanned aerial vehicle, dictionary tree retrieval and a word bag model are established, a loop database is established, and sources of the current key frames and the matching frames are judged; and when the key frame and the matching frame come from the first unmanned aerial vehicle or the second unmanned aerial vehicle respectively, carrying out loop optimization on the pose of the unmanned aerial vehicle, and when the key frame and the matching frame come from the first unmanned aerial vehicle and the second unmanned aerial vehicle respectively, carrying out loop optimization on the pose between the unmanned aerial vehicles, and updating to finally obtain the positioning map.
CN202410414180.1A 2024-04-08 2024-04-08 Binocular SLAM and inter-machine loop optimization-based double unmanned aerial vehicle co-location method Active CN118010008B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410414180.1A CN118010008B (en) 2024-04-08 2024-04-08 Binocular SLAM and inter-machine loop optimization-based double unmanned aerial vehicle co-location method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410414180.1A CN118010008B (en) 2024-04-08 2024-04-08 Binocular SLAM and inter-machine loop optimization-based double unmanned aerial vehicle co-location method

Publications (2)

Publication Number Publication Date
CN118010008A true CN118010008A (en) 2024-05-10
CN118010008B CN118010008B (en) 2024-06-07

Family

ID=90956665

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410414180.1A Active CN118010008B (en) 2024-04-08 2024-04-08 Binocular SLAM and inter-machine loop optimization-based double unmanned aerial vehicle co-location method

Country Status (1)

Country Link
CN (1) CN118010008B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019169540A1 (en) * 2018-03-06 2019-09-12 斯坦德机器人(深圳)有限公司 Method for tightly-coupling visual slam, terminal and computer readable storage medium
CN111242996A (en) * 2020-01-08 2020-06-05 郭轩 SLAM method based on Apriltag and factor graph
CN114332360A (en) * 2021-12-10 2022-04-12 深圳先进技术研究院 Collaborative three-dimensional mapping method and system
CN115265534A (en) * 2022-08-02 2022-11-01 江苏省农业科学院 Multi-sensor fusion positioning navigation method, device and system based on AprilTag code
CN115790571A (en) * 2022-11-25 2023-03-14 中国科学院深圳先进技术研究院 Simultaneous positioning and map construction method based on mutual observation of heterogeneous unmanned system
CN116907477A (en) * 2023-07-19 2023-10-20 同济大学 High-precision laser SLAM method and device based on visual road sign assistance

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019169540A1 (en) * 2018-03-06 2019-09-12 斯坦德机器人(深圳)有限公司 Method for tightly-coupling visual slam, terminal and computer readable storage medium
CN111242996A (en) * 2020-01-08 2020-06-05 郭轩 SLAM method based on Apriltag and factor graph
CN114332360A (en) * 2021-12-10 2022-04-12 深圳先进技术研究院 Collaborative three-dimensional mapping method and system
WO2023104207A1 (en) * 2021-12-10 2023-06-15 深圳先进技术研究院 Collaborative three-dimensional mapping method and system
CN115265534A (en) * 2022-08-02 2022-11-01 江苏省农业科学院 Multi-sensor fusion positioning navigation method, device and system based on AprilTag code
CN115790571A (en) * 2022-11-25 2023-03-14 中国科学院深圳先进技术研究院 Simultaneous positioning and map construction method based on mutual observation of heterogeneous unmanned system
CN116907477A (en) * 2023-07-19 2023-10-20 同济大学 High-precision laser SLAM method and device based on visual road sign assistance

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
RUBLEE E, RABAUD V, KONOLIGE K, ET AL.: "ORB:An efficient alternative to SIFT or SURF", INTERNATIONAL CONFERENCE ON COMPUTER VISION, 31 December 2011 (2011-12-31) *
张震;郑宏;周璇;张生群;: "一种结合ORB特征和视觉词典的RGB-D SLAM算法", 计算机工程与应用, no. 12, 30 June 2018 (2018-06-30) *
赵彤;刘洁瑜;李卓;: "三阶段局部双目光束法平差视觉里程计", 光电工程, no. 11, 30 November 2018 (2018-11-30) *

Also Published As

Publication number Publication date
CN118010008B (en) 2024-06-07

Similar Documents

Publication Publication Date Title
CN110068335B (en) Unmanned aerial vehicle cluster real-time positioning method and system under GPS rejection environment
CN109579843B (en) Multi-robot cooperative positioning and fusion image building method under air-ground multi-view angles
CN107990899B (en) Positioning method and system based on SLAM
CN112419494B (en) Obstacle detection and marking method and device for automatic driving and storage medium
Nistér et al. Visual odometry
US9031809B1 (en) Method and apparatus for generating three-dimensional pose using multi-modal sensor fusion
CN110125928A (en) A kind of binocular inertial navigation SLAM system carrying out characteristic matching based on before and after frames
CN112304307A (en) Positioning method and device based on multi-sensor fusion and storage medium
CN109579825B (en) Robot positioning system and method based on binocular vision and convolutional neural network
CN111127524A (en) Method, system and device for tracking trajectory and reconstructing three-dimensional image
CN108534782A (en) A kind of instant localization method of terrestrial reference map vehicle based on binocular vision system
CN104933718A (en) Physical coordinate positioning method based on binocular vision
CN112219087A (en) Pose prediction method, map construction method, movable platform and storage medium
CN112634451A (en) Outdoor large-scene three-dimensional mapping method integrating multiple sensors
CN104281148A (en) Mobile robot autonomous navigation method based on binocular stereoscopic vision
US20210183100A1 (en) Data processing method and apparatus
KR101544021B1 (en) Apparatus and method for generating 3d map
CN112258409A (en) Monocular camera absolute scale recovery method and device for unmanned driving
CN111307146B (en) Virtual reality wears display device positioning system based on binocular camera and IMU
CN111998862B (en) BNN-based dense binocular SLAM method
CN116255992A (en) Method and device for simultaneously positioning and mapping
CN110889873A (en) Target positioning method and device, electronic equipment and storage medium
CN112802096A (en) Device and method for realizing real-time positioning and mapping
CN116222543B (en) Multi-sensor fusion map construction method and system for robot environment perception
CN116468786B (en) Semantic SLAM method based on point-line combination and oriented to dynamic environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant