CN109579840A - A kind of close coupling binocular vision inertia SLAM method of dotted line Fusion Features - Google Patents
A kind of close coupling binocular vision inertia SLAM method of dotted line Fusion Features Download PDFInfo
- Publication number
- CN109579840A CN109579840A CN201811250049.7A CN201811250049A CN109579840A CN 109579840 A CN109579840 A CN 109579840A CN 201811250049 A CN201811250049 A CN 201811250049A CN 109579840 A CN109579840 A CN 109579840A
- Authority
- CN
- China
- Prior art keywords
- dotted line
- imu
- key frame
- line feature
- feature
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Image Analysis (AREA)
Abstract
The present invention relates to a kind of close coupling binocular vision inertia SLAM methods of dotted line Fusion Features, comprising the following steps: determines the transformation relation of camera coordinates system Yu inertial sensor coordinate system;It establishes dotted line feature and adds IMU track thread, solve initial three-dimensional dotted line three-dimensional coordinate, after IMU initialization, use the position of IMU predicted characteristics dotted line, proper characteristics data correlation is established, in conjunction with IMU and dotted line feature re-projection error item, solves the pose transformation of successive frame;It establishes dotted line feature and adds the part IMU bundle adjustment thread, optimize dotted line three-dimensional coordinate, the quantity of state of key frame pose and IMU in the crucial frame window in part;Dotted line feature winding detection thread is established, calculates the score of bag of words using dotted line characteristic weighing to detect winding, and carry out the optimization of global state amount.The present invention can guarantee stable and high-precision under and the quick motion conditions of camera less in characteristic point number.
Description
Technical field
The present invention relates to fields synchronous superposition (SLAM) of vision inertia fusion, more particularly to a kind of point
The close coupling binocular vision inertia SLAM method of line Fusion Features.
Background technique
Synchronous superposition (Simultaneous Localization And Mapping, SLAM) is machine
People realizes that self-position is determining and experiences the important means of ambient enviroment and underlying issue and the research in independent navigation field
Hot spot.It, which leads, is to solve after robot enters foreign environment, how by sensor (such as: binocular camera, inertia measurement list
Member etc.) data that obtain, the map for calculating ambient enviroment indicates, and at the same time determining the position in itself locating map
Problem.
Binocular vision sensor has the advantages such as lightweight, easy to install, cheap and scale certainty, so that binocular
Vision SLAM has become a hot topic of research.Inertial sensor (Inertial Measurement Unit, IMU) is capable of providing itself
Motion information (such as: acceleration, angular speed), pass through integral carry out position and posture calculating.Inertial sensor has height
The data of rate export, and can timely respond to the variation of position and posture, but have serious calculating cumulative errors, vision letter
Although it is lower to cease rate, it is capable of providing information content abundant, and cumulative errors can be eliminated by winding detection.Vision is used
Property fusion SLAM method the robustness and accuracy of entire SLAM method are improved by the fusion to sensing data.
Conventional visual SLAM method relies primarily on the characteristic point in environment, and characteristic point is tracked carry out positioning and
Build figure.It is more sparse in image characteristic point, when the stronger and violent movement of illumination variation, it is easy to influence the precision of algorithm, very
To causing algorithm to fail.The SLAM method of vision inertia fusion, although inertial sensor assistant images characteristic can be passed through
Association, while optimizing inertial sensor data, but when image feature data association error or inertial sensor initialization are estimated
When meter error, the error of measurement can propagate to the solution of entire algorithm, so that entire vision inertia blending algorithm precision is tight
Decline again.
Summary of the invention
Technical problem to be solved by the invention is to provide a kind of close coupling binocular vision inertia of dotted line Fusion Features
SLAM method can guarantee stable and high-precision under and the quick motion conditions of camera less in characteristic point number.
The technical solution adopted by the present invention to solve the technical problems is: the close coupling for providing a kind of dotted line Fusion Features is double
Visually feel inertia SLAM method, comprising the following steps:
It determines the transformation relation between binocular camera coordinate system and inertial sensor coordinate system, initializes SLAM system, build
Vertical dotted line feature adds IMU track thread, dotted line feature to add the part IMU bundle adjustment thread and dotted line feature winding detection line
Journey;
The dotted line feature adds IMU track thread to calculate the three-dimensional coordinate of initial dotted line feature, and calculates between before and after frames
IMU pre-integration information dotted line feature is predicted after IMU initialization, and carry out the spy of before and after frames and local map frame
Sign matching determines correct characteristic association;Combined optimization dotted line feature re-projection error and IMU incremental error, before solution
Framing bit appearance converts afterwards;By dotted line feature distribution and IMU measurement data situation determine whether that new key frame is added, and is deposited
Storage;
The dotted line feature adds the part IMU bundle adjustment thread to need that the dotted line feature is waited to add IMU track thread
Increased key number of frames just starts to execute operation after reaching certain threshold value, when crucial number of frames reaches the value of setting, general
Start IMU initialization procedure, estimates the acceleration and angular speed deviation of IMU data;Simultaneously to dotted line characteristic 3 D map into
Row processing, deletes dotted line characteristic 3 D point map of low quality, establishes the crucial frame window in part and regards crucial frame window altogether, right
Key frame and corresponding local dotted line characteristic 3 D map and IMU quantity of state in the key frame window of part carry out local light beam
The optimization of method adjustment, and redundant eliminating is carried out to key frame, after IMU initialization, imu error item will be increased and optimized, simultaneously
Solve the quantity of state of the pose of key frame, dotted line characteristic 3 D coordinate and IMU;
The dotted line feature winding detection thread compares present frame by the bag of words of dotted line feature and winding detection is closed
Key frame in key frame sequence calculates the bag of words score of present frame and key frame, carries out entirely after detecting winding key frame
The dotted line feature bundle adjustment with imu error item of office optimizes, at the same optimize the status items of IMU, key frame pose and
Dotted line characteristic 3 D coordinate.
The dotted line feature adds IMU track thread to receive the left and right stereo-picture of input corrected and extracts point spy respectively
Line feature of seeking peace calculates the three-dimensional coordinate of dotted line feature to left and right feature dotted line progress Stereo matching, with creating initial characteristics
Figure, meanwhile, calculate IMU pre-integration when present image reaches;When next frame image inputs, the pre- of dotted line feature locations is carried out
It surveys, and carries out characteristic matching near predicted position, by the before and after frames feature locations data correlation of foundation, optimize dotted line feature
Re-projection error, when IMU initialization after, by increase imu error item optimized together with re-projection error, solve front and back
The rotation and translation of frame converts, and by establishing local feature map, obtains more characteristic associations, then passes through dotted line spy
Levy distribution situation and IMU measurement data, it is determined whether addition key frame, and carry out the storage of key frame.
The dotted line feature adds the part IMU bundle adjustment thread establishing the crucial frame window in part and regarding key frame window altogether
Mouthful when, top n key frame is formed into the crucial frame window in part, exists with this N number of key frame and depending on the key frame of relationship but does not wrap altogether
It includes this N number of key frame and the N+1 key frame composition regards crucial frame window altogether;In local BA, in the key frame window of part
Dotted line characteristic 3 D coordinate, IMU quantity of state, whole optimizations regard the crucial framing bit in crucial frame window by key frame pose altogether
Appearance, which will fix, to be added in majorized function, and dotted line characteristic 3 D coordinate, IMU quantity of state is added without in majorized function;Key frame
Determination is determined jointly by dotted line feature.
The bag of words score of point feature and line feature in the dotted line feature winding detection thread is respectively spAnd sl,
The weighting of the two is as final bag of words score.
Beneficial effect
Due to the adoption of the above technical solution, compared with prior art, the present invention having the following advantages that and actively imitating
Fruit: the present invention can still be able to guarantee to connect in real time at the less place of image characteristic point, especially under man-made structures' environment
Continuous, stable high-precision pose output, so that the working environment adapted to is substantially improved, the robustness of method is significant
Enhancing.The present invention is capable of providing the winding detection method of a kind of combination IMU and dotted line feature, can make full use of dotted line feature
Fusion detects more accurate and robust winding.
Detailed description of the invention
Fig. 1 is dotted line feature and IMU close coupling SLAM schematic diagram of the invention;
Fig. 2 is that dotted line feature of the invention adds IMU track thread to realize Principle of Process figure;
Fig. 3 is that dotted line feature of the invention adds the part IMU bundle adjustment thread to realize Principle of Process figure;
Fig. 4 is that Principle of Process figure is realized in dotted line feature winding detection of the invention.
Specific embodiment
Present invention will be further explained below with reference to specific examples.It should be understood that these embodiments are merely to illustrate the present invention
Rather than it limits the scope of the invention.In addition, it should also be understood that, after reading the content taught by the present invention, those skilled in the art
Member can make various changes or modifications the present invention, and such equivalent forms equally fall within the application the appended claims and limited
Range.
Embodiments of the present invention are related to a kind of close coupling binocular vision inertia SLAM method of dotted line Fusion Features, we
Method is the extension and improvement in traditional SLAM method, as shown in Figure 1, it is by extracting image point feature and line feature and carrying out spy
Matching is levied, determines that characteristic is associated with, and passes through while optimizing solution and the dotted line spy of the error term progress pose of IMU and vision
Levy three-dimensional map building.
The main process of this method is realized by c++ language, and the Optimization Solution of error function is carried out using the library g2o, this
Method comprises the steps of:
1) camera coordinates system and inertial sensor coordinate system coordinate transform calibration determine the rotation R peace between Two coordinate system
Move transformation t;The inside and outside ginseng of binocular camera is demarcated, and determines the internal reference matrix K and baseline length b of binocular camera;
2) SLAM is initialized, and establish three threads: dotted line feature adds IMU track thread, dotted line feature to add the part IMU light beam
Method adjustment thread and dotted line feature winding detect thread;
3) as shown in Fig. 2, dotted line feature adds IMU track thread to receive the good IMU data of hardware synchronization and corrected double
View stereo image carries out Stereo matching to Points And lines feature in the figure of left and right parallel, obtains initial dotted line three-dimensional map;And each
When arriving to new image frame, current IMU pre-integration information is calculated, if IMU has completed to initialize, will be carried out using IMU
The prediction of pose determines approximate location of the current signature again in next frame, then in this position by the posture information of prediction
The dotted line characteristic matching of before and after frames is nearby carried out, while establishing local map information, the three-dimensional dotted line feature in local map is anti-
Present frame is projected to, finds more dotted line characteristic matchings as a result, establishing more data correlations, carries out only solving pose later
The BA of information, majorized function include imu error item and dotted line feature re-projection error item, are asked by way of nonlinear optimization
Solve the posture information of present frame.It and will be according to the fusion distribution situation of dotted line feature, it is determined whether receiving present frame is to close
Key frame.
(4) as shown in figure 3, dotted line feature adds the part IMU bundle adjustment thread in the key for waiting track thread selection
When frame number reaches the threshold value of setting, it is first begin to carry out the initialization operation of IMU, estimates the acceleration and angular speed of IMU
Biasing, and further estimate gravity direction and velocity information, while handling dotted line characteristic 3 D map, delete quality not
High dotted line characteristic 3 D point map establishes the crucial frame window in part and regards two optimization windows of crucial frame window altogether, wherein office
Portion's key frame window is made of top n key frame, and view key frame window is total depending on relationship but be not this by having with this N number of key frame altogether
The key frame of N number of key frame and the N+1 key frame composition.To in local key frame window key frame and corresponding part
Dotted line characteristic 3 D map and IMU quantity of state carry out local bundle adjustment optimization, and the key frame pose in viewing window is added altogether
In majorized function formula (1), function to be optimized includes three, is respectively: point feature re-projection error, line feature re-projection miss
Difference and imu error item.Three error terms provide in formula (2), and wherein point feature re-projection error is plane of delineation detection
The coordinate difference of characteristic point and three-dimensional point back projection point, line feature re-projection error are that three-dimensional line segment projects latter two endpoint to detection
The distance of line segment out.Imu error item includes increment of rotation, speed increment and displacement increment and offset error.It is fixed by g2o
Position fixed point and side, treat the Jacobian matrix of optimized variable in the error term according to calculating, renewal amount are determined, thus using excellent
Change device to be solved.Key frame pose in local window, three-dimensional point coordinate, three-dimensional line coordinates, IMU speed, deviation will be by
It estimates.
Wherein,
Wherein, ρ is robustness kernel function,For point feature re-projection error,Characteristic point for plane of delineation detection is sat
Mark, π (Pk) be three-dimensional point back projection point coordinate;For line feature re-projection error, li,kFor line segment equation, the π of image detection
(Si,k) it is line segment starting point re-projection position, π (Ei,k) be line segment terminal re-projection position,For imu error item, eRFor
Increment of rotation, evFor speed increment, epFor displacement increment, ebFor offset error.
(5) as shown in figure 4, the dotted line feature winding, which detects thread, compares present frame by the bag of words of dotted line feature
Key frame with winding detection keyframe sequence, calculates the bag of words score of present frame and key frame, detects that winding closes
The global dotted line feature bundle adjustment optimization with imu error item is carried out after key frame, while being optimized the status items of IMU, being closed
Key framing bit appearance and dotted line characteristic 3 D coordinate.It wherein, is respectively s for the bag of words score of point feature and line featurepWith
sl, need to calculate the weighting of the two as final bag of words score: sk=ωpsp+ωlsl。
Wherein,ωpFor the weight of point feature, ωlFor the weight of line feature.The two power
The standard deviation composition of proportions of the ratio as shared by number of features and characteristic coordinates respectively again, wherein nlAnd npBe respectively line feature and
The number of point feature, dlAnd dpIt is line feature and point feature x, the standard deviation of y-coordinate respectively;It is determined by calculating final score
It whether is winding.The global dotted line feature bundle adjustment optimization with imu error item is carried out after detecting winding key frame,
Optimize status items, key frame pose and the dotted line characteristic 3 D coordinate of IMU simultaneously.
It is not difficult to find that the present invention can still be able to especially under man-made structures' environment at the less place of image characteristic point
Guarantee high-precision pose output continuous, stable in real time, so that the working environment adapted to is substantially improved, method
Robustness significantly increase.The present invention is capable of providing the winding detection method of a kind of combination IMU and dotted line feature, can be sufficiently sharp
More accurate and robust winding is detected with the fusion of dotted line feature.
Claims (4)
1. a kind of close coupling binocular vision inertia SLAM method of dotted line Fusion Features, which comprises the following steps: really
Determine the transformation relation between binocular camera coordinate system and inertial sensor coordinate system, initializes SLAM system, establish dotted line feature
Add IMU track thread, dotted line feature that the part IMU bundle adjustment thread and dotted line feature winding is added to detect thread;
The dotted line feature adds IMU track thread to calculate the three-dimensional coordinate of initial dotted line feature, and calculates the IMU between before and after frames
Pre-integration information predicts dotted line feature after IMU initialization, and carries out the characteristic matching of before and after frames and local map frame,
Determine correct characteristic association;Combined optimization dotted line feature re-projection error and IMU incremental error solve before and after frames pose
Transformation;By dotted line feature distribution and IMU measurement data situation determine whether that new key frame is added, and is stored;
The dotted line feature adds the part IMU bundle adjustment thread to need to wait the dotted line feature that IMU track thread is added to increase
Crucial number of frames reach certain threshold value after just start to execute operation, when crucial number of frames reaches the value of setting, will start
IMU initialization procedure estimates the acceleration and angular speed deviation of IMU data;Simultaneously to dotted line characteristic 3 D map at
Reason deletes dotted line characteristic 3 D point map of low quality, establishes the crucial frame window in part and regards crucial frame window altogether, to part
It is flat that key frame and corresponding local dotted line characteristic 3 D map and IMU quantity of state in crucial frame window carry out local flux of light method
Difference optimization, and redundant eliminating is carried out to key frame, after IMU initialization, imu error item will be increased and optimized, solved simultaneously
The quantity of state of the pose of key frame, dotted line characteristic 3 D coordinate and IMU out;
The dotted line feature winding detection thread compares present frame by the bag of words of dotted line feature and winding detects key frame
Key frame in sequence calculates the bag of words score of present frame and key frame, carries out the overall situation after detecting winding key frame
Dotted line feature bundle adjustment optimization with imu error item, while optimizing the status items of IMU, key frame pose and dotted line
Characteristic 3 D coordinate.
2. the close coupling binocular vision inertia SLAM method of dotted line Fusion Features according to claim 1, which is characterized in that
The dotted line feature adds IMU track thread to receive the left and right stereo-picture of input corrected and extracts point feature and line spy respectively
Sign carries out Stereo matching to left and right feature dotted line, calculates the three-dimensional coordinate of dotted line feature, creates initial characteristics map, meanwhile, meter
Calculate IMU pre-integration when present image reaches;When next frame image inputs, the prediction of dotted line feature locations is carried out, and is being predicted
Position nearby carries out characteristic matching, and by the before and after frames feature locations data correlation of foundation, the re-projection for optimizing dotted line feature is missed
Difference, when IMU initialization after, by increase imu error item optimized together with re-projection error, solve before and after frames rotation and
Translation transformation obtains more characteristic associations, then passes through dotted line feature distribution situation by establishing local feature map
And IMU measurement data, it is determined whether addition key frame, and carry out the storage of key frame.
3. the close coupling binocular vision inertia SLAM method of dotted line Fusion Features according to claim 1, which is characterized in that
The dotted line feature adds the part IMU bundle adjustment thread when establishing the crucial frame window in part and regarding crucial frame window altogether, will
Top n key frame forms locally crucial frame window, depending on the key frame of relationship but does not include altogether that this is N number of with this N number of key frame presence
Key frame and the N+1 key frame composition regard crucial frame window altogether;In local BA, the dotted line in the key frame window of part is special
Three-dimensional coordinate, IMU quantity of state are levied, key frame pose will fix whole optimizations depending on the key frame pose in crucial frame window altogether
It is firmly added in majorized function, dotted line characteristic 3 D coordinate, IMU quantity of state is added without in majorized function;The determination of key frame is by point
Line feature determines jointly.
4. the close coupling binocular vision inertia SLAM method of dotted line Fusion Features according to claim 1, which is characterized in that
The bag of words score of point feature and line feature in the dotted line feature winding detection thread is respectively spAnd sl, the two plus
It acts temporarily as final bag of words score.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811250049.7A CN109579840A (en) | 2018-10-25 | 2018-10-25 | A kind of close coupling binocular vision inertia SLAM method of dotted line Fusion Features |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811250049.7A CN109579840A (en) | 2018-10-25 | 2018-10-25 | A kind of close coupling binocular vision inertia SLAM method of dotted line Fusion Features |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109579840A true CN109579840A (en) | 2019-04-05 |
Family
ID=65920788
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811250049.7A Pending CN109579840A (en) | 2018-10-25 | 2018-10-25 | A kind of close coupling binocular vision inertia SLAM method of dotted line Fusion Features |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109579840A (en) |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109465832A (en) * | 2018-12-18 | 2019-03-15 | 哈尔滨工业大学(深圳) | High-precision vision and the tight fusion and positioning method of IMU and system |
CN110095111A (en) * | 2019-05-10 | 2019-08-06 | 广东工业大学 | A kind of construction method of map scene, building system and relevant apparatus |
CN110132306A (en) * | 2019-05-20 | 2019-08-16 | 广州小鹏汽车科技有限公司 | The correcting method and system of vehicle location error |
CN110189390A (en) * | 2019-04-09 | 2019-08-30 | 南京航空航天大学 | A kind of monocular vision SLAM method and system |
CN110296702A (en) * | 2019-07-30 | 2019-10-01 | 清华大学 | Visual sensor and the tightly coupled position and orientation estimation method of inertial navigation and device |
CN110445982A (en) * | 2019-08-16 | 2019-11-12 | 深圳特蓝图科技有限公司 | A kind of tracking image pickup method based on six degree of freedom equipment |
CN110490085A (en) * | 2019-07-24 | 2019-11-22 | 西北工业大学 | The quick pose algorithm for estimating of dotted line characteristic visual SLAM system |
CN110726413A (en) * | 2019-10-25 | 2020-01-24 | 中国人民解放军国防科技大学 | Multi-sensor fusion and data management mechanism facing large-scale SLAM |
CN110763251A (en) * | 2019-10-18 | 2020-02-07 | 华东交通大学 | Method and system for optimizing visual inertial odometer |
CN110782494A (en) * | 2019-10-16 | 2020-02-11 | 北京工业大学 | Visual SLAM method based on point-line fusion |
CN110838145A (en) * | 2019-10-09 | 2020-02-25 | 西安理工大学 | Visual positioning and mapping method for indoor dynamic scene |
CN110986968A (en) * | 2019-10-12 | 2020-04-10 | 清华大学 | Method and device for real-time global optimization and error loop judgment in three-dimensional reconstruction |
CN111156984A (en) * | 2019-12-18 | 2020-05-15 | 东南大学 | Monocular vision inertia SLAM method oriented to dynamic scene |
CN111207774A (en) * | 2020-01-17 | 2020-05-29 | 山东大学 | Method and system for laser-IMU external reference calibration |
CN111260661A (en) * | 2020-01-15 | 2020-06-09 | 江苏大学 | Visual semantic SLAM system and method based on neural network technology |
CN112115980A (en) * | 2020-08-25 | 2020-12-22 | 西北工业大学 | Binocular vision odometer design method based on optical flow tracking and point line feature matching |
CN112432653A (en) * | 2020-11-27 | 2021-03-02 | 北京工业大学 | Monocular vision inertial odometer method based on point-line characteristics |
CN112509044A (en) * | 2020-12-02 | 2021-03-16 | 重庆邮电大学 | Binocular vision SLAM method based on dotted line feature fusion |
CN112577494A (en) * | 2020-11-13 | 2021-03-30 | 上海航天控制技术研究所 | SLAM method, electronic device and storage medium suitable for lunar vehicle |
CN112734839A (en) * | 2020-12-31 | 2021-04-30 | 浙江大学 | Monocular vision SLAM initialization method for improving robustness |
CN112945233A (en) * | 2021-01-15 | 2021-06-11 | 北京理工大学 | Global drift-free autonomous robot simultaneous positioning and map building method |
CN112950709A (en) * | 2021-02-21 | 2021-06-11 | 深圳市优必选科技股份有限公司 | Pose prediction method, pose prediction device and robot |
CN113012196A (en) * | 2021-03-05 | 2021-06-22 | 华南理工大学 | Positioning method based on information fusion of binocular camera and inertial navigation sensor |
CN113192140A (en) * | 2021-05-25 | 2021-07-30 | 华中科技大学 | Binocular vision inertial positioning method and system based on point-line characteristics |
CN113298796A (en) * | 2021-06-10 | 2021-08-24 | 西北工业大学 | Line feature SLAM initialization method based on maximum posterior IMU |
CN113514058A (en) * | 2021-04-23 | 2021-10-19 | 北京华捷艾米科技有限公司 | Visual SLAM positioning method and device integrating MSCKF and graph optimization |
CN113532431A (en) * | 2021-07-15 | 2021-10-22 | 贵州电网有限责任公司 | Visual inertia SLAM method for power inspection and operation |
CN113701766A (en) * | 2020-05-20 | 2021-11-26 | 浙江欣奕华智能科技有限公司 | Robot map construction method, robot positioning method and device |
CN114485640A (en) * | 2022-01-20 | 2022-05-13 | 河北工业职业技术学院 | Monocular vision inertia synchronous positioning and mapping method and system based on point-line characteristics |
WO2022120698A1 (en) * | 2020-12-10 | 2022-06-16 | Intel Corporation | Method and system of image processing with increased subjective quality 3d reconstruction |
CN115526811A (en) * | 2022-11-28 | 2022-12-27 | 电子科技大学中山学院 | Adaptive vision SLAM method suitable for variable illumination environment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018048353A1 (en) * | 2016-09-09 | 2018-03-15 | Nanyang Technological University | Simultaneous localization and mapping methods and apparatus |
CN108665540A (en) * | 2018-03-16 | 2018-10-16 | 浙江工业大学 | Robot localization based on binocular vision feature and IMU information and map structuring system |
CN108682027A (en) * | 2018-05-11 | 2018-10-19 | 北京华捷艾米科技有限公司 | VSLAM realization method and systems based on point, line Fusion Features |
-
2018
- 2018-10-25 CN CN201811250049.7A patent/CN109579840A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018048353A1 (en) * | 2016-09-09 | 2018-03-15 | Nanyang Technological University | Simultaneous localization and mapping methods and apparatus |
CN108665540A (en) * | 2018-03-16 | 2018-10-16 | 浙江工业大学 | Robot localization based on binocular vision feature and IMU information and map structuring system |
CN108682027A (en) * | 2018-05-11 | 2018-10-19 | 北京华捷艾米科技有限公司 | VSLAM realization method and systems based on point, line Fusion Features |
Non-Patent Citations (4)
Title |
---|
PEILIANG LI: "Monocular Visual-Inertial State Estimation for Mobile Augmented Reality", 《2017 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY》 * |
RA´UL MUR-ARTAL: "Visual-Inertial Monocular SLAM with Map Reuse", 《IEEE ROBOTICS AND AUTOMATION LETTERS》 * |
YIJIA HE: "PL-VIO: Tightly-Coupled Monocular Visual–Inertial Odometry Using Point and Line Features", 《SENSORS》 * |
陈慧岩: "《智能车辆理论与应用》", 31 July 2018, 北京理工大学出版社 * |
Cited By (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109465832A (en) * | 2018-12-18 | 2019-03-15 | 哈尔滨工业大学(深圳) | High-precision vision and the tight fusion and positioning method of IMU and system |
CN110189390A (en) * | 2019-04-09 | 2019-08-30 | 南京航空航天大学 | A kind of monocular vision SLAM method and system |
CN110095111A (en) * | 2019-05-10 | 2019-08-06 | 广东工业大学 | A kind of construction method of map scene, building system and relevant apparatus |
CN110132306B (en) * | 2019-05-20 | 2021-02-19 | 广州小鹏汽车科技有限公司 | Method and system for correcting vehicle positioning error |
CN110132306A (en) * | 2019-05-20 | 2019-08-16 | 广州小鹏汽车科技有限公司 | The correcting method and system of vehicle location error |
CN110490085A (en) * | 2019-07-24 | 2019-11-22 | 西北工业大学 | The quick pose algorithm for estimating of dotted line characteristic visual SLAM system |
CN110490085B (en) * | 2019-07-24 | 2022-03-11 | 西北工业大学 | Quick pose estimation algorithm of dotted line feature vision SLAM system |
CN110296702A (en) * | 2019-07-30 | 2019-10-01 | 清华大学 | Visual sensor and the tightly coupled position and orientation estimation method of inertial navigation and device |
CN110445982B (en) * | 2019-08-16 | 2021-01-12 | 深圳特蓝图科技有限公司 | Tracking shooting method based on six-degree-of-freedom equipment |
CN110445982A (en) * | 2019-08-16 | 2019-11-12 | 深圳特蓝图科技有限公司 | A kind of tracking image pickup method based on six degree of freedom equipment |
CN110838145A (en) * | 2019-10-09 | 2020-02-25 | 西安理工大学 | Visual positioning and mapping method for indoor dynamic scene |
CN110838145B (en) * | 2019-10-09 | 2020-08-18 | 西安理工大学 | Visual positioning and mapping method for indoor dynamic scene |
CN110986968A (en) * | 2019-10-12 | 2020-04-10 | 清华大学 | Method and device for real-time global optimization and error loop judgment in three-dimensional reconstruction |
CN110986968B (en) * | 2019-10-12 | 2022-05-24 | 清华大学 | Method and device for real-time global optimization and error loop judgment in three-dimensional reconstruction |
CN110782494A (en) * | 2019-10-16 | 2020-02-11 | 北京工业大学 | Visual SLAM method based on point-line fusion |
CN110763251B (en) * | 2019-10-18 | 2021-07-13 | 华东交通大学 | Method and system for optimizing visual inertial odometer |
CN110763251A (en) * | 2019-10-18 | 2020-02-07 | 华东交通大学 | Method and system for optimizing visual inertial odometer |
CN110726413A (en) * | 2019-10-25 | 2020-01-24 | 中国人民解放军国防科技大学 | Multi-sensor fusion and data management mechanism facing large-scale SLAM |
CN111156984A (en) * | 2019-12-18 | 2020-05-15 | 东南大学 | Monocular vision inertia SLAM method oriented to dynamic scene |
CN111260661A (en) * | 2020-01-15 | 2020-06-09 | 江苏大学 | Visual semantic SLAM system and method based on neural network technology |
CN111207774A (en) * | 2020-01-17 | 2020-05-29 | 山东大学 | Method and system for laser-IMU external reference calibration |
CN113701766A (en) * | 2020-05-20 | 2021-11-26 | 浙江欣奕华智能科技有限公司 | Robot map construction method, robot positioning method and device |
CN112115980A (en) * | 2020-08-25 | 2020-12-22 | 西北工业大学 | Binocular vision odometer design method based on optical flow tracking and point line feature matching |
CN112577494A (en) * | 2020-11-13 | 2021-03-30 | 上海航天控制技术研究所 | SLAM method, electronic device and storage medium suitable for lunar vehicle |
CN112432653B (en) * | 2020-11-27 | 2024-02-23 | 北京工业大学 | Monocular vision inertial odometer method based on dotted line characteristics |
CN112432653A (en) * | 2020-11-27 | 2021-03-02 | 北京工业大学 | Monocular vision inertial odometer method based on point-line characteristics |
CN112509044A (en) * | 2020-12-02 | 2021-03-16 | 重庆邮电大学 | Binocular vision SLAM method based on dotted line feature fusion |
WO2022120698A1 (en) * | 2020-12-10 | 2022-06-16 | Intel Corporation | Method and system of image processing with increased subjective quality 3d reconstruction |
CN112734839A (en) * | 2020-12-31 | 2021-04-30 | 浙江大学 | Monocular vision SLAM initialization method for improving robustness |
CN112945233B (en) * | 2021-01-15 | 2024-02-20 | 北京理工大学 | Global drift-free autonomous robot simultaneous positioning and map construction method |
CN112945233A (en) * | 2021-01-15 | 2021-06-11 | 北京理工大学 | Global drift-free autonomous robot simultaneous positioning and map building method |
CN112950709B (en) * | 2021-02-21 | 2023-10-24 | 深圳市优必选科技股份有限公司 | Pose prediction method, pose prediction device and robot |
CN112950709A (en) * | 2021-02-21 | 2021-06-11 | 深圳市优必选科技股份有限公司 | Pose prediction method, pose prediction device and robot |
CN113012196B (en) * | 2021-03-05 | 2023-03-24 | 华南理工大学 | Positioning method based on information fusion of binocular camera and inertial navigation sensor |
CN113012196A (en) * | 2021-03-05 | 2021-06-22 | 华南理工大学 | Positioning method based on information fusion of binocular camera and inertial navigation sensor |
CN113514058A (en) * | 2021-04-23 | 2021-10-19 | 北京华捷艾米科技有限公司 | Visual SLAM positioning method and device integrating MSCKF and graph optimization |
CN113192140A (en) * | 2021-05-25 | 2021-07-30 | 华中科技大学 | Binocular vision inertial positioning method and system based on point-line characteristics |
CN113298796B (en) * | 2021-06-10 | 2024-04-19 | 西北工业大学 | Line characteristic SLAM initialization method based on maximum posterior IMU |
CN113298796A (en) * | 2021-06-10 | 2021-08-24 | 西北工业大学 | Line feature SLAM initialization method based on maximum posterior IMU |
CN113532431A (en) * | 2021-07-15 | 2021-10-22 | 贵州电网有限责任公司 | Visual inertia SLAM method for power inspection and operation |
CN114485640A (en) * | 2022-01-20 | 2022-05-13 | 河北工业职业技术学院 | Monocular vision inertia synchronous positioning and mapping method and system based on point-line characteristics |
CN115526811A (en) * | 2022-11-28 | 2022-12-27 | 电子科技大学中山学院 | Adaptive vision SLAM method suitable for variable illumination environment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109579840A (en) | A kind of close coupling binocular vision inertia SLAM method of dotted line Fusion Features | |
CN112649016B (en) | Visual inertial odometer method based on dotted line initialization | |
CN109307508B (en) | Panoramic inertial navigation SLAM method based on multiple key frames | |
CN108375370B (en) | A kind of complex navigation system towards intelligent patrol unmanned plane | |
CN111024066A (en) | Unmanned aerial vehicle vision-inertia fusion indoor positioning method | |
US9990726B2 (en) | Method of determining a position and orientation of a device associated with a capturing device for capturing at least one image | |
CN108682027A (en) | VSLAM realization method and systems based on point, line Fusion Features | |
CN110044354A (en) | A kind of binocular vision indoor positioning and build drawing method and device | |
CN108036785A (en) | A kind of aircraft position and orientation estimation method based on direct method and inertial navigation fusion | |
CN108406731A (en) | A kind of positioning device, method and robot based on deep vision | |
CN109166149A (en) | A kind of positioning and three-dimensional wire-frame method for reconstructing and system of fusion binocular camera and IMU | |
CN109540126A (en) | A kind of inertia visual combination air navigation aid based on optical flow method | |
CN105953796A (en) | Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone | |
CN109991636A (en) | Map constructing method and system based on GPS, IMU and binocular vision | |
CN110095116A (en) | A kind of localization method of vision positioning and inertial navigation combination based on LIFT | |
CN109084732A (en) | Positioning and air navigation aid, device and processing equipment | |
CN109465832A (en) | High-precision vision and the tight fusion and positioning method of IMU and system | |
CN109579844A (en) | Localization method and system | |
CN107909614A (en) | Crusing robot localization method under a kind of GPS failures environment | |
CN110398245A (en) | The indoor pedestrian navigation Attitude estimation method of formula Inertial Measurement Unit is worn based on foot | |
CN113091738B (en) | Mobile robot map construction method based on visual inertial navigation fusion and related equipment | |
CN108700946A (en) | System and method for parallel ranging and fault detect and the recovery of building figure | |
CN108981693A (en) | VIO fast joint initial method based on monocular camera | |
CN208323361U (en) | A kind of positioning device and robot based on deep vision | |
CN206990800U (en) | A kind of alignment system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190405 |
|
RJ01 | Rejection of invention patent application after publication |