CN105371840A - Method for combined navigation of inertia/visual odometer/laser radar - Google Patents
Method for combined navigation of inertia/visual odometer/laser radar Download PDFInfo
- Publication number
- CN105371840A CN105371840A CN201510727853.XA CN201510727853A CN105371840A CN 105371840 A CN105371840 A CN 105371840A CN 201510727853 A CN201510727853 A CN 201510727853A CN 105371840 A CN105371840 A CN 105371840A
- Authority
- CN
- China
- Prior art keywords
- delta
- matrix
- navigation
- speed
- visual odometer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000000007 visual effect Effects 0.000 title claims abstract description 29
- 238000000034 method Methods 0.000 title claims abstract description 20
- 238000005259 measurement Methods 0.000 claims abstract description 27
- 238000001914 filtration Methods 0.000 claims abstract description 16
- 239000011159 matrix material Substances 0.000 claims description 39
- 238000003491 array Methods 0.000 claims description 6
- 230000007704 transition Effects 0.000 claims description 6
- 230000009897 systematic effect Effects 0.000 claims description 3
- 238000012935 Averaging Methods 0.000 claims description 2
- 238000012545 processing Methods 0.000 claims description 2
- 238000005516 engineering process Methods 0.000 abstract description 4
- 238000013459 approach Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/50—Systems of measurement based on relative movement of target
- G01S17/58—Velocity or trajectory determination systems; Sense-of-movement determination systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Navigation (AREA)
Abstract
The invention belongs to navigation methods and particularly relates to a method for combined navigation of an inertia/visual odometer/laser radar. The method comprises (1) state model establishment, (2) visual odometer speed measurement based on characteristic information, (3) establishment of a measurement equation and obtaining of a measurement value, (4) Kalman filtering and (5) system error correction. The method has the advantages that a machine vision autonomous navigation technology is used, a monocular camera can measure a vector speed under the conditions of a known distance through the difference of a front frame image and a rear frame image, the laser radar can accurately measure the distance of an observation point and then measure a vector speed, navigation is performed by utilizing combination of the speed obtained through measurement and an inertial reference speed, and high-accuracy navigation is performed finally under the conditions of no outside reference information.
Description
Technical Field
The invention belongs to a navigation method, and particularly relates to an inertia/visual odometer/laser radar combined navigation method.
Background
With the continuous enhancement of the long-time and long-distance navigation capability of the aircraft, the requirements on the precision and the autonomy of a navigation system are also continuously improved. Pure inertial navigation systems do not fully meet the requirements of practical applications due to their inherent disadvantage of navigation errors that accumulate over time. There are two types of approaches to solving this problem: firstly, the accuracy of the inertial navigation system is improved. The precision of the inertia device is improved mainly by adopting new materials, new processes and new technologies, or a novel high-precision inertia device is developed. However, a lot of manpower and financial resources are required, and the improvement of the precision of the inertial device is limited. And secondly, adopting a combined navigation technology. The method mainly uses some additional navigation information sources outside the inertial system to improve the accuracy of the inertial system, and improves the navigation accuracy through software technology. However, there are many available combination information, if all the information is combined, although the precision can reach or even exceed the requirement, the calculation amount is huge, and the information cannot be used in practice at all, and if only partial information combination is selected, the type of the selected information, the sequence of the combination and the specific combination mode all have great influence on the precision of the result. In the prior art, a perfect combination mode does not exist, which not only considers the calculated amount, but also reduces the combined content to the maximum extent.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a combined navigation method of an inertia/visual odometer/laser radar.
The invention is realized by the following steps: an inertial/visual odometer/lidar integrated navigation method, comprising the steps of:
(1) establishing a state model of the inertial/visual odometer/laser radar combined navigation system as follows
In the formula: x (t) is the system state vector; w (t) is system white noise; the coefficient matrices f (t) and g (t) are solved according to an error equation,
X(t)=[Vn,Vu,Ve,L,h,λ,φn,φu,φe,▽x,▽y,▽z,x,y,z,Vn_ov,Ve_OV]
Vn,Vu,Verespectively representing the speed errors of the strapdown inertial navigation system in the north direction, the sky direction and the east direction;
l, h and lambda respectively represent latitude errors, altitude errors and longitude errors of the strapdown inertial navigation system;
φn,φu,φerespectively representing misalignment angles in the north direction, the sky direction and the east direction in a navigation coordinate system of the strapdown inertial navigation system;
▽x,▽y,▽zrespectively representing the zero offset of the accelerometer in X, Y, Z three directions in a carrier coordinate system of the strapdown inertial navigation system;
x,y,zrespectively representing the gyro drift of X, Y, Z three directions in a carrier coordinate system of the strapdown inertial navigation system;
Vn_ov,Ve_OVrespectively representing the north and east speed errors of the visual odometer;
(2) visual odometer speed based on feature information
a) Selecting a suitable matching area
Selecting the geometric center position of the image as a matching area;
b) extracting image feature information
Intercepting the matching area in the image of the current frame and the image of the previous frame, extracting SIFT feature points of the intercepted images,
c) feature point matching
Matching the feature points in the current frame image and the previous frame image by using the K-D tree rapid feature point matching to obtain a series of matching pairs,
d) selecting the same feature point to calculate the speed
Calculating the speed of the feature point by dividing different distances obtained on the matching pair by the laser radar by time;
e) outputting the current frame velocity
Each feature point can calculate a speed, and the speeds are processed and then output;
(3) establishment of measurement equation and acquisition of measurement value
The kalman filter measurement equation is of the form:
Z=HX+V
the measurement value Z is the difference value of the speeds respectively given by the inertial navigation system and the visual odometer, and actually is the difference value of the errors of the two:
where V is the measurement noise, considered as white noise,
the H-matrix is derived from the above formula as follows:
(4) kalman filtering
According to a system equation and a measurement equation of the inertial/visual odometer/laser radar combined navigation, a state one-step transition matrix when a Kalman filtering period comes is calculated, and the calculation formula is as follows:
in the formula:
Tnfor navigation period, NTnIn order to be a kalman filtering cycle,is a system matrix of the ith navigation period in a Kalman filtering period, I is a unit matrix,
state one-step prediction
State estimation
Filter gain matrix
One-step prediction error variance matrix
Estimation error variance matrix
Pk=[I-KkHk]Pk,k-1
Wherein,in order to predict the value of the one-step state,estimate the matrix for the state, phik,k-1For a state one-step transition matrix, HkFor measuring the matrix, ZkMeasurement of quantitative value, KkFor filtering the gain matrix, RkFor observing noise arrays, Pk,k-1For one-step prediction of error variance matrix, PkIn order to estimate the error variance matrix,k,k-1for system noise driven arrays, Qk-1A system noise array;
a series of error values can be calculated by using the equation in the step (1) and the step;
(5) correcting systematic errors
And (4) correcting the system output value by using the error value calculated in the step (4).
The combined navigation method of inertial/visual odometer/lidar as described above, wherein the processing in step e) of step (2) is RANSAC method or averaging.
The invention has the following effects: the method uses the autonomous navigation technology of machine vision, and the monocular camera can measure the speed of the carrier under the condition of known distance through the difference of front and back frame images; the laser radar can accurately measure the distance to an observation point, then measure the speed of the carrier, and finally realize high-precision navigation under the condition of no external reference information input by utilizing the speed obtained by measurement and the inertial navigation speed for combined navigation.
Detailed Description
An inertial/visual odometer/lidar integrated navigation method, comprising the steps of:
(1) establishing a state model of the inertial/visual odometer/laser radar combined navigation system as follows
In the formula: x (t) is the system state vector; w (t) is system white noise; the coefficient matrices F (t) and G (t) are derived from the error equation.
X(t)=[Vn,Vu,Ve,L,h,λ,φn,φu,φe,▽x,▽y,▽z,x,y,z,Vn_ov,Ve_OV]
Vn,Vu,VeRespectively representing the speed errors of the strapdown inertial navigation system in the north direction, the sky direction and the east direction;
l, h and lambda respectively represent latitude errors, altitude errors and longitude errors of the strapdown inertial navigation system;
φn,φu,φerespectively representing misalignment angles in the north direction, the sky direction and the east direction in a navigation coordinate system of the strapdown inertial navigation system;
▽x,▽y,▽zrespectively representing the zero offset of the accelerometer in X, Y, Z three directions in a carrier coordinate system of the strapdown inertial navigation system;
x,y,zrespectively representing the gyro drift of X, Y, Z three directions in a carrier coordinate system of the strapdown inertial navigation system;
Vn_ov,Ve_OVrespectively representing the north and east speed errors of the visual odometer;
(2) visual odometer speed based on feature information
a) Selecting a suitable matching area
In order to improve the speed measurement range of the visual odometer, a proper position needs to be selected as a matching area by combining the speed and posture information output by inertial navigation. The geometric center of the image is generally selected as a matching region, and the size of the region is 30 × 30 pixels.
b) Extracting image feature information
And intercepting the matching area in the image of the current frame and the image of the previous frame, and extracting SIFT feature points of the intercepted image.
c) Feature point matching
Setting a proper threshold value, being suitable for the fast characteristic point matching of the K-D tree, and matching the characteristic points in the current frame image and the previous frame image to obtain a series of matching pairs.
d) Selecting the same feature point to calculate the speed
As can be known from the velocity calculation formula of the visual odometer, the distance and the velocity are calculated for the same point, so that the same feature point needs to be found in the current frame image and the previous frame image. And obtaining the distance between the feature point and the camera by using a laser radar, and then calculating the relative speed between the feature point and the camera according to the information of the feature point in the matching pair of the current frame image and the previous frame image.
e) Outputting the current frame velocity
Each feature point can calculate a speed, the speeds are processed for a certain time (such as RANSAC method or average), and finally, the speed measurement result of the visual odometer is output.
(3) Establishment of measurement equation and acquisition of measurement value
The kalman filter measurement equation is of the form:
Z=HX+V
the measurement value Z is the difference value of the speeds respectively given by the inertial navigation system and the visual odometer, and actually is the difference value of the errors of the two:
where V is the measurement noise, considered white noise.
The H-matrix is derived from the above formula as follows:
(4) kalman filtering
According to a system equation and a measurement equation of the inertial/visual odometer/laser radar combined navigation, a state one-step transition matrix when a Kalman filtering period comes is calculated, and the calculation formula is as follows:
in the formula:
Tnfor navigation period, NTnIn order to be a kalman filtering cycle,in a Kalman filtering cycleAnd I is a unit matrix.
State one-step prediction
State estimation
Filter gain matrix
One-step prediction error variance matrix
Estimation error variance matrix
Pk=[I-KkHk]Pk,k-1
Wherein,in order to predict the value of the one-step state,estimate the matrix for the state, phik,k-1For a state one-step transition matrix, HkFor measuring the matrix, ZkMeasurement of quantitative value, KkFor filtering the gain matrix, RkFor observing noise arrays, Pk,k-1For one-step prediction of error variance matrix, PkIn order to estimate the error variance matrix,k,k-1for system noise driven arrays, Qk-1Is a system noise matrix.
(5) Correcting systematic errors
And (4) correcting the system output value by using the error value calculated in the step (4).
Claims (2)
1. An inertial/visual odometer/lidar integrated navigation method, comprising the steps of:
(1) establishing a state model of the inertial/visual odometer/laser radar combined navigation system as follows
In the formula: x (t) is the system state vector; w (t) is system white noise; the coefficient matrices f (t) and g (t) are solved according to an error equation,
Vn,Vu,Verespectively representing the speed errors of the strapdown inertial navigation system in the north direction, the sky direction and the east direction;
l, h and lambda respectively represent latitude errors, altitude errors and longitude errors of the strapdown inertial navigation system;
φn,φu,φerespectively representing misalignment angles in the north direction, the sky direction and the east direction in a navigation coordinate system of the strapdown inertial navigation system;
respectively representing the zero offset of the accelerometer in X, Y, Z three directions in a carrier coordinate system of the strapdown inertial navigation system;
x,y,zrespectively representing the gyro drift of X, Y, Z three directions in a carrier coordinate system of the strapdown inertial navigation system;
Vn_ov,Ve_OVrespectively representing the north and east speed errors of the visual odometer;
(2) visual odometer speed based on feature information
a) Selecting a suitable matching area
Selecting the geometric center position of the image as a matching area;
b) extracting image feature information
Intercepting the matching area in the image of the current frame and the image of the previous frame, extracting SIFT feature points of the intercepted images,
c) feature point matching
Matching the feature points in the current frame image and the previous frame image by using the K-D tree rapid feature point matching to obtain a series of matching pairs,
d) selecting the same feature point to calculate the speed
Calculating the speed of the feature point by dividing different distances obtained on the matching pair by the laser radar by time;
e) outputting the current frame velocity
Each feature point can calculate a speed, and the speeds are processed and then output;
(3) establishment of measurement equation and acquisition of measurement value
The kalman filter measurement equation is of the form:
Z=HX+V
the measurement value Z is the difference value of the speeds respectively given by the inertial navigation system and the visual odometer, and actually is the difference value of the errors of the two:
where V is the measurement noise, considered as white noise,
the H-matrix is derived from the above formula as follows:
(4) kalman filtering
According to a system equation and a measurement equation of the inertial/visual odometer/laser radar combined navigation, a state one-step transition matrix when a Kalman filtering period comes is calculated, and the calculation formula is as follows:
in the formula:
Tnfor navigation period, NTnIn order to be a kalman filtering cycle,is a system matrix of the ith navigation period in a Kalman filtering period, I is a unit matrix,
state one-step prediction
State estimation
Filter gain matrix
One-step prediction error variance matrix
Estimation error variance matrix
Pk=[I-KkHk]Pk,k-1
Wherein,in order to predict the value of the one-step state,estimate the matrix for the state, phik,k-1For a state one-step transition matrix, HkFor measuring the matrix, ZkIn order to measure the quantity of the sample,Kkfor filtering the gain matrix, RkFor observing noise arrays, Pk,k-1For one-step prediction of error variance matrix, PkIn order to estimate the error variance matrix,k,k-1for system noise driven arrays, Qk-1A system noise array;
a series of error values can be calculated by using the equation in the step (1) and the step;
(5) correcting systematic errors
And (4) correcting the system output value by using the error value calculated in the step (4).
2. The integrated inertial/visual odometer/lidar navigation method of claim 1, wherein: the processing in step e) of step (2) is RANSAC method or averaging. .
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510727853.XA CN105371840B (en) | 2015-10-30 | 2015-10-30 | A kind of inertia/visual odometry/laser radar Combinated navigation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510727853.XA CN105371840B (en) | 2015-10-30 | 2015-10-30 | A kind of inertia/visual odometry/laser radar Combinated navigation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105371840A true CN105371840A (en) | 2016-03-02 |
CN105371840B CN105371840B (en) | 2019-03-22 |
Family
ID=55374248
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510727853.XA Active CN105371840B (en) | 2015-10-30 | 2015-10-30 | A kind of inertia/visual odometry/laser radar Combinated navigation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105371840B (en) |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106153074A (en) * | 2016-06-20 | 2016-11-23 | 浙江大学 | A kind of optical calibrating system and method for the dynamic navigation performance of IMU |
CN106803262A (en) * | 2016-12-21 | 2017-06-06 | 上海交通大学 | The method that car speed is independently resolved using binocular vision |
CN107402012A (en) * | 2016-05-20 | 2017-11-28 | 北京自动化控制设备研究所 | A kind of Combinated navigation method of vehicle |
CN107402005A (en) * | 2016-05-20 | 2017-11-28 | 北京自动化控制设备研究所 | One kind is based on inertia/odometer/RFID high-precision integrated navigation method |
CN107588769A (en) * | 2017-10-17 | 2018-01-16 | 北京航天发射技术研究所 | A kind of vehicle-mounted inertial navigation, odometer and altimeter Combinated navigation method |
CN107861501A (en) * | 2017-10-22 | 2018-03-30 | 北京工业大学 | Underground sewage treatment works intelligent robot automatic positioning navigation system |
CN107990893A (en) * | 2017-11-24 | 2018-05-04 | 南京航空航天大学 | The detection method that environment is undergone mutation is detected in two-dimensional laser radar SLAM |
CN108151713A (en) * | 2017-12-13 | 2018-06-12 | 南京航空航天大学 | A kind of quick position and orientation estimation methods of monocular VO |
CN108519615A (en) * | 2018-04-19 | 2018-09-11 | 河南科技学院 | Mobile robot autonomous navigation method based on integrated navigation and Feature Points Matching |
CN108536163A (en) * | 2018-03-15 | 2018-09-14 | 南京航空航天大学 | A kind of kinetic model/laser radar Combinated navigation method under single-sided structure environment |
CN108562289A (en) * | 2018-06-07 | 2018-09-21 | 南京航空航天大学 | Quadrotor laser radar air navigation aid in continuous polygon geometry environment |
CN108731670A (en) * | 2018-05-18 | 2018-11-02 | 南京航空航天大学 | Inertia/visual odometry combined navigation locating method based on measurement model optimization |
CN109387198A (en) * | 2017-08-03 | 2019-02-26 | 北京自动化控制设备研究所 | A kind of inertia based on sequential detection/visual odometry Combinated navigation method |
CN109444911A (en) * | 2018-10-18 | 2019-03-08 | 哈尔滨工程大学 | A kind of unmanned boat waterborne target detection identification and the localization method of monocular camera and laser radar information fusion |
CN110017850A (en) * | 2019-04-19 | 2019-07-16 | 小狗电器互联网科技(北京)股份有限公司 | A kind of gyroscopic drift estimation method, device and positioning system |
CN110501024A (en) * | 2019-04-11 | 2019-11-26 | 同济大学 | A kind of error in measurement compensation method of vehicle-mounted INS/ laser radar integrated navigation system |
CN110849362A (en) * | 2019-11-28 | 2020-02-28 | 湖南率为控制科技有限公司 | Laser radar and vision combined navigation algorithm based on vehicle-mounted inertia |
CN111121768A (en) * | 2019-12-23 | 2020-05-08 | 深圳市优必选科技股份有限公司 | Robot pose estimation method and device, readable storage medium and robot |
CN111521176A (en) * | 2020-04-27 | 2020-08-11 | 北京工业大学 | Visual auxiliary inertial navigation method fusing laser |
CN111595333A (en) * | 2020-04-26 | 2020-08-28 | 武汉理工大学 | Modularized unmanned vehicle positioning method and system based on visual inertial laser data fusion |
CN111829552A (en) * | 2019-04-19 | 2020-10-27 | 北京初速度科技有限公司 | Error correction method and device for visual inertial system |
CN111947652A (en) * | 2020-08-13 | 2020-11-17 | 北京航空航天大学 | Inertia/vision/astronomy/laser ranging combined navigation method suitable for lunar lander |
CN112985416A (en) * | 2021-04-19 | 2021-06-18 | 湖南大学 | Robust positioning and mapping method and system based on laser and visual information fusion |
CN114637302A (en) * | 2022-04-15 | 2022-06-17 | 安徽农业大学 | Automatic advancing obstacle avoidance method and system based on computer vision |
CN115855042A (en) * | 2022-12-12 | 2023-03-28 | 北京自动化控制设备研究所 | Pedestrian visual navigation method based on laser radar cooperative assistance |
CN117968680A (en) * | 2024-03-29 | 2024-05-03 | 西安现代控制技术研究所 | Inertial-radar integrated navigation limited frame measurement variable weight updating method |
CN118243094A (en) * | 2024-04-10 | 2024-06-25 | 北京自动化控制设备研究所 | Inertial vision integrated navigation self-adaptive filtering method based on quality factors |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090248304A1 (en) * | 2008-03-28 | 2009-10-01 | Regents Of The University Of Minnesota | Vision-aided inertial navigation |
CN102538781A (en) * | 2011-12-14 | 2012-07-04 | 浙江大学 | Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method |
CN103033189A (en) * | 2012-12-26 | 2013-04-10 | 北京航空航天大学 | Inertia/vision integrated navigation method for deep-space detection patrolling device |
CN103162687A (en) * | 2013-03-07 | 2013-06-19 | 中国人民解放军国防科学技术大学 | Image/inertial navigation combination navigation method based on information credibility |
CN103424112A (en) * | 2013-07-29 | 2013-12-04 | 南京航空航天大学 | Vision navigating method for movement carrier based on laser plane assistance |
CN104833352A (en) * | 2015-01-29 | 2015-08-12 | 西北工业大学 | Multi-medium complex-environment high-precision vision/inertia combination navigation method |
-
2015
- 2015-10-30 CN CN201510727853.XA patent/CN105371840B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090248304A1 (en) * | 2008-03-28 | 2009-10-01 | Regents Of The University Of Minnesota | Vision-aided inertial navigation |
CN102538781A (en) * | 2011-12-14 | 2012-07-04 | 浙江大学 | Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method |
CN103033189A (en) * | 2012-12-26 | 2013-04-10 | 北京航空航天大学 | Inertia/vision integrated navigation method for deep-space detection patrolling device |
CN103162687A (en) * | 2013-03-07 | 2013-06-19 | 中国人民解放军国防科学技术大学 | Image/inertial navigation combination navigation method based on information credibility |
CN103424112A (en) * | 2013-07-29 | 2013-12-04 | 南京航空航天大学 | Vision navigating method for movement carrier based on laser plane assistance |
CN104833352A (en) * | 2015-01-29 | 2015-08-12 | 西北工业大学 | Multi-medium complex-environment high-precision vision/inertia combination navigation method |
Cited By (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107402012A (en) * | 2016-05-20 | 2017-11-28 | 北京自动化控制设备研究所 | A kind of Combinated navigation method of vehicle |
CN107402005A (en) * | 2016-05-20 | 2017-11-28 | 北京自动化控制设备研究所 | One kind is based on inertia/odometer/RFID high-precision integrated navigation method |
CN106153074A (en) * | 2016-06-20 | 2016-11-23 | 浙江大学 | A kind of optical calibrating system and method for the dynamic navigation performance of IMU |
CN106803262A (en) * | 2016-12-21 | 2017-06-06 | 上海交通大学 | The method that car speed is independently resolved using binocular vision |
CN109387198A (en) * | 2017-08-03 | 2019-02-26 | 北京自动化控制设备研究所 | A kind of inertia based on sequential detection/visual odometry Combinated navigation method |
CN109387198B (en) * | 2017-08-03 | 2022-07-15 | 北京自动化控制设备研究所 | Inertia/vision milemeter combined navigation method based on sequential detection |
CN107588769A (en) * | 2017-10-17 | 2018-01-16 | 北京航天发射技术研究所 | A kind of vehicle-mounted inertial navigation, odometer and altimeter Combinated navigation method |
CN107861501A (en) * | 2017-10-22 | 2018-03-30 | 北京工业大学 | Underground sewage treatment works intelligent robot automatic positioning navigation system |
CN107990893A (en) * | 2017-11-24 | 2018-05-04 | 南京航空航天大学 | The detection method that environment is undergone mutation is detected in two-dimensional laser radar SLAM |
CN108151713A (en) * | 2017-12-13 | 2018-06-12 | 南京航空航天大学 | A kind of quick position and orientation estimation methods of monocular VO |
CN108536163A (en) * | 2018-03-15 | 2018-09-14 | 南京航空航天大学 | A kind of kinetic model/laser radar Combinated navigation method under single-sided structure environment |
CN108519615A (en) * | 2018-04-19 | 2018-09-11 | 河南科技学院 | Mobile robot autonomous navigation method based on integrated navigation and Feature Points Matching |
CN108519615B (en) * | 2018-04-19 | 2021-11-26 | 河南科技学院 | Mobile robot autonomous navigation method based on combined navigation and feature point matching |
CN108731670B (en) * | 2018-05-18 | 2021-06-22 | 南京航空航天大学 | Inertial/visual odometer integrated navigation positioning method based on measurement model optimization |
CN108731670A (en) * | 2018-05-18 | 2018-11-02 | 南京航空航天大学 | Inertia/visual odometry combined navigation locating method based on measurement model optimization |
CN108562289A (en) * | 2018-06-07 | 2018-09-21 | 南京航空航天大学 | Quadrotor laser radar air navigation aid in continuous polygon geometry environment |
CN108562289B (en) * | 2018-06-07 | 2021-11-26 | 南京航空航天大学 | Laser radar navigation method for four-rotor aircraft in continuous multilateral geometric environment |
CN109444911B (en) * | 2018-10-18 | 2023-05-05 | 哈尔滨工程大学 | Unmanned ship water surface target detection, identification and positioning method based on monocular camera and laser radar information fusion |
CN109444911A (en) * | 2018-10-18 | 2019-03-08 | 哈尔滨工程大学 | A kind of unmanned boat waterborne target detection identification and the localization method of monocular camera and laser radar information fusion |
CN110501024A (en) * | 2019-04-11 | 2019-11-26 | 同济大学 | A kind of error in measurement compensation method of vehicle-mounted INS/ laser radar integrated navigation system |
CN110017850A (en) * | 2019-04-19 | 2019-07-16 | 小狗电器互联网科技(北京)股份有限公司 | A kind of gyroscopic drift estimation method, device and positioning system |
CN111829552A (en) * | 2019-04-19 | 2020-10-27 | 北京初速度科技有限公司 | Error correction method and device for visual inertial system |
CN110849362B (en) * | 2019-11-28 | 2022-01-04 | 湖南率为控制科技有限公司 | Laser radar and vision combined navigation algorithm based on vehicle-mounted inertia |
CN110849362A (en) * | 2019-11-28 | 2020-02-28 | 湖南率为控制科技有限公司 | Laser radar and vision combined navigation algorithm based on vehicle-mounted inertia |
CN111121768A (en) * | 2019-12-23 | 2020-05-08 | 深圳市优必选科技股份有限公司 | Robot pose estimation method and device, readable storage medium and robot |
CN111595333A (en) * | 2020-04-26 | 2020-08-28 | 武汉理工大学 | Modularized unmanned vehicle positioning method and system based on visual inertial laser data fusion |
CN111521176A (en) * | 2020-04-27 | 2020-08-11 | 北京工业大学 | Visual auxiliary inertial navigation method fusing laser |
CN111947652A (en) * | 2020-08-13 | 2020-11-17 | 北京航空航天大学 | Inertia/vision/astronomy/laser ranging combined navigation method suitable for lunar lander |
CN111947652B (en) * | 2020-08-13 | 2022-09-20 | 北京航空航天大学 | Inertia/vision/astronomy/laser ranging combined navigation method suitable for lunar lander |
CN112985416A (en) * | 2021-04-19 | 2021-06-18 | 湖南大学 | Robust positioning and mapping method and system based on laser and visual information fusion |
CN112985416B (en) * | 2021-04-19 | 2021-07-30 | 湖南大学 | Robust positioning and mapping method and system based on laser and visual information fusion |
CN114637302A (en) * | 2022-04-15 | 2022-06-17 | 安徽农业大学 | Automatic advancing obstacle avoidance method and system based on computer vision |
CN114637302B (en) * | 2022-04-15 | 2022-10-18 | 安徽农业大学 | Automatic advancing obstacle avoidance method and system based on computer vision |
CN115855042A (en) * | 2022-12-12 | 2023-03-28 | 北京自动化控制设备研究所 | Pedestrian visual navigation method based on laser radar cooperative assistance |
CN117968680A (en) * | 2024-03-29 | 2024-05-03 | 西安现代控制技术研究所 | Inertial-radar integrated navigation limited frame measurement variable weight updating method |
CN118243094A (en) * | 2024-04-10 | 2024-06-25 | 北京自动化控制设备研究所 | Inertial vision integrated navigation self-adaptive filtering method based on quality factors |
Also Published As
Publication number | Publication date |
---|---|
CN105371840B (en) | 2019-03-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105371840B (en) | A kind of inertia/visual odometry/laser radar Combinated navigation method | |
US10295365B2 (en) | State estimation for aerial vehicles using multi-sensor fusion | |
CN103697889B (en) | A kind of unmanned plane independent navigation and localization method based on multi-model Distributed filtering | |
EP2372656B1 (en) | Method and apparatus for vision aided navigation using image registration | |
CN101598556B (en) | Unmanned aerial vehicle vision/inertia integrated navigation method in unknown environment | |
EP2112630A1 (en) | Method and system for real-time visual odometry | |
CN110084832A (en) | Correcting method, device, system, equipment and the storage medium of camera pose | |
Chien et al. | Visual odometry driven online calibration for monocular lidar-camera systems | |
CN111707261A (en) | High-speed sensing and positioning method for micro unmanned aerial vehicle | |
CN113466890B (en) | Light laser radar inertial combination positioning method and system based on key feature extraction | |
CN114693754B (en) | Unmanned aerial vehicle autonomous positioning method and system based on monocular vision inertial navigation fusion | |
CN109387198B (en) | Inertia/vision milemeter combined navigation method based on sequential detection | |
EP2322902B1 (en) | System and method for determining heading | |
Khoshelham et al. | Vehicle positioning in the absence of GNSS signals: Potential of visual-inertial odometry | |
CN113052897A (en) | Positioning initialization method and related device, equipment and storage medium | |
RU2564379C1 (en) | Platformless inertial attitude-and-heading reference | |
CN114690229A (en) | GPS-fused mobile robot visual inertial navigation method | |
Wu et al. | AFLI-Calib: Robust LiDAR-IMU extrinsic self-calibration based on adaptive frame length LiDAR odometry | |
Mostafa et al. | Optical flow based approach for vision aided inertial navigation using regression trees | |
CN112819711B (en) | Monocular vision-based vehicle reverse positioning method utilizing road lane line | |
Chu et al. | Performance comparison of tight and loose INS-Camera integration | |
US11037018B2 (en) | Navigation augmentation system and method | |
Zhang et al. | INS assisted monocular visual odometry for aerial vehicles | |
Alekseev et al. | Visual-inertial odometry algorithms on the base of thermal camera | |
CN113946151A (en) | Data processing method and device for automatic driving vehicle and automatic driving vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |