CN105371840B - A kind of inertia/visual odometry/laser radar Combinated navigation method - Google Patents

A kind of inertia/visual odometry/laser radar Combinated navigation method Download PDF

Info

Publication number
CN105371840B
CN105371840B CN201510727853.XA CN201510727853A CN105371840B CN 105371840 B CN105371840 B CN 105371840B CN 201510727853 A CN201510727853 A CN 201510727853A CN 105371840 B CN105371840 B CN 105371840B
Authority
CN
China
Prior art keywords
error
speed
matrix
visual odometry
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510727853.XA
Other languages
Chinese (zh)
Other versions
CN105371840A (en
Inventor
孙伟
李海军
郭元江
徐海刚
李群
郑辛
张忆欣
刘冲
裴玉锋
原润
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Automation Control Equipment Institute BACEI
Original Assignee
Beijing Automation Control Equipment Institute BACEI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Automation Control Equipment Institute BACEI filed Critical Beijing Automation Control Equipment Institute BACEI
Priority to CN201510727853.XA priority Critical patent/CN105371840B/en
Publication of CN105371840A publication Critical patent/CN105371840A/en
Application granted granted Critical
Publication of CN105371840B publication Critical patent/CN105371840B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/50Systems of measurement based on relative movement of target
    • G01S17/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders

Abstract

The invention belongs to air navigation aids, and in particular to a kind of inertia/visual odometry/laser radar Combinated navigation method.It includes: the foundation of (1) state model, (2) visual odometry based on characteristic information tests the speed, (3) acquisition of the foundation of measurement equation and measuring value, (4) Kalman filtering, (5) correct systematic error.Effect of the invention is: this patent uses the autonomous navigation technology of machine vision, and monocular camera passes through the difference of before and after frames image, the speed of carrier can be measured in the case where known distance;Laser radar can accurately measure the distance of observation point, then measure the speed of carrier, and the speed and inertial navigation velocity composition obtained using measurement is navigated, and can finally realize that high-precision is navigated under without extraneous reference information input condition.

Description

A kind of inertia/visual odometry/laser radar Combinated navigation method
Technical field
The invention belongs to air navigation aids, and in particular to a kind of inertia/visual odometry/laser radar integrated navigation side Method.
Background technique
As aircraft long-time, the ability of long voyage are continuously available enhancing, to the precision and independence of navigation system It is required that being also continuously improved.The shortcomings that pure-inertial guidance system is accumulated at any time due to its intrinsic navigation error, because without The needs of practical application can be fully met.It solves the problems, such as that this approach has two classes: the first, improving the essence of inertial navigation system itself Degree.New material, new process, new technology are relied primarily upon, the precision or the high-precision inertia of development of new of inertia device are improved Device.But it needs to spend a large amount of manpower financial capacity, and the raising of inertia device precision is limited.The second, using integrated navigation Technology.Mainly using certain additional navigation information sources outside inertia system, to improve the precision of inertia system, by soft Part technology improves navigation accuracy.However, there are many available combined information, if all combined these information, although smart Degree, which can achieve, even more than to be required, but calculation amount is huge, can not use in practice completely, if only selected part information Combination, then choosing the type of information, combined sequencing, specific combination all generates huge shadow to the precision of result It rings.Calculation amount can be not only still taken into account without a kind of perfect combination of comparison in the prior art, but also is reduced to the greatest extent Combined content.
Summary of the invention
The present invention in view of the drawbacks of the prior art, provides a kind of inertia/visual odometry/laser radar integrated navigation side Method.
The present invention is implemented as follows: a kind of inertia/visual odometry/laser radar Combinated navigation method, including under State step:
(1) inertia/visual odometry/laser radar integrated navigation system state model foundation, such as following formula
In formula: X (t) is above system state vector;W (t) is system white noise;Coefficient matrix F (t) and G (t) is according to accidentally Eikonal equation is sought,
X (t)=[δ Vn, δ Vu, δ Ve, δ L, δ h, δ λ, φn, φu, φe, ▽x, ▽y, ▽z, εx, εy, εz, δ Vn_ov, δ Ve_OV]
δVn,δVu,δVeRespectively indicate Strapdown Inertial Navigation System north orientation, velocity error from day to, east orientation;
δ L, δ h, δ λ respectively indicate the latitude error, height error, longitude error of Strapdown Inertial Navigation System;
φnueRespectively indicate Strapdown Inertial Navigation System navigational coordinate system Nei Bei, day, eastern three directions misalignment;
x, ▽y, ▽zRespectively indicate the accelerometer zero in tri- directions X, Y, Z in Strapdown Inertial Navigation System carrier coordinate system Partially;
εxyzRespectively indicate the gyroscopic drift in tri- directions X, Y, Z in Strapdown Inertial Navigation System carrier coordinate system;
δVn_ov,δVe_OVRespectively indicate the velocity error of visual odometry north orientation, east orientation;
(2) visual odometry based on characteristic information tests the speed
A) suitable matching area is chosen
The geometric center position of image is chosen as matching area;
B) image feature information is extracted
By the image of present frame, previous frame image in matching area intercept out, to after interception image carry out SIFT feature is extracted,
C) Feature Points Matching
It is matched with K-D tree rapid characteristic points, current frame image is matched with the characteristic point in previous frame image, is obtained A series of matching pair,
D) same characteristic point calculating speed is selected
The speed for calculating characteristic point divided by the time to the different distance of upper acquisition in matching is obtained using laser radar;
E) current frame speed is exported
Each characteristic point can calculate a speed, export after these speed are handled;
(3) acquisition of the foundation of measurement equation and measuring value
Kalman filter measurement equation form is as follows:
Z=HX+V
Measuring value Z is the difference of speed that inertial navigation system and visual odometry provide respectively, both actually error Difference:
V is to measure noise in formula, is thought of as white noise,
As available from the above equation H gusts it is as follows:
(4) Kalman filtering
According to inertia/visual odometry/laser radar integrated navigation system equation and measurement equation, Kalman's filter is calculated State Matrix of shifting of a step when wave period arrives, calculation formula are as follows:
In formula:
TnFor period of navigating, NTnFor the Kalman filtering period,For in a Kalman filtering period, i-th of navigation The sytem matrix in period, I are unit battle array,
State one-step prediction
State estimation
Filtering gain matrix
One-step prediction error covariance matrix
Estimation error variance battle array
Pk=[I-KkHk]Pk,k-1
Wherein,For a step status predication value,For state estimation matrix, Φk,k-1For state Matrix of shifting of a step, HkFor measurement matrix, ZkFor measurement, KkFor filtering gain matrix, RkFor observation noise battle array, Pk,k-1For one-step prediction error variance Battle array, PkFor estimation error variance battle array, Γk,k-1Battle array, Q are driven for system noisek-1For system noise acoustic matrix;
Using the equation of step (1) and this step, a series of error amounts can be calculated;
(5) systematic error is corrected
System output value is corrected with step (4) calculated error amount.
A kind of inertia/visual odometry/laser radar Combinated navigation method as described above, wherein the step (2) e) processing of step middle finger is RANSAC method or average.
Effect of the invention is: this patent uses the autonomous navigation technology of machine vision, and monocular camera passes through before and after frames figure The difference of picture can measure the speed of carrier in the case where known distance;Laser radar can accurately measure observation The distance of point, then measures the speed of carrier, and the speed and inertial navigation velocity composition obtained using measurement is navigated, finally can be without outer Realize that high-precision is navigated under boundary's reference information input condition.
Specific embodiment
A kind of inertia/visual odometry/laser radar Combinated navigation method, includes the following steps:
(1) inertia/visual odometry/laser radar integrated navigation system state model foundation, such as following formula
In formula: X (t) is above system state vector;W (t) is system white noise;Coefficient matrix F (t) and G (t) is according to accidentally Eikonal equation is sought.
X (t)=[δ Vn,δVu,δVe,δL,δh,δλ,φnue,▽x,▽y,▽zxyz,δVn_ov,δVe_OV]
δVn,δVu,δVeRespectively indicate Strapdown Inertial Navigation System north orientation, velocity error from day to, east orientation;
δ L, δ h, δ λ respectively indicate the latitude error, height error, longitude error of Strapdown Inertial Navigation System;
φnueRespectively indicate Strapdown Inertial Navigation System navigational coordinate system Nei Bei, day, eastern three directions misalignment;
x,▽y,▽zRespectively indicate the accelerometer zero in tri- directions X, Y, Z in Strapdown Inertial Navigation System carrier coordinate system Partially;
εxyzRespectively indicate the gyroscopic drift in tri- directions X, Y, Z in Strapdown Inertial Navigation System carrier coordinate system;
δVn_ov,δVe_OVRespectively indicate the velocity error of visual odometry north orientation, east orientation;
(2) visual odometry based on characteristic information tests the speed
A) suitable matching area is chosen
In order to improve the range that tests the speed of visual odometry, the speed exported in conjunction with inertial navigation, posture information is needed to select to close Suitable position is as matching area.As matching area, area size is 30*30 picture for the general geometric center position for choosing image Element.
B) image feature information is extracted
By the image of present frame, previous frame image in matching area intercept out, to after interception image carry out SIFT feature is extracted.
C) Feature Points Matching
Suitable threshold value is set, the matching of K-D tree rapid characteristic points is applicable in, by the spy in current frame image and previous frame image Sign point is matched, and obtains a series of matching pair.
D) same characteristic point calculating speed is selected
By the speed calculation formula of visual odometry it is found that the calculating of distance and speed is for same point, therefore It needs to find the same characteristic point in current frame image, previous frame image.This feature point and camera are obtained using laser radar Distance, then information of the matching in by this feature point in current frame image and previous frame image calculates this feature point With the relative velocity of camera.
E) current frame speed is exported
Each characteristic point can calculate a speed, by these speed carry out certain processing (such as RANSAC method or It is average), finally export the result that tests the speed of visual odometry.
(3) acquisition of the foundation of measurement equation and measuring value
Kalman filter measurement equation form is as follows:
Z=HX+V
Measuring value Z is the difference of speed that inertial navigation system and visual odometry provide respectively, both actually error Difference:
V is to measure noise in formula, is thought of as white noise.
As available from the above equation H gusts it is as follows:
(4) Kalman filtering
According to inertia/visual odometry/laser radar integrated navigation system equation and measurement equation, Kalman's filter is calculated State Matrix of shifting of a step when wave period arrives, calculation formula are as follows:
In formula:
TnFor period of navigating, NTnFor the Kalman filtering period,For in a Kalman filtering period, i-th of navigation The sytem matrix in period, I are unit battle array.
State one-step prediction
State estimation
Filtering gain matrix
One-step prediction error covariance matrix
Estimation error variance battle array
Pk=[I-KkHk]Pk,k-1
Wherein,For a step status predication value,For state estimation matrix, Φk,k-1For state Matrix of shifting of a step, HkFor measurement matrix, ZkFor measurement, KkFor filtering gain matrix, RkFor observation noise battle array, Pk,k-1For one-step prediction error variance Battle array, PkFor estimation error variance battle array, Γk,k-1Battle array, Q are driven for system noisek-1For system noise acoustic matrix.
(5) systematic error is corrected
System output value is corrected with step (4) calculated error amount.

Claims (2)

1. a kind of inertia/visual odometry/laser radar Combinated navigation method, which is characterized in that include the following steps:
(1) inertia/visual odometry/laser radar integrated navigation system state model foundation, such as following formula
In formula: X (t) is above system state vector;W (t) is system white noise;Coefficient matrix F (t) and G (t) are according to error side Journey is sought,
δVn, δ Vu, δ VeRespectively indicate Strapdown Inertial Navigation System north orientation, velocity error from day to, east orientation;
δ L, δ h, δ λ respectively indicate the latitude error, height error, longitude error of Strapdown Inertial Navigation System;
φn, φu, φeRespectively indicate Strapdown Inertial Navigation System navigational coordinate system Nei Bei, day, eastern three directions misalignment;
Respectively indicate the accelerometer bias in tri- directions X, Y, Z in Strapdown Inertial Navigation System carrier coordinate system;
εx, εy, εzRespectively indicate the gyroscopic drift in tri- directions X, Y, Z in Strapdown Inertial Navigation System carrier coordinate system;
δVn_ov, δ Ve_0VRespectively indicate the velocity error of visual odometry north orientation, east orientation;
(2) visual odometry based on characteristic information tests the speed
A) suitable matching area is chosen
The geometric center position of image is chosen as matching area;
B) image feature information is extracted
By the image of present frame, previous frame image in matching area intercept out, it is special to carry out SIFT to the image after interception Sign point extracts,
C) Feature Points Matching
It is matched with K-D tree rapid characteristic points, current frame image is matched with the characteristic point in previous frame image, obtains a system The matching pair of column,
D) same characteristic point calculating speed is selected
The speed for calculating characteristic point divided by the time to the different distance of upper acquisition in matching is obtained using laser radar;
E) current frame speed is exported
Each characteristic point can calculate a speed, export after these speed are handled;
(3) acquisition of the foundation of measurement equation and measuring value
Kalman filter measurement equation form is as follows:
Z=HX+V
Measuring value Z is the difference for the speed that inertial navigation system and visual odometry provide respectively, actually the difference of the two error:
V is to measure noise, as white noise in formula,
As available from the above equation H gusts it is as follows:
(4) Kalman filtering
According to inertia/visual odometry/laser radar integrated navigation system equation and measurement equation, Kalman filtering week is calculated State Matrix of shifting of a step when phase arrives, calculation formula are as follows:
In formula:
TnFor period of navigating, NTnFor the Kalman filtering period,For in a Kalman filtering period, i-th of navigation period Sytem matrix, I be unit battle array,
State one-step prediction
State estimation
Filtering gain matrix
One-step prediction error covariance matrix
Estimation error variance battle array
Pk=[I-KkHk]PK, k-1
Wherein,For a step status predication value,For state estimation matrix, Φk,k-1For state Matrix of shifting of a step, HkFor Measurement matrix, ZkFor measurement, KkFor filtering gain matrix, RkFor observation noise battle array, Pk,k-1For one-step prediction error covariance matrix, PkFor estimation error variance battle array, Γk,k-1Battle array, Q are driven for system noisek-1For system noise acoustic matrix;
Using the equation of step (1) and this step, a series of error amounts can be calculated;
(5) systematic error is corrected
System output value is corrected with step (4) calculated error amount.
2. a kind of inertia/visual odometry/laser radar Combinated navigation method as described in claim 1, it is characterised in that: The e of the step (2)) processing of step middle finger is RANSAC method or average.
CN201510727853.XA 2015-10-30 2015-10-30 A kind of inertia/visual odometry/laser radar Combinated navigation method Active CN105371840B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510727853.XA CN105371840B (en) 2015-10-30 2015-10-30 A kind of inertia/visual odometry/laser radar Combinated navigation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510727853.XA CN105371840B (en) 2015-10-30 2015-10-30 A kind of inertia/visual odometry/laser radar Combinated navigation method

Publications (2)

Publication Number Publication Date
CN105371840A CN105371840A (en) 2016-03-02
CN105371840B true CN105371840B (en) 2019-03-22

Family

ID=55374248

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510727853.XA Active CN105371840B (en) 2015-10-30 2015-10-30 A kind of inertia/visual odometry/laser radar Combinated navigation method

Country Status (1)

Country Link
CN (1) CN105371840B (en)

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107402005A (en) * 2016-05-20 2017-11-28 北京自动化控制设备研究所 One kind is based on inertia/odometer/RFID high-precision integrated navigation method
CN107402012A (en) * 2016-05-20 2017-11-28 北京自动化控制设备研究所 A kind of Combinated navigation method of vehicle
CN106153074B (en) * 2016-06-20 2023-05-05 浙江大学 Optical calibration system and method for inertial measurement combined dynamic navigation performance
CN106803262A (en) * 2016-12-21 2017-06-06 上海交通大学 The method that car speed is independently resolved using binocular vision
CN109387198B (en) * 2017-08-03 2022-07-15 北京自动化控制设备研究所 Inertia/vision milemeter combined navigation method based on sequential detection
CN107588769B (en) * 2017-10-17 2020-01-03 北京航天发射技术研究所 Vehicle-mounted strapdown inertial navigation, odometer and altimeter integrated navigation method
CN107861501A (en) * 2017-10-22 2018-03-30 北京工业大学 Underground sewage treatment works intelligent robot automatic positioning navigation system
CN107990893B (en) * 2017-11-24 2020-07-24 南京航空航天大学 Detection method for sudden change of detection environment in two-dimensional laser radar S L AM
CN108151713A (en) * 2017-12-13 2018-06-12 南京航空航天大学 A kind of quick position and orientation estimation methods of monocular VO
CN108536163B (en) * 2018-03-15 2021-04-16 南京航空航天大学 Dynamic model/laser radar combined navigation method in single-sided structure environment
CN108519615B (en) * 2018-04-19 2021-11-26 河南科技学院 Mobile robot autonomous navigation method based on combined navigation and feature point matching
CN108731670B (en) * 2018-05-18 2021-06-22 南京航空航天大学 Inertial/visual odometer integrated navigation positioning method based on measurement model optimization
CN108562289B (en) * 2018-06-07 2021-11-26 南京航空航天大学 Laser radar navigation method for four-rotor aircraft in continuous multilateral geometric environment
CN109444911B (en) * 2018-10-18 2023-05-05 哈尔滨工程大学 Unmanned ship water surface target detection, identification and positioning method based on monocular camera and laser radar information fusion
CN110501024B (en) * 2019-04-11 2023-03-28 同济大学 Measurement error compensation method for vehicle-mounted INS/laser radar integrated navigation system
CN111829552B (en) * 2019-04-19 2023-01-06 北京魔门塔科技有限公司 Error correction method and device for visual inertial system
CN110017850B (en) * 2019-04-19 2021-04-20 小狗电器互联网科技(北京)股份有限公司 Gyroscope drift estimation method and device and positioning system
CN110849362B (en) * 2019-11-28 2022-01-04 湖南率为控制科技有限公司 Laser radar and vision combined navigation algorithm based on vehicle-mounted inertia
CN111121768B (en) * 2019-12-23 2021-10-29 深圳市优必选科技股份有限公司 Robot pose estimation method and device, readable storage medium and robot
CN111595333B (en) * 2020-04-26 2023-07-28 武汉理工大学 Modularized unmanned vehicle positioning method and system based on visual inertia laser data fusion
CN111521176A (en) * 2020-04-27 2020-08-11 北京工业大学 Visual auxiliary inertial navigation method fusing laser
CN111947652B (en) * 2020-08-13 2022-09-20 北京航空航天大学 Inertia/vision/astronomy/laser ranging combined navigation method suitable for lunar lander
CN112985416B (en) * 2021-04-19 2021-07-30 湖南大学 Robust positioning and mapping method and system based on laser and visual information fusion
CN114637302B (en) * 2022-04-15 2022-10-18 安徽农业大学 Automatic advancing obstacle avoidance method and system based on computer vision

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102538781A (en) * 2011-12-14 2012-07-04 浙江大学 Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN103033189A (en) * 2012-12-26 2013-04-10 北京航空航天大学 Inertia/vision integrated navigation method for deep-space detection patrolling device
CN103162687A (en) * 2013-03-07 2013-06-19 中国人民解放军国防科学技术大学 Image/inertial navigation combination navigation method based on information credibility
CN103424112A (en) * 2013-07-29 2013-12-04 南京航空航天大学 Vision navigating method for movement carrier based on laser plane assistance
CN104833352A (en) * 2015-01-29 2015-08-12 西北工业大学 Multi-medium complex-environment high-precision vision/inertia combination navigation method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9766074B2 (en) * 2008-03-28 2017-09-19 Regents Of The University Of Minnesota Vision-aided inertial navigation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102538781A (en) * 2011-12-14 2012-07-04 浙江大学 Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN103033189A (en) * 2012-12-26 2013-04-10 北京航空航天大学 Inertia/vision integrated navigation method for deep-space detection patrolling device
CN103162687A (en) * 2013-03-07 2013-06-19 中国人民解放军国防科学技术大学 Image/inertial navigation combination navigation method based on information credibility
CN103424112A (en) * 2013-07-29 2013-12-04 南京航空航天大学 Vision navigating method for movement carrier based on laser plane assistance
CN104833352A (en) * 2015-01-29 2015-08-12 西北工业大学 Multi-medium complex-environment high-precision vision/inertia combination navigation method

Also Published As

Publication number Publication date
CN105371840A (en) 2016-03-02

Similar Documents

Publication Publication Date Title
CN105371840B (en) A kind of inertia/visual odometry/laser radar Combinated navigation method
CN110706279B (en) Global position and pose estimation method based on information fusion of global map and multiple sensors
EP3364153B1 (en) Method for updating all attitude angles of agricultural machine on the basis of nine-axis mems sensor
US10295365B2 (en) State estimation for aerial vehicles using multi-sensor fusion
CN109540126A (en) A kind of inertia visual combination air navigation aid based on optical flow method
CN106017463B (en) A kind of Aerial vehicle position method based on orientation sensing device
CN103411609B (en) A kind of aircraft return route planing method based on online composition
CN108362281B (en) Long-baseline underwater submarine matching navigation method and system
CN102538781B (en) Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN103697889B (en) A kind of unmanned plane independent navigation and localization method based on multi-model Distributed filtering
CN103674021A (en) Integrated navigation system and method based on SINS (Strapdown Inertial Navigation System) and star sensor
CN105698822B (en) Initial Alignment Method between autonomous type inertial navigation based on reversed Attitude Tracking is advanced
CN107909614B (en) Positioning method of inspection robot in GPS failure environment
Strydom et al. Visual odometry: autonomous uav navigation using optic flow and stereo
CN103175524B (en) A kind of position of aircraft without view-based access control model under marking environment and attitude determination method
CN106772524B (en) A kind of agricultural robot integrated navigation information fusion method based on order filtering
CN107796391A (en) A kind of strapdown inertial navigation system/visual odometry Combinated navigation method
CN113781582B (en) Synchronous positioning and map creation method based on laser radar and inertial navigation combined calibration
CN108871336A (en) A kind of vehicle location estimating system and method
CN108759823A (en) The positioning of low speed automatic driving vehicle and method for correcting error in particular link based on images match
CN106352897B (en) It is a kind of based on the silicon MEMS gyro estimation error of monocular vision sensor and bearing calibration
CN103438890B (en) Based on the planetary power descending branch air navigation aid of TDS and image measurement
CN107144278A (en) A kind of lander vision navigation method based on multi-source feature
CN109387198A (en) A kind of inertia based on sequential detection/visual odometry Combinated navigation method
CN106595637A (en) Visual navigation method for agricultural machine

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant