CN109916394A - A kind of Integrated Navigation Algorithm merging optical flow position and velocity information - Google Patents

A kind of Integrated Navigation Algorithm merging optical flow position and velocity information Download PDF

Info

Publication number
CN109916394A
CN109916394A CN201910270669.5A CN201910270669A CN109916394A CN 109916394 A CN109916394 A CN 109916394A CN 201910270669 A CN201910270669 A CN 201910270669A CN 109916394 A CN109916394 A CN 109916394A
Authority
CN
China
Prior art keywords
light stream
carrier
formula
coordinate system
height
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910270669.5A
Other languages
Chinese (zh)
Inventor
李德辉
王冠林
唐宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Zhiyi Aviation Technology Co Ltd
Original Assignee
Shandong Zhiyi Aviation Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Zhiyi Aviation Technology Co Ltd filed Critical Shandong Zhiyi Aviation Technology Co Ltd
Priority to CN201910270669.5A priority Critical patent/CN109916394A/en
Publication of CN109916394A publication Critical patent/CN109916394A/en
Pending legal-status Critical Current

Links

Abstract

The present invention provides a kind of Integrated Navigation Algorithm for merging optical flow position and velocity information, utilize the position and speed information of light stream sensor output and the data of MEMS IMU, magnetometer, barometertic altimeter, laser range sensor, data fusion is completed using extended Kalman filter, calculates position, speed and the posture information of carrier.For this algorithm using topocentric coordinate system as navigational coordinate system, real-time resolving goes out the position coordinates of carrier relative initial position point.Navigation and positioning of the carrier under the conditions of GNSS defence may be implemented in this algorithm, provides accurate course, posture and velocity information for carrier, and effectively slows down the drift speed of the pure inertial positioning error of strapdown.

Description

A kind of Integrated Navigation Algorithm merging optical flow position and velocity information
Technical field
The present invention relates to Integrated Navigation Algorithm technical field, more particularly to a kind of fusion optical flow position and velocity information Integrated Navigation Algorithm.
Background technique
The inertial navigation of MEMS IMU based on low precision can not provide available navigation data for carrier, frequently with GNSS number According to the diverging come to ins error is inhibited, but GNSS can not provide positioning indoors, under the environment such as the chivalrous road in city, bridge and tunnel Information needs to find pseudo- GNSS under these GNSS defence environment to assist inertial navigation, so-called puppet GNSS, i.e., in GNSS Under defence environment, functionally partly or entirely substitution GNSS realizes the measurement of speed or position, thus with inertial navigation and other Sensing data is merged, and realizes navigation and positioning function under GNSS defence environment.
Summary of the invention
The technical problems to be solved by the present invention are: in order to overcome navigation positioning problems under GNSS defence environment, the present invention A kind of Integrated Navigation Algorithm merging optical flow position and velocity information is provided, this algorithm fusion light stream sensor, MEMS IMU, Magnetometer, barometertic altimeter, laser range sensor data information, using Extended Kalman filter complete data fusion, Position, speed and the posture information of carrier are calculated under topocentric coordinate system.
The present invention solves its technical problem technical solution to be taken:
Integrated navigation system according to the present invention includes the light stream sensor being installed on carrier, MEMS IMU (abbreviation IMU), magnetometer, barometertic altimeter and laser range sensor, light stream sensor camera coordinates system, IMU coordinate system, magnetometer Coordinate system is overlapped with right front upper carrier coordinate system, and light stream sensor and laser range sensor are installed on carrier base, Laser Measuring It is opposite with axis direction on carrier away from sensor measurement direction, wherein
IMU includes the orthogonal gyroscope of three axis and three axis normal acceleration meters, is respectively used to measurement angular speed and acceleration (ratio Power);Magnetometer uses three axis orthogonal magnetometers, is used for magnetic survey;Barometertic altimeter is for measuring pressure altitude;Light stream sensing Device is used to measure the pixel displacement under pixel coordinate system between two frame effective images;Laser range sensor is used for measurement sensor One-dimensional distance between reflection point.
The present invention relates to coordinate system include that carrier coordinate system, topocentric coordinate system, northeast day geographic coordinate system, navigation are sat Mark system, earth magnetism northeast day coordinate system, camera coordinates system, pixel coordinate system, wherein carrier coordinate system refers to load navigation system " right front upper " coordinate system of carrier, be installed on the sensor (IMU, magnetometer, light stream sensor) with coordinate on carrier its Installation is overlapped with carrier coordinate system;Topocentric coordinate system is the northeast day geographic coordinate system that carrier navigation starting point is origin, combination The location estimation of navigation is indicated with topocentric coordinate system;The origin of northeast day geographic coordinate system is carrier mass center, the speed of integrated navigation Degree estimation indicates in this coordinate system, and thinks to be equal to and be shown in topocentric coordinate system;Navigational coordinate system is adopted in this algorithm Use topocentric coordinate system;For earth magnetism northeast day coordinate system using magnetic north as north orientation, northeast day geographic coordinate system day rotates an earth magnetism to axis Drift angle is overlapped with earth magnetism northeast day coordinate system, camera coordinates system, that is, camera " right front upper " coordinate system, pixel coordinate system origin For the image upper left corner, unit is pixel (pixel), and u axis is identical as the dextrad axis direction of camera coordinates system, v axis and camera coordinates Be is preceding axially identical.
A kind of Integrated Navigation Algorithm merging optical flow position and velocity information, comprising the following steps:
S1: the light stream sensor being installed on carrier, IMU, magnetometer, barometertic altimeter are read by navigational computer and is swashed The data information of ligh-ranging sensor, wherein what is read from light stream sensor is u axis direction and v axis direction under pixel coordinate system Two frame effective images between pixel displacement, what is read from IMU is angular speed and acceleration (specific force) data, from magnetometer Upper reading is absolute force data, and what is read from barometertic altimeter is pressure altitude data, from laser range sensor What is read is laser ranging altitude information;
S2: using variable weight value method to the pressure altitude and the progress of laser ranging height in the data information obtained in step S1 Fusion calculates and obtains a fusion height, and will amount of height of the fusion height as Extended Kalman filter (EKF) data fusion It surveys;
S3: navigation algorithm is updated according to light stream sensor data to be indicated, judges whether light stream sensor data update, if Light stream sensor data do not update, and enter step S4;If light stream sensor data have update, S5 is entered step;
S4: in the case where light stream sensor data are without update status, navigation algorithm carries out strapdown pure-inertial guidance recursion, calculates Position, speed and the posture information of carrier;
S5: in the case where light stream sensor data have update status, angle is carried out to the bidimensional pixel displacement of light stream sensor output Motion compensation obtains line and moves corresponding pixel displacement, and utilizes the object between camera resolution and camera and the plane that is taken It manages distance and physical size conversion is carried out to line movement pixel displacement, convert the camera that rice is unit for line movement pixel displacement and sit Mark is lower dextrad and forward direction displacement;And camera is calculated according to the interval time between the effective light stream sensor data output of two frames Dextrad and forward speed under coordinate system;Ensure that light stream sensor is overlapped with carrier coordinate system by installation, therefore camera coordinates system The displacement and speed of the displacement and speed of lower dextrad and forward direction, i.e. carrier dextrad and forward direction under carrier coordinate system;
S6: according to step S5's as a result, carrier that light stream sensor data are calculated by Extended Kalman filter It is that position (being displaced) and speed, IMU data, magnetometer data, fusion height carry out data fusion, calculates the position of carrier It sets, speed and posture information.
In step s 2, pressure altitude calculates one with laser ranging height and merges height, filters as spreading kalman The measurement information of wave EKF data fusion merges to calculate the process of fusion height pressure altitude with laser ranging height It is as follows:
(1) elemental height H is determinedBaro0
In navigation initial time, the pressure altitude that barometertic altimeter at this time is exported is denoted as initial gas pressure height HBaro_T0, Laser ranging height will be exported by laser range sensor at this time, and be denoted as initial laser ranging height HLaser_T0, calculate and be used for air pressure The elemental height H that height change calculatesBaro0, and ignore laser ranging height caused by navigation initial time carrier levels attitude angle Non-perpendicularity, calculation formula is as follows:
HBaro0=HBaro_T0-HLaser_T0 (1)
(2) laser range sensor vertical height H is determinedLaser
Due to highly referring to vertical height in navigation algorithm, and due to laser range sensor because of the measurement side of installation To parallel with carrier system Z axis, when carrier levels attitude angle is not 0, laser range sensor measurement is an inclined height, Therefore the horizontal attitude angle using carrier is needed, the laser ranging height H that laser range sensor is exportedLaserIt is transformed into vertical Height HLaser_vertical, calculation formula is as follows:
HLaser_vertical=HLaser*cosθ*cosγ (2)
θ is the carrier pitch angle as unit of radian, and γ is the carrier roll angle as unit of radian;
(3) fusion height H is calculated
The pressure altitude H exported according to barometertic altimeterBaro, in conjunction with the calculated value of formula (1) and formula (2), for leading Data fusion of navigating measures the calculation formula of the fusion H of height are as follows:
H=HLaser_vertical*W+(1-W)*(HBaro-HBaro0) (3)
In formula (3), W is weight coefficient, and value range is 0~1, and the generally included numberical range of laser range sensor is 0~1 health parameters output, when health parameters are relatively low, W value sets 0, and health parameters are higher, then W value is bigger, and laser ranging The health parameters of sensor count and outrange actually by the noise variance of the laser distance measuring value exported to sensor Distance value number statistics is common to be determined.
Further, in step S5 to light stream sensor output light stream sensor data, i.e., pixel coordinate system u axis direction and The pixel displacement of v axis direction, the process for carrying out compensation of angle movement and physical size conversion are specific as follows:
In this algorithm, light stream sensor is installed using bottom, the camera coordinates system of light stream sensor, IMU coordinate system, magnetic force It counts coordinate system and " right front upper " carrier coordinate system keeps installation consistent, the u axis of pixel coordinate system and the right ward axis of carrier coordinate system In parallel, the v axis of pixel coordinate system is parallel to axis with before carrier coordinate system;Under the direct output pixel coordinate system of light stream sensor U axis pixel displacement OpFlowX and v axis pixel displacement OpFlowY, u axis pixel displacement OpFlowX and v axis pixel displacement OpFlowY includes that the pixel displacement that line movement generates and the pixel displacement that angular movement generates need to extract outlet motion information Compensate the pixel displacement that carrier angular movement generates:
In formula (4), OpFlowX_transmotion, OpFlowY_transmotion are the pixel position that line movement generates It moves;γlast、θlastThe carrier roll angle and pitch angle as unit of radian when being exported for upper one effective light stream sensor data; K is light stream sensor compensation of angle movement parameter, and K value is the sensor factory parameter provided in light stream sensor handbook, Light stream sensor can be made only to do angular movement under fixed height to test to obtain.
After obtaining the pixel displacement that the line movement in (4) formula generates, the camera coordinates system that is translated into as unit of rice Displacement:
OpFlowP in formula (5)x、OpFlowPyLight stream sensor between the effective light stream sensor data of two frames is in camera The displacement of dextrad and forward direction under coordinate system, unit are rice, because camera coordinates system is overlapped with carrier coordinate system, therefore, the displacement It is also displacement of the light stream sensor under carrier coordinate system;Resolution is the resolution ratio of light stream sensor, can be by looking into The handbook of light stream sensor obtains, after the integrated acceleration that lower GPS velocity calibration, accelerometer output can also be moved by line Speed carries out calibration or the modes such as linear reciprocating motion calibration obtain the parameter between fixed range;PUFor integrated navigation number The height obtained according to fusion.
Light stream sensor that rice in (5) formula is unit is obtained after carrier coordinate system bottom offset, calculating meter per second is unit Speed of the light stream sensor under carrier system:
In formula (6), OpFlowVx、OpFlowVyIt is light stream sensor in the dextrad of carrier coordinate system and the speed of forward direction; tnowThe time that the second of time is unit is exported for light stream sensor present frame;tlastThe second of time is exported for light stream sensor previous frame data For the time of unit.
Further, in step S6, when carrying out navigation data fusion, it is related to the processing of magnetometer data, it is normalized to work as Ground magnetic vector [m under ground geographic coordinate system (northeast day)E mN mU]TCalculation method are as follows:
By under carrier coordinate system, the normalized value [m of the original output of magnetometerx my mz]TIt is transformed into geographic coordinate system:
Wherein, [mE1 mN1 mU1]TFor under the geographic coordinate system that is directly converted to by attitude matrix magnetic vector; For quaternary number constitute carrier coordinate system to navigational coordinate system attitude matrix.
Using under geographic coordinate system in formula (7) magnetic vector [mE1 mN1 mU1]T, reconstruct under the coordinate system of earth magnetism northeast day Ground magnetic vector [0 mN2 mU2]T:
By ground magnetic vector [0 m under earth magnetism northeast day coordinate system in formula (8)N2 mU2]T, turned by the compensation of geomagnetic declination Change to the ground magnetic vector [m under geographic coordinate systemE mN mU]T:
Wherein, Mag_dec is the geomagnetic declination that radian is unit, inquires to obtain by longitude and latitude.
Based under topocentric coordinate system, using the Integrated Navigation Algorithm of EKF fusion light stream sensor position and speed, combine Navigation algorithm state equation are as follows:
Wherein,For system mode,For system noise;It include: 3 dimension positions3 dimension speed4 dimension posture quaternarys Number3 dimension gyro zero biasWith 3 dimension accelerometer biasTotally 16 dimension;System noiseIt include: 3 dimension gyroscope white noises3 dimension accelerometer white noisesTotally 6 dimension;For quaternary numberMultiplication matrix;It is exported for accelerometer in IMU Acceleration,For gyroscope output angular velocity in IMU;According to state differential equation (10) solving state one-step prediction.
Integrated Navigation Algorithm measurement equation:
Wherein,It indicates to measure;For measurement predictor;It indicates to measure noise;
In formula (11), noise is measuredNoise is measured including 2 dimension light stream displacements, 2 dimension optical flow velocities measure noise, 3 dimension normalizings Magnetometer output after change measures noise, 1 dimension fusion height-measuring noise.
In formula (11),It indicates that 8 dimensions measure, can be indicated with following formula (12):
In formula (12),Indicate that 2 dimension light stream sensor output carrier system dextrad and forward direction are displaced OpFlowPxWith OpFlowPyIndicate 2 dimension light stream sensor output carrier system dextrad and forward speed OpFlowVxWith OpFlowVyTable Show that 3 dimensions are measured by the magnetometer after magnetometer output normalization, H indicates that the fusion of 1 dimension pressure altitude and laser ranging height is high Degree.
In formula (11),For measurement and state nonlinear relation function, can indicate are as follows:
In formula (13)WithIt is calculated by formula (14) and formula (15):
Wherein,The dextrad and forward direction displacement of carrier system, i.e. carrier in formula (14) are transformed into for northeast day position in state It is speed [Px Py Pz]TIn PxAnd PyFor the dextrad and forward speed of northeast day rate conversion in state to carrier system, i.e., Carrier system speed [V in formula (15)x Vy Vz]TIn VxAnd Vy;PUThat is stateIn height;For the last optical flow data northeast that navigation data merges when effective day location status Estimation;For the ground magnetic vector under normalized local geographic coordinate system (northeast day);By state one It walks predicted value and substitutes into formula (13) to get measurement predictor is arrived.
In formula (11), noise is measuredNoise is measured including 2 dimension light stream displacements, 2 dimension optical flow velocities measure noise, 3 dimension normalizings Magnetometer output after change measures noise, 1 dimension fusion height-measuring noise.
State-transition matrix Φ, system noise driving square are obtained by calculating Jacobian matrix to formula (10) and formula (13) Battle array Γ and measurement matrix H.
State-transition matrix Φ is calculated:
System noise drives matrix Γ to calculate:
T is the navigation period in formula (16), (17), and I is unit matrix;
Measurement matrix H is calculated:
StateOne-step prediction valueIt is obtained by the solution differential equation:
In formula (19)For upper navigation periodic state estimated value;
Then data fusion can be completed using Extended Kalman filter, calculates position, speed and the posture letter of carrier Breath.
The beneficial effects of the present invention are: the integrated navigation of a kind of fusion optical flow position provided by the invention and velocity information is calculated Method, the position and speed information and MEMS IMU, magnetometer, barometertic altimeter, laser ranging exported using light stream sensor The data of sensor complete data fusion using extended Kalman filter, calculate position, speed and the posture letter of carrier Breath.For this algorithm using topocentric coordinate system as navigational coordinate system, real-time resolving goes out the position coordinates of carrier relative initial position point. Navigation and positioning of the carrier under the conditions of GNSS defence may be implemented in this algorithm, provides accurate course posture and speed for carrier Measurement data, and effectively slow down the drift speed of the pure inertial positioning error of strapdown.
Detailed description of the invention
Present invention will be further explained below with reference to the attached drawings and examples.
Fig. 1 is the functional block diagram of inventive algorithm.
Fig. 2 is algorithm flow chart of the invention.
Specific embodiment
Presently in connection with attached drawing, the present invention is described in detail.This figure is simplified schematic diagram, is only illustrated in a schematic way Basic structure of the invention, therefore it only shows the composition relevant to the invention.
As shown in Figure 1, integrated navigation system according to the present invention includes the light stream sensor being installed on carrier, MEMS IMU (abbreviation IMU), magnetometer, barometertic altimeter and laser range sensor, light stream sensor camera coordinates system, IMU coordinate System, magnetometer coordinate system are overlapped with right front upper carrier coordinate system, wherein
IMU includes the orthogonal gyroscope of three axis and three axis normal acceleration meters, is respectively used to measurement angular speed and acceleration (ratio Power);Magnetometer uses three axis orthogonal magnetometers, is used for magnetic survey;Barometertic altimeter is for measuring pressure altitude;Light stream sensing Device is used to measure the pixel displacement under pixel coordinate system between two frame effective images;Laser range sensor is used for measurement sensor One-dimensional distance between reflection point.
As shown in Fig. 2, a kind of Integrated Navigation Algorithm for merging optical flow position and velocity information, comprising the following steps:
S1: the light stream sensor being installed on carrier, IMU, magnetometer, barometertic altimeter are read by navigational computer and is swashed The data information of ligh-ranging sensor, wherein what is read from light stream sensor is u axis direction and v axis direction under pixel coordinate system Two frame effective images between pixel displacement, what is read from IMU is angular speed and acceleration (specific force) data, from magnetometer Upper reading is absolute force data, and what is read from barometertic altimeter is pressure altitude data, from laser range sensor What is read is laser ranging altitude information;
S2: using variable weight value method to the pressure altitude and the progress of laser ranging height in the data information obtained in step S1 Fusion calculates and obtains a fusion height, and will amount of height of the fusion height as Extended Kalman filter (EKF) data fusion It surveys;
S3: navigation algorithm is updated according to light stream sensor data to be indicated, judges whether light stream sensor data update, if Light stream sensor data do not update, and enter step S4;If light stream sensor data have update, S5 is entered step;
S4: in the case where light stream sensor data are without update status, navigation algorithm carries out strapdown pure-inertial guidance recursion, calculates Position, speed and the posture information of carrier;
S5: in the case where light stream sensor data have update status, angle is carried out to the bidimensional pixel displacement of light stream sensor output Motion compensation obtains line and moves corresponding pixel displacement, and utilizes the object between camera resolution and camera and the plane that is taken It manages distance and physical size conversion is carried out to line movement pixel displacement, convert the camera that rice is unit for line movement pixel displacement and sit Mark is lower dextrad and forward direction displacement;And camera is calculated according to the interval time between the effective light stream sensor data output of two frames Dextrad and forward speed under coordinate system;Ensure that light stream sensor is overlapped with carrier coordinate system by installation, therefore camera coordinates system The displacement and speed of the displacement and speed of lower dextrad and forward direction, i.e. carrier dextrad and forward direction under carrier coordinate system;
S6: according to step S5's as a result, carrier that light stream sensor data are calculated by Extended Kalman filter It is that position (being displaced) and speed, IMU data, magnetometer data, fusion height carry out data fusion, calculates the position of carrier It sets, speed and posture information.
In step s 2, pressure altitude calculates one with laser ranging height and merges height, filters as spreading kalman The measurement information of wave EKF data fusion, in pressure altitude is merged with laser ranging height to calculate the mistake of fusion height Journey is as follows:
(1) elemental height H is determinedBaro0
In navigation initial time, the pressure altitude that barometertic altimeter at this time is exported is denoted as initial gas pressure height HBaro_T0, Laser ranging height will be exported by laser range sensor at this time, and be denoted as initial laser ranging height HLaser_T0, calculate and be used for air pressure The elemental height H that height change calculatesBaro0, and ignore laser ranging height caused by navigation initial time carrier levels attitude angle Non-perpendicularity, calculation formula is as follows:
HBaro0=HBaro_T0-HLaser_T0 (1)
(2) laser range sensor vertical height H is determinedLaser
Due to highly referring to vertical height in navigation algorithm, and due to laser range sensor because of the measurement side of installation To parallel with carrier system Z axis, when carrier levels attitude angle is not 0, laser range sensor measurement is an inclined height, Therefore the horizontal attitude angle using carrier is needed, the laser ranging height H that laser range sensor is exportedLaserIt is transformed into vertical Height HLaser_vertical, calculation formula is as follows:
HLaser_vertical=HLaser*cosθ*cosγ (2)
θ is the carrier pitch angle as unit of radian, and γ is the carrier roll angle as unit of radian;
(3) fusion height H is calculated
The pressure altitude H exported according to barometertic altimeterBaro, in conjunction with the calculated value of formula (1) and formula (2), for leading Data fusion of navigating measures the calculation formula of the fusion H of height are as follows:
H=HLaser_vertical*W+(1-W)*(HBaro-HBaro0) (3)
In formula (3), W is weight coefficient, and value range is 0~1, and the generally included numberical range of laser range sensor is 0~1 health parameters output, when health parameters are relatively low, W value sets 0, and health parameters are higher, then W value is bigger, and laser ranging The health parameters of sensor count and outrange actually by the noise variance of the laser distance measuring value exported to sensor Distance value number statistics is common to be determined.
Further, in step S5 to light stream sensor output light stream sensor data, i.e., pixel coordinate system u axis direction and The pixel displacement of v axis direction, the process for carrying out compensation of angle movement and physical size conversion are specific as follows:
In this algorithm, light stream sensor is installed using bottom, the camera coordinates system of light stream sensor, IMU coordinate system, magnetic force It counts coordinate system and " right front upper " carrier coordinate system keeps installation consistent, the u axis of pixel coordinate system and the right ward axis of carrier coordinate system In parallel, the v axis of pixel coordinate system is parallel to axis with before carrier coordinate system;Under the direct output pixel coordinate system of light stream sensor U axis pixel displacement OpFlowX and v axis pixel displacement OpFlowY, u axis pixel displacement OpFlowX and v axis pixel displacement OpFlowY includes that the pixel displacement that line movement generates and the pixel displacement that angular movement generates need to extract outlet motion information Compensate the pixel displacement that carrier angular movement generates:
In formula (4), OpFlowX_transmotion, OpFlowY_transmotion are the pixel position that line movement generates It moves;γlast、θlastThe carrier roll angle and pitch angle as unit of radian when being exported for upper one effective light stream sensor data; K is light stream sensor compensation of angle movement parameter, and K value is the sensor factory parameter provided in light stream sensor handbook, Light stream sensor can be made only to do angular movement under fixed height to test to obtain.
After obtaining the pixel displacement that the line movement in (4) formula generates, the camera coordinates system that is translated into as unit of rice Displacement:
OpFlowP in formula (5)x、OpFlowPyLight stream sensor between the effective light stream sensor data of two frames is in camera The displacement of dextrad and forward direction under coordinate system, unit are rice, because camera coordinates system is overlapped with carrier coordinate system, therefore, the displacement It is also displacement of the light stream sensor under carrier coordinate system;Resolution is the resolution ratio of light stream sensor, can be by looking into The handbook of light stream sensor obtains, after the integrated acceleration that lower GPS velocity calibration, accelerometer output can also be moved by line Speed carries out calibration or the modes such as linear reciprocating motion calibration obtain the parameter between fixed range;PUFor integrated navigation number The height obtained according to fusion.
Light stream sensor that rice in (5) formula is unit is obtained after carrier coordinate system bottom offset, calculating meter per second is unit Speed of the light stream sensor under carrier system:
In formula (6), OpFlowVx、OpFlowVyIt is light stream sensor in the dextrad of carrier coordinate system and the speed of forward direction; tnowThe time that the second of time is unit is exported for light stream sensor present frame;tlastThe second of time is exported for light stream sensor previous frame data For the time of unit.
Further, in step S6, when carrying out navigation data fusion, it is related to the processing of magnetometer data, it is normalized to work as Ground magnetic vector [m under ground geographic coordinate system (northeast day)E mN mU]TCalculation method are as follows:
By under carrier coordinate system, the normalized value [m of the original output of magnetometerx my mz]TIt is transformed into geographic coordinate system:
Wherein, [mE1 mN1 mU1]TFor under the geographic coordinate system that is directly converted to by attitude matrix magnetic vector; For quaternary number constitute carrier coordinate system to navigational coordinate system attitude matrix.
Using under geographic coordinate system in formula (7) magnetic vector [mE1 mN1 mU1]T, reconstruct under the coordinate system of earth magnetism northeast day Ground magnetic vector [0 mN2 mU2]T:
By ground magnetic vector [0 m under earth magnetism northeast day coordinate system in formula (8)N2 mU2]T, turned by the compensation of geomagnetic declination Change to the ground magnetic vector [m under geographic coordinate systemE mN mU]T:
Wherein, Mag_dec is the geomagnetic declination that radian is unit, inquires to obtain by longitude and latitude.
Based under topocentric coordinate system, using the Integrated Navigation Algorithm of EKF fusion light stream sensor position and speed, combine Navigation algorithm state equation are as follows:
Wherein,For system mode,For system noise;It include: 3 dimension positions3 dimension speed4 dimension posture quaternarys Number3 dimension gyro zero biasWith 3 dimension accelerometer biasTotally 16 dimension;System noiseIt include: 3 dimension gyroscope white noises3 dimension accelerometer white noisesTotally 6 dimension;For quaternary numberMultiplication matrix;It is exported for accelerometer in IMU Acceleration,For gyroscope output angular velocity in IMU;According to state differential equation (10) solving state one-step prediction.
Integrated Navigation Algorithm measurement equation:
Wherein,It indicates to measure;For measurement predictor;It indicates to measure noise;
In formula (11), noise is measuredNoise is measured including 2 dimension light stream displacements, 2 dimension optical flow velocities measure noise, 3 dimension normalizings Magnetometer output after change measures noise, 1 dimension fusion height-measuring noise;
In formula (11),It indicates that 8 dimensions measure, can be indicated with following formula (12):
In formula (12),Indicate that 2 dimension light stream sensor output carrier system dextrad and forward direction are displaced OpFlowPxWith OpFlowPyIndicate 2 dimension light stream sensor output carrier system dextrad and forward speed OpFlowVxWith OpFlowVyTable Show that 3 dimensions are measured by the magnetometer after magnetometer output normalization, H indicates that the fusion of 1 dimension pressure altitude and laser ranging height is high Degree.
In formula (11),For measurement and state nonlinear relation function, can indicate are as follows:
In formula (13)WithIt is calculated by formula (14) and formula (15):
Wherein,The dextrad and forward direction displacement of carrier system, i.e. carrier in formula (14) are transformed into for northeast day position in state It is speed [Px Py Pz]TIn PxAnd PyFor northeast day rate conversion in state to the dextrad and forward speed of carrier system, i.e. formula (15) carrier system speed [V inx Vy Vz]TIn VxAnd Vy;PUThat is stateIn height; For the location status estimation of the last optical flow data northeast that navigation data merges when effective day;To return Ground magnetic vector under the one local geographic coordinate system (northeast day) changed;State one-step prediction value is substituted into formula (13) to get amount is arrived Survey predicted value.
In formula (11), noise is measuredNoise is measured including 2 dimension light stream displacements, 2 dimension optical flow velocities measure noise, 3 dimension normalizings Magnetometer output after change measures noise, 1 dimension fusion height-measuring noise.
State-transition matrix Φ, system noise driving square are obtained by calculating Jacobian matrix to formula (10) and formula (13) Battle array Γ and measurement matrix H;
State-transition matrix Φ is calculated:
System noise drives matrix Γ to calculate:
T is the navigation period in formula (16), (17), and I is unit matrix;
Measurement matrix H is calculated:
StateOne-step prediction valueIt is obtained by the solution differential equation:
In formula (19)For upper navigation periodic state estimated value;
Then data fusion can be completed using Extended Kalman filter, calculates position, speed and the posture letter of carrier Breath.
Taking the above-mentioned ideal embodiment according to the present invention as inspiration, through the above description, relevant staff Various changes and amendments can be carried out without departing from the scope of the present invention completely.The technical scope of this invention is not The content being confined on specification, it is necessary to which the technical scope thereof is determined according to the scope of the claim.

Claims (5)

1. a kind of Integrated Navigation Algorithm for merging optical flow position and velocity information, it is characterised in that: described to lead including navigation system Boat system includes light stream sensor, IMU, magnetometer, barometertic altimeter and laser range sensor, further comprising the steps of:
S1: the light stream sensor, IMU, magnetometer, barometertic altimeter and the Laser Measuring that are installed on carrier are read by navigational computer Data information away from sensor, wherein what is read from light stream sensor is two of u axis direction and v axis direction under pixel coordinate system Pixel displacement between frame effective image, what is read from IMU is angular speed and acceleration information, and what is read from magnetometer is Absolute force data, what is read from barometertic altimeter is pressure altitude data, and what is read from laser range sensor is sharp Ligh-ranging altitude information;
S2: the pressure altitude in the data information obtained in step S1 is melted with laser ranging height using variable weight value method It closes, calculates and obtain a fusion height, and the height-measuring by fusion height as Extended Kalman filter data fusion;
S3: navigation algorithm is updated according to light stream sensor data to be indicated, judges whether light stream sensor data update, if light stream Sensing data does not update, and enters step S4;If light stream sensor data have update, S5 is entered step;
S4: in the case where light stream sensor data are without update status, navigation algorithm carries out strapdown pure-inertial guidance recursion, calculates carrier Position, speed and posture information;
S5: in the case where light stream sensor data have update status, angular movement is carried out to the bidimensional pixel displacement of light stream sensor output Compensation obtains line and moves corresponding pixel displacement, and using the physics between camera resolution and camera and the plane that is taken away from Physical size conversion is carried out to line movement pixel displacement, converts the camera coordinates system that rice is unit for line movement pixel displacement Lower dextrad and forward direction displacement;And camera coordinates are calculated according to the interval time between the effective light stream sensor data output of two frames It is lower dextrad and forward speed;
S6: according to step S5's as a result, the carrier system position that light stream sensor data are calculated by Extended Kalman filter It sets and carries out data fusion with speed, IMU data, magnetometer data, fusion height, calculate position, speed and the posture of carrier Information.
2. the Integrated Navigation Algorithm of fusion optical flow position and velocity information as described in claim 1, it is characterised in that: step S2 In pressure altitude is merged with laser ranging height it is as follows to calculate the process of fusion height:
(1) elemental height H is determinedBaro0
In navigation initial time, the pressure altitude that barometertic altimeter at this time is exported is denoted as initial gas pressure height HBaro_T0, by this When laser range sensor export laser ranging height, be denoted as initial laser ranging height HLaser_T0, calculate and be used for pressure altitude Change the elemental height H calculatedBaro0, and ignore the non-of laser ranging height caused by navigation initial time carrier levels attitude angle Up rightness, calculation formula are as follows:
HBaro0=HBaro_T0-HLaser_T0 (1)
(2) laser range sensor vertical height H is determinedLaser
Due to highly referring to vertical height in navigation algorithm, and due to laser range sensor because the measurement direction of installation with Carrier system Z axis is parallel, and when carrier levels attitude angle is not 0, laser range sensor measurement is an inclined height, therefore The horizontal attitude angle using carrier is needed, the laser ranging height H that laser range sensor is exportedLaserIt is transformed into vertical height HLaser_vertical, calculation formula is as follows:
HLaser_vertical=HLaser*cosθ*cosγ (2)
θ is the carrier pitch angle as unit of radian, and γ is the carrier roll angle as unit of radian;
(3) fusion height H is calculated
The pressure altitude H exported according to barometertic altimeterBaro, in conjunction with the calculated value of formula (1) and formula (2), it is used for navigation data Fusion measures the calculation formula of the fusion H of height are as follows:
H=HLaser_vertical*W+(1-W)*(HBaro-HBaro0) (3)
In formula (3), W is weight coefficient, and value range is 0~1, and the generally included numberical range of laser range sensor is 0~1 Health parameters output, when health parameters are relatively low, W value sets 0, and health parameters are higher, then W value is bigger, and laser ranging sense The health parameters of device count and outrange distance actually by the noise variance of the laser distance measuring value exported to sensor It is common determining to be worth number statistics.
3. the Integrated Navigation Algorithm of fusion optical flow position and velocity information as claimed in claim 2, it is characterised in that: step S5 In to the light stream sensor data of light stream sensor output, i.e. the pixel displacement of pixel coordinate system u axis direction and v axis direction, into Row compensation of angle movement and the process of physical size conversion are specific as follows:
In this algorithm, light stream sensor is installed using bottom, and the camera coordinates system of light stream sensor, IMU coordinate system, magnetometer are sat Mark system keeps installation consistent with " right front upper " carrier coordinate system, and the u axis of pixel coordinate system is parallel with the right ward axis of carrier coordinate system, The v axis of pixel coordinate system is parallel to axis with before carrier coordinate system;U axis under the direct output pixel coordinate system of light stream sensor Pixel displacement OpFlowX and v axis pixel displacement OpFlowY, u axis pixel displacement OpFlowX and v axis pixel displacement OpFlowY packet The pixel displacement that the pixel displacement and angular movement that the movement containing line generates generate needs to compensate carrier to extract outlet motion information The pixel displacement that angular movement generates:
In formula (4), OpFlowX_transmotion, OpFlowY_transmotion are the pixel displacement that line movement generates; γlast、θlastThe carrier roll angle and pitch angle as unit of radian when being exported for upper one effective light stream sensor data;K is Light stream sensor compensation of angle movement parameter;
After obtaining the pixel displacement that the line movement in (4) formula generates, the camera coordinates system being translated into as unit of rice is displaced:
OpFlowP in formula (5)x、OpFlowPyLight stream sensor between the effective light stream sensor data of two frames is in camera coordinates It is the displacement of lower dextrad and forward direction, unit is rice;Resolution is the resolution ratio of light stream sensor;PUFor integrated navigation data Merge obtained height;
Light stream sensor that rice in (5) formula is unit is obtained after carrier coordinate system bottom offset, calculates the light that meter per second is unit Speed of the flow sensor under carrier system:
In formula (6), OpFlowVx、OpFlowVyIt is light stream sensor in the dextrad of carrier coordinate system and the speed of forward direction;tnowFor Light stream sensor present frame exports the time that the second of time is unit;tlastExporting the second of time for light stream sensor previous frame data is unit Time.
4. the autonomous Fault-tolerant Integrated navigation algorithm of posture as claimed in claim 3, it is characterised in that: in step S6, navigating When data fusion, it is related to the processing of magnetometer data, the ground magnetic vector [m under normalized locality geographic coordinate systemE mN mU]T's Calculation method are as follows:
By under carrier coordinate system, the normalized value [m of the original output of magnetometerx my mz]TIt is transformed into geographic coordinate system:
Wherein, [mE1 mN1 mU1]TFor under the geographic coordinate system that is directly converted to by attitude matrix magnetic vector;It is four Attitude matrix of the carrier coordinate system that first number is constituted to navigational coordinate system;
Using under geographic coordinate system in formula (7) magnetic vector [mE1 mN1 mU1]T, reconstruct the earth magnetism under the coordinate system of earth magnetism northeast day Vector [0 mN2 mU2]T:
By ground magnetic vector [0 m under earth magnetism northeast day coordinate system in formula (8)N2 mU2]T, it is transformed by the compensation of geomagnetic declination Ground magnetic vector [m under geographic coordinate systemE mN mU]T:
Wherein, Mag_dec is the geomagnetic declination that radian is unit, inquires to obtain by longitude and latitude.
5. the Integrated Navigation Algorithm of fusion optical flow position and velocity information as claimed in claim 4, it is characterised in that: based on station Under heart coordinate system, using the Integrated Navigation Algorithm of EKF fusion light stream sensor position and speed, Integrated Navigation Algorithm state side Journey are as follows:
Wherein,For system mode,For system noise;It include: 3 dimension positions3 dimension speed4 dimension attitude quaternions 3 dimension gyro zero biasWith 3 dimension accelerometer biasTotally 16 dimension;System noiseIt include: 3 dimension gyroscope white noises3 dimensions Accelerometer white noiseTotally 6 dimension;For quaternary numberMultiplication matrix;For in IMU accelerometer export acceleration,For gyroscope output angular velocity in IMU;According to state differential equation (10) solving state one-step prediction;
Integrated Navigation Algorithm measurement equation:
Wherein,It indicates to measure;For measurement predictor;It indicates to measure noise;
In formula (11), noise is measuredNoise is measured including 2 dimension light stream displacements, 2 dimension optical flow velocities measure noise, after 3 dimension normalization Magnetometer output measure noise, 1 dimension fusion height-measuring noise;
In formula (11),It indicates that 8 dimensions measure, can be indicated with following formula (12):
In formula (12),Indicate that 2 dimension light stream sensor output carrier system dextrad and forward direction are displaced OpFlowPxWith OpFlowPyIndicate 2 dimension light stream sensor output carrier system dextrad and forward speed OpFlowVxWith OpFlowVyIndicate 3 dimension by Magnetometer after magnetometer output normalization measures, H indicates the 1 fusion height for tieing up pressure altitude and laser ranging height;
In formula (11),For measurement and state nonlinear relation function, can indicate are as follows:
In formula (13)WithIt is calculated by formula (14) and formula (15):
Wherein,The dextrad and forward direction displacement of carrier system, i.e., carrier system speed in formula (14) are transformed into for northeast day position in state Spend [Px Py Pz]TIn PxAnd PyFor northeast day rate conversion in state to the dextrad and forward speed of carrier system, i.e. formula (15) carrier system speed [V inx Vy Vz]TIn VxAnd Vy;PUThat is stateIn height;For the last optical flow data northeast that navigation data merges when effective day location status Estimation;For the ground magnetic vector under normalized local geographic coordinate system;By state one-step prediction value Substitution formula (13) to get arrive measurement predictor;
In formula (11), noise is measuredNoise is measured including 2 dimension light stream displacements, 2 dimension optical flow velocities measure noise, after 3 dimension normalization Magnetometer output measure noise, 1 dimension fusion height-measuring noise;
State-transition matrix Φ, system noise driving matrix Γ are obtained by calculating Jacobian matrix to formula (10) and formula (13) And measurement matrix H;
State-transition matrix Φ is calculated:
System noise drives matrix Γ to calculate:
T is the navigation period in formula (16), (17), and I is unit matrix;
Measurement matrix H is calculated:
StateOne-step prediction valueIt is obtained by the solution differential equation:
In formula (19)For upper navigation periodic state estimated value;
Then data fusion can be completed using Extended Kalman filter, calculates position, speed and the posture information of carrier.
CN201910270669.5A 2019-04-04 2019-04-04 A kind of Integrated Navigation Algorithm merging optical flow position and velocity information Pending CN109916394A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910270669.5A CN109916394A (en) 2019-04-04 2019-04-04 A kind of Integrated Navigation Algorithm merging optical flow position and velocity information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910270669.5A CN109916394A (en) 2019-04-04 2019-04-04 A kind of Integrated Navigation Algorithm merging optical flow position and velocity information

Publications (1)

Publication Number Publication Date
CN109916394A true CN109916394A (en) 2019-06-21

Family

ID=66968658

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910270669.5A Pending CN109916394A (en) 2019-04-04 2019-04-04 A kind of Integrated Navigation Algorithm merging optical flow position and velocity information

Country Status (1)

Country Link
CN (1) CN109916394A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110428452A (en) * 2019-07-11 2019-11-08 北京达佳互联信息技术有限公司 Detection method, device, electronic equipment and the storage medium of non-static scene point
CN110873813A (en) * 2019-12-02 2020-03-10 中国人民解放军战略支援部队信息工程大学 Water flow velocity estimation method, integrated navigation method and device
CN111445491A (en) * 2020-03-24 2020-07-24 山东智翼航空科技有限公司 Three-neighborhood maximum difference value edge detection narrow lane guidance algorithm for micro unmanned aerial vehicle
CN112254721A (en) * 2020-11-06 2021-01-22 南京大学 Attitude positioning method based on optical flow camera
CN112284380A (en) * 2020-09-23 2021-01-29 深圳市富临通实业股份有限公司 Nonlinear estimation method and system based on fusion of optical flow and IMU (inertial measurement Unit)
CN112923924A (en) * 2021-02-01 2021-06-08 杭州电子科技大学 Method and system for monitoring attitude and position of anchored ship
CN114018241A (en) * 2021-11-03 2022-02-08 广州昂宝电子有限公司 Positioning method and device for unmanned aerial vehicle
CN114184194A (en) * 2021-11-30 2022-03-15 中国电子科技集团公司第二十九研究所 Unmanned aerial vehicle autonomous navigation positioning method in rejection environment
CN114216454A (en) * 2021-10-27 2022-03-22 湖北航天飞行器研究所 Unmanned aerial vehicle autonomous navigation positioning method based on heterogeneous image matching in GPS rejection environment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110206236A1 (en) * 2010-02-19 2011-08-25 Center Jr Julian L Navigation method and aparatus
CN103344218A (en) * 2013-06-18 2013-10-09 桂林理工大学 System and method for measuring altitude of low-altitude unmanned plane
US20170212529A1 (en) * 2013-11-27 2017-07-27 The Trustees Of The University Of Pennsylvania Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft micro-aerial vehicle (mav)
CN106989744A (en) * 2017-02-24 2017-07-28 中山大学 A kind of rotor wing unmanned aerial vehicle autonomic positioning method for merging onboard multi-sensor
CN107074360A (en) * 2016-11-22 2017-08-18 深圳市大疆创新科技有限公司 Control method, flight controller and the unmanned vehicle of unmanned vehicle
CN109540126A (en) * 2018-12-03 2019-03-29 哈尔滨工业大学 A kind of inertia visual combination air navigation aid based on optical flow method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110206236A1 (en) * 2010-02-19 2011-08-25 Center Jr Julian L Navigation method and aparatus
CN103344218A (en) * 2013-06-18 2013-10-09 桂林理工大学 System and method for measuring altitude of low-altitude unmanned plane
US20170212529A1 (en) * 2013-11-27 2017-07-27 The Trustees Of The University Of Pennsylvania Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft micro-aerial vehicle (mav)
CN107074360A (en) * 2016-11-22 2017-08-18 深圳市大疆创新科技有限公司 Control method, flight controller and the unmanned vehicle of unmanned vehicle
CN106989744A (en) * 2017-02-24 2017-07-28 中山大学 A kind of rotor wing unmanned aerial vehicle autonomic positioning method for merging onboard multi-sensor
CN109540126A (en) * 2018-12-03 2019-03-29 哈尔滨工业大学 A kind of inertia visual combination air navigation aid based on optical flow method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨天雨等: "惯性/光流/磁组合导航技术在四旋翼飞行器中的应用", 《传感器与微系统》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110428452B (en) * 2019-07-11 2022-03-25 北京达佳互联信息技术有限公司 Method and device for detecting non-static scene points, electronic equipment and storage medium
CN110428452A (en) * 2019-07-11 2019-11-08 北京达佳互联信息技术有限公司 Detection method, device, electronic equipment and the storage medium of non-static scene point
CN110873813A (en) * 2019-12-02 2020-03-10 中国人民解放军战略支援部队信息工程大学 Water flow velocity estimation method, integrated navigation method and device
CN111445491A (en) * 2020-03-24 2020-07-24 山东智翼航空科技有限公司 Three-neighborhood maximum difference value edge detection narrow lane guidance algorithm for micro unmanned aerial vehicle
CN111445491B (en) * 2020-03-24 2023-09-15 山东智翼航空科技有限公司 Three-neighborhood maximum difference edge detection narrow channel guiding method for miniature unmanned aerial vehicle
CN112284380A (en) * 2020-09-23 2021-01-29 深圳市富临通实业股份有限公司 Nonlinear estimation method and system based on fusion of optical flow and IMU (inertial measurement Unit)
CN112254721A (en) * 2020-11-06 2021-01-22 南京大学 Attitude positioning method based on optical flow camera
CN112923924A (en) * 2021-02-01 2021-06-08 杭州电子科技大学 Method and system for monitoring attitude and position of anchored ship
CN114216454A (en) * 2021-10-27 2022-03-22 湖北航天飞行器研究所 Unmanned aerial vehicle autonomous navigation positioning method based on heterogeneous image matching in GPS rejection environment
CN114216454B (en) * 2021-10-27 2023-09-08 湖北航天飞行器研究所 Unmanned aerial vehicle autonomous navigation positioning method based on heterogeneous image matching in GPS refusing environment
CN114018241A (en) * 2021-11-03 2022-02-08 广州昂宝电子有限公司 Positioning method and device for unmanned aerial vehicle
CN114018241B (en) * 2021-11-03 2023-12-26 广州昂宝电子有限公司 Positioning method and device for unmanned aerial vehicle
CN114184194A (en) * 2021-11-30 2022-03-15 中国电子科技集团公司第二十九研究所 Unmanned aerial vehicle autonomous navigation positioning method in rejection environment

Similar Documents

Publication Publication Date Title
CN109916394A (en) A kind of Integrated Navigation Algorithm merging optical flow position and velocity information
CN104736963B (en) mapping system and method
CN109931926B (en) Unmanned aerial vehicle seamless autonomous navigation method based on station-core coordinate system
CN111678538B (en) Dynamic level error compensation method based on speed matching
CN109540126A (en) A kind of inertia visual combination air navigation aid based on optical flow method
Ladetto et al. Digital magnetic compass and gyroscope integration for pedestrian navigation
CN109269471A (en) A kind of novel GNSS receiver inclinometric system and method
CN109186597B (en) Positioning method of indoor wheeled robot based on double MEMS-IMU
CN101290229A (en) Silicon micro-navigation attitude system inertia/geomagnetism assembled method
CN107270893A (en) Lever arm, time in-synchronization error estimation and the compensation method measured towards real estate
CN107490378B (en) Indoor positioning and navigation method based on MPU6050 and smart phone
CN201955092U (en) Platform type inertial navigation device based on geomagnetic assistance
CN107270898B (en) Double particle filter navigation devices and method based on MEMS sensor and VLC positioning fusion
CN111024070A (en) Inertial foot binding type pedestrian positioning method based on course self-observation
CN104181573B (en) Big Dipper inertial navigation deep integrated navigation micro-system
CN102575933A (en) System that generates map image integration database and program that generates map image integration database
CN109916395A (en) A kind of autonomous Fault-tolerant Integrated navigation algorithm of posture
CN103512584A (en) Navigation attitude information output method, device and strapdown navigation attitude reference system
Wahdan et al. Three-dimensional magnetometer calibration with small space coverage for pedestrians
Hartmann et al. Indoor 3D position estimation using low-cost inertial sensors and marker-based video-tracking
CN109540135A (en) The method and device that the detection of paddy field tractor pose and yaw angle are extracted
CN108458714A (en) The Eulerian angles method for solving of acceleration of gravity is free of in a kind of attitude detection system
CN103162677A (en) Digital geological compass and method for measuring geological occurrence
CN107015259A (en) The tight integration method of pseudorange/pseudorange rates is calculated using Doppler anemometer
CN109579836A (en) A kind of indoor pedestrian's bearing calibration method based on MEMS inertial navigation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
AD01 Patent right deemed abandoned

Effective date of abandoning: 20230609

AD01 Patent right deemed abandoned