CN110081881B - Carrier landing guiding method based on unmanned aerial vehicle multi-sensor information fusion technology - Google Patents

Carrier landing guiding method based on unmanned aerial vehicle multi-sensor information fusion technology Download PDF

Info

Publication number
CN110081881B
CN110081881B CN201910317175.8A CN201910317175A CN110081881B CN 110081881 B CN110081881 B CN 110081881B CN 201910317175 A CN201910317175 A CN 201910317175A CN 110081881 B CN110081881 B CN 110081881B
Authority
CN
China
Prior art keywords
matrix
sensor
quaternion
camera
measurement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910317175.8A
Other languages
Chinese (zh)
Other versions
CN110081881A (en
Inventor
熊洪睿
黄捷
冯山
韩婵
邹阳
王新华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Aircraft Industrial Group Co Ltd
Original Assignee
Chengdu Aircraft Industrial Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Aircraft Industrial Group Co Ltd filed Critical Chengdu Aircraft Industrial Group Co Ltd
Priority to CN201910317175.8A priority Critical patent/CN110081881B/en
Publication of CN110081881A publication Critical patent/CN110081881A/en
Application granted granted Critical
Publication of CN110081881B publication Critical patent/CN110081881B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Abstract

The invention discloses a carrier landing guiding method based on an unmanned aerial vehicle multi-sensor information fusion technology, which mainly comprises the following steps of S100: carrying out information synchronization based on a least square method: step S200: warship landing guide sensor fusion based on Bayesian estimation; step S300: IMU/airborne visual attitude fusion based on Kalman filtering; acquiring IMU experimental data, resolving through quaternion to obtain attitude angle data, and processing an image acquired picture to obtain internal and external parameters of the camera; and adopting Kalman filtering to fuse the attitude angle data and the external parameters. The invention fuses the measurement information of multiple sensors and carries out filtering estimation, thereby realizing accurate landing guidance. Compared with single sensor guidance, the method has the advantages that the actual position measurement of the airplane is more accurate, the compatibility and the fault tolerance rate of the guidance system are improved, the advantage of multi-sensor fusion is fully exerted, and the accuracy and the stability of carrier landing are ensured to the greatest extent.

Description

Carrier landing guiding method based on unmanned aerial vehicle multi-sensor information fusion technology
Technical Field
The invention belongs to the technical field of unmanned aerial vehicle navigation, and particularly relates to a landing guide method based on an unmanned aerial vehicle multi-sensor information fusion technology.
Background
Accurate guide information is needed to successfully finish an automatic carrier landing task, and a smooth carrier landing track is planned to smoothly land on a deck carrier landing point. In addition, the limited deck space of the ship also puts higher requirements on the guiding accuracy. The multi-sensor information fusion method provides a feasible method for accurate guidance; it can use the auxiliary properties of multiple sensors for estimation to obtain more accurate estimation. On the basis, in order to ensure the compatibility of a guidance system and the improvement of fault tolerance, under the support of sensors such as a GPS, a visual guidance system, a laser radar and the like, factors which possibly influence the change of a guidance scheme are comprehensively analyzed, corresponding solutions are made in a targeted manner, and the accuracy and the stability of carrier landing are ensured to the greatest extent.
The landing guidance system adopts a multi-information-source redundancy design, fully considers complex marine environments and other application environments, mutually complements each other, and can ensure higher reliability and guidance precision under various interferences.
Disclosure of Invention
The invention aims to provide a landing guide method based on an unmanned aerial vehicle multi-sensor information fusion technology, so that the actual position of an airplane is measured more accurately, the compatibility and fault-tolerant rate of a guide system can be improved, and the possibility is provided for realizing accurate landing of an unmanned aerial vehicle at a fixed point.
The invention is mainly realized by the following technical scheme: a landing guide method based on an unmanned aerial vehicle multi-sensor information fusion technology mainly comprises the following steps:
step S100: carrying out information synchronization based on a least square method:
step S200: warship landing guide sensor fusion based on Bayesian estimation; after the information synchronization of the step S100, acquiring data of each sensor in the kth measurement interval, calculating the maximum posterior of the kth sensor data, calculating a local average LA of the kth measurement, and further calculating a global average GA; calculating the measurement closeness of each sensor, and if the measurement closeness is greater than or equal to a preset threshold D, calculating to obtain a single fusion value as a similarity output:
Figure GDA0003504298930000011
wherein e belongs to (1,2) is the number of the selected sensors after threshold logic comparison, xtotalIs a single fused value;
step S300: IMU/airborne visual attitude fusion based on Kalman filtering; acquiring IMU experimental data, resolving through quaternion to obtain attitude angle data, and processing an image acquired picture to obtain internal and external parameters of the camera; and adopting Kalman filtering to fuse the attitude angle data and the external parameters.
In order to better implement the present invention, further, the step S300 mainly includes the following steps:
step S301: the acquisition frequency of IMU data is 200 Hz;
step S302: the attitude angle is solved by quaternion:
step S303: calibrating intrinsic parameters of the camera, wherein the intrinsic parameters of the camera comprise a camera focal length f, principal point coordinates and pixel size; performing clearance calibration by adopting a chessboard calibration method;
step S304: resolving the external parameters of the camera: solving the external parameters, namely solving the rotation matrix R and the translation matrix t, and then transforming coordinates;
step S305: fusing IMU data and airborne visual data: simulating IMU and visual signals by using experimental data acquisition, and changing the posture of a camera during shooting; for the background, an upper view angle, a right view angle and a front view angle are provided, continuous pictures and corresponding IMU data are collected, and attitude angles are resolved under a matlab platform and a VS2013 platform through timestamp alignment.
In order to better implement the present invention, further, the step S302 mainly includes the following steps:
R'=qRq'
wherein R represents a rotated vector, R 'represents a vector after rotation by a quaternion, q represents a quaternion, q' represents a transpose thereof, a scalar part of the quaternion represents a cosine value of the rotation angle, and a vector part thereof represents a direction of the instantaneous rotation axis n; the following equation indicates that the vector R is rotated by a rotation angle θ with respect to the reference coordinate system, the direction of the rotation axis is determined by the imaginary part of the quaternion, cos α, cos β, cos γ indicate the direction cosine between the rotation axis n and the axis of the reference coordinate system:
q=λ+p1i+p2j+p3k
Figure GDA0003504298930000021
Figure GDA0003504298930000022
Figure GDA0003504298930000023
Figure GDA0003504298930000024
in the formula, λ represents the real part of the quaternion, p1、p2And p3An imaginary part representing a quaternion;
the quaternion attitude matrix is calculated as follows:
firstly, determining an initial quaternion, wherein the input is an initial attitude angle:
Figure GDA0003504298930000031
θ0、ψ0and gamma0Representing initial values of a pitch angle, a yaw angle and a roll angle;
② quaternion scalar part and vector part lambda, p1、p2、p3Real-time calculation of (2): the input signal being the digital output signal of a gyroscope
Figure GDA0003504298930000032
Wherein delta is any one of x, y and z, omegaδbIs the triaxial angular rate, and delta theta is the angular increment obtained by integrating the gyroscope; the calculation method adopts a second-order Runge Kutta method:
Figure GDA0003504298930000033
t is the time interval, T represents the current time, omegab(t) an antisymmetric matrix of gyroscope angular rates, K1、K2And Y is an intermediate variable, and the formula indicates that the quaternion value at the next time is determined by the product of the quaternion value at the previous time and an estimated slope;
and (3) real-time calculation of the attitude matrix: determining a matrix of poses
Figure GDA0003504298930000034
The input is lambda (n), p1(n)、p2(n) and p3(n):
Figure GDA0003504298930000035
In the formula (I), the compound is shown in the specification,
Figure GDA0003504298930000036
representing a rotation matrix of the geodetic system-to-machine system;
calculating attitude angle of carrier to determine attitude angles theta, psi and gamma, and inputting them as T11(n)、T12(n)、T13(n)、T23(n) and T33(n):
θ=-arcsin(T13(n))
Figure GDA0003504298930000037
Figure GDA0003504298930000038
The difference between the calculation of the orientation matrix of the direction cosine method and the quaternion method is mainly the difference of the description of the orientation matrix, which is shown as the following formula:
Figure GDA0003504298930000041
it resolves a directional cosine matrix differential equation of
Figure GDA0003504298930000042
Obtaining a directional cosine matrix
Figure GDA0003504298930000043
Then the attitude angle can be extracted; where Ω is a diagonally symmetric matrix of rotational angular velocities.
In order to better implement the present invention, the rotation matrix R in step S304 is solved as follows:
firstly, obtaining a feature matching point calculation epipolar geometric relationship according to two continuous pictures to obtain a basic matrix F:
any pair of matching points in the two figures
Figure GDA0003504298930000044
X'TFx is 0, and coordinates of matching points of the two graphs are recorded asx=(x,y,1)T,x'=(x',y',1)T,fijFor i rows and j columns of elements in the basic matrix F, the following conditions are satisfied:
Figure GDA0003504298930000045
after deployment there are:
x'xf11+x'yf12+x'f13+y'xf21+y'yf22+y'f23+xf31+yf32+f33=0
writing the base matrix F in the form of a column vector F, then:
[x'x x'y x' y'x y'y y' x y 1]f=0
randomly select 8 groups from the matching point pairs, set as
Figure GDA0003504298930000046
Then there are:
Figure GDA0003504298930000047
making the left matrix be A, if Af is 0, performing linear operation to obtain a basic matrix F, performing checking operation by using F, and calculating the point logarithm n' of successful checking operation;
randomly selecting 8 groups of point pairs, repeating the steps, and finding F with the maximum n' as a finally obtained basic matrix F;
secondly, converting the basic matrix F through the normalized image coordinates to obtain an essential matrix E:
suppose that the camera external parameter matrix P ═ R | t]Where, R represents a rotation coordinate system, t represents translation of two pictures, X represents coordinates in a camera coordinate system, and X represents coordinates in a world coordinate system; the known camera calibrates a matrix K, and an inverse matrix of the K is acted on a point x to obtain a point
Figure GDA0003504298930000051
Then
Figure GDA0003504298930000052
Figure GDA0003504298930000053
The method is characterized in that points on an image are represented under a normalized coordinate, and an essential matrix is a basic matrix under the normalized image coordinate;
obtaining the relation between the essential matrix E and the basic matrix F:
E=K'TFK
converting the basic matrix F into an essential matrix E under the condition of a known camera internal reference matrix;
carrying out singular value decomposition on the essential matrix E to solve a rotation matrix R:
the intrinsic matrix E is known, namely the intrinsic matrix of the shipboard aircraft-borne camera can be recovered, and the E is made to be [ t ]]×When R is SR and S is antisymmetric matrix, the rotation matrix R can be solved; and selecting a point corresponding to the shot object positioned in front of the shipboard aircraft-mounted camera to obtain a solution of the external parameter matrix P of the aircraft-mounted camera, thereby obtaining a rotation matrix.
In order to better implement the present invention, in step S305, the euler angle is used as the state quantity xkThe result of monocular vision measurement is the observed quantity ykTaking Ak、CkBoth are unitary matrices, and regarding w and v as white noise which is independently distributed and satisfies normal distribution, the following equation of state and measurement are respectively:
xk=xk-1+wk-1
yk=xk+vk
p (w) -N (0, Q), p (v) -N (0, R), wherein Q is the variance of the process noise, R is the variance of the measurement noise, and p represents the probability density of the noise; q, R vary over time, here set to a constant; because the two data are collected by different frequencies, in an experiment of less than 50s, 9162 data are collected by IMU measurement, 687 pictures are collected visually, and therefore 9162 attitude angles and 687 attitude angles correspond to each other; in order to correspond the data of the two attitude angles, average values are selected to be taken for every 40 attitude angles of 9162 attitude angles, and average values are taken for every 3 attitude angles of 687 attitude angles, so that 224 data can be corresponded, and fusion is carried out in a Kalman filtering program.
In order to better implement the present invention, further, the step S200 mainly includes the following steps:
step S201: suppose that the vision sensor and the laser radar are installed at the lower part of the origin of the unmanned aerial vehicle navigation coordinate system, namely OZnOn the axis, with a relative distance D between the two, the measured proximity between the sensors can be defined as:
Figure GDA0003504298930000054
wherein GA is the average of the local means of the two sensor measurements, which can also be referred to as global mean; sigmaiIs the standard deviation of sensor i;
step S202: defining a local average of kth measurements for each sensor
Figure GDA0003504298930000061
The global mean may be expressed as
Figure GDA0003504298930000062
Step S203: (Max i) the expression is:
Figure GDA0003504298930000063
Figure GDA0003504298930000068
is the maximum a posteriori of the sensor; the posterior probability of the i-th sensor is expressed as the measured value xiConditional P(s)i|xi),siIs the state quantity of the ith sensor;
step S204: the posterior probability of each sensor can be obtained based on the Bayesian criterion:
Figure GDA0003504298930000064
wherein s is a state quantity, and x is an observation subject to normal distribution; the probability density of s is subject to a normal distribution N (mu, sigma) subject to the measurement value x2) I.e. the posterior probability density; where μ is the mean and σ is the standard deviation, the mean of the likelihood function is as follows:
Figure GDA0003504298930000065
it can then be derived: px=∑PxsP(s) where the right side of the equal sign is used to calculate all observations and is divided by the sum of all values to obtain P (x | s), so that each time a new observation is obtained by setting a prior probability and performing an iterative calculation, the maximum a posteriori probability is extracted in each iteration;
step S205: a single fusion value is calculated.
In order to better implement the present invention, further, in step S100, a relational expression of the sensor data fusion time is as follows:
Figure GDA0003504298930000066
Figure GDA0003504298930000067
where C and Var (C) are the target measurement and measurement variance of the sensor at the time of alignment; ciIs the ith measurement when the sensor times remain inconsistent; sigma1Variance, σ, of the lidar in the unfused state2Is the variance of the vision sensor in the unfused state, and 1,2 represent the lidar and vision sensor, respectively, and n represents the number of measurements of the sensor.
The invention has the beneficial effects that:
(1) the invention fuses the measurement information of multiple sensors and carries out filtering estimation, thereby realizing accurate landing guidance.
(2) Compared with single sensor guidance, the method has the advantages that the actual position measurement of the airplane is more accurate, the compatibility and the fault tolerance rate of the guidance system are improved, the advantage of multi-sensor fusion is fully exerted, the accuracy and the stability of carrier landing are ensured to the greatest extent, and the possibility is provided for realizing accurate carrier landing at a fixed point of the unmanned aerial vehicle.
(3) The observation error of the fused observation value of the sensor after fusion of the depression angle and the azimuth angle is 10mrad at most, the relative distance error is 1m at most, and the fused effect is obviously higher than the result measured by a single sensor.
(4) IMU and visual signals are fused, and the multi-sensor information fusion technology research is carried out by adopting an extended Kalman filtering algorithm. The method and the device realize multi-sensor information fusion and improve the precision and reliability of the guide information.
Drawings
FIG. 1 is a time-aligned sampling diagram of a fusion system of the present invention;
FIG. 2 is a schematic diagram of the transformation of pixel coordinates to world coordinates according to the present invention;
FIG. 3 is a flow chart of a position sensor fusion algorithm of the present invention;
FIG. 4 is a flow chart of the attitude fusion algorithm of the present invention;
FIG. 5 is a flow chart of Kalman filtering according to the present invention.
Detailed Description
Example 1:
a landing guide method based on an unmanned aerial vehicle multi-sensor information fusion technology mainly comprises the following steps:
step S100: carrying out information synchronization based on a least square method:
step S200: warship landing guide sensor fusion based on Bayesian estimation; after the information synchronization of the step S100, acquiring data of each sensor in the kth measurement interval, calculating the maximum posterior of the kth sensor data, calculating a local average LA of the kth measurement, and further calculating a global average GA; calculating the measurement closeness of each sensor, and if the measurement closeness is greater than or equal to a preset threshold D, calculating to obtain a single fusion value as similarity output:
Figure GDA0003504298930000071
wherein e belongs to (1,2) is the number of the selected sensors after threshold logic comparison, xtotalIs a single fused value;
step S300: IMU/airborne visual attitude fusion based on Kalman filtering; acquiring IMU experimental data, resolving through quaternion to obtain attitude angle data, and processing an image acquired picture to obtain internal and external parameters of a camera; and adopting Kalman filtering to fuse the attitude angle data and the external parameters.
The invention fuses the measurement information of multiple sensors and carries out filtering estimation, thereby realizing accurate landing guidance. Compared with single sensor guidance, the method has the advantages that the actual position measurement of the airplane is more accurate, the compatibility and the fault tolerance rate of the guidance system are improved, the advantage of multi-sensor fusion is fully exerted, and the accuracy and the stability of carrier landing are ensured to the greatest extent.
Example 2:
in this embodiment, optimization is performed based on embodiment 1, time alignment sampling of the fusion system is shown in fig. 1, and in step S100, the sampling periods of the lidar and the vision sensor are set to be T respectively1And T2And T1/T2And (m/n), selecting the sampling period of the tracking system as T, wherein T is the least common multiple of the sampling period, namely the time when the laser radar samples n times and the time when the vision system samples m times. And under the support of the least square method, the relation of the obtained sensor data fusion time can be expressed as:
Figure GDA0003504298930000081
Figure GDA0003504298930000082
in the above formula, C and Var (C) are the target measurement value and measurement variance of the sensor at the time of alignment, CiThe i-th measurement, σ, when the sensor times remain inconsistent1Variance, σ, of the lidar in the unfused state2Is the variance of the vision sensor in the unfused state, and 1,2 represent the lidar and the vision sensor, respectively, and n represents the number of measurements of the sensor.
Other parts of this embodiment are the same as those of embodiment 1, and thus are not described again.
Example 3:
in this embodiment, optimization is performed on the basis of embodiment 2, and step S200 mainly includes the following steps:
respectively reading sensor measurement values of a plurality of sensors, after the information synchronization in the step 1, acquiring data of each sensor in a kth measurement interval and calculating the maximum posterior of the kth sensor data, then calculating a local average LA of the kth measurement by using the read data, calculating a global average GA, calculating the global average GA, judging whether to reject the sensor data according to the closeness of each sensor, and finally calculating
Figure GDA0003504298930000083
LA(i)Is the local average of the kth measurement of each sensor, and n is the number of sensors for which the sensor data is judged to be correct.
As shown in fig. 3, the following are the steps of the sensor fusion algorithm based on bayesian estimation:
(1) suppose that: the vision sensor and the laser radar are arranged at the lower part of the origin of the unmanned aerial vehicle navigation coordinate system, namely OZnOn the axis, the relative distance between the two sensors is D, and the measurement closeness between the sensors refers to the relative distance between the two sensors in the same coordinate axis calculated according to the measurement, and can be defined as:
Figure GDA0003504298930000084
in the above equation, GA is the average of the local means of two sensor measurements, which may also be called global meanA value; sigmaiIs the standard deviation of sensor i.
(2) Define the local mean of the kth measurement of each sensor:
Figure GDA0003504298930000091
then the global mean can be expressed as
Figure GDA0003504298930000092
(3) (max i) refers to the best estimate of the closest true state data among the sensor measurements over the measurement interval, and the expression is:
Figure GDA0003504298930000093
Figure GDA0003504298930000094
is the maximum a posteriori of the sensor, the a posteriori probability of the ith sensor being measured as the value xiConditional P(s)i|xi),siIs the state quantity of the ith sensor.
(4) The posterior probability of each sensor can be obtained based on the Bayesian criterion:
Figure GDA0003504298930000095
in the formula: s is a state quantity, x is an observation obeying a normal distribution, and the probability density of s is based on a measured value x and obeys a normal distribution N (mu, sigma)2) I.e. posterior probability density, where μ is the mean and σ is the standard deviation, the mean of the likelihood function is as follows:
Figure GDA0003504298930000096
it can then be derived: px=∑Px|sP(s), in the above formula, the equal sign right is used to calculate all observations and is divided by the sum of all values to obtain P (x | s), so each time a new observation is obtained by setting a prior probability and performing an iterative calculation, and the maximum a posteriori probability is extracted in each iteration.
(5) Calculating d for each sensoriAfter, can be compared with the priorA threshold D is set and a comparison is made, assuming D is known by pre-calibration, to decide whether to choose to use the measurement of this sensor. Finally, a single fused value is obtained as the similarity output:
Figure GDA0003504298930000097
in the formula, e belongs to (1,2) and is the number of the selected sensors after threshold logic comparison, xtotalIs a single fused value.
(6) The observation error of the fused observation value of the sensor after fusion of the depression angle and the azimuth angle is 10mrad at most, the relative distance error is 1m at most, and the fused effect is obviously higher than the result measured by a single sensor.
Other parts of this embodiment are the same as embodiment 2, and thus are not described again.
Example 4:
in this embodiment, optimization is performed on the basis of embodiment 3, and as shown in fig. 4, the step S300 mainly includes the following steps:
step S301: the acquisition frequency of IMU data is 200 Hz;
step S302: the attitude angle is solved by quaternion:
step S303: calibrating intrinsic parameters of the camera, wherein the intrinsic parameters of the camera comprise a camera focal length f, principal point coordinates and pixel size; performing clearance calibration by adopting a chessboard calibration method;
step S304: resolving the external parameters of the camera: solving the external parameters, namely solving the rotation matrix R and the translation matrix t, and then transforming coordinates;
step S305: fusing IMU data and airborne visual data: simulating IMU and visual signals by using experimental data acquisition, and changing the posture of a camera during shooting; for the background, an upper view angle, a right view angle and a front view angle are provided, continuous pictures and corresponding IMU data are collected, and the attitude angles are resolved under a matlab platform and a VS2013 platform through timestamp alignment.
The other parts of this embodiment are the same as those of embodiment 3, and thus are not described again.
Example 5:
the embodiment is optimized on the basis of embodiment 4, IMU experimental data are collected, meanwhile, images collected by the images are processed to obtain internal and external parameters of the camera, an attitude angle is calculated to obtain a change image, and finally Kalman filtering is performed to fuse the internal and external parameters and the change image.
Compared with single sensor guidance, the method has the advantages that the actual position of the airplane is measured more accurately, the compatibility and fault tolerance of a guidance system are improved, the advantage of multi-sensor fusion is fully exerted, the accuracy and stability of carrier landing are ensured to the greatest extent, and the possibility is provided for realizing accurate carrier landing at a fixed point of the unmanned aerial vehicle.
IMU and visual signals are fused, and the multi-sensor information fusion technology research is carried out by adopting an extended Kalman filtering algorithm. To realize multi-sensor information fusion and improve the precision and reliability of the guiding information, the algorithm flow chart is shown in fig. 4.
The method comprises the following specific steps of IMU/visual attitude fusion technology based on Kalman filtering:
(1) IMU data import
Firstly, data are acquired through experiments, the data acquisition frequency of an IMU is 200Hz, the IMU data are stored in a txt file, the file is imported into matlab, information in the file is named one by one, and the file is exported to be a mat file for later use.
(2) Resolving attitude angle by quaternion
R'=qRq'
In the formula, R represents a rotated vector, R 'represents a vector after rotation by a quaternion, q represents a quaternion, q' represents a transpose thereof, a scalar part of the quaternion represents a general cosine value of a rotation angle, and a vector part thereof represents a direction of an instantaneous rotation axis n. The above formula indicates that the vector R is rotated by a rotation angle θ with respect to the reference coordinate system, the direction of the rotation axis is determined by the imaginary part of the quaternion, and cos α, cos β, cos γ indicate the direction cosine value between the rotation axis n and the axis of the reference coordinate system. Comprises the following steps:
q=λ+p1i+p2j+p3k
Figure GDA0003504298930000111
Figure GDA0003504298930000112
Figure GDA0003504298930000113
Figure GDA0003504298930000114
in the formula, λ represents the real part of the quaternion, p1、p2And p3Representing the imaginary part of the quaternion.
The quaternion attitude matrix is calculated as follows:
determining initial quaternion, the input of which is initial attitude angle
Figure GDA0003504298930000115
θ0、ψ0And gamma0Representing the initial values of pitch, yaw, and roll.
② quaternion scalar part and vector part lambda, p1、p2、p3Real-time calculation of (a). The input signal being the digital output signal of a gyroscope
Figure GDA0003504298930000116
Wherein delta is any one of x, y and z, omegaδbFor the three-axis angular rate, Δ θ is the angular increment integrated by the gyroscope. The calculation method adopts a second-order Runge Kutta method.
Figure GDA0003504298930000117
T is the time interval, T represents the current time, omegab(t) an antisymmetric matrix of gyroscope angular rates, K1、K2And Y is an intermediate variable, and the formula indicates that the quaternion value at the next time is determined by the product of the quaternion value at the previous time and an estimated slope.
Calculating the attitude matrix in real time to determine the attitude matrix
Figure GDA0003504298930000118
The input is lambda (n), p1(n)、p2(n) and p3(n)。
Figure GDA0003504298930000119
In the formula (I), the compound is shown in the specification,
Figure GDA0003504298930000121
representing the geodetic rotating machine rotation matrix.
And fourthly, calculating the attitude angle of the carrier to determine the attitude angles theta, psi and gamma. Input is T11(n)、T12(n)、T13(n)、T23(n) and T33(n)。
θ=-arcsin(T13(n))
Figure GDA0003504298930000122
Figure GDA0003504298930000123
The difference between the calculation of the orientation matrix by the direction cosine method and the quaternion method is mainly that the description of the orientation matrix is different, and the description is shown as the following formula:
Figure GDA0003504298930000124
its solution methodDifferential equation to cosine matrix of
Figure GDA0003504298930000125
Obtaining a directional cosine matrix
Figure GDA0003504298930000126
The attitude angle can then be extracted. (omega is the diagonal symmetric matrix of the angular velocity of rotation)
(3) Camera intrinsic parameter calibration
For this reason, the camera model parameters are critical, as the key factor for determining the pose of the target by vision is the relationship between the pixel coordinates of the feature points and the world coordinates, which is determined by the geometric model of the camera image.
The internal parameters of the camera include the camera focal length f, the principal point coordinates, the pixel size, and the like.
The adopted calibration method is a chessboard calibration method, also called a Zhang Zheng calibration method. The Zhangzhen has the name that a camera calibration method based on a two-dimensional plane calibration template is proposed by Zhangzhen, a camera is required to shoot the same calibration template from at least two different directions, the calibration template and the camera can move freely, any motion parameters of the camera and a target do not need to be known, but the corresponding relation between coordinates of a characteristic point on the calibration template in a world coordinate system and coordinates of a projection point of the characteristic point in a pixel coordinate system needs to be determined, and then the internal parameters of the camera can be solved.
(4) Extrinsic parameter resolution
The extrinsic parameters of the camera describe the pose of the camera in three-dimensional space, the transformation from pixel coordinates to world coordinates, as shown in fig. 2. The solution of the extrinsic parameters is the solution of the rotation matrix R and the translation matrix t, and then the transformation of the coordinates can be performed.
The solution for the rotation matrix R is as follows:
firstly, obtaining a feature matching point calculation epipolar geometric relationship according to two continuous pictures to obtain a basic matrix F:
any pair of matching points in the two figures
Figure GDA0003504298930000131
X'TFx ═ 0, and x ═ 1 respectively for the coordinates of matching points in the two figuresT,x'=(x',y',1)T,fijFor i rows and j columns of elements in the basic matrix F, the following conditions are satisfied:
Figure GDA0003504298930000132
after deployment there are:
x'xf11+x'yf12+x'f13+y'xf21+y'yf22+y'f23+xf31+yf32+f33=0
writing the base matrix F in the form of a column vector F, then:
[x'x x'y x' y'x y'y y' x y 1]f=0
randomly select 8 groups from the matching point pairs, set as
Figure GDA0003504298930000133
Then there are:
Figure GDA0003504298930000134
making the left matrix be A, if Af is 0, performing linear operation to obtain a basic matrix F, performing checking operation by using F, and calculating the point logarithm n' of successful checking operation;
and randomly selecting 8 groups of point pairs, repeating the steps, and finding F with the maximum n' as a finally obtained basic matrix F.
Secondly, converting the basic matrix F through the normalized image coordinates to obtain an essential matrix E:
suppose that the camera external parameter matrix P ═ R | t]Where R denotes a rotational coordinate system, t denotes a translation of two pictures, X denotes coordinates in a camera coordinate system, and X denotes coordinates in a world coordinate system. The known camera calibrates a matrix K, and an inverse matrix of the K is acted on a point x to obtain a point
Figure GDA0003504298930000135
Then
Figure GDA0003504298930000136
Figure GDA0003504298930000137
Is a representation of a point on an image in normalized coordinates, the essential matrix being the basis matrix in normalized image coordinates.
Obtaining the relation between the essential matrix E and the basic matrix F:
E=K'TFK
the basis matrix F can be converted to the essential matrix E according to the above formula, given the camera reference matrix. Carrying out singular value decomposition on the essential matrix E to solve a rotation matrix R:
the intrinsic matrix E is known, namely the intrinsic matrix of the shipboard aircraft-borne camera can be recovered, and the E is made to be [ t ]]×R=SR。
S is an antisymmetric matrix, then the rotation matrix R can be solved. And selecting a corresponding point of the shot object in front of the shipboard aircraft-mounted camera to obtain a solution of the external parameter matrix P of the aircraft-mounted camera, thereby obtaining a rotation matrix.
(5) Fusing IMU data and airborne visual data
The kalman filter algorithm flowchart is shown in fig. 5. The IMU and the visual signal are simulated by experimental data collection, and the posture of the camera is changed during shooting. Relative to the background, there are an upward viewing angle, a right viewing angle, and a frontal viewing angle. And acquiring continuous pictures and corresponding IMU data, and aligning through time stamps. The working environment is as follows: and resolving attitude angles under a matlab platform and a VS2013 platform.
Using Euler angle as state quantity xkThe result of monocular vision measurement is the observed quantity ykTaking Ak、CkBoth are unitary matrices, and regarding w and v as white noise which is independently distributed and satisfies normal distribution, the following equation of state and measurement are respectively:
xk=xk-1+wk-1
yk=xk+vk
p (w) N (0, Q), p (v) N (0, R). Where Q is the variance of the process noise, R is the variance of the measurement noise, and p represents the probability density of the noise. Q, R should be variable over time, here set to a constant.
Since the two data are acquired at different frequencies, in the experiment of less than 50s, 9162 data are acquired by measurement of the IMU, 687 pictures are visually acquired, and therefore 9162 attitude angles and 687 attitude angles correspond to each other. In order to correspond the data of the two attitude angles, average values are selected to be taken for every 40 attitude angles of 9162 attitude angles, and average values are taken for every 3 attitude angles of 687 attitude angles, so that 224 data can be corresponded, and fusion is carried out in a Kalman filtering program.
The invention integrates IMU and visual signal, adopts extended Kalman filtering algorithm to research multi-sensor information fusion technology to realize multi-sensor information fusion, and improves the precision and reliability of guide information.
The other parts of this embodiment are the same as those of embodiment 4, and thus are not described again.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention in any way, and all simple modifications and equivalent variations of the above embodiments according to the technical spirit of the present invention are included in the scope of the present invention.

Claims (5)

1. A landing guide method based on an unmanned aerial vehicle multi-sensor information fusion technology is characterized by mainly comprising the following steps:
step S100: carrying out information synchronization based on a least square method:
step S200: warship landing guide sensor fusion based on Bayesian estimation; after the information synchronization of the step S100, acquiring data of each sensor in the kth measurement interval, calculating the maximum posterior of the kth sensor data, calculating a local average LA of the kth measurement, and further calculating a global average GA; calculating the measurement closeness of each sensor, and if the measurement closeness is greater than or equal to a preset threshold D, calculating to obtain a single fusion value as similarity output:
Figure FDA0003504298920000011
wherein e belongs to (1,2) is the number of the selected sensors after threshold logic comparison, xtotalIs a single fused value;
step S300: IMU/airborne visual attitude fusion based on Kalman filtering; acquiring IMU experimental data, resolving through quaternion to obtain attitude angle data, and processing an image acquired picture to obtain internal and external parameters of the camera; fusing attitude angle data and external parameters by adopting Kalman filtering;
the step S200 includes the steps of:
step S201: suppose that the vision sensor and the laser radar are installed at the lower part of the origin of the unmanned aerial vehicle navigation coordinate system, namely OZnOn the axis, with a relative distance D between the two, the measured proximity between the sensors can be defined as:
Figure FDA0003504298920000012
wherein GA is the average of the local means of the two sensor measurements, which can also be referred to as global mean; sigmaiIs the standard deviation of sensor i;
step S202: defining a local mean value for the kth measurement of each sensor
Figure FDA0003504298920000013
The global mean may be expressed as
Figure FDA0003504298920000014
Step S203: (Max i) the expression is:
Figure FDA0003504298920000015
Figure FDA0003504298920000016
is the maximum a posteriori of the sensor; the posterior probability of the i-th sensor is expressed as the measured value xiConditional P(s)i|xi),siIs the state quantity of the ith sensor;
step S204: the posterior probability of each sensor can be obtained based on the Bayesian criterion:
Figure FDA0003504298920000017
wherein s is a state quantity, and x is an observation subject to normal distribution; the probability density of s is subject to a normal distribution N (mu, sigma) subject to the measurement value x2) I.e. the posterior probability density; where μ is the mean and σ is the standard deviation, the mean of the likelihood function is as follows:
Figure FDA0003504298920000021
∑P(s|x)=1,
Figure FDA0003504298920000022
it can then be derived: px=∑Px|sP(s) where the right side of the equal sign is used to calculate all observations and is divided by the sum of all values to obtain P (x | s), so that each time a new observation is obtained by setting a prior probability and performing an iterative calculation, the maximum a posteriori probability is extracted in each iteration;
step S205: calculating a single fusion value;
the relational expression of the sensor data fusion time in step S100 is:
Figure FDA0003504298920000023
Figure FDA0003504298920000024
where C and Var (C) are the target measurement and measurement variance of the sensor at the time of alignment; ciIs the ith measurement when the sensor times remain inconsistent; sigma1Variance, σ, of the lidar in the unfused state2Is the variance of the vision sensor in the unfused state, and 1,2 represent the lidar and vision sensor, respectively, and n represents the number of measurements of the sensor.
2. The landing guide method based on the unmanned aerial vehicle multi-sensor information fusion technology according to claim 1, wherein the step S300 mainly includes the steps of:
step S301: the acquisition frequency of IMU data is 200 Hz;
step S302: the attitude angle is solved by quaternion:
step S303: calibrating intrinsic parameters of the camera, wherein the intrinsic parameters of the camera comprise a camera focal length f, principal point coordinates and pixel size; performing clearance calibration by adopting a chessboard calibration method;
step S304: resolving the external parameters of the camera: solving the external parameters, namely solving the rotation matrix R and the translation matrix t, and then transforming coordinates;
step S305: fusing IMU data and airborne visual data: simulating IMU and visual signals by using experimental data acquisition, and changing the posture of a camera during shooting; for the background, an upper view angle, a right view angle and a front view angle are provided, continuous pictures and corresponding IMU data are collected, and attitude angles are resolved under a matlab platform and a VS2013 platform through timestamp alignment.
3. The landing guide method based on the unmanned aerial vehicle multi-sensor information fusion technology according to claim 2, wherein the step S302 mainly includes the steps of:
R'=qRq'
wherein R represents a rotated vector, R 'represents a vector after rotation by a quaternion, q represents a quaternion, q' represents a transpose thereof, a scalar part of the quaternion represents a cosine value of the rotation angle, and a vector part thereof represents a direction of the instantaneous rotation axis n; the following equation indicates that the vector R is rotated by a rotation angle θ with respect to the reference coordinate system, the direction of the rotation axis is determined by the imaginary part of the quaternion, cos α, cos β, cos γ indicate the direction cosine between the rotation axis n and the axis of the reference coordinate system:
q=λ+p1i+p2j+p3k
Figure FDA0003504298920000031
Figure FDA0003504298920000032
Figure FDA0003504298920000033
Figure FDA0003504298920000034
in the formula, λ represents the real part of the quaternion, p1、p2And p3An imaginary part representing a quaternion;
the quaternion attitude matrix is calculated as follows:
firstly, determining an initial quaternion, wherein the input is an initial attitude angle:
Figure FDA0003504298920000035
θ0、ψ0and gamma0Representing initial values of a pitch angle, a yaw angle and a roll angle;
② quaternion scalar part and vector part lambda, p1、p2、p3The real-time calculation of (2): the input signal being the digital output signal of a gyroscope
Figure FDA0003504298920000036
Wherein delta is any one of x, y and z, omegaδbIs the triaxial angular rate, and delta theta is the angular increment obtained by integrating the gyroscope; the calculation method adopts a second-order Runge Kutta method:
Figure FDA0003504298920000037
t is the time interval, T represents the current time, omegab(t) an antisymmetric matrix of gyroscope angular rates, K1、K2And Y is an intermediate variable, and the formula indicates that the quaternion value at the next time is determined by the product of the quaternion value at the previous time and an estimated slope;
and (3) real-time calculation of the attitude matrix: determining a matrix of poses
Figure FDA0003504298920000041
The input is lambda (n), p1(n)、p2(n) and p3(n):
Figure FDA0003504298920000042
In the formula (I), the compound is shown in the specification,
Figure FDA0003504298920000043
representing a rotation matrix of the geodetic system-to-machine system;
calculating attitude angles of the carrier to determine the attitude angles theta, psi and gamma, and inputting the attitude angles theta, psi and gamma as T11(n)、T12(n)、T13(n)、T23(n) and T33(n):
θ=-arcsin(T13(n))
Figure FDA0003504298920000044
Figure FDA0003504298920000045
The difference between the calculation of the orientation matrix by the direction cosine method and the quaternion method is mainly that the description of the orientation matrix is different, and the description is shown as the following formula:
Figure FDA0003504298920000046
it resolves a directional cosine matrix differential equation of
Figure FDA0003504298920000047
Obtaining a directional cosine matrix
Figure FDA0003504298920000048
Then the attitude angle can be extracted; where Ω is a diagonally symmetric matrix of rotational angular velocities.
4. The landing guide method based on unmanned aerial vehicle multi-sensor information fusion technology according to claim 3, wherein the rotation matrix R is solved in step S304 as follows:
firstly, obtaining a feature matching point calculation epipolar geometric relationship according to two continuous pictures to obtain a basic matrix F:
any pair of matching points in the two figures
Figure FDA0003504298920000049
X'TFx ═ 0, and x ═ 1 respectively for the coordinates of matching points in the two figuresT,x'=(x',y',1)T,fijFor i rows and j columns of elements in the basic matrix F, the following conditions are satisfied:
Figure FDA00035042989200000410
after deployment there are:
x'xf11+x'yf12+x'f13+y'xf21+y'yf22+y'f23+xf31+yf32+f33=0
writing the base matrix F in the form of a column vector F, then:
[x'x x'y x' y'x y'y y' x y 1]f=0
randomly select 8 groups from the matching point pairs, set as
Figure FDA0003504298920000051
Then there are:
Figure FDA0003504298920000052
making the left matrix be A, if Af is 0, performing linear operation to obtain a basic matrix F, performing checking operation by using F, and calculating the point logarithm n' of successful checking operation;
randomly selecting 8 groups of point pairs, repeating the steps, and finding F with the maximum n' as a finally obtained basic matrix F;
secondly, converting the basic matrix F through the normalized image coordinates to obtain an essential matrix E:
suppose that the camera external parameter matrix P ═ R | t]If X is PX, where R represents a rotational coordinate system, t represents a translation of two pictures, X represents coordinates in a camera coordinate system, and X represents coordinates in a world coordinate system; the known camera calibrates a matrix K, and an inverse matrix of the K is acted on a point x to obtain a point
Figure FDA0003504298920000053
Then
Figure FDA0003504298920000054
Figure FDA0003504298920000055
The method is characterized in that points on an image are represented under a normalized coordinate, and an essential matrix is a basic matrix under the normalized image coordinate;
obtaining the relation between the essential matrix E and the basic matrix F:
E=K'TFK
converting the basic matrix F into an essential matrix E under the condition of a known camera internal reference matrix;
carrying out singular value decomposition on the essential matrix E to solve a rotation matrix R:
the intrinsic matrix E is known, namely the intrinsic matrix of the shipboard aircraft-borne camera can be recovered, and the E is made to be [ t ]]×When R is SR and S is an antisymmetric matrix, the rotation matrix R can be solved; and selecting a corresponding point of the shot object in front of the shipboard aircraft-mounted camera to obtain a solution of the external parameter matrix P of the aircraft-mounted camera, thereby obtaining a rotation matrix.
5. The landing guide method based on unmanned aerial vehicle multi-sensor information fusion technology as claimed in claim 4, wherein in step S305, Euler angle is used as state quantity xkThe result of monocular vision measurement is the observed quantity ykTaking Ak、CkBoth are unitary matrices, and regarding w and v as white noise which is independently distributed and satisfies normal distribution, the following equation of state and measurement are respectively:
xk=xk-1+wk-1
yk=xk+vk
p (w) -N (0, Q), p (v) -N (0, R), wherein Q is the variance of the process noise, R is the variance of the measurement noise, and p represents the probability density of the noise; q, R vary with time, here set to a constant; because the two data are collected by different frequencies, in an experiment of less than 50s, 9162 data are collected by IMU measurement, 687 pictures are collected visually, and therefore 9162 attitude angles and 687 attitude angles correspond to each other; in order to correspond to the data of the two attitude angles, average value taking is carried out on every 40 attitude angles of the 9162 attitude angles, and average value taking is carried out on every 3 attitude angles in the 687 attitude angles, so that 224 data can be corresponded to and fused in a Kalman filtering program.
CN201910317175.8A 2019-04-19 2019-04-19 Carrier landing guiding method based on unmanned aerial vehicle multi-sensor information fusion technology Active CN110081881B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910317175.8A CN110081881B (en) 2019-04-19 2019-04-19 Carrier landing guiding method based on unmanned aerial vehicle multi-sensor information fusion technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910317175.8A CN110081881B (en) 2019-04-19 2019-04-19 Carrier landing guiding method based on unmanned aerial vehicle multi-sensor information fusion technology

Publications (2)

Publication Number Publication Date
CN110081881A CN110081881A (en) 2019-08-02
CN110081881B true CN110081881B (en) 2022-05-10

Family

ID=67415664

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910317175.8A Active CN110081881B (en) 2019-04-19 2019-04-19 Carrier landing guiding method based on unmanned aerial vehicle multi-sensor information fusion technology

Country Status (1)

Country Link
CN (1) CN110081881B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110968023B (en) * 2019-10-14 2021-03-09 北京航空航天大学 Unmanned aerial vehicle vision guiding system and method based on PLC and industrial camera
CN111679680A (en) * 2019-12-31 2020-09-18 华东理工大学 Unmanned aerial vehicle autonomous landing method and system
CN111833281B (en) * 2020-06-08 2024-04-09 上海宇航系统工程研究所 Multi-vision sensor data fusion method oriented to recycling of reusable rocket
CN111949123B (en) * 2020-07-01 2023-08-08 青岛小鸟看看科技有限公司 Multi-sensor handle controller hybrid tracking method and device
CN112712565B (en) * 2020-12-28 2024-03-01 中国民航大学 Aircraft skin damage unmanned aerial vehicle winding verification positioning method based on vision and IMU fusion
CN114489101B (en) * 2022-01-19 2023-09-29 成都飞机工业(集团)有限责任公司 Terminal guidance control method and system for unmanned aerial vehicle
CN115629164B (en) * 2022-12-21 2023-04-14 天津飞眼无人机科技有限公司 Unmanned aerial vehicle carbon flux monitoring data acquisition equipment and processing method
CN115857520B (en) * 2023-02-15 2023-05-30 北京航空航天大学 Unmanned aerial vehicle landing state monitoring method based on combination of vision and ship state
CN116189272B (en) * 2023-05-05 2023-07-07 南京邮电大学 Facial expression recognition method and system based on feature fusion and attention mechanism
CN116594080B (en) * 2023-07-17 2023-12-01 中国海洋大学 Underwater target detection system and detection method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103424114A (en) * 2012-05-22 2013-12-04 同济大学 Visual navigation/inertial navigation full combination method
CN103777220A (en) * 2014-01-17 2014-05-07 西安交通大学 Real-time and accurate pose estimation method based on fiber-optic gyroscope, speed sensor and GPS
CN105021184A (en) * 2015-07-08 2015-11-04 西安电子科技大学 Pose estimation system and method for visual carrier landing navigation on mobile platform
EP3158293A1 (en) * 2015-05-23 2017-04-26 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
CN106679648A (en) * 2016-12-08 2017-05-17 东南大学 Vision-inertia integrated SLAM (Simultaneous Localization and Mapping) method based on genetic algorithm
CN106708066A (en) * 2015-12-20 2017-05-24 中国电子科技集团公司第二十研究所 Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation
CN107014371A (en) * 2017-04-14 2017-08-04 东南大学 UAV integrated navigation method and apparatus based on the adaptive interval Kalman of extension
CN107592442A (en) * 2017-10-10 2018-01-16 南京瑞安腾企业管理咨询有限公司 A kind of marine exploration multi-sensor amalgamation device
CN107833249A (en) * 2017-09-29 2018-03-23 南京航空航天大学 A kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding
CN108052110A (en) * 2017-09-25 2018-05-18 南京航空航天大学 UAV Formation Flight method and system based on binocular vision

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE69117661T2 (en) * 1990-11-06 1996-07-18 Fujitsu Ten Ltd Direction sensor with an earth magnetism sensor and a rotation speed gyro sensor and navigation system, which contains this direction sensor
CN102052078B (en) * 2010-11-04 2012-09-26 华中科技大学 Real-time guide system of multi-sensor data fusion shield machine
EP3470787B1 (en) * 2013-11-27 2020-04-15 The Trustees Of The University Of Pennsylvania Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft micro-aerial vehicle (mav)
WO2015143173A2 (en) * 2014-03-19 2015-09-24 Neurala, Inc. Methods and apparatus for autonomous robotic control
CN105492985B (en) * 2014-09-05 2019-06-04 深圳市大疆创新科技有限公司 A kind of system and method for the control loose impediment in environment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103424114A (en) * 2012-05-22 2013-12-04 同济大学 Visual navigation/inertial navigation full combination method
CN103777220A (en) * 2014-01-17 2014-05-07 西安交通大学 Real-time and accurate pose estimation method based on fiber-optic gyroscope, speed sensor and GPS
EP3158293A1 (en) * 2015-05-23 2017-04-26 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
CN105021184A (en) * 2015-07-08 2015-11-04 西安电子科技大学 Pose estimation system and method for visual carrier landing navigation on mobile platform
CN106708066A (en) * 2015-12-20 2017-05-24 中国电子科技集团公司第二十研究所 Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation
CN106679648A (en) * 2016-12-08 2017-05-17 东南大学 Vision-inertia integrated SLAM (Simultaneous Localization and Mapping) method based on genetic algorithm
CN107014371A (en) * 2017-04-14 2017-08-04 东南大学 UAV integrated navigation method and apparatus based on the adaptive interval Kalman of extension
CN108052110A (en) * 2017-09-25 2018-05-18 南京航空航天大学 UAV Formation Flight method and system based on binocular vision
CN107833249A (en) * 2017-09-29 2018-03-23 南京航空航天大学 A kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding
CN107592442A (en) * 2017-10-10 2018-01-16 南京瑞安腾企业管理咨询有限公司 A kind of marine exploration multi-sensor amalgamation device

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
"A visual/inertial integrated landing guidance method for UAV landing on the ship";Meng, Y等;《AEROSPACE SCIENCE AND TECHNOLOGY》;20190102;第85卷;474-480 *
"Small carrier UAV lateral autonomous landing system(Article)";Zheng, F.a等;《Nanjing Hangkong Hangtian Daxue Xuebao/Journal of Nanjing University of Aeronautics and Astronautics》;20131231;第45卷(第1期);82-87 *
"单目视觉惯性融合方法在无人机位姿估计中的应用";茹祥宇等;《控制理论与应用》;20181231(第6期);50-58 *
"地面在线飞行仿真系统设计";余长贵等;《系统仿真学报》;20180228;第30卷(第2期);513-520 *
"基于嵌入式的四旋翼飞行器姿态控制设计";赵世荣;《中国优秀博硕士学位论文全文数据库(硕士)工程科技Ⅱ辑》;20170115(第01期);第C031-29页第3.1节 *
"基于改进粒子滤波的多传感器融合空间目标跟踪";薛广月等;《空天资源的可持续发展——第一届中国空天安全会议论文集》;20161231;第1-5节 *
"增强现实环境中的三维重建技术研究";袁钰杰;《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》;20091015(第10期);第I138-492页第3.4-4.3节 *
"融合 IMU 与单目视觉的无人机自主定位方法";陈丁等;《系统仿真学报》;20171231;第29卷;9-14 *

Also Published As

Publication number Publication date
CN110081881A (en) 2019-08-02

Similar Documents

Publication Publication Date Title
CN110081881B (en) Carrier landing guiding method based on unmanned aerial vehicle multi-sensor information fusion technology
CN107741229B (en) Photoelectric/radar/inertia combined carrier-based aircraft landing guiding method
CN102741706B (en) The geographical method with reference to image-region
CN110032201A (en) A method of the airborne visual gesture fusion of IMU based on Kalman filtering
CN108917753B (en) Aircraft position determination method based on motion recovery structure
Liu et al. An off-board vision system for relative attitude measurement of aircraft
CN112529957A (en) Method and device for determining pose of camera device, storage medium and electronic device
Veth et al. Stochastic constraints for efficient image correspondence search
US9816786B2 (en) Method for automatically generating a three-dimensional reference model as terrain information for an imaging device
Kaufmann et al. Shadow-based matching for precise and robust absolute self-localization during lunar landings
Xian et al. Fusing stereo camera and low-cost inertial measurement unit for autonomous navigation in a tightly-coupled approach
CN114777768A (en) High-precision positioning method and system for satellite rejection environment and electronic equipment
CN110411449B (en) Aviation reconnaissance load target positioning method and system and terminal equipment
Del Pizzo et al. Reliable vessel attitude estimation by wide angle camera
CN112902957B (en) Missile-borne platform navigation method and system
CN113962057B (en) Remote missile active section motion parameter correction method based on time sequence intersection
CN113301248B (en) Shooting method and device, electronic equipment and computer storage medium
CN114842224A (en) Monocular unmanned aerial vehicle absolute vision matching positioning scheme based on geographical base map
Yan et al. Horizontal velocity estimation via downward looking descent images for lunar landing
Alix et al. Error characterization of flight trajectories reconstructed using Structure from Motion
CN111674573B (en) Non-parallel gravitational field deep space impact control method and system based on proportional guidance
Klionovska et al. Experimental analysis of measurements fusion for pose estimation using PMD sensor
Arai et al. Fast vision-based localization for a mars airplane
Alix Error characterization of flight trajectories reconstructed using structure from motion
CN116257134B (en) Method, device, equipment and medium for tracking handle helmet under non-inertial reference system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant