CN107833249A - A kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding - Google Patents

A kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding Download PDF

Info

Publication number
CN107833249A
CN107833249A CN201710904602.3A CN201710904602A CN107833249A CN 107833249 A CN107833249 A CN 107833249A CN 201710904602 A CN201710904602 A CN 201710904602A CN 107833249 A CN107833249 A CN 107833249A
Authority
CN
China
Prior art keywords
mtd
msub
mrow
mtr
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710904602.3A
Other languages
Chinese (zh)
Other versions
CN107833249B (en
Inventor
彭聪
曾聪
甄子洋
王新华
江驹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN201710904602.3A priority Critical patent/CN107833249B/en
Publication of CN107833249A publication Critical patent/CN107833249A/en
Application granted granted Critical
Publication of CN107833249B publication Critical patent/CN107833249B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/207Analysis of motion for motion estimation over a hierarchy of resolutions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Measuring Magnetic Variables (AREA)
  • Image Analysis (AREA)

Abstract

The present invention discloses a kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding, the characteristic point of the two continuous frames image collected first to carrier-borne aircraft is extracted and matched, according to matching double points and its pixel coordinate, the Epipolar geometry relation for calculating two field pictures obtains basis matrix, essential matrix is determined by the corresponding relation of basis matrix and essential matrix, spin matrix is solved after carrying out singular value decomposition to essential matrix, the spin matrix tried to achieve is converted into Eulerian angles, so as to estimate the attitude information during carrier-borne aircraft declines.The present invention only need to utilize two frames and above observed image to estimate the attitude information of carrier-borne aircraft, have stronger flexibility, the mode of view-based access control model guiding, cost is low, precision is high, and strong antijamming capability.

Description

A kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding
Technical field
The present invention relates to a kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding, belong to computer and regard Feel technical field.
Background technology
Carrier-borne aircraft is the main operational weapon of aircraft carrier.Whether the performance of carrier-borne aircraft can play maximum to aircraft carrier Combat forces play conclusive effect.Still suffering from many difficult points on carrier-borne aircraft is needed to assault fortified position, such as the landing of carrier-borne aircraft is asked Topic, because its degree of danger is high, landing difficulty is big, and suffered heavy losses caused by landing accident, therefore, guiding carrier-borne aircraft drops safely Fall also be always whole world research emphasis.
The attitude information for obtaining accurate, high-precision warship process is the key of carrier-borne aircraft safe falling, and attitude information It is to be provided by navigation system.Traditional airmanship includes inertial navigation, radar navigation, GPS navigation etc..Traditional navigation Technology be there are problems that:Requirement of the inertial navigation to inertance element is higher, and error can gradually increase, and influences navigation essence Degree;The positioning precision of radar navigation is limited to radar, and need to use radar recycle bin, and expense will increase;GPS navigation is current Most widely used navigation mode, but because GPS is by satellite fix, easily disturbed by electronics;And vision guided navigation conduct A kind of new airmanship, because of the advantages that it does not disturb by landform, precision is high, autonomy-oriented degree is high, cost is relatively low, obtain people Extensive concern.
Vision guided navigation be based on image processing techniques, to camera acquisition to image handle, so as to estimate The posture information gone out in target motion process, to carry out the further control to target, wherein, according to image information acquisition mesh Target attitude information is always the emphasis and difficult point studied.
The content of the invention
To solve the above problems, it is of the invention under existing airmanship theory, with the characteristic point in visual odometry Method, propose a kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding.This method is adopted according to carrier-borne aircraft video camera The sequence image of the carrier-borne aircraft descent collected, corresponding characteristic point is obtained, and solved using the variation relation of adjacent two frame The relative movement parameters of interframe, and then the posture information of carrier-borne aircraft is obtained, and then realize carrier-borne aircraft motion control.Particular technique Scheme is as follows:
A kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding, comprises the following steps:
S1. extracting and matching feature points are carried out to the two continuous frames image of carrier-borne aircraft Airborne camera shooting, matched Point pair;
S2. the Epipolar geometry relation between two field pictures is calculated according to the matching double points of acquisition, obtains basis matrix F;
S3. basis matrix F is changed to obtain essential matrix E by normalized image coordinate;
S4. spin matrix R is solved after carrying out singular value decomposition to essential matrix E;
S5. the spin matrix R tried to achieve is converted into Eulerian angles, believed by Eulerian angles to describe the pose of estimating of carrier-borne aircraft Breath.
As a kind of preferred scheme, also include judging whether matching succeeds in S1, if match point occur is less than desired value N does not match phenomenon, then according to the characteristic point position of previous frame information prediction present frame, and combines the renewal of previous frame characteristic point Matching double points, N value are not less than 8.
As a kind of preferred scheme, according to the characteristic point position of previous frame Information Pull Kalman filter prediction present frame Put.
As a kind of preferred scheme, step S1 carries out the extraction and matching of characteristic point using SIFT algorithms.
As a kind of preferred scheme, step S1 includes:
S11. difference of Gaussian pyramid is established:The gaussian pyramid of structure, same group of the gaussian pyramid of completion will be built The image subtraction of middle adjacent two layers, a series of difference images are generated by a group arrangement, these difference images form Gaussian difference parting Word tower;
S12. the preliminary investigation of key point:Find the pyramidal Local Extremum of difference of Gaussian and determine these Local Extremums Justice is key point;
S13. key point is accurately positioned:The key point searched out is further screened, remove low contrast key point and Unstable skirt response point;
S14. key point direction is distributed:The gradient information of the neighborhood territory pixel of any key point is calculated, is believed further according to gradient Breath, it is each key point distribution direction after being accurately positioned;
S15. crucial point feature description;The key point for being each after being accurately positioned establishes feature descriptor, according to ladder Degree information obtains the characteristic vector of corresponding key point;
S16. Feature Points Matching:Euclidean distance by calculating two groups of characteristic point characteristic vectors measures two groups of characteristic points Similitude, when the Euclidean distance of two characteristic point characteristic vectors is less than the distance threshold of setting, then receive this pair of matchings Point.
As a kind of preferred scheme, in step S12, when finding Local Extremum, sample point is neighbouring with its present image 8 points, and neighbouring each 9 points on neighbouring layer compare, and each point are traveled through successively, when selected sample point Or hour bigger than all Neighbor Points is selected as key point.
As a kind of preferred scheme, in step S14, after gradient calculation, the gradient side using histogram to neighborhood territory pixel To being counted, the principal direction using the peak value direction of histogram as key point, the direction that will be greater than principal direction peak value 80% is protected Stay and be used as the auxiliary direction of key point.
As a kind of preferred scheme, step S2 calculates basis matrix using RANSAC algorithms.
RANSAC algorithms are calculated as follows:
To any pair of match point in two frame figures of carrier-borne aircraft Airborne camera shootingFundamental matrix F meets bar Part x'TFx=0, note two frame figures matching point coordinates is respectively x=(x, y, 1)T, x'=(x', y', 1)T, fijBased on matrix F I row j column elements, then meet:
Have after expansion:
x'xf11+x'yf12+x'f13+y'xf21+y'yf22+y'f23+xf31+yf32+f33=0
Basis matrix F is write as to column vector f form, then had:
[x'x x'y x'y'x y'y y'x y 1] f=0
8 groups are selected from matching double points at random, is set toThen have:
It is A to make left side matrix, then has an Af=0, carries out linear operation and is easy to get basis matrix F, is checked with F, calculating is tested Calculate successfully point logarithm n;Randomly choose 8 groups of points pair again, repeat above step, find the F that makes n maximum as finally trying to achieve Basis matrix F.
As a kind of preferred scheme, step S3 includes:
Assuming that join matrix P=[R | t] outside carrier-borne aircraft Airborne camera, then x=PX, it is known that Airborne camera calibration matrix K, K inverse matrix is acted on into point x and obtained a littleThen It is the point on image in the case where normalizing coordinate Expression, essential matrix is basis matrix under normalized image coordinate;
Assuming thatMaking it, correspondingly antisymmetric matrix is:
Then essential matrix has following form:
E=[t]XR=R [R | t]X
With normalized image coordinate representation point pairThen the definite equation of essential matrix is:
WillSubstitute into above formula and obtain x'TK'-TEK-1X=0;
With x'TFx=0 compares, and obtains essential matrix E and basis matrix F relation:
E=K'TFK
Under conditions of known carrier-borne aircraft Airborne camera internal reference matrix, basis matrix F can be changed according to above-mentioned formula For essential matrix E.
As a kind of preferred scheme, step S4 includes:
Essential matrix E makes E=[t], it is known that can recover the internal reference matrix of carrier-borne aircraft Airborne camera×R=SR, S are Antisymmetric matrix, antisymmetric matrix block, which decomposes, can obtain S=kUZUT, utilize following matrix:
In the case where differing the meaning of a symbol, Z=diag (1 1 0) W, S=Udiag (1 1 0) WUT, obtain the strange of E Different value is decomposed, i.e.,:
E=SR=Udiag (1 1 0) (WUTR)
Assuming that E singular value decomposition is Udiag (1 1 0) VT, R=UXVT, then have:
Udiag(1 1 0)VT=E=SR=(UZUT)(UXVT)=U (ZX) VT
Therefore ZX=diag (1 1 0), push away to obtain X=W or X=WT
Again because St=[t]XT=0, t=U (0 0 1) is calculatedT=u3, i.e., equal to split-matrix U last Row, thus, the outer ginseng matrix P of Airborne camera by spin matrix and translate matrix group into, its is right
Answering four kinds may solve, i.e.,:
P=[UWVT|u3];[UWVT|-u3];[UWTVT|u3];[UWTVT|-u3],
The object of selection shooting is located at the point corresponding to the front of carrier-borne aircraft Airborne camera, obtains the outer of Airborne camera
Join matrix P solution, so as to obtain spin matrix.
As a kind of preferred scheme, step S5 includes:
If the spin matrix that carrier-borne aircraft rotates θ angles alone about x, y, z axle is respectively Rx(θ)、Ry(θ)、Rz(θ), then have:
If carrier-borne aircraft is by x-axis, y-axis, the rotation of the order of z-axis, the sine and cosine functions of note three axle Eulerian angles of x, y, z For sx,cx,sy,cy,sz,cz
The spin matrix of the conversion is:
If spin matrix form is as follows:
If carrier-borne aircraft x-axis, y-axis, z-axis Eulerian angles are respectively θxyz, according to spin matrix expression formula, then can derive Go out Eulerian angles value:
θx=a tan2 (r32,r33)
θz=a tan2 (r21,r11)
Thus, the guiding of view-based access control model, the attitude information that carrier-borne aircraft declines process is estimated.
The carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding proposed by the present invention is compared with prior art Advantage is:
(1) mode of view-based access control model guiding of the present invention, it is only necessary to can be estimated using two frames and the observed image of the above The attitude information of carrier-borne aircraft, flexibility is high, strong antijamming capability, and applicability is wide, and precision is higher, cost is cheap.
(2) because matching double points exceed well over 8 pairs, the present invention calculates basis matrix with RANSAC algorithms, makes full use of all Matching double points, the accuracy rate of basis matrix calculating is substantially increased, is effectively prevented from influence of the wrong data to whole result, So as to obtain more accurate carrier-borne aircraft posture information.
(3) thought of the method for characteristic point used in the present invention, to solving the estimation of camera pose while positioning with building figure etc. Problem is respectively provided with important reference significance.
Brief description of the drawings
Fig. 1 is the carrier-borne aircraft landing mission attitude prediction method flow diagram of view-based access control model guiding;
Fig. 2 is the pyramidal generation schematic diagram of difference of Gaussian;
Fig. 3 is spatial extrema point detects schematic diagram;
Fig. 4 is Epipolar geometry schematic diagram;
Fig. 5 is the geometric interpretation schematic diagram of four solutions of camera matrix.
Embodiment
The image sequence shot according to carrier-borne aircraft Airborne camera in landing mission, feature is carried out to inter frame image and carried Take and match, and the pixel coordinate of synchronization matching double points is calculated, the coordinate value of two frame matching double points before and after recycling The attitude information of carrier-borne aircraft is estimated, and then realizes carrier-borne aircraft motion control.
Extracting and matching feature points are carried out to the two continuous frames image of carrier-borne aircraft Airborne camera shooting, if matching Point is to less or even do not match phenomenon, then according to the characteristic point position of previous frame information prediction present frame, with reference to previous frame feature Point renewal matching double points;
Basis matrix is solved according to matching double points, essential matrix is determined, so as to calculate spin matrix, estimates carrier-borne The attitude information of machine.
The present invention is described in detail with specific implementation step below in conjunction with the accompanying drawings.
As shown in figure 1, the specific implementation method of the present invention is as follows:
Step 1: carrying out extracting and matching feature points to the two continuous frames image of carrier-borne aircraft Airborne camera shooting, obtain Matching double points.
The present invention carries out the extraction and matching of characteristic point using SIFT algorithms.The specific implementation process of SIFT algorithms is as follows It is shown:
(1) difference of Gaussian pyramid (DOG pyramids) is generated
Want structure difference of Gaussian pyramid, it is necessary to first build gaussian pyramid.The structure of gaussian pyramid is as follows:It is first First, the original image gathered when carrier-borne aircraft is landed is done down-sampled, and by big figure under, small figure is in upper arrangement, figure from level to level As composition is similar to pyramidal model;Secondly, then to this image does the Gaussian Blur of different scale from level to level, will be each The set of the Gaussian Blur figure of layer different scale is referred to as one group.These images just constitute gaussian pyramid together.
As shown in Fig. 2 after the completion of gaussian pyramid structure, you can build difference of Gaussian pyramid on basis herein.Will The image subtraction of adjacent two layers in same group of gaussian pyramid, a series of difference images, these difference images are generated by a group arrangement Just constitute difference of Gaussian pyramid.
(2) spatial extrema point (i.e. the preliminary investigation of key point) is detected
Using the pyramidal Local Extremum of difference of Gaussian as key point, found on the basis of difference of Gaussian is pyramidal These key points.
As shown in figure 3, when finding Local Extremum, 8 points that sample point will be neighbouring with its present image, and up and down Neighbouring each 9 points on adjacent layer compare.Travel through each point successively, when selected sample point is bigger than all Neighbor Points or Hour, this sample point is selected as key point.
(3) key point is accurately positioned
Local Extremum detected by previous step is also needed to further be screened.
Metric space function D (x, y, σ) Taylor expansions (before reservation two) are:
Wherein, X=(x, y, σ)T.(x, y) is the point on image, and σ is the metric space factor.
To above formula derivation, and equation is allowed to be equal to 0.Calculating extreme point is:
The value of extreme point is updated in the Taylor expansions of metric space function, then had:
WillKey point remove, eliminate the key point of low contrast.
Also to remove unstable skirt response point simultaneously.The Hessian matrixes of key point are obtained, Hessian matrixes Formula is as follows:
Wherein DxxRepresent same yardstick hypograph to x directions derivation 2 times, DyyRepresent to y directions derivation 2 times, DxyFirst to x Direction derivation, then to the derivation of y directions.Pass through Hessian matrix computations principal curvatures.Assuming that the big characteristic value of Hessian matrixes is α, it is small for β, and α=r β.Tr (H), Det (H) are respectively H mark and determinant, then have:
IfThen this key point is retained.
Based on above-mentioned two step, complete to be accurately positioned key point.
(4) key point direction is distributed
For any key point, the gradient information of its neighborhood territory pixel is calculated using formula.It is every further according to gradient information One key point distributes direction.
The amplitude of note gradient be m (x, y), and the direction of gradient be θ (x, y), key point place metric space value for L (x, y)。
The amplitude formula of gradient is as follows:
The direction formula of gradient is as follows:
θ (x, y)=tan-1((L(x,y+1)-L(x,y-1))/(L(x+1,y)-L(x-1,y)))
After gradient calculation, the gradient direction of neighborhood territory pixel is counted using histogram.By the peak value direction of histogram As the principal direction of key point, and the direction that will be greater than principal direction peak value 80% retains, the auxiliary direction as the key point.
(5) crucial point feature description
Descriptor is established for each key point.It is determined that calculate the image-region needed for descriptor.Reference axis is rotated to be into pass The direction of key point, to ensure rotational invariance.Sampled point in neighborhood is assigned in corresponding subregion, by subregion Grad be assigned on 8 directions, calculate its weights.The gradient in each eight directions of seed point of interpolation calculation.As above count 128 gradient informations be the key point characteristic vector.
(6) Feature Points Matching
Based on SIFT algorithms, the characteristic point of extraction is matched.The matching of characteristic point is by calculating two groups of characteristic points The Euclidean distance of characteristic vector measures the similitude of two groups of characteristic points.Characteristic point is represented by 128 dimensional feature vectors, it is assumed that two Euclidean distance is L between characteristic point, and two feature point description elements are respectively Xi、Yi, then have formula:
Distance threshold is set, when the Euclidean distance L of two characteristic point characteristic vectors is less than the threshold value of setting, then received This pair of match points.
Step 2: judging whether matching succeeds, different disposal is carried out for two kinds of situations.
After the two field pictures collected to carrier-borne aircraft carry out characteristic matching, if obtained matching double points number is very more, Then it can determine that as the match is successful, direct output matching point is to corresponding pixel coordinate;If obtained matching double points number is few In 8 pairs or there is the phenomenon that matching is not implemented, then can determine that as it fails to match.According to the feature point coordinates of previous frame image, profit With the feature point coordinates of Kalman filter prediction current frame image, renewal match point is combined with the characteristic point of previous frame image It is right, and pixel coordinate corresponding to output is so as to subsequent treatment.Specific implementation process is as follows:
In the motion of adjacent interframe can be approximately uniform motion by carrier-borne aircraft, then because the time of adjacent 2 frame is very short System is linear dynamic model, and the state equation and observational equation of the system are as follows:
X (k)=AX (k-1)+BU (k)+w (k)
Z (k)=HX (k)+v (k)
Wherein, X (k) is defined as k moment system modes, and U (k) is controlled quentity controlled variable of the k moment to system, and A is that state shifts square Battle array, B are control input matrix, and w (k) is process noise, and Z (k) is k moment system measurements, and H is observing matrix, and v (k) is survey Measure noise.
State vector X (k) is set to:
X (k)=and [x (k) y (k) x'(k) y'(k)]T
Wherein x (k), y (k) are respectively the horizontal stroke of characteristic point, ordinate, x'(k), y'(k) represent the speed on x, y direction Degree.It can be seen that X (k) imparts position and the velocity information of characteristic point.
State-transition matrix A is set to:
Wherein Δ t is the interval of adjacent 2 frame.
Observing matrix H is set to:
In addition, process noise covariance matrix Q, measurement noise covariance matrix R and least mean-square error matrix P are equal Diagonal matrix can be set to.Thus the state equation and observational equation of carrier-borne aircraft motion are defined, so as to predict present frame figure The feature point coordinates of picture, matching double points are combined to form with the characteristic point of previous frame, and draw corresponding pixel coordinate.
Step 3: the Epipolar geometry relation between two field pictures, i.e. basis matrix F are calculated according to matching double points.
Preferred RANSAC algorithms calculate basis matrix in embodiment.The concept of Epipolar geometry is sketched first, as shown in figure 4, C and C' is the center of two video cameras, and for the point X in space, the projection on two width views is respectively x and x', video camera Baseline is the straight line CC' of two video camera baselines of connection.It can be seen that camera center C, C', spatial point X and picture point x, x' are same On one plane π.Video camera baseline intersects at two antipodal points e, e' with the plane of delineation.Any plane π comprising baseline is one Intersected to polar plane, and with the plane of delineation to polar curve l, l'.For the point x on piece image, exist in another piece image To polar curve l' corresponding to one, any one point matched with x is inevitable on to polar curve l', i.e., in the presence of a point to reply The mapping relations of polar curve:
x→l'
Assuming thatMaking it, correspondingly antisymmetric matrix is:
Set point x', e', x' are on to polar curve l'.And x, x' are projective equivalences, a homography matrix can be found HπSo that x'=HπX, then have:
L'=e' × x'=[e']×X'=[e']×HπX=Fx
As can be seen here, fundamental matrix F is the projective mapping for representing x → l'.And to any pair of corresponding points in two width figuresFundamental matrix meets condition x'TFx=0.
Basis matrix F is solved below.Because match point logarithm is more than 8, consideration makes full use of these points pair, adopts Basis matrix F is found out with RANSAC algorithms.
Note point coordinates is x=(x y 1)T, x'=(x'y'1)T, fijBased on matrix F i row i column elements, then it is full Foot:
Have after expansion:
x'xf11+x'yf12+x'f13+y'xf21+y'yf22+y'f23+xf31+yf32+f33=0
Matrix F is write as to column vector f form, then had:
[x'x x'y x'y'x y'y y'x y 1] f=0
Assuming that selecting 8 groups of points pair at random, it is set toThen have:
It is A to make left side matrix, then has Af=0.Carry out linear operation and try to achieve basis matrix F.Checked with F, calculating is tested Calculate successfully point logarithm n.Randomly choose 8 groups of points pair again, repeat above step, find the F that makes n maximum as finally trying to achieve Basis matrix F.
Step 4: according to basis matrix F and essential matrix E relation, in known carrier-borne aircraft Airborne camera internal reference matrix In the case of, obtain essential matrix E.
Assuming that carrier-borne aircraft Airborne camera matrix P=[R | t], then x=PX.The calibration matrix K of known Airborne camera, K inverse matrix is acted on into point x to obtain a littleThen It is the point on image in the case where normalizing coordinate Represent.Basis matrix corresponding with normalization Airborne camera matrix is referred to as essential matrix.
Essential matrix has following form:
E=[t]XR=R [R | t]X
With normalized image coordinate representation point pairThen the definite equation of essential matrix is:
WillSubstitute into above formula and obtain x'TK'-TEK-1X=0.With x'TFx=0 compares, and obtains essential matrix E and basis The relation of matrix F:
E=K'TFK
Under conditions of known carrier-borne aircraft Airborne camera internal reference matrix, basis matrix F can be changed according to above-mentioned formula For essential matrix E.
Step 5: recover the outer parameter matrix of video camera by the essential matrix E tried to achieve, you can obtain spin matrix R.
Essential matrix E is, it is known that can recover carrier-borne aircraft Airborne camera matrix.Make E=[t]×R=SR, S are antisymmetry Matrix, antisymmetric matrix block, which decomposes, can obtain S=kUZUT.Utilize following matrix:
In the case where differing the meaning of a symbol, Z=diag (1 1 0) W, S=Udiag (1 1 0) WUT, then have:
E=SR=Udiag (1 1 0) (WUTR)
It can be seen that this is exactly E singular value decomposition.
Assuming that E singular value (SVD) is decomposed into Udiag (1 1 0) VT, R=UXVT, then have:
Udiag(1 1 0)VT=E=SR=(UZUT) (UXVT)=U (ZX) VT
Therefore ZX=diag (1 1 0), so as to push away to obtain X=W or X=WT.Again because St=[t]XT=0, so as to calculate Go out t=U (0 0 1)T=u3, equal to last row of U, but E symbol is unknown, therefore t symbol can not determine, so airborne The outer ginseng matrix P of video camera has four kinds may solve:
P=[UWVT|u3];[UWVT|-u3];[UWTVT|u3];[UWTVT|-u3]。
Geometric interpretation corresponding to four solutions by Fig. 5 (1) as shown in figure 5, can be seen that spatial point X is before video camera in figure Side, meet real situation, therefore select solution P=[UWVT|u3], wherein, spin matrix R is exactly UWVT, translation matrix T is exactly u3, i.e., ginseng matrix includes spin matrix R and translation matrix T outside.Here spatial point is the object shot, by the object shot One is scheduled on this general knowledge in front of video camera, you can to determine unique solution.
Step 6: the spin matrix tried to achieve is converted into Eulerian angles, the attitude information that carrier-borne aircraft declines process is estimated.
If the spin matrix that carrier-borne aircraft rotates θ angles alone about x, y, z axle is respectively Rx(θ)、Ry(θ)、Rz(θ), then have:
If carrier-borne aircraft is by x-axis, y-axis, the rotation of the order of z-axis, the sine and cosine functions of note three axle Eulerian angles of x, y, z For sx,cx,sy,cy,sz,cz.Then the spin matrix of the conversion is:
If spin matrix form is as follows:
If carrier-borne aircraft x-axis, y-axis, z-axis Eulerian angles are respectively θxyz.According to spin matrix expression formula, then can derive Go out Eulerian angles value, posture information is estimated with what Eulerian angles can describe carrier-borne aircraft:
θx=a tan2 (r32,r33)
θz=a tan2 (r21,r11)
In summary, the thought of view-based access control model guiding of the present invention, the interframe figure collected using Sift algorithms to carrier-borne aircraft As carrying out characteristic matching, according to the pixel coordinate of matching double points, carry out a series of computing and calculate spin matrix, so as to estimate Count out the current posture information of carrier-borne aircraft.
The preferred embodiments of the present invention are the foregoing is only, are not intended to limit the invention, for the skill of this area For art personnel, the present invention can have various modifications and variations.Within the spirit and principles of the invention, that is made is any Modification, equivalent substitution, improvement etc., should be included in the scope of the protection.

Claims (10)

  1. A kind of 1. carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding, it is characterised in that:Comprise the following steps:
    S1. extracting and matching feature points are carried out to the two continuous frames image of carrier-borne aircraft Airborne camera shooting, obtains matching double points;
    S2. the Epipolar geometry relation between two field pictures is calculated according to the matching double points of acquisition, obtains basis matrix F;
    S3. basis matrix F is changed to obtain essential matrix E by normalized image coordinate;
    S4. spin matrix R is solved after carrying out singular value decomposition to essential matrix E;
    S5. the spin matrix R tried to achieve is converted into Eulerian angles, posture information is estimated by what Eulerian angles can describe carrier-borne aircraft.
  2. 2. a kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding according to claim 1, its feature It is:Also include judging whether matching succeeds in S1, if match point occur less than desired value N or not matching phenomenon, basis The characteristic point position of previous frame information prediction present frame, and previous frame characteristic point renewal matching double points are combined, N value is not less than 8。
  3. A kind of 3. carrier-borne aircraft landing mission attitude prediction of view-based access control model guiding according to claim 1 to 2 any one Method, it is characterised in that:Step S1 carries out the extraction and matching of characteristic point using SIFT algorithms.
  4. 4. a kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding according to claim 3, its feature It is:The step S1 includes:
    S11. difference of Gaussian pyramid is established:The gaussian pyramid of structure, by phase in same group of gaussian pyramid for building completion Adjacent two layers image subtraction, a series of difference images are generated by a group arrangement, these difference images form difference of Gaussian pyramid;
    S12. the preliminary investigation of key point:Find the pyramidal Local Extremum of difference of Gaussian and be defined as closing by these Local Extremums Key point;
    S13. key point is accurately positioned:The key point searched out is further screened, removes the key point and shakiness of low contrast Fixed skirt response point;
    S14. key point direction is distributed:The gradient information of the neighborhood territory pixel of any key point is calculated, is every further according to gradient information One key point distribution direction after being accurately positioned;
    S15. crucial point feature description;The key point for being each after being accurately positioned establishes feature descriptor, is believed according to gradient Breath obtains the characteristic vector of corresponding key point;
    S16. Feature Points Matching:Euclidean distance by calculating two groups of characteristic point characteristic vectors measures the similar of two groups of characteristic points Property, when the Euclidean distance of two characteristic point characteristic vectors is less than the distance threshold of setting, then receive this pair of match points.
  5. 5. a kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding according to claim 4, its feature It is:In step S12, when finding Local Extremum, 8 points that sample point will be neighbouring with its present image, and it is neighbouring Neighbouring each 9 points on layer compare, and travel through each point successively, the quilt when selected sample point is bigger than all Neighbor Points or small Select as key point.
  6. 6. a kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding according to claim 4, its feature It is:In step S14, after gradient calculation, the gradient direction of neighborhood territory pixel is counted using histogram, by histogram Principal direction of the peak value direction as key point, the direction that will be greater than principal direction peak value 80% retain and are used as the auxiliary side of key point To.
  7. 7. a kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding according to claim 1, its feature It is:The step S2 calculates basis matrix using RANSAC algorithms, and calculation is as follows:Carrier-borne aircraft Airborne camera is shot Two frame figures in any pair of match pointFundamental matrix F meets condition x'TFx=0, note two frame figures matching point coordinates point Wei not x=(x, y, 1)T, x'=(x', y', 1)T, fijBased on matrix F i row j column elements, then meet:
    <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>,</mo> <mn>1</mn> <mo>)</mo> <mfenced open = "(" close = ")"> <mtable> <mtr> <mtd> <msub> <mi>f</mi> <mn>11</mn> </msub> </mtd> <mtd> <msub> <mi>f</mi> <mn>12</mn> </msub> </mtd> <mtd> <msub> <mi>f</mi> <mn>13</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>f</mi> <mn>21</mn> </msub> </mtd> <mtd> <msub> <mi>f</mi> <mn>22</mn> </msub> </mtd> <mtd> <msub> <mi>f</mi> <mn>23</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>f</mi> <mn>31</mn> </msub> </mtd> <mtd> <msub> <mi>f</mi> <mn>32</mn> </msub> </mtd> <mtd> <msub> <mi>f</mi> <mn>33</mn> </msub> </mtd> </mtr> </mtable> </mfenced> <mfenced open = "(" close = ")"> <mtable> <mtr> <mtd> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> </mtd> </mtr> <mtr> <mtd> <msup> <mi>y</mi> <mo>&amp;prime;</mo> </msup> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mn>0</mn> </mrow>
    Have after expansion:
    x'xf11+x'yf12+x'f13+y'xf21+y'yf22+y'f23+xf31+yf32+f33=0
    Basis matrix F is write as to column vector f form, then had:
    [x'x x'y x'y'x y'y y'x y 1] f=0
    8 groups are selected from matching double points at random, is set toThen have:
    <mrow> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mrow> <msup> <msub> <mi>x</mi> <mn>1</mn> </msub> <mo>&amp;prime;</mo> </msup> <msub> <mi>x</mi> <mn>1</mn> </msub> </mrow> </mtd> <mtd> <mrow> <msup> <msub> <mi>x</mi> <mn>1</mn> </msub> <mo>&amp;prime;</mo> </msup> <msub> <mi>y</mi> <mn>1</mn> </msub> </mrow> </mtd> <mtd> <mrow> <msup> <msub> <mi>x</mi> <mn>1</mn> </msub> <mo>&amp;prime;</mo> </msup> </mrow> </mtd> <mtd> <mrow> <msup> <msub> <mi>y</mi> <mn>1</mn> </msub> <mo>&amp;prime;</mo> </msup> <msub> <mi>x</mi> <mn>1</mn> </msub> </mrow> </mtd> <mtd> <mrow> <msup> <msub> <mi>y</mi> <mn>1</mn> </msub> <mo>&amp;prime;</mo> </msup> <msub> <mi>y</mi> <mn>1</mn> </msub> </mrow> </mtd> <mtd> <mrow> <msup> <msub> <mi>y</mi> <mn>1</mn> </msub> <mo>&amp;prime;</mo> </msup> </mrow> </mtd> <mtd> <msub> <mi>x</mi> <mn>1</mn> </msub> </mtd> <mtd> <msub> <mi>y</mi> <mn>1</mn> </msub> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> <mtr> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> </mtr> <mtr> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> </mtr> <mtr> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> <mtd> <mo>.</mo> </mtd> </mtr> <mtr> <mtd> <mrow> <msup> <msub> <mi>x</mi> <mn>8</mn> </msub> <mo>&amp;prime;</mo> </msup> <msub> <mi>x</mi> <mn>8</mn> </msub> </mrow> </mtd> <mtd> <mrow> <msup> <msub> <mi>x</mi> <mn>8</mn> </msub> <mo>&amp;prime;</mo> </msup> <msub> <mi>y</mi> <mn>8</mn> </msub> </mrow> </mtd> <mtd> <mrow> <msup> <msub> <mi>x</mi> <mn>8</mn> </msub> <mo>&amp;prime;</mo> </msup> </mrow> </mtd> <mtd> <mrow> <msup> <msub> <mi>y</mi> <mn>8</mn> </msub> <mo>&amp;prime;</mo> </msup> <msub> <mi>x</mi> <mn>8</mn> </msub> </mrow> </mtd> <mtd> <mrow> <msup> <msub> <mi>y</mi> <mn>8</mn> </msub> <mo>&amp;prime;</mo> </msup> <msub> <mi>y</mi> <mn>8</mn> </msub> </mrow> </mtd> <mtd> <mrow> <msup> <msub> <mi>y</mi> <mn>8</mn> </msub> <mo>&amp;prime;</mo> </msup> </mrow> </mtd> <mtd> <msub> <mi>x</mi> <mn>8</mn> </msub> </mtd> <mtd> <msub> <mi>y</mi> <mn>8</mn> </msub> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mi>f</mi> <mo>=</mo> <mn>0</mn> </mrow>
    It is A to make left side matrix, then has an Af=0, carries out linear operation and is easy to get basis matrix F, is checked with F, calculate checking computations into The point logarithm n of work(;
    8 groups of points pair are randomly choosed again, repeat above step, find the F for making n maximum as the basis matrix F finally tried to achieve.
  8. 8. a kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding according to claim 1, its feature It is:The step S3 includes:
    Assuming that join matrix P=[R | t] outside carrier-borne aircraft Airborne camera, then x=PX, it is known that Airborne camera calibration matrix K, by K Inverse matrix act on point x and obtain a littleThen It is table of the point in the case where normalizing coordinate on image Show, essential matrix is the basis matrix under normalized image coordinate;
    Assuming thatMaking it, correspondingly antisymmetric matrix is:
    <mrow> <msub> <mrow> <mo>&amp;lsqb;</mo> <mi>a</mi> <mo>&amp;rsqb;</mo> </mrow> <mi>X</mi> </msub> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mo>-</mo> <msub> <mi>a</mi> <mn>3</mn> </msub> </mrow> </mtd> <mtd> <msub> <mi>a</mi> <mn>2</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>a</mi> <mn>3</mn> </msub> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mo>-</mo> <msub> <mi>a</mi> <mn>1</mn> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>-</mo> <msub> <mi>a</mi> <mn>2</mn> </msub> </mrow> </mtd> <mtd> <msub> <mi>a</mi> <mn>1</mn> </msub> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> </mtable> </mfenced> </mrow>
    Then essential matrix has following form:
    E=[t]XR=R [R | t]X
    With normalized image coordinate representation point pairThen the definite equation of essential matrix is:
    <mrow> <msup> <mover> <mi>x</mi> <mo>~</mo> </mover> <mrow> <mo>&amp;prime;</mo> <mi>T</mi> </mrow> </msup> <mi>E</mi> <mover> <mi>x</mi> <mo>~</mo> </mover> <mo>=</mo> <mn>0</mn> </mrow>
    WillSubstitute into above formula and obtain x'TK'-TEK-1X=0;
    With x'TFx=0 compares, and obtains essential matrix E and basis matrix F relation:
    E=K'TFK
    Under conditions of known carrier-borne aircraft Airborne camera internal reference matrix, basis matrix F can be converted to by this according to above-mentioned formula Stromal matrix E.
  9. 9. a kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding according to claim 1, its feature It is:The step S4 includes:
    Essential matrix E makes E=[t], it is known that can recover the internal reference matrix of carrier-borne aircraft Airborne camera×R=SR, S are antisymmetry Matrix, antisymmetric matrix block, which decomposes, can obtain S=kUZUT, utilize following matrix:
    <mfenced open = "" close = ""> <mtable> <mtr> <mtd> <mrow> <mi>W</mi> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> </mrow> </mtd> <mtd> <mrow> <mi>Z</mi> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> </mtable> </mfenced> </mrow> </mtd> </mtr> </mtable> </mfenced>
    In the case where differing the meaning of a symbol, Z=diag (1 1 0) W, S=Udiag (1 1 0) WUT, obtain E singular value point Solution, i.e.,:
    E=SR=Udiag (1 1 0) (WUTR)
    Assuming that E singular value decomposition is Udiag (1 1 0) VT, R=UXVT, then have:
    Udiag(1 1 0)VT=E=SR=(UZUT)(UXVT)=U (ZX) VT
    Therefore ZX=diag (1 1 0), push away to obtain X=W or X=WT
    Again because St=[t]XT=0, t=U (0 0 1) is calculatedT=u3, i.e., last row equal to split-matrix U, by This, the outer ginseng matrix P of Airborne camera by spin matrix and translation matrix group into, its corresponding four kinds may solution, i.e.,:
    P=[UWVT|u3];[UWVT|-u3];[UWTVT|u3];[UWTVT|-u3],
    The object of selection shooting is located at the point corresponding to the front of carrier-borne aircraft Airborne camera, obtains the outer ginseng square of Airborne camera Battle array P solution, so as to obtain spin matrix.
  10. 10. a kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding according to claim 1, its feature It is:The step S5 includes:
    If the spin matrix that carrier-borne aircraft rotates θ angles alone about x, y, z axle is respectively Rx(θ)、Ry(θ)、Rz(θ), then have:
    <mrow> <msub> <mi>R</mi> <mi>x</mi> </msub> <mrow> <mo>(</mo> <mi>&amp;theta;</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "(" close = ")"> <mtable> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mi>cos</mi> <mi>&amp;theta;</mi> </mrow> </mtd> <mtd> <mrow> <mo>-</mo> <mi>sin</mi> <mi>&amp;theta;</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mi>sin</mi> <mi>&amp;theta;</mi> </mrow> </mtd> <mtd> <mrow> <mi>cos</mi> <mi>&amp;theta;</mi> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> <msub> <mi>R</mi> <mi>y</mi> </msub> <mrow> <mo>(</mo> <mi>&amp;theta;</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "(" close = ")"> <mtable> <mtr> <mtd> <mrow> <mi>cos</mi> <mi>&amp;theta;</mi> </mrow> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mi>sin</mi> <mi>&amp;theta;</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>-</mo> <mi>sin</mi> <mi>&amp;theta;</mi> </mrow> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mi>cos</mi> <mi>&amp;theta;</mi> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> <msub> <mi>R</mi> <mi>z</mi> </msub> <mrow> <mo>(</mo> <mi>&amp;theta;</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "(" close = ")"> <mtable> <mtr> <mtd> <mrow> <mi>cos</mi> <mi>&amp;theta;</mi> </mrow> </mtd> <mtd> <mrow> <mo>-</mo> <mi>sin</mi> <mi>&amp;theta;</mi> </mrow> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mrow> <mi>sin</mi> <mi>&amp;theta;</mi> </mrow> </mtd> <mtd> <mrow> <mi>cos</mi> <mi>&amp;theta;</mi> </mrow> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> </mrow>
    If for carrier-borne aircraft by x-axis, y-axis, the rotation of the order of z-axis, the sine and cosine functions of note three axle Eulerian angles of x, y, z are sx, cx,sy,cy,sz,cz
    The spin matrix of the conversion is:
    <mrow> <mi>R</mi> <mo>=</mo> <msub> <mi>R</mi> <mi>z</mi> </msub> <mrow> <mo>(</mo> <mi>&amp;theta;</mi> <mo>)</mo> </mrow> <mo>&amp;CenterDot;</mo> <msub> <mi>R</mi> <mi>y</mi> </msub> <mrow> <mo>(</mo> <mi>&amp;theta;</mi> <mo>)</mo> </mrow> <mo>&amp;CenterDot;</mo> <msub> <mi>R</mi> <mi>x</mi> </msub> <mrow> <mo>(</mo> <mi>&amp;theta;</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "(" close = ")"> <mtable> <mtr> <mtd> <mrow> <msub> <mi>c</mi> <mi>y</mi> </msub> <msub> <mi>c</mi> <mi>z</mi> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>c</mi> <mi>z</mi> </msub> <msub> <mi>s</mi> <mi>x</mi> </msub> <msub> <mi>s</mi> <mi>y</mi> </msub> <mo>-</mo> <msub> <mi>c</mi> <mi>x</mi> </msub> <msub> <mi>s</mi> <mi>z</mi> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>s</mi> <mi>x</mi> </msub> <msub> <mi>s</mi> <mi>z</mi> </msub> <mo>+</mo> <msub> <mi>c</mi> <mi>x</mi> </msub> <msub> <mi>c</mi> <mi>z</mi> </msub> <msub> <mi>s</mi> <mi>y</mi> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>c</mi> <mi>y</mi> </msub> <msub> <mi>s</mi> <mi>z</mi> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>c</mi> <mi>x</mi> </msub> <msub> <mi>c</mi> <mi>z</mi> </msub> <mo>+</mo> <msub> <mi>s</mi> <mi>x</mi> </msub> <msub> <mi>s</mi> <mi>y</mi> </msub> <msub> <mi>s</mi> <mi>z</mi> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>c</mi> <mi>x</mi> </msub> <msub> <mi>s</mi> <mi>y</mi> </msub> <msub> <mi>s</mi> <mi>z</mi> </msub> <mo>-</mo> <msub> <mi>c</mi> <mi>z</mi> </msub> <msub> <mi>s</mi> <mi>x</mi> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>-</mo> <msub> <mi>s</mi> <mi>y</mi> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>c</mi> <mi>y</mi> </msub> <msub> <mi>s</mi> <mi>x</mi> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>c</mi> <mi>x</mi> </msub> <msub> <mi>c</mi> <mi>y</mi> </msub> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow>
    If spin matrix form is as follows:
    <mrow> <mi>R</mi> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>r</mi> <mn>11</mn> </msub> </mtd> <mtd> <msub> <mi>r</mi> <mn>12</mn> </msub> </mtd> <mtd> <msub> <mi>r</mi> <mn>13</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>r</mi> <mn>21</mn> </msub> </mtd> <mtd> <msub> <mi>r</mi> <mn>22</mn> </msub> </mtd> <mtd> <msub> <mi>r</mi> <mn>23</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>r</mi> <mn>31</mn> </msub> </mtd> <mtd> <msub> <mi>r</mi> <mn>32</mn> </msub> </mtd> <mtd> <msub> <mi>r</mi> <mn>33</mn> </msub> </mtd> </mtr> </mtable> </mfenced> </mrow>
    If carrier-borne aircraft x-axis, y-axis, z-axis Eulerian angles are respectively θxyz, according to spin matrix expression formula, then can derive Europe Draw angle value:
    θx=atan2 (r32,r33)
    <mrow> <msub> <mi>&amp;theta;</mi> <mi>y</mi> </msub> <mo>=</mo> <mi>a</mi> <mi> </mi> <mi>t</mi> <mi>a</mi> <mi>n</mi> <mn>2</mn> <mrow> <mo>(</mo> <mo>-</mo> <msub> <mi>r</mi> <mn>31</mn> </msub> <mo>,</mo> <msqrt> <mrow> <msup> <msub> <mi>r</mi> <mn>32</mn> </msub> <mn>2</mn> </msup> <mo>+</mo> <msup> <msub> <mi>r</mi> <mn>33</mn> </msub> <mn>2</mn> </msup> </mrow> </msqrt> <mo>)</mo> </mrow> </mrow>
    θz=atan2 (r21,r11)
    Thus, the guiding of view-based access control model, the attitude information that carrier-borne aircraft declines process is estimated.
CN201710904602.3A 2017-09-29 2017-09-29 Method for estimating attitude of shipboard aircraft in landing process based on visual guidance Active CN107833249B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710904602.3A CN107833249B (en) 2017-09-29 2017-09-29 Method for estimating attitude of shipboard aircraft in landing process based on visual guidance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710904602.3A CN107833249B (en) 2017-09-29 2017-09-29 Method for estimating attitude of shipboard aircraft in landing process based on visual guidance

Publications (2)

Publication Number Publication Date
CN107833249A true CN107833249A (en) 2018-03-23
CN107833249B CN107833249B (en) 2020-07-07

Family

ID=61647541

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710904602.3A Active CN107833249B (en) 2017-09-29 2017-09-29 Method for estimating attitude of shipboard aircraft in landing process based on visual guidance

Country Status (1)

Country Link
CN (1) CN107833249B (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109902695A (en) * 2019-03-01 2019-06-18 辽宁工程技术大学 One kind is towards as to the correction of linear feature matched line feature and method of purification
CN110081881A (en) * 2019-04-19 2019-08-02 成都飞机工业(集团)有限责任公司 It is a kind of based on unmanned plane multi-sensor information fusion technology warship bootstrap technique
CN110516516A (en) * 2018-05-22 2019-11-29 北京京东尚科信息技术有限公司 Robot pose measurement method and device, electronic equipment, storage medium
CN110874818A (en) * 2018-08-31 2020-03-10 阿里巴巴集团控股有限公司 Image processing and virtual space construction method, device, system and storage medium
CN111027010A (en) * 2019-11-14 2020-04-17 武汉天恒信息技术有限公司 Steel member cylinder fitting algorithm
CN111461998A (en) * 2020-03-11 2020-07-28 中国科学院深圳先进技术研究院 Environment reconstruction method and device
CN111899180A (en) * 2019-05-05 2020-11-06 上海闻通信息科技有限公司 Image key pixel direction positioning method
CN112068168A (en) * 2020-09-08 2020-12-11 中国电子科技集团公司第五十四研究所 Visual error compensation-based geological disaster unknown environment combined navigation method
CN112363528A (en) * 2020-10-15 2021-02-12 北京理工大学 Unmanned aerial vehicle anti-interference cluster formation control method based on airborne vision
CN112396662A (en) * 2019-08-13 2021-02-23 杭州海康威视数字技术股份有限公司 Method and device for correcting conversion matrix
CN113160315A (en) * 2021-04-16 2021-07-23 广东工业大学 Semantic environment map representation method based on dual quadric surface mathematical model
CN113643365A (en) * 2021-07-07 2021-11-12 紫东信息科技(苏州)有限公司 Camera pose estimation method, device, equipment and readable storage medium
CN115205564A (en) * 2022-09-16 2022-10-18 山东辰升科技有限公司 Unmanned aerial vehicle-based hull maintenance inspection method
CN116252581A (en) * 2023-03-15 2023-06-13 吉林大学 System and method for estimating vertical and pitching motion information of vehicle body under straight running working condition
CN117372924A (en) * 2023-10-18 2024-01-09 中国铁塔股份有限公司 Video detection method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101770568A (en) * 2008-12-31 2010-07-07 南京理工大学 Target automatically recognizing and tracking method based on affine invariant point and optical flow calculation
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
CN103759716A (en) * 2014-01-14 2014-04-30 清华大学 Dynamic target position and attitude measurement method based on monocular vision at tail end of mechanical arm
CN107341814A (en) * 2017-06-14 2017-11-10 宁波大学 The four rotor wing unmanned aerial vehicle monocular vision ranging methods based on sparse direct method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101770568A (en) * 2008-12-31 2010-07-07 南京理工大学 Target automatically recognizing and tracking method based on affine invariant point and optical flow calculation
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
CN103759716A (en) * 2014-01-14 2014-04-30 清华大学 Dynamic target position and attitude measurement method based on monocular vision at tail end of mechanical arm
CN107341814A (en) * 2017-06-14 2017-11-10 宁波大学 The four rotor wing unmanned aerial vehicle monocular vision ranging methods based on sparse direct method

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110516516A (en) * 2018-05-22 2019-11-29 北京京东尚科信息技术有限公司 Robot pose measurement method and device, electronic equipment, storage medium
CN110874818A (en) * 2018-08-31 2020-03-10 阿里巴巴集团控股有限公司 Image processing and virtual space construction method, device, system and storage medium
CN110874818B (en) * 2018-08-31 2023-06-23 阿里巴巴集团控股有限公司 Image processing and virtual space construction method, device, system and storage medium
CN109902695B (en) * 2019-03-01 2022-12-20 辽宁工程技术大学 Line feature correction and purification method for image pair linear feature matching
CN109902695A (en) * 2019-03-01 2019-06-18 辽宁工程技术大学 One kind is towards as to the correction of linear feature matched line feature and method of purification
CN110081881B (en) * 2019-04-19 2022-05-10 成都飞机工业(集团)有限责任公司 Carrier landing guiding method based on unmanned aerial vehicle multi-sensor information fusion technology
CN110081881A (en) * 2019-04-19 2019-08-02 成都飞机工业(集团)有限责任公司 It is a kind of based on unmanned plane multi-sensor information fusion technology warship bootstrap technique
CN111899180A (en) * 2019-05-05 2020-11-06 上海闻通信息科技有限公司 Image key pixel direction positioning method
CN111899180B (en) * 2019-05-05 2023-11-17 上海闻通信息科技有限公司 Image key pixel direction positioning method
CN112396662A (en) * 2019-08-13 2021-02-23 杭州海康威视数字技术股份有限公司 Method and device for correcting conversion matrix
CN112396662B (en) * 2019-08-13 2024-05-24 杭州海康威视数字技术股份有限公司 Conversion matrix correction method and device
CN111027010A (en) * 2019-11-14 2020-04-17 武汉天恒信息技术有限公司 Steel member cylinder fitting algorithm
CN111027010B (en) * 2019-11-14 2023-09-22 武汉天恒信息技术有限公司 Steel member cylinder fitting method
CN111461998A (en) * 2020-03-11 2020-07-28 中国科学院深圳先进技术研究院 Environment reconstruction method and device
WO2021179745A1 (en) * 2020-03-11 2021-09-16 中国科学院深圳先进技术研究院 Environment reconstruction method and device
CN112068168A (en) * 2020-09-08 2020-12-11 中国电子科技集团公司第五十四研究所 Visual error compensation-based geological disaster unknown environment combined navigation method
CN112363528A (en) * 2020-10-15 2021-02-12 北京理工大学 Unmanned aerial vehicle anti-interference cluster formation control method based on airborne vision
CN113160315A (en) * 2021-04-16 2021-07-23 广东工业大学 Semantic environment map representation method based on dual quadric surface mathematical model
CN113643365A (en) * 2021-07-07 2021-11-12 紫东信息科技(苏州)有限公司 Camera pose estimation method, device, equipment and readable storage medium
CN113643365B (en) * 2021-07-07 2024-03-19 紫东信息科技(苏州)有限公司 Camera pose estimation method, device, equipment and readable storage medium
CN115205564A (en) * 2022-09-16 2022-10-18 山东辰升科技有限公司 Unmanned aerial vehicle-based hull maintenance inspection method
CN115205564B (en) * 2022-09-16 2022-12-13 山东辰升科技有限公司 Unmanned aerial vehicle-based hull maintenance inspection method
CN116252581A (en) * 2023-03-15 2023-06-13 吉林大学 System and method for estimating vertical and pitching motion information of vehicle body under straight running working condition
CN116252581B (en) * 2023-03-15 2024-01-16 吉林大学 System and method for estimating vertical and pitching motion information of vehicle body under straight running working condition
CN117372924A (en) * 2023-10-18 2024-01-09 中国铁塔股份有限公司 Video detection method and device
CN117372924B (en) * 2023-10-18 2024-05-07 中国铁塔股份有限公司 Video detection method and device

Also Published As

Publication number Publication date
CN107833249B (en) 2020-07-07

Similar Documents

Publication Publication Date Title
CN107833249A (en) A kind of carrier-borne aircraft landing mission attitude prediction method of view-based access control model guiding
CN106407315B (en) A kind of vehicle autonomic positioning method based on street view image database
CN107709928B (en) Method and device for real-time mapping and positioning
CN103149939B (en) A kind of unmanned plane dynamic target tracking of view-based access control model and localization method
CN103954283B (en) Inertia integrated navigation method based on scene matching aided navigation/vision mileage
CN105021184B (en) It is a kind of to be used for pose estimating system and method that vision under mobile platform warship navigation
CN102607526B (en) Target posture measuring method based on binocular vision under double mediums
CN102435188B (en) Monocular vision/inertia autonomous navigation method for indoor environment
CN104748750B (en) A kind of model constrained under the Attitude estimation of Three dimensional Targets in-orbit method and system
CN106529538A (en) Method and device for positioning aircraft
CN110726406A (en) Improved nonlinear optimization monocular inertial navigation SLAM method
CN106096621B (en) Based on vector constraint drop position detection random character point choosing method
CN103745458A (en) A robust method for estimating the rotation axis and mass center of a spatial target based on a binocular optical flow
CN109871739B (en) Automatic target detection and space positioning method for mobile station based on YOLO-SIOCTL
CN104281148A (en) Mobile robot autonomous navigation method based on binocular stereoscopic vision
CN110044374A (en) A kind of method and odometer of the monocular vision measurement mileage based on characteristics of image
CN109596121A (en) A kind of motor-driven station Automatic Targets and space-location method
Lin et al. An automatic key-frame selection method for monocular visual odometry of ground vehicle
CN105004337A (en) Straight line matching based autonomous navigation method for agricultural unmanned aerial vehicle
Zhao et al. RTSfM: Real-time structure from motion for mosaicing and DSM mapping of sequential aerial images with low overlap
Jianbang et al. Real-time monitoring of physical education classroom in colleges and universities based on open IoT and cloud computing
CN114693754A (en) Unmanned aerial vehicle autonomous positioning method and system based on monocular vision inertial navigation fusion
Oreifej et al. Horizon constraint for unambiguous UAV navigation in planar scenes
CN106767841A (en) Vision navigation method based on self adaptation volume Kalman filtering and single-point random sampling
Kuang et al. A Real-time and Robust Monocular Visual Inertial SLAM System Based on Point and Line Features for Mobile Robots of Smart Cities Toward 6G

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant