CN113790719B - Unmanned aerial vehicle inertial/visual landing navigation method based on line characteristics - Google Patents

Unmanned aerial vehicle inertial/visual landing navigation method based on line characteristics Download PDF

Info

Publication number
CN113790719B
CN113790719B CN202110928170.6A CN202110928170A CN113790719B CN 113790719 B CN113790719 B CN 113790719B CN 202110928170 A CN202110928170 A CN 202110928170A CN 113790719 B CN113790719 B CN 113790719B
Authority
CN
China
Prior art keywords
coordinate system
runway
line
image
navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110928170.6A
Other languages
Chinese (zh)
Other versions
CN113790719A (en
Inventor
尚克军
扈光锋
王大元
裴新凯
段昊雨
明丽
庄广琛
刘崇亮
王海军
焦浩
李茜茜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Automation Control Equipment Institute BACEI
Original Assignee
Beijing Automation Control Equipment Institute BACEI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Automation Control Equipment Institute BACEI filed Critical Beijing Automation Control Equipment Institute BACEI
Priority to CN202110928170.6A priority Critical patent/CN113790719B/en
Publication of CN113790719A publication Critical patent/CN113790719A/en
Application granted granted Critical
Publication of CN113790719B publication Critical patent/CN113790719B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an unmanned aerial vehicle inertial/visual landing navigation method based on line characteristics, which comprises the steps of firstly, collecting images of an airport runway, extracting real-time characteristics of the runway edge, and obtaining a linear equation of the edge and a central line; calculating Plucker coordinates of the boundary line equation through the width of the airport runway and the camera internal reference matrix which are bound in advance; calculating an equation of an infinite distance blanking point and a blanking line by two equidistant parallel lines, solving an attitude transfer matrix between a real-time world coordinate system and a camera coordinate system of the unmanned aerial vehicle by a simultaneous equation set, and solving the attitude, lateral and vertical positions; and taking the position information calculated by the visual landing system as an observed quantity, and fusing the observed quantity with a Kalman filter constructed by the navigation information output by inertial navigation to realize a continuous and autonomous navigation positioning function. The invention solves the problems of accumulated divergence of inertial navigation errors along with time and larger noise of visual navigation resolving results.

Description

Unmanned aerial vehicle inertial/visual landing navigation method based on line characteristics
Technical Field
The invention belongs to the technical field of navigation, and particularly relates to an unmanned aerial vehicle landing navigation method.
Background
The line features applied in the pose resolving process of the unmanned aerial vehicle visual landing are two side lines, a central line and a far blanking line of an airport runway. Unlike point features, line features are not easily affected by factors such as illumination, flight distance, height and the like, and have better robustness, but because the features of the vanishing line are not enough to be obviously difficult to identify in a feature extraction mode, a new scheme is needed to acquire a vanishing line equation in an image plane. Meanwhile, the situation that the pose resolving result of the visual navigation mode is not smooth enough and jumps along with the rapid maneuver of the unmanned aerial vehicle needs to be considered, and a filtering method is needed to fuse the inertial navigation result and the visual navigation result.
Disclosure of Invention
The invention provides an unmanned aerial vehicle inertial/visual landing navigation method based on line characteristics, which solves the problems of accumulated divergence of inertial navigation errors along with time and larger noise of visual navigation resolving results.
An unmanned aerial vehicle inertial/visual landing navigation method based on line characteristics comprises the following steps:
(1) Airport runway data acquisition
Establishing an airport coordinate system, a visual coordinate system, a world coordinate system, a camera coordinate system and an image coordinate system; image acquisition is carried out on an airport runway, real-time feature extraction is carried out on the runway sideline, and a linear equation of the sideline and a midline is obtained;
(2) Plucker coordinate representation
Calculating Plucker coordinates of the boundary line equation through the width of the airport runway and the camera internal reference matrix which are bound in advance;
(3) Vision measurement pose calculation
After Plucker coordinates of runway edge lines are obtained, equations of infinity blanking points and blanking lines are calculated by two equidistant parallel lines, and an attitude transfer matrix C between a real-time world coordinate system and a camera coordinate system of the unmanned aerial vehicle is calculated by a simultaneous equation set w c, solving the posture and the lateral and vertical positions;
(4) Combined navigation based on inertial/visual fusion
And taking the position information calculated by the visual landing system as an observed quantity, and fusing the observed quantity with a Kalman filter constructed by the navigation information output by inertial navigation to realize a continuous and autonomous navigation positioning function.
Further, in the step (1), an airport coordinate system, a visual coordinate system, a world coordinate system, a camera coordinate system and an image coordinate system are established, wherein the method comprises the following steps;
an airport coordinate system, denoted as a-system; taking the intersection point of the initial line of the runway landing tip and the central line of the runway as an origin o a The method comprises the steps of carrying out a first treatment on the surface of the The shaft is positive in the forward direction along the center line of the runway; y is a The axis is vertical to the plane of the runway and is positive upwards; z a The shaft coincides with the initial line of the runway, and the right direction is positive; o (o) a x a y a z a Forming a right-hand coordinate system; for the coordinates of a point in the airport coordinate system (x a ,y a ,z a ) A representation;
a visual coordinate system, denoted as v-system; taking an image side principal point of an optical system as an origin o v ;x v The axis is parallel to the optical axis, and the forward direction is positive; y is v The axis is parallel to the transverse axis of the imaging plane coordinate system and is positive upwards; z v Axis and x v Axes and y v The shaft forms a right-hand coordinate system, and the right direction is positive;
the world coordinate system is marked as a w system; taking the intersection point of the initial line of the runway landing tip aiming point and the runway center line as an origin o w ;x w The shaft coincides with the initial line of the runway, and the right direction is positive; y is w The axis is vertical to the plane of the runway and is positive downwards; z w The shaft is positive in the forward direction along the center line of the runway; o (o) w x w y w z w Forming a right-hand coordinate system; for the coordinates of a point in the world coordinate system (x w ,y w ,z w ) A representation;
a camera coordinate system, denoted as c-system; taking an image side principal point of an optical system as an origin o c The method comprises the steps of carrying out a first treatment on the surface of the X when looking at the optical system c The axis is parallel to the horizontal axis of the imaging plane coordinate system, and the left direction is positive; y is c The axis is parallel to the vertical axis of the imaging plane coordinate system, and the downward direction is positive; z c The axis is directed to the observer and is in communication with x c Axes and y c The axes form a right hand coordinate system;
an image coordinate system, namely an i-system; an image coordinate system is established in the plane of the photosensitive surface of the camera, and the left upper corner of the image is taken as the origin, and the right upper corner of the image is taken as the right lower corner of the imageX of image coordinate system i Axis, y of image coordinate system downward along image vertical direction i The units of the image coordinate system are pixels, axes.
Further, the Plucker coordinate representation specifically includes:
the linear equation under the image-space image coordinate system can be described as:
ax i +by i +c=0
thus, a straight line can be represented by a three-dimensional vector:
l=[a,b,c] T
the three-dimensional coordinates of two points A, B in the world coordinate system of the object space are respectivelyTheir homogeneous coordinates are: />The straight line passing through these two points can be represented by a 4 x 4 antisymmetric homogeneous matrix L, which is called the plucker matrix:
L=AB T -BA T
in addition, the straight line L can use its direction vectorExpressed as a moment m, referred to as Plucker coordinates, denoted:
wherein ,is the direction vector of a straight line, and the moment m is the normal vector of the straight line and the origin determining plane, i.e
The relation between the Plucker matrix and the Plucker coordinates is thus obtained:
under the action of the camera mapping T, a straight line L defined by a Plucker matrix is used for representing an image L of a corresponding straight line in an image coordinate system:
wherein K is a camera reference matrix:
for the pose transfer matrix of the world coordinate system to the camera coordinate system,/for the camera coordinate system>A position vector of an origin of a camera coordinate system in a world coordinate system; s is(s) 2 Only the common coefficients of the parameters in the straight line, therefore [ l ]] × Can be simplified into:
further, the equation for calculating the infinity point and the blanking line in step (3) includes:
in the unmanned plane landing process, the visual landing system extracts runway line characteristics, wherein left and right side lines and central lines of the runway are a group of parallel lines, the runway line characteristics can be used for calculating coordinates of infinite distance blanking points and blanking line equations, and three equidistant parallel lines on the runway in an object space are L 0w 、L 1w 、L 2w Which is imaged in the image plane as l 0i 、l 1i 、l 2i Then like the airInter-blanking point coordinates can be solved by the following relationship:
the blanking line equation is:
l ∞i =[(l 0i ×l 2i ) T (l 1i ×l 2i )]l 1i +2[(l 0i ×l 1i ) T (l 2i ×l 1i )]l 2i
assume four-dimensional homogeneous coordinates of a point A in object spaceThen the point A is crossed and the direction isCan be expressed as:
when the parameter lambda changes from 0 to +.:
the relation between the blanking point and the unmanned plane attitude transfer matrix is obtained according to an image conjugation equation:
and (5) further finishing an equation to obtain:
let us assume an image space blanking line l ∞i The upper point is x, and the back projection of the point in the object space is a direction ofIs a straight line of (2); from the point x on a straight line, it is possible to:
x T l ∞i =0
by means ofNormal vector n to plane π The orthogonality can be obtained:
the direction in the object space is utilized asIs the point x of the image space, yielding:
the transposed transformation is carried out on the above components to obtain:
in conjunction with the foregoing, the spatial blanking line equation:
the method solves the gesture transfer matrix between the real-time world coordinate system and the camera coordinate system of the unmanned aerial vehicle through the simultaneous equationsAnd calculate the attitude and lateral and vertical positionsThe solution specifically comprises the following steps:
the set of equations that can be found by combining the above equations is as follows:
in the formula ,for the pose transfer matrix:
in the scene of an airport runway,is the unit direction vector of the center line of the object space runway, < + >>n π Is the unit normal vector of the plane of the object space runway, n π =[0,1,0] T Similarly, can get->
Let K -1 p =[g 1 ,g 2 ,g 3 ] T ,K T l =[h 1 ,h 2 ,h 3 ] T ,(K -1 p )×(K T l )=[e 1 ,e 2 ,e 3 ] T Simultaneous pose transfer matrix C w c is an antisymmetric array, the sum of squares of elements in each row and each column is 1, and the sum can be solved by the formula:
three attitude angles are thus solved:
calculating relative position using line equations:
wherein :
the runway edge equation is taken into the above:
straight line of same reason L 2 And (3) determining:
solving alpha from the above two 0 、α 2
Finally solving the vertical and lateral positions t of the unmanned aerial vehicle in the world coordinate system y 、t x
Further, in the step (4), the continuous state equation of the kalman filter model of the inertial/visual integrated navigation system is as follows:
wherein F (t) is a state transition matrix of a continuous state equation at the moment t,the random noise vector of the system at the moment t;
the filtering state quantity is north eastern speed error, latitude error, altitude error, longitude error, north eastern misalignment angle error, carrier system XYZ direction gyro drift and carrier system XYZ direction accelerometer zero position;
system state transition matrix
wherein :
the observation equation is defined as follows:
is an observation noise array;
the observed quantity of the combined navigation system is the difference value between the vertical lateral position of the inertial navigation output of the airport coordinate system and the navigation result of the visual landing system:
H(t)=[0 3×3 M 3×3 0 3×9 ]
wherein ,
the invention provides an infinity element and a Plucker coordinate representation method thereof, which solve the problem that the intersection point coordinates of parallel lines in an object space on an image plane cannot be solved. And solving a blanking line equation at infinity by a group of equidistant parallel lines on the left and right side lines and the central line of the runway, thereby solving the problem that the blanking line cannot be identified by a characteristic extraction method. Finally, solving the pose of the unmanned aerial vehicle through a hidden line eliminating equation and fusing the pose with inertia, thereby solving the problems of accumulated divergence of inertial navigation errors along with time and larger noise of a visual navigation resolving result.
Drawings
FIG. 1 is a schematic diagram of a coordinate system;
FIG. 2 is a schematic diagram of a set of parallel lines intersecting at a blanking point;
the Plucker coordinates of the straight line in FIG. 3 are schematically represented.
Detailed Description
The invention is described in further detail below with reference to the accompanying drawings.
Aiming at the autonomous landing navigation problem of the unmanned aerial vehicle under the satellite refusing condition, the invention develops the research of an inertial/visual landing navigation method based on line characteristics, firstly provides an infinite element and a Plucker coordinate representation method thereof, and solves the problem that the intersection point coordinates of parallel lines in an object space on an image plane can not be solved. And solving a blanking line equation at infinity by a group of equidistant parallel lines on the left and right side lines and the central line of the runway, thereby solving the problem that the blanking line cannot be identified by a characteristic extraction method. Finally, solving the pose of the unmanned aerial vehicle through a hidden line eliminating equation and fusing the pose with inertia, thereby solving the problems of accumulated divergence of inertial navigation errors along with time and larger noise of a visual navigation resolving result.
1. Infinity element and Plucker representation
(1) Coordinate system definition
As shown in fig. 1, an airport coordinate system, a visual coordinate system, a world coordinate system, a camera coordinate system, and an image coordinate system are established.
Wherein airport coordinate system (a system): taking the intersection point of the initial line of the runway landing tip and the central line of the runway as an origin o a The method comprises the steps of carrying out a first treatment on the surface of the The shaft is positive in the forward direction along the center line of the runway; y is a The axis is vertical to the plane of the runway and is positive upwards; z a The shaft coincides with the initial line of the runway, and the right direction is positive; o (o) a x a y a z a Forming a right-hand coordinate system; for the coordinates of a point in the airport coordinate system (x a ,y a ,z a ) And (3) representing.
Visual coordinate system (v system): landing visual navigation system coordinate system, visual coordinate system for short; taking an image side principal point of an optical system as an origin o v ;x v The axis is parallel to the optical axis, and the forward direction is positive; y is v The axis is parallel to the transverse axis of the imaging plane coordinate system and is positive upwards; z v Axis and x v Axes and y v The axes form a right hand coordinate system, with the right direction being positive.
World coordinate system (w system): taking the intersection point of the initial line of the runway landing tip aiming point and the runway center line as an origin o w ;x w The shaft coincides with the initial line of the runway, and the right direction is positive; y is w The axis is vertical to the plane of the runway and is positive downwards; z w The shaft is positive in the forward direction along the center line of the runway; o (o) w x w y w z w Forming a right-hand coordinate system; for the coordinates of a point in the world coordinate system (x w ,y w ,z w ) And (3) representing.
Camera coordinate system (c system): taking an image side principal point of an optical system as an origin o c The method comprises the steps of carrying out a first treatment on the surface of the X when looking at the optical system c The axis is parallel to the horizontal axis of the imaging plane coordinate system, and the left direction is positive; y is c The axis is parallel to the vertical axis of the imaging plane coordinate system, and the downward direction is positive; z c The axis is directed to the observer and is in communication with x c Axes and y c The axes form the right hand coordinate system.
Image coordinate system (i system): an image coordinate system is established in the plane of the photosensitive surface of the camera, which is a two-dimensional plane coordinate system, and the upper left corner of the image is taken as the origin, and the right corner of the image is taken as the x of the image coordinate system along the horizontal direction of the image i Axis, y of image coordinate system downward along image vertical direction i The units of the image coordinate system are pixels, axes.
(2) Infinity element
In the object space, two parallel lines are never intersected, a projective space is built on the basis of European space by introducing an infinity element, and a group of parallel lines in a plane are intersected at a unique Point at infinity, which is called a blanking Point (Vanishing Point). As shown in fig. 2. The position of the point on the image plane is only dependent on the pose of the camera and not on the position of the camera.
The blanking points represent directions corresponding to parallel lines, infinity points of non-parallel straight lines are different, and all infinity points on a plane form a straight Line, namely a blanking Line (Vanishing Line). The vanishing line is the only intersection of a set of parallel planes in space at infinity.
(3) Plucker representation
The linear equation under the image-space image coordinate system can be described as:
ax i +by i +c=0
thus, a straight line can be represented by a three-dimensional vector:
l=[a,b,c] T
the three-dimensional coordinates of two points A, B in the world coordinate system of the object space are respectively(3 x 1 matrix), then their homogeneous coordinates are: />The straight line passing through these two points can be represented by a 4 x 4 antisymmetric homogeneous matrix L, which is called the plucker matrix.
L=AB T -BA T
In addition, the straight line L can use its direction vectorExpressed as a moment m, referred to as Plucker coordinates, denoted:
wherein ,is the direction vector of a straight line, and the moment m (which can characterize the area of DeltaABC or the distance of O from the straight line L) is the normal vector of the straight line and the origin determining plane, i.e.>(as shown in fig. 3):
the relation between the Plucker matrix and the Plucker coordinates is thus obtained:
under the action of the camera mapping T, a straight line L defined by a Plucker matrix is used for representing an image L of a corresponding straight line in an image coordinate system:
wherein K is a camera reference matrix:
for the pose transfer matrix of the world coordinate system to the camera coordinate system,/for the camera coordinate system>Is a position vector of the origin of the camera coordinate system in the world coordinate system. s is(s) 2 Only the common coefficients of the parameters in the straight line, therefore [ l ]] × Can be simplified into:
2. vision measurement pose calculation
(1) Blanking point and blanking line imaging equation
In the landing process of the unmanned aerial vehicle, the visual landing system extracts runway line characteristics, wherein left and right side lines and a central line of the runway are a group of parallel lines, and the runway can be used forFor calculating coordinates of infinite far blanking points and blanking line equations, three equidistant parallel lines on a runway in an object space are set as L 0w 、L 1w 、L 2w Which is imaged in the image plane as l 0i 、l 1i 、l 2i Then the blanking point coordinates in image space can be solved by the following relationship:
the blanking line equation is:
l ∞i =[(l 0i ×l 2i ) T (l 1i ×l 2i )]l 1i +2[(l 0i ×l 1i ) T (l 2i ×l 1i )]l 2i
assume four-dimensional homogeneous coordinates of a point A in object spaceThen the point A is crossed and the direction is
(three-dimensional unit column vector) can be expressed as:
when the parameter lambda changes from 0 to +.:
the relation between the blanking point and the unmanned plane attitude transfer matrix is obtained according to an image conjugation equation:
and (5) further finishing an equation to obtain:
let us assume an image space blanking line l ∞i The upper point is x, and the back projection of the point in the object space is a direction ofIs a straight line of (a). From the point x on a straight line, it is possible to:
x T l ∞i =0
by means ofNormal vector n to plane π The orthogonality can be obtained:
the direction in the object space is utilized asIs the point x of the image space, yielding:
the transposed transformation is carried out on the above components to obtain:
in conjunction with the foregoing, the spatial blanking line equation:
(2) Unmanned aerial vehicle pose calculation
The equations in the simultaneous preamble can be found as follows:
/>
in the formula ,for the pose transfer matrix:
in the scene of an airport runway,is the unit direction vector of the center line of the object space runway, < + >>n π Is the unit normal vector of the plane of the object space runway, n π =[0,1,0] T Similarly, can get->
Let K -1 p =[g 1 ,g 2 ,g 3 ] T ,K T l =[h 1 ,h 2 ,h 3 ] T ,(K -1 p )×(K T l )=[e 1 ,e 2 ,e 3 ] T Simultaneous pose transfer matrix C w c is an antisymmetric array, the sum of squares of elements in each row and each column is 1, and the sum can be solved by the formula:
three attitude angles are thus solved:
calculating relative position using line equations:
wherein :
the runway edge equation is taken into the above:
straight line of same reason L 2 And (3) determining:
solving alpha from the above two 0 、α 2
Finally solving the vertical and lateral positions t of the unmanned aerial vehicle in the world coordinate system y 、t x
3. Inertia/vision fusion method
The Kalman filtering model continuous state equation of the inertial/visual integrated navigation system is as follows:
wherein F (t) is a state transition matrix of a continuous state equation at the moment t,and the random noise vector of the system is at the moment t.
The filtering state quantity is respectively north eastern speed error (unit: m/s), dimension error (unit: rad), altitude error (unit: m), longitude error (unit: rad), north eastern misalignment angle error (unit: rad), carrier system XYZ direction gyro drift (unit: rad/s) and carrier system XYZ direction accelerometer zero position (unit: m/s) 2 )。
System state transition matrix
wherein :
/>
the observation equation is defined as follows:
to observe the noise array.
The observed quantity of the combined navigation system is the difference value between the vertical lateral position of the inertial navigation output of the airport coordinate system and the navigation result of the visual landing system:
H(t)=[0 3×3 M 3×3 0 3×9 ]
wherein ,
in summary, an inertial/visual navigation method using three equidistant parallel lines of left and right side lines and center line of runway in the unmanned aerial vehicle landing stage is provided.
The invention realizes the unmanned aerial vehicle autonomous landing by the following 4 processes:
(1) Airport runway data acquisition:
the method comprises the steps of acquiring images of an airport runway through a front-view infrared visual navigation system arranged on a nose of an unmanned aerial vehicle, extracting real-time characteristics of the runway side line, and acquiring a linear equation of the side line and a central line.
(2) Plucker coordinates:
and calculating Plucker coordinates of the sideline equation according to the pushed formula through the width of the airport runway and the camera internal reference matrix which are bound in advance.
(3) Visual measurement pose solution:
after Plucker coordinates of runway edge lines are obtained, equations of infinity blanking points and blanking lines are calculated by two equidistant parallel lines, and an attitude transfer matrix between a real-time world coordinate system and a camera coordinate system of the unmanned aerial vehicle is calculated by a simultaneous equation setAnd solving the posture and the lateral and vertical positions.
(4) Combined navigation based on inertial/visual fusion:
and taking the position information calculated by the visual landing system as an observed quantity, and fusing the observed quantity with a Kalman filter constructed by the navigation information output by inertial navigation to realize a continuous and autonomous navigation positioning function.

Claims (5)

1. The unmanned aerial vehicle inertial/visual landing navigation method based on the line characteristics is characterized by comprising the following steps of:
(1) Airport runway data acquisition
Establishing an airport coordinate system, a visual coordinate system, a world coordinate system, a camera coordinate system and an image coordinate system; image acquisition is carried out on an airport runway, real-time feature extraction is carried out on the runway sideline, and a linear equation of the sideline and a midline is obtained;
(2) Plucker coordinate representation
Calculating Plucker coordinates of the boundary line equation through the width of the airport runway and the camera internal reference matrix which are bound in advance;
(3) Vision measurement pose calculation
After Plucker coordinates of runway edge lines are obtained, equations of infinity blanking points and blanking lines are calculated by two equidistant parallel lines, and an attitude transfer matrix between a real-time world coordinate system and a camera coordinate system of the unmanned aerial vehicle is calculated by a simultaneous equation setSolving the posture and the lateral and vertical positions;
(4) Combined navigation based on inertial/visual fusion
The position information calculated by the visual landing system is used as an observed quantity and is fused with a Kalman filter constructed by the navigation information output by inertial navigation, so that a continuous autonomous navigation positioning function is realized; the observed quantity is the difference value between the vertical lateral position output by the inertial navigation of the airport coordinate system and the navigation result of the visual landing system.
2. The unmanned aerial vehicle inertial/visual landing navigation method based on line features of claim 1, wherein establishing an airport coordinate system, a visual coordinate system, a world coordinate system, a camera coordinate system, an image coordinate system in step (1) comprises;
an airport coordinate system, denoted as a-system; taking the intersection point of the initial line of the runway landing tip and the central line of the runway as an origin o a The method comprises the steps of carrying out a first treatment on the surface of the The shaft is positive in the forward direction along the center line of the runway; y is a The axis is vertical to the plane of the runway and is positive upwards; z a The shaft coincides with the initial line of the runway, and the right direction is positive; o (o) a x a y a z a Forming a right-hand coordinate system; for the coordinates of a point in the airport coordinate system (x a ,y a ,z a ) A representation;
a visual coordinate system, denoted as v-system; taking an image side principal point of an optical system as an origin o v ;x v The axis is parallel to the optical axis, and the forward direction is positive; y is v The axis being parallel toThe horizontal axis of the imaging plane coordinate system is positive upwards; z v Axis and x v Axes and y v The shaft forms a right-hand coordinate system, and the right direction is positive;
the world coordinate system is marked as a w system; taking the intersection point of the initial line of the runway landing tip aiming point and the runway center line as an origin o w ;x w The shaft coincides with the initial line of the runway, and the right direction is positive; y is w The axis is vertical to the plane of the runway and is positive downwards; z w The shaft is positive in the forward direction along the center line of the runway; o (o) w x w y w z w Forming a right-hand coordinate system; for the coordinates of a point in the world coordinate system (x w ,y w ,z w ) A representation;
a camera coordinate system, denoted as c-system; taking an image side principal point of an optical system as an origin o c The method comprises the steps of carrying out a first treatment on the surface of the X when looking at the optical system c The axis is parallel to the horizontal axis of the imaging plane coordinate system, and the left direction is positive; y is c The axis is parallel to the vertical axis of the imaging plane coordinate system, and the downward direction is positive; z c The axis is directed to the observer and is in communication with x c Axes and y c The axes form a right hand coordinate system;
an image coordinate system, namely an i-system; an image coordinate system is established in a plane where a photosensitive surface of the camera is positioned, and an x of the image coordinate system is formed by taking the upper left corner of the image as an origin and taking the right corner of the image as the right corner of the image along the horizontal direction of the image i Axis, y of image coordinate system downward along image vertical direction i The units of the image coordinate system are pixels, axes.
3. The unmanned aerial vehicle inertial/visual landing navigation method based on line features according to claim 2, wherein the plucker coordinate representation specifically comprises:
the linear equation under the image-space image coordinate system can be described as:
ax i +by i +c=0
thus, a straight line can be represented by a three-dimensional vector:
l=[a,b,c] T
the three-dimensional coordinates of two points A, B in the world coordinate system of the object space are respectivelyTheir homogeneous coordinates are:the straight line passing through these two points can be represented by a 4 x 4 antisymmetric homogeneous matrix L, which is called the plucker matrix:
L=AB T -BA T
in addition, the straight line L can use its direction vectorExpressed as a moment m, referred to as Plucker coordinates, denoted:
wherein ,is the direction vector of a straight line, and the moment m is the normal vector of the straight line and the origin determining plane, i.e
The relation between the Plucker matrix and the Plucker coordinates is thus obtained:
under the action of the camera mapping T, a straight line L defined by a Plucker matrix is used for representing an image L of a corresponding straight line in an image coordinate system:
wherein K is a camera reference matrix:
for the pose transfer matrix of the world coordinate system to the camera coordinate system,/for the camera coordinate system>A position vector of an origin of a camera coordinate system in a world coordinate system; s is(s) 2 Only the common coefficients of the parameters in the straight line, therefore [ l ]] × Can be simplified into:
4. a method of unmanned aerial vehicle inertial/visual landing navigation based on line features according to claim 3, wherein the equation for calculating the infinity blanking point and blanking line in step (3) comprises:
in the unmanned plane landing process, the visual landing system extracts runway line characteristics, wherein left and right side lines and central lines of the runway are a group of parallel lines, the runway line characteristics can be used for calculating coordinates of infinite distance blanking points and blanking line equations, and three equidistant parallel lines on the runway in an object space are L 0w 、L 1w 、L 2w Which is imaged in the image plane as l 0i 、l 1i 、l 2i Then the blanking point coordinates in image space can be solved by the following relationship:
the blanking line equation is:
l ∞i =[(l 0i ×l 2i ) T (l 1i ×l 2i )]l 1i +2[(l 0i ×l 1i ) T (l 2i ×l 1i )]l 2i
assume four-dimensional homogeneous coordinates of a point A in object spaceThen the point A is crossed and the direction is +.>Can be expressed as:
when the parameter lambda changes from 0 to +.:
the relation between the blanking point and the unmanned plane attitude transfer matrix is obtained according to an image conjugation equation:
and (5) further finishing an equation to obtain:
let us assume an image space blanking line l ∞i The upper point is x, and the back projection of the point in the object space is a direction ofIs a straight line of (2); from the point x on a straight line, it is possible to:
x T l ∞i =0
by means ofNormal vector n to plane π The orthogonality can be obtained:
the direction in the object space is utilized asIs the point x of the image space, yielding:
the transposed transformation is carried out on the above components to obtain:
in conjunction with the foregoing, the spatial blanking line equation:
the method solves the gesture transfer matrix between the real-time world coordinate system and the camera coordinate system of the unmanned aerial vehicle through the simultaneous equationsAnd solving the posture and the lateral and vertical positions specifically comprises:
the set of equations that can be found by combining the above equations is as follows:
in the formula ,for the pose transfer matrix:
in the scene of an airport runway,is the unit direction vector of the center line of the object space runway, < + >>n π Is the unit normal vector of the plane of the object space runway, n π =[0,1,0] T Similarly, can get->
Let K -1 p =[g 1 ,g 2 ,g 3 ] T ,K T l =[h 1 ,h 2 ,h 3 ] T ,(K -1 p )×(K T l )=[e 1 ,e 2 ,e 3 ] T Simultaneous pose transfer matrixFor an antisymmetric array, the sum of squares of each row and column element is 1, and can be solved by the above formula:
three attitude angles are thus solved:
calculating relative position using line equations:
wherein :
the runway edge equation is taken into the above:
straight line of same reason L 2 And (3) determining:
solving alpha from the above two 0 、α 2
Finally solving the vertical and lateral positions t of the unmanned aerial vehicle in the world coordinate system y 、t x
5. The unmanned aerial vehicle inertial/visual landing navigation method based on line characteristics according to claim 4, wherein in the step (4), a kalman filter model continuous state equation of the inertial/visual integrated navigation system is as follows:
wherein F (t) is a state transition matrix of a continuous state equation at the moment t,the random noise vector of the system at the moment t;
the filtering state quantity is north eastern speed error, latitude error, altitude error, longitude error, north eastern misalignment angle error, carrier system XYZ direction gyro drift and carrier system XYZ direction accelerometer zero position;
system state transition matrix
wherein :
the observation equation is defined as follows:
is an observation noise array;
the observed quantity of the combined navigation system is the difference value between the vertical lateral position of the inertial navigation output of the airport coordinate system and the navigation result of the visual landing system:
H(t)=[0 3×3 M 3×3 0 3×9 ]
wherein ,
CN202110928170.6A 2021-08-13 2021-08-13 Unmanned aerial vehicle inertial/visual landing navigation method based on line characteristics Active CN113790719B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110928170.6A CN113790719B (en) 2021-08-13 2021-08-13 Unmanned aerial vehicle inertial/visual landing navigation method based on line characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110928170.6A CN113790719B (en) 2021-08-13 2021-08-13 Unmanned aerial vehicle inertial/visual landing navigation method based on line characteristics

Publications (2)

Publication Number Publication Date
CN113790719A CN113790719A (en) 2021-12-14
CN113790719B true CN113790719B (en) 2023-09-12

Family

ID=79181750

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110928170.6A Active CN113790719B (en) 2021-08-13 2021-08-13 Unmanned aerial vehicle inertial/visual landing navigation method based on line characteristics

Country Status (1)

Country Link
CN (1) CN113790719B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115690205B (en) * 2022-10-09 2023-12-05 北京自动化控制设备研究所 Visual relative pose measurement error estimation method based on point-line comprehensive characteristics
CN116380057B (en) * 2023-06-05 2023-08-29 四川腾盾科技有限公司 Unmanned aerial vehicle autonomous landing positioning method under GNSS refusing environment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3546116A1 (en) * 1985-12-24 1987-06-25 Heinz Mueller Method for displaying required and actual data on the flight conditions before and during landing, for a pilot
CN104517291A (en) * 2014-12-15 2015-04-15 大连理工大学 Pose measuring method based on coaxial circle characteristics of target
CN109341700A (en) * 2018-12-04 2019-02-15 中国航空工业集团公司西安航空计算技术研究所 Fixed wing aircraft vision assists landing navigation method under a kind of low visibility
CN109544696A (en) * 2018-12-04 2019-03-29 中国航空工业集团公司西安航空计算技术研究所 A kind of airborne enhancing Synthetic vision actual situation Image Precision Registration of view-based access control model inertia combination
CN110634162A (en) * 2019-04-01 2019-12-31 青岛鑫慧铭视觉科技有限公司 Calibration method of structured light vision sensor based on concentric circles
CN111275764A (en) * 2020-02-12 2020-06-12 南开大学 Depth camera visual mileage measurement method based on line segment shadow

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3546116A1 (en) * 1985-12-24 1987-06-25 Heinz Mueller Method for displaying required and actual data on the flight conditions before and during landing, for a pilot
CN104517291A (en) * 2014-12-15 2015-04-15 大连理工大学 Pose measuring method based on coaxial circle characteristics of target
CN109341700A (en) * 2018-12-04 2019-02-15 中国航空工业集团公司西安航空计算技术研究所 Fixed wing aircraft vision assists landing navigation method under a kind of low visibility
CN109544696A (en) * 2018-12-04 2019-03-29 中国航空工业集团公司西安航空计算技术研究所 A kind of airborne enhancing Synthetic vision actual situation Image Precision Registration of view-based access control model inertia combination
CN110634162A (en) * 2019-04-01 2019-12-31 青岛鑫慧铭视觉科技有限公司 Calibration method of structured light vision sensor based on concentric circles
CN111275764A (en) * 2020-02-12 2020-06-12 南开大学 Depth camera visual mileage measurement method based on line segment shadow

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
嵇盛育 ; 徐贵力 ; 冯文玲 ; 刘小霞 ; .基于红外视觉的无人机自主着舰合作目标的研究.红外技术.2007,(第10期),593-597. *

Also Published As

Publication number Publication date
CN113790719A (en) 2021-12-14

Similar Documents

Publication Publication Date Title
CN105021184B (en) It is a kind of to be used for pose estimating system and method that vision under mobile platform warship navigation
CN106708066B (en) View-based access control model/inertial navigation unmanned plane independent landing method
US11086324B2 (en) Structure from motion (SfM) processing for unmanned aerial vehicle (UAV)
CN102768042B (en) Visual-inertial combined navigation method
CN107194989B (en) Traffic accident scene three-dimensional reconstruction system and method based on unmanned aerial vehicle aircraft aerial photography
CN113790719B (en) Unmanned aerial vehicle inertial/visual landing navigation method based on line characteristics
Haala et al. Quality of 3D point clouds from highly overlapping UAV imagery
CN106017463A (en) Aircraft positioning method based on positioning and sensing device
CN106447766B (en) A kind of scene reconstruction method and device based on mobile device monocular camera
KR20190051704A (en) Method and system for acquiring three dimentional position coordinates in non-control points using stereo camera drone
JP2008186145A (en) Aerial image processing apparatus and aerial image processing method
CN102840852A (en) Aerial photograph image pickup method and aerial photograph image pickup apparatus
CN109341686B (en) Aircraft landing pose estimation method based on visual-inertial tight coupling
Haala et al. Dense multiple stereo matching of highly overlapping UAV imagery
CN109997091B (en) Method for managing 3D flight path and related system
CN207068060U (en) The scene of a traffic accident three-dimensional reconstruction system taken photo by plane based on unmanned plane aircraft
CN106780337A (en) Unmanned plane based on two dimensional image warship visual simulation method
Sai et al. Geometric accuracy assessments of orthophoto production from uav aerial images
CN110223233B (en) Unmanned aerial vehicle aerial photography image building method based on image splicing
JP3808833B2 (en) Aerial photogrammetry
Moore et al. A stereo vision system for uav guidance
CN110382358A (en) Holder attitude rectification method, holder attitude rectification device, holder, clouds terrace system and unmanned plane
CN112785686A (en) Forest map construction method based on big data and readable storage medium
CN109341685B (en) Fixed wing aircraft vision auxiliary landing navigation method based on homography transformation
CN116957360A (en) Space observation and reconstruction method and system based on unmanned aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant