CN108665499B - Near distance airplane pose measuring method based on parallax method - Google Patents

Near distance airplane pose measuring method based on parallax method Download PDF

Info

Publication number
CN108665499B
CN108665499B CN201810420236.9A CN201810420236A CN108665499B CN 108665499 B CN108665499 B CN 108665499B CN 201810420236 A CN201810420236 A CN 201810420236A CN 108665499 B CN108665499 B CN 108665499B
Authority
CN
China
Prior art keywords
airplane
dimensional
cameras
plane
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810420236.9A
Other languages
Chinese (zh)
Other versions
CN108665499A (en
Inventor
刘震
高扬
张婧毓
杨守波
石博文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201810420236.9A priority Critical patent/CN108665499B/en
Publication of CN108665499A publication Critical patent/CN108665499A/en
Application granted granted Critical
Publication of CN108665499B publication Critical patent/CN108665499B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/77Determining position or orientation of objects or cameras using statistical methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a near distance aircraft pose measuring method based on a parallax method, which is used for positioning the aircraft position and solving the attitude under the high-altitude near distance condition, can be used for solving the pose of an oil receiver in the air refueling process, and comprises the following steps: calibrating internal and external parameters of two parallel cameras and correcting images; after an initial positioning area of the airplane is obtained through a detection and tracking algorithm, stereo matching is carried out, and three-dimensional point cloud is solved; two-dimensional feature points are classified and identified by methods such as edge features and the like, a plurality of three-dimensional feature point sets are formed by combining three-dimensional point clouds, and a plane equation of the three-dimensional point sets is solved. After filtering background points and noise points, solving the space attitude of the airplane by combining a plane equation with the shape characteristics of the airplane; the method adopts a mode of combining image characteristics and morphological characteristics, can effectively realize near-distance airplane pose measurement, reduces time consumed by a stereoscopic vision matching algorithm, does not need to add a cooperation mark on an airplane, has the characteristics of quickness, effectiveness, convenience in realization, strong practicability, convenience in algorithm transplantation and the like, and is easy to use in practical projects.

Description

Near distance airplane pose measuring method based on parallax method
Technical Field
The method relates to an aircraft pose measuring method, in particular to an aircraft pose measuring method based on a parallax method and combining a two-dimensional image with three-dimensional point cloud.
Background
The parallax method is one of the algorithms commonly used for stereoscopic vision. The parallax method is a method for passively sensing distance by using a computer by simulating the human visual principle. Observing an object from two or more points, acquiring images under different viewing angles, and calculating the offset between pixels according to the matching relation of the pixels between the images by the triangulation principle to acquire the three-dimensional information of the object.
In the process of matching corresponding pixels by the parallax method, errors are easy to occur in matching due to the fact that the parallax searching range is not easy to determine. An inappropriate disparity search range tends to trap the stereo matching algorithm into local minima. Too large or too small a preset disparity search range may result in errors in matching in the disparity method. Taking the maximum disparity as an example, if a given maximum disparity is too large, the mismatch may increase. And consume more computation time and memory space; if the parallax is too small, the correct parallax cannot be calculated. Meanwhile, in many applications, the parallax range of a scene cannot be known in advance, and manual calibration and estimation are needed. This method is not suitable for practical stereoscopic vision systems.
After the three-dimensional point cloud is obtained through the parallax image, the real-time pose settlement of the airplane is not realized by a quick and effective point cloud matching method. Though ICP (iterative closest point method) proposed by Besl and Mckay is continuously improved and supplemented through years of development, and is greatly improved, the ICP is one of the main means for solving the registration problem based on the free form surface, but still needs a large amount of calculation and a plurality of iterative processes, and has no advantage in calculation time.
Disclosure of Invention
The invention solves the problems: the method overcomes the defects of the prior art, provides a near distance aircraft pose measurement method based on a parallax method, and can realize relatively accurate aircraft pose measurement capable of ensuring real-time performance under the condition of an actual complex background environment.
In order to achieve the purpose, the technical scheme of the method is realized as follows:
1. a near distance airplane pose measuring method based on a parallax method comprises the following steps:
a. for two cameras placed in parallel, a camera coordinate system is defined with the left camera optical center as the origin of the coordinate axes. The internal parameters of the two cameras and the external parameters (namely, the rotation matrix and the translation vector) of the two cameras are obtained through calibration. And correcting the two-dimensional images shot by the two cameras by using the internal and external parameters and the external parameters between the cameras.
b. The method comprises the steps of shooting thousands of actually used two-dimensional images (the number can be adjusted according to actual requirements, and the number is determined to be three thousands of images through experiments) containing airplanes in advance, marking the two-dimensional images only containing airplane areas as positive sample sets, and marking the two-dimensional images of non-airplane areas as negative sample sets. And training an airplane classifier through the positive and negative sample sets. An airplane classifier is used to detect airplane regions in the corrected two-dimensional images of the two cameras respectively. Substituting the airplane area detected in the two-dimensional image into a tracking algorithm, tracking the airplane area in the two-dimensional image, and obtaining an airplane initial positioning area in the two-dimensional image of the two cameras by combining the airplane area and the algorithm corresponding weight respectively obtained by the detection algorithm and the tracking algorithm;
c. stereo matching is carried out on the aircraft initial positioning area in the two-dimensional images of the two cameras to obtain a disparity map, and the disparity map is solved to obtain three-dimensional point cloud of the aircraft initial positioning area in the two-dimensional image of the left camera;
d. c, performing point screening on an aircraft initial positioning area in a two-dimensional image of a left camera, combining the three-dimensional point cloud in the step c to obtain a three-dimensional point set of key points in a plane where the head and the empennage of the aircraft are located and a three-dimensional point set of key points in an aircraft wing plane, and solving a plane equation of each three-dimensional point set by using RANSAC to obtain a plane equation of the plane where the head and the empennage of the aircraft are located and the plane equation of the aircraft wing plane;
e. according to the plane equation of the plane where the head and the empennage of the airplane are located and the plane of the wings of the airplane obtained in the step d, filtering background points and noise points by combining the shape of the airplane and three-dimensional data to obtain a three-dimensional point set near each plane, namely a point set in each plane, further iterating to obtain a new plane equation, determining a local coordinate system of the airplane by using the new two plane equations, and solving the space pose of the airplane.
2. The steps of calibrating internal and external parameters of a stereoscopic vision system consisting of two cameras and correcting binocular images in the step a are as follows:
(1) shooting a target picture, extracting angular points in the target picture, and calibrating internal parameters of the two cameras;
(2) according to the internal parameters and the target pictures of the two cameras, external parameters (namely a rotation matrix and a translation vector) of the two cameras are calibrated;
(3) and correcting the two-dimensional images shot by the two cameras according to the internal parameters of the cameras and the external parameters of the two cameras.
3. The step b of detecting and tracking the airplane to obtain the airplane initial positioning area on the two-dimensional image comprises the following implementation steps:
(1) extracting a two-dimensional image only containing an airplane area from a two-dimensional image shot in advance to form a positive sample set; the two-dimensional image of the aircraft-free region serves as a negative sample set. Acquiring rectangular features of the airplane through the positive and negative sample sets, and training an airplane classifier;
(2) positioning an airplane area in the initial frame image by using an airplane classifier, and initializing a tracking algorithm by using the airplane area;
(3) and in the subsequent two-dimensional image sequence, respectively operating a detection algorithm and a tracking algorithm, and performing strategic fusion on the airplane areas and the algorithm weights obtained by the two algorithms to obtain the airplane initial positioning areas in the two-dimensional images shot by the two cameras.
4. In the step c, stereo matching is carried out on the initial positioning area of the airplane in the two-dimensional images shot by the two cameras to obtain a disparity map, and the three-dimensional point cloud is solved by the following implementation steps:
(1) calculating the matching degree of the area around each pixel point in the initial positioning area of the airplane in the two-dimensional images shot by the two cameras in the step b, and respectively calculating the matching degree of the areas around the corresponding pixels in the initial positioning area of the airplane in the two-dimensional images shot by the two cameras to obtain an initial parallax map;
(2) filtering the initial disparity map, and screening out mismatching points to obtain a disparity map;
(3) based on a stereoscopic vision model and a camera pinhole imaging principle, and in combination with a parallax map, the three-dimensional coordinates of each two-dimensional image point in the aircraft initial positioning area in the two-dimensional image of the left camera can be calculated, and three-dimensional point cloud of the aircraft initial positioning area is formed.
5. In the step d, the plane equation of the plane where the head and the empennage of the airplane are located and the plane of the wings of the airplane is solved by the following steps:
(1) and in an initial airplane positioning area in the two-dimensional image, screening and extracting two-dimensional points of the airplane in the image by combining the outline and the skeleton information, and performing classification marking on the extracted key feature points. Dividing the key feature points into a key feature point set a on the head and the empennage of the airplane and a key feature point set b on the wings of the airplane according to the distribution positions;
(2) screening corresponding three-dimensional points of the key characteristic points from the three-dimensional point cloud according to the one-to-one corresponding relation between the points of the two-dimensional image and the points in the parallax image to form a three-dimensional point set A on the head and the tail of the airplane and a three-dimensional point set B on the wings of the airplane;
(3) and respectively solving the equations of the planes of the two three-dimensional point sets by using a RANSAC algorithm, namely solving the plane equations of the planes of the head and the empennage of the airplane and the plane of the wings of the airplane.
6. In the step e, the three-dimensional point cloud obtained in the step c is filtered to remove noise points and background points according to the plane equation solved in the step d, and a straight line where the plane body of the airplane is located can be solved through the plane equations of the plane where the head and the empennage of the airplane are located and the plane of the wings of the airplane. And (4) solving the space pose of the airplane under the coordinate system of the camera system by combining the size proportion and the shape characteristics of the actual model of the airplane. Through mutual transformation of the coordinate systems, the space pose of the airplane under the geodetic coordinate system can be solved.
(1) And d, using the plane equation solved in the step d to carry out three-dimensional point screening on the three-dimensional point cloud of the airplane initial positioning area in the two-dimensional image solved in the step c. By the distance from the point to the plane, background points and noise points can be filtered. The straight line of the airplane body can be determined according to the plane of the head and the tail of the airplane and the plane of the wings of the airplane.
(2) And solving the space pose of the airplane under the coordinate system of the camera system according to the size proportion and the shape characteristics of the actual model of the airplane.
(3) And solving the space pose of the airplane under the geodetic coordinate system through the mutual conversion relation between the coordinate systems.
Compared with the prior art, the invention has the advantages that: according to the invention, the three-dimensional pose measurement of the airplane can be completed by using the two cameras only after the two cameras are calibrated. The method has the advantages of no need of a cooperative airplane in the implementation process, easy implementation of the calculation process, no need of a complex preparation process and a harsh implementation environment, strong universality and high calculation speed. Therefore, the method is suitable for measuring the three-dimensional space pose of the airplane under the conditions of high altitude complex background and short distance, such as the measurement of the three-dimensional space pose of the oil receiving machine in the air refueling process.
Drawings
FIG. 1 is a flow chart of a near distance aircraft pose measurement method based on a parallax method of the invention;
FIG. 2 is a schematic view of a principle of a parallax method;
FIG. 3 is a schematic view of an aircraft profile and skeleton;
FIG. 4 is a local coordinate system and attitude angle of the aircraft;
FIG. 5 is a diagram illustrating the relationship between coordinate systems in the attitude measurement.
Detailed Description
The basic idea of the invention is as follows: the initial area of the aircraft is located in two cameras using a detection and tracking method. And acquiring airplane three-dimensional point cloud by combining the primary positioning areas of the two cameras by using a parallax method, and solving the position and the posture of the airplane by combining the image characteristic points, the three-dimensional point cloud information and the airplane morphology characteristics.
The method is further described below in connection with the actual process.
As shown in FIG. 1, the near distance aircraft pose measurement method based on the parallax method mainly comprises the following steps:
step 11: and carrying out internal and external reference calibration and image correction on the two camera systems.
Here, each camera of the binocular system is first calibrated, i.e., to solve for the intrinsic parameters of the camera, and the two camera vision systems are calibrated, i.e., to solve for the extrinsic parameters between the two cameras, the specific solving method is described in detail in Zhang friend's article "A flex New technology for camera calibration [ R ]. Microsoft Corporation, NSR-TR-98-71,1998".
After the internal and external parameters of the two camera systems are obtained, images acquired by the left camera and the right camera are corrected, and the image deformation and the like caused by camera distortion are corrected.
Step 12: and positioning the initial area of the airplane in the images obtained by the two cameras through a detection algorithm and a tracking algorithm detection and tracking algorithm. The region is the input content of the subsequent calculation, and the calculation region is reduced for the subsequent algorithm, so that the calculation speed of the algorithm is improved.
Here, three thousand (the number may be actually adjusted) two-dimensional images containing only the airplane area are first acquired as positive samples, and six thousand (the number may be actually adjusted) two-dimensional images containing no airplane area are acquired as negative samples. And screening out characteristic information capable of distinguishing the airplane by analyzing the rectangular characteristics in the positive and negative samples. The feature information is combined in a cascading mode, so that the area where the airplane is located can be accurately identified in the image through the cascading rectangular features.
And substituting the aircraft area into a kernel correlation filtering tracking algorithm, tracking the aircraft area in two-dimensional images shot by a left camera and a right camera, and obtaining an aircraft initial positioning area in the two-dimensional images of the two cameras by combining the aircraft area and corresponding algorithm weights respectively obtained by a detection algorithm and a tracking algorithm.
Step 13: and (3) carrying out stereo matching on the positioned airplane areas in the images acquired by the left camera and the right camera through an SGBM stereo matching algorithm to acquire a disparity map, and solving the three-dimensional point cloud of the initial positioning area of the airplane by combining the internal and external parameters of the system calibrated in the step 11.
Here, since the image is over-corrected, matching is constrained to be performed in the corresponding row by the limit constraint, reducing the possibility of a false match. And calculating the pixel position of the corresponding row in the other image and the value with the minimum cost according to the constraint relation of the values of each pixel and the surrounding pixels, and solving the parallax of the point according to the corresponding pixel point pair. And traversing and solving the whole area to obtain an initial local disparity map of the image of the area. Filtering the initial local disparity map, and screening out mismatching points to obtain a disparity map;
the stereo matching algorithm can be mainly divided into the following four steps:
(1) pretreatment:
using a horizontal Sobel operator to perform the following processing on each point of the aircraft initial positioning area in the two-dimensional image:
Sobel(x,y)=2[P(x+1,y)-P(x-1,y)]+P(x+1,y-1)-P(x-1,y-1)+P(x+1,y+1)-P(x-1,y+1) (1)
wherein Sobel (x, y) represents the horizontal Sobel operator for the point. P (x, y) represents the pixel value of the currently calculated pixel point (x, y).
And (3) forming a new two-dimensional image area by using a function to calculate each pixel point of the processing area through a mapping function.
The mapping function is as follows:
Figure BDA0001650511150000051
wherein, preFilterCap is a constant parameter, generally 63. P is the pixel value of the current calculation point in the initial positioning area of the airplane in the two-dimensional image, PNRepresenting the pixel value corresponding to P on the two-dimensional image area after calculation by the mapping function.
The preprocessing is actually to obtain gradient information of the initially located region of the aircraft in the two-dimensional image. And storing a new two-dimensional image area formed after the airplane initial positioning area is preprocessed for cost calculation.
(2) And (3) cost calculation:
the cost of each pixel is calculated using the sum of absolute errors algorithm.
(3) Global optimization:
and (4) performing global optimization by adopting a dynamic programming mode, namely solving a minimum cost path. For each point p in the area, 8 paths are set at intervals of 45 ° around p, and the minimum cost path L is calculated from the 8 pathsr(p,d)。
Figure BDA0001650511150000061
Wherein, P1,P2For dynamic parameter planning, according to the actual use ringAdjusting the environment; l isr(p, d) represents the minimum cost path along the current direction (i.e. from left to right) when the disparity of the current calculation pixel point p takes the value of d.
Figure BDA0001650511150000062
Indicating that along the current direction (i.e., from left to right), the disparity value of the current calculation pixel point p is smaller than the minimum value of the minimum cost path at k.
(4) And (3) post-treatment:
after matching of each pixel point in the aircraft initial positioning area in the two-dimensional image acquired by the left camera is completed, each pixel point in the aircraft initial positioning area in the two-dimensional image acquired by the right camera is used for matching with the pixel point in the aircraft initial positioning area in the two-dimensional image acquired by the left camera. And if the parallax of the pixel point obtained by two matching is different, the pixel point is regarded as invalid matching.
Through the process, the disparity map of the aircraft initial positioning area in the two-dimensional image can be calculated.
According to the stereo vision model and the camera pinhole imaging principle, the schematic diagram of the principle of the parallax method can be seen in detail, namely as shown in fig. 2: p in FIG. 2 is a point in space, PlAnd PrIs the imaging point of point P on the left and right camera image planes, f is the focal length, OlAnd OrIs the optical center of the left and right cameras. O islAnd OrThe distance between them is the binocular distance, P to OlAnd OrThe distance of the connecting line is the actual distance. The optical axes of the left and right cameras are parallel. x is the number ofrAnd xlIs the distance of the two imaging points from the left edge of the image on the left and right camera image planes.
If the two cameras are calibrated, the polar lines are parallel, and the two optical axis directions are also parallel. The relationship between parallax and object depth is as follows:
Figure BDA0001650511150000063
wherein x isrAnd xlTwo imaging points are arranged on the left and the rightThe distance from the left edge of the image on the image surface of the camera, b is the optical center O of the left camera and the right cameralAnd OrThe distance of the connecting line. Z is the actual distance from point P to the camera.
It can be deduced that:
Figure BDA0001650511150000071
wherein x isrAnd xlIs the distance between two imaging points on the left and right camera image planes from the left edge of the image, and d is the parallax between the imaging points, i.e. xl-xrB is the optical center O of the left and right cameraslAnd OrDistance of connecting line, f is focal length of camera
Through the formula and the disparity map of the airplane initial positioning area in the solved two-dimensional image, the three-dimensional point cloud in the airplane initial positioning area in the two-dimensional image can be obtained.
Step 14: and screening two characteristic point sets in the two-dimensional image initial positioning area, and combining three-dimensional point cloud data to obtain two corresponding three-dimensional point sets.
According to the overall appearance of the airplane, the method can be simplified into an object consisting of two planes, namely a plane consisting of the head and the tail of the airplane and a plane consisting of the wings of the airplane. Therefore, the measured attitude of the airplane can be obtained by solving the equations of the plane where the head and the tail of the airplane are located and the plane of the wings of the airplane.
In an initial positioning area of the two-dimensional image, an edge extraction algorithm is used for obtaining the edge features inside the area. The method comprises the steps of utilizing edge features to combine with three-dimensional point cloud to obtain airplane outline information (see fig. 3, fig. 3 is an airplane schematic diagram, two shadow planes in the diagram are respectively a plane formed by an airplane head and an empennage and an airplane wing plane, a black edge is the airplane outline information, a middle line is airplane skeleton information, and a point highlighted on the skeleton information is an extracted airplane key feature point), so that an airplane silhouette area is formed. And extracting a framework in the silhouette area of the airplane by adopting a framework extraction algorithm, and performing topology reconstruction on the framework by using a binary tree structure according to the characteristics of the framework. And establishing a skeleton branch reduction method adapting to the characteristics of the target skeleton of the airplane to finish the extraction of the target main skeleton (shown in figure 3).
The main framework of the airplane can be combined with the shape characteristics of the airplane to position the head, the tail wing and the two wings of the airplane in the image. And combining the extracted edge features to extract key points on the airplane head, the empennage and two wings from the image, defining the key points as key feature points, and labeling and classifying the key feature points. The feature points on the head and the empennage form a key feature point set A, and the feature points on the two wings form a key feature point set B.
According to the relation between the two-dimensional image and the disparity map, three-dimensional points corresponding to each two-dimensional point in the key feature point set can be extracted from the three-dimensional point cloud to form two corresponding three-dimensional feature point sets.
Step 15: the plane equations for each three-dimensional point set obtained in step 14 are solved using the RANSAC method.
In the invention, the RANSAC method is used for iterating the three-dimensional point sets A and B extracted in the step 14 for a fixed number of times respectively to obtain a more ideal model (namely a plane equation) which is used as a plane equation where the point sets are located, namely, a plane formed by the head and the empennage of the airplane and a plane of the wings of the airplane are obtained.
RANSAC is an abbreviation for "RANdom SAmple Consensus". It can iteratively estimate the parameters of the mathematical model from a set of observed data sets comprising "outliers".
The input to the RANSAC algorithm is a set of observations (three-dimensional point sets a and B solved in step 14), and the plane is arrived at by iteratively selecting a set of random subsets of the data. The selected subset is assumed to be an in-office point and verified by the following method:
(1) and solving a plane model by using the local interior points, namely all plane model parameters can be calculated from the local interior points.
(2) And (3) testing data except the local point in the observation data by using the plane model obtained in the step (1), namely calculating the distance between the three-dimensional point except the local point and the plane equation, and if the calculated distance is less than a set threshold value delta, considering the calculated distance as the local point.
(3) If 50% of the points in the input observations (three-dimensional point sets A and B solved in step 14) are classified as local points, then the calculated planar model is considered reasonable enough.
(4) Then, all the obtained local inner points are used for recalculating the plane model.
(5) Finally, the planar model is evaluated by estimating the error rate of the local interior point and the model.
The above process is repeated a fixed number of times, and the resulting planar model is either discarded because there are too few local points or selected because it is better than the existing planar model. And finally, two plane equations of a plane formed by the head and the tail of the airplane and a plane of the wing of the airplane are solved respectively.
Step 16: and filtering background and noise points in the three-dimensional point cloud through a plane equation, and solving the spatial pose of the airplane by combining the morphological characteristics of the airplane.
The three-dimensional attitude of an aircraft is actually the attitude angle of the aircraft in the air relative to a ground measurement coordinate system. Therefore, three conversion processes from the local coordinate system of the aircraft (the coordinate system fixed to the aircraft itself) to the camera coordinate system and then to the ground survey coordinate system are required. And the coordinates and the angles of the camera coordinate system relative to the ground measurement coordinate system can be solved by photoelectric longitude and latitude and other modes. Therefore, the absolute attitude of the aircraft can be solved as long as the attitude of the aircraft in the camera coordinate system (i.e. the local coordinate system of the aircraft) in the sequence images is determined.
Using the equations of the plane formed by the nose and the tail of the airplane and the plane of the wings of the airplane obtained in step 15, respectively obtaining points near the two planes, namely, the point sets in the plane, from the three-dimensional point cloud obtained in step 13. Meanwhile, by combining the relationship that the two planes are perpendicular to each other, noise points are filtered, and a more accurate plane equation is further obtained through iteration. And then, through two newly solved plane equations, the actual shape and size of the airplane and the three-dimensional point cloud data, the intersection line of the two planes is the straight line where the airplane body is located. The center position of the airplane body can be positioned according to the position of the airplane head and the actual size of the airplane, and the center position is used as the origin of a local coordinate system of the airplane, and the direction of two wings (namely the direction of a connecting line from a left wing to a right wing) is defined as an X axis. The direction in which the nose of the aircraft is pointed (i.e., the direction of the line connecting the tail to the nose, or the longitudinal axis of the fuselage) is the Y-axis. The normal to the plane of the aircraft fuselage (or the plane in which the cabin floor lies), i.e. the vertical direction, is the Z-axis. Therefore, the local coordinate system of the airplane can be determined through the origin and other two plane equations, and the attitude of the airplane under the camera coordinate system is solved.
As shown in FIG. 5, a camera coordinate system O is establishedcxcyczcAnd local coordinate system O of the aircraftpxpypzp
Camera coordinate system OcxcyczcThe optical center point of the left camera and the horizontal pointing direction of the right camera are set as O in the stereoscopic vision systemcxcAxis, perpendicular to ground, ofcycAxis, pointing horizontally to the front of the left camera is OczcA shaft.
The plane equation of the plane where the head and the empennage of the airplane are located and the plane equation of the wings of the airplane are simultaneous: .
Figure BDA0001650511150000091
Wherein A is1,B1,C1,D1,A2,B2,C2,D2The two plane equation parameters solved for step 15. The straight line of the fuselage can be known through the equation, and the central point of the airplane is also on the straight line.
And (3) solving the central point (mass center) of the airplane by combining the spatial linear equation (formula 6) and the airplane main skeleton information obtained in the step 14 with the actual size of the airplane. Thus, an aircraft local coordinate system O is definedpxpypzpThe central point (mass center) of the airplane is used as the origin O of the local coordinate system of the airplanepDefining the direction of the wing (pointing to the right wing) as OpxpAxis, nose direction OpypAxis, perpendicular to the plane, OpzpThe axis, which varies with the change in attitude of the aircraft, is equivalent to finding the coordinate system, which requires the attitude of the aircraft. The initial state of the airplane is assumed to be that the local coordinate system of the airplane coincides with the coordinate system of the camera, and the initial attitude is 0 degree at the moment.
In the inertial system, the aircraft attitude is described mainly by using three euler angles of pitch, yaw and roll. The pitch rotates about the X axis, also called pitch angle θ. yaw is rotation about the Z-axis, also called yaw
Figure BDA0001650511150000094
The roll is rotated about the Y-axis, also called the roll angle γ. The euler angle is used for describing the rotation of the object, and the rotation sequence is not only required to be angular, but also required to be in a general rotation sequence of yaw, pitch and roll. Therefore, the reaction is carried out on the defined coordinate axes, namely, the rotation is carried out around the Z axis, then the rotation is carried out around the X axis, and finally the rotation is carried out around the Y axis.
The transformation of the coordinate system is generally expressed by a directional cosine matrix, which is expressed as:
Figure BDA0001650511150000092
wherein,
Figure BDA0001650511150000093
is the camera coordinate system OcxcyczcTo the local coordinate system O of the aircraftpxpypzpThe transfer matrix of (2). r ispIs a three-dimensional coordinate in the local coordinate system of the airplane, rcIs a three-dimensional coordinate in a camera coordinate system.
The Euler angle definition and the cosine matrix definition show that the direction cosine matrix can be popularized to the single rotation of three axes. From the camera coordinate system OcxcyczcRotation to intermediate transformation of coordinate system O1x1y1z1(the camera coordinate system is surrounded by OczcConversion seat obtained after shaft rotationStandard system) and then rotated to the intermediate transformation coordinate system O2x2y2z2(intermediate transformation of the coordinate System-around the Camera coordinate System OcxcThe transformed coordinate system obtained after the rotation of the shaft) and finally rotated to the local coordinate system O of the airplanepxpypzpIn order of a single rotation:
Figure BDA0001650511150000101
in summary, it can be seen that:
Figure BDA0001650511150000102
wherein,
Figure BDA0001650511150000103
is the camera coordinate system OcxcyczcTo an intermediate transformation coordinate system-O1x1y1z1(the camera coordinate system is surrounded by OczcA transformed coordinate system obtained after rotation of the axis),
Figure BDA0001650511150000104
is an intermediate transformation of the coordinate system-O1x1y1z1(the camera coordinate system is wound around O)czcThe transformed coordinate system obtained after the rotation of the axis) to the intermediate transformed coordinate system two O2x2y2z2(intermediate transformation coordinate system is wrapped around OcxcA transformed coordinate system obtained after rotation of the axis),
Figure BDA0001650511150000105
is the intermediate transformation of the coordinate system two O2x2y2z2(intermediate transformation of the coordinate System-around the Camera coordinate System OcxcThe transformed coordinate system obtained after rotation of the axis) to the local coordinate system O of the aircraftpxpypzpThe rotation matrix of (2).
Figure BDA0001650511150000106
Theta and gamma are the angles of rotation corresponding to the three rotations, theta,
Figure BDA0001650511150000107
The method can be obtained by solving the straight line (formula 6) where the fuselage is located and the camera coordinate system. The gamma can be obtained by solving the included angle between the plane formed by the straight line where the fuselage is located and the Z axis of the camera coordinate system and the plane of the airplane wing.
The above examples are provided only for the purpose of describing the present invention, and are not intended to limit the scope of the present invention. The scope of the invention is defined by the appended claims. Various equivalent substitutions and modifications can be made without departing from the spirit and principles of the invention, and are intended to be within the scope of the invention.

Claims (4)

1. A near distance aircraft pose measurement method based on a parallax method is characterized by comprising the following steps:
a. for two cameras which are placed in parallel, defining a camera coordinate system which takes the optical center of any one of the two cameras as the origin of a coordinate axis, obtaining internal parameters of the two cameras and external parameters of the two cameras, namely a rotation matrix and a translation vector, through calibration, and correcting two-dimensional images shot by the two cameras by adopting the internal and external parameters and the external parameters between the cameras;
b. shooting thousands of actually used two-dimensional images containing airplanes in advance, marking out the two-dimensional images only containing airplane areas as a positive sample set, and marking out the two-dimensional images of non-airplane areas as a negative sample set; training an airplane classifier through a positive sample set and a negative sample set, and respectively detecting airplane areas in the corrected two-dimensional images of the two cameras by using the airplane classifier; substituting the airplane area detected in the two-dimensional image into a tracking algorithm, tracking the airplane area in the two-dimensional image, and obtaining an airplane initial positioning area in the two-dimensional image of the two cameras by combining the airplane area and the algorithm corresponding weight respectively obtained by the detection algorithm and the tracking algorithm;
c. stereo matching is carried out on the aircraft initial positioning area in the two-dimensional images of the two cameras to obtain a disparity map, and the disparity map is solved to obtain a three-dimensional point cloud of the aircraft initial positioning area in the two-dimensional image of the camera with the optical center defined as the origin;
d. point screening is carried out in an initial positioning area of the airplane in a two-dimensional image of a camera with an optical center defined as an origin, three-dimensional point sets of key feature points in the plane where the head and the empennage of the airplane are located and three-dimensional point sets of key feature points in the plane of the wings of the airplane are obtained by combining the three-dimensional point clouds in the step c, a plane equation of each three-dimensional point set is solved by using RANSAC, and a plane equation of the plane where the head and the empennage of the airplane are located and the plane equation of the plane of the wings of the airplane are obtained;
e. according to the plane equation of the plane where the head and the empennage of the airplane are located and the plane of the wings of the airplane obtained in the step d, filtering background points and noise points by combining the shape of the airplane and three-dimensional data to obtain a three-dimensional point set near each plane, namely a point set in each plane, further iterating to obtain a new plane equation, determining a local coordinate system of the airplane by using the new two plane equations, and solving the space pose of the airplane;
in the step d, the implementation steps of solving the plane equation of the plane where the head and the empennage of the airplane are located and the plane of the wings of the airplane are as follows:
(1) in an initial positioning area of the airplane in the two-dimensional image, screening and extracting two-dimensional points of the airplane in the image by combining contour and skeleton information, classifying and marking the extracted key feature points, and dividing the key feature points into a key feature point set a on the head and the tail of the airplane and a key feature point set b on the wings of the airplane according to distribution positions;
(2) screening corresponding three-dimensional points of the key characteristic points from the three-dimensional point cloud according to the one-to-one corresponding relation between the points of the two-dimensional image and the points in the parallax image to form a three-dimensional point set A on the head and the tail of the airplane and a three-dimensional point set B on the wings of the airplane;
(3) respectively solving equations of planes where the two three-dimensional point sets are located by using a RANSAC algorithm, namely solving plane equations of planes where the head and the empennage of the airplane are located and planes of wings of the airplane are located;
the step e is specifically realized as follows:
(1) d, screening three-dimensional points of the three-dimensional point cloud of the aircraft initial positioning area in the two-dimensional image solved in the step c by using each plane equation solved in the step d, and filtering background points and noise points according to the distance from the points to the plane; determining the straight line of the airplane body according to the plane of the head and the empennage of the airplane and the plane of the wings of the airplane;
(2) solving the space pose of the airplane under a camera system coordinate system according to the size proportion and the shape characteristics of the actual model of the airplane;
(3) and solving the space pose of the airplane under the geodetic coordinate system through the mutual conversion relation between the coordinate systems.
2. The near distance aircraft pose measurement method based on the parallax method is characterized in that: in the step a, the step of obtaining the internal parameters of the two cameras and the external parameters of the two cameras by calibration comprises the following steps:
(1) shooting a target picture, extracting angular points in the target picture, and calibrating internal parameters of the two cameras;
(2) according to the internal parameters of the two cameras and the target pictures, external parameters of the two cameras, namely a rotation matrix and a translation vector, are calibrated;
(3) and correcting the two-dimensional images shot by the two cameras according to the internal parameters of the cameras and the external parameters of the two cameras.
3. The near distance aircraft pose measurement method based on the parallax method is characterized in that: in the step b, the implementation steps of obtaining the initial positioning area of the airplane in the two-dimensional images in the two cameras are as follows:
(1) extracting a two-dimensional image only containing an airplane area from a two-dimensional image shot in advance to form a positive sample set; taking a two-dimensional image without an airplane area as a negative sample set, acquiring airplane rectangular characteristics through the positive and negative sample set, and training an airplane classifier;
(2) positioning an airplane area in the initial frame image by using an airplane classifier, and initializing a tracking algorithm by using the airplane area;
(3) and in the subsequent two-dimensional image sequence, respectively operating a detection algorithm and a tracking algorithm, and performing strategic fusion on the airplane areas and the algorithm weights obtained by the two algorithms to obtain the airplane initial positioning areas in the two-dimensional images shot by the two cameras.
4. The near distance aircraft pose measurement method based on the parallax method is characterized in that: in the step c, solving the disparity map to obtain a three-dimensional point cloud of an aircraft initial positioning area in a two-dimensional image of a camera with an optical center defined as an origin is realized by the following steps:
(1) calculating the matching degree of the area around each pixel point in the initial positioning area of the airplane in the two-dimensional images shot by the two cameras in the step b, and respectively calculating the matching degree of the areas around the corresponding pixels in the initial positioning area of the airplane in the two-dimensional images shot by the two cameras to obtain an initial parallax map;
(2) filtering the initial disparity map, and screening out mismatching points to obtain a disparity map;
(3) based on a stereoscopic vision model and a camera pinhole imaging principle, the three-dimensional coordinates of each two-dimensional image point in the aircraft initial positioning area in the two-dimensional image of the left camera are calculated by combining a parallax map, and three-dimensional point cloud of the aircraft initial positioning area is formed.
CN201810420236.9A 2018-05-04 2018-05-04 Near distance airplane pose measuring method based on parallax method Active CN108665499B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810420236.9A CN108665499B (en) 2018-05-04 2018-05-04 Near distance airplane pose measuring method based on parallax method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810420236.9A CN108665499B (en) 2018-05-04 2018-05-04 Near distance airplane pose measuring method based on parallax method

Publications (2)

Publication Number Publication Date
CN108665499A CN108665499A (en) 2018-10-16
CN108665499B true CN108665499B (en) 2021-08-10

Family

ID=63781859

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810420236.9A Active CN108665499B (en) 2018-05-04 2018-05-04 Near distance airplane pose measuring method based on parallax method

Country Status (1)

Country Link
CN (1) CN108665499B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111288956B (en) * 2018-12-07 2022-04-22 顺丰科技有限公司 Target attitude determination method, device, equipment and storage medium
CN110246212B (en) * 2019-05-05 2023-02-07 上海工程技术大学 Target three-dimensional reconstruction method based on self-supervision learning
CN111238441B (en) * 2020-02-14 2022-10-04 天津时空经纬测控技术有限公司 Angular deviation measuring method, angular deviation measuring device, and storage medium
CN111966953A (en) * 2020-08-07 2020-11-20 四川泛华航空仪表电器有限公司 Aircraft attitude angle calculation method based on coordinate transformation
CN112150532A (en) * 2020-08-25 2020-12-29 北京迈格威科技有限公司 Image processing method and device, electronic equipment and computer readable medium
CN112037282B (en) * 2020-09-04 2021-06-15 北京航空航天大学 Aircraft attitude estimation method and system based on key points and skeleton
CN112862862B (en) * 2021-02-10 2023-11-17 中国飞行试验研究院 Aircraft autonomous oil receiving device based on artificial intelligence visual tracking and application method
CN114166137B (en) * 2021-11-26 2024-09-10 沪东中华造船(集团)有限公司 Ship-to-ship filling interval intelligent detection system and method
CN114355957A (en) * 2021-12-29 2022-04-15 深圳市镭神智能系统有限公司 Unmanned aerial vehicle autonomous landing method and system and unmanned aerial vehicle
CN116112769A (en) * 2023-01-18 2023-05-12 江南大学 Shooting control method and system for camera outside vehicle

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102506757A (en) * 2011-10-10 2012-06-20 南京航空航天大学 Self-positioning method of binocular stereo measuring system in multiple-visual angle measurement
CN104317391A (en) * 2014-09-24 2015-01-28 华中科技大学 Stereoscopic vision-based three-dimensional palm posture recognition interactive method and system
CN106898022A (en) * 2017-01-17 2017-06-27 徐渊 A kind of hand-held quick three-dimensional scanning system and method
CN107945220A (en) * 2017-11-30 2018-04-20 华中科技大学 A kind of method for reconstructing based on binocular vision

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101894366B (en) * 2009-05-21 2014-01-29 北京中星微电子有限公司 Method and device for acquiring calibration parameters and video monitoring system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102506757A (en) * 2011-10-10 2012-06-20 南京航空航天大学 Self-positioning method of binocular stereo measuring system in multiple-visual angle measurement
CN104317391A (en) * 2014-09-24 2015-01-28 华中科技大学 Stereoscopic vision-based three-dimensional palm posture recognition interactive method and system
CN106898022A (en) * 2017-01-17 2017-06-27 徐渊 A kind of hand-held quick three-dimensional scanning system and method
CN107945220A (en) * 2017-11-30 2018-04-20 华中科技大学 A kind of method for reconstructing based on binocular vision

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
立体视觉传感器的一种灵活靶标新标定方法;刘震;《光学学报》;20130331;第33卷(第03期);第205-211页 *

Also Published As

Publication number Publication date
CN108665499A (en) 2018-10-16

Similar Documents

Publication Publication Date Title
CN108665499B (en) Near distance airplane pose measuring method based on parallax method
CN108955685B (en) Refueling aircraft taper sleeve pose measuring method based on stereoscopic vision
CN110426051B (en) Lane line drawing method and device and storage medium
US11120560B2 (en) System and method for real-time location tracking of a drone
CN107314762B (en) Method for detecting ground object distance below power line based on monocular sequence images of unmanned aerial vehicle
CN101598556B (en) Unmanned aerial vehicle vision/inertia integrated navigation method in unknown environment
CN110296691A (en) Merge the binocular stereo vision measurement method and system of IMU calibration
CN113850126A (en) Target detection and three-dimensional positioning method and system based on unmanned aerial vehicle
CN107316325A (en) A kind of airborne laser point cloud based on image registration and Image registration fusion method
CN109360240A (en) A kind of small drone localization method based on binocular vision
CN107300377B (en) A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion
CN109724586B (en) Spacecraft relative pose measurement method integrating depth map and point cloud
CN115187798A (en) Multi-unmanned aerial vehicle high-precision matching positioning method
CN112461204B (en) Method for satellite to dynamic flying target multi-view imaging combined calculation of navigation height
CN114004977A (en) Aerial photography data target positioning method and system based on deep learning
CN113624231A (en) Inertial vision integrated navigation positioning method based on heterogeneous image matching and aircraft
Xinmei et al. Passive measurement method of tree height and crown diameter using a smartphone
CN116309798A (en) Unmanned aerial vehicle imaging positioning method
CN115423863A (en) Camera pose estimation method and device and computer readable storage medium
CN109764864B (en) Color identification-based indoor unmanned aerial vehicle pose acquisition method and system
CN113740864A (en) Self-pose estimation method for soft landing tail segment of detector based on laser three-dimensional point cloud
CN107063191B (en) A kind of method of photogrammetric regional network entirety relative orientation
CN113340272A (en) Ground target real-time positioning method based on micro-group of unmanned aerial vehicle
CN117115271A (en) Binocular camera external parameter self-calibration method and system in unmanned aerial vehicle flight process
CN112489118B (en) Method for quickly calibrating external parameters of airborne sensor group of unmanned aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant