CN109405835B - Relative pose measurement method based on non-cooperative target straight line and circular monocular image - Google Patents

Relative pose measurement method based on non-cooperative target straight line and circular monocular image Download PDF

Info

Publication number
CN109405835B
CN109405835B CN201710776372.7A CN201710776372A CN109405835B CN 109405835 B CN109405835 B CN 109405835B CN 201710776372 A CN201710776372 A CN 201710776372A CN 109405835 B CN109405835 B CN 109405835B
Authority
CN
China
Prior art keywords
coordinate system
image
camera
circle
straight line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710776372.7A
Other languages
Chinese (zh)
Other versions
CN109405835A (en
Inventor
孟偲
孙宏超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201710776372.7A priority Critical patent/CN109405835B/en
Publication of CN109405835A publication Critical patent/CN109405835A/en
Application granted granted Critical
Publication of CN109405835B publication Critical patent/CN109405835B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/24Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for cosmonautical navigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method based on non-cooperative targetThe method for measuring the relative pose of the straight line monocular image and the circular monocular image comprises the following steps: A. calibrating an internal parameter matrix K of the measuring camera, and taking a projection matrix of a coordinate system of the measuring camera as a world coordinate system as M; B. the corresponding image of the selected circle and straight line on the non-cooperative spacecraft target is detected in the image, the image of the circle is an elliptic curve, and the quadratic form of the elliptic curve is expressed asC q The image of the line is still a straight line, with homogeneous coordinates denoted ī; C. according to the projection of the circle, obtaining the position and the normal direction of the circle containing the virtual solution; D. and according to the straight line projection, removing the virtual solution and determining the rotation angle of the circle. The method of the invention utilizes the circular contour and the straight line edge contour of the structural component on the non-cooperative target aircraft to restore the pose of the target on the image of the calibration camera, thereby reducing the difficulty of identifying and screening the required features from the image, improving the reliability and the robustness of the method and providing the accurate pose information of the target for the operation of the non-cooperative target.

Description

Relative pose measurement method based on non-cooperative target straight line and circular monocular image
Technical Field
The invention relates to a relative navigation technology of a space non-cooperative target, in particular to a relative pose measuring method based on a non-cooperative target straight line and a round monocular image.
Background
The targets of the space attack and defense countermeasure or the space on-orbit service can be classified into cooperative targets and non-cooperative targets according to whether the cooperation flag can be installed on the target to provide effective cooperation information. At present, the research of the space rendezvous docking and capturing control technology based on the cooperative target is mature internationally. An Advanced Video Guidance System (AVGS) developed by NASA in the United states is the most advanced short-distance monocular pose measurement System based on cooperative signs in space at present, and is verified in space tasks. China also carries out space on-orbit demonstration and verification through Shenzhou No. eight and Tiangong No. one. The space rendezvous and control technology aiming at the non-cooperative target is a research hotspot in the international at present, and space rendezvous and control demonstration verification of the non-cooperative target is already or is carried out in developed countries such as America and Europe.
At present, a domestic space platform captures, identifies, matches and measures a space non-cooperative target and a formation flight control technology, wherein a navigation and control method of a remote section is broken through under the traction of model development. And finally, a visual image navigation technology required by a spatial non-cooperative target capturing and controlling technology in the short-range and super-short-range stages is a core technology which needs to be mastered urgently in developing an intelligent space maneuvering platform, wherein a spatial non-cooperative target pose measurement technology based on target characteristics is an important key technology.
In the visual pose measurement of the space task, because the space imaging condition is poor and the distance change between the tracking spacecraft and the target spacecraft is large, the image formed by the target spacecraft on the CCD sensor of the tracking spacecraft has low brightness, poor contrast, large noise, large scale change and large image detail change, so that the rapid, robust and stable detection of the target characteristics on the image is the biggest problem in realizing the pose measurement. The screening of points and straight lines in the complex image is corresponding to the classic problem in computer vision. The method is an important basis of a space non-cooperative target pose measurement technology by selecting stable image characteristics easy to identify for pose measurement.
Disclosure of Invention
In view of this, the main objective of the present invention is to provide a method for measuring a relative pose based on a non-cooperative target straight line and a round monocular image, so as to improve robustness and reliability of the measurement of the relative pose of a spatial non-cooperative target, and reduce difficulty in identifying and screening required features from the image.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
a relative pose measurement method based on a non-cooperative target straight line and a round monocular image comprises the following steps:
A. calibrating an internal parameter matrix K of the measuring camera, and taking a projection matrix of a coordinate system of the measuring camera as a world coordinate system as M;
B. an image corresponding to a selected circle on the object and a straight line is detected in the image, the image of the circle is an elliptic curve, and the quadratic form of the elliptic curve is represented as CqThe image of the line is still a straight line, with homogeneous coordinates expressed as
Figure GDA0002702139750000021
The radius of the known circle is R;
C. according to the projection of the circle, two groups of positions and normal directions of the circle in a camera coordinate system are obtained, wherein one group is a virtual solution; the method specifically comprises the following steps:
c1, determining by taking the center of the camera as the vertex and an elliptic curveA constant-space elliptic cone whose quadratic form is Q,Q=KTCqK is a real symmetric matrix;
c2 transforming a general elliptic cone surface into a common-vertex standard elliptic cone, i.e. a real symmetric matrix QDiagonalized with an orthogonal array P, i.e. P satisfies PTQP=P-1QP=diag{λ123};
C3, property pair Q according to standard elliptical coneEigenvalues λ of the matrix123And the feature vector is reordered so that12Is of the same sign and lambda1||≥||λ2And obtaining a corresponding orthogonal array P;
c4, solving the normal vector of the circle center position and the cutting plane by using the elliptic cone under the standard elliptic cone coordinate system, wherein two groups of solutions are respectively provided:
solving the first step:
Figure GDA0002702139750000031
the solution two is:
Figure GDA0002702139750000032
c5, converting the circle center position and the normal direction into the camera coordinate system to obtain the circle center position of the space circle under the camera coordinate system { C }
Figure GDA0002702139750000033
And normal direction of its support plane
Figure GDA0002702139750000034
Information:
Figure GDA0002702139750000035
the above formula includes two groups of circle positions and normal directions under the camera coordinate system, wherein one group is true solution, and the other group is false solution;
D. according to the straight line projection, eliminating the virtual solution and determining the rotation angle of the circle; the method specifically comprises the following steps:
d1 calculating plane pi from camera projection matrix M and image straight line1Has a homogeneous coordinate of
Figure GDA0002702139750000036
Plane pi1In a normal direction of
Figure GDA0002702139750000037
D2, holding
Figure GDA0002702139750000038
And
Figure GDA0002702139750000039
pointing in unison, i.e. at an acute angle, and
Figure GDA00027021397500000310
and
Figure GDA00027021397500000311
if the directions are consistent, namely the included angle is an acute angle, two groups of pose solutions of the object coordinate system are obtained as follows:
Figure GDA00027021397500000312
d3, for any set of pose solution, obtaining a transformation matrix from the object coordinate system { B } to the camera coordinate system { C }, and obtaining the transformation matrix{B}T{C}
Figure GDA00027021397500000313
D4, selecting a point P on the straight line L, transforming the coordinate of the point P in the object coordinate system { B } into the coordinate P in the camera coordinate system { C }, and obtaining the coordinate of the point P{C}Then project itTo the image plane, eliminating the virtual solution according to whether the projection is on the image straight line;
wherein:
Figure GDA0002702139750000041
as space vectors
Figure GDA0002702139750000042
Vector direction in camera coordinate system;
Figure GDA0002702139750000043
as space vectors
Figure GDA0002702139750000044
Vector direction in camera coordinate system;
Figure GDA0002702139750000045
as space vectors
Figure GDA0002702139750000046
Vector direction in camera coordinate system;
Figure GDA0002702139750000047
as space vectors
Figure GDA0002702139750000048
Figure GDA0002702139750000049
As space vectors
Figure GDA00027021397500000410
Figure GDA00027021397500000411
As space vectors
Figure GDA00027021397500000412
Figure GDA00027021397500000413
Is a plane pi1The spatial normal vector direction of;
Figure GDA00027021397500000414
is OBThree coordinate components under a camera coordinate system;
Figure GDA00027021397500000415
is composed of
Figure GDA00027021397500000416
Three coordinate components of the vector in the camera coordinate system; two groups of poses of the space target under a camera coordinate system are determined by the formula (1), wherein the position is represented by OBDetermining the attitude from three vectors
Figure GDA00027021397500000417
Determining;
p is an orthogonal matrix, determined by step C2. Wherein the non-cooperative target has visible circles and straight lines thereon, and the sizes and coordinate information of the circles and the straight lines in the reference coordinate system of the non-cooperative target are known to refer to the aircraft which is not provided with the cooperative mark but is known with the three-dimensional model data information.
The non-cooperative target is a space vehicle which is not provided with a cooperative mark but has known three-dimensional model data information, or a landing area space vehicle which utilizes a circular pattern to mark a helicopter landing area, wherein the target structure parameters are known, and the target has a circular structure and a straight line edge parallel to the circular structure.
Selected circular and straight edges on the non-cooperative target spacecraft are visible on the image and can be detected and identified to accurately suggest on the image.
The circular contour selected by the structural component on the non-cooperative target aircraft is specifically a circular ring pattern which marks a helicopter landing area on a spacecraft rocket docking ring or a rocket propeller nozzle or a landing area.
The linear edge profile selected by the structural component on the non-cooperative target aircraft is an edge of a solar wing of the aerospace aircraft, an edge of a satellite body, or an edge of an antenna mast or other visible linear edges, or an arrow pattern, an H pattern or a cross pattern which marks the landing direction of the helicopter on a landing zone.
The camera is a monocular camera. The monocular camera is a small-hole imaging model camera, and internal parameters of the monocular camera can be obtained through calibration.
The method for measuring the relative pose based on the non-cooperative target straight line and the round monocular image has the following advantages:
the method is based on the non-cooperative target aircraft structure appearance circular contour and the corresponding image of the straight line edge on the calibration camera as the basis of pose measurement. (1) The circle and the straight line are both set features, and the specific point features are stable; (2) the corresponding pose resolving of the image of only one circle and one parallel straight line is less difficult than the feature screening in the same characteristic feature; (3) all positions and attitudes of the non-cooperative target aircraft may be determined.
Drawings
FIG. 1 is a schematic diagram of a spatial relationship of a relative pose measurement method based on a non-cooperative target straight line and a round monocular image according to the present invention;
FIG. 2 is a schematic diagram of a satellite model based on a non-cooperative target straight line and circle monocular image relative pose measurement method.
[ description of main symbols ]
1)Oc-XcYcZc-a camera coordinate system { C };
2)OB-XBYBZB-object coordinate system { B }, with the center of circle as origin, circle support plane pi2Normal vector is ZBAxis, ZBAxial pointing and camera coordinate system ZCThe orientation included angle is an acute angle;
3) o-xy-is the image plane physical coordinate system;
4) o-uv-is the image plane pixel coordinate system;
5) l-straight line at upper edge of space target, and pi-plane of circular support2Parallel connection;
6)π2-the plane of the circle on the spatial target;
7)π1the straight line L and the optical center O of the cameraCA determined plane;
8) l is an imaging straight line formed by the space straight line L on the image plane;
9) q-spatial pi of selected circle on spatial target in object coordinate system { B }2The equation of a circle on a plane, the radius of which is R;
10) q-an elliptic curve, which is the projected image of a selected circle on the spatial target on the image plane;
11) with the camera center OCThe vertex is the elliptic cone defined by the elliptic curve q.
Detailed Description
FIG. 1 is a schematic diagram of a spatial relationship of a relative pose measurement method based on a non-cooperative target straight line and a round monocular image according to the present invention; FIG. 2 is a schematic diagram of a satellite model based on a non-cooperative target straight line and circle monocular image relative pose measurement method.
Referring to fig. 1, in conjunction with the schematic diagram of the satellite model shown in fig. 2, the method for measuring the relative pose of the non-cooperative target with respect to the camera according to the present invention detects and extracts parameters of the circular contour and the linear edge contour selected by the structural component on the non-cooperative target through the following embodiments, and obtains the pose of the non-cooperative target with respect to the camera in conjunction with the parameter matrix in the camera.
Wherein the non-cooperative target has visible circles and straight lines thereon, and the sizes and coordinate information of the circles and the straight lines in the reference coordinate system of the non-cooperative target are known. The non-cooperative target, i.e. the non-cooperative target aircraft, may be an aerospace aircraft which is not provided with the cooperative mark but has known three-dimensional model data information, or a landing zone which marks a helicopter landing area by using a circular pattern.
For example, the spatial pose of the target is restored based on a selected one of the circular and straight edges on the spacecraft and its image on the calibration camera. The spacecraft target structure parameters are known, and the target has a circular structure and straight line edges parallel to the circular structure. Selected circular and straight edges on the non-cooperative target are visible on the image and can be detected and identified.
Two groups of positions and normal directions of the circle meeting the projection relation under the camera coordinate are recovered from the single circle and the image of the single circle on the calibration camera. And combining the position and the normal direction of the circle with the recovered space straight line direction to determine a conversion matrix from a space aircraft target coordinate system to a camera coordinate system, converting a certain point on the space straight line to the camera coordinate system, projecting the camera coordinate system onto a camera plane, and eliminating a false solution according to whether the image is a straight line image to obtain the real pose information of the target.
The circular contour selected by the structural component on the non-cooperative target can be a spacecraft satellite-rocket docking ring or a rocket thruster nozzle, or a circular pattern on a landing zone which marks the landing zone of the helicopter. The linear edge profile selected by the structural component on the non-cooperative target can be the edge of a solar wing of the aerospace vehicle, the edge of a satellite body, an antenna mast or other visible linear edges, or an arrow pattern, an H pattern or a cross pattern which marks the landing direction of the helicopter on a landing zone, but the linear edge is parallel to the plane of the circular profile. The camera is a monocular camera, such as a traditional pinhole imaging model camera, and the internal parameters of the camera can be obtained through calibration.
The following describes in detail the process of the method for measuring the relative pose based on the non-cooperative target straight line and the circular monocular image according to the present invention by using a specific embodiment. The method comprises the following specific steps:
step 11: and calibrating an internal parameter matrix K of the measuring camera, and taking a projection matrix of a coordinate system of the measuring camera as a world coordinate system as M.
The calibration camera is used for calibrating the camera for measuring the pose by using the existing camera calibration method, so that an internal parameter matrix K of the camera is obtained.
Step 12: and detecting image characteristics, including elliptic curve detection and straight line detection. The method specifically comprises the following steps: a corresponding image of a selected circle and a straight line on the non-cooperative spacecraft target is detected in the image, the image of the circle is an elliptic curve, and the quadratic form of the elliptic curve is represented as CqThe image of the line is still a straight line, with homogeneous coordinates expressed as
Figure GDA0002702139750000071
The radius of the known circle is R.
In the satellite model image as shown in fig. 2, an ellipse q, and a straight line l are detected.
The curve equation for the elliptic curve q is:
au2+bv2+cuv+du+ev+f=0 (1)
wherein: (a, b, c, d, e, f) are fitting curve parameters of the elliptic curve, which is rewritten in matrix form:
Figure GDA0002702139750000081
wherein: cqIs an algebraic quadratic form of an elliptic curve:
Figure GDA0002702139750000082
the equation for the line l is:
lau+lbv+lc=0
wherein:
Figure GDA0002702139750000084
is a homogeneous coordinate representation of the straight line l.
Step 13: from the projection of the circle, a position (x ') of the circle containing the virtual solution is obtained'om,y'om,z'om) From normal direction (n'mx,n'my,n'mz) (ii) a Wherein: (m is 1 or 2).
The step 13 further comprises:
step 131: byThe camera center being the vertex and the elliptic curve defining a spatial elliptic cone whose quadratic form is denoted Q,Q=KTCAnd K is a real symmetric matrix.
Step 132: converting a general elliptic cone surface into a standard elliptic cone, i.e. a real symmetric matrix QDiagonalized with an orthogonal array P, i.e. P satisfies PTQP=P-1QP=diag{λ123}。
Step 133: property pair Q according to standard elliptic coneEigenvalues λ of the matrix123And the feature vector is reordered so that12Is of the same sign and lambda1||≥||λ2And | l, and obtaining a corresponding orthogonal matrix P.
Step 134: the method comprises the following steps of solving the normal vectors of the circle center position and the cutting plane by using the elliptic conical surface under a standard coordinate system, wherein two groups of solutions are respectively adopted:
solving the first step:
Figure GDA0002702139750000083
the solution two is:
Figure GDA0002702139750000091
the calculation process of the pose solution is as follows. The quadratic form of the elliptic curve q is represented as:
[u v 1]Cq[u v 1]T=0 (2)
wherein:
Figure GDA0002702139750000092
space point P{C}=[x y z]TAnd image point p ═ u v on the image]TThe corresponding relation is as follows:
Figure GDA0002702139750000093
wherein: and K is an in-camera parameter matrix.
Substituting the formula (4) for the formula (2) to obtain a quadratic representation of the elliptical cone in the camera coordinate system:
[x y z]KTCqK[x y z]T=0 (5)
let Q=KTCqK (6)
The elliptical conical surface is complex in representation under a camera coordinate system and inconvenient to calculate, so that the elliptical conical surface under the camera coordinate system is converted into a standard coordinate system for calculation, and the result is converted into the camera coordinate system after the result is obtained. Let (x ', y ', z ') be the coordinate representation in the standard coordinate system. Since the camera coordinate system and the standard coordinate system have the same origin, the conversion between the two coordinate systems is a pure rotational transformation. Matrix QFor a real symmetric matrix, the real symmetric matrix must have an orthogonal matrix P which can be diagonalized, i.e.
PTQP=P-1QP=diag{λ123} (7)
Let P [ x ' y ' z ']T=[x y z]T (8)
Substituting the above formula (8) into formula (5) to obtain:
[x' y' z']PTQP[x' y' z']T=0 (9)
after transformation, the equation of the elliptical cone is:
λ1x'22y'23z'2=0 (10)
since the above formula represents an elliptic cone, λ can be known from the expression of a standard cone123Wherein 2 values are identical in sign and different in sign from one another. When a space circle is imaged as a circle, the elliptical cone becomes a cone, and 2 values of the same sign are equal.
For facilitating subsequent calculation, lambda needs to be also calculated123And P ═ e1 e2 e3]The matrix is processed. Suppose QThe eigenvalues and normalized vectors of the matrix are respectively (mu)123) And (f)1,f2,f3). Setting:
①μ12the same number, and:
②||μ1||≥||μ2||,
then λ3=μ3,λ2=μ2,λ1=μ1
If (mu)123) The order needs to be adjusted to satisfy the condition (r) and (f)1,f2,f3) Corresponding exchange is also carried out; if e3[0 0 1]TIf > 0, then e3=f3Else, e3=-f3;e2=f2,e1=e2×e3
Then, the elliptic conical surface under the standard coordinate system is utilized to solve the normal vectors of the circle center position and the cutting plane, and two groups of solutions are respectively provided:
solving the first step:
Figure GDA0002702139750000101
Figure GDA0002702139750000102
the solution two is:
Figure GDA0002702139750000103
Figure GDA0002702139750000104
wherein: r is the radius of the space circle Q.
Step 135: converting the circle center position and the normal direction into a camera coordinate system to obtain the camera coordinate systemCentre position of space circle under { C }
Figure GDA0002702139750000117
And normal direction of its support plane
Figure GDA0002702139750000111
And (4) information.
Converting the pose results expressed by the formulas (11) to (14) into a camera coordinate system to obtain the center position of a space circle under the camera coordinate system { C }, and obtaining the center position of the space circle under the camera coordinate system { C }
Figure GDA0002702139750000118
And normal direction of its support plane
Figure GDA0002702139750000112
Information:
Figure GDA0002702139750000113
the above equation (15) includes two sets of circle positions and normal directions in the camera coordinate system, one set is a true solution, and the other set is a false solution.
Examples are: coordinate P of any point P on space circle Q in camera reference coordinate system { C }{C}=[x y z]TAnd image point p ═ u v on the image]TExistence of projection correspondence relationship:
Figure GDA0002702139750000114
substituting the elliptic curve into a quadratic equation of the elliptic curve to obtain:
[x y z]KTCqK[x y z]T=0
let Q=KTCK
Wherein: matrix QFor a real symmetric matrix, pair QDecomposing the eigenvalues to obtain QThe eigenvalue and normalized eigenvector of (u) are respectively123) And (f)1,f2,f3)。
To [ mu ] isi,fi](i ═ 1,2,3) reordering is performed so that the ordering satisfies μ1·μ2> 0 and | mu1|≥|μ2L. Let new rank be [ mu'i,f'i](i=1,2,3)。
Let orthogonal matrix P be [ e ═ e1 e2 e3]。
If f is3'[0 0 1]TIf > 0, then e3=f3Else, e3=-f3
Let e2=f2,e1=e2×e3
The center position of the space circle under the { C } coordinate system of the camera
Figure GDA0002702139750000115
And normal direction of its support plane
Figure GDA0002702139750000116
The information is as follows:
Figure GDA0002702139750000121
wherein: includes two sets of circle positions and normal directions under the camera coordinate system, one set is true solution, the other set is false solution.
Note that: centre position of space circle under camera coordinate system { C }
Figure GDA0002702139750000122
And normal direction of its support plane
Figure GDA0002702139750000123
To be connected with
Figure GDA0002702139750000124
At angles less than acute, i.e. maintaining pointing at a common sense
Figure GDA0002702139750000125
Otherwise make
Figure GDA0002702139750000126
And (6) taking the inverse.
Step 14: and according to the straight line projection, eliminating the virtual solution and determining the rotation angle of the circle.
As shown in fig. 1, the space straight line L and the support plane pi of the space circle2Parallel, L' ═ pi1∩π2Is a plane pi1And plane pi2Obviously L ', i.e. the straight line L is parallel to L'.
Step 141: determining a plane pi1The normal direction of (c).
The camera internal parameter matrix is K, the world coordinate system { W } is coincident with the camera coordinate system { C }, and then the projection matrix M of the camera is:
Figure GDA0002702139750000127
plane pi1The homogeneous coordinates of (a) are:
Figure GDA0002702139750000128
thus the plane pi1The normal directions of (A) are:
Figure GDA0002702139750000129
note that: holding plane pi1In the normal direction of
Figure GDA00027021397500001210
And
Figure GDA00027021397500001211
the included angle being less than an acute angle, i.e. maintaining a consistent orientation (pi)1x> 0), otherwise make
Figure GDA00027021397500001212
And (6) taking the inverse.
Step 142: checking the pose of the circle to eliminate false solutions.
As can be seen from the attached figure 1,
Figure GDA00027021397500001213
while
Figure GDA00027021397500001214
Therefore it must have
Figure GDA00027021397500001215
Because of the fact that
Figure GDA00027021397500001216
And
Figure GDA00027021397500001217
are directed in the same direction
Figure GDA0002702139750000131
And
Figure GDA0002702139750000132
pointing in a uniform manner, therefore
Figure GDA0002702139750000133
Then
Figure GDA0002702139750000134
The complete pose solution of the space target obtained by the combination formula (15) is as follows:
Figure GDA0002702139750000135
wherein:
Figure GDA0002702139750000136
refers to a space vector
Figure GDA0002702139750000137
Vector direction in camera coordinate system;
Figure GDA0002702139750000138
refers to a space vector
Figure GDA0002702139750000139
Vector direction in camera coordinate system;
Figure GDA00027021397500001310
refers to a space vector
Figure GDA00027021397500001311
Vector direction in camera coordinate system;
Figure GDA00027021397500001312
refers to a space vector
Figure GDA00027021397500001313
Figure GDA00027021397500001314
Refers to a space vector
Figure GDA00027021397500001315
Figure GDA00027021397500001316
Refers to a space vector
Figure GDA00027021397500001317
Figure GDA00027021397500001318
Means a plane pi1The spatial normal vector direction of;
Figure GDA00027021397500001319
is a direction vector of the spatial straight line L;
Figure GDA00027021397500001320
is denoted by OBThree coordinate components under a camera coordinate system;
Figure GDA00027021397500001321
means that
Figure GDA00027021397500001322
Three coordinate components of the vector in the camera coordinate system.
Equation (19) determines two sets of poses of the space object in the camera coordinate system, wherein the position is represented by OBDetermining the attitude from three vectors
Figure GDA00027021397500001323
And (4) determining.
Step 143: for any set of pose solution, a transformation matrix from an object coordinate system { B } to a camera coordinate system { C } can be obtained{B}T{C}
Figure GDA00027021397500001324
Wherein:
Figure GDA0002702139750000141
step 144: at any point P on the straight line L, the coordinate of the point P in the object coordinate system { B } is P{B}The coordinate in the camera coordinate system { C } is P{C}. According to the coordinate system transformation relationship, there are:
Figure GDA0002702139750000142
then P projects P on the image as:
Figure GDA0002702139750000143
because P ∈ L, there must be P ∈ L according to the invariance of the projective transformation, i.e.:
Figure GDA0002702139750000144
p to be determined by the above formula (22)iIf (i) is 1 or 2 and is substituted for equation (23), equation (23) is satisfied as a correct solution, and if not satisfied, it is a false solution.
Therefore, not only is the false solution of the pose of the circle eliminated, but also the complete pose of the circle is determined.
In the present invention, the specific implementation process of the step of excluding the virtual solution and determining the rotation angle of the circle according to the linear projection is as follows:
assuming that the positions and normal directions of the first set of circles are correct poses, namely:
Figure GDA0002702139750000145
order to
Figure GDA0002702139750000146
Then
Figure GDA0002702139750000147
Will be provided with
Figure GDA0002702139750000148
All the vector normalization processing is carried out, namely:
Figure GDA0002702139750000149
obtaining a transformation matrix from the object coordinate system { B } to the camera coordinate system { C }{B}T{C}
Figure GDA0002702139750000151
Wherein:
Figure GDA0002702139750000152
at any point P on the straight line L, the coordinate of the point P in the object coordinate system { B } is P{B}And converting the coordinate system into a camera coordinate system { C }, and obtaining:
Figure GDA0002702139750000153
then the image homogeneous coordinates of P projected on the image are:
Figure GDA0002702139750000154
it is judged whether p falls on the straight line l.
If it is not
Figure GDA0002702139750000155
The first set of solutions is the correct solution; otherwise, the second set of solutions is the correct solution. If the second set of solutions is correct, recalculating
Figure GDA0002702139750000156
And vector normalization processing is performed.
The invention utilizes the circular outline and the straight line edge outline of the structural component on the non-cooperative target aircraft to restore the pose of the target on the image of the calibration camera, eliminates the ambiguity of single circle projection positioning and the uncertainty of the rotation angle, reduces the difficulty of identifying and screening the required characteristics from the image, and improves the reliability and the robustness of the method.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention.

Claims (8)

1. The relative pose measurement method based on the non-cooperative target straight line and the circular monocular image is characterized by comprising the following steps of:
A. calibrating an internal parameter matrix K of the measuring camera, and taking a projection matrix of a coordinate system of the measuring camera as a world coordinate system as M;
B. an image corresponding to a selected circle on the object and a straight line is detected in the image, the image of the circle is an elliptic curve, and the quadratic form of the elliptic curve is represented as CqThe image of the line is still a straight line, with homogeneous coordinates expressed as
Figure FDA0002702139740000012
The radius of the known circle is R;
C. according to the projection of the circle, two groups of positions and normal directions of the circle in a camera coordinate system are obtained, wherein one group is a virtual solution; the method specifically comprises the following steps:
c1, determining a spatial elliptic cone from the camera center as the vertex and the elliptic curve, the quadratic form of which is Q,Q=KTCqK is a real symmetric matrix;
c2 transforming a general elliptic cone surface into a common-vertex standard elliptic cone, i.e. a real symmetric matrix QDiagonalized with an orthogonal array P, i.e. P satisfies PTQP=P-1QP=diag{λ123};
C3, property pair Q according to standard elliptical coneEigenvalues λ of the matrix123And the feature vector is reordered so that12Is of the same sign and lambda1||≥||λ2And obtaining a corresponding orthogonal array P;
c4, solving the normal vector of the circle center position and the cutting plane by using the elliptic cone under the standard elliptic cone coordinate system, wherein two groups of solutions are respectively provided:
solving the first step:
Figure FDA0002702139740000011
the solution two is:
Figure FDA0002702139740000021
c5, converting the circle center position and the normal direction into the camera coordinate system to obtain the circle center position of the space circle under the camera coordinate system { C }
Figure FDA0002702139740000022
And normal direction of its support plane
Figure FDA0002702139740000023
Information:
Figure FDA0002702139740000024
the above formula includes two groups of circle positions and normal directions under the camera coordinate system, wherein one group is true solution, and the other group is false solution;
D. according to the straight line projection, eliminating the virtual solution and determining the rotation angle of the circle; the method specifically comprises the following steps:
d1 calculating plane pi from camera projection matrix M and image straight line1Has a homogeneous coordinate of
Figure FDA0002702139740000025
Plane pi1In a normal direction of
Figure FDA0002702139740000026
D2, holding
Figure FDA0002702139740000027
And
Figure FDA0002702139740000028
pointing in unison, i.e. at an acute angle, and
Figure FDA0002702139740000029
and
Figure FDA00027021397400000210
if the directions are consistent, namely the included angle is an acute angle, two groups of pose solutions of the object coordinate system are obtained as follows:
Figure FDA00027021397400000211
d3, for any set of pose solution, obtaining a transformation matrix from the object coordinate system { B } to the camera coordinate system { C }, and obtaining the transformation matrix{B}T{C}
Figure FDA00027021397400000212
D4, selecting a point P on the straight line L, transforming the coordinate of the point P in the object coordinate system { B } into the coordinate P in the camera coordinate system { C }, and obtaining the coordinate of the point P{C}Projecting the image plane to eliminate virtual solution based on whether the image plane is projected;
wherein:
Figure FDA00027021397400000213
as space vectors
Figure FDA00027021397400000214
Vector direction in camera coordinate system;
Figure FDA0002702139740000031
as space vectors
Figure FDA0002702139740000032
Vector direction in camera coordinate system;
Figure FDA0002702139740000033
as space vectors
Figure FDA0002702139740000034
Vector direction in camera coordinate system;
Figure FDA0002702139740000035
as space vectors
Figure FDA0002702139740000036
Figure FDA0002702139740000037
As space vectors
Figure FDA0002702139740000038
Figure FDA0002702139740000039
As space vectors
Figure FDA00027021397400000310
Figure FDA00027021397400000311
Is a plane pi1The spatial normal vector direction of;
Figure FDA00027021397400000312
are each OBThree coordinate components under a camera coordinate system;
Figure FDA00027021397400000313
are respectively as
Figure FDA00027021397400000314
Three coordinate components of the vector in the camera coordinate system; determining two groups of poses of the space target under a camera coordinate system by the formula (1), wherein the position is represented by OBDetermining the attitude from three vectors
Figure FDA00027021397400000315
Determining;
p is an orthogonal matrix.
2. The method for measuring the relative pose based on the monocular image of the non-cooperative target of claim 1, wherein the non-cooperative target has visible circles and straight lines thereon, and the size and coordinate information of the circles and the straight lines in the reference coordinate system of the non-cooperative target itself are known.
3. The method for measuring the relative pose based on the linear and circular monocular images of the non-cooperative target according to claim 1, wherein the non-cooperative target is an aerospace vehicle which is not provided with a cooperative sign but has known three-dimensional model data information, or a landing zone for marking a landing area of a helicopter by using a circular pattern.
4. The method according to claim 3, wherein the selected circular and straight line edges on the non-cooperative target are visible on the image and can be detected and identified.
5. The method for measuring the relative pose based on the linear and circular monocular images of the non-cooperative target according to claim 1, wherein the circular contour selected by the structural component on the non-cooperative target is a circular ring pattern on a spacecraft satellite-rocket docking ring or a rocket thruster nozzle, or a landing zone marking a helicopter landing area.
6. The method for measuring the relative pose based on the linear and circular monocular images of the non-cooperative target according to claim 1, wherein the linear edge profile selected by the structural component on the non-cooperative target is the edge of the sun wing of the aerospace vehicle, the edge of the satellite body, the mast of the antenna or other visible linear edges, or an arrow pattern, an H pattern or a cross pattern on the landing zone for marking the landing direction of the helicopter.
7. The non-cooperative target straight line and circle monocular image based relative pose measurement method of claim 1, wherein the camera is a monocular camera.
8. The method for measuring the relative pose based on the non-cooperative target straight line and the circular monocular image according to claim 7, wherein the monocular camera is a pinhole imaging model camera, and internal parameters thereof can be obtained through calibration.
CN201710776372.7A 2017-08-31 2017-08-31 Relative pose measurement method based on non-cooperative target straight line and circular monocular image Active CN109405835B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710776372.7A CN109405835B (en) 2017-08-31 2017-08-31 Relative pose measurement method based on non-cooperative target straight line and circular monocular image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710776372.7A CN109405835B (en) 2017-08-31 2017-08-31 Relative pose measurement method based on non-cooperative target straight line and circular monocular image

Publications (2)

Publication Number Publication Date
CN109405835A CN109405835A (en) 2019-03-01
CN109405835B true CN109405835B (en) 2020-11-13

Family

ID=65463295

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710776372.7A Active CN109405835B (en) 2017-08-31 2017-08-31 Relative pose measurement method based on non-cooperative target straight line and circular monocular image

Country Status (1)

Country Link
CN (1) CN109405835B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109949367B (en) * 2019-03-11 2023-01-20 中山大学 Visible light imaging positioning method based on circular projection
CN110647156B (en) * 2019-09-17 2021-05-11 中国科学院自动化研究所 Target object docking ring-based docking equipment pose adjusting method and system
CN111104890B (en) * 2019-12-13 2023-09-29 上海宇航系统工程研究所 Method and device for identifying and reconstructing generalized model of spacecraft
CN111862126B (en) * 2020-07-09 2022-09-20 北京航空航天大学 Non-cooperative target relative pose estimation method combining deep learning and geometric algorithm
CN112378383B (en) * 2020-10-22 2021-10-19 北京航空航天大学 Binocular vision measurement method for relative pose of non-cooperative target based on circle and line characteristics
CN112381884B (en) * 2020-11-12 2022-04-19 北京航空航天大学 RGBD camera-based space circular target pose measurement method
CN113129371A (en) * 2021-03-15 2021-07-16 北京航空航天大学 Image feature-based spacecraft monocular vision attitude estimation method
CN113295171B (en) * 2021-05-19 2022-08-16 北京航空航天大学 Monocular vision-based attitude estimation method for rotating rigid body spacecraft
CN113610763A (en) * 2021-07-09 2021-11-05 北京航天计量测试技术研究所 Rocket engine structural member pose motion compensation method in vibration environment
CN114596355B (en) * 2022-03-16 2024-03-08 哈尔滨工业大学 High-precision pose measurement method and system based on cooperative targets
CN114926526B (en) * 2022-05-23 2023-05-05 南京航空航天大学 Pose measurement method based on zoom camera

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
CN103791889A (en) * 2014-01-14 2014-05-14 南京航空航天大学 Cross structure light assisted monocular vision pose measurement method
CN104101331A (en) * 2014-07-24 2014-10-15 合肥工业大学 Method used for measuring pose of non-cooperative target based on complete light field camera
CN104154918A (en) * 2014-07-14 2014-11-19 南京航空航天大学 Fault processing method for monocular vision navigation feature point losing
CN104517291A (en) * 2014-12-15 2015-04-15 大连理工大学 Pose measuring method based on coaxial circle characteristics of target
CN104778716A (en) * 2015-05-05 2015-07-15 西安电子科技大学 Truck carriage volume measurement method based on single image
CN105606025A (en) * 2016-02-01 2016-05-25 西安交通大学 Method for measuring spherical object geometric parameters by use of laser and monocular camera
US9482575B1 (en) * 2014-05-21 2016-11-01 Jeffrey D Barchers System and method for low signal knife edge wavefront sensing in an adaptive optical system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
CN103791889A (en) * 2014-01-14 2014-05-14 南京航空航天大学 Cross structure light assisted monocular vision pose measurement method
US9482575B1 (en) * 2014-05-21 2016-11-01 Jeffrey D Barchers System and method for low signal knife edge wavefront sensing in an adaptive optical system
CN104154918A (en) * 2014-07-14 2014-11-19 南京航空航天大学 Fault processing method for monocular vision navigation feature point losing
CN104101331A (en) * 2014-07-24 2014-10-15 合肥工业大学 Method used for measuring pose of non-cooperative target based on complete light field camera
CN104517291A (en) * 2014-12-15 2015-04-15 大连理工大学 Pose measuring method based on coaxial circle characteristics of target
CN104778716A (en) * 2015-05-05 2015-07-15 西安电子科技大学 Truck carriage volume measurement method based on single image
CN105606025A (en) * 2016-02-01 2016-05-25 西安交通大学 Method for measuring spherical object geometric parameters by use of laser and monocular camera

Also Published As

Publication number Publication date
CN109405835A (en) 2019-03-01

Similar Documents

Publication Publication Date Title
CN109405835B (en) Relative pose measurement method based on non-cooperative target straight line and circular monocular image
CN108012325B (en) Navigation positioning method based on UWB and binocular vision
CN105091744B (en) The apparatus for detecting position and posture and method of a kind of view-based access control model sensor and laser range finder
CN108955685B (en) Refueling aircraft taper sleeve pose measuring method based on stereoscopic vision
Zhang et al. Vision-based pose estimation for textureless space objects by contour points matching
Peng et al. A pose measurement method of a space noncooperative target based on maximum outer contour recognition
CN107063261B (en) Multi-feature information landmark detection method for precise landing of unmanned aerial vehicle
Štěpán et al. Vision techniques for on‐board detection, following, and mapping of moving targets
CN104880176A (en) Moving object posture measurement method based on prior knowledge model optimization
CN108845335A (en) Unmanned aerial vehicle ground target positioning method based on image and navigation information
Wang et al. Real-time drogue recognition and 3D locating for UAV autonomous aerial refueling based on monocular machine vision
Coutard et al. Visual detection and 3D model-based tracking for landing on an aircraft carrier
CN110083177A (en) A kind of quadrotor and control method of view-based access control model landing
CN110160528B (en) Mobile device pose positioning method based on angle feature recognition
Zheng et al. Robust and accurate monocular visual navigation combining IMU for a quadrotor
CN108225273A (en) A kind of real-time runway detection method based on sensor priori
Dotenco et al. Autonomous approach and landing for a low-cost quadrotor using monocular cameras
CN109764864B (en) Color identification-based indoor unmanned aerial vehicle pose acquisition method and system
Deng et al. A binocular vision-based measuring system for UAVs autonomous aerial refueling
Xiao-Hong et al. UAV's automatic landing in all weather based on the cooperative object and computer vision
CN100582653C (en) System and method for determining position posture adopting multi- bundle light
Zhang et al. Tracking and position of drogue for autonomous aerial refueling
CN110490934A (en) Mixing machine vertical blade attitude detecting method based on monocular camera and robot
CN108731683B (en) Unmanned aerial vehicle autonomous recovery target prediction method based on navigation information
CN115131433A (en) Non-cooperative target pose processing method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant