CN111366162B - Small celestial body detector pose estimation method based on solar panel projection and template matching - Google Patents

Small celestial body detector pose estimation method based on solar panel projection and template matching Download PDF

Info

Publication number
CN111366162B
CN111366162B CN202010176131.0A CN202010176131A CN111366162B CN 111366162 B CN111366162 B CN 111366162B CN 202010176131 A CN202010176131 A CN 202010176131A CN 111366162 B CN111366162 B CN 111366162B
Authority
CN
China
Prior art keywords
image
navigation
detector
coordinate system
projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010176131.0A
Other languages
Chinese (zh)
Other versions
CN111366162A (en
Inventor
邵巍
王博宁
姚文龙
秦浩华
郗洪良
王光泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao University of Science and Technology
Original Assignee
Qingdao University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao University of Science and Technology filed Critical Qingdao University of Science and Technology
Priority to CN202010176131.0A priority Critical patent/CN111366162B/en
Publication of CN111366162A publication Critical patent/CN111366162A/en
Application granted granted Critical
Publication of CN111366162B publication Critical patent/CN111366162B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/24Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for cosmonautical navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Astronomy & Astrophysics (AREA)
  • Automation & Control Theory (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method for estimating the pose of a small celestial body detector based on the matching of solar panel projection and a template, which aims to solve the problem of estimating the pose of the small celestial body detector at a descending landing stage. Firstly, establishing a navigation and projection coordinate system, then solving the change angle of a rolling angle and a pitch angle of the detector based on the solar panel projection information extracted from the navigation image, carrying out template correction on the navigation image at the previous moment according to the solved angle information, and finally solving the rotation angle and the position change between the corrected template image and the image to be matched by utilizing template matching to obtain the real-time pose information of the detector. The method and the device make full use of the projection information of the solar panel of the detector shot by the navigation camera on the surface of the small celestial body, avoid the problems that the number of star surface features at the landing end is not enough for navigation positioning and the traditional navigation landmarks are difficult to extract and match and long in time, improve the real-time performance of a navigation algorithm and ensure the accurate estimation of the position and posture of the detector.

Description

Small celestial body detector pose estimation method based on solar panel projection and template matching
Technical Field
The invention belongs to the field of autonomous navigation of deep space exploration, is applied to a small celestial body exploration landing process, and particularly relates to a small celestial body detector pose estimation method based on solar panel projection and template matching.
Background
In recent years, the search for small celestial bodies has become a difficult but scientifically significant deep space exploration task. However, compared with a planetary detection task, the small celestial body task has the characteristics of long detection target, long flight time, dynamically variable environment, longer time delay between the ground measurement and control station and the detector, and the like. Therefore, the small celestial body detector needs to independently complete a deep space detection task by means of equipment carried by the small celestial body detector.
With the development of high-performance spaceborne computer and imaging sensor technologies, the navigation image resolution is higher and higher, and the visual autonomous navigation system based on optical information becomes a hot point for research of various researchers. Reviewing the visual navigation algorithm for detecting the small celestial body, most of the algorithm takes a characteristic point, a meteorite crater or a reflecting ball carried by the algorithm as a navigation landmark.
For example, Johnson et al, who took feature points as navigation landmarks, studied relative navigation algorithms Based on feature point tracking and matching as early as 1999, obtained detector inter-frame relative Motion information by tracking feature points in Image sequences using the Shi-Tomasi-Kanade algorithm (Johnson, Andrew E., Larry H.Matthies.Precise-Based Motion Estimation for Autonomous Small Body extension. proceedings of the 5th International Symposium on architectural overview, Robotics and Automation in Space, Noordwijk, Netherlands,1-3June 1999:627 634.). The algorithm has strong autonomy, but the mismatching or the unmatching condition is easy to occur. And subsequently, a plurality of feature point extraction and matching algorithms with scale and rotation invariance, such as SIFT, SURF, ASIFT, Camshift and the like, are developed. However, experiments show that the navigation accuracy cannot be obviously improved due to the increase of the number of correctly matched feature points, the calculation complexity is increased, hundreds of megabytes of memory space are needed only for storing feature point descriptors, multi-dimensional data are required to be screened and matched, the performance of the conventional satellite-borne computer is difficult to meet the requirements of algorithms on the memory and the running speed, and the instantaneity is difficult to guarantee. Therefore, the number of feature point extraction and matching is large, the calculated amount is large, and the running time is long; and the feature points generally do not have absolute coordinates and can only be applied to relative navigation.
For meteorite craters as navigation landmarks, although scholars at home and abroad perform some related researches on the method for extracting and matching the meteorite craters as navigation landmarks, compared with large celestial bodies such as moon and mars, the small celestial bodies have smaller mass, the number of the meteorite craters distributed on the surface is rare, the extraction and matching of the meteorite craters are greatly influenced by the change of illumination angles, the prior information is insufficient, and the difficulty in using the meteorite craters as the small celestial body visual navigation landmarks is higher. Similarly, when the meteorite crater is used as a navigation landmark, the meteorite crater is difficult to extract and match, and mismatching is easy to occur; moreover, the meteorites are unevenly distributed on the surface of the attic ladder, and the algorithm is difficult to estimate the position of the lander when no meteorites or few meteorites exist in the landing area.
In order to solve the problem that the small celestial body detectors such as tenon and tenon 2 developed by the Japan aerospace agency increase the self weight of the detectors and the throwing position is difficult to control accurately, the throwing light-reflecting ball is used as a navigation landmark (see Tsuda Yuichi, Yoshikawa Makoto, Abe Masanao. System Design of the Hayabusa 2-absolute Sample Return Mission to 1999JU3.acta Astronautica,2013,91: 356 and 362.). The method is simple and effective, but the self weight of the detector is increased, and the detection cost is increased; meanwhile, the small celestial body has small gravity, so that the throwing position of the reflective ball is difficult to control accurately, and the navigation accuracy can be directly influenced if the reflective ball is shielded by rocks or has poor geometric configuration.
In summary, the navigation landmarks have respective limitations no matter the landmark, the meteorite crater or the self-carried reflective ball are used as the navigation landmark. In the landing process of the small celestial body detector, the characteristic information quantity of the small celestial body obtained before landing is small, and the small celestial body model and the physical parameters are mainly obtained by detection of the flying around segment, so that accurate modeling cannot be realized. The small celestial body generally cannot restrain the atmosphere due to small mass, the projection of the solar sailboard under the sun irradiation can be clearly projected onto the surface of the small celestial body in the landing process of the detector, and the solar sailboard and the projection shape thereof are regular and known, so that the subsequent extraction, modeling and calculation processes are facilitated. The projection of the solar panel of the detector is more obvious in a navigation image along with the descending of the height of the detector, and if the projection information of the solar panel can be used in a navigation algorithm to be used as a navigation landmark, the defects of the traditional navigation landmark can be overcome, and the navigation precision of the detector in the landing process is improved.
Disclosure of Invention
The invention provides a method for estimating the pose of a small celestial body detector based on the matching of solar panel projection and a template, aiming at solving the problem of estimating the pose of the small celestial body detector at a descending landing stage, and provides a new visual autonomous navigation algorithm for estimating the pose for the landing of the small celestial body detector on the basis of not increasing the self weight of the detector, and simultaneously meets the requirements of the real-time performance and the running speed of an on-board computer.
The invention is realized by adopting the following technical scheme: a small celestial body detector pose estimation method based on solar panel projection and template matching comprises the following steps:
step A, establishing a navigation and projection coordinate system;
selecting a landing point coordinate system as a world coordinate system OS-XSYSZSDescribing the specific position of the navigation camera and its carrier in space; and assuming a detector body coordinate system and a navigation camera coordinate system OC-XCYCZCThe transformation of the coincidence, from the landing site coordinate system to the navigation camera coordinate system, is noted
Figure RE-RE-GDA0002478100400000021
Wherein
Figure RE-RE-GDA0002478100400000022
The method comprises the following steps that a rotation matrix is adopted, t is a translation vector, and an ideal pinhole model is adopted by a navigation camera imaging model;
after the navigation camera acquires the image, the homogeneous coordinates of the image points in the image coordinate system are recorded as: u ═ u v 1]T(ii) a The homogeneous coordinate of the object point in the landing point coordinate system is as follows: xS=[x y z 1]TThe homogeneous coordinate of the object point in the navigation camera coordinate system is XC=[xC yC zC 1]T(ii) a The coordinates of the image point in the image coordinate system under the world coordinate system are as follows:
Figure RE-RE-GDA0002478100400000023
matrix K3×3An internal parameter matrix for a known navigation camera;
b, solving the change angles of the rolling angle and the pitch angle of the detector based on the projection information of the solar panel of the detector extracted from the navigation image;
c, performing template correction on the navigation image at the previous moment based on the angle change information obtained in the step B to obtain a corrected template image;
and D, obtaining the rotation angle and the position change between the corrected template image and the image to be matched by utilizing template matching, and obtaining the real-time pose information of the detector.
Further, in the step B, when the roll angle and pitch angle information of the probe at different times is calculated, the following method is specifically adopted:
(1) suppose the edge of the solar panel of the small celestial detector is rectangular in shape, where it is parallel to the navigation camera OCXCThe side length of the shaft is n meters and is parallel to the navigation camera OCYCThe side length of the shaft is m meters; ideal posture sigma set for probe0Roll angle gamma is equal to 0, pitch angle
Figure RE-RE-GDA0002478100400000031
Namely navigation camera XCOCYCSurface and landing site coordinate system XSOSYSThe surfaces are parallel, the projection shape and the projection size of the detector on the surface of the planet are known, and the Lm0|=m、|Ln0|=n;
(2) Will tiThe attitude of the time probe is recorded as ΣiAnd obtaining the projection length | l of the long edge of the solar panel of the detector in the navigation imageniAnd the included angle theta between two edgesiWill | lni[ and [ theta ]iSubstituting the constraint equation:
Figure RE-RE-GDA0002478100400000032
in (1), solve for tiTemporal probe attitude relative to ideal attitude Σ0Roll angle gamma of0,iAnd a pitch angle
Figure RE-RE-GDA0002478100400000033
hiThe height of the detector from the star table is used, and f is the focal length of the navigation camera;
(3) similarly, calculate ti+1Gamma of the moment0,i+1And
Figure RE-RE-GDA0002478100400000034
substituted type
Figure RE-RE-GDA0002478100400000035
In the method, the roll angle and the pitch angle of the detector from t can be obtainediTime ti+1The angle changes at the moment.
Further, in the step C, when performing image correction, the following method is specifically adopted:
(1) introducing image correction coordinate system pair tiAnd performing rotation correction on the time navigation image, wherein the conversion relation between an image correction coordinate system and an original image coordinate system is as follows:
Figure RE-RE-GDA0002478100400000036
wherein R' is a coordinate transformation rotation matrix;
(2) substituting the above-mentioned conversion relationship into that solved by step B
Figure RE-RE-GDA0002478100400000037
γi,i+1Simultaneously order psii,i+1When 0, we get:
Figure RE-RE-GDA0002478100400000038
if tiThe coordinate of a certain point on the time navigation image is (x)0,y0) Then, the coordinates of the corrected image corresponding to the point are: (x)1,y1) And further obtaining the coordinates of each point on the corrected image to obtain a template image.
Further, in the step C, if the calculated conversion parameter is a non-integer, the coordinate conversion formula is used to convert tiWhen the time image is converted, an interpolation operation is carried out:
f (x, y) denotes a pixel value at the corrected image (x, y), Q11(x1,y1)、Q12(x1,y2)、Q21(x2,y1)、 Q22(x2,y2) For the known four-point pixel values adjacent to point f (x, y), then:
Figure RE-RE-GDA0002478100400000041
completion of pair tiAnd correcting the time navigation image.
Further, in the step D, when the yaw angle and the displacement information of the detector are resolved by using template matching, the following method is specifically adopted:
(1)tinavigation image f with time correction1(x, y) and ti+1Navigation image f acquired at any moment2There are translation, rotation and scaling relationships between (x, y), namely:
f2(x,y)=f1[a(x cosψ+y sinψ)-Δx,a(-x sinψ+y cosψ)-Δy]
after Fourier transform and frequency domain amplitude spectrum extraction are carried out on the power spectrum, cross power spectrum functions of the two are obtained through opposite-polar number coordinate transformation:
Figure RE-RE-GDA0002478100400000042
then, an impulse response function delta (w-mu, alpha-psi) can be obtained by Fourier transformation, the translation quantity (mu, psi) under a polar coordinate system can be obtained by searching the coordinates of the peak value of the impulse response function delta (w-mu, alpha-psi), and further, the rotation angle psi and the scaling quantity a, delta z which are (a-1) h are obtainedi
(2) To f2(x, y) corrected for rotation and scaling, and f1And (x, y) only have a translation relation, then obtaining translation amount (delta x, delta y) through calculation of cross power spectrum, repeating the steps from B to D, and solving the real-time pose parameters of the small celestial body detector.
Compared with the prior art, the invention has the advantages and positive effects that:
according to the scheme, autonomous visual navigation is performed by combining projection information of the detector solar array in the navigation image and template matching, and the method has the following advantages:
(1) the shape characteristics and the distribution configuration of star surface features are not required to be considered, the projection information of the detector solar panel shot by a navigation camera on the surface of a small celestial body is fully utilized, the matching and analysis based on image content are completely realized, and the problems that the number of the star surface features at the landing end is not enough for navigation positioning, and the traditional navigation landmark is difficult to extract and match and has long time are solved;
(2) shadow information is easy to extract, and robustness on image rotation, scaling, image blurring, visual angle change, noise and the like is high; and the calculated amount is small, the memory occupation is small, when the solar panel projection is used for visual navigation, only the characteristics of the shape and the size of the solar panel projection are required to be extracted, the algorithm operation time is reduced, the real-time performance of a navigation algorithm is improved, and the accurate estimation of the detector pose is ensured.
Drawings
FIG. 1 is a flow chart of a visual navigation method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a coordinate system constructed according to an embodiment of the present invention, (1) is a schematic diagram of a navigation coordinate system; (2) is a schematic diagram of a local coordinate system;
FIG. 3 shows an embodiment t of the present inventioniA schematic diagram of a template image and an image to be matched is shown in the step (a), the schematic diagram is a template image, and the schematic diagram is an image to be matched;
FIG. 4 is a schematic diagram of an error curve for estimating the pose of a detector according to an embodiment of the present invention.
Detailed Description
In order to make the above objects, features and advantages of the present invention more clearly understood, the present invention will be further described with reference to the accompanying drawings and examples. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, however, the present invention may be practiced in other ways than those described herein, and thus, the present invention is not limited to the specific embodiments disclosed below.
A method for estimating the pose of a small celestial body detector based on the matching of solar panel projection and a template is shown in figure 1 and comprises the following steps:
step one, establishing a navigation and projection coordinate system;
step two, calculating the change angles of the rolling angle and the pitch angle of the detector based on the projection information of the solar panel of the detector extracted from the navigation image;
thirdly, performing template correction on the navigation image at the previous moment based on the angle change information obtained in the second step to obtain a corrected template image;
and fourthly, obtaining the rotation angle and the position change between the corrected template image and the image to be matched by utilizing template matching, and obtaining the real-time pose information of the detector.
Specifically, the method comprises the following steps:
the method comprises the following steps: navigation coordinate system and projection model establishment
In the navigation process, the navigation problem is a problem related to multiple coordinate systems, a relative position and posture relation exists between any two coordinate systems, and in order to realize accurate navigation, the relation between different coordinate systems needs to be strictly modeled.
In this embodiment, the landing site coordinate system is selected as the world coordinate system OS-XSYSZSAnd is used for describing the specific positions of the navigation camera and the carrier thereof in the space, and the position and the attitude parameters of the lander in the embodiment are described in the coordinate system. As shown in FIG. 2, the coordinate system defines the selected landing site in the landing mission as the origin OSThe direction vector from the centroid of the celestial body to the landing point is defined as OSZSAn axis, a direction vector pointing to the south pole direction of the celestial body along the tangential direction of the meridian of the land point is defined as OSXSShaft, OSYSShaft and OSZSShaft and OSXSThe axis meets the right hand criterion.
Assuming a detector body coordinate system and a navigation camera coordinate system OC-XCYCZCCoincidence, selecting the optical center of the navigation camera as the origin OC,OCXCShaft and OCYCThe plane formed by the axes is parallel to the image plane, and the line connecting the optical center to the center of the image plane is defined as OCZCA shaft. Thus, the transformation relationship for converting from the landing site coordinate system to the navigation camera coordinate system is
Figure RE-RE-GDA0002478100400000061
Wherein
Figure RE-RE-GDA0002478100400000062
Is the rotation matrix and t is the translation vector.
The rotation matrix is obtained from the relationship between the euler angle and the rotation angle as follows:
Figure RE-RE-GDA0002478100400000063
wherein the roll angle gamma is around OSXSAngle of rotation of the shaft, pitch angle
Figure RE-RE-GDA0002478100400000069
To surround OSYSThe angle of rotation of the axis, yaw angle psi, about OSZSThe angle of rotation of the shaft defines a positive direction as a counter-clockwise direction.
The navigation camera imaging model adopted by the detector in the embodiment is an ideal pinhole model, and the camera type and the internal parameters are known, namely a linear model. After the navigation camera acquires the image, the homogeneous coordinate of the image point in the image pixel coordinate system is u ═ u v 1]TThe homogeneous coordinate of the object point in the coordinate system of the landing point is XS=[x y z 1]THomogeneous coordinate in the navigation camera coordinate system is XC=[xC yC zC 1]T
The transformation relationship between the navigation camera coordinate system and the world coordinate system is:
Figure RE-RE-GDA0002478100400000064
the coordinates of the image point in the image pixel coordinate system under the world coordinate system are:
Figure RE-RE-GDA0002478100400000065
wherein the matrix K3×3Is an internal parameter matrix of a known navigation camera
Figure RE-RE-GDA0002478100400000066
The matrix P is called the navigation camera projection matrix. Rotation matrix
Figure RE-RE-GDA0002478100400000067
In practice, only 3 independent variables are contained, together with three elements of the translation vector t, for a total of six parameters. The six parameters determine the position and the attitude of the navigation camera in the landing site coordinate system, and the position and the attitude of the navigation camera in the landing site coordinate system are independent of the parameters in the camera, so the matrix
Figure RE-RE-GDA0002478100400000068
The method is called an external parameter matrix of the navigation camera, and is also a parameter which needs to be solved in the navigation problem.
When the algorithm for calculating the detector pose by using the projection information is used for deducing the constraint equation, in order to simplify the deduction, a projection local coordinate system O is establishedP-XPYPZP. Because the edge of the solar sailboard of the small celestial body detector is mostly rectangular, the left front end point of the edge of the detector is used as an origin OPThe long end extension line is OPXPThe short end of the shaft is extended to form an extension line OPYPShaft, OPZPShaft and OPXPShaft, OPYPThe axis satisfies the right hand criterion, and XPOPYPArea and navigation camera XCOCYCThe faces are parallel. The detector edge straight line projection in the navigation image is defined as li=(si,ei) Wherein s isiAnd eiTwo end points, L, of the projection of the edge lineiThe real projection of the edge straight line of the detector on the surface of the small celestial body celestial sphere is obtained.
Step two: calculating the roll angle and pitch angle information of the detector at different moments based on the projection information in the landing process of the detector
Suppose the edge of the small celestial object detector is rectangular in shape, with the edge parallel to the navigation camera OCXCThe side length of the axis is n meters and is parallel to the navigation camera OCYCThe side length of the shaft is m meters. When the height of the detector is h meters, the actual projection length | LiAnd the length | l of the edge of the detector in the navigation image shot by the navigation cameraiThe relationship of | is shown in formula (4):
Figure RE-RE-GDA0002478100400000071
wherein f is the focal length of the navigation camera, and the ideal posture sigma of the detector is set0Roll angle gamma is equal to 0, pitch angle
Figure RE-RE-GDA0002478100400000074
Namely navigation camera XCOCYCSurface and landing site coordinate system XSOSYSThe surfaces are parallel, the projection shape and the projection size of the detector on the surface of the planet are known, and the Lm0|=m、|Ln0And l is n, and the coordinates of three vertexes of the detector positioned on the coordinate axis in the local coordinate system are sequentially as follows: x1(0,0,0)T、 X2(0,m,0)T、X3(n,0,0)T. Meanwhile, the length | l of the detector edge in the navigation image can be obtained by the formula (4)m0|、|ln0|。
Will tiThe attitude of the time probe is recorded as ΣiAnd the projection length | l of the long edge of the solar panel of the detector in the acquired navigation imageniAnd the included angle theta between two edgesiFor knowing, the coordinates of three vertexes of the detector located on the coordinate axis in the local coordinate system are sequentially: x1′、X′2、X′3Then, there are:
Xi′=R0iXi (5)
wherein R is0iFor a probe slave ∑0To sigmaiThe transformation matrix of (2). At the same time, let the coordinate transform matrix
Figure RE-RE-GDA0002478100400000072
Available length constraints:
|lni|=a·|(AX1′)(AX3′)| (6)
substituting the formulas (1) and (5) to obtain:
Figure RE-RE-GDA0002478100400000073
in the navigation image, the edge X1′X3' and X1′X2' projection corresponding direction vectors are respectively:
Figure RE-RE-GDA0002478100400000081
the angular constraint is as shown in equation (9):
Figure RE-RE-GDA0002478100400000082
the united vertical type (7) and (9) can obtain a detector at tiThe constraint equations for roll and pitch angles are:
Figure RE-RE-GDA0002478100400000083
and judging whether the rotation angle is positive or negative through the inertia measuring element, wherein the anticlockwise direction is positive.
By inputting navigation images taken at different times, i.e. tiLong edge projection length | l of detector in time navigation imageniAnd the included angle theta between two edgesiI.e. the corresponding time with respect to the ideal attitude Σ can be solved0Detector roll angle gamma0,iAnd a pitch angle
Figure RE-RE-GDA0002478100400000084
Variations in。
Similarly, calculate ti+1Gamma of the moment0,i+1And
Figure RE-RE-GDA0002478100400000085
substituting the formula into the formula,
Figure RE-RE-GDA0002478100400000086
from equation (11), the roll and pitch angles of the probe are determined from tiTime ti+1The angle changes at the moment.
Step three: t is tiTime of day navigation image correction
Directly to tiTime t andi+1and (3) carrying out template matching on the navigation image acquired at any moment to solve the detector pose information, wherein the larger the relative visual angle change of the navigation camera is, the larger the navigation image distortion is.
So this scheme first begins with tiThe navigation image obtained at any moment is subjected to rotation correction, an image correction coordinate system is introduced, the conversion relation between the image correction coordinate system and the image coordinate system is shown as a formula (12),
Figure RE-RE-GDA0002478100400000087
wherein R' is a coordinate transformation rotation matrix, substituted into one solved by equation (11)
Figure RE-RE-GDA0002478100400000088
γi,i+1Simultaneously order psii,i+1When 0, we get:
Figure RE-RE-GDA0002478100400000089
if tiThe coordinates of a certain point on the time navigation image are as follows: (x)0,y0) Then, the coordinates of the corrected image corresponding to the point are: (x)1,y1) From the formula (12) toAnd calculating the coordinates of each point on the corrected image.
In general, the conversion parameters obtained by calculation are non-integer, and when a coordinate conversion formula is used for tiWhen the time image is converted, the output pixel coordinates are usually mapped to non-integer positions in the new coordinate system, and the non-integer coordinates cannot be used on the discrete data of the image, so that interpolation is needed.
f (x, y) denotes a pixel value at the corrected image (x, y), Q11(x1,y1)、Q12(x1,y2)、Q21(x2,y1)、 Q22(x2,y2) For the known four-point pixel values adjacent to point f (x, y), then:
Figure RE-RE-GDA0002478100400000091
the pairs t are completed by the formulas (12) to (14)iAnd correcting the time navigation image.
Step four: resolving detector yaw angle and displacement information by template matching
tiNavigation image f with time correction1(x, y) and ti+1Navigation image f acquired at any moment2There are translation, rotation and scaling relationships between (x, y), namely:
f2(x,y)=f1[a(x cosψ+y sinψ)-Δx,a(-x sinψ+y cosψ)-Δy] (15)
where a is the amount of zoom, ψ is the amount of rotation, and Δ x, Δ y are the amounts of translation. Represented in the frequency domain as phase changes, spectrogram rotations and scaling. I.e. resolving the detector at t by template matching methodiTime t andi+1yaw angle and displacement between moments transform information.
First, fourier transform of equation (15) is performed:
Figure RE-RE-GDA0002478100400000092
since the relative displacement of the two does not affect the amplitude of Fourier transform, but only affects the phase, so that M is controlled1(u, v) and M2(u, v) are each f1(x, y) and f2(x, y) and is subjected to a pair-polar coordinate transformation to obtain:
Figure RE-RE-GDA0002478100400000093
let w ═ lg ρ and μ ═ lg a, then:
Figure RE-RE-GDA0002478100400000094
due to the coefficient 1/a2Only the value is influenced, and the calculation of the parameter is not influenced. So M2(w, α) can be regarded as being composed of M1(w, α) translation. Where (μ, ψ) is the translation amount of both. By performing Fourier transform on equation (18), and then finding the cross-power spectrum of the two as:
Figure RE-RE-GDA0002478100400000101
wherein the content of the first and second substances,
Figure RE-RE-GDA0002478100400000102
is composed of
Figure RE-RE-GDA0002478100400000103
Complex conjugation of (a). The inverse Fourier transform is carried out on the linear motion vector to obtain an impulse response function delta (w-mu, alpha-psi), the translation quantity (mu, psi) under a polar coordinate system can be obtained by searching the coordinate of the peak value, and further, the rotation angle psi and the scaling quantity a are obtained. To f2(x, y) corrected for rotation and scaling, and f1And (x, y) only have a translation relation, and then the translation amount (delta x, delta y) can be obtained through calculation of cross power spectrum.
The position and pose parameters of the small celestial body detector can be calculated by repeating the steps two to four, the scheme fully utilizes the projection information of the detector solar panel shot by the navigation camera on the surface of the small celestial body, avoids the problems that the number of star surface features at the landing end is not enough for navigation positioning, and the traditional navigation landmark is difficult to extract and match and long in time, improves the real-time performance of a navigation algorithm, and ensures the accurate estimation of the position and pose of the detector.
The method adopts a pose estimation algorithm of a small celestial body detector matched with a template on the basis of solar panel projection and a small celestial body as an example analysis, and during algorithm verification, the pose estimation is carried out by using the algorithm of the invention by taking the initial height of the detector as 200m and taking the detector as an example when the detector descends to the height of 10m from the height of 200m, wherein the pose of the detector is estimated once every 5m, and finally the pose estimation error curve of the detector is obtained, and specific simulation parameters are shown in table 1.
TABLE 1 landing mission simulation parameters
Figure RE-RE-GDA0002478100400000104
FIG. 3 is tiThe schematic diagram of the time template image and the image to be matched and the pose estimation error curve of the detector are shown in fig. 4, and as can be seen from fig. 4, the pose estimation algorithm of the small celestial body detector based on the solar panel projection and template matching can estimate the position of the detector in real time, and finally can obtain high-precision pose estimation information.
The above description is only a preferred embodiment of the present invention, and not intended to limit the present invention in other forms, and any person skilled in the art may apply the above modifications or changes to the equivalent embodiments with equivalent changes, without departing from the technical spirit of the present invention, and any simple modification, equivalent change and change made to the above embodiments according to the technical spirit of the present invention still belong to the protection scope of the technical spirit of the present invention.

Claims (5)

1. The small celestial body detector pose estimation method based on solar panel projection and template matching is characterized by comprising the following steps of:
step A, establishing a navigation and projection coordinate system;
selecting a landing point coordinate system as a world coordinate system OS-XSYSZSDescribing the specific position of the navigation camera and its carrier in space; and assuming a detector body coordinate system and a navigation camera coordinate system OC-XCYCZCThe transformation of the coincidence, from the landing site coordinate system to the navigation camera coordinate system, is noted
Figure FDA0002410883330000011
Wherein
Figure FDA0002410883330000012
The method comprises the following steps that a rotation matrix is adopted, t is a translation vector, and an ideal pinhole model is adopted by a navigation camera imaging model;
after the navigation camera acquires the image, the homogeneous coordinates of the image points in the image coordinate system are recorded as: u ═ u v 1]T(ii) a The homogeneous coordinate of the object point in the landing point coordinate system is as follows: xS=[x y z 1]TThe homogeneous coordinate of the object point in the navigation camera coordinate system is XC=[xC yC zC 1]T(ii) a The coordinates of the image point in the image coordinate system under the world coordinate system are as follows:
Figure FDA0002410883330000013
matrix K3×3An internal parameter matrix for a known navigation camera;
b, solving the change angles of the rolling angle and the pitch angle of the detector based on the projection information of the solar panel of the detector extracted from the navigation image;
c, performing template correction on the navigation image at the previous moment based on the angle change information obtained in the step B to obtain a corrected template image;
and D, obtaining the rotation angle and the position change between the corrected template image and the image to be matched by utilizing template matching, and obtaining the real-time pose information of the detector.
2. The small celestial body detector pose estimation method based on solar sailboard projection and template matching according to claim 1, characterized in that: in the step B, when the roll angle and pitch angle information of the detector at different times is solved, the following method is specifically adopted:
(1) suppose the edge of the solar panel of the small celestial detector is rectangular in shape, where it is parallel to the navigation camera OCXCThe side length of the shaft is n meters and is parallel to the navigation camera OCYCThe side length of the shaft is m meters; ideal posture sigma set for probe0Roll angle gamma is equal to 0, pitch angle
Figure FDA0002410883330000014
Namely navigation camera XCOCYCSurface and landing site coordinate system XSOSYSThe surfaces are parallel, the projection shape and the projection size of the detector solar panel on the surface of the planet are known, and the L ism0|=m、|Ln0|=n;
(2) Will tiThe attitude of the time probe is recorded as ΣiAnd obtaining the projection length | l of the long edge of the solar panel of the detector in the navigation imageniAnd the included angle theta between two edgesiWill | lniI and thetaiSubstituting the constraint equation:
Figure FDA0002410883330000015
in (1), solve for tiTemporal probe attitude relative to ideal attitude Σ0Roll angle gamma of0,iAnd a pitch angle
Figure FDA0002410883330000016
hiThe height of the detector from the star catalogue is shown, and f is the focal length of the navigation camera;
(3) similarly, calculate ti+1Gamma of the moment0,i+1And
Figure FDA0002410883330000021
substituted type
Figure FDA0002410883330000022
In the method, the roll angle and the pitch angle of the detector from t can be obtainediTime ti+1The angle changes at the moment.
3. The small celestial body detector pose estimation method based on solar sailboard projection and template matching according to claim 2, characterized in that: in the step C, when performing image correction, the following method is specifically adopted:
(1) introducing image correction coordinate system pair tiAnd performing rotation correction on the time navigation image, wherein the conversion relation between an image correction coordinate system and an original image coordinate system is as follows:
Figure FDA0002410883330000023
wherein R' is a coordinate transformation rotation matrix;
(2) substituting the above-mentioned conversion relationship into that solved by step B
Figure FDA0002410883330000024
γi,i+1Simultaneously order psii,i+1When 0, we get:
Figure FDA0002410883330000025
if tiThe coordinate of a certain point on the time navigation image is (x)0,y0) Then, the coordinates of the corrected image corresponding to the point are: (x)1,y1) And further obtaining the coordinates of each point on the corrected image to obtain a template image.
4. The small celestial body detector pose estimation method based on solar sailboard projection and template matching according to claim 3, characterized in that: in the step C, if the conversion parameter obtained by calculation is a non-integer, the coordinate conversion formula is used for tiWhen the time-of-day image is converted,and (3) carrying out an interpolation operation:
f (x, y) denotes a pixel value at the corrected image (x, y), Q11(x1,y1)、Q12(x1,y2)、Q21(x2,y1)、Q22(x2,y2) For the known four-point pixel values adjacent to point f (x, y), then:
Figure FDA0002410883330000026
completion of pair tiAnd correcting the time navigation image.
5. The small celestial body detector pose estimation method based on solar sailboard projection and template matching according to claim 4, characterized in that: in the step D, when the yaw angle and the displacement information of the detector are solved by template matching, the following method is specifically adopted:
(1)tinavigation image f with time correction1(x, y) and ti+1Navigation image f acquired at any moment2There are translation, rotation and scaling relationships between (x, y), namely:
f2(x,y)=f1[a(xcosψ+ysinψ)-Δx,a(-xsinψ+ycosψ)-Δy]
after Fourier transform and frequency domain amplitude spectrum extraction are carried out on the power spectrum, cross power spectrum functions of the two are obtained through opposite-polar number coordinate transformation:
Figure FDA0002410883330000031
then, an impulse response function delta (w-mu, alpha-psi) can be obtained by Fourier transformation, the translation quantity (mu, psi) under a polar coordinate system can be obtained by searching the coordinates of the peak value of the impulse response function delta (w-mu, alpha-psi), and further, the rotation angle psi and the scaling quantity a, delta z which are (a-1) h are obtainedi
(2) To f2(x, y) corrected for rotation and scaling, and f1If only translation relation exists between (x, y), then through calculation of cross power spectrum, obtaining translation quantity (delta x, delta y), repeating steps B to D, and solvingAnd (3) real-time pose parameters of the small celestial body detector.
CN202010176131.0A 2020-03-13 2020-03-13 Small celestial body detector pose estimation method based on solar panel projection and template matching Active CN111366162B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010176131.0A CN111366162B (en) 2020-03-13 2020-03-13 Small celestial body detector pose estimation method based on solar panel projection and template matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010176131.0A CN111366162B (en) 2020-03-13 2020-03-13 Small celestial body detector pose estimation method based on solar panel projection and template matching

Publications (2)

Publication Number Publication Date
CN111366162A CN111366162A (en) 2020-07-03
CN111366162B true CN111366162B (en) 2021-09-14

Family

ID=71204319

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010176131.0A Active CN111366162B (en) 2020-03-13 2020-03-13 Small celestial body detector pose estimation method based on solar panel projection and template matching

Country Status (1)

Country Link
CN (1) CN111366162B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112577463B (en) * 2020-12-07 2022-08-02 中国西安卫星测控中心 Attitude parameter corrected spacecraft monocular vision distance measuring method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102607526A (en) * 2012-01-03 2012-07-25 西安电子科技大学 Target posture measuring method based on binocular vision under double mediums
CN104406594A (en) * 2014-12-09 2015-03-11 上海新跃仪表厂 Measuring algorithm for relative position and posture of rendezvous and docking spacecraft
CN106250898A (en) * 2016-07-28 2016-12-21 哈尔滨工业大学 A kind of image local area feature extracting method based on scale prediction
CN107631728A (en) * 2017-09-13 2018-01-26 哈尔滨工业大学 A kind of spaceborne visual aids air navigation aid
CN108750148A (en) * 2018-05-14 2018-11-06 上海微小卫星工程中心 Spacecraft windsurfing two dimension driving mechanism stagnates position in-orbit identification method
CN108920829A (en) * 2018-06-29 2018-11-30 中国空间技术研究院 A kind of solar light pressure torque calculation method of band large size net-shape antenna satellite

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102607526A (en) * 2012-01-03 2012-07-25 西安电子科技大学 Target posture measuring method based on binocular vision under double mediums
CN104406594A (en) * 2014-12-09 2015-03-11 上海新跃仪表厂 Measuring algorithm for relative position and posture of rendezvous and docking spacecraft
CN106250898A (en) * 2016-07-28 2016-12-21 哈尔滨工业大学 A kind of image local area feature extracting method based on scale prediction
CN107631728A (en) * 2017-09-13 2018-01-26 哈尔滨工业大学 A kind of spaceborne visual aids air navigation aid
CN108750148A (en) * 2018-05-14 2018-11-06 上海微小卫星工程中心 Spacecraft windsurfing two dimension driving mechanism stagnates position in-orbit identification method
CN108920829A (en) * 2018-06-29 2018-11-30 中国空间技术研究院 A kind of solar light pressure torque calculation method of band large size net-shape antenna satellite

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
UPF在探测器自主导航中的应用;隋树林等;《微计算机信息》;20080131;287-314 *

Also Published As

Publication number Publication date
CN111366162A (en) 2020-07-03

Similar Documents

Publication Publication Date Title
Sharma et al. Pose estimation for non-cooperative rendezvous using neural networks
CN105371870B (en) A kind of in-orbit accuracy measurement method of star sensor based on star chart data
CN104484648B (en) Robot variable visual angle obstacle detection method based on outline identification
Zhang et al. Vision-based pose estimation for textureless space objects by contour points matching
CN105913489A (en) Indoor three-dimensional scene reconstruction method employing plane characteristics
CN109612438B (en) Method for determining initial orbit of space target under constraint of virtual coplanar condition
CN110849331B (en) Monocular vision measurement and ground test method based on three-dimensional point cloud database model
CN108917753B (en) Aircraft position determination method based on motion recovery structure
CN108680165B (en) Target aircraft attitude determination method and device based on optical image
CN106909161B (en) A kind of optimum attitude motor-driven planing method of agility satellite zero drift angle imaging
CN109724586B (en) Spacecraft relative pose measurement method integrating depth map and point cloud
CN111220120B (en) Moving platform binocular ranging self-calibration method and device
CN111121787B (en) Autonomous initial orbit determination method based on remote sensing image
CN113175929A (en) UPF-based spatial non-cooperative target relative pose estimation method
CN113218577A (en) Outfield measurement method for star point centroid position precision of star sensor
CN111366162B (en) Small celestial body detector pose estimation method based on solar panel projection and template matching
CN106097277B (en) A kind of rope substance point-tracking method that view-based access control model measures
CN111462182A (en) Trajectory missile three-dimensional trajectory estimation method based on infrared early warning image
CN113295171B (en) Monocular vision-based attitude estimation method for rotating rigid body spacecraft
CN112857306B (en) Method for determining continuous solar altitude angle of video satellite at any view direction point
CN111337031B (en) Spacecraft landmark matching autonomous position determination method based on attitude information
CN112927294A (en) Satellite orbit and attitude determination method based on single sensor
CN111735447A (en) Satellite-sensitive-simulation type indoor relative pose measurement system and working method thereof
CN114485620B (en) Autonomous visual positioning system and method for asteroid detector fused with orbit dynamics
CN111680552B (en) Feature part intelligent recognition method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant