CN113295171A - Monocular vision-based attitude estimation method for rotating rigid body spacecraft - Google Patents

Monocular vision-based attitude estimation method for rotating rigid body spacecraft Download PDF

Info

Publication number
CN113295171A
CN113295171A CN202110545278.7A CN202110545278A CN113295171A CN 113295171 A CN113295171 A CN 113295171A CN 202110545278 A CN202110545278 A CN 202110545278A CN 113295171 A CN113295171 A CN 113295171A
Authority
CN
China
Prior art keywords
coordinate system
spacecraft
ellipse
image
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110545278.7A
Other languages
Chinese (zh)
Other versions
CN113295171B (en
Inventor
胡庆雷
龙宸溶
郑建英
郭雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN202110545278.7A priority Critical patent/CN113295171B/en
Publication of CN113295171A publication Critical patent/CN113295171A/en
Application granted granted Critical
Publication of CN113295171B publication Critical patent/CN113295171B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/24Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for cosmonautical navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering

Abstract

The invention discloses a monocular vision-based attitude estimation method for a rotary rigid spacecraft, which comprises the steps of establishing a known target rotary rigid spacecraft model and a coordinate system for describing the attitude of the spacecraft, carrying out median filtering and background separation on an image obtained by a monocular camera, detecting and marking all ellipses by using a rapid ellipse detection method, determining general functions of all the ellipses, carrying out radius matching on the marked ellipses and the known target model, solving by using a geometric method to obtain the normal direction and the central coordinate of a plane where the ellipses are located under a camera coordinate system, fitting all the ellipse central coordinates into a straight line by using a least square method, calculating the unit direction vector of the straight line, and obtaining the complete relative attitude of the target spacecraft relative to a tracked spacecraft according to the normal line and the central coordinate of the uniquely determined plane where the reference ellipses are located. The method has the advantages of small calculated amount, short consumed time and high accuracy, and can effectively estimate the relative attitude of the non-cooperative target spacecraft relative to the tracking spacecraft.

Description

Monocular vision-based attitude estimation method for rotating rigid body spacecraft
Technical Field
The invention belongs to the technical field of vision measurement of non-cooperative target spacecrafts in the space task technology, and relates to a method for estimating attitude of a rotating rigid body spacecraft based on monocular vision.
Background
Pose (relative position and attitude) estimation of the target spacecraft is an important issue for a wide range of spatial mission scenarios, such as Formation Flight (FF), on-orbit service (OOS), and Active Debris Removal (ADR). Relative pose estimation and navigation are essential in safe close range operations (from tens of meters to several meters), close range capture such as rendezvous and docking, and the like. In the most common non-cooperative target spacecraft (namely, no artificial marker and no communication link are provided), the realization of the pose determination of the non-cooperative target spacecraft is an important basis for executing subsequent complex space tasks, so that the pose estimation problem research of the non-cooperative target spacecraft has important theoretical value and engineering significance.
For the pose estimation problem of the non-cooperative target, the patent CN108562274A adopts a binary square marker as a marker, casts a plurality of markers to the target spacecraft in the rendezvous and docking approach stage, and solves the relative pose between the target coordinate system and the camera coordinate system by identifying the position information of the satellite and rocket docking rings and the markers on the surface of the non-cooperative target. However, this method requires throwing a plurality of markers, which results in waste of resources, and is only suitable for the close proximity approach stage, and the identification of the markers increases the amount of calculation and brings estimation errors. In the patent CN111536981A, feature information such as frame corner points and docking rings of a non-cooperative target main body is extracted by a binocular camera, left and right image features are matched, three-dimensional coordinates of the matched features are restored, and finally, the position and posture of the non-cooperative target relative to the binocular camera are obtained by calculation, so as to obtain the relative pose between a coordinate system of the non-cooperative target and a coordinate system of the binocular camera. The method is restricted by the base line of the binocular camera, the observable distance is limited, the power consumption and the load are large, and the requirement on the load is improved.
Disclosure of Invention
The invention solves the problems: the method is used for obtaining the position and attitude information of the non-cooperative target spacecraft relative to the tracking spacecraft and establishing a foundation for the subsequent complex space task. The invention realizes a monocular vision system based on low power and low quality requirements, and under the condition of not depending on additional measurement information, the invention fits the straight line where the centers of a plurality of circles of the rotating rigid spacecraft are positioned by the least square method to determine the normal direction and the central position of the support plane of the spacecraft, thereby realizing the accurate measurement of the relative pose of the non-cooperative target spacecraft, having small calculated amount and high accuracy, and providing higher value for practical engineering application.
The invention provides a monocular vision-based attitude estimation method for a rotating rigid body spacecraft, which comprises the following steps:
s1: establishing a known target rotating rigid body spacecraft 3D model, and simultaneously establishing a coordinate system for describing the posture of a spacecraft, wherein the coordinate system comprises a pixel coordinate system, an image coordinate system, a camera coordinate system and a target rotating rigid body spacecraft body coordinate system; calculating a conversion relation among the pixel coordinate system, the image coordinate system, the camera coordinate system and the target rotating rigid body spacecraft body coordinate system to obtain a coordinate conversion matrix;
s2: calculating pixel coordinates of an image obtained by a monocular camera on a tracking spacecraft in a pixel coordinate system, and performing median filtering to obtain a filtered image;
s3: processing the image after median filtering, and separating the spacecraft from the background by using a weak gradient elimination method to obtain an image after background separation;
s4: processing the image after background separation, detecting all ellipses in the image after background separation by using a rapid ellipse detection method, marking the ellipses according to the sequence 1,2 and … n, and determining general functions of all the ellipses;
s5: matching the radii of the ellipses of all the marks with the known 3D model of the target rotating rigid body spacecraft, and solving general functions of all the ellipses by using a geometric method to obtain normal vectors and central coordinates of the plane where each ellipse is located under a camera coordinate system;
s6: all the center coordinates (x)i,yi,zi) Fitting i belongs to {1,2.. n } to form a straight line, calculating to obtain a unit direction vector of the straight line, wherein i represents the ith mark ellipse;
s7: according to the characteristic that the centers of all ellipses are on one straight line, and the straight line is the normal line of the plane where the ellipses are located, the normal vector and the central coordinate of the plane where the ellipses are located are uniquely determined, the plane where the first ellipse is located is used as a reference, the rotation matrix of the plane where the reference ellipse is located relative to the tracking spacecraft is calculated, and the rotation matrix is used as the estimation of the attitude of the rotating rigid-body spacecraft.
In step S1, a known target rotating rigid body spacecraft 3D model is established, and a coordinate system describing the posture of the spacecraft is established, where the coordinate system includes a pixel coordinate system, an image coordinate system, a camera coordinate system, and a target rotating rigid body spacecraft body coordinate system; and calculating a conversion relation among the pixel coordinate system, the image coordinate system, the camera coordinate system and the target rotating rigid body spacecraft body coordinate system to obtain a coordinate conversion matrix, which specifically comprises the following steps:
pixel coordinate system Ouv-uv: to track the vertex O of the image obtained by a monocular camera on a spacecraftuvThe point is an original point, the horizontal coordinate u axis is a row where the image is located, and the vertical coordinate v axis is a column where the image is located;
image coordinate system Oxy-xy: at the center O of the imagexyAs an origin, the x-axis is parallel to the u-axis and the y-axis is parallel to the v-axis of the pixel coordinate system;
camera coordinate system Oc-XcYcZc: with camera optical center OcIs an origin, ZcAxial in the direction of the optical axis, XcThe axis being parallel to the x-axis, Y, of the image coordinate systemcThe axis is parallel to the y-axis of the image coordinate system;
with the mass center O of the target spacecraft bodyw: establishing a target spacecraft body coordinate system O for an originw-XwYwZwThe direction of the outward normal of the plane of the ellipse is taken as ZwAxis X in the direction perpendicular to the normal of the plane of the ellipsewAxis, YwThe axes are respectively connected with XwAxis and ZwThe axes are vertical and form a right-handed screw system;
establishing a coordinate system O on the plane of the ellipseD-XDYDZDAt the center O of the planeDIs an origin, XDAxis parallel to X of body coordinate systemwAxis, YDY with axis parallel to body coordinate system of target spacecraftwAxis, ZDZ with axis parallel to body coordinate system of target spacecraftwA shaft.
In step S2, calculating pixel coordinates of an image obtained by tracking a monocular camera on the spacecraft in a pixel coordinate system, and performing median filtering to obtain a filtered image, specifically including:
s21: converting an image obtained by a monocular camera on a tracked spacecraft into a gray image;
s22: selecting a sliding template (generally 3 multiplied by 3), aligning the first three rows and the first three columns of pixel points of the gray image, sequencing all the pixel points in the sliding template according to the pixel values to generate a two-dimensional data sequence which monotonically rises or monotonically falls, and selecting a median value in the two-dimensional data sequence to replace the pixel value of a central pixel point in the sliding template;
s23: the whole sliding template translates a row of pixel points along the direction of the u axis of the pixel coordinate system, and the step S22 is repeated until the scanning of all the pixels of the row is completed;
s24: and moving the pixels of the whole sliding template downwards by one line, repeating the step S22 and the step S23, scanning the next line, and finally obtaining the image after median filtering.
In step S3, the image after the median filtering is processed, and the spacecraft is separated from the background by using the weak gradient elimination method to obtain an image after the background separation, which specifically includes:
s31: calculating image gradient by using a Priwit operator, and calculating the gradient of the image I in the horizontal direction and the vertical direction by using 2 convolution kernels with the window size of 3 multiplied by 3 respectively;
Figure BDA0003073284690000031
in the formula, Gx,GyRespectively representing the horizontal and vertical gradients of the image, and representing a two-dimensional convolution operator;
s32: taking the root mean square of the gradient values in the horizontal direction and the vertical direction as the integral gradient of the image;
Figure BDA0003073284690000032
wherein G (u, v) represents an image gradient value at an image coordinate of (u, v);
s33: arranging gradient values G (u, v) of all pixels in the image in an ascending order, uniformly dividing the gradient values into a plurality of groups (100 groups are selected in the invention), counting the frequency of each group into a histogram form, and approximating the histogram form to be an exponential probability density function which is
Figure BDA0003073284690000041
Wherein x represents an image gradient value, and lambda is obtained by fitting histogram data by using a formula (3);
s34: substituting the percentage of the weak gradient value pixel quantity in the total pixel quantity and lambda obtained by the formula (3) into the exponential distribution function formula (4), calculating the corresponding gradient segmentation threshold value, and setting the pixel values smaller than the gradient segmentation threshold value to be 0
Figure BDA0003073284690000042
The step S4, processing the image after background separation, detecting all ellipses in the image after background separation by using a fast ellipse detection method, labeling the ellipses in the sequence 1,2, … n, and determining general functions of all the ellipses, specifically including:
processing the image after median filtering by using a rapid ellipse detection method to obtain five parameters (x) of a series of ellipses0i,y0i,ai,bii) I ∈ {1,2.. n }, where (x)0i,y0i) Represents the center position of the ellipse, aiSemi-major axis of the ellipse, biSemi-minor axis, theta, representing an ellipseiRepresenting the angle through which the major axis is rotated from the x-axis of the image coordinate system; from the five parameters of the ellipse, a general function for determining the ellipse is as follows:
Au2+Bv2+Cuv+Du+Ev+F=0 (5)
wherein A, B, C, D, E and F are parameters of an elliptic general function;
Figure BDA0003073284690000043
step S5, performing radius matching on the ellipses of all the marks and the known target rotational rigid body spacecraft 3D model, and solving general functions of all the ellipses by using a geometric method to obtain normal vectors and center coordinates of a plane where each ellipse is located in the camera coordinate system, specifically including:
s51: matching the marked ellipse with the known target spacecraft 3D model from ZwMatching in the positive direction of the axis to obtain the radius r of each matched circleiThe formula (5) is rewritten into an algebraic form:
[u v 1]Q[u v 1]T=0 (7)
wherein the content of the first and second substances,
Figure BDA0003073284690000051
the point is represented as P in the camera coordinate systemc=(Xc,Yc,Zc)TExpressed as (u, v) in the pixel coordinate systemTCorresponding homogeneous coordinate is expressed as (u, v,1)TThe following relationship is satisfied:
Figure BDA0003073284690000052
Xc,Yc,Zcrespectively representing points along X in the camera coordinate systemcAxis, YcAxis and ZcDistance of the shaft; minsThe internal parameter matrix of the monocular camera is obtained; converting the formula (7) into a camera coordinate system to obtain an oblique elliptic cone gamma equation under the camera coordinate system as follows:
[Xc Yc Zc]CQ[Xc Yc Zc]T=0 (10)
wherein the content of the first and second substances,
Figure BDA0003073284690000053
establishing a coordinate system O at the origin of the camera coordinate systemc-X 'Y' Z ', the Z' axis being parallel to the normal of the plane of the ellipse, the oblique elliptic cone Γ becoming a positive elliptic cone; coordinate system Oc-X ' Y ' Z ' and camera coordinate system Oc-XcYcZcThere is only a rotational transformation; transforming the matrix P by rotation, CQConversion to diagonal matrix:
PTCQP=diag(λ123) (12)
wherein λ is1、λ2And λ3Is CQA characteristic value of (a), and1≥λ2>0>λ3(ii) a Reckoning the coordinate system Oc-center coordinate O 'of the plane of the ith ellipse under X' Y 'Z'iAnd normal vector n'iComprises the following steps:
Figure BDA0003073284690000054
wherein i belongs to {1,2.. n }, riRepresenting the radius of each circle after matching with the known 3D model of the target spacecraft;
the center coordinate O 'of the plane of the ellipse'iAnd normal vector n'iTurning to the camera coordinate system:
Figure BDA0003073284690000055
wherein the content of the first and second substances,
Figure BDA0003073284690000056
representing the center coordinate of the plane where the ith ellipse is located under the camera coordinate system;
Figure BDA0003073284690000057
a normal vector representing the plane of the ith ellipse in the camera coordinate system;
s52: comparing the ellipse to be marked with the known 3D model of the target spacecraft from ZwMatching in the opposite direction, and repeating the step S51 to obtain the radius r of each matched circleiCalculating ZwThe axes are matched with the center coordinates and normal vectors of the plane where the series of ellipses are located in the lower camera coordinate system in the opposite direction.
The step S6, converting all the center coordinates (x)i,yi,zi) I belongs to {1,2.. n } and is fitted into a straight line, a unit direction vector where the straight line is located is obtained through calculation, i represents the ith mark ellipse, and the method specifically comprises the following steps:
the simplified form of the line l is:
Figure BDA0003073284690000061
wherein a, b, c and d are parameters of a straight line l, and the matrix form converted from (15) is as follows:
Figure BDA0003073284690000062
all center coordinates
Figure BDA0003073284690000063
Satisfies the following conditions:
Figure BDA0003073284690000064
thus:
Figure BDA0003073284690000065
this gives (a, c,1) the direction vector of the straight line l, and converts N into a unit direction vector
Figure BDA0003073284690000066
Step S7, according to the feature that the centers of all ellipses are on a straight line, and the straight line is the normal of the plane where the ellipses are located, uniquely determining the normal vector and the central coordinate of the plane where the ellipses are located, taking the plane where the first ellipse is located as the reference, calculating the rotation matrix of the plane where the reference ellipse is located with respect to the tracked spacecraft, where the rotation matrix is used as the estimation of the attitude of the rotating rigid-body spacecraft, specifically including:
s71: calculating the vector n in the direction of the normal line l of the ellipse and the two normal vectors of the plane of the reference ellipse obtained by S5
Figure BDA0003073284690000067
Angle therebetween
Figure BDA0003073284690000068
Figure BDA0003073284690000069
S72: uniquely determining a normal vector of a plane where a reference ellipse is located and a corresponding central coordinate thereof according to the following principle;
Figure BDA0003073284690000071
wherein ε < 5 ° represents an angle threshold;
s73: calculating a displacement vector t of the target spacecraft relative to the tracking spacecraft according to the center coordinate of the plane where the uniquely determined reference ellipse is located:
t=(Ox,Oy,Oz)T (21)
wherein (O)x,Oy,Oz) The center coordinates of the plane where the uniquely determined reference ellipse is located are represented;
the normal vector of the plane of the uniquely determined reference ellipse
Figure BDA0003073284690000072
As ZDAxial direction vector
Figure BDA0003073284690000073
Calculating a vector perpendicular thereto as XDAxial direction vector
Figure BDA0003073284690000074
Thereby calculating a rotation matrix R of the target spacecraft relative to the tracking spacecraft:
Figure BDA0003073284690000075
wherein the content of the first and second substances,
Figure BDA0003073284690000076
Figure BDA0003073284690000077
representing the normal vector of the plane in which the uniquely determined reference ellipse lies.
The normal vector of the plane of the uniquely determined reference ellipse
Figure BDA0003073284690000078
As ZDAxial direction vector
Figure BDA0003073284690000079
Calculating a vector perpendicular thereto as XDAxial direction vector
Figure BDA00030732846900000710
Thereby calculating a rotation matrix R of the target spacecraft relative to the tracking spacecraft:
Figure BDA00030732846900000711
wherein the content of the first and second substances,
Figure BDA00030732846900000712
Figure BDA00030732846900000713
representing the normal vector of the plane in which the uniquely determined reference ellipse lies.
In order to facilitate the analysis of the error determined by the relative attitude, the rotation matrix R is written into the form of Euler angle, and the target spacecraft is made to rotate around XwThe angle of rotation of the shaft being pitch phi, about YwThe angle of rotation of the shaft being yaw angle
Figure BDA00030732846900000714
Around ZwThe angle of the axis rotation is a roll angle gamma, and the rotation matrix R is in the form of Euler angle according to Xw、Yw、ZwThe rotation sequence of (a) is expressed as:
Figure BDA00030732846900000715
from equation (23), the euler angle can be calculated, expressed as:
Figure BDA00030732846900000716
wherein R isij(i, j e (1,2,3)) represents the i-th row and j-th column element of the rotation matrix R.
Compared with the prior art, the invention has the advantages that:
(1) compared with the existing spacecraft pose estimation method based on binocular vision, the spacecraft pose estimation method based on monocular vision designed by the invention is not influenced by baseline constraint, has lower hardware complexity and cost, lower power consumption and load, reduces the requirement on load, can ensure that the pose is determined quickly under the low-power and quality requirements, and better meets the actual engineering requirement.
(2) Compared with the existing monocular vision spacecraft pose estimation method based on the projection marker, the monocular vision-based spacecraft pose estimation method designed by the invention does not need to additionally project and identify the marker, does not cause resource waste, is suitable for spacecrafts with longer distance and without markers, has small calculated amount, short consumed time and high accuracy, can effectively estimate the relative posture of the non-cooperative target spacecraft relative to the tracking spacecraft, and has good engineering value.
Drawings
FIG. 1 is a flow chart of a method for estimating attitude of a rotating rigid body spacecraft based on monocular vision according to the present invention;
fig. 2 is a schematic diagram of a coordinate system for describing the attitude of a spacecraft, which is established in embodiment 1 of the invention;
FIG. 3 is a normal vector diagram estimated from the centers of all detected ellipses in the target spacecraft in embodiment 1 of the present invention;
FIG. 4 is a diagram of an absolute error of an ellipsometric vector in a target spacecraft in embodiment 1 of the present invention;
fig. 5 is a diagram of absolute error of a displacement vector of a target spacecraft with respect to a tracking spacecraft in embodiment 1 of the present invention;
fig. 6 is a diagram of relative errors of a target spacecraft with respect to a displacement vector of a tracking spacecraft in embodiment 1 of the present invention;
fig. 7 is a diagram of an estimation error of the euler angle of the target spacecraft with respect to the tracking spacecraft in embodiment 1 of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only illustrative and are not intended to limit the present invention.
As shown in fig. 1, the method for estimating the attitude of a rotating rigid body spacecraft based on monocular vision of the present invention comprises the following steps:
s1: establishing a known target rotating rigid body spacecraft 3D model, and simultaneously establishing a coordinate system for describing the posture of a spacecraft, wherein the coordinate system comprises a pixel coordinate system, an image coordinate system, a camera coordinate system and a target rotating rigid body spacecraft body coordinate system; calculating a conversion relation among the pixel coordinate system, the image coordinate system, the camera coordinate system and the target rotating rigid body spacecraft body coordinate system to obtain a coordinate conversion matrix;
s2: calculating pixel coordinates of an image obtained by a monocular camera on a tracking spacecraft in a pixel coordinate system, and performing median filtering to obtain a filtered image;
s3: processing the image after median filtering, and separating the spacecraft from the background by using a weak gradient elimination method to obtain an image after background separation;
s4: processing the image after background separation, detecting all ellipses in the image after background separation by using a rapid ellipse detection method, marking the ellipses according to the sequence 1,2 and … n, and determining general functions of all the ellipses;
s5: matching the radii of the ellipses of all the marks with the known 3D model of the target rotating rigid body spacecraft, and solving general functions of all the ellipses by using a geometric method to obtain normal vectors and central coordinates of the plane where each ellipse is located under a camera coordinate system;
s6: all the center coordinates (x)i,yi,zi) Fitting i belongs to {1,2.. n } to form a straight line, calculating to obtain a unit direction vector of the straight line, wherein i represents the ith mark ellipse;
s7: according to the characteristic that the centers of all ellipses are on one straight line, and the straight line is the normal line of the plane where the ellipses are located, the normal vector and the central coordinate of the plane where the ellipses are located are uniquely determined, the plane where the first ellipse is located is used as a reference, the rotation matrix of the plane where the reference ellipse is located relative to the tracking spacecraft is calculated, and the rotation matrix is used as the estimation of the attitude of the rotating rigid-body spacecraft.
The following describes in detail a specific implementation of the aforementioned monocular vision-based rigid body spacecraft attitude estimation method according to the present invention with a specific embodiment.
Example 1:
firstly, establishing a known target rotating rigid body spacecraft 3D model, and simultaneously establishing a coordinate system for describing the posture of a spacecraft, wherein the coordinate system comprises a pixel coordinate system, an image coordinate system, a camera coordinate system and a target rotating rigid body spacecraft body coordinate system; and calculating a conversion relation among the pixel coordinate system, the image coordinate system, the camera coordinate system and the target rotating rigid body spacecraft body coordinate system to obtain a coordinate conversion matrix. The method is realized by the following specific steps:
as shown in fig. 2, each coordinate system is established as follows: pixel coordinate system Ouv-uv: to track the vertex O of the image obtained by a monocular camera on a spacecraftuvThe point is an original point, the horizontal coordinate u axis is a row where the image is located, and the vertical coordinate v axis is a column where the image is located; image coordinate system Oxy-xy: at the center O of the imagexyAs an origin, the x-axis is parallel to the u-axis and the y-axis is parallel to the v-axis of the pixel coordinate system; camera coordinate system Oc-XcYcZc: with camera optical center OcIs an origin, ZcAxial in the direction of the optical axis, XcThe axis being parallel to the x-axis, Y, of the image coordinate systemcThe axis is parallel to the y-axis of the image coordinate system; with the mass center O of the target spacecraft bodyw: establishing a target spacecraft body coordinate system O for an originw-XwYwZwThe direction of the outward normal of the plane of the ellipse is taken as ZwAxis X in the direction perpendicular to the normal of the plane of the ellipsewAxis, YwThe axes are respectively connected with XwAxis and ZwThe axes are vertical and form a right-handed screw system; establishing a coordinate system O on the plane of the ellipseD-XDYDZDAt the center O of the planeDIs an origin, XDAxis parallel to X of body coordinate systemwAxis, YDY with axis parallel to body coordinate system of target spacecraftwAxis, ZDZ with axis parallel to body coordinate system of target spacecraftwA shaft.
And secondly, calculating the pixel coordinates of an image obtained by a monocular camera on the tracking spacecraft in a pixel coordinate system, and performing median filtering to obtain a filtered image. The method is realized by the following specific steps:
(1) converting an image obtained by a monocular camera on a tracked spacecraft into a gray image;
(2) selecting a sliding template, aligning the first three rows and the first three columns of pixel points of the gray level image, sequencing all the pixel points in the sliding template according to the pixel values to generate a two-dimensional data sequence which monotonically rises or monotonically falls, and selecting a median value in the two-dimensional data sequence to replace the pixel value of a central pixel point in the sliding template;
(3) the whole sliding template translates a row of pixel points along the direction of the u axis of the pixel coordinate system, and the step S22 is repeated until the scanning of all the pixels of the row is completed;
(4) and moving the pixels of the whole sliding template downwards by one line, repeating the step S22 and the step S23, scanning the next line, and finally obtaining the image after median filtering.
And thirdly, processing the image after median filtering, and separating the spacecraft from the background by using a weak gradient elimination method to obtain the image after background separation. The method is realized by the following specific steps:
(1) calculating image gradient by using a Priwit operator, and calculating the gradient of the image I in the horizontal direction and the vertical direction by using 2 convolution kernels with the window size of 3 multiplied by 3 respectively;
Figure BDA0003073284690000101
in the formula, Gx,GyRepresenting the image horizontal and vertical gradients, respectively, representing the two-dimensional convolution operator.
(2) Taking the root mean square of the gradient values in the horizontal direction and the vertical direction as the integral gradient of the image;
Figure BDA0003073284690000102
in the formula, G (u, v) represents an image gradient at image coordinates (u, v).
(3) Arranging gradient values G (u, v) of all pixels in the image in an ascending order, uniformly dividing the gradient values into 100 groups, counting the frequency of each group as a histogram, and approximating the histogram as an exponential probability density function
Figure BDA0003073284690000103
Wherein x represents an image gradient value, and lambda is obtained by fitting histogram data by using a formula (3);
(4) the percentage of the weak gradient value pixel amount to the total pixel amount and λ obtained in (3) are substituted into the exponential distribution function (4), so that the corresponding gradient division threshold value can be calculated, and all the pixel values smaller than the gradient division threshold value are set to 0.
Figure BDA0003073284690000104
And fourthly, processing the image after the background separation, detecting all ellipses in the image after the background separation by using a rapid ellipse detection method, marking the ellipses according to the sequence 1,2 and … n, and determining general functions of all the ellipses. The method is realized by the following specific steps:
processing the image after median filtering by using a rapid ellipse detection method to obtain five parameters (x) of a series of ellipses0i,y0i,ai,bii) I ∈ {1,2.. n }, where (x)0i,y0i) Represents the center position of the ellipse, aiSemi-major axis of the ellipse, biSemi-minor axis, theta, representing an ellipseiRepresenting the angle through which the major axis is rotated from the x-axis of the image coordinate system; from the five parameters of the ellipse, a general function for determining the ellipse is as follows:
Au2+Bv2+Cuv+Du+Ev+F=0 (5)
wherein, A, B, C, D, E and F are parameters of an elliptic general function.
Figure BDA0003073284690000111
And fifthly, matching the marked ellipse with the known target spacecraft 3D model from two directions respectively, and solving general functions of all the ellipses by using a geometric method to obtain normal vectors and central coordinates of the plane where each ellipse is located under the camera coordinate system.
(1) Matching the marked ellipse with the known target spacecraft 3D model from ZwMatching in the positive direction of the axis to obtain the radius r of each matched circlei. Rewrite equation (5) to algebraic form:
[u v 1]Q[u v 1]T=0 (7)
wherein the content of the first and second substances,
Figure BDA0003073284690000112
due to the fact that
Figure BDA0003073284690000113
Wherein M isinsThe internal parameter matrix of the monocular camera is obtained; and (3) converting the formula (7) into a camera coordinate system to obtain an oblique elliptic cone gamma equation under the camera coordinate system as follows:
[Xc Yc Zc]CQ[Xc Yc Zc]T=0 (10)
wherein the content of the first and second substances,
Figure BDA0003073284690000121
establishing a coordinate system O at the origin of the camera coordinate systemc-X 'Y' Z ', the Z' axis being parallel to the normal of the plane of the ellipse, the oblique elliptic cone Γ becoming a positive elliptic cone; coordinate system Oc-X ' Y ' Z ' and camera coordinate system Oc-XcYcZcThere is only a rotational transformation; transforming the matrix P by rotation, CQConversion to diagonal matrix:
PTCQP=diag(λ123) (12)
wherein λ is1、λ2And λ3Is CQCharacteristic value of
Figure BDA0003073284690000122
And lambda1≥λ2>0>λ3(ii) a Reckoning the coordinate system Oc-center coordinate O 'of the plane of the ith ellipse under X' Y 'Z'iAnd normal vector n'iComprises the following steps:
Figure BDA0003073284690000123
wherein i belongs to {1,2.. n }, riRepresenting the radius of each circle after matching with the known 3D model of the target spacecraft;
the center coordinate O 'of the plane of the ellipse'iAnd normal vector n'iTurning to the camera coordinate system:
Figure BDA0003073284690000124
wherein the content of the first and second substances,
Figure BDA0003073284690000125
representing the center coordinate of the plane where the ith ellipse is located under the camera coordinate system;
Figure BDA0003073284690000126
and the normal vector of the plane where the ith ellipse is located in the camera coordinate system is represented.
(2) Matching the marked ellipse with the known target spacecraft 3D model from ZwMatching in the opposite direction of the axis, and repeating the step (1) to obtain the radius r of each matched circleiCalculating ZwThe axes are matched with the center coordinates and normal vectors of the plane where the series of ellipses are located in the lower camera coordinate system in the opposite direction.
Sixthly, all the center coordinates (x)i,yi,zi) And i belongs to {1,2.. n } and is fitted into a straight line, a unit direction vector where the straight line is located is obtained through calculation, and i represents the ith mark ellipse. The method is realized by the following specific steps:
the simplified form of the line l is:
Figure BDA0003073284690000127
wherein a, b, c and d are parameters of a straight line l, and the matrix form converted from (15) is as follows:
Figure BDA0003073284690000128
all center coordinates
Figure BDA0003073284690000131
Satisfies the following conditions:
Figure BDA0003073284690000132
thus:
Figure BDA0003073284690000133
this gives (a, c,1) the direction vector of the straight line l, and converts N into a unit direction vector
Figure BDA0003073284690000134
And seventhly, according to the characteristic that the centers of all the ellipses are in one straight line, and the straight line is the normal line of the plane where the ellipses are located, uniquely determining a normal vector and a central coordinate of the plane where the ellipses are located, taking the plane where the first ellipse is located as a reference, calculating a rotation matrix of the plane where the reference ellipse is located relative to the tracked spacecraft, and taking the rotation matrix as an estimation of the attitude of the rotating rigid-body spacecraft. The method is realized by the following specific steps:
(1) calculating the vector n in the direction of the normal line l of the ellipse and the two normal vectors of the plane of the reference ellipse obtained in the fifth step
Figure BDA0003073284690000135
Angle therebetween
Figure BDA0003073284690000136
Figure BDA0003073284690000137
(2) And uniquely determining a normal vector of a plane where the reference ellipse is located and a corresponding central coordinate thereof according to the following principle.
Figure BDA0003073284690000138
Wherein ε < 5 ° represents an angle threshold;
(3) calculating a displacement vector t of the target spacecraft relative to the tracking spacecraft according to the center coordinate of the plane where the uniquely determined reference ellipse is located:
t=(Ox,Oy,Oz)T (21)
wherein (O)x,Oy,Oz) The center coordinates of the plane where the uniquely determined reference ellipse is located are represented;
the normal vector of the plane of the uniquely determined reference ellipse
Figure BDA0003073284690000139
As ZDAxial direction vector
Figure BDA00030732846900001310
Calculating a vector perpendicular thereto as XDAxial direction vector
Figure BDA00030732846900001311
Thereby calculating a rotation matrix R of the target spacecraft relative to the tracking spacecraft:
Figure BDA00030732846900001312
wherein the content of the first and second substances,
Figure BDA0003073284690000141
Figure BDA0003073284690000142
representing the normal vector of the plane in which the uniquely determined reference ellipse lies.
In order to facilitate the analysis of the error determined by the relative attitude, the rotation matrix R is written into the form of Euler angle, and the target spacecraft is made to rotate around XwThe angle of rotation of the shaft being pitch phi, about YwThe angle of rotation of the shaft being yaw angle
Figure BDA0003073284690000143
Around ZwThe angle of the axis rotation is a roll angle gamma, and the rotation matrix R is in the form of Euler angle according to Xw、Yw、ZwThe rotation sequence of (a) is expressed as:
Figure BDA0003073284690000144
from equation (23), the euler angle can be calculated, expressed as:
Figure BDA0003073284690000145
wherein R isij(i, j e (1,2,3)) represents the i-th row and j-th column element of the rotation matrix R.
The following is to detect the accuracy of the method for estimating the attitude of the rotating rigid body spacecraft based on the monocular vision provided in embodiment 1 of the present invention. FIG. 1 is a flow chart of the steps of the present invention. Fig. 2 depicts a target rotating rigid body spacecraft model, a pixel coordinate system, an image coordinate system, a camera coordinate system, a plane coordinate system in which a reference ellipse is located, and a target spacecraft body coordinate system. Fig. 3 depicts the direction vectors, estimated normal vectors and true normal vectors of all the straight lines where the centers of the detected ellipses in the target spacecraft are located by using a least square method when the target spacecraft is 4 meters away from the tracking spacecraft, and fig. 4 shows the absolute error curves between the estimated normal vectors and the true normal vectors, so that the errors are all within 0.4 degrees, and the invention realizes the effective calculation of the elliptical normal vectors of the target spacecraft. Fig. 5 and 6 respectively show the absolute error and relative error curve of the target spacecraft relative to the tracking spacecraft, the absolute error is within 10cm, and the relative error is less than 1.2%, so that the relative position of the target spacecraft relative to the tracking spacecraft is recovered. FIG. 7 shows the Euler angle error of the target spacecraft relative to the tracking spacecraft, with the upper diagram representing the pitch angle psi error, the middle diagram representing the roll angle gamma error, and the lower diagram representing the yaw angle
Figure BDA0003073284690000146
The error, all euler angle errors in the graph are very small. All angle and position errors in fig. 3-7 decrease as the monocular camera approaches the target spacecraft because the resolution of the objects in the image increases and the error in feature extraction decreases as the distance between the monocular camera and the target spacecraft decreases. The simulation results fully show that the method for estimating the attitude of the rotating rigid body spacecraft based on monocular vision, provided by the embodiment of the invention, is feasible and effective, has small calculated amount and high accuracy, and can realize effective estimation of the attitude of the non-cooperative target spacecraft.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (8)

1. A method for estimating the attitude of a rotating rigid body spacecraft based on monocular vision is characterized by comprising the following steps:
s1: establishing a known target rotating rigid body spacecraft 3D model, and simultaneously establishing a coordinate system for describing the posture of a spacecraft, wherein the coordinate system comprises a pixel coordinate system, an image coordinate system, a camera coordinate system and a target rotating rigid body spacecraft body coordinate system; calculating a conversion relation among the pixel coordinate system, the image coordinate system, the camera coordinate system and the target rotating rigid body spacecraft body coordinate system to obtain a coordinate conversion matrix;
s2: calculating pixel coordinates of an image obtained by a monocular camera on a tracking spacecraft in a pixel coordinate system, and performing median filtering to obtain a filtered image;
s3: processing the image after median filtering, and separating the spacecraft from the background by using a weak gradient elimination method to obtain an image after background separation;
s4: processing the image after background separation, detecting all ellipses in the image after background separation by using a rapid ellipse detection method, marking the ellipses according to the sequence 1,2 and … n, and determining general functions of all the ellipses;
s5: matching the radii of the ellipses of all the marks with the known 3D model of the target rotating rigid body spacecraft, and solving general functions of all the ellipses by using a geometric method to obtain normal vectors and central coordinates of the plane where each ellipse is located under a camera coordinate system;
s6: all the center coordinates (x)i,yi,zi) Fitting i belongs to {1,2.. n } to form a straight line, calculating to obtain a unit direction vector of the straight line, wherein i represents the ith mark ellipse;
s7: according to the characteristic that the centers of all ellipses are on one straight line, and the straight line is the normal line of the plane where the ellipses are located, the normal vector and the central coordinate of the plane where the ellipses are located are uniquely determined, the plane where the first ellipse is located is used as a reference, the rotation matrix of the plane where the reference ellipse is located relative to the tracking spacecraft is calculated, and the rotation matrix is used as the estimation of the attitude of the rotating rigid-body spacecraft.
2. A monocular vision based rotating rigid body spacecraft attitude estimation method according to claim 1, wherein in step S1, a known target rotating rigid body spacecraft 3D model is established, along with a coordinate system describing spacecraft attitude, the coordinate system comprising a pixel coordinate system, an image coordinate system, a camera coordinate system, and a target rotating rigid body spacecraft body coordinate system; and calculating a conversion relation among the pixel coordinate system, the image coordinate system, the camera coordinate system and the target rotating rigid body spacecraft body coordinate system to obtain a coordinate conversion matrix, which specifically comprises the following steps:
pixel coordinate system Ouv-uv: to track the vertex O of the image obtained by a monocular camera on a spacecraftuvThe point is an original point, the horizontal coordinate u axis is a row where the image is located, and the vertical coordinate v axis is a column where the image is located;
image coordinate system Oxy-xy: at the center O of the imagexyAs an origin, the x-axis is parallel to the u-axis and the y-axis is parallel to the v-axis of the pixel coordinate system;
camera coordinate system Oc-XcYcZc: with camera optical center OcIs an origin, ZcAxial in the direction of the optical axis, XcThe axis being parallel to the x-axis, Y, of the image coordinate systemcThe axis is parallel to the y-axis of the image coordinate system;
with the mass center O of the target spacecraft bodyw: establishing a target spacecraft body coordinate system O for an originw-XwYwZwThe direction of the outward normal of the plane of the ellipse is taken as ZwAxis X in the direction perpendicular to the normal of the plane of the ellipsewAxis, YwThe axes are respectively connected with XwAxis and ZwThe axes are vertical and form a right-handed screw system;
establishing a coordinate system O on the plane of the ellipseD-XDYDZDAt the center O of the planeDIs an origin, XDAxis parallel to X of body coordinate systemwAxis, YDY with axis parallel to body coordinate system of target spacecraftwAxis, ZDZ with axis parallel to body coordinate system of target spacecraftwA shaft.
3. The method for estimating pose of rotating rigid body spacecraft based on monocular vision according to claim 1, wherein in step S2, calculating pixel coordinates of an image obtained by tracking a monocular camera on the spacecraft in a pixel coordinate system, and performing median filtering to obtain a filtered image, specifically comprising:
s21: converting an image obtained by a monocular camera on a tracked spacecraft into a gray image;
s22: selecting a sliding template, aligning the first three rows and the first three columns of pixel points of the gray level image, sequencing all the pixel points in the sliding template according to the pixel values to generate a two-dimensional data sequence which monotonically rises or monotonically falls, and selecting a median value in the two-dimensional data sequence to replace the pixel value of a central pixel point in the sliding template;
s23: the whole sliding template translates a row of pixel points along the direction of the u axis of the pixel coordinate system, and the step S22 is repeated until the scanning of all the pixels of the row is completed;
s24: and moving the pixels of the whole sliding template downwards by one line, repeating the step S22 and the step S23, scanning the next line, and finally obtaining the image after median filtering.
4. The method for estimating pose of rotating rigid body spacecraft based on monocular vision according to claim 1, wherein in step S3, the processing is performed on the image after median filtering, and the spacecraft is separated from the background by weak gradient elimination, so as to obtain the image after background separation, specifically comprising:
s31: calculating image gradient by using a Priwit operator, and calculating the gradient of the image I in the horizontal direction and the vertical direction by using 2 convolution kernels with the window size of 3 multiplied by 3 respectively;
Figure FDA0003073284680000021
in the formula, Gx,GyRespectively representing the horizontal and vertical gradients of the image, and representing a two-dimensional convolution operator;
s32: taking the root mean square of the gradient values in the horizontal direction and the vertical direction as the integral gradient of the image;
Figure FDA0003073284680000031
wherein G (u, v) represents an image gradient value at an image coordinate of (u, v);
s33: arranging gradient values G (u, v) of all pixels in the image in an ascending order, uniformly dividing the gradient values into a plurality of groups, counting the frequency of each group as a histogram form, and approximating the histogram form as an exponential probability density function
Figure FDA0003073284680000032
Wherein x represents an image gradient value, and lambda is obtained by fitting histogram data by using a formula (3);
s34: substituting the percentage of the weak gradient value pixel quantity in the total pixel quantity and lambda obtained by the formula (3) into an exponential distribution function formula (4), calculating a corresponding gradient segmentation threshold value, and setting the pixel values less than or equal to the gradient segmentation threshold value as 0;
Figure FDA0003073284680000033
5. the method for estimating pose of rotating rigid body spacecraft based on monocular vision as claimed in claim 1, wherein step S4, the image after background separation is processed, all ellipses in the image after background separation are detected by using fast ellipse detection method, labeled by 1,2, … n in sequence, and general functions of all ellipses are determined, specifically comprising:
processing the image after median filtering by using a rapid ellipse detection method to obtain five parameters (x) of a series of ellipses0i,y0i,ai,bii) I ∈ {1,2.. n }, where (x)0i,y0i) Represents the center position of the ellipse, aiSemi-major axis of the ellipse, biSemi-minor axis, theta, representing an ellipseiRepresenting the angle through which the major axis is rotated from the x-axis of the image coordinate system; from the five parameters of the ellipse, a general function for determining the ellipse is as follows:
Au2+Bv2+Cuv+Du+Ev+F=0 (5)
wherein A, B, C, D, E and F are parameters of an elliptic general function;
Figure FDA0003073284680000034
6. the method for estimating pose of rotating rigid body spacecraft based on monocular vision according to claim 5, wherein the step S5 is to match radii of all marked ellipses with known 3D model of target rotating rigid body spacecraft, and solve general functions of all ellipses by using a geometric method to obtain normal vector and center coordinate of a plane where each ellipse is located under a camera coordinate system, and specifically comprises:
s51: matching the marked ellipse with the known target spacecraft 3D model from ZwMatching in the positive direction of the axis to obtain the radius r of each matched circleiThe formula (5) is rewritten into an algebraic form:
[u v 1]Q[u v 1]T=0 (7)
wherein the content of the first and second substances,
Figure FDA0003073284680000041
the point is represented as P in the camera coordinate systemc=(Xc,Yc,Zc)TExpressed as (u, v) in the pixel coordinate systemTCorresponding homogeneous coordinate is expressed as (u, v,1)TThe following relationship is satisfied:
Figure FDA0003073284680000042
Xc,Yc,Zcrespectively representing points along X in the camera coordinate systemcAxis, YcAxis and ZcDistance of the shaft; minsThe internal parameter matrix of the monocular camera is obtained; converting the formula (7) into a camera coordinate system to obtain an oblique elliptic cone gamma equation under the camera coordinate system as follows:
[Xc Yc Zc]CQ[Xc Yc Zc]T=0 (10)
wherein the content of the first and second substances,
Figure FDA0003073284680000043
establishing a coordinate system O at the origin of the camera coordinate systemc-X 'Y' Z ', Z' axis parallel to the ellipseThe oblique elliptic cone gamma is changed into a positive elliptic cone by the normal of the plane where the circle is positioned; coordinate system Oc-X ' Y ' Z ' and camera coordinate system Oc-XcYcZcThere is only a rotational transformation; transforming the matrix P by rotation, CQConversion to diagonal matrix:
PTCQP=diag(λ123) (12)
wherein λ is1、λ2And λ3Is CQA characteristic value of (a), and1≥λ2>0>λ3(ii) a Coordinate system Oc-center coordinate O 'of the plane of the ith ellipse under X' Y 'Z'iAnd normal vector n'iComprises the following steps:
Figure FDA0003073284680000044
wherein i belongs to {1,2.. n }, riRepresenting the radius of each circle after matching with the known 3D model of the target spacecraft;
the center coordinate O 'of the plane of the ellipse'iAnd normal vector n'iTurning to the camera coordinate system:
Figure FDA0003073284680000051
wherein the content of the first and second substances,
Figure FDA0003073284680000052
representing the center coordinate of the plane where the ith ellipse is located under the camera coordinate system;
Figure FDA0003073284680000053
a normal vector representing the plane of the ith ellipse in the camera coordinate system;
s52: comparing the ellipse to be marked with the known 3D model of the target spacecraft from ZwMatching in the opposite direction, and repeating the step S51 to obtain the radius of each matched circleriCalculating ZwThe axes are matched with the center coordinates and normal vectors of the plane where the series of ellipses are located in the lower camera coordinate system in the opposite direction.
7. A method for estimating pose of rotary rigid body spacecraft based on monocular vision as recited in claim 1, wherein said step S6 is to convert all central coordinates (x) intoi,yi,zi) Fitting i ∈ {1,2.. n } into a straight line, and calculating a unit direction vector where the straight line is located specifically includes:
the simplified form of the line l is:
Figure FDA0003073284680000054
wherein a, b, c and d are parameters of a straight line l, and the matrix form converted from (15) is as follows:
Figure FDA0003073284680000055
all center coordinates
Figure FDA0003073284680000056
Satisfies the following conditions:
Figure FDA0003073284680000057
thus:
Figure FDA0003073284680000058
this gives (a, c,1) the direction vector of the straight line l, and converts N into a unit direction vector
Figure FDA0003073284680000059
8. The method for estimating pose of rotating rigid body spacecraft based on monocular vision as claimed in claim 1, wherein said step S7 is to determine the normal vector and center coordinate of the plane where the ellipse is located uniquely according to the characteristic that all the ellipses are centered on one straight line, and the straight line is the normal of the plane where the ellipse is located, and to calculate the rotation matrix of the plane where the reference ellipse is located with respect to the tracked spacecraft, using the plane where the first ellipse is located as the reference, the rotation matrix as the estimate of pose of rotating rigid body spacecraft, specifically comprising:
s71: calculating the vector n in the direction of the normal line l of the ellipse and the two normal vectors of the plane of the reference ellipse obtained by S5
Figure FDA0003073284680000061
Angle therebetween
Figure FDA0003073284680000062
Figure FDA0003073284680000063
S72: uniquely determining a normal vector of a plane where a reference ellipse is located and a corresponding central coordinate thereof according to the following principle;
Figure FDA0003073284680000064
wherein ε < 5 ° represents an angle threshold;
s73: calculating a displacement vector t of the target spacecraft relative to the tracking spacecraft according to the center coordinate of the plane where the uniquely determined reference ellipse is located:
t=(Ox,Oy,Oz)T (21)
wherein (O)x,Oy,Oz) The center coordinates of the plane where the uniquely determined reference ellipse is located are represented;
the normal vector of the plane of the uniquely determined reference ellipse
Figure FDA0003073284680000065
As ZDAxial direction vector
Figure FDA0003073284680000066
Calculating a vector perpendicular thereto as XDAxial direction vector
Figure FDA0003073284680000067
Thereby calculating a rotation matrix R of the target spacecraft relative to the tracking spacecraft:
Figure FDA0003073284680000068
wherein the content of the first and second substances,
Figure FDA0003073284680000069
Figure FDA00030732846800000610
representing the normal vector of the plane in which the uniquely determined reference ellipse lies.
The normal vector of the plane of the uniquely determined reference ellipse
Figure FDA00030732846800000611
As ZDAxial direction vector
Figure FDA00030732846800000612
Calculating a vector perpendicular thereto as XDAxial direction vector
Figure FDA00030732846800000613
Thereby calculating a rotation matrix R of the target spacecraft relative to the tracking spacecraft:
Figure FDA00030732846800000614
wherein the content of the first and second substances,
Figure FDA00030732846800000615
Figure FDA00030732846800000616
representing the normal vector of the plane in which the uniquely determined reference ellipse lies.
In order to facilitate the analysis of the error determined by the relative attitude, the rotation matrix R is written into the form of Euler angle, and the target spacecraft is made to rotate around XwThe angle of rotation of the shaft being pitch phi, about YwThe angle of rotation of the shaft being yaw angle
Figure FDA00030732846800000617
Around ZwThe angle of the axis rotation is a roll angle gamma, and the rotation matrix R is in the form of Euler angle according to Xw、Yw、ZwThe rotation sequence of (a) is expressed as:
Figure FDA00030732846800000618
from equation (23), the euler angle can be calculated, expressed as:
Figure FDA0003073284680000071
wherein R isij(i, j e (1,2,3)) represents the i-th row and j-th column element of the rotation matrix R.
CN202110545278.7A 2021-05-19 2021-05-19 Monocular vision-based attitude estimation method for rotating rigid body spacecraft Active CN113295171B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110545278.7A CN113295171B (en) 2021-05-19 2021-05-19 Monocular vision-based attitude estimation method for rotating rigid body spacecraft

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110545278.7A CN113295171B (en) 2021-05-19 2021-05-19 Monocular vision-based attitude estimation method for rotating rigid body spacecraft

Publications (2)

Publication Number Publication Date
CN113295171A true CN113295171A (en) 2021-08-24
CN113295171B CN113295171B (en) 2022-08-16

Family

ID=77322796

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110545278.7A Active CN113295171B (en) 2021-05-19 2021-05-19 Monocular vision-based attitude estimation method for rotating rigid body spacecraft

Country Status (1)

Country Link
CN (1) CN113295171B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114963981A (en) * 2022-05-16 2022-08-30 南京航空航天大学 Monocular vision-based cylindrical part butt joint non-contact measurement method
CN116310126A (en) * 2023-03-23 2023-06-23 南京航空航天大学 Aircraft air inlet three-dimensional reconstruction method and system based on cooperative targets

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108492333A (en) * 2018-03-30 2018-09-04 哈尔滨工业大学 Spacecraft attitude method of estimation based on satellite-rocket docking ring image information
CN109405835A (en) * 2017-08-31 2019-03-01 北京航空航天大学 Relative pose measurement method based on noncooperative target straight line and circle monocular image
CN110186465A (en) * 2019-07-03 2019-08-30 西北工业大学 A kind of space non-cooperative target relative status estimation method based on monocular vision

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109405835A (en) * 2017-08-31 2019-03-01 北京航空航天大学 Relative pose measurement method based on noncooperative target straight line and circle monocular image
CN108492333A (en) * 2018-03-30 2018-09-04 哈尔滨工业大学 Spacecraft attitude method of estimation based on satellite-rocket docking ring image information
CN110186465A (en) * 2019-07-03 2019-08-30 西北工业大学 A kind of space non-cooperative target relative status estimation method based on monocular vision

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114963981A (en) * 2022-05-16 2022-08-30 南京航空航天大学 Monocular vision-based cylindrical part butt joint non-contact measurement method
CN114963981B (en) * 2022-05-16 2023-08-15 南京航空航天大学 Cylindrical part butt joint non-contact measurement method based on monocular vision
CN116310126A (en) * 2023-03-23 2023-06-23 南京航空航天大学 Aircraft air inlet three-dimensional reconstruction method and system based on cooperative targets
CN116310126B (en) * 2023-03-23 2023-11-03 南京航空航天大学 Aircraft air inlet three-dimensional reconstruction method and system based on cooperative targets

Also Published As

Publication number Publication date
CN113295171B (en) 2022-08-16

Similar Documents

Publication Publication Date Title
CN108562274B (en) Marker-based non-cooperative target pose measurement method
CN107862719B (en) Method and device for calibrating external parameters of camera, computer equipment and storage medium
Kolomenkin et al. Geometric voting algorithm for star trackers
Zhang et al. Vision-based pose estimation for textureless space objects by contour points matching
CN111862201B (en) Deep learning-based spatial non-cooperative target relative pose estimation method
CN110047108B (en) Unmanned aerial vehicle pose determination method and device, computer equipment and storage medium
CN111507901B (en) Aerial image splicing and positioning method based on aerial GPS and scale invariant constraint
EP2887315B1 (en) Camera calibration device, method for implementing calibration, program and camera for movable body
CN113295171B (en) Monocular vision-based attitude estimation method for rotating rigid body spacecraft
CN109631911B (en) Satellite attitude rotation information determination method based on deep learning target recognition algorithm
CN107677274A (en) Unmanned plane independent landing navigation information real-time resolving method based on binocular vision
CN110516532B (en) Unmanned aerial vehicle railway track line identification method based on computer vision
CN109214254B (en) Method and device for determining displacement of robot
CN111598952A (en) Multi-scale cooperative target design and online detection and identification method and system
CN115187798A (en) Multi-unmanned aerial vehicle high-precision matching positioning method
CN108225273B (en) Real-time runway detection method based on sensor priori knowledge
CN115147723B (en) Inland ship identification and ranging method, inland ship identification and ranging system, medium, equipment and terminal
CN110427030B (en) Unmanned ship autonomous docking recovery method based on Tiny-YolOship target detection algorithm
Kaufmann et al. Shadow-based matching for precise and robust absolute self-localization during lunar landings
CN108921896B (en) Downward vision compass integrating dotted line characteristics
Betge-Brezetz et al. Object-based modelling and localization in natural environments
CN110634160B (en) Method for constructing target three-dimensional key point extraction model and recognizing posture in two-dimensional graph
CN110211148B (en) Underwater image pre-segmentation method based on target state estimation
Del Pizzo et al. Reliable vessel attitude estimation by wide angle camera
CN115760984A (en) Non-cooperative target pose measurement method based on monocular vision by cubic star

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant