CN110186465B - Monocular vision-based space non-cooperative target relative state estimation method - Google Patents
Monocular vision-based space non-cooperative target relative state estimation method Download PDFInfo
- Publication number
- CN110186465B CN110186465B CN201910593684.3A CN201910593684A CN110186465B CN 110186465 B CN110186465 B CN 110186465B CN 201910593684 A CN201910593684 A CN 201910593684A CN 110186465 B CN110186465 B CN 110186465B
- Authority
- CN
- China
- Prior art keywords
- target
- coordinate system
- equation
- axis
- relative
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/24—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for cosmonautical navigation
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Astronomy & Astrophysics (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Navigation (AREA)
Abstract
The invention provides a monocular vision-based relative state estimation method for a space non-cooperative target, which adopts a monocular camera to realize relative navigation in the final approach process of the space non-cooperative target and accelerates filtering convergence by adding a constraint mode in an observation equation; establishing a kinematics and dynamics model of relative motion between the target and the service spacecraft, establishing an observation equation, adding a constraint relation equation between target feature points, designing a filter, and realizing estimation of a target relative state. The monocular camera is adopted to realize the relative navigation of the final approaching space non-cooperative target, and compared with a binocular camera (stereoscopic vision) and a laser imaging radar, the monocular camera has the advantages of small volume and mass, low power consumption, low economic cost and the like; and the constraint satisfied by the relative characteristics is added into the observation equation, so that the filter is faster in convergence.
Description
Technical Field
The invention relates to the technical field of relative navigation of spacecrafts, in particular to a target relative state estimation method.
Background
In order to be able to implement on-orbit services for non-cooperative targets, the target-to-service spacecraft relative position and attitude needs to be estimated as the space target is approached in close proximity. The spatial non-cooperative target mostly has the following characteristics: the surface is free of specific optical identification marks; communication cannot be performed; the object model is unknown, etc. This brings great difficulty to the estimation of the relative pose during the approaching operation of such targets.
From the perspective measurement sensors available for relative navigation, there are monocular cameras, binocular cameras (stereo vision), laser imaging radars, etc., and compared with the latter two, the monocular cameras have the advantages of simple technical implementation, small volume and mass, low power consumption, etc., and are more flexible and economical in economy, but have the defect that depth information cannot be directly obtained. Although the literature has demonstrated the observability of position-dependent states when cameras are mounted offset, it remains a difficult task to estimate the relative state of an object using a monocular camera.
In fact, the estimation problem of the relative state of the unknown non-cooperative target can be similar to the SLAM problem, that is, the relative pose of the target is estimated and the target structure is restored at the same time, and for a rolling target, information such as the inertia main axis of the target generally needs to be estimated. In contrast, in one method in the literature, image coordinates of feature points on the surface of a target are used as observed quantities, and a filtering algorithm is designed to estimate the relative state of the target by establishing a motion model of the feature points and a dynamic model of the target. But it has a major problem that the filtering converges slowly.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a monocular vision-based method for estimating the relative state of a spatial non-cooperative target. The invention aims to solve the technical problems that relative navigation in the final approximation process of a space non-cooperative target is realized by adopting a monocular camera, and filtering convergence is accelerated by adding a constraint mode into an observation equation.
The technical scheme adopted by the invention for solving the technical problem comprises the following steps:
the method comprises the following steps: establishing a kinematics and dynamics model of the relative motion between the target and the service spacecraft;
the following coordinate system is defined:
(1) geocentric inertial coordinate system I: the origin is located at the earth mass center, the z-axis points to the north pole of the earth, the x-axis points to the spring break point, and the y-axis is determined by a right-hand rule;
(2) LVLH coordinate system H: the origin of the coordinate system is located at the centroid of the service spacecraft, the x axis points to the centroid of the service spacecraft from the centroid, the z axis is perpendicular to the orbital plane and is consistent with the orbital angular momentum direction, and the y axis is determined by a right-hand rule;
(2) serving spacecraft body coordinate system A: the origin of coordinates is fixed on the center of mass of the service spacecraft, and the coordinate axis is superposed with the inertia main shaft of the service spacecraft;
(3) eye specimen coordinate system B: the origin of coordinates is fixed on a target centroid, coordinate axes are overlapped with an inertia main axis of the target, a minimum inertia main axis is defined as an x axis, and a maximum inertia main axis is defined as a y axis.
(5) Camera coordinate system C: the coordinate origin is fixed on the optical center of the camera, the position vector on the service spacecraft body coordinate system is d, the known fixed quantity is obtained, the z axis is along the optical axis direction of the camera, and the x axis and the y axis are parallel to the imaging plane;
the relative attitude of the target specimen body coordinate system B relative to the service spacecraft body coordinate system A is expressed by a quaternion q which is expressed byq v =[q 1 q 2 q 3 ] T As part of a quaternion vector, q 4 A scalar part which is a quaternion, and a rotation matrix corresponding to the quaternion q is:
wherein [ q ] is]Is vector q ═ q 1 q 2 q 3 ] T The antisymmetric matrix of (a):
the multiplication between quaternions b and q is defined as follows:
the representation of the angular speed of the serving spacecraft body coordinate system A and the target specimen coordinate system B under the respective body coordinate systems is defined asThe angular velocity ω of the target relative to the serving spacecraft is expressed in the target object coordinate system B as:
the kinematic equation of the relative attitude is:
wherein the angular velocity of the serving spacecraftMeasured by inertial equipment of a service spacecraft, and used as a known quantity in a model equation;
the attitude dynamics equation of the target is:
wherein ω is t Represents the target angular velocity, i.e. isτ t As a noise term, J t An inertia matrix of the target, and J t =diag(J xx ,J yy ,J zz ),J xx ,J yy ,J zz The moment of inertia of the x axis, the y axis and the z axis of the target main shaft are respectively; order toThen formula (6) is converted into:
the target moment of inertia ratio is a constant and should satisfy:
wherein the content of the first and second substances,represents k 1 With respect to the rate of change of time,represents the rate of change of k2 with respect to time;
definition of p 0 Is the position vector of the target centroid relative to the serving spacecraft centroid, expressed as [ x y z ] in LVLH coordinate system] T ρ when considering a close proximity of a serving spacecraft to a target 0 If small, the relative position kinetics equation is:
wherein:to serve the true near point angular velocity, r, of the spacecraft orbit c To serve the spacecraft orbit radius, p c Is a semi-positive focal chord line,respectively, second derivatives of x, y, z,is represented by r c First derivative of, serving spacecraft orbit parametersr c 、p c 、The method is obtained through a service spacecraft orbit determination system and used as a known quantity in a model equation;
definition of p 1 ,…,p N The position vectors of the N characteristic points on the target relative to the target centroid are respectively expressed on the target ontology, and the rigid body hypothesis of the target is as follows:
wherein the content of the first and second substances,to representRate of change with respect to time;
wherein: f (X) is obtained from formulae (5), (7), (8), (9) and (10), respectively; w (t) is the system noise term;
step two: establishing an observation equation, wherein a constraint relation equation between target characteristic points is added;
defining a position vector rho of a characteristic point i on a target under a camera coordinate system i Comprises the following steps:
in the formula:a rotation matrix representing the coordinate system from the serving spacecraft body coordinate system to the camera coordinate system is a known quantity;the rotation matrix from the LVLH coordinate system to the service spacecraft body coordinate system is represented and obtained through self-measuring equipment;relating to the state q to be estimated, given by equation (1);
characteristic point p i Image projection coordinates of (2)And rho i =[x i y i z i ] T The relationship is as follows:
wherein f is x ,f y ,c x ,c y The internal parameter of the used camera;
further, the relative feature ρ is defined in consideration of the feature points i, j, k ij Comprises the following steps:
same definition of rho jk And ρ ki ,ρ ij Projection coordinate in the image is y ij The two satisfy the following relation:
Taking into account the vector sum p ij +ρ jk +ρ ki When the formula (15) is 0, the constraint should be satisfied between the feature points i, j, and k:
is expressed as M (i, j, k) ═ y i,j,k ;
Numbering N characteristic points on a target by 1,2, … and N, and forming a constraint equation by taking three points as a group and adding the constraint equation into an observation equation; suppose s 1 ,…,s m Is a point selection combination mode, covers all the characteristic points to be estimated, and the point selection combination mode is s 1 =1,2,3,s 2 =2,3,4,…,s N-1 =N-1,N,1;
The final observation equation is:
in observation equation (17): z is all observed quantities, v is observed noise, and all ρ i 、ρ ij The related components are replaced by the formula (12) and the formula (14), and the mapping relation from the state to the observation can be obtained;
step three: designing a filter to realize the estimation of a target relative state;
equation (11) describes the equation of state of a continuous system, which is first transformed into a discrete model, and then estimated, where the discrete model is:
respectively expressing a system state equation and an observation equation for the formula (18) and the formula (17), and performing state estimation by adopting iterative extended Kalman filtering; the specific estimation steps are as follows:
step 3.1: initialization: setting initial filter estimation valuesAnd an error variance matrix P (0);
step 3.2: the state prediction value is:
the prediction error variance matrix is:
in the formula: p (k +1/k) represents a prediction error variance matrix, and P (k/k) represents a filtering error variance matrix at the time k;
step 3.3: and (3) state updating:
step 3.3.1: taking the predicted value as an iteration initial value:
P(k+1/k+1) 0 =P(k+1/k) (22)
step 3.3.2: in the iteration process, the ith iteration is as follows:
whereinR k+1 For measuring the variance of the noise, K (K +1) i Represents the gain at the ith iteration;
wherein, (r (k) i Representing the observed residual at the ith iteration,the iteration result of the (i-1) th iteration result is substituted into an observation equation (17) to obtain the result;
wherein z (k +1) is all observations at time k + 1;
P(k+1/k+1) i =[I-K(k+1) i H(k+1) i-1 ]P(k+1/k+1) i-1 (26)
step 3.3.3: when the iteration times reach the maximum iteration times or the state iteration error is smaller than a set threshold value, the iteration is terminated, and the state iteration error is defined as follows:
wherein | · | purple 2 Representing a vector two norm; assuming that iteration is terminated in the nth iteration, outputting the iteration result at the moment as a final estimation result:
P(k+1/k+1)=P(k+1/k+1) n (29)
wherein the content of the first and second substances,i.e. the final state estimation result at the time k + 1.
The method has the advantages that the monocular camera is adopted to realize the relative navigation of the final approaching space non-cooperative target, and compared with a binocular camera (stereoscopic vision) and a laser imaging radar, the monocular camera has the advantages of small volume and mass, low power consumption, low economic cost and the like; and the constraint satisfied by the relative characteristics is added into the observation equation, so that the filter is faster in convergence.
Drawings
Fig. 1 is a schematic diagram of a body coordinate system and a measurement geometry relationship of a service spacecraft and a target.
Among them, 1-service spacecraft, 2-target satellite, 3-monocular camera.
Detailed Description
The invention is further illustrated with reference to the following figures and examples.
The invention provides a method for estimating the relative state of a target and rapidly converging by using a monocular camera on the basis of the existing research. The procedure of the example is as follows:
the method comprises the following steps: establishing a kinematics and dynamics model of the relative motion between the target and the service spacecraft;
the following coordinate system is defined:
(1) geocentric inertial coordinate system I: the origin is located at the earth mass center, the z-axis points to the north pole of the earth, the x-axis points to the spring break point, and the y-axis is determined by a right-hand rule;
(2) LVLH coordinate system H: the origin of the coordinate system is located at the centroid of the service spacecraft, the x axis points to the centroid of the service spacecraft from the centroid, the z axis is perpendicular to the orbital plane and is consistent with the orbital angular momentum direction, and the y axis is determined by a right-hand rule;
(3) serving spacecraft body coordinate system A: the origin of coordinates is fixed on the center of mass of the service spacecraft, and the coordinate axis is superposed with the inertia main shaft of the service spacecraft;
(4) eye specimen coordinate system B: the origin of coordinates is fixed on a target centroid, coordinate axes are overlapped with an inertia main axis of the target, a minimum inertia main axis is defined as an x axis, and a maximum inertia main axis is defined as a y axis.
(5) Camera coordinate system C: the coordinate origin is fixed on the optical center of the camera, the position vector on the service spacecraft body coordinate system is d, the known fixed quantity is obtained, the z axis is along the optical axis direction of the camera, and the x axis and the y axis are parallel to the imaging plane;
target specimen coordinate system B relative to service spacecraftThe relative attitude of the body coordinate system A is expressed by a quaternion q which is a quaternionq v =[q 1 q 2 q 3 ] T As part of a quaternion vector, q 4 A scalar part which is a quaternion, and a rotation matrix corresponding to the quaternion q is:
wherein [ q ] is]Is vector q ═ q 1 q 2 q 3 ] T The antisymmetric matrix of (a):
the multiplication between quaternions b and q is defined as follows:
the representation of the angular speed of the serving spacecraft body coordinate system A and the target specimen coordinate system B under the respective body coordinate systems is defined asThe angular velocity ω of the target relative to the serving spacecraft is expressed in the target object coordinate system B as:
the kinematic equation of the relative attitude is:
wherein the angular velocity of the serving spacecraftDegree of rotationMeasured by inertial equipment of a service spacecraft, and used as a known quantity in a model equation;
the attitude dynamics equation of the target is:
wherein ω is t Represents the target angular velocity, i.e. isτ t As a noise term, J t An inertia matrix of the target, and J t =diag(J xx ,J yy ,J zz ),J xx ,J yy ,J zz Respectively the rotational inertia of the x axis, the y axis and the z axis of the target main shaft; since the target moment of inertia is unknown, the state estimation cannot be directly performed by using the formula (6), and the moment of inertia needs to be parameterized to makeThen formula (6) is converted into:
considering that the target is a rigid body, the mass and the distribution thereof do not change in a short period, and the target moment-of-inertia ratio is a constant, the following conditions should be satisfied:
wherein the content of the first and second substances,represents k 1 With respect to the rate of change of time,represents the rate of change of k2 with respect to time;
definition of p 0 Is the position vector of the target centroid relative to the serving spacecraft centroid, expressed as [ x y z ] in LVLH coordinate system] T ρ when considering a close proximity of a serving spacecraft to a target 0 If small, the relative position kinetics equation is:
wherein:to serve the true near point angular velocity, r, of the spacecraft orbit c To serve the spacecraft orbit radius, p c Is a semi-positive focal chord line,respectively, second derivatives of x, y, z,is represented by r c First derivative of, serving spacecraft orbit parametersr c 、p c 、The method is obtained through a service spacecraft orbit determination system and used as a known quantity in a model equation;
definition of p 1 ,…,p N The position vectors of the N characteristic points on the target relative to the target centroid are respectively expressed on the target ontology, and the rigid body hypothesis of the target is as follows:
wherein the content of the first and second substances,to representRate of change with respect to time;
wherein: f (X) is obtained from formulae (5), (7), (8), (9) and (10), respectively; w (t) is the system noise term;
step two: establishing an observation equation, wherein a constraint relation equation between target characteristic points is added;
the observed quantity is coordinates of target feature points extracted from the image, the number of the feature points to be selected and tracked is assumed to be N, and the problem of feature point shielding disappearance is not considered; defining a position vector rho of a characteristic point i on a target under a camera coordinate system i As shown in fig. 1, the following components are provided:
in the formula:a rotation matrix representing the coordinate system of the serving spacecraft body to the coordinate system of the camera, is a known quantity;the rotation matrix representing the LVLH coordinate system to the service spacecraft body coordinate system is related to the attitude motion and the orbit position of the service spacecraft and can be obtained through self measuring equipment;relating to the state q to be estimated, given by equation (1);
characteristic point p i Image projection coordinates ofAnd rho i =[x i y i z i ] T The relationship is as follows:
wherein f is x ,f y ,c x ,c y The internal parameter of the used camera;
further, the relative feature ρ is defined in consideration of the feature points i, j, k ij Comprises the following steps:
similarly, p can be defined jk And ρ ki ,ρ ij Projection coordinate in the image is y ij The two satisfy the following relation:
Taking into account the vector sum p ij +ρ jk +ρ ki When the formula (15) is 0, the constraint should be satisfied between the feature points i, j, and k:
is expressed as M (i, j, k) ═ y i,j,k ;
Numbering N characteristic points on a target by 1,2, … and N, and forming a constraint equation by taking three points as a group and adding the constraint equation into an observation equation; suppose s 1 ,…,s m Is a point selection combination mode, covers all the characteristic points to be estimated, and the point selection combination mode is s 1 =1,2,3,s 2 =2,3,4,…,s N-1 =N-1,N,1;
The final observation equation is:
in observation equation (17): z is all observed quantities, v is observed noise, and all ρ i 、ρ ij The related components are replaced by the formula (12) and the formula (14), and the mapping relation from the state to the observation can be obtained;
step three: designing a filter to realize the estimation of a target relative state;
equation (11) describes the equation of state of a continuous system, which is first transformed into a discrete model, and then estimated, where the discrete model is:
respectively expressing a system state equation and an observation equation for the formula (18) and the formula (17), and performing state estimation by adopting iterative extended Kalman filtering; the specific estimation steps are as follows:
step 3.1: initialization: setting initial filter estimation valuesAnd an error variance matrix P (0);
step 3.2: the state prediction value is:
the prediction error variance matrix is:
in the formula: p (k +1/k) represents a prediction error variance matrix, and P (k/k) represents a filtering error variance matrix at the time k;
step 3.3: and (3) updating the state:
step 3.3.1: taking the predicted value as an iteration initial value:
P(k+1/k+1) 0 =P(k+1/k) (22)
step 3.3.2: in the iteration process, the ith iteration is as follows:
whereinR k+1 For measuring the variance of the noise, K (K +1) i Represents the gain at the ith iteration;
wherein, (r (k) i Representing the observed residual at the ith iteration,the iteration result of the (i-1) th iteration result is substituted into an observation equation (17) to obtain the result;
wherein z (k +1) is all observations at time k + 1;
P(k+1/k+1) i =[I-K(k+1) i H(k+1) i-1 ]P(k+1/k+1) i-1 (26)
step 3.3.3: when the iteration times reach the maximum iteration times or the state iteration error is smaller than a set threshold value, the iteration is terminated, and the state iteration error is defined as follows:
wherein | · | purple 2 Representing a vector two norm; assuming that iteration is terminated in the nth iteration, outputting the iteration result at the moment as a final estimation result:
P(k+1/k+1)=P(k+1/k+1) n (29)
Claims (1)
1. A monocular vision-based spatial non-cooperative target relative state estimation method is characterized by comprising the following steps:
the method comprises the following steps: establishing a kinematics and dynamics model of the relative motion between the target and the service spacecraft;
the following coordinate system is defined:
(1) geocentric inertial coordinate system I: the origin is located at the earth mass center, the z-axis points to the north pole of the earth, the x-axis points to the spring break point, and the y-axis is determined by a right-hand rule;
(2) LVLH coordinate system H: the origin of the coordinate system is located at the centroid of the service spacecraft, the x axis points to the centroid of the service spacecraft from the centroid, the z axis is perpendicular to the orbital plane and is consistent with the orbital angular momentum direction, and the y axis is determined by a right-hand rule;
(2) serving spacecraft body coordinate system A: the origin of coordinates is fixed on the center of mass of the service spacecraft, and the coordinate axis is superposed with the inertia main shaft of the service spacecraft;
(3) eye specimen coordinate system B: the origin of coordinates is fixed on a target centroid, coordinate axes are overlapped with an inertia main axis of a target, a minimum inertia main axis is defined as an x axis, and a maximum inertia main axis is defined as a y axis;
(5) camera coordinate system C: the origin of coordinates is fixed on the optical center of the camera, the z axis is along the optical axis direction of the camera, and the x axis and the y axis are parallel to the imaging plane;
the relative attitude of the target specimen body coordinate system B relative to the service spacecraft body coordinate system A is expressed by a quaternion q which is expressed byq v =[q 1 q 2 q 3 ] T As part of a quaternion vector, q 4 A scalar part which is a quaternion, and a rotation matrix corresponding to the quaternion q is:
wherein [ q ] is]Is vector q ═ q 1 q 2 q 3 ] T The antisymmetric matrix of (a):
the multiplication between quaternions b and q is defined as follows:
the representation of the angular speed of the serving spacecraft body coordinate system A and the target specimen coordinate system B under the respective body coordinate systems is defined asThe angular velocity ω of the target relative to the serving spacecraft is expressed in the target object coordinate system B as:
the kinematic equation of the relative attitude is:
wherein the angular velocity of the serving spacecraftMeasured by inertial equipment of a service spacecraft, and used as a known quantity in a model equation;
the attitude dynamics equation of the target is:
wherein ω is t Represents the target angular velocity, i.e. isτ t As a noise term, J t An inertia matrix of the target, and J t =diag(J xx ,J yy ,J zz ),J xx ,J yy ,J zz Respectively the rotational inertia of the x axis, the y axis and the z axis of the target main shaft; order toThen the formula is converted to:
the target moment of inertia ratio is a constant and should satisfy:
wherein the content of the first and second substances,represents k 1 With respect to the rate of change of time,represents the rate of change of k2 with respect to time;
definition of p 0 Is the position vector of the target centroid relative to the serving spacecraft centroid, expressed as [ x y z ] in LVLH coordinate system] T ρ when considering a close proximity of a serving spacecraft to a target 0 If small, the relative position kinetics equation is:
wherein:to serve the true near point angular velocity, r, of the spacecraft orbit c To serve the spacecraft orbit radius, p c Is a semi-positive focal length string of a straight line,respectively, second derivatives of x, y, z,is represented by r c First derivative of, serving spacecraft orbit parametersr c 、p c 、The method is obtained through a service spacecraft orbit determination system and used as a known quantity in a model equation;
definition of p 1 ,…,p N The position vectors of the N characteristic points on the target relative to the target centroid are respectively expressed on the target ontology, and the rigid body hypothesis of the target is as follows:
wherein the content of the first and second substances,is represented by [ p ] 1 T ,…,p N T ] T Rate of change with respect to time;
wherein: f (X) is obtained from formulae (5), (7), (8), (9) and (10), respectively; w (t) is a system noise term;
step two: establishing an observation equation, wherein a constraint relation equation between target characteristic points is added;
defining a position vector rho of a characteristic point i on a target under a camera coordinate system i Comprises the following steps:
in the formula:a rotation matrix representing the coordinate system from the serving spacecraft body coordinate system to the camera coordinate system is a known quantity; d is a position vector of the origin of the camera coordinate system under the serving spacecraft body coordinate system, and is a known quantity;the rotation matrix from the LVLH coordinate system to the service spacecraft body coordinate system is represented and obtained through self-measuring equipment;relating to the state q to be estimated, given by equation (1);
characteristic point p i Image projection coordinates ofAnd rho i =[x i y i z i ] T The relationship is as follows:
wherein f is x ,f y ,c x ,c y The internal parameter of the used camera;
further, considering the feature points i, j, k, the relative feature ρ is defined ij Comprises the following steps:
definition of rho in the same way jk And ρ ki ,ρ ij Projection coordinate in the image is y ij The two satisfy the following relation:
Taking into account the vector sum p ij +ρ jk +ρ ki When the characteristic point i, j, k satisfies the constraint of 0:
is expressed as M (i, j, k) ═ y i,j,k ;
Numbering N characteristic points on a target by 1,2, … and N, and forming a constraint equation by taking three points as a group and adding the constraint equation into an observation equation; suppose s 1 ,…,s m Is a point selection combination mode, covers all the characteristic points to be estimated, and the point selection combination mode is s 1 =1,2,3,s 2 =2,3,4,…,s N-1 =N-1,N,1;
The final observation equation is:
in the observation equation: z is all observed quantities, v is observed noise, and all ρ i 、ρ ij The related components are replaced by applying formulas and formulas, and the mapping relation from the state to the observation can be obtained;
step three: designing a filter to realize the estimation of a target relative state;
the equation describes a continuous system state equation, which is converted into a discrete model, and then estimated, wherein the discrete model is as follows:
respectively expressing a system state equation and an observation equation by using the equation and the formula, and performing state estimation by using iterative extended Kalman filtering; the specific estimation steps are as follows:
step 3.1: initialization: setting initial filter estimation valuesAnd an error variance matrix P (0);
step 3.2: the state prediction value is:
the prediction error variance matrix is:
in the formula: p (k +1/k) represents a prediction error variance matrix, and P (k/k) represents a filtering error variance matrix at the time k;
step 3.3: and (3) updating the state:
step 3.3.1: taking the predicted value as an iteration initial value:
P(k+1/k+1) 0 =P(k+1/k) (22)
step 3.3.2: in the iteration process, the ith iteration is as follows:
whereinR k+1 For measuring the variance of the noise, K (K +1) i Represents the gain at the ith iteration;
wherein, (r (k) i Representing the observed residual at the ith iteration,) The iteration result of the (i-1) th iteration result is substituted into an observation equation (17) to obtain the result;
wherein z (k +1) is all observations at time k + 1;
P(k+1/k+1) i =[I-K(k+1) i H(k+1) i-1 ]P(k+1/k+1) i-1 (26)
step 3.3.3: when the iteration times reach the maximum iteration times or the state iteration error is smaller than a set threshold value, the iteration is terminated, and the state iteration error is defined as follows:
wherein | · | purple 2 Representing a vector two norm; assuming that iteration is terminated in the nth iteration, outputting the iteration result at the moment as a final estimation result:
P(k+1/k+1)=P(k+1/k+1) n (29)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910593684.3A CN110186465B (en) | 2019-07-03 | 2019-07-03 | Monocular vision-based space non-cooperative target relative state estimation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910593684.3A CN110186465B (en) | 2019-07-03 | 2019-07-03 | Monocular vision-based space non-cooperative target relative state estimation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110186465A CN110186465A (en) | 2019-08-30 |
CN110186465B true CN110186465B (en) | 2022-08-05 |
Family
ID=67724752
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910593684.3A Active CN110186465B (en) | 2019-07-03 | 2019-07-03 | Monocular vision-based space non-cooperative target relative state estimation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110186465B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110823214B (en) * | 2019-10-18 | 2021-05-25 | 西北工业大学 | Method for estimating relative pose and inertia of space complete non-cooperative target |
CN113295171B (en) * | 2021-05-19 | 2022-08-16 | 北京航空航天大学 | Monocular vision-based attitude estimation method for rotating rigid body spacecraft |
CN116958263B (en) * | 2023-08-09 | 2024-04-12 | 苏州三垣航天科技有限公司 | Monocular camera intelligent enhancement method in space observation target gesture recognition process |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104406598A (en) * | 2014-12-11 | 2015-03-11 | 南京航空航天大学 | Non-cooperative spacecraft attitude estimation method based on virtual sliding mode control |
CN106548475A (en) * | 2016-11-18 | 2017-03-29 | 西北工业大学 | A kind of Forecasting Methodology of the target trajectory that spins suitable for space non-cooperative |
CN107167145A (en) * | 2017-05-25 | 2017-09-15 | 西北工业大学 | A kind of morphological parameters measuring method of adaptive contactless inert satellite |
CN107449402A (en) * | 2017-07-31 | 2017-12-08 | 清华大学深圳研究生院 | A kind of measuring method of the relative pose of noncooperative target |
CN107702709A (en) * | 2017-08-31 | 2018-02-16 | 西北工业大学 | A kind of noncooperative target moves the time-frequency domain mixing discrimination method with inertial parameter |
CN108804846A (en) * | 2018-06-20 | 2018-11-13 | 哈尔滨工业大学 | A kind of data-driven attitude controller design method of noncooperative target assembly spacecraft |
CN108917772A (en) * | 2018-04-04 | 2018-11-30 | 北京空间飞行器总体设计部 | Noncooperative target Relative Navigation method for estimating based on sequence image |
-
2019
- 2019-07-03 CN CN201910593684.3A patent/CN110186465B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104406598A (en) * | 2014-12-11 | 2015-03-11 | 南京航空航天大学 | Non-cooperative spacecraft attitude estimation method based on virtual sliding mode control |
CN106548475A (en) * | 2016-11-18 | 2017-03-29 | 西北工业大学 | A kind of Forecasting Methodology of the target trajectory that spins suitable for space non-cooperative |
CN107167145A (en) * | 2017-05-25 | 2017-09-15 | 西北工业大学 | A kind of morphological parameters measuring method of adaptive contactless inert satellite |
CN107449402A (en) * | 2017-07-31 | 2017-12-08 | 清华大学深圳研究生院 | A kind of measuring method of the relative pose of noncooperative target |
CN107702709A (en) * | 2017-08-31 | 2018-02-16 | 西北工业大学 | A kind of noncooperative target moves the time-frequency domain mixing discrimination method with inertial parameter |
CN108917772A (en) * | 2018-04-04 | 2018-11-30 | 北京空间飞行器总体设计部 | Noncooperative target Relative Navigation method for estimating based on sequence image |
CN108804846A (en) * | 2018-06-20 | 2018-11-13 | 哈尔滨工业大学 | A kind of data-driven attitude controller design method of noncooperative target assembly spacecraft |
Non-Patent Citations (2)
Title |
---|
Trajectory prediction of space robot for capturing non-cooperative target;Dong Han et.al;《2017 18th International Conference on Advanced Robotics (ICAR)》;20171231;第328-333页 * |
对空间碎片的相对位姿估计;宋亮等;《宇航学报》;20150831;第36卷(第8期);第906-915页 * |
Also Published As
Publication number | Publication date |
---|---|
CN110186465A (en) | 2019-08-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230194266A1 (en) | Vision-aided inertial navigation | |
Pesce et al. | Stereovision-based pose and inertia estimation of unknown and uncooperative space objects | |
Aghili et al. | Robust relative navigation by integration of ICP and adaptive Kalman filter using laser scanner and IMU | |
CN110186465B (en) | Monocular vision-based space non-cooperative target relative state estimation method | |
Aghili et al. | Motion and parameter estimation of space objects using laser-vision data | |
Gross et al. | A comparison of extended Kalman filter, sigma-point Kalman filter, and particle filter in GPS/INS sensor fusion | |
CN108645416B (en) | Non-cooperative target relative navigation simulation verification method based on vision measurement system | |
Scandaroli et al. | Nonlinear filter design for pose and IMU bias estimation | |
Clemens et al. | Extended Kalman filter with manifold state representation for navigating a maneuverable melting probe | |
Moutinho et al. | Attitude estimation in SO (3): A comparative UAV case study | |
CN115200578A (en) | Polynomial optimization-based inertial-based navigation information fusion method and system | |
Cristofalo et al. | Vision-based control for fast 3-d reconstruction with an aerial robot | |
Na et al. | Vision-based relative navigation using dual quaternion for spacecraft proximity operations | |
Galante et al. | Fast Kalman filtering for relative spacecraft position and attitude estimation for the raven ISS hosted payload | |
Song et al. | Autonomous rendezvous and docking of an unknown tumbling space target with a monocular camera | |
Paul et al. | Dual Kalman filters for autonomous terrain aided navigation in unknown environments | |
Ammann et al. | Undelayed initialization of inverse depth parameterized landmarks in UKF-SLAM with error state formulation | |
Sveier et al. | Pose estimation using dual quaternions and moving horizon estimation | |
Huntsberger et al. | Sensory fusion for planetary surface robotic navigation, rendezvous, and manipulation operations | |
Villa et al. | Optical navigation for autonomous approach of small unknown bodies | |
Li et al. | A novel hybrid unscented particle filter based on firefly algorithm for tightly-coupled stereo visual-inertial vehicle positioning | |
Mazzucato et al. | Stereoscopic vision-based relative navigation for spacecraft proximity operations | |
Rathinama et al. | Vision based state estimation using a graph-SLAM approach for proximity operations near an asteroid | |
Yang et al. | Inertial-aided vision-based localization and mapping in a riverine environment with reflection measurements | |
Marcus et al. | MHN-SLAM FOR PLANETARY LANDING |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |