CN109269511B - Curve matching visual navigation method for planet landing in unknown environment - Google Patents

Curve matching visual navigation method for planet landing in unknown environment Download PDF

Info

Publication number
CN109269511B
CN109269511B CN201811310255.2A CN201811310255A CN109269511B CN 109269511 B CN109269511 B CN 109269511B CN 201811310255 A CN201811310255 A CN 201811310255A CN 109269511 B CN109269511 B CN 109269511B
Authority
CN
China
Prior art keywords
lander
landing
coordinate system
unknown
navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811310255.2A
Other languages
Chinese (zh)
Other versions
CN109269511A (en
Inventor
崔平远
高锡珍
朱圣英
刘阳
徐瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Technology
Original Assignee
Beijing University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Technology filed Critical Beijing University of Technology
Priority to CN201811310255.2A priority Critical patent/CN109269511B/en
Publication of CN109269511A publication Critical patent/CN109269511A/en
Application granted granted Critical
Publication of CN109269511B publication Critical patent/CN109269511B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/24Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for cosmonautical navigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a planet landing curve matching visual navigation method in an unknown environment, and belongs to the technical field of deep space exploration. The realization method of the invention is as follows: firstly, establishing a lander kinematic model by combining inertial measurement information, then establishing a measurement model based on interframe curve matching by using sequence images obtained in the descending process of the lander, and finally estimating the absolute motion state of the lander in real time by using a Kalman filtering algorithm, thereby realizing curve matching visual navigation of satellite landing in an unknown environment, improving the precision of a navigation system, ensuring the stability of the navigation system and ensuring the accurate and safe landing of the lander. The method can estimate the absolute motion state of the lander without prior map information, and improves the stability of a navigation system. The invention is not only suitable for the planet landing task, but also suitable for the small celestial body landing task.

Description

Curve matching visual navigation method for planet landing in unknown environment
Technical Field
The invention relates to a planet landing curve matching visual navigation method in an unknown environment, and belongs to the technical field of deep space exploration.
Background
Landing detection and sample return are the main development directions of future deep space exploration. The future small celestial body and mars detection tasks require the detector to have the capability of accurate fixed-point landing in areas with higher scientific value. The target celestial body is far away from the earth, and the communication time delay is serious, so that the detector is required to have the autonomous navigation capability. Meanwhile, uncertainty such as insufficient prior information and environmental disturbance of the target celestial body environment puts higher requirements on the autonomous navigation system.
At present, a navigation method based on IMU (inertial measurement unit) position recursion is mainly adopted in the landing process, but the method cannot correct initial deviation, random drift and errors exist in the IMU, the accumulated errors can be gradually diffused along with time, and the requirement of high-precision navigation is difficult to meet. In view of the above drawbacks of the navigation method, the autonomous visual navigation method based on the image information of the surface features of the celestial body is becoming the focus of research of various national scholars. Autonomous visual navigation methods based on celestial surface feature image information are mainly classified into two categories: the first type is a navigation method with known landmark characteristic positions on the surface of the celestial body; the second type is a navigation method in which the landmark feature position on the surface of the celestial body is unknown. However, when the prior-to-test map information does not exist, that is, when the landmark feature position on the surface of the celestial body cannot be obtained, the first method is not applicable any more. The navigation method based on unknown landmark feature positions on the celestial body surface is divided into a navigation method based on feature point matching and a navigation method based on curve matching according to different landmark image features. But the characteristic point line-of-sight measurement information is susceptible to noise. In view of this, it is necessary to design a fast and effective visual navigation method for the lander to ensure accurate and safe landing of the lander, aiming at the problem of estimating the motion state of the lander in an unknown environment.
Disclosure of Invention
In order to solve the problem of interplanetary landing autonomous navigation in an unknown environment, the invention aims to provide a curve matching visual navigation method for planetary landing in the unknown environment, which is used for estimating the absolute motion state of a lander in real time by combining inertial measurement information and utilizing a Kalman filtering algorithm, so that interplanetary landing autonomous navigation in the unknown environment is realized, and accurate and safe landing of the lander is ensured.
The purpose of the invention is realized by the following technical scheme.
The invention discloses a curve matching visual navigation method for planet landing in an unknown environment, which comprises the steps of firstly establishing a lander kinematic model by combining inertia measurement information, then establishing a measurement model based on interframe curve matching by using a sequence image obtained in the descending process of the lander, and finally estimating the absolute motion state of the lander in real time by a Kalman filtering algorithm, thereby realizing the curve matching visual navigation for the planet landing in the unknown environment, improving the precision of a navigation system and ensuring the stability of the navigation system.
The invention discloses a planet landing curve matching visual navigation method in an unknown environment, which comprises the following steps:
step 1: and establishing a lander kinematic model.
In order to describe the position and attitude of the lander in the air and the relative geometrical relationship between the lander and the visual characteristics of the target celestial surface and define the motion equation of the lander in the relevant reference system, the following relevant coordinate system is firstly introduced: a landing site coordinate system { L }, a navigation camera body coordinate system { C } and a lander body coordinate system { B }. The position and attitude parameters of the lander are both described in the landing site coordinate system. The landing device body coordinate system and the navigation camera coordinate system are superposed, namely the installation matrix of the optical navigation camera and the landing device is a unit matrix. And (3) not considering the planetary rotation influence, establishing a lander landing kinematic equation by utilizing IMU measurement information as follows:
Figure BDA0001854747160000021
wherein the inertia information acceleration aimuAnd angular velocity ωimuThe measurement model is
Wherein the content of the first and second substances,Lr andLv respectively represents the position and the speed of the lander under a landing point coordinate system;
Figure BDA0001854747160000023
is the quaternion of the attitude,
Figure BDA0001854747160000024
is a transformation matrix from a landing point coordinate system to a lander body coordinate system, which is abbreviated as C (q);Lg is gravitational acceleration, ngDisturbance of gravitational acceleration; baAnd bωRespectively representing zero offset of an accelerometer and a gyroscope; n isaAnd nωAccelerometer and gyroscope, respectively, measure noise; n iswaAnd nAccelerometer and gyroscope bias noise, respectively;La represents the acceleration resulting from the resultant force acting on the lander in addition to the gravitational force;Bomega represents the rotation angular speed of the lander relative to the landing point coordinate system under the lander body coordinate system; for any angular velocity ω ═ ωx ωy ωz]TΩ (. cndot.) is defined as
Figure BDA0001854747160000031
Step 2: and establishing a measurement model based on inter-frame curve matching for updating the motion state of the lander.
The landing area is approximately planar, and the meteorite crater is represented as the landing site coordinate system { L }
Wherein
Figure BDA0001854747160000033
Is any point on the edge of the meteorite crater under the coordinate system of the landing site.
Adopting pinhole imaging model to land any point on the planeLx=[Lx Ly Lz]TThe image point u in the ith descending image is [ u v ]]TIs composed of
Figure BDA0001854747160000034
Where a is a non-zero constant, where,
Figure BDA0001854747160000035
f is the focal length of the camera, Ri=C(qi)。
Since the landing area is approximately planar, thenLz is 0, and formula (4) is written as
Figure BDA0001854747160000036
Wherein
Figure BDA0001854747160000038
Wherein
Figure BDA0001854747160000039
Representing the lander position component in the landing site coordinate system.
Merle crater is represented in the ith descending image as
Figure BDA0001854747160000041
Then the meteorite crater curve E is obtained from the formula (3), the formula (5) and the formula (8)iIs composed of
Figure BDA0001854747160000042
Measurement of the jth Merle crater in the ith descending image
Figure BDA0001854747160000043
Is shown as
Figure BDA0001854747160000044
WhereinIs the parameter of the meteorite crater edge curve,
Figure BDA0001854747160000046
to measure noise, vech (·) represents a vectorized version of a symmetric matrix, vec (·) represents a vectorized version of an arbitrary matrix, matrix H is a transition matrix between vech (·) and vec (·),
Figure BDA0001854747160000047
since the meteorite absolute position information Q is unknown, equation (10) cannot be used directly for state estimation.
Meteorite crater Q is observed in two continuous descending images, and the lander is at t1And t2The meteorite crater image curves observed at the moment are respectively
Figure BDA0001854747160000049
Is obtained by formula (13) and formula (14)
Figure BDA00018547471600000411
Meteorite crater at t2Time measurement model
Figure BDA00018547471600000412
Is composed of
Wherein
Figure BDA00018547471600000414
In order to measure the noise, it is,
Figure BDA0001854747160000051
and step 3: and (3) by combining the lander kinematic model established in the step (1) and the measurement model based on the unknown curve characteristics established in the step (2), estimating the absolute motion state of the lander in real time by using a Kalman filtering algorithm, realizing interplanetary landing autonomous navigation in an unknown environment, and ensuring accurate and safe landing of the lander.
In order to solve the problem of nonlinearity of the unknown curve characteristic measurement model established in the step 2, the unscented Kalman filtering algorithm is preferably selected as the Kalman filtering algorithm in the step 3.
When the unscented kalman filter algorithm is selected in step 3, the specific implementation method in step 3 is as follows:
step 3.1: the lander state equation is obtained based on the lander kinematic model established by using the inertia measurement information in the step 1
Figure BDA0001854747160000052
WhereinA differential form of the representation of the state,Lrcand
Figure BDA0001854747160000054
representing the position and attitude quaternion of the lander at the previous imaging instant, w representing the state noise, Qk=E[wwT]Is a state noise covariance matrix.
Step 3.2: based on the measurement model based on the unknown curve characteristics established in the step 2, the measurement model is obtained as
Figure BDA0001854747160000055
Is composed of
zk=h(vech(Ek))+vk (19)
Wherein k is 1,2,3, …,
Figure BDA0001854747160000056
vkto measure noise, Rk=E{vk(vk)TAnd is the measurement noise covariance matrix.
Step 3.3: and estimating the absolute motion state of the lander in real time by using an unscented Kalman filtering algorithm, realizing interstellar landing autonomous navigation in an unknown environment, and ensuring accurate and safe landing of the lander.
Step 3.3.1: initializing lander motion state
Figure BDA0001854747160000061
Wherein x0And
Figure BDA0001854747160000062
respectively representing the initial state of the lander and its mean value, P0Representing the lander initial state covariance matrix.
Step 3.3.2: calculating sigma sampling point of lander motion state
Figure BDA0001854747160000063
Step 3.3.3: and (4) establishing a lander motion state time propagation equation by using the formula (18).
Figure BDA0001854747160000065
Step 3.3.4: and establishing a lander motion state measurement updating equation by using the formula (19).
Figure BDA0001854747160000066
In the above formula n represents the state dimension,
Figure BDA0001854747160000071
therein 10-4≤α≤1,κ=3-n,β=2
Step 3.3.5: the real-time absolute motion state of the lander is obtained by using the formulas (22) and (23)Andnamely, the interplanetary landing autonomous navigation under the unknown environment is realized, and the accurate and safe landing of the lander is ensured.
Has the advantages that:
1. the invention discloses a visual navigation method for planet landing curve matching in an unknown environment, and provides a visual navigation method for a lander by using inter-frame curve matching.
2. The invention discloses a planet landing curve matching visual navigation method in an unknown environment, which utilizes inter-frame curve matching as a measurement model, so that the absolute motion state of a lander can be estimated without prior map information, and the stability of a navigation system is improved.
3. Because the surfaces of the planet and the small celestial body have curve characteristics, the curve matching visual navigation method for the planet landing in the unknown environment is not only suitable for the planet landing task, but also suitable for the small celestial body landing task.
Drawings
FIG. 1 is a flow chart of a planetary landing curve matching visual navigation method in an unknown environment;
FIG. 2 shows the lander position estimation error and its 3 σ filter standard deviation;
FIG. 3 is a lander velocity estimation error and its 3 σ filter standard deviation;
FIG. 4 shows the lander attitude estimation error and its 3 σ filter standard deviation.
Detailed Description
For a better understanding of the objects and advantages of the present invention, reference should be made to the following detailed description taken in conjunction with the accompanying drawings and examples.
As shown in fig. 1, the method for visual navigation by planet landing curve matching in unknown environment disclosed in this example includes the following specific steps:
step 1: and establishing a lander kinematic model.
In order to describe the position and attitude of the lander in the air and the relative geometrical relationship between the lander and the visual characteristics of the target celestial surface and define the motion equation of the lander in the relevant reference system, the following relevant coordinate system is firstly introduced: a landing site coordinate system { L }, a navigation camera body coordinate system { C } and a lander body coordinate system { B }. The position and attitude parameters of the lander are both described in the landing site coordinate system. The landing device body coordinate system and the navigation camera coordinate system are superposed, namely the installation matrix of the optical navigation camera and the landing device is a unit matrix. And (3) not considering the planetary rotation influence, establishing a lander landing kinematic equation by utilizing IMU measurement information as follows:
Figure BDA0001854747160000081
wherein the inertia information acceleration aimuAnd angular velocity ωimuThe measurement model is
Figure BDA0001854747160000082
Wherein the content of the first and second substances,Lr andLv respectively represents the position and the speed of the lander under a landing point coordinate system;
Figure BDA0001854747160000083
is the quaternion of the attitude,is a transformation matrix from a landing point coordinate system to a lander body coordinate system, which is abbreviated as C (q);Lg is gravitational acceleration, ngDisturbance of gravitational acceleration; baAnd bωRespectively representing zero offset of an accelerometer and a gyroscope; n isaAnd nωAccelerometer and gyroscope, respectively, measure noise; n iswaAnd nAccelerometer and gyroscope bias noise, respectively;La represents the acceleration resulting from the resultant force acting on the lander in addition to the gravitational force;Bomega represents the rotation angular speed of the lander relative to the landing point coordinate system under the lander body coordinate system; for any angular velocity ω ═ ωx ωy ωz]TΩ (. cndot.) is defined as
Figure BDA0001854747160000091
Step 2: and establishing a measurement model based on inter-frame curve matching for updating the motion state of the lander.
The landing area is approximately planar, and the meteorite crater is represented as the landing site coordinate system { L }
Figure BDA0001854747160000092
Wherein
Figure BDA0001854747160000093
1]TIs any point on the edge of the meteorite crater under the coordinate system of the landing site.
Adopting pinhole imaging model to land any point on the planeLx=[Lx Ly Lz]TThe image point u in the ith descending image is [ u v ]]TIs composed of
Figure BDA0001854747160000094
Where a is a non-zero constant, where,f is the focal length of the camera, Ri=C(qi)。
Since the landing area is approximately planar, thenLz is 0, and formula (28) is written as
Figure BDA0001854747160000096
Wherein
Figure BDA0001854747160000097
Figure BDA0001854747160000098
Wherein
Figure BDA0001854747160000099
Indicating landing sitesA lander position component under the coordinate system.
Merle crater is represented in the ith descending image as
Figure BDA00018547471600000910
Then the meteorite crater curve E is obtained from the formula (27), the formula (29) and the formula (32)iIs composed of
Thus measurement of a meteorite crater in the ith descending image
Figure BDA0001854747160000101
Is shown as
Figure BDA0001854747160000102
Wherein
Figure BDA0001854747160000103
Is the parameter of the meteorite crater edge curve,
Figure BDA0001854747160000104
to measure noise, vech (·) represents a vectorized version of a symmetric matrix, vec (·) represents a vectorized version of an arbitrary matrix, matrix H is a transition matrix between vech (·) and vec (·),
Figure BDA0001854747160000105
Figure BDA0001854747160000106
since the meteorite absolute position information Q is unknown, equation (34) cannot be used directly for state estimation.
Meteorite crater Q is observed in two continuous descending images, and the lander is at t1And t2Time of dayObserved meteorite crater image curves are respectively
Figure BDA0001854747160000107
Figure BDA0001854747160000108
Is obtained by formula (37) and formula (38)
Figure BDA0001854747160000109
Meteorite crater at t2Time measurement model
Figure BDA00018547471600001010
Is composed of
Figure BDA00018547471600001011
Wherein
Figure BDA00018547471600001012
In order to measure the noise, it is,
Figure BDA00018547471600001013
and step 3: and (3) by combining the lander kinematic model established in the step (1) and the measurement model based on the unknown curve characteristics established in the step (2), estimating the absolute motion state of the lander in real time by using a Kalman filtering algorithm, realizing interplanetary landing autonomous navigation in an unknown environment, and ensuring accurate and safe landing of the lander.
In order to solve the problem of nonlinearity of the measurement model based on the unknown curve characteristics established in the step 2, the unscented kalman filter algorithm is preferred as the kalman filter algorithm in the step 3.
When the unscented kalman filter algorithm is selected in step 3, the specific implementation method in step 3 is as follows:
step 3.1: the lander state equation is obtained based on the lander kinematic model established by using the inertia measurement information in the step 1
Figure BDA0001854747160000111
Wherein
Figure BDA0001854747160000112
A differential form of the representation of the state,Lrcandrespectively representing landers at a preceding imaging moment
Position and attitude quaternions, w represents state noise, Qk=E[wwT]Is a state noise covariance matrix.
Step 3.2: based on the measurement model based on the unknown curve characteristics established in the step 2, the measurement model is obtained as
Figure BDA0001854747160000114
Is composed of
zk=h(vech(Ek))+vk (43)
Wherein k is 1,2,3, …,
Figure BDA0001854747160000115
vkto measure noise, Rk=E{vk(vk)TAnd is the measurement noise covariance matrix.
Step 3.3: and estimating the absolute motion state of the lander in real time by using an unscented Kalman filtering algorithm, realizing interstellar landing autonomous navigation in an unknown environment, and ensuring accurate and safe landing of the lander.
Step 3.3.1: initializing lander motion state
Figure BDA0001854747160000116
Wherein x0Andrespectively representing the initial state of the lander and its mean value, P0Representing the lander initial state covariance matrix.
Step 3.3.2: calculating sigma sampling point of lander motion state
Figure BDA0001854747160000121
Figure BDA0001854747160000122
Step 3.3.3: and establishing a lander motion state time propagation equation by using the formula (42).
Figure BDA0001854747160000123
Step 3.3.4: and establishing a lander motion state measurement updating equation by using the formula (43).
Wherein n represents a state dimension, n is 23,
Figure BDA0001854747160000125
therein 10-4≤α≤1,κ=3-n,β=2
Step 3.3.5: the real-time absolute motion state of the lander is obtained by using the formulas (46) and (47)
Figure BDA0001854747160000126
Andthe planet landing autonomous navigation under the unknown environment is realized, and the accurate and safe landing of the lander is ensured.
In Matlab environment, a curve is used for carrying out mathematical simulation verification by taking Mars landing detection as a background. And when the lander reaches a position 100m above the landing point, the simulation is finished, and the landing time is 120 s. The navigation camera has 45 degrees of field angle and 14.6mm of focal length, and measures 1 pixel of noise. The IMU adopts LN-200 and the sampling frequency is 50 HZ. The initial state of the lander is shown in table 1, the initial error of the position in each direction is 500m, the initial error of the speed in each direction is 1m/s, and the initial error of the attitude in each direction is 1 °. Process noise covariance Q of
Q=diag([2.4×10-13I 2.4×10-13I 2.5×10-7I 1.2×10-7I 1.2×10-8I])
TABLE 1 simulation parameters
Figure BDA0001854747160000131
The above detailed description is intended to illustrate the objects, aspects and advantages of the present invention, and it should be understood that the above detailed description is only exemplary of the present invention and is not intended to limit the scope of the present invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (4)

1. The planet landing curve matching visual navigation method under the unknown environment is characterized by comprising the following steps: the method comprises the following steps:
step 1: establishing a lander kinematic model;
step 2: establishing a measurement model based on interframe curve matching for updating the motion state of the lander;
and step 3: combining the lander kinematics model established in the step 1 and the measurement model based on the unknown curve characteristics established in the step 2, estimating the absolute motion state of the lander in real time by using a Kalman filtering algorithm, realizing interplanetary landing autonomous navigation in an unknown environment, and ensuring accurate and safe landing of the lander;
the specific implementation method of the step 1 is that,
in order to describe the position and attitude of the lander in the air and the relative geometrical relationship between the lander and the visual characteristics of the target celestial surface and define the motion equation of the lander in the relevant reference system, the following relevant coordinate system is firstly introduced: a landing site coordinate system { L }, a navigation camera body coordinate system { C } and a lander body coordinate system { B }; the position and attitude parameters of the lander are described in a landing point coordinate system; the landing device body coordinate system and the navigation camera coordinate system are overlapped, namely the installation matrix of the optical navigation camera and the landing device is a unit array; and (3) not considering the planetary rotation influence, establishing a lander landing kinematic equation by utilizing IMU measurement information as follows:
Figure FDA0002242602280000011
wherein the inertia information acceleration aimuAnd angular velocity ωimuThe measurement model is
Figure FDA0002242602280000012
Wherein the content of the first and second substances,Lr andLv respectively represents the position and the speed of the lander under a landing point coordinate system;
Figure FDA0002242602280000013
is the quaternion of the attitude,
Figure FDA0002242602280000014
is a transformation matrix from a landing point coordinate system to a lander body coordinate system, which is abbreviated as C (q);Lg is gravitational acceleration, ngDisturbance of gravitational acceleration; baAnd bωRespectively representing zero offset of an accelerometer and a gyroscope; n isaAnd nωAccelerometer and gyroscope, respectively, measure noise; n iswaAnd nAccelerometer and gyroscope bias noise, respectively;La represents the acceleration resulting from the resultant force acting on the lander in addition to the gravitational force;Bomega represents the rotation angular speed of the lander relative to the landing point coordinate system under the lander body coordinate system; for any angular velocity ω ═ ωx ωy ωz]TΩ (. cndot.) is defined as
Figure FDA0002242602280000021
The specific implementation method of the step 2 is that,
the landing area is approximately planar, and the meteorite crater is represented as the landing site coordinate system { L }
Figure FDA0002242602280000022
WhereinAny point on the edge of the meteorite crater under the coordinate system of the landing point;
adopting pinhole imaging model to land any point on the planeLx=[Lx Ly Lz]TThe image point u in the ith descending image is [ u v ]]TIs composed of
Where a is a non-zero constant, where,f is the focal length of the camera, Ri=C(qi);
Since the landing area is approximately planar, thenLz is 0, and formula (4) is written as
Figure FDA0002242602280000026
Wherein
Figure FDA0002242602280000027
Figure FDA0002242602280000028
WhereinLri x,Lri y,Lri zRepresenting the lander position component under the landing point coordinate system;
merle crater is represented in the ith descending image as
Figure FDA0002242602280000031
Then the meteorite crater curve E is obtained from the formula (3), the formula (5) and the formula (8)iIs composed of
Measurement of the jth Merle crater in the ith descending imageIs shown as
Figure FDA0002242602280000034
Wherein
Figure FDA0002242602280000035
Is the parameter of the meteorite crater edge curve,
Figure FDA0002242602280000036
to measure noise, vech (·) represents a vectorized version of a symmetric matrix, vec (·) represents a vectorized version of an arbitrary matrix, matrix H is a transition matrix between vech (·) and vec (·),
Figure FDA0002242602280000037
Figure FDA0002242602280000038
since the meteorite crater absolute position information Q is unknown, equation (10) cannot be used directly for state estimation;
meteorite crater Q is observed in two continuous descending images, and the lander is at t1And t2The meteorite crater image curves observed at the moment are respectively
Figure FDA0002242602280000039
Is obtained by formula (13) and formula (14)
Figure FDA00022426022800000311
Meteorite crater at t2Time measurement model
Figure FDA00022426022800000312
Is composed of
WhereinIn order to measure the noise, it is,
Figure FDA0002242602280000041
2. the method for curve matching visual navigation of planetary landing in unknown environments as claimed in claim 1, wherein: in order to solve the problem of nonlinearity of the unknown curve characteristic measurement model established in the step 2, the unscented Kalman filtering algorithm is selected as the Kalman filtering algorithm in the step 3.
3. The method for curve matching visual navigation of planetary landing in unknown environments as claimed in claim 2, wherein: when the unscented kalman filter algorithm is selected in step 3, the specific implementation method of step 3 is as follows,
step 3.1: the lander state equation is obtained based on the lander kinematic model established by using the inertia measurement information in the step 1
Figure FDA0002242602280000042
Wherein
Figure FDA0002242602280000043
A differential form of the representation of the state,Lrcand
Figure FDA0002242602280000044
representing the position and attitude quaternion of the lander at the previous imaging instant, w representing the state noise, Qk=E[wwT]Is a state noise covariance matrix;
step 3.2: based on the measurement model based on the unknown curve characteristics established in the step 2, the measurement model is obtained as
Figure FDA0002242602280000045
Is composed of
zk=h(vech(Ek))+vk (19)
Wherein k is 1,2,3 …,
Figure FDA0002242602280000046
vkto measure noise, Rk=E{vk(vk)TThe covariance matrix of the measured noise is used as the mean value;
step 3.3: and estimating the absolute motion state of the lander in real time by using an unscented Kalman filtering algorithm, realizing interstellar landing autonomous navigation in an unknown environment, and ensuring accurate and safe landing of the lander.
4. The method for curve matching visual navigation of planetary landing in unknown environments as claimed in claim 3, wherein: step 3.3 the specific implementation method is that,
step 3.3.1: initializing lander motion state
Wherein x0And
Figure FDA0002242602280000052
respectively representing the initial state of the lander and its mean value, P0Representing an initial state covariance matrix of the lander;
step 3.3.2: calculating sigma sampling point of lander motion state
Figure FDA0002242602280000053
Figure FDA0002242602280000054
Step 3.3.3: establishing a lander motion state time propagation equation by using a formula (18);
Figure FDA0002242602280000055
step 3.3.4: establishing a lander motion state measurement updating equation by using a formula (19);
Figure FDA0002242602280000056
in the above formula n represents the state dimension,
therein 10-4≤α≤1,κ=3-n,β=2
Step 3.3.5: the real-time absolute motion state of the lander is obtained by using the formulas (22) and (23)
Figure FDA0002242602280000062
And
Figure FDA0002242602280000063
namely, the interplanetary landing autonomous navigation under the unknown environment is realized, and the accurate and safe landing of the lander is ensured.
CN201811310255.2A 2018-11-06 2018-11-06 Curve matching visual navigation method for planet landing in unknown environment Active CN109269511B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811310255.2A CN109269511B (en) 2018-11-06 2018-11-06 Curve matching visual navigation method for planet landing in unknown environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811310255.2A CN109269511B (en) 2018-11-06 2018-11-06 Curve matching visual navigation method for planet landing in unknown environment

Publications (2)

Publication Number Publication Date
CN109269511A CN109269511A (en) 2019-01-25
CN109269511B true CN109269511B (en) 2020-01-07

Family

ID=65192856

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811310255.2A Active CN109269511B (en) 2018-11-06 2018-11-06 Curve matching visual navigation method for planet landing in unknown environment

Country Status (1)

Country Link
CN (1) CN109269511B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110702122B (en) * 2019-10-22 2021-03-30 北京理工大学 Comprehensive optimization method for autonomous optical navigation characteristics of extraterrestrial celestial body landing
CN111652896B (en) * 2020-05-29 2023-06-23 北京理工大学 Method for detecting coarse-fine meteorite crater by inertial navigation assistance
CN112066999B (en) * 2020-09-16 2022-08-12 北京控制工程研究所 Method and device for determining gravity direction in real time in planet landing process
CN113022898B (en) * 2021-02-18 2022-05-17 北京理工大学 State estimation method for flexible attachment system in weak gravity environment
CN114485678B (en) * 2021-12-31 2023-09-12 上海航天控制技术研究所 Navigation method for land, ground and lunar landing
CN114577205B (en) * 2022-02-10 2023-06-06 北京空间飞行器总体设计部 Satellite soft landing autonomous navigation landmark optimization method based on sequence images

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3018383B1 (en) * 2014-03-07 2017-09-08 Airbus Operations Sas METHOD AND DEVICE FOR DETERMINING NAVIGATION PARAMETERS OF AN AIRCRAFT DURING A LANDING PHASE
CN105371853A (en) * 2014-08-06 2016-03-02 北京理工大学 Mars power descending section navigation method based on TDS and orbiter
CN106096621B (en) * 2016-06-02 2019-05-21 西安科技大学 Based on vector constraint drop position detection random character point choosing method
CN107144278B (en) * 2017-04-24 2020-02-14 北京理工大学 Lander visual navigation method based on multi-source characteristics
CN108036785A (en) * 2017-11-24 2018-05-15 浙江大学 A kind of aircraft position and orientation estimation method based on direct method and inertial navigation fusion

Also Published As

Publication number Publication date
CN109269511A (en) 2019-01-25

Similar Documents

Publication Publication Date Title
CN109269511B (en) Curve matching visual navigation method for planet landing in unknown environment
CN108731670B (en) Inertial/visual odometer integrated navigation positioning method based on measurement model optimization
CN111947652B (en) Inertia/vision/astronomy/laser ranging combined navigation method suitable for lunar lander
CN106780699B (en) Visual SLAM method based on SINS/GPS and odometer assistance
CN100587641C (en) A kind of attitude determination system that is applicable to the arbitrary motion mini system
Mourikis et al. Vision-aided inertial navigation for spacecraft entry, descent, and landing
CN109931955B (en) Initial alignment method of strap-down inertial navigation system based on state-dependent lie group filtering
CN110095116A (en) A kind of localization method of vision positioning and inertial navigation combination based on LIFT
CN110702122B (en) Comprehensive optimization method for autonomous optical navigation characteristics of extraterrestrial celestial body landing
CN107144278B (en) Lander visual navigation method based on multi-source characteristics
CN104457705B (en) Deep space target celestial body based on the autonomous optical observation of space-based just orbit determination method
CN106840151B (en) Model-free deformation of hull measurement method based on delay compensation
Mostafa et al. A novel GPS/RAVO/MEMS-INS smartphone-sensor-integrated method to enhance USV navigation systems during GPS outages
CN103335654B (en) A kind of autonomous navigation method of planetary power descending branch
CN106672265B (en) A kind of small feature loss accuracy Guidance and control method based on Optic flow information
CN113551668B (en) Spacecraft inertia/star starlight vector/starlight refraction combined navigation method
CN103438890B (en) Based on the planetary power descending branch air navigation aid of TDS and image measurement
CN109612438B (en) Method for determining initial orbit of space target under constraint of virtual coplanar condition
Xu et al. Landmark-based autonomous navigation for pinpoint planetary landing
CN114690229A (en) GPS-fused mobile robot visual inertial navigation method
CN106352897A (en) Silicon MEMS (micro-electromechanical system) gyroscope error estimating and correcting method based on monocular visual sensor
Gu et al. Optical/radio/pulsars integrated navigation for Mars orbiter
CN111207773A (en) Attitude unconstrained optimization solving method for bionic polarized light navigation
Xiaolin et al. A tightly coupled rotational SINS/CNS integrated navigation method for aircraft
CN108871319B (en) Attitude calculation method based on earth gravity field and earth magnetic field sequential correction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant