CN107132542A - A kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar - Google Patents
A kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar Download PDFInfo
- Publication number
- CN107132542A CN107132542A CN201710300242.6A CN201710300242A CN107132542A CN 107132542 A CN107132542 A CN 107132542A CN 201710300242 A CN201710300242 A CN 201710300242A CN 107132542 A CN107132542 A CN 107132542A
- Authority
- CN
- China
- Prior art keywords
- msub
- mrow
- mtd
- mtr
- mtable
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Automation & Control Theory (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The present invention discloses a kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar, belongs to deep-space detection field.Implementation method of the present invention is:The kinetic model of small feature loss soft lander probe is set up, the standard Gravitation Field Model of small feature loss is set up and linearization process is carried out to kinetic model;Set up independent navigation measurement model, on the basis of autonomous optical navigation method, introduce Doppler radar ranging and range rate information, radar beam is launched by Doppler radar, instrumentation radar beam direction to small feature loss surface relative distance and relative velocity, so as to obtain detector real time position and velocity information;According to small feature loss landing kinetic model and measurement model, detector real-time navigation status information is resolved based on nonlinear system filtering algorithm.The present invention can improve the estimated accuracy of small feature loss soft landing autonomic air navigation aid, filtering convergence rate, realize the quick accurate estimation of detector's status, support is provided for the accurate soft landing task navigation of small feature loss.
Description
Technical field
The present invention relates to a kind of small feature loss soft landing autonomic air navigation aid, belong to field of deep space exploration.
Background technology
It is that the mankind understand universe and formation and evolution, the main way of exploration origin of life of the solar system that small feature loss, which lands and detected,
Footpath, and detector has the heat that the complex region precision landing of high scientific value is survey of deep space technical research on small feature loss surface
Point problem.Because small feature loss is remote apart from the earth, taking the conventional navigation mode of earth station's telemetry communication has larger communication
Time delay, it is difficult to meet the requirement of small feature loss landing task, therefore, autonomous navigation technology turn into main the leading of small feature loss landing detection
Boat mode.Because small feature loss gravitational field is weak, distribution irregular and ground surface environment is complicated, and small feature loss soft landing needs realization double
Zero when Landing on Small Bodies surface (distance is zero) (i.e. require that speed is zero) attachment, thus it is soft on small feature loss surface to detector
Land causes very big difficulty.The position and velocity information that autonomous navigation system is provided are used as the basis of Guidance and control, its navigation
Precision directly influences small feature loss landing precision, is also related to the success or failure of whole detection mission.Therefore, small feature loss soft landing autonomic
The research of air navigation aid is significant, is directly connected to arrival that whether lander can be safe and accurate is default to have section
Learn the target area of value.
Optical guidance has had extensively by the advantages of independence is strong, precision is high in terms of spacecraft landing independent navigation
General application.In small feature loss landing detection mission, the general autonomous navigation scheme that target landing point is tracked using optical navigation camera,
The detection of the gray level image, the completion of spaceborne image processing software in scheduled landing region to characteristic point is obtained by optical navigation camera
And tracking.But this method needs to obtain the accurate position coordinates of small feature loss surface characteristics point in advance, and to be obtained in actual task
It is extremely difficult to obtain accurate characteristic point position coordinate, therefore easily the error hiding of characteristic point occurs, so as to influence from leading
The estimated accuracy of boat system.
The content of the invention
The estimated accuracy existed for small feature loss soft landing autonomic optical guidance in the prior art is low with filtering convergence rate
A kind of slow problem, small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar disclosed by the invention will be solved
Technical problem be improve small feature loss soft landing autonomic air navigation aid estimated accuracy, filtering convergence rate, realize detector shape
The quick accurate estimation of state, technical support is provided for the accurate soft landing task navigation conceptual design of small feature loss.
The purpose of the present invention is achieved through the following technical solutions.
A kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar disclosed by the invention, sets up small
The kinetic model of celestial body soft lander probe, sets up the standard Gravitation Field Model of small feature loss and kinetic model is carried out linear
Change is handled.Independent navigation measurement model is set up, on the basis of autonomous optical navigation method, the ranging for introducing Doppler radar is surveyed
Fast information, radar beam is launched by Doppler radar, then to the relative distance in radar beam direction to small feature loss surface and
Relative velocity is measured, so as to obtain the real-time position of detector and velocity information.According to small feature loss landing dynamics
Model and measurement model, detector real-time navigation status information is resolved based on nonlinear system filtering algorithm.
With reference to measurement accuracy demand and cost effectiveness, the Doppler radar preferably six light beam Doppler radars.
A kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar disclosed by the invention, including such as
Lower step:
Step 1:Set up the kinetic model of small feature loss soft lander probe.
The kinetic model of small feature loss soft landing is set up to be connected under coordinate system in J2000 landing points.State vector includes small
The position of celestial body landing seeker and velocity, shown in the kinetic model such as formula (1) of foundation:
Wherein r represents Relative position vector, and v represents relative velocity vector, and F represents to control acceleration, and U represents that small feature loss draws
Power acceleration, ω is small feature loss spin angle velocity.
Detector position vector r and velocity v is chosen as state variable, then
The standard gravitational field of small feature loss uses spherical harmonic coefficient expansion model, and statement is as shown in formula (6):
In formula, λ, φ are respectively longitude and latitude of the check point away from small feature loss barycenter;R is check point away from small feature loss barycenter
Distance;For spherical harmonic coefficient;N, m are exponent number and number of times;G is universal gravitational constant;M is small feature loss quality;R0For
The Brillouin radiuses of a ball;For association Legnedre polynomial.
The matrix form of described formula (6) preferably quadravalence spherical harmonic coefficient model.
Step 2:Set up soft landing small feature loss independent navigation measurement model.
Described independent navigation measurement model includes optical camera sight information measurement model and Doppler radar ranging is surveyed
Fast measurement model.
The process of camera imaging uses a certain characteristic point f on the model of pinhole imaging system, small feature loss surface1In camera coordinates
Position coordinates under system is rp=[xc yc zc]T, then shown in its picture original pixel coordinate such as formula (7) in camera image plane:
Wherein:F is camera focus, zcDistance for target point along camera reference line to camera imaging plane.
Define xc,ycThe attitude misalignment in direction is respectively θ1,θ2, then the Random-Rotation during camera measurement will be to feature
The position measurement of point produces influence, therefore position coordinates such as formula (8) actual in the case of little deviation is shown:
Detector attitude error increases with the increase of detector flying distance, ignores sensing angle and is multiplied to denominator part
Influence, then formula (8) can be reduced to:
Meanwhile, the relative distance ρ with reference to Doppler radar measurement along radar beam direction to celestial body surfacejAnd relative velocityRelative distance ρj, relative velocityRespectively as shown in formula (10), (11):
Wherein:ρjIt is the distance on beam direction and ground,Line-of-sight velocity is represented, B is modulation bandwidth, and c is the light velocity, and T is
The cycle of waveform, λ is wavelength, fRIt is intermediate frequency, fdIt is Doppler frequency shift, Doppler radar measurement radar beam quantity is n.
Therefore, shown in the measurement vector such as formula (11) of Doppler radar:
The unit vector for being defined on each beam direction that landing point is connected under coordinate system is λj(j=1 ..., n), such as
Shown in formula (12):
It is the transformation matrix to landing coordinate system from detector body system, shown in matrix such as formula (13):
In formula,θ, ψ are respectively the anglec of rotation of three axles of x, y, z, in addition, detector's status and Doppler radar measurement
Shown in relation such as formula (14), (15) between value:
ρj=z/ (λj·[001]T) (j=1 ..., n) (14)
Wherein z is spacecraft height, vx,vy,vzIt is the component of the spacecraft velocity in rectangular coordinate system,ForInverse matrix, represent from the landing point coordinate that is connected and be tied to the transition matrixes of body series.
With reference to measurement accuracy demand and cost effectiveness, the Doppler radar preferably six light beam Doppler radars.
Step 3:According to small feature loss landing kinetic model and measurement model, resolved and visited based on nonlinear system filtering algorithm
Survey device real-time navigation status information.
The measurement model that the small feature loss landing kinetic model that is obtained according to step 1, step 2 are obtained, passes through Navigation
Calculate and the state of detector is estimated.Because state model and measurement model are presented non-linear, therefore select nonlinear filtering
Ripple device, preferred development Kalman filtering (EKF) improves Navigation precision and convergence rate.The state letter of final output detector
Breath.
Beneficial effect:
1st, the air navigation aid of characteristic point sight information is measured only with optical camera in the prior art, due to the position of characteristic point
Putting coordinate has certain matching error, causes the problem of navigation accuracy is relatively low, filtering convergence rate is slower occur.The present invention is disclosed
A kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar, by the ranging for introducing Doppler radar
Test the speed information, can realize the quick estimation to detector position and speed, effectively reduction optical camera Feature Points Matching error
To the adverse effect of independent navigation performance, the estimated accuracy and filtering convergence rate of navigation algorithm are improved, following small feature loss is met
The accuracy requirement of soft landing autonomic navigation.
2nd, a kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar disclosed by the invention, is used
Nonlinear filter, improves the precision and filtering convergence rate of Autonomous Navigation Algorithm.
Brief description of the drawings
Fig. 1 is the flow chart of the small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar;
When Fig. 2 is in specific embodiment only with the autonomous navigation method of optical camera, detector is connected in landing point and sat
Navigation error curve under mark system.
(Fig. 2 a be detector x directions position navigation error curve, Fig. 2 b be detector y directions position navigation error curve,
Fig. 2 c are that detector z directions position navigation error curve, Fig. 2 d are that detector x directions speed navigation error curve, Fig. 2 e are spy
Survey device y directions speed navigation error curve, Fig. 2 f be detector z directions speed navigation error curve)
Fig. 3 be specific embodiment in use the small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar
When, the navigation error curve that detector is connected under coordinate system in landing point.
(Fig. 3 a be detector x directions position navigation error curve, Fig. 3 b be detector y directions position navigation error curve,
Fig. 3 c are that detector z directions position navigation error curve, Fig. 3 d are that detector x directions speed navigation error curve, Fig. 3 e are spy
Survey device y directions speed navigation error curve, Fig. 3 f be detector z directions speed navigation error curve)
Embodiment
In order to better illustrate objects and advantages of the present invention, the content of the invention is done further with example below in conjunction with the accompanying drawings
Explanation.
Embodiment 1:
This example is directed to small feature loss soft landing, and simulating, verifying is carried out by target small feature loss of Eros433.Detector is in small day
The initial position that body landing point is connected under coordinate system is [500m, 300m, 2000m]T, initial velocity is [- 0.5m/s, -0.3m/
s,-0.5m/s]T, small feature loss superficial objects landing point position is [0m, 0m, 0m]T.The sight measured by combining optical camera is believed
The relative ranging and range rate information of breath and Doppler lidar, using extended Kalman filter (EKF), to the position of detector
Put, speed state carries out Combined estimator, independent navigation when realizing high-precision real.
A kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar disclosed in this example, including such as
Lower step:
Step 1:Set up the kinetic model of small feature loss soft lander probe.
The kinetic model of small feature loss soft landing is set up to be connected under coordinate system in J2000 landing points.State vector includes small
The position of celestial body landing seeker and velocity, shown in the kinetic model such as formula (1) of foundation:
Wherein r represents Relative position vector, and v represents relative velocity vector, and F represents to control acceleration, and U represents that small feature loss draws
Power acceleration, ω is small feature loss spin angle velocity.
Detector position vector r and velocity v is chosen as state variable, then
The standard gravitational field of small feature loss uses spherical harmonic coefficient expansion model, and it is stated as shown in formula (6):
In formula, λ, φ are respectively longitude and latitude of the check point away from small feature loss barycenter;R is check point away from small feature loss barycenter
Distance;For spherical harmonic coefficient;N, m are exponent number and number of times;G is universal gravitational constant;M is small feature loss quality;R0For
The Brillouin radiuses of a ball;For association Legnedre polynomial.
The matrix form of described formula (6) preferably quadravalence spherical harmonic coefficient model.
Step 2:Set up soft landing small feature loss independent navigation measurement model.
Described independent navigation measurement model includes optical camera sight information measurement model and Doppler radar ranging is surveyed
Fast measurement model.
The process of camera imaging uses a certain characteristic point f on the model of pinhole imaging system, small feature loss surface1In camera coordinates
Position coordinates under system is rp=[xc yc zc]T, then shown in its picture original pixel coordinate such as formula (7) in camera image plane:
Wherein:F is camera focus, zcDistance for target point along camera reference line to camera imaging plane.
Define xc,ycThe attitude misalignment in direction is respectively θ1,θ2, then the Random-Rotation during camera measurement will be to feature
The position measurement of point produces influence, therefore little deviation is assumed shown in lower actual position coordinates such as formula (8):
Detector attitude error increases with the increase of detector flying distance, ignores sensing angle and is multiplied to denominator part
Influence, then formula (8) can be reduced to:
Meanwhile, the relative distance ρ with reference to Doppler radar measurement along radar beam direction to celestial body surfacejAnd relative velocityRelative distance ρj, relative velocityRespectively as shown in formula (10), (11):
Wherein:ρjIt is the distance on beam direction and ground,Line-of-sight velocity is represented, B is modulation bandwidth, and c is the light velocity, and T is
The cycle of waveform, λ is wavelength, fRIt is intermediate frequency, fdIt is Doppler frequency shift.
With reference to measurement accuracy demand and cost effectiveness, the Doppler radar preferably six light beam Doppler radars, six light beams are more
The general light beam for strangling radar, which is pointed to, to be defined as:Doppler radar wherein beam of laser wave beam points to minimum point along spacecraft vertical axis,
Wherein three beams diagonal beam and vertical axis into equally distributed azimuth angle alpha, in addition two beams per a branch of downwards with each rotary shaft into β
Angle, with the advance axis direction of lander into γ angles.
Therefore, shown in the measurement vector such as formula (11) of Doppler radar:
The unit vector for being defined on each beam direction that landing point is connected under coordinate system is λj(j=1 ..., 6), such as
Shown in formula (12):
It is the transformation matrix to landing coordinate system from detector body system, shown in matrix such as formula (13):
In formula,θ, ψ are respectively the anglec of rotation of three axles of x, y, z, in addition, detector's status and Doppler radar measurement
Shown in relation such as formula (14), (15) between value:
ρj=z/ (dj·[001]T) (j=1 ..., n) (14)
Wherein z is spacecraft height, vx,vy,vzIt is the component of the spacecraft velocity in rectangular coordinate system,ForInverse matrix, represent from the landing point coordinate that is connected and be tied to the transition matrixes of body series.
Step 3:According to small feature loss landing kinetic model and measurement model, resolved and visited based on nonlinear system filtering algorithm
Survey device real-time navigation status information.
The measurement model that the small feature loss landing kinetic model that is obtained according to step 1, step 2 are obtained, passes through Navigation
Calculate and the state of detector is estimated.Because state model and measurement model are presented non-linear, therefore select nonlinear filtering
Ripple device, preferred development Kalman filtering (EKF) improves Navigation precision and convergence rate.The state letter of final output detector
Breath.
Air navigation aid to the present embodiment carries out simulating, verifying, and the simulation parameter of landing seeker is as shown in table 1.
The simulation parameter of table 1
Only with the autonomous navigation method and the small feature loss based on optics and Doppler radar of the present embodiment of optical camera
The Numerical Simulation Results of soft landing autonomic air navigation aid sets forth as shown in Figure 2 and Figure 3, in figure respectively detector position and
The navigation evaluated error curve of speed.From simulation result as can be seen that compared to the air navigation aid using optical camera, based on light
The navigation accuracy and filtering convergence rate learned with the small feature loss soft landing autonomic air navigation aid of Doppler radar are significantly improved, energy
It is enough that the position of detector and speed are estimated in real time, it can finally obtain high-precision state estimation information.
The scope of the present invention is not only limited to embodiment, and embodiment is used to explaining the present invention, it is all with of the invention identical
Change or modification under the conditions of principle and design is within protection domain disclosed by the invention.
Claims (6)
1. a kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar, it is characterised in that:Including as follows
Step,
Step 1:Set up the kinetic model of small feature loss soft lander probe;
The kinetic model of small feature loss soft landing is set up to be connected under coordinate system in J2000 landing points;State vector includes small feature loss
The position of landing seeker and velocity, shown in the kinetic model such as formula (1) of foundation:
<mrow>
<mtable>
<mtr>
<mtd>
<mrow>
<mover>
<mi>r</mi>
<mo>&CenterDot;</mo>
</mover>
<mo>=</mo>
<mi>v</mi>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mover>
<mi>r</mi>
<mo>&CenterDot;&CenterDot;</mo>
</mover>
<mo>=</mo>
<mi>F</mi>
<mo>+</mo>
<mi>U</mi>
<mo>-</mo>
<mn>2</mn>
<mi>&omega;</mi>
<mo>&times;</mo>
<mover>
<mi>r</mi>
<mo>&CenterDot;</mo>
</mover>
<mo>-</mo>
<mi>&omega;</mi>
<mo>&times;</mo>
<mrow>
<mo>(</mo>
<mrow>
<mi>&omega;</mi>
<mo>&times;</mo>
<mi>r</mi>
</mrow>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
</mtable>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>1</mn>
<mo>)</mo>
</mrow>
</mrow>
Wherein r represents Relative position vector, and v represents relative velocity vector, and F represents to control acceleration, and U represents that small feature loss gravitation adds
Speed, ω is small feature loss spin angle velocity;
Detector position vector r and velocity v is chosen as state variable, then
<mrow>
<mtable>
<mtr>
<mtd>
<mrow>
<msub>
<mi>x</mi>
<mi>f</mi>
</msub>
<mo>=</mo>
<msup>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<mi>r</mi>
</mtd>
<mtd>
<mover>
<mi>r</mi>
<mo>&CenterDot;</mo>
</mover>
</mtd>
</mtr>
</mtable>
</mfenced>
<mi>T</mi>
</msup>
<mo>=</mo>
<msup>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>x</mi>
<mi>f</mi>
</msub>
</mtd>
<mtd>
<msub>
<mi>y</mi>
<mi>f</mi>
</msub>
</mtd>
<mtd>
<msub>
<mi>z</mi>
<mi>f</mi>
</msub>
</mtd>
<mtd>
<msub>
<mi>v</mi>
<mrow>
<mi>x</mi>
<mi>f</mi>
</mrow>
</msub>
</mtd>
<mtd>
<msub>
<mi>v</mi>
<mrow>
<mi>y</mi>
<mi>f</mi>
</mrow>
</msub>
</mtd>
<mtd>
<msub>
<mi>v</mi>
<mrow>
<mi>z</mi>
<mi>f</mi>
</mrow>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mi>T</mi>
</msup>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mover>
<mi>x</mi>
<mo>&CenterDot;</mo>
</mover>
<mo>=</mo>
<msub>
<mi>Ax</mi>
<mi>f</mi>
</msub>
<mo>+</mo>
<mi>B</mi>
<mrow>
<mo>(</mo>
<mrow>
<mi>F</mi>
<mo>+</mo>
<mi>U</mi>
</mrow>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
</mtr>
</mtable>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>2</mn>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<mi>A</mi>
<mo>=</mo>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>&theta;</mi>
<mrow>
<mn>3</mn>
<mo>&times;</mo>
<mn>3</mn>
</mrow>
</msub>
</mtd>
<mtd>
<msub>
<mi>I</mi>
<mrow>
<mn>3</mn>
<mo>&times;</mo>
<mn>3</mn>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>A</mi>
<mn>1</mn>
</msub>
</mtd>
<mtd>
<msub>
<mi>A</mi>
<mn>2</mn>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>3</mn>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<msub>
<mi>A</mi>
<mn>1</mn>
</msub>
<mo>=</mo>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msup>
<mi>&omega;</mi>
<mn>2</mn>
</msup>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<msup>
<mi>&omega;</mi>
<mn>2</mn>
</msup>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>,</mo>
<msub>
<mi>A</mi>
<mn>2</mn>
</msub>
<mo>=</mo>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mrow>
<mn>2</mn>
<mi>&omega;</mi>
</mrow>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>-</mo>
<mn>2</mn>
<mi>&omega;</mi>
</mrow>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>4</mn>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<mi>B</mi>
<mo>=</mo>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mn>0</mn>
<mrow>
<mn>3</mn>
<mo>&times;</mo>
<mn>3</mn>
</mrow>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>I</mi>
<mrow>
<mn>3</mn>
<mo>&times;</mo>
<mn>3</mn>
</mrow>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>5</mn>
<mo>)</mo>
</mrow>
</mrow>
The standard gravitational field of small feature loss uses spherical harmonic coefficient expansion model, and statement is as shown in formula (6):
<mrow>
<mi>U</mi>
<mo>=</mo>
<mfrac>
<mrow>
<mi>G</mi>
<mi>M</mi>
</mrow>
<mi>R</mi>
</mfrac>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>n</mi>
<mo>=</mo>
<mn>0</mn>
</mrow>
<mi>&infin;</mi>
</munderover>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>n</mi>
<mo>=</mo>
<mn>0</mn>
</mrow>
<mi>n</mi>
</munderover>
<msup>
<mrow>
<mo>(</mo>
<mfrac>
<msub>
<mi>R</mi>
<mn>0</mn>
</msub>
<mi>R</mi>
</mfrac>
<mo>)</mo>
</mrow>
<mi>n</mi>
</msup>
<msub>
<mover>
<mi>P</mi>
<mo>&OverBar;</mo>
</mover>
<mrow>
<mi>n</mi>
<mi>m</mi>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mrow>
<mi>sin</mi>
<mi>&phi;</mi>
</mrow>
<mo>)</mo>
</mrow>
<mrow>
<mo>(</mo>
<mrow>
<msub>
<mover>
<mi>C</mi>
<mo>&OverBar;</mo>
</mover>
<mrow>
<mi>n</mi>
<mi>m</mi>
</mrow>
</msub>
<mi>cos</mi>
<mi> </mi>
<mi>m</mi>
<mi>&lambda;</mi>
<mo>+</mo>
<msub>
<mover>
<mi>S</mi>
<mo>&OverBar;</mo>
</mover>
<mrow>
<mi>n</mi>
<mi>m</mi>
</mrow>
</msub>
<mi>sin</mi>
<mi> </mi>
<mi>m</mi>
<mi>&lambda;</mi>
</mrow>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>6</mn>
<mo>)</mo>
</mrow>
</mrow>
In formula, λ, φ are respectively longitude and latitude of the check point away from small feature loss barycenter;R be check point away from small feature loss barycenter away from
From;For spherical harmonic coefficient;N, m are exponent number and number of times;G is universal gravitational constant;M is small feature loss quality;R0For
The Brillouin radiuses of a ball;For association Legnedre polynomial;
Step 2:Set up soft landing small feature loss independent navigation measurement model;
Described independent navigation measurement model includes optical camera sight information measurement model and Doppler radar ranging and range rate is surveyed
Measure model;
The process of camera imaging uses a certain characteristic point f on the model of pinhole imaging system, small feature loss surface1Under camera coordinates system
Position coordinates be rp=[xc yc zc]T, then shown in such as formula (7) of the picture original pixel coordinate in camera image plane:
<mrow>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>u</mi>
<mi>p</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>v</mi>
<mi>p</mi>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>=</mo>
<mfrac>
<mi>f</mi>
<msub>
<mi>z</mi>
<mi>c</mi>
</msub>
</mfrac>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>x</mi>
<mi>c</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>y</mi>
<mi>c</mi>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>7</mn>
<mo>)</mo>
</mrow>
</mrow>
Wherein:F is camera focus, zcDistance for target point along camera reference line to camera imaging plane;
Define xc,ycThe attitude misalignment in direction is respectively θ1,θ2, then the Random-Rotation during camera measurement is by the position of characteristic point
Put measurement and produce influence, therefore position coordinates such as formula (8) actual in the case of little deviation is shown:
<mrow>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>u</mi>
<mi>p</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>v</mi>
<mi>p</mi>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>=</mo>
<mfrac>
<mi>f</mi>
<mrow>
<msub>
<mi>&theta;</mi>
<mn>1</mn>
</msub>
<msub>
<mi>x</mi>
<mi>c</mi>
</msub>
<mo>+</mo>
<msub>
<mi>&theta;</mi>
<mn>2</mn>
</msub>
<msub>
<mi>y</mi>
<mi>c</mi>
</msub>
<mo>+</mo>
<msub>
<mi>z</mi>
<mi>c</mi>
</msub>
</mrow>
</mfrac>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<mrow>
<msub>
<mi>x</mi>
<mi>c</mi>
</msub>
<mo>-</mo>
<msub>
<mi>&theta;</mi>
<mn>1</mn>
</msub>
<msub>
<mi>z</mi>
<mi>c</mi>
</msub>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>y</mi>
<mi>c</mi>
</msub>
<mo>-</mo>
<msub>
<mi>&theta;</mi>
<mn>2</mn>
</msub>
<msub>
<mi>z</mi>
<mi>c</mi>
</msub>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>8</mn>
<mo>)</mo>
</mrow>
</mrow>
Detector attitude error increases with the increase of detector flying distance, ignores and points to the shadow that angle is multiplied to denominator part
Ring, then formula (8) is reduced to:
<mrow>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>u</mi>
<mi>p</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>v</mi>
<mi>p</mi>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>=</mo>
<mfrac>
<mi>f</mi>
<msub>
<mi>z</mi>
<mi>c</mi>
</msub>
</mfrac>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>x</mi>
<mi>c</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>y</mi>
<mi>c</mi>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>+</mo>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<mo>-</mo>
<msub>
<mi>&theta;</mi>
<mn>1</mn>
</msub>
<mi>f</mi>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>-</mo>
<msub>
<mi>&theta;</mi>
<mn>2</mn>
</msub>
<mi>f</mi>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>9</mn>
<mo>)</mo>
</mrow>
</mrow>
Meanwhile, the relative distance ρ with reference to Doppler radar measurement along radar beam direction to celestial body surfacejAnd relative velocity
Relative distance ρj, relative velocityRespectively as shown in formula (10), (11):
<mrow>
<mtable>
<mtr>
<mtd>
<mrow>
<msub>
<mi>&rho;</mi>
<mi>j</mi>
</msub>
<mo>=</mo>
<mfrac>
<mrow>
<mi>c</mi>
<mi>T</mi>
</mrow>
<mrow>
<mn>4</mn>
<mi>B</mi>
</mrow>
</mfrac>
<msub>
<mi>f</mi>
<mi>R</mi>
</msub>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mover>
<mi>&rho;</mi>
<mo>&CenterDot;</mo>
</mover>
<mi>j</mi>
</msub>
<mo>=</mo>
<mfrac>
<mrow>
<msub>
<mi>&lambda;f</mi>
<mi>d</mi>
</msub>
</mrow>
<mn>2</mn>
</mfrac>
</mrow>
</mtd>
</mtr>
</mtable>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>10</mn>
<mo>)</mo>
</mrow>
</mrow>
Wherein:ρjIt is the distance on beam direction and ground,Line-of-sight velocity is represented, B is modulation bandwidth, and c is the light velocity, and T is waveform
Cycle, λ is wavelength, fRIt is intermediate frequency, fdIt is Doppler frequency shift, Doppler radar measurement radar beam quantity is n;
Therefore, shown in the measurement vector such as formula (11) of Doppler radar:
<mrow>
<msub>
<mi>Z</mi>
<mi>R</mi>
</msub>
<mo>=</mo>
<mo>&lsqb;</mo>
<msub>
<mi>&rho;</mi>
<mn>1</mn>
</msub>
<mo>,</mo>
<msub>
<mi>&rho;</mi>
<mn>2</mn>
</msub>
<mo>,</mo>
<mn>...</mn>
<mo>,</mo>
<msub>
<mi>&rho;</mi>
<mi>n</mi>
</msub>
<mo>,</mo>
<msub>
<mover>
<mi>&rho;</mi>
<mo>&CenterDot;</mo>
</mover>
<mn>1</mn>
</msub>
<mo>,</mo>
<msub>
<mover>
<mi>&rho;</mi>
<mo>&CenterDot;</mo>
</mover>
<mn>2</mn>
</msub>
<mo>,</mo>
<mn>...</mn>
<mo>,</mo>
<msub>
<mover>
<mi>&rho;</mi>
<mo>&CenterDot;</mo>
</mover>
<mi>n</mi>
</msub>
<mo>&rsqb;</mo>
<mo>=</mo>
<mo>&lsqb;</mo>
<msup>
<mi>&rho;</mi>
<mi>T</mi>
</msup>
<mo>,</mo>
<msup>
<mover>
<mi>&rho;</mi>
<mo>&CenterDot;</mo>
</mover>
<mi>T</mi>
</msup>
<mo>&rsqb;</mo>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>11</mn>
<mo>)</mo>
</mrow>
</mrow>
The unit vector for being defined on each beam direction that landing point is connected under coordinate system is λj(j=1 ..., n), such as formula
(12) shown in:
<mrow>
<msub>
<mrow>
<mo>&lsqb;</mo>
<msub>
<mi>&lambda;</mi>
<mn>1</mn>
</msub>
<mo>,</mo>
<msub>
<mi>&lambda;</mi>
<mn>2</mn>
</msub>
<mn>...</mn>
<msub>
<mi>&lambda;</mi>
<mi>n</mi>
</msub>
<mo>&rsqb;</mo>
</mrow>
<mrow>
<mn>3</mn>
<mo>&times;</mo>
<mi>n</mi>
</mrow>
</msub>
<mo>=</mo>
<msubsup>
<mi>T</mi>
<mi>B</mi>
<mi>L</mi>
</msubsup>
<mo>&CenterDot;</mo>
<msub>
<mi>S</mi>
<mrow>
<mn>3</mn>
<mo>&times;</mo>
<mi>n</mi>
</mrow>
</msub>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>12</mn>
<mo>)</mo>
</mrow>
</mrow>
It is the transformation matrix to landing coordinate system from detector body system, shown in matrix such as formula (13):
In formula,θ, ψ are respectively the anglec of rotation of three axles of x, y, z, in addition, detector's status and Doppler radar measurement value it
Between relation such as formula (14), shown in (15):
ρj=z/ (λj·[001]T) (j=1 ..., n) (14)
<mrow>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mover>
<mi>&rho;</mi>
<mo>&CenterDot;</mo>
</mover>
<mn>1</mn>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>.</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>.</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<mo>.</mo>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mover>
<mi>&rho;</mi>
<mo>&CenterDot;</mo>
</mover>
<mi>n</mi>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>=</mo>
<msup>
<mi>S</mi>
<mi>T</mi>
</msup>
<mo>&CenterDot;</mo>
<msubsup>
<mi>T</mi>
<mi>L</mi>
<mi>B</mi>
</msubsup>
<msup>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<msub>
<mi>v</mi>
<mi>x</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>v</mi>
<mi>y</mi>
</msub>
</mtd>
</mtr>
<mtr>
<mtd>
<msub>
<mi>v</mi>
<mi>z</mi>
</msub>
</mtd>
</mtr>
</mtable>
</mfenced>
<mi>L</mi>
</msup>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>15</mn>
<mo>)</mo>
</mrow>
</mrow>
Wherein z is spacecraft height, vx,vy,vzIt is the component of the spacecraft velocity in rectangular coordinate system,For's
Inverse matrix, represents the transition matrix that body series are tied to from the connected coordinate of landing point;
Step 3:According to small feature loss landing kinetic model and measurement model, detector is resolved based on nonlinear system filtering algorithm
Real-time navigation status information;
The measurement model that the small feature loss landing kinetic model that is obtained according to step 1, step 2 are obtained, is calculated by Navigation
State to detector estimates, the status information of final output detector.
2. a kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar as claimed in claim 1, its
It is characterised by:Matrix form of the described formula (6) from quadravalence spherical harmonic coefficient model.
3. a kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar as claimed in claim 1, its
It is characterised by:With reference to measurement accuracy demand and cost effectiveness, the Doppler radar selects six light beam Doppler radars.
4. a kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar as claimed in claim 1, its
It is characterised by:Navigation described in step 3, which is calculated, selects nonlinear filter.
5. a kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar as claimed in claim 4, its
It is characterised by:Described nonlinear filter improves Navigation precision and convergence rate from EKF (EKF).
6. a kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar, it is characterised in that:Set up small day
The kinetic model of body soft lander probe, sets up the standard Gravitation Field Model of small feature loss and kinetic model is linearized
Processing;Independent navigation measurement model is set up, on the basis of autonomous optical navigation method, the ranging and range rate of Doppler radar is introduced
Information, launches radar beam, then to the relative distance and phase in radar beam direction to small feature loss surface by Doppler radar
Velocity information is measured, so as to obtain the real-time position of detector and velocity information;According to small feature loss landing kinetic simulation
Type and measurement model, detector real-time navigation status information is resolved based on nonlinear system filtering algorithm.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710300242.6A CN107132542B (en) | 2017-05-02 | 2017-05-02 | A kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710300242.6A CN107132542B (en) | 2017-05-02 | 2017-05-02 | A kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107132542A true CN107132542A (en) | 2017-09-05 |
CN107132542B CN107132542B (en) | 2019-10-15 |
Family
ID=59715173
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710300242.6A Active CN107132542B (en) | 2017-05-02 | 2017-05-02 | A kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107132542B (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107655485A (en) * | 2017-09-25 | 2018-02-02 | 北京理工大学 | A kind of cruise section independent navigation position deviation modification method |
CN109212976A (en) * | 2018-11-20 | 2019-01-15 | 北京理工大学 | The small feature loss soft landing robust trajectory tracking control method of input-bound |
CN109269512A (en) * | 2018-12-06 | 2019-01-25 | 北京理工大学 | The Relative Navigation that planetary landing image is merged with ranging |
CN110307840A (en) * | 2019-05-21 | 2019-10-08 | 北京控制工程研究所 | A kind of landing phase robust fusion method based on multi-beam ranging and range rate and inertia |
WO2020244467A1 (en) * | 2019-06-06 | 2020-12-10 | 华为技术有限公司 | Method and device for motion state estimation |
CN113408623A (en) * | 2021-06-21 | 2021-09-17 | 北京理工大学 | Non-cooperative target flexible attachment multi-node fusion estimation method |
CN113432609A (en) * | 2021-06-16 | 2021-09-24 | 北京理工大学 | Flexible attachment state collaborative estimation method |
CN114296069A (en) * | 2021-12-23 | 2022-04-08 | 青岛科技大学 | Small celestial body detector multi-model navigation method based on visual radar |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110889219A (en) * | 2019-11-22 | 2020-03-17 | 北京理工大学 | Small celestial body gravitational field inversion correction method based on inter-device ranging |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RU2083998C1 (en) * | 1995-09-11 | 1997-07-10 | Выдревич Моисей Бецалелович | Doppler sensor of components of velocity vector, altitude and local vertical for helicopters and vertical landing space vehicles |
CN1847791A (en) * | 2006-05-12 | 2006-10-18 | 哈尔滨工业大学 | Verification system for fast autonomous deep-space optical navigation control prototype |
CN101762273A (en) * | 2010-02-01 | 2010-06-30 | 北京理工大学 | Autonomous optical navigation method for soft landing for deep space probe |
CN103438890A (en) * | 2013-09-05 | 2013-12-11 | 北京理工大学 | Planetary power descending branch navigation method based on TDS (total descending sensor) and image measurement |
CN103528587A (en) * | 2013-10-15 | 2014-01-22 | 西北工业大学 | Autonomous integrated navigation system |
CN104567880A (en) * | 2014-12-23 | 2015-04-29 | 北京理工大学 | Mars ultimate approach segment autonomous navigation method based on multi-source information fusion |
-
2017
- 2017-05-02 CN CN201710300242.6A patent/CN107132542B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RU2083998C1 (en) * | 1995-09-11 | 1997-07-10 | Выдревич Моисей Бецалелович | Doppler sensor of components of velocity vector, altitude and local vertical for helicopters and vertical landing space vehicles |
CN1847791A (en) * | 2006-05-12 | 2006-10-18 | 哈尔滨工业大学 | Verification system for fast autonomous deep-space optical navigation control prototype |
CN101762273A (en) * | 2010-02-01 | 2010-06-30 | 北京理工大学 | Autonomous optical navigation method for soft landing for deep space probe |
CN103438890A (en) * | 2013-09-05 | 2013-12-11 | 北京理工大学 | Planetary power descending branch navigation method based on TDS (total descending sensor) and image measurement |
CN103528587A (en) * | 2013-10-15 | 2014-01-22 | 西北工业大学 | Autonomous integrated navigation system |
CN104567880A (en) * | 2014-12-23 | 2015-04-29 | 北京理工大学 | Mars ultimate approach segment autonomous navigation method based on multi-source information fusion |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107655485B (en) * | 2017-09-25 | 2020-06-16 | 北京理工大学 | Cruise section autonomous navigation position deviation correction method |
CN107655485A (en) * | 2017-09-25 | 2018-02-02 | 北京理工大学 | A kind of cruise section independent navigation position deviation modification method |
CN109212976A (en) * | 2018-11-20 | 2019-01-15 | 北京理工大学 | The small feature loss soft landing robust trajectory tracking control method of input-bound |
CN109212976B (en) * | 2018-11-20 | 2020-07-07 | 北京理工大学 | Input-limited small celestial body soft landing robust trajectory tracking control method |
CN109269512A (en) * | 2018-12-06 | 2019-01-25 | 北京理工大学 | The Relative Navigation that planetary landing image is merged with ranging |
CN110307840B (en) * | 2019-05-21 | 2021-09-07 | 北京控制工程研究所 | Landing stage robust fusion method based on multi-beam ranging, velocity measurement and inertia |
CN110307840A (en) * | 2019-05-21 | 2019-10-08 | 北京控制工程研究所 | A kind of landing phase robust fusion method based on multi-beam ranging and range rate and inertia |
WO2020244467A1 (en) * | 2019-06-06 | 2020-12-10 | 华为技术有限公司 | Method and device for motion state estimation |
CN113432609A (en) * | 2021-06-16 | 2021-09-24 | 北京理工大学 | Flexible attachment state collaborative estimation method |
CN113432609B (en) * | 2021-06-16 | 2022-11-29 | 北京理工大学 | Flexible attachment state collaborative estimation method |
CN113408623A (en) * | 2021-06-21 | 2021-09-17 | 北京理工大学 | Non-cooperative target flexible attachment multi-node fusion estimation method |
CN113408623B (en) * | 2021-06-21 | 2022-10-04 | 北京理工大学 | Non-cooperative target flexible attachment multi-node fusion estimation method |
CN114296069A (en) * | 2021-12-23 | 2022-04-08 | 青岛科技大学 | Small celestial body detector multi-model navigation method based on visual radar |
CN114296069B (en) * | 2021-12-23 | 2024-05-28 | 青岛科技大学 | Small celestial body detector multi-model navigation method based on visual radar |
Also Published As
Publication number | Publication date |
---|---|
CN107132542B (en) | 2019-10-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107132542B (en) | A kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar | |
CN109613583B (en) | Passive target positioning method based on single star and ground station direction finding and combined time difference | |
CN104848860B (en) | A kind of agile satellite imagery process attitude maneuver planing method | |
CN110487301A (en) | A kind of airborne strapdown inertial navigation system Initial Alignment Method of radar auxiliary | |
CN102928861B (en) | Target positioning method and device for airborne equipment | |
CN104792340B (en) | A kind of star sensor installation error matrix and navigation system star ground combined calibrating and the method for correction | |
CN101339244B (en) | On-board SAR image automatic target positioning method | |
CN102878995B (en) | Method for autonomously navigating geo-stationary orbit satellite | |
CN103852082B (en) | Inter-satellite measurement and gyro attitude orbit integrated smoothing estimation method | |
CN105184002B (en) | A kind of several simulating analysis for passing antenna pointing angle | |
CN109633724B (en) | Passive target positioning method based on single-satellite and multi-ground-station combined measurement | |
CN106468552A (en) | A kind of two-shipper crossing location method based on airborne photoelectric platform | |
CN105180728B (en) | Front data based rapid air alignment method of rotary guided projectiles | |
CN105698762A (en) | Rapid target positioning method based on observation points at different time on single airplane flight path | |
CN106595674A (en) | HEO satellite-formation-flying automatic navigation method based on star sensor and inter-satellite link | |
CN107655485A (en) | A kind of cruise section independent navigation position deviation modification method | |
CN107300697A (en) | Moving target UKF filtering methods based on unmanned plane | |
CN104049269B (en) | A kind of target navigation mapping method based on laser ranging and MEMS/GPS integrated navigation system | |
CN105737858A (en) | Attitude parameter calibration method and attitude parameter calibration device of airborne inertial navigation system | |
CN104374388A (en) | Flight attitude determining method based on polarized light sensor | |
CN105115508A (en) | Post data-based rotary guided projectile quick air alignment method | |
CN101692001A (en) | Autonomous celestial navigation method for deep space explorer on swing-by trajectory | |
CN103197291A (en) | Satellite-borne synthetic aperture radar (SAR) echo signal simulation method based on non-stop walking model | |
US12078716B2 (en) | System and method of hypersonic object tracking | |
CN103344958B (en) | Based on the satellite-borne SAR high-order Doppler parameter evaluation method of almanac data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |