CN106672265A - Small celestial body fixed-point landing guidance control method based on light stream information - Google Patents

Small celestial body fixed-point landing guidance control method based on light stream information Download PDF

Info

Publication number
CN106672265A
CN106672265A CN201611250994.8A CN201611250994A CN106672265A CN 106672265 A CN106672265 A CN 106672265A CN 201611250994 A CN201611250994 A CN 201611250994A CN 106672265 A CN106672265 A CN 106672265A
Authority
CN
China
Prior art keywords
detector
camera
point
information
sight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201611250994.8A
Other languages
Chinese (zh)
Other versions
CN106672265B (en
Inventor
崔平远
付文涛
朱圣英
高艾
徐瑞
于正湜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN201611250994.8A priority Critical patent/CN106672265B/en
Publication of CN106672265A publication Critical patent/CN106672265A/en
Application granted granted Critical
Publication of CN106672265B publication Critical patent/CN106672265B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G1/00Cosmonautic vehicles
    • B64G1/22Parts of, or equipment specially adapted for fitting in or to, cosmonautic vehicles
    • B64G1/24Guiding or controlling apparatus, e.g. for attitude control

Abstract

The invention relates to a small celestial body fixed-point landing guidance control method, in particular to a small celestial body fixed-point landing guidance control method based on light stream information and belongs to the field of deep space exploration. The method utilizes a light stream method to obtain the light stream information of a detector, wherein a camera is utilized to obtain sight line information of a landing zone, the light stream information in the step 2 and the sight line information of the landing zone in the step 3 are utilized to design a control law required by the landing process of the detector, the control acceleration a of the detector is calculated and is substituted into a formula (1), the positional deviation in the horizontal direction is corrected, and further the horizontal position of the detector is constrained to achieve fixed-point landing. The method utilizes the satellite-borne navigation camera to obtain the sight line information of the landing zone without introduction of a new sensor device, and the method has the potential of a standby navigation guidance scheme.

Description

A kind of small feature loss accuracy Guidance and control method based on Optic flow information
Technical field
The present invention relates to a kind of small feature loss landing guidance control method, more particularly to a kind of small feature loss based on Optic flow information Accuracy Guidance and control method, belongs to deep-space detection field.
Background technology
Small feature loss landing detection is a key content of survey of deep space task, and compared to planet celestial body, small feature loss has Light weight, it is in irregular shape the characteristics of, this causes its nearby to there is irregular weak gravitational field.Under this gravitational field environment, visit Survey task has strict requirements to the speed of landing mission end detector.Simultaneously because in the presence of communication delay more long, using The traditional navigation based on Deep Space Network, control model, detector cannot realize high accuracy soft landing.In order to succeed Into scientific investigation task, landing detection mission independence, the demand of real-time are met, various countries researcher is for autonomous in recent years Navigation, Guide and Controlling method has carried out substantial amounts of research work.
First technology [1] is (referring to Johnson A E, Cheng Y, Matthies L H.Machine vision for autonomous small body navigation[C].IEEE Aerospace Conference Proceedings, 2000,7:661-671), JPL laboratories propose a set of small feature loss soft landing navigation, Guidance and control based on computer vision Scheme, by tracking multiple image characteristic point, counts with reference to laser elevation and detector motion is estimated, while completing right The reconstruct of touch-down zone landform.
Traditional Navigation, Guide and Controlling method based on computer vision, with independence is strong, high precision, real-time The advantages of good.But whole process is computationally intensive, image procossing poor reliability, and features of terrain information is needed, it is only applicable to The detection mission of known environment.
First technology [2] is (referring to Heriss é B, Hamel T, Mahony R, et al.Landing a VTOL unmanned aerial vehicle on a moving platform using optical flow[J].IEEE Transactions on Robotics,2012,28(1):77-89.), researcher proposes a kind of using Optic flow information pair The method that unmanned plane carries out Guidance and control, is successfully realized unmanned plane independent landing, and carried out verification experimental verification.
Traditional Guidance and control method based on Optic flow information, it is small with amount of calculation, the advantages of calculating speed is fast, and not Priori landform characteristic information is needed, can be detected for circumstances not known.But due to lacking the position constraint of horizontal direction, visit Surveying device can not realize accuracy, it is impossible to meet the mission requirements in the precision landing of small feature loss surface.
The content of the invention
The Guidance and control method based on Optic flow information the invention aims to solve traditional can not realize accuracy Problem, propose a kind of small feature loss accuracy Guidance and control method based on Optic flow information.The method is believed by introducing sight line Breath, row constraint is entered to detector horizontal level, realizes accuracy.
The purpose of the present invention is realized by the following method.
A kind of small feature loss accuracy Guidance and control method based on Optic flow information, comprises the following steps that:
Step one, set up system dynamics equation.
The kinetics equation of detector is set up in the case where landing point connects firmly coordinate system:
Wherein, r is position vector of the detector under landing point coordinate system, and v is the velocity of detector, and ω is target Celestial bodies rotation angular velocity vector, the target celestial body gravitational acceleration that g is subject to for detector, a is the control acceleration of detector.
Step 2, the Optic flow information that detector is obtained using optical flow method:
Based on pin-hole imaging principle, the light stream that can obtain current time detector using camera speed and depth information is believed Breath:
Wherein, w=(u, v) is the Optic flow information of detector, (Vx,Vy,Vz) it is point-to-point speed of the camera under inertial system, (ωxyz) it is angular velocity of rotation of the camera under inertial system,For in three dimensions any point M in camera coordinates system The coordinate of lower Z axis, (x, y) is coordinates of the point M in camera imaging plane correspondence subpoint, and f is known navigation camera focus.
Step 3, the sight line information that touch-down zone is obtained using camera:
Using the relative line of sight navigated between camera observation detector and touch-down zone, the survey of relative line of sight is obtained Angle information.Relation between detector position, attitude and observed quantity can be given by formula (3) and formula (4):
Wherein,Angle Information for touch-down zone with respect to line of sight;(X, Y, Z) is touch-down zone under inertial system Coordinate;(x', y', z') is the coordinate of detector under inertial system;Aij, i, j=1,2,3 are camera coordinates system relative to inertial system Each component of direction cosine matrix A.
Angle Information using touch-down zone with respect to line of sightTo realize to level in detector landing mission The amendment of direction position.
Step 4, the control acceleration a for asking for detector:
The touch-down zone sight line information that the Optic flow information and step 3 obtained using step 2 are obtained, design detector landed The control law of Cheng Suoxu, asks for the control acceleration a of detector, is brought into formula (1), it is therefore an objective to correct horizontal direction position inclined Difference.
Repeat step two to four, to realize small feature loss accuracy.
The control law can be:Ratio control form or sliding formwork control form.
When the control law is ratio control form:
A=kw(w*-w)+kζ*-ζ) (5)
Wherein w**It is desired smooth flow valuve and the angle of sight, w, ζ are the light flow valuve and the angle of sight at detector current time, kw,kζThe respectively control gain and the control gain of sight line angular displacement of light stream deviation.
Beneficial effect
1st, a kind of small feature loss accuracy Guidance and control method based on Optic flow information given by the present invention, by introducing The amendment of horizontal direction position, can realize accuracy of the detector on small feature loss surface, and with algorithm it is simple, calculate The small advantage of amount.
2nd, the method obtains the sight line information in touch-down zone using satellite-based navigation camera, it is not necessary to introduces new sensor and sets It is standby, and with the potential quality as standby navigational guidance scheme.
Brief description of the drawings
Fig. 1 is a kind of flow chart of the small feature loss accuracy Guidance and control method based on Optic flow information of the present invention;
Fig. 2 is pin-hole imaging schematic diagram;
Fig. 3 is touch-down zone sight line information schematic diagram;
Fig. 4 is detector track along the x-axis direction under small feature loss landing point coordinate system;
Fig. 5 is detector track along the y-axis direction under small feature loss landing point coordinate system;
Fig. 6 is detector track along the z-axis direction under small feature loss landing point coordinate system.
Specific embodiment
The invention will be further described with embodiment below in conjunction with the accompanying drawings.
Embodiment 1
A kind of small feature loss accuracy Guidance and control method based on Optic flow information, comprises the following steps that:
Step one, set up system dynamics equation.
The kinetics equation of detector is set up in the case where landing point connects firmly coordinate system:
Wherein r is position vector of the detector under landing point coordinate system, and v is the velocity of detector, and ω is target Celestial bodies rotation angular velocity vector, the target celestial body gravitational acceleration that g is subject to for detector, a is the control acceleration of detector.
Step 2, the Optic flow information that detector is obtained using optical flow method:
In Fig. 2, point M is any point in three dimensions, and point m is corresponding subpoints of the point M in camera imaging plane, point m Coordinate be (x, y), f is known navigation camera focus, then the coordinate of point M and point m has following relation:
If impact point M is motionless in space, in detector flight course, camera corresponding point M has the translational motion also to have rotation Transhipment is dynamic, then speed of the point M under camera coordinates system can be expressed as:
VM=-Vt-ω'×M (8)
In formula × it is vectorial multiplication cross, Vt=(Vx,Vy,Vz)TIt is point-to-point speed of the camera under inertial system, ω '=(ωx, ωyz)TIt is angular velocity of rotation of the camera under inertial system,It is seats of the point M under camera coordinates system Mark, formula (8) is launched to obtain:
Light stream refers to the instantaneous velocity that pixel is moved and produced on the image plane, therefore by the coordinate of formula (2) midpoint m (x, y) differentiates the light stream (u, v) that just can obtain at projection plane point m to the time respectively, and its unit is that pixel is per second, its expression Formula is respectively as shown in formula (10) and formula (11):
(u, v) and Block- matching calculated from formula (10) and formula (11) calculate (Δ i, Δ j) is between the two Unit it is inconsistent, therefore Conversion of measurement unit need to be carried out, its transformational relation is:
Wherein h is simulation step length, and unit is the second.
Bringing formula (9) into formula (10) and formula (11) respectively can obtain between light stream and camera speed and depth information Corresponding relation formula:
From formula (13) as can be seen that point-to-point speed under inertial system of light stream and camera, rotary speed andIt is relevant, And the light stream that light stream is caused by camera translationWith the light stream caused by camera rotationTwo parts are constituted;Wherein Translation light stream and camera point-to-point speed, relative depth informationIt is relevant;Rotation light stream it is relevant with camera rotary speed, and and phase It is unrelated to depth.If installing IMU inertial navigation units in detector, rotation light stream just can be by being calculated, so as to draw translation Light stream.
Step 3, the sight line information that touch-down zone is obtained using camera:
Using the relative line of sight navigated between camera observation detector and touch-down zone, the survey of relative line of sight is obtained Angle information.Fig. 3 gives measuring principle figure.Relation between detector position, attitude and observed quantity can by formula (14) and Formula (15) is given:
Wherein,Angle Information for touch-down zone with respect to line of sight;(X, Y, Z) is touch-down zone under inertial system Coordinate;(x', y', z') is the coordinate of detector under inertial system;Aij, i, j=1,2,3 are camera coordinates system relative to inertial system Each component of direction cosine matrix A.
Angle Information using touch-down zone with respect to line of sightTo realize to level in detector landing mission The amendment of position.
Step 4, the control acceleration a for asking for detector:
The touch-down zone sight line information that the Optic flow information and step 3 obtained using step 2 are obtained, design detector landed The control law of Cheng Suoxu, asks for the control acceleration a of detector, is brought into formula (1), it is therefore an objective to correct the position of horizontal direction Deviation.
The control law is ratio control form:
A=kw(w*-w)+kζ*-ζ) (16)
Wherein w**It is desired smooth flow valuve and the angle of sight, w, ζ are the light flow valuve and the angle of sight at detector current time, kw,kζThe respectively control gain and the control gain of sight line angular displacement of light stream deviation.
Repeat step two to four, to realize small feature loss accuracy.
The present embodiment carries out simulating, verifying by target satellite of 433Eros asteroids, and simulated conditions are:In small feature loss landing point Under coordinate system, the initial position of detector is [- 1000, -1000,1500]T, dbjective state is zero point, and small feature loss surface is landed Point position is [0,0,0]T
The simulation result of accompanying drawing 4-6 shows, with different initial velocities, has carried out 4 groups of emulation, the landing that the present embodiment is proposed Control method, introduces the sight line information in touch-down zone on the basis of Optic flow information, and detector horizontal direction position is modified, Successfully realize accuracy of the detector on small feature loss surface.
Above-described specific descriptions, purpose, technical scheme and beneficial effect to inventing have been carried out further specifically It is bright, should be understood that and the foregoing is only specific embodiment of the invention, for explaining the present invention, it is not used to limit this The protection domain of invention, all any modification, equivalent substitution and improvements within the spirit and principles in the present invention, done etc., all should It is included within protection scope of the present invention.

Claims (3)

1. a kind of small feature loss accuracy Guidance and control method based on Optic flow information, it is characterised in that:Comprise the following steps that:
Step one, set up system dynamics equation;
The kinetics equation of detector is set up in the case where landing point connects firmly coordinate system:
r · = v v · = g - 2 ω × v - ω × ω × r + a - - - ( 1 )
Wherein, r is position vector of the detector under landing point coordinate system, and v is the velocity of detector, and ω is target celestial body Spin velocity vector, the target celestial body gravitational acceleration that g is subject to for detector, a is the control acceleration of detector;
2nd, the Optic flow information of detector is obtained using optical flow method:
Based on pin-hole imaging principle, the Optic flow information of current time detector can be obtained using camera speed and depth information:
Wherein, w=(u, v) is the Optic flow information of detector, (Vx,Vy,Vz) it is point-to-point speed of the camera under inertial system, (ωx, ωyz) it is angular velocity of rotation of the camera under inertial system,It is M Z axis under camera coordinates system in any point in three dimensions Coordinate, (x, y) is point M in the coordinate of camera imaging plane correspondence subpoint, and f is known navigation camera focus;
Step 3, the sight line information that touch-down zone is obtained using camera:
Using the relative line of sight navigated between camera observation detector and touch-down zone, the angle measurement for obtaining relative line of sight is believed Breath;Relation between detector position, attitude and observed quantity can be given by formula (3) and formula (4):
θ = tan - 1 ( A 11 ( X - x ′ ) + A 12 ( Y - y ′ ) + A 13 ( Z - z ′ ) A 31 ( X - x ′ ) + A 32 ( Y - y ′ ) + A 33 ( Z - z ′ ) ) - - - ( 3 )
Wherein,Angle Information for touch-down zone with respect to line of sight;(X, Y, Z) is the seat in touch-down zone under inertial system Mark;(x', y', z') is the coordinate of detector under inertial system;Aij, i, j=1,2,3 are side of the camera coordinates system relative to inertial system To each component of cosine matrix A;
Angle Information using touch-down zone with respect to line of sightTo realize to horizontal direction in detector landing mission The amendment of position;
Step 4, the control acceleration a for asking for detector:
The touch-down zone sight line information that the Optic flow information and step 3 obtained using step 2 are obtained, design detector landing mission institute The control law for needing, asks for the control acceleration a of detector, is brought into formula (1), it is therefore an objective to correct horizontal direction position deviation;
Repeat step two to four, to realize small feature loss accuracy.
2. a kind of small feature loss accuracy Guidance and control method based on Optic flow information as claimed in claim 1, its feature exists In:The control law can be:Ratio control form or sliding formwork control form.
3. a kind of small feature loss accuracy Guidance and control method based on Optic flow information as claimed in claim 1 or 2, its feature It is:When the control law is ratio control form:
A=kw(w*-w)+kζ*-ζ) (5)
Wherein w**It is desired smooth flow valuve and the angle of sight, w, ζ are the light flow valuve and the angle of sight at detector current time, kw,kζPoint Not Wei light stream deviation control gain and sight line angular displacement control gain.
CN201611250994.8A 2016-12-29 2016-12-29 A kind of small feature loss accuracy Guidance and control method based on Optic flow information Active CN106672265B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611250994.8A CN106672265B (en) 2016-12-29 2016-12-29 A kind of small feature loss accuracy Guidance and control method based on Optic flow information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611250994.8A CN106672265B (en) 2016-12-29 2016-12-29 A kind of small feature loss accuracy Guidance and control method based on Optic flow information

Publications (2)

Publication Number Publication Date
CN106672265A true CN106672265A (en) 2017-05-17
CN106672265B CN106672265B (en) 2019-02-15

Family

ID=58872701

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611250994.8A Active CN106672265B (en) 2016-12-29 2016-12-29 A kind of small feature loss accuracy Guidance and control method based on Optic flow information

Country Status (1)

Country Link
CN (1) CN106672265B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110239744A (en) * 2019-06-28 2019-09-17 北京理工大学 Thrust Trajectory Tracking Control method is determined in a kind of landing of small feature loss
CN110466805A (en) * 2019-09-18 2019-11-19 北京理工大学 Asteroid landing guidance method based on optimization Guidance Parameter
CN112249369A (en) * 2020-09-28 2021-01-22 上海航天控制技术研究所 Rocket power fixed-point landing guidance method
CN113030517A (en) * 2021-02-18 2021-06-25 北京控制工程研究所 Attitude correction method by using speed measuring sensor in Mars landing process
CN115309057A (en) * 2022-09-05 2022-11-08 北京理工大学 Safe landing guidance method for planet surface complex terrain area

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103158891A (en) * 2013-03-04 2013-06-19 北京理工大学 Target selection method for flying over small celestial body from driven balance point track
CN104457705A (en) * 2014-12-12 2015-03-25 北京理工大学 Initial orbit determination method for deep space target celestial body based on space-based autonomous optical observation
US9067694B2 (en) * 2009-02-03 2015-06-30 The Boeing Company Position-based gyroless control of spacecraft attitude
CN105644785A (en) * 2015-12-31 2016-06-08 哈尔滨工业大学 Unmanned aerial vehicle landing method based on optical flow method and horizon line detection
CN105739537A (en) * 2016-03-29 2016-07-06 北京理工大学 Active control method for adhesion motion on small celestial body surface

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9067694B2 (en) * 2009-02-03 2015-06-30 The Boeing Company Position-based gyroless control of spacecraft attitude
CN103158891A (en) * 2013-03-04 2013-06-19 北京理工大学 Target selection method for flying over small celestial body from driven balance point track
CN104457705A (en) * 2014-12-12 2015-03-25 北京理工大学 Initial orbit determination method for deep space target celestial body based on space-based autonomous optical observation
CN105644785A (en) * 2015-12-31 2016-06-08 哈尔滨工业大学 Unmanned aerial vehicle landing method based on optical flow method and horizon line detection
CN105739537A (en) * 2016-03-29 2016-07-06 北京理工大学 Active control method for adhesion motion on small celestial body surface

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110239744A (en) * 2019-06-28 2019-09-17 北京理工大学 Thrust Trajectory Tracking Control method is determined in a kind of landing of small feature loss
CN110239744B (en) * 2019-06-28 2020-12-22 北京理工大学 Small celestial body landing fixed thrust trajectory tracking control method
CN110466805A (en) * 2019-09-18 2019-11-19 北京理工大学 Asteroid landing guidance method based on optimization Guidance Parameter
CN112249369A (en) * 2020-09-28 2021-01-22 上海航天控制技术研究所 Rocket power fixed-point landing guidance method
CN112249369B (en) * 2020-09-28 2022-01-04 上海航天控制技术研究所 Rocket power fixed-point landing guidance method
CN113030517A (en) * 2021-02-18 2021-06-25 北京控制工程研究所 Attitude correction method by using speed measuring sensor in Mars landing process
CN113030517B (en) * 2021-02-18 2022-10-28 北京控制工程研究所 Attitude correction method by using speed measuring sensor in Mars landing process
CN115309057A (en) * 2022-09-05 2022-11-08 北京理工大学 Safe landing guidance method for planet surface complex terrain area
CN115309057B (en) * 2022-09-05 2023-08-11 北京理工大学 Planet surface complex terrain area safe landing guidance method

Also Published As

Publication number Publication date
CN106672265B (en) 2019-02-15

Similar Documents

Publication Publication Date Title
CN106708066B (en) View-based access control model/inertial navigation unmanned plane independent landing method
CN106672265B (en) A kind of small feature loss accuracy Guidance and control method based on Optic flow information
CN100587641C (en) A kind of attitude determination system that is applicable to the arbitrary motion mini system
CN102829779B (en) Aircraft multi-optical flow sensor and inertia navigation combination method
CN105021092B (en) A kind of guidance information extracting method of strapdown homing target seeker
CN105953796A (en) Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone
CN103852082B (en) Inter-satellite measurement and gyro attitude orbit integrated smoothing estimation method
CN106017463A (en) Aircraft positioning method based on positioning and sensing device
CN109269511B (en) Curve matching visual navigation method for planet landing in unknown environment
CN104457705B (en) Deep space target celestial body based on the autonomous optical observation of space-based just orbit determination method
CN103913181A (en) Airborne distribution type POS (position and orientation system) transfer alignment method based on parameter identification
CN105953795B (en) A kind of navigation device and method for the tour of spacecraft surface
CN104833352A (en) Multi-medium complex-environment high-precision vision/inertia combination navigation method
CN105973238A (en) Spacecraft attitude estimation method based on norm-constrained cubature Kalman filter
Steiner et al. A vision-aided inertial navigation system for agile high-speed flight in unmapped environments: Distribution statement a: Approved for public release, distribution unlimited
Yun et al. IMU/Vision/Lidar integrated navigation system in GNSS denied environments
CN109813306A (en) A kind of unmanned vehicle planned trajectory satellite location data confidence level calculation method
CN103438890B (en) Based on the planetary power descending branch air navigation aid of TDS and image measurement
CN103968844B (en) Big oval motor-driven Spacecraft Autonomous Navigation method based on low rail platform tracking measurement
CN107144278A (en) A kind of lander vision navigation method based on multi-source feature
CN102426025A (en) Simulation analysis method for drift correction angle during remote sensing satellite attitude maneuver
CN103017772A (en) Optical and pulsar fusion type self-navigating method based on observability analysis
Delaune et al. Extended navigation capabilities for a future mars science helicopter concept
CN103335654A (en) Self-navigation method for planetary power descending branch
Xu et al. Landmark-based autonomous navigation for pinpoint planetary landing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant