CN1847791A - Verification system for fast autonomous deep-space optical navigation control prototype - Google Patents

Verification system for fast autonomous deep-space optical navigation control prototype Download PDF

Info

Publication number
CN1847791A
CN1847791A CNA200610010033XA CN200610010033A CN1847791A CN 1847791 A CN1847791 A CN 1847791A CN A200610010033X A CNA200610010033X A CN A200610010033XA CN 200610010033 A CN200610010033 A CN 200610010033A CN 1847791 A CN1847791 A CN 1847791A
Authority
CN
China
Prior art keywords
deep
deep space
space probe
simulation
autonomous
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA200610010033XA
Other languages
Chinese (zh)
Other versions
CN100451548C (en
Inventor
崔平远
张泽旭
崔祜涛
徐瑞
朱圣英
史雪岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CNB200610010033XA priority Critical patent/CN100451548C/en
Publication of CN1847791A publication Critical patent/CN1847791A/en
Application granted granted Critical
Publication of CN100451548C publication Critical patent/CN100451548C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The verification system for fast autonomous deep-space optical navigation control prototype belongs to the field of semi-physical simulation system based on virtual reality. It includes the autonomous optical navigation control and dynamic simulation system to be verified with the selected landed planet and area to generate the navigation information and control command of deep-space detector to complete the dynamic simulation of the deep-space detector; a target simulating system to generate the 3D virtual environment the deep-space detector faces and to be shown in the display based on the detector information produced through dynamic simulation; and an optical navigation camera system to take the image on the display and to drive the generation of next step of deep-space detector navigating information and control command and complete the dynamic simulation of the deep-space detector.

Description

Verification system for fast autonomous deep-space optical navigation control prototype
Technical field
The present invention relates to a kind of simulation system of deep space probe autonomous optical navigation control, belong to semi-matter simulating system based on virtual reality technology.
Background technology
Increasing along with the interplanetary exploration task, deep space probe autonomous positioning and the safe soft landing on the celestial body surface have become the vital task and the problem of following deep space scientific exploration.Soft landing autonomic Navigation Control technology is meant under the situation that the people does not participate in directly, the information that detector utilizes self-contained sensor to provide, independently determine pose information such as position, attitude, simultaneously safety evaluation is carried out in alternative touch-down zone, in landing mission, choose the final landing point, independently produce the process that control law arrives the point that lands safely simultaneously.The Navigation Control scheme of based target celestial body optical characteristics and face of land image information is considered to one of the best Navigation Control scheme of celestial body of landing at present.
Because deep space probe autonomous optical navigation control system is before the rail test flight, must be through a parameter testing and a Qualify Phase of simulating on the ground under the deep space environment, mainly finish the dynamic (dynamical) simulation of deep space probe, produce descending trajectory, the attitude information of its six degree of freedom, and rely on these information to generate the image that the navigation camera photographs, provide condition for verifying different autonomous autonomous optical navigation controlling schemes.Descending trajectory, the attitude information of the six degree of freedom that dynamics simulation is produced are compared with estimated position and attitude information that autonomous navigation system provides, autonomous Orbit that can measuring optical independent navigation algorithm is determined, attitude is determined performance, utilize the obstacle detection function of the target celestial body surface optics characteristic test autonomous optical navigation algorithm of simulation, by analysis to track, attitude in the whole landing mission, the speed of deep space probe, attitude when watching drop point site and ground connection can be verified the feasibility of soft landing controlling schemes.
In existing realization deep space probe autonomous optical navigation control simulation test verification platform, formerly technology [1] is (referring to Eli David Skulsky, Andrew Edie Johnson et al., Rocket Sled Testing ofa Prototype Terrain-Relative Navigation System.AAS 01-026), U.S. NASA subordinate JPL laboratory is adopted by motor and is directly dragged the platform scheme of deep space probe camera system in the slide rail motion, be that deep space probe will be simulated actual spatial movement by the space trajectory data that has designed, be used to test and verify the performance parameter of the autonomous obstacle detection algorithm that the JPL laboratory proposes.This platform can be simulated the one dimension track and the two-dimensional attitude motion state of deep space probe, can finish the test to obstacle detection algorithm correlated performance parameter.Drag camera system and on slide rail, move because this platform is direct drive motor, therefore to the target star dynamics, topworkies etc. can not simulate, this test platform is merely able to verify the performance of autonomous obstacle detection algorithm in the soft landing for deep space probe process, can not verify a whole set of soft-landing system that also comprises navigation, control system, simultaneously because this platform can only be simulated three-dimensional motion, come relatively difficulty of Simulation of Complex deep space probe running orbit with it, this platform also has problems such as floor area is bigger.
Formerly technology [2] is (referring to Srikanth Saripalli, Gaurav S.Sukhatme et al., A Testbedfor Mars Precision Landing Experiments by Emulating Spacecraft Dynamics on aModel Helicopter.In IEEE/RSJ International Conference on Intelligent Robots andSystems (IROS), pp 2097-2102, EPFL, Switzerland, Oct 2002), the American South University of California utilizes helicopter simulation deep space probe, promptly utilize the dynamics mathematical model of deep space probe to drive the position that helicopter is realized deep space probe, the attitude state, utilize helicopter to carry the navigation camera simultaneously the zone of appointment on the ground sensor that carries in conjunction with other helicopter of taking pictures is finished navigation feature, utilize this platform can verify the accuracy and the real-time of soft landing autonomic navigation algorithm like this, simultaneously, the feasibility of checking soft landing scheme.But therefore this platform involves great expense owing to adopt Helicopter System, and system complex causes each validation test will spend great amount of manpower and material resources and long test period.
Virtual reality technology is to utilize three-dimensional picture generation technique, many sensings interaction technique and high-resolution display technique, generates the virtual environment of three dimensional lifelike.It is the artificial environment of the three-dimensional information that is made of computer hardware, software and various sensor, it is virtual environment, things and the environment physically, on the function that are attainable and can not realize, the user drops in this environment, just reciprocation with it.This technology itself is comparative maturity.
Summary of the invention
The purpose of this invention is to provide a kind of verification system for fast autonomous deep-space optical navigation control prototype, device involves great expense when carrying out the simulation of deep space probe landing mission to overcome prior art, system complex causes each validation test will spend great amount of manpower and material resources and the defective of long test period.It comprises autonomous optical navigation to be verified control and dynamics simulation system 1, to realize the navigation information and the steering order of selected landing celestial body and touchdown area generation deep space probe and to finish the dynamic (dynamical) simulation of deep space probe;
It also comprises target simulation system 3, to realize basis the detector information that dynamics simulation was produced is generated the three-dimensional virtual environment that deep space probe was faced, and is shown by display 3-1; With optical guidance camera system 4, to realize taking the shown image of display 3-1, utilize the navigation information and the steering order of the autonomous optical navigation control to be verified of this image-driven and dynamics simulation system 1 next step deep space probe of generation and finish the dynamic (dynamical) simulation of deep space probe by optical guidance camera 4-1.
Ultimate principle of the present invention is to have adopted virtual reality technology to drive object module, generates the three-dimensional virtual environment image that deep space probe faced that optical guidance camera 4-1 gathers.The present invention has adopted modular technology simultaneously, at different mission phases, adopts different object modules.Utilize 3dMAX to set up terrain model storehouse, touch-down zone at different touch-down zone topographical features based on statistical theory; At different target celestial bodies, set up the dynamics model library of inertial coordinates system under with the celestial body surface coordinate being down based on its spin time, gravitational field coefficient, celestial body radius; The diversity of the topworks of carrying at deep space probe based on the characteristic of each parts, is set up topworks's model bank; At the navigation camera characteristics difference that deep space probe carries, set up the phase hangar, set up the image algorithm storehouse based on the difference of different navigation cameras and Flame Image Process form; In the deep space probe flight course, the difference of autonomous optical navigation controlling schemes has been set up autonomous optical navigation controlling schemes storehouse.
When the present invention carries out the simulation of deep space probe landing mission, do not use large-scale plants such as helicopter, all processes are all used the computer based virtual reality technology, therefore apparatus cost is cheap, system is simple, each validation test spends a spot of human and material resources and short test period can finish, and the technology of utilizing optical guidance camera 4-1 to take the shown this hardware-in-the-loop simulation of image of display 3-1 in addition has the high advantage of reliability.
The present invention can generate deep space probe at the angle of pitch and crab angle-90~90 degree, roll angle-180~180 degree, the image of gathering in the six degree of freedom scope of three-dimensional optional position; Owing to adopted virtual reality technology, made the present invention can adopt dissimilar navigation camera systems; Owing to adopted the dSPACE real-time simulation machine with powerful adjustable parameter function, the present invention can verify the kinetic model of dissimilar target celestial bodies, different Navigation Control schemes; The present invention has set up nine major planets, the moon and typical asteroidal dynamics model bank simultaneously, the target information storehouse, the Navigation Control scheme base, spaceborne topworks model bank also has expanded function, utilizes simple dilatory mode just can change the deep space probe flight environment of vehicle.Based on above characteristics, comparing the present invention with formerly technology [1], [2] can freely simulate deep space probe six, in the time of to the checking of the feasibility of whole autonomous optical navigation control system, have easy to operate, simple in structure, reliability is high, practical, characteristics such as cost is low, occupation of land is few.
The image information that the present invention utilizes optical guidance camera 4-1 to obtain is estimated the deep space probe pose, compare autonomous optical navigation control and dynamics simulation system 1 results estimated and deep space probe real trace, attitude, the performance of checking optical guidance algorithm.By analysis to track, attitude in the whole aerial mission, watch deep space probe position, speed with and information such as attitude, verify the feasibility of a whole set of autonomous optical navigation controlling schemes.Utilize and change touch-down zone landform, can verify the robustness of landing scheme with different characteristic.
Description of drawings
Fig. 1 is a structural representation of the present invention.
Embodiment
Specify present embodiment below in conjunction with Fig. 1.It is by autonomous optical navigation to be verified control and dynamics simulation system 1, to realize the navigation information and the steering order of selected landing celestial body and touchdown area generation deep space probe and to finish the dynamic (dynamical) simulation of deep space probe;
Target simulation system 3 generates the three-dimensional virtual environment that deep space probe was faced to realize basis to the detector information that dynamics simulation was produced, and is shown by display 3-1; With optical guidance camera system 4, to realize taking the shown image of display 3-1, utilize the navigation information and the steering order of the autonomous optical navigation control to be verified of this image-driven and dynamics simulation system 1 next step deep space probe of generation and finish the dynamic (dynamical) simulation of deep space probe by optical guidance camera 4-1; Form.Be described in detail below:
In simulation deep space probe sporting flying state procedure, because the whole motion state of deep space probe can be decomposed into the Three Degree Of Freedom of translation, the Three Degree Of Freedom of attitude.The Three Degree Of Freedom of translation can be divided into along three directions (X, Y, motion Z), the Three Degree Of Freedom of attitude can be decomposed into successively the Eulerian angle around three coordinate axis, native system adopts earlier and rotates the φ angle around the Z axle, rotates the  angle again around the X angle, rotates the θ angle at last around the Y angle.Claim that the φ angle is a crab angle, the  angle is a roll angle, and the θ angle is the angle of pitch, has
φ = arctan [ - A 21 A 22 ] =arcsin[A 23] θ = arctan [ - A 13 A 33 ]
A wherein IjFor the i of attitude transformed matrix A is capable, the j column element.Each constantly, deep space probe position, the attitude information (X that simulation system provides learned in current autonomous optical navigation control to be verified and dynamics simulation system 1 medium power, Y, Z, φ,  θ) passes to target simulation system 3, adopt virtual reality technology to drive object module by it, generated the target image that optical guidance camera 4-1 gathers in real time.
The present invention is directed to different mission phases, different reference frames, the kinetic model of deep space probe is to have any different, this platform is considered this point, at different mission phases, in different coordinate systems, set up the star dynamics model bank at each target celestial body.For example, celestial body barycenter dynamics can be expressed as under inertial system m d 2 r d t 2 = P + mU
Wherein: m is the deep space probe quality; R is the radius vector of lander barycenter in inertial coordinates system; P is the control thrust vectoring that acts on the lander; U is the celestial body gravitation acceleration.Celestial body barycenter dynamics then is expressed as under moving coordinate system
m δ 2 r δ t 2 = P + mU - 2 m ω e × δr δt - m ω e × ( ω e × r )
Wherein: m is the deep space probe quality; R is the radius vector of lander barycenter in inertial coordinates system; P is the control thrust vectoring that acts on the lander; U is the celestial body gravitation acceleration; ω eBe the moving coordinate system angular velocity of rotation.
(1) the dynamics simulation system of autonomous optical navigation control to be verified and dynamics simulation system 1: this system is that utilization is that dSPACE real-time simulation machine is realized with the autonomous spaceborne navigation control system of optics, finishes the simulation to topworks, star dynamics.For different mission phases, different celestial bodies, different reference frames, the kinetic model of deep space probe is to have any different, and this system considers this point, at different mission phases, in the different coordinate systems, set up the star dynamics model bank at each target celestial body, utilized this storehouse operating personnel can utilize the simple dilatory selection of finishing kinetic model, perhaps operating personnel's need this storehouse be expanded according to checking.The control signal that this system utilizes spaceborne autonomous optical navigation control simulation system to provide topworks is a control law, drive control and moment that topworks's model utilizes its generation, utilize dynamics of orbits model, attitude dynamics model to provide the status information of the deep space probes such as position, speed, attitude of deep space probe.Because the topworks that deep space probe carries and the difference of target celestial body, make that utilizing different topworkies is discrepant with the deep space probe kinetic model of surveying different celestial bodies, simultaneously, also be differentiated at different mission phases, kinetics equation in the coordinate system.The present invention has set up model bank to common topworks and target celestial body under different mission phases, coordinate system.As
Mission phase comprises: the interspace section of cruising, target celestial body are near section and target star group landing phase.
Coordinate system comprises: target star barycenter inertial coordinates system, translation inertial coordinates system, landing point coordinate system, velocity coordinate system, half speed coordinate system and deep space probe body series.
Topworks comprises: attitude nozzle, engine, flywheel and steering wheel.
Target celestial body comprises: nine major planets, the moon and typical asteroid.
More than each storehouse all have extended capability.For example before aerial mission, each topworks is definite, and operating personnel can pass through test execution mechanism, obtain its parameter, these parameters can be input among the present invention, utilize the present invention to carry out the checking of scheme.
(2) the autonomous optical navigation control system of autonomous optical navigation control to be verified and dynamics simulation system 1: this system finishes the simulation to spaceborne autonomous optical navigation control system.This system has set up the storehouse at common autonomous optical navigation controlling schemes and algorithm thereof.This fast autonomous deep-space optical navigation control prototype verification platform can be at different mission phases, face of land characteristic, camera system, spaceborne topworks, the target celestial body, the different situations of Navigation Control scheme, estimate track and attitude information and actual value relatively by the deep space probe that the comparison navigation control system provides, this platform is in the accuracy and the real-time of checking deep space probe independent navigation algorithm, by to track in the whole aerial mission, the analysis of attitude, watch the deep space probe position, speed with and information such as attitude, verify the feasibility of a whole set of autonomous optical navigation controlling schemes.
(3) target simulation system 3: this system utilizes virtual reality technology to drive object module, generates the target image that optical guidance camera 4-1 gathers.Object module comprises target luck movable model, target star brightness model and target star catalogue surface model, target luck movable model wherein, target star brightness model are the object modules that the deep space probe section of cruising, approaching section will be utilized, it comprises positional information, the monochrome information of day aerial celestial body at current time, and model such as size of the target celestial body that will survey, color, reflectivity, the data of utilizing Nasa JPL laboratory to provide are finished.Target star catalogue surface model then is the object module that the deep space probe landing phase will utilize, the deep space probe of having launched with reference to countries in the world photograph each in the surface image of target celestial body, statistical study is carried out on target celestial body surface, obtain the probability that various topographical features occur, utilize 3dMAX that target star catalogue face is set up model bank, comprise with different surface being characterized as main landform: crater, valley, rock, slope, crater etc.Utilize OpenGVS to drive the three-dimensional model of selecting for use, on indicator screen, generate the target star catalogue face image that the navigation camera is gathered according to information such as deep space probe current location, attitude and illumination, can generate deep space probe at the angle of pitch and crab angle-90~90 degree, roll angle-180~180 degree, the image of gathering in the six degree of freedom scope of three-dimensional optional position.
(4) the optical guidance camera system 4: because target simulation system 3 has adopted virtual reality technology, make optical guidance camera system 4 can adopt dissimilar navigation cameras, test platform can be tested the alternative camera that deep space probe carries like this, whether satisfies the performance requirement of total system with each cover navigation camera of checking.Simultaneously, because the difference of Navigation Control scheme, the information difference that requires the navigation camera system to provide is as low layer information or deep space probe estimation motions such as characteristic point position, unique point inheritance, obstacle location and size, middle layer information such as face of land estimating depth.This requires different image algorithms, because this system and other system are relatively independent, so replacement image Processing Algorithm easily.The image processing algorithm here needs at different navigational system respectively, respectively different image informations is extracted.

Claims (1)

1, verification system for fast autonomous deep-space optical navigation control prototype, it comprises autonomous optical navigation to be verified control and dynamics simulation system (1), to realize the navigation information and the steering order of selected landing celestial body and touchdown area generation deep space probe and to finish the dynamic (dynamical) simulation of deep space probe; It is characterized in that it also comprises target simulation system (3), to realize basis the detector information that dynamics simulation was produced is generated the three-dimensional virtual environment that deep space probe was faced, and shown by display (3-1); With optical guidance camera system (4), to realize taking the shown image of display (3-1), utilize autonomous optical navigation control to be verified of this image-driven and dynamics simulation system (1) to produce the navigation information and the steering order of next step deep space probe and finish the dynamic (dynamical) simulation of deep space probe by optical guidance camera (4-1).
CNB200610010033XA 2006-05-12 2006-05-12 Verification system for fast autonomous deep-space optical navigation control prototype Expired - Fee Related CN100451548C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB200610010033XA CN100451548C (en) 2006-05-12 2006-05-12 Verification system for fast autonomous deep-space optical navigation control prototype

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB200610010033XA CN100451548C (en) 2006-05-12 2006-05-12 Verification system for fast autonomous deep-space optical navigation control prototype

Publications (2)

Publication Number Publication Date
CN1847791A true CN1847791A (en) 2006-10-18
CN100451548C CN100451548C (en) 2009-01-14

Family

ID=37077439

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB200610010033XA Expired - Fee Related CN100451548C (en) 2006-05-12 2006-05-12 Verification system for fast autonomous deep-space optical navigation control prototype

Country Status (1)

Country Link
CN (1) CN100451548C (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101074880B (en) * 2007-07-24 2010-06-16 北京控制工程研究所 Method for scanning entry safety landing area in moon detector suspension stage
CN101762274B (en) * 2010-02-01 2011-11-09 北京理工大学 Observation condition number-based method for selecting autonomously located road sign of deep space probe
CN105302156A (en) * 2015-12-03 2016-02-03 上海新跃仪表厂 Ground verification system and track planning method for tracking spacecraft
CN105628055A (en) * 2016-01-06 2016-06-01 北京工业大学 Autonomous optical navigation target imaging analog system for landing of deep space probe
CN107132542A (en) * 2017-05-02 2017-09-05 北京理工大学 A kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar
CN109541714A (en) * 2018-12-14 2019-03-29 上海航天控制技术研究所 A kind of optical target analogy method under distribution dynamic scene
CN110174850A (en) * 2019-04-30 2019-08-27 上海卫星工程研究所 Angle measurement test the speed Integrated Navigation Semi-physical Simulation verifying system and method
CN111537000A (en) * 2020-06-08 2020-08-14 中国科学院微小卫星创新研究院 Ground verification system and method for deep space small celestial body landing segment optical navigation algorithm
CN112550778A (en) * 2020-11-10 2021-03-26 中国科学院空间应用工程与技术中心 Deep space exploration visual imaging environment simulation device and method
CN114291263A (en) * 2022-01-13 2022-04-08 沈阳智翔通飞通用航空技术有限公司 Remote control system for coaxial dual-rotor unmanned helicopter
CN114396948A (en) * 2022-01-13 2022-04-26 中国科学院微小卫星创新研究院 High-precision ground test system suitable for verifying autonomous navigation of multi-vision system
CN117826619A (en) * 2024-03-05 2024-04-05 哈尔滨工业大学 Extraterrestrial celestial body detection soft landing multimachine reference correction and ground simulation system and method

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101726296B (en) * 2009-12-22 2013-10-09 哈尔滨工业大学 Vision measurement, path planning and GNC integrated simulation system for space robot
CN102116641B (en) * 2009-12-31 2012-08-08 北京控制工程研究所 Semi-physical simulation testing system and method for deep space autonomous navigation star sensor
CN101830290B (en) * 2010-02-12 2012-12-19 哈尔滨工业大学 Autonomous navigation and guidance control programming dispatching method for small celestial body impact probing

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4688092A (en) * 1986-05-06 1987-08-18 Ford Aerospace & Communications Corporation Satellite camera image navigation
CN1276062A (en) * 1997-10-24 2000-12-06 韦尔斯和弗恩投资有限公司 Camera guide system
JPH11183192A (en) * 1997-12-22 1999-07-09 Mitsubishi Electric Corp Altitude-calculating apparatus
JP2005291885A (en) * 2004-03-31 2005-10-20 Nec Corp Portable communication terminal with navigation function
JP4470105B2 (en) * 2004-05-20 2010-06-02 ノーリツ鋼機株式会社 Navigation system and program
CN1693851A (en) * 2005-06-08 2005-11-09 中国科学院上海技术物理研究所 Aviation linear array CCD image geometric rough correct algorithm

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101074880B (en) * 2007-07-24 2010-06-16 北京控制工程研究所 Method for scanning entry safety landing area in moon detector suspension stage
CN101762274B (en) * 2010-02-01 2011-11-09 北京理工大学 Observation condition number-based method for selecting autonomously located road sign of deep space probe
CN105302156A (en) * 2015-12-03 2016-02-03 上海新跃仪表厂 Ground verification system and track planning method for tracking spacecraft
CN105302156B (en) * 2015-12-03 2018-01-30 上海新跃仪表厂 A kind of method for planning track of ground validation system and pursuit spacecraft
CN105628055A (en) * 2016-01-06 2016-06-01 北京工业大学 Autonomous optical navigation target imaging analog system for landing of deep space probe
CN105628055B (en) * 2016-01-06 2018-07-31 北京工业大学 A kind of deep space probe landing autonomous optical navigation target imaging simulation system
CN107132542B (en) * 2017-05-02 2019-10-15 北京理工大学 A kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar
CN107132542A (en) * 2017-05-02 2017-09-05 北京理工大学 A kind of small feature loss soft landing autonomic air navigation aid based on optics and Doppler radar
CN109541714A (en) * 2018-12-14 2019-03-29 上海航天控制技术研究所 A kind of optical target analogy method under distribution dynamic scene
CN110174850A (en) * 2019-04-30 2019-08-27 上海卫星工程研究所 Angle measurement test the speed Integrated Navigation Semi-physical Simulation verifying system and method
CN111537000A (en) * 2020-06-08 2020-08-14 中国科学院微小卫星创新研究院 Ground verification system and method for deep space small celestial body landing segment optical navigation algorithm
CN112550778A (en) * 2020-11-10 2021-03-26 中国科学院空间应用工程与技术中心 Deep space exploration visual imaging environment simulation device and method
CN114291263A (en) * 2022-01-13 2022-04-08 沈阳智翔通飞通用航空技术有限公司 Remote control system for coaxial dual-rotor unmanned helicopter
CN114396948A (en) * 2022-01-13 2022-04-26 中国科学院微小卫星创新研究院 High-precision ground test system suitable for verifying autonomous navigation of multi-vision system
CN117826619A (en) * 2024-03-05 2024-04-05 哈尔滨工业大学 Extraterrestrial celestial body detection soft landing multimachine reference correction and ground simulation system and method
CN117826619B (en) * 2024-03-05 2024-05-14 哈尔滨工业大学 Extraterrestrial celestial body detection soft landing multimachine reference correction and ground simulation system and method

Also Published As

Publication number Publication date
CN100451548C (en) 2009-01-14

Similar Documents

Publication Publication Date Title
CN100451548C (en) Verification system for fast autonomous deep-space optical navigation control prototype
Rybus et al. Planar air-bearing microgravity simulators: Review of applications, existing solutions and design parameters
CN101726296B (en) Vision measurement, path planning and GNC integrated simulation system for space robot
Black A passive system for determining the attitude of a satellite
CN100495261C (en) Half-physical emulation test system for controlling and guiding, navigating and controlling soft landing for moon
CN104898642B (en) A kind of integration testing analogue system for Spacecraft Attitude Control algorithm
CN100451549C (en) Simulation test device for visual navigation algorithm of soft landing of deep-space detector
Tsiotras ASTROS: A 5DOF experimental facility for research in space proximity operations
CN105628055B (en) A kind of deep space probe landing autonomous optical navigation target imaging simulation system
CN105466477A (en) A space-based observation simulation system and method targeted at satellite targets and fixed star targets
Yoshimitsu et al. Hayabusa-final autonomous descent and landing based on target marker tracking
Schulte et al. Development of an integrated spacecraft guidance, navigation, & control subsystem for automated proximity operations
CN112650076B (en) Constellation cooperative control ground simulation system
Sabatini et al. Design and tests of a frictionless 2D platform for studying space navigation and control subsystems
Cameron et al. DSENDS: multi-mission flight dynamics simulator for NASA missions
Benninghoff et al. Autonomous navigation for on-orbit servicing
Rew et al. Control system design of the Korean lunar lander demonstrator
Amiri Atashgah et al. An integrated virtual environment for feasibility studies and implementation of aerial MonoSLAM
Schwartz et al. The distributed spacecraft attitude control system simulator: development, progress, plans
Helmick et al. Small body surface mobility with a limbed robot
Jung et al. Dynamic 3-D Visualization of the Korea Pathfinder Lunar Orbiter Attitude Control Simulator
CN114721296A (en) Spatial non-cooperative target capturing, controlling and assembly control test device and method
Felicetti et al. Image-based visual servoing control for spacecraft formation flying
Bajracharya et al. Target tracking, approach, and camera handoff for automated instrument placement
Steiner III A unified vision and inertial navigation system for planetary hoppers

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20090114

Termination date: 20110512