CN105628055A - Autonomous optical navigation target imaging analog system for landing of deep space probe - Google Patents
Autonomous optical navigation target imaging analog system for landing of deep space probe Download PDFInfo
- Publication number
- CN105628055A CN105628055A CN201610007370.7A CN201610007370A CN105628055A CN 105628055 A CN105628055 A CN 105628055A CN 201610007370 A CN201610007370 A CN 201610007370A CN 105628055 A CN105628055 A CN 105628055A
- Authority
- CN
- China
- Prior art keywords
- data
- celestial body
- unit
- camera
- parameter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 36
- 230000003287 optical effect Effects 0.000 title claims abstract description 28
- 239000000523 sample Substances 0.000 title claims abstract description 15
- 238000012876 topography Methods 0.000 claims abstract description 23
- 238000009877 rendering Methods 0.000 claims abstract description 13
- 238000005286 illumination Methods 0.000 claims abstract description 9
- 230000000694 effects Effects 0.000 claims abstract description 3
- 238000011160 research Methods 0.000 claims description 23
- 238000004088 simulation Methods 0.000 claims description 19
- 238000000034 method Methods 0.000 claims description 11
- 239000011435 rock Substances 0.000 claims description 10
- 239000000463 material Substances 0.000 claims description 8
- 238000012545 processing Methods 0.000 claims description 6
- 238000013461 design Methods 0.000 claims description 5
- 241001424688 Enceliopsis Species 0.000 claims description 4
- 239000011159 matrix material Substances 0.000 claims description 4
- 230000000007 visual effect Effects 0.000 claims description 4
- 230000009897 systematic effect Effects 0.000 claims description 3
- PEDCQBHIVMGVHV-UHFFFAOYSA-N Glycerine Chemical compound OCC(O)CO PEDCQBHIVMGVHV-UHFFFAOYSA-N 0.000 claims description 2
- 230000008676 import Effects 0.000 claims description 2
- 238000013178 mathematical model Methods 0.000 claims description 2
- 238000012856 packing Methods 0.000 claims description 2
- 238000004806 packaging method and process Methods 0.000 abstract 1
- 230000007423 decrease Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000012795 verification Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 230000018109 developmental process Effects 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 239000004576 sand Substances 0.000 description 2
- 238000004891 communication Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000012332 laboratory investigation Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
Landscapes
- Engineering & Computer Science (AREA)
- Manufacturing & Machinery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a kinetics analog system for the landing of an asteroid probe. A target celestial body topography generation unit is in charge of reading topographical gross DEM (Dynamic Effect Model) data, surface physical parameter profiles, and surface feature profiles, and polygon three dimensional topographical data conforming with the target celestial body topographical features are generated by an algorithm; a shadow projection generation unit is in charge of calculating the position of the target celestial body relative to the sun and the incident angle of solar ray at the current analog time according to an ephemeris, and rendering to generate a shadow projection map; a navigation camera imaging analog unit is in charge of rendering to generate an image in accordance with the requirements of a camera according to the information of current camera parameters, positions, illumination parameters, and topographical texture; a data input and output unit is in charge of receiving and resolving the data of the camera position, attitude parameters and current analog time sent over by a client side according to a protocol, meanwhile packaging and sending the image data generated by rendering to the client side according to the protocol so as to be used in the resolution of an autonomous optical navigation system, thus realizing the imaging analog of the navigated target.
Description
Technical field
The present invention relates to a kind of deep space probe landing autonomous optical navigation target imaging analog systems, belong to the Digital Simulation System based on virtual reality technology.
Background technology
In survey of deep space, owing to the target celestial body distance earth is remote, there is longer communication delay between detector and earth station, the navigational guidance mode based on Deep Space Network cannot meet requirement, therefore, landing seeker must have independent navigation, guidance and control the ability of (GNC). Optical navigation system has the advantages such as volume is little, low in energy consumption, contain much information, it is possible to provides detector position, speed, attitude information in whole landing mission, and realizes obstacle detection to ensure that detector can safe soft landing in final landing section. The navigation system of optically-based navigation camera and IMU is considered as one of deep space probe best navigation scheme of realizing accurate soft landing celestial body, the navigation system of optically-based camera has all been carried out substantial amounts of research, and has made some progress by the JPL laboratory of current U.S. NASA subordinate, the ESA of European Union, Japanese ISAS (ISAS) etc.
The Digital Simulation verification platform being currently based on optical navigation system does not but obtain enough research and developments, in the research of optically-based navigation system, there is following difficulty:
1. real celestial body earth's surface Image Acquisition difficulty, limited resolution;
2. in the celestial body earth's surface image got at present, it is impossible to determine height during shooting, the parameter such as camera focus, thus the locus that the impact point that cannot calculate on image is under camera coordinates system;
3. cannot obtain continuous print earth's surface image in detector decline process, thus whole decline process cannot be carried out Digital Simulation;
4. in the celestial body earth's surface image got at present, it does not have the radar range finding image matched with image, it is impossible to the obstacle detection algorithm based on elevation map and image is carried out simulating, verifying;
5. the detector soft landing navigation control system of optically-based navigation cannot be carried out closed-loop simulation.
In order to solve the problems referred to above, U.S.'s NASA subordinate's JPL laboratory investigation also have studied kinds of schemes, including: 1) rocket sledge (RocketSled), 2) manned versions of helicopter, 3) parachute, 4) unmanned plane, 5) crane boom, 6) automatic Pilot helicopter. paper [1] (EliDavidSkulsky, AndrewEdieJohnsonetal., RocketSledTestingofaPrototypeTerrain-RelativeNavigationS ystem.24thAnnualAASGuidanceandControlConference, Jan31,2001) in, JPL laboratory uses rocket sledge to drag the platform scheme that detector slides, the landing mission of analog prober, being provided with the equipment such as lidar measurement system, IMU, camera on detector, guide rail front vertical places the sand table of simulated target touchdown area. paper [2] (SrikanthSaripalliandGauravS.Sukhatme, ATestbedforMarsPrecisionLandingExperimentsbyEmulatingSpa cecraftDynamicsonaModelHelicopter, IEEE/RSJInternaltionalConferenceonInteligentRobotsandSys tems, pp2097-2102, EPFL, Switzeland, Oct2002) in, SrikanthSaripalli et al. proposes use model copter analog prober decline process, it is integrated obtaining the sub position of current detection and attitude to the kinetic model of detector by computer, the pose of helicopter is adjusted to consistent with the pose of detector, the Autonomous Navigation Algorithm of helicopter is verified. paper [3] (MontgomeryJF, JohnsonAE, RoumeliotisSI, etal.Thejetpropulsionlaboratoryautonomoushelicoptertestb ed:Aplatformforplanetaryexplorationtechnologyresearchand development [J] .JournalofFieldRobotics, 2006, 23 (3 4): 245-267.) in, above-mentioned each scheme pluses and minuses have been summed up, describe the crane boom analog platform of JPL laboratory and the progress of depopulated helicopter verification platform scheme simultaneously, hardware configuration etc., two kinds of platforms can cooperate, it is respectively used to the proof of algorithm of different phase.
Such scheme is all based on physical system simulation and verification platform, there is volume big, involve great expense, system complex, the problems such as testing cost is high, test period length, simultaneously, it is limited to the restriction of the maneuverabilities such as physical system such as helicopter, is difficult to 3 D motion trace and attitude that full simulation detector declines.
Paper [4] (KubotaT, HashimotoT, SawaiS, etal.AnautonomousnavigationandguidancesystemforMUSES-Cas teroidlanding [J] .ActaAstronautica, 2003, 52 (2): 125-131.) in, mention Japan ISAS when verifying the navigation camera reliability of falcon bird number, SGI work station is used to construct an asteroid image simulation system, this system can render the threedimensional model of phobos (Phobos), emulation as target landing celestial body, the image that analog prober navigates taken by camera under accurate location and attitude, checking for GNC systematic function.
Patent [5] (autonomous deep-space optical navigation control prototype checking system, CN100451548C) a kind of optical guidance rapid prototyping checking system is proposed, this system includes 1) system autonomous optical navigation to be verified control and dynamics simulation system, utilize dSPACE real-time simulation machine to realize, complete the simulation to actuator, star dynamics; 2) target simulation system, utilizes 3D Rendering, demonstrates the three-dimensional virtual environment that detector is encountered over the display; 3) optical navigation camera system. The analog systems OpenGVS of this aims of systems drives the threedimensional model selected, indicator screen shows the target satellite surface image of the navigation collected by camera generated, its shortcoming is as follows: 1) OpenGVS is as a ancient 3D engine, itself design is for different fields, run suitable on the computer of different performance, the OpenGL version that it is supported is 1.2, rendering capability is limited, do not support shading language, the ability of current video card can not be played, it is difficult to render the real scene reached required for optical guidance; 2) render image and show over the display, be limited to the factors such as size of display, resolution, refreshing frequency, by camera, the image shown by display is shot, necessarily cause the decline of picture quality, even can not shoot correct picture; 3) system complex, relates to a large amount of expensive professional emulator, very flexible, is unfavorable for that the research worker of optical guidance algorithm carries out the high-speed simulation checking of algorithm.
Summary of the invention
It is an object of the invention to provide a kind of deep space probe landing autonomous optical navigation target imaging analog systems, to overcome existing system complicated, involve great expense, the defect of very flexible, make research worker can realize the fast verification of deep space probe autonomous optical navigation algorithm based on native system.
Native system includes target celestial body topography and geomorphology and generates unit, shade projection generation unit, navigation camera imaging analogue unit, data input-output unit; Target celestial body topography and geomorphology generates unit and is generated unit and interact with navigation camera imaging analogue unit, shade projection by polygon three dimensional topographic data respectively; Navigation camera imaging analogue unit interacts with data input-output unit; Shade projection generates unit and is interacted by shade projection pinup picture be connected with the camera imaging analogue unit that navigates; Data input-output unit interacts with shade projection generation unit and is connected.
Target celestial body ground row landforms generating unit is responsible for reading the thick dem data of landform, Land Surface Parameters configuration file, topographical features configuration file, is generated the polygon three dimensional topographic data meeting target celestial body topography and landform character by algorithm; Shade projection generates unit and is responsible for, according to ephemeris, calculating position and the sunray incident angle of the present day analog time relative sun of target celestial body, renders generation shade projection pinup picture; Navigation camera imaging analogue unit is responsible for, according to Current camera parameter, position, illumination parameter, landform material information, rendering and generating the image meeting camera requirement; Data input-output unit is responsible for according to agreement, receive and resolve camera position, attitude parameter and present day analog time data that client sends over, the view data rendering generation is sent to client according to protocol packing simultaneously, resolve for autonomous optical navigation system, it is achieved the Imaging Simulation of navigation camera subject.
The ultimate principle of the present invention is to have employed the three-dimensional imaging of up-to-date computer graphics development result DirectX11 post-processing object celestial body, it is achieved that deep space probe navigates the Imaging Simulation of camera subject in landing mission.
The present invention simulates generation deep space probe under three dimensions optional position, attitude, navigation image taken by camera.
Compared with prior art, it is an advantage of the current invention that:
(1) development result of computer technology particularly video card technology, 3D Rendering is made full use of, system is all-digital simulation system, do not use the main equipment such as helicopter, rocket sledge, sand table model need not be developed, algorithm designer attentively can study the reliability testing its algorithm on the platform, it is not necessary to the problem worrying bottom hardware design;
(2) compared with patent [5], the target simulation system of the present invention is directly according to navigation camera parameter, render produce current camera faced by scene, to simulate the image taken by camera, rendering result directly reads from frame buffer zone, need not move through display to show, figure is obtained again by camera shooting, so not by size of display, resolution limits, also without factors such as consideration display refreshing frequency, Show Color distortions, it is ensured that the farthest image taken by analog video camera.
Accompanying drawing explanation
Fig. 1 is the structured flowchart of present system;
Detailed description of the invention
Below in conjunction with Fig. 1 and detailed description of the invention, the present invention will be further described.
Target celestial body topography and geomorphology generates unit, including its corresponding landform dem data storehouse and two configuration files.
It is an independent program that target celestial body topography and geomorphology generates unit, and before using native system to emulate, research worker must first use target celestial body topography and geomorphology to generate the terrain data that unit previously generates the generation target landing celestial body of the precision that meets the requirements.
Landform DEM (digital elevation model DigitalElevationModel) data base contains the thick dem data of the part current mankind celestial body to detect, including the moon, Mars, 433Eros, 25143Itokawa dem data, when data base does not comprise the target celestial body data that research worker to simulate landing, research worker can manually import the target celestial body landform dem data meeting data format requirement.
Terrain parameter configuration file is text file format, research worker can use any text editor open carry out editor configuration, the parameter of configuration includes: target celestial body title, substantially landing path, final landing regional extent, DEM required precision. Specifying substantially landing path, final landing regional extent is to better the three-dimensional terrain model generated is optimized, and only generates the detector high-resolution relief model through region, improves the efficiency that comes into force.
Features of terrain configuration file is text formatting, research worker can use any text editor open carry out editor configuration, the parameter of configuration includes: terrain generation pattern (random or specify design parameter to generate according to research worker) is if set to stochastic generation, research worker need to specify crater distributed constant (including age, diameter, the degree of depth, crater brim height, position probability distribution), rock distribution parameter (size, position probability distribution); If for specifying design parameter to generate, then needing to arrange crater number, the age of each crater, diameter, the degree of depth, location parameter, and the number of rock, the size of each rock, location parameter, it is possible to two kinds of generation mode mixing use.
The flow process that target celestial body topography and geomorphology generates the polygon three dimensional topographic data that unit generates target celestial body topography and landform character is as follows:
(1) start, read configure according to research worker terrain parameter configuration file, features of terrain configuration file, target celestial body dem data;
(2) base area shape parameter, is processed target celestial body dem data by fractal algorithm, then carries out Fuzzy Processing, is that landform is more smooth, finally obtains required high resolution DEM data;
(3) according to landform characteristic parameter and crater, rock mathematical model, generate satisfactory crater and rock threedimensional model, and the dem data generated with (2) is overlapped processing, generate polygon three dimensional topographic data, generate the terrain data of terrain data meeting piecemeal storage different accuracy, project pinup picture generation unit for shade and the camera model unit that navigates reads and uses.
Shade projection pinup picture generates unit, including its corresponding landform almanac data storehouse and celestial body data base;
Shade projection pinup picture generation unit is a thread, runs when the camera imaging analogue unit that navigates starts, and it is as follows that it renders generation pinup picture generation process:
(1) initialize, read target celestial body topography and geomorphology and generate the polygon three dimensional topographic data of unit generation, almanac data storehouse, celestial body data base, determine the orbit parameter of target celestial body, rotation and revolution speed, according to the interval that correlation computations pinup picture renders;
(2) according to target celestial body information, ephemeris, receive the present day analog time cleared out according to data input cell, calculate position and the sunray incident angle of present day analog time sun relative target celestial body;
(3) according to target celestial body row landforms generating unit generate three-dimensional polygon terrain data and (2) in calculated angle of incidence of sunlight degree, by algorithm calculate render, generate shade projection pinup picture.
Navigation camera imaging analogue unit, navigate accordingly including it camera parameter configuration file, illumination parameter configuration file, landform material quality data storehouse
Navigation camera parameter configuration file is a text, and its configuration parameter includes: imaging resolution, visual angle, focal length, shooting interval, time of exposure.
Illumination parameter configuration file is a text, and its configuration parameter includes: ambient light intensity, scattered light intensity, specular light intensity.
Two above file should be configured by research worker before system start-up.
Navigation camera imaging analogue unit is a program, and it is as follows that its simulation generates image flow process captured by navigation camera:
(1) initialize, read navigation camera configuration, illumination parameter configuration, information according to target celestial body, read the specific materials pinup picture meeting target celestial body topography and geomorphology from landform material quality data storehouse;
(2) the up-to-date camera position of clearing, attitude are obtained from data input-output unit;
(3) determine according to Current camera position, visual angle and render region, generate, from topography and geomorphology, the terrain data reading specific region the polygon three dimensional topographic data that unit generates;
(4) read shade projection pinup picture unit and render the up-to-date pinup picture of generation;
(5) data according to (1), (2), (3) of this flow process, utilize DirectX11 interface to render generation image;
(6) according to camera exposure time and current frame rate, utilize (5) to render generation multiple image and be overlapped processing, the blur effect that analogue camera motion causes, finally give the image of analogue navigation camera shooting;
(7) view data that (6) render generation is sent to Data Data input-output unit.
Shade projection pinup picture generates unit, including its corresponding landform almanac data storehouse and celestial body data base
Data input-output unit is a thread, runs when the camera imaging analogue unit that navigates starts, and it is responsible for communicating with the optical guidance algorithm routine using this systematic research personnel to write, and is embodied as using TCP/IP to realize, specific as follows:
(1) thread starts, and initializes socket, keeps intercepting;
(2) receive the camera that sends over of optical guidance algorithm routine to be connected the spin matrix of coordinate system at the be connected positional information under coordinate system, the relative celestial body of camera coordinates system of celestial body;
(3) positional information and spin matrix to receiving are settled accounts, and calculate and meet DirectX and arrange the parameter format required for camera;
(4) read the up-to-date navigation camera image data rendering generation, by data plus frame originating point information, be sent to optical guidance algorithm routine by socket.
Claims (6)
1. a deep space probe landing autonomous optical navigation target imaging analog systems, it is characterised in that: native system includes target celestial body topography and geomorphology and generates unit, shade projection generation unit, navigation camera imaging analogue unit, data input-output unit; Target celestial body topography and geomorphology generates unit and is generated unit and interact with navigation camera imaging analogue unit, shade projection by polygon three dimensional topographic data respectively; Navigation camera imaging analogue unit interacts with data input-output unit; Shade projection generates unit and is interacted by shade projection pinup picture be connected with the camera imaging analogue unit that navigates; Data input-output unit interacts with shade projection generation unit and is connected;
Target celestial body ground row landforms generating unit is responsible for reading the thick dem data of landform, Land Surface Parameters configuration file, topographical features configuration file, is generated the polygon three dimensional topographic data meeting target celestial body topography and landform character by algorithm; Shade projection generates unit and is responsible for, according to ephemeris, calculating position and the sunray incident angle of the present day analog time relative sun of target celestial body, renders generation shade projection pinup picture; Navigation camera imaging analogue unit is responsible for, according to Current camera parameter, position, illumination parameter, landform material information, rendering and generating the image meeting camera requirement; Data input-output unit is responsible for according to agreement, receive and resolve camera position, attitude parameter and present day analog time data that client sends over, the view data rendering generation is sent to client according to protocol packing simultaneously, resolve for autonomous optical navigation system, it is achieved the Imaging Simulation of navigation camera subject.
2. a kind of deep space probe landing autonomous optical navigation target imaging analog systems according to claim 1, it is characterised in that: target celestial body topography and geomorphology generates unit, including its corresponding landform dem data storehouse and two configuration files;
It is an independent program that target celestial body topography and geomorphology generates unit, and before using native system to emulate, research worker must first use target celestial body topography and geomorphology to generate the terrain data that unit previously generates the generation target landing celestial body of the precision that meets the requirements;
Landform dem data storehouse contains the thick dem data of the part current mankind celestial body to detect, including the moon, Mars, 433Eros, 25143Itokawa dem data, when data base does not comprise the target celestial body data that research worker to simulate landing, research worker can manually import the target celestial body landform dem data meeting data format requirement;
Terrain parameter configuration file is text file format, research worker can use any text editor open carry out editor configuration, the parameter of configuration includes: target celestial body title, substantially landing path, final landing regional extent, DEM required precision; Specifying substantially landing path, final landing regional extent is to better the three-dimensional terrain model generated is optimized, and only generates the detector high-resolution relief model through region, improves the efficiency that comes into force;
Features of terrain configuration file is text formatting, research worker can use any text editor open carry out editor configuration, the parameter of configuration includes: terrain generation pattern is if set to stochastic generation, and research worker need to specify crater distributed constant, rock distribution parameter; If for specifying design parameter to generate, then needing to arrange crater number, the age of each crater, diameter, the degree of depth, location parameter, and the number of rock, the size of each rock, location parameter, it is possible to two kinds of generation mode mixing use.
3. a kind of deep space probe landing autonomous optical navigation target imaging analog systems according to claim 1, it is characterised in that: the flow process that target celestial body topography and geomorphology generates the polygon three dimensional topographic data that unit generates target celestial body topography and landform character is as follows:
(1) start, read configure according to research worker terrain parameter configuration file, features of terrain configuration file, target celestial body dem data;
(2) base area shape parameter, is processed target celestial body dem data by fractal algorithm, then carries out Fuzzy Processing, is that landform is more smooth, finally obtains required high resolution DEM data;
(3) according to landform characteristic parameter and crater, rock mathematical model, generate satisfactory crater and rock threedimensional model, and the dem data generated with (2) is overlapped processing, generate polygon three dimensional topographic data, generate the terrain data of terrain data meeting piecemeal storage different accuracy, project pinup picture generation unit for shade and the camera model unit that navigates reads and uses.
4. a kind of deep space probe landing autonomous optical navigation target imaging analog systems according to claim 1, it is characterised in that: shade projection pinup picture generates unit, including its corresponding landform almanac data storehouse and celestial body data base;
Shade projection pinup picture generation unit is a thread, runs when the camera imaging analogue unit that navigates starts, and it is as follows that it renders generation pinup picture generation process:
(1) initialize, read target celestial body topography and geomorphology and generate the polygon three dimensional topographic data of unit generation, almanac data storehouse, celestial body data base, determine the orbit parameter of target celestial body, rotation and revolution speed, according to the interval that correlation computations pinup picture renders;
(2) according to target celestial body information, ephemeris, receive the present day analog time cleared out according to data input cell, calculate position and the sunray incident angle of present day analog time sun relative target celestial body;
(3) according to target celestial body row landforms generating unit generate three-dimensional polygon terrain data and (2) in calculated angle of incidence of sunlight degree, by algorithm calculate render, generate shade projection pinup picture.
5. a kind of deep space probe landing autonomous optical navigation target imaging analog systems according to claim 1, it is characterized in that: navigation camera imaging analogue unit, navigate accordingly including it camera parameter configuration file, illumination parameter configuration file, landform material quality data storehouse
Navigation camera parameter configuration file is a text, and its configuration parameter includes: imaging resolution, visual angle, focal length, shooting interval, time of exposure;
Illumination parameter configuration file is a text, and its configuration parameter includes: ambient light intensity, scattered light intensity, specular light intensity;
Two above file should be configured by research worker before system start-up;
Navigation camera imaging analogue unit is a program, and it is as follows that its simulation generates image flow process captured by navigation camera:
(1) initialize, read navigation camera configuration, illumination parameter configuration, information according to target celestial body, read the specific materials pinup picture meeting target celestial body topography and geomorphology from landform material quality data storehouse;
(2) the up-to-date camera position of clearing, attitude are obtained from data input-output unit;
(3) determine according to Current camera position, visual angle and render region, generate, from topography and geomorphology, the terrain data reading specific region the polygon three dimensional topographic data that unit generates;
(4) read shade projection pinup picture unit and render the up-to-date pinup picture of generation;
(5) data according to (1), (2), (3) of this flow process, utilize DirectX11 interface to render generation image;
(6) according to camera exposure time and current frame rate, utilize (5) to render generation multiple image and be overlapped processing, the blur effect that analogue camera motion causes, finally give the image of analogue navigation camera shooting;
(7) view data that (6) render generation is sent to Data Data input-output unit.
6. a kind of deep space probe landing autonomous optical navigation target imaging analog systems according to claim 1, it is characterised in that: shade projection pinup picture generates unit, including its corresponding landform almanac data storehouse and celestial body data base;
Data input-output unit is a thread, runs when the camera imaging analogue unit that navigates starts, and it is responsible for communicating with the optical guidance algorithm routine using this systematic research personnel to write, and is embodied as using TCP/IP to realize, specific as follows:
(1) thread starts, and initializes socket, keeps intercepting;
(2) receive the camera that sends over of optical guidance algorithm routine to be connected the spin matrix of coordinate system at the be connected positional information under coordinate system, the relative celestial body of camera coordinates system of celestial body;
(3) positional information and spin matrix to receiving are settled accounts, and calculate and meet DirectX and arrange the parameter format required for camera;
(4) read the up-to-date navigation camera image data rendering generation, by data plus frame originating point information, be sent to optical guidance algorithm routine by socket.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610007370.7A CN105628055B (en) | 2016-01-06 | 2016-01-06 | A kind of deep space probe landing autonomous optical navigation target imaging simulation system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610007370.7A CN105628055B (en) | 2016-01-06 | 2016-01-06 | A kind of deep space probe landing autonomous optical navigation target imaging simulation system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105628055A true CN105628055A (en) | 2016-06-01 |
CN105628055B CN105628055B (en) | 2018-07-31 |
Family
ID=56043203
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610007370.7A Expired - Fee Related CN105628055B (en) | 2016-01-06 | 2016-01-06 | A kind of deep space probe landing autonomous optical navigation target imaging simulation system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105628055B (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107966693A (en) * | 2017-12-05 | 2018-04-27 | 成都合纵连横数字科技有限公司 | A kind of mobile lidar emulation mode rendered based on depth |
CN110930064A (en) * | 2019-12-09 | 2020-03-27 | 山东大学 | Method for extracting space-time probability of Mars dust storm and evaluating landing safety |
CN111102976A (en) * | 2018-10-25 | 2020-05-05 | 哈尔滨工业大学 | Simulation experiment table for landing buffering process of asteroid probe |
CN111453005A (en) * | 2020-03-31 | 2020-07-28 | 上海卫星工程研究所 | Reconfigurable small celestial body impact detection target characteristic ground simulation system |
CN111537000A (en) * | 2020-06-08 | 2020-08-14 | 中国科学院微小卫星创新研究院 | Ground verification system and method for deep space small celestial body landing segment optical navigation algorithm |
CN112550778A (en) * | 2020-11-10 | 2021-03-26 | 中国科学院空间应用工程与技术中心 | Deep space exploration visual imaging environment simulation device and method |
CN113753265A (en) * | 2021-09-17 | 2021-12-07 | 北京控制工程研究所 | Extraterrestrial star fixed-point landing method under low-illumination environment |
CN114565742A (en) * | 2022-02-16 | 2022-05-31 | 青岛科技大学 | Dynamic simulation and landing visual simulation system and method for surface of small celestial body |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1847791A (en) * | 2006-05-12 | 2006-10-18 | 哈尔滨工业大学 | Verification system for fast autonomous deep-space optical navigation control prototype |
CN102114919A (en) * | 2009-12-31 | 2011-07-06 | 北京控制工程研究所 | Asteroid imaging simulator at deep space exploration transition stage |
CN102129713A (en) * | 2011-03-11 | 2011-07-20 | 天津大学 | Test system of asynchronous broom type remote sensing solid imaging simulation and test method thereof |
CN103279974A (en) * | 2013-05-15 | 2013-09-04 | 中国科学院软件研究所 | High-accuracy high-resolution satellite imaging simulation engine and implementation method |
-
2016
- 2016-01-06 CN CN201610007370.7A patent/CN105628055B/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1847791A (en) * | 2006-05-12 | 2006-10-18 | 哈尔滨工业大学 | Verification system for fast autonomous deep-space optical navigation control prototype |
CN102114919A (en) * | 2009-12-31 | 2011-07-06 | 北京控制工程研究所 | Asteroid imaging simulator at deep space exploration transition stage |
CN102129713A (en) * | 2011-03-11 | 2011-07-20 | 天津大学 | Test system of asynchronous broom type remote sensing solid imaging simulation and test method thereof |
CN103279974A (en) * | 2013-05-15 | 2013-09-04 | 中国科学院软件研究所 | High-accuracy high-resolution satellite imaging simulation engine and implementation method |
Non-Patent Citations (2)
Title |
---|
T KUBOTA ETAL: ""An autonomous navigation and guidance system for MUSES-C asteroid landing"", 《ACTA ASTRONAUTICA》 * |
魏若岩: ""基于单幅图像且避封闭环境的星体表面着陆区选取方法"", 《系统工程与电子技术》 * |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107966693A (en) * | 2017-12-05 | 2018-04-27 | 成都合纵连横数字科技有限公司 | A kind of mobile lidar emulation mode rendered based on depth |
CN111102976A (en) * | 2018-10-25 | 2020-05-05 | 哈尔滨工业大学 | Simulation experiment table for landing buffering process of asteroid probe |
CN110930064A (en) * | 2019-12-09 | 2020-03-27 | 山东大学 | Method for extracting space-time probability of Mars dust storm and evaluating landing safety |
CN110930064B (en) * | 2019-12-09 | 2023-04-25 | 山东大学 | Mars storm space-time probability extraction and landing safety evaluation method |
CN111453005A (en) * | 2020-03-31 | 2020-07-28 | 上海卫星工程研究所 | Reconfigurable small celestial body impact detection target characteristic ground simulation system |
CN111453005B (en) * | 2020-03-31 | 2021-12-03 | 上海卫星工程研究所 | Reconfigurable small celestial body impact detection target characteristic ground simulation system |
CN111537000A (en) * | 2020-06-08 | 2020-08-14 | 中国科学院微小卫星创新研究院 | Ground verification system and method for deep space small celestial body landing segment optical navigation algorithm |
CN112550778A (en) * | 2020-11-10 | 2021-03-26 | 中国科学院空间应用工程与技术中心 | Deep space exploration visual imaging environment simulation device and method |
CN113753265A (en) * | 2021-09-17 | 2021-12-07 | 北京控制工程研究所 | Extraterrestrial star fixed-point landing method under low-illumination environment |
CN114565742A (en) * | 2022-02-16 | 2022-05-31 | 青岛科技大学 | Dynamic simulation and landing visual simulation system and method for surface of small celestial body |
CN114565742B (en) * | 2022-02-16 | 2024-09-06 | 青岛科技大学 | Small celestial body surface dynamic simulation and landing visual simulation method |
Also Published As
Publication number | Publication date |
---|---|
CN105628055B (en) | 2018-07-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105628055A (en) | Autonomous optical navigation target imaging analog system for landing of deep space probe | |
CN102346922B (en) | Space remote sensing load imaging geometric distortion three-dimensional visualization simulation method | |
Jain et al. | Recent developments in the ROAMS planetary rover simulation environment | |
CN113137955B (en) | Unmanned aerial vehicle aerial survey virtual simulation method based on scene modeling and virtual photography | |
Brochard et al. | Scientific image rendering for space scenes with the SurRender software | |
Lemmens et al. | Radar mappings for attitude analysis of objects in orbit | |
CN108090957A (en) | The method of mapping landform based on BIM | |
CN105444781A (en) | Ground verification method for satellite-borne autonomously guided imaging | |
CN113051776A (en) | Satellite attitude and orbit simulation system and method based on Unity3D | |
Setterfield et al. | Lidar-inertial based navigation and mapping for precision landing | |
Hill et al. | Ground-to-air flow visualization using Solar Calcium-K line Background-Oriented Schlieren | |
CN115688440A (en) | Lunar digital environment construction simulation system | |
Ruel et al. | 3DLASSO: Real-time pose estimation from 3D data for autonomous satellite servicing | |
Remetean et al. | Philae locating and science support by robotic vision techniques | |
CN117635816A (en) | Method and system for constructing spacecraft simulation data set in space environment | |
Johnson et al. | Motion estimation from laser ranging for autonomous comet landing | |
Bremer et al. | Simulating unmanned-aerial-vehicle based laser scanning data for efficient mission planning in complex terrain | |
Getchius et al. | Hazard detection and avoidance for the nova-c lander | |
Crues et al. | Digital Lunar Exploration Sites (DLES) | |
Smith et al. | Building Maps for Terrain Relative Navigation Using Blender: an Open Source Approach | |
Palmer et al. | Mercator—Independent rover localization using stereophotoclinometry and panoramic images | |
Thompson et al. | Stereo Camera Simulation for Lunar Surface Photogrammetry | |
Dubois-Matra et al. | Testing and Validation of Planetary Vision-based navigation systems with PANGU | |
Amzajerdian et al. | Performance of Flash Lidar with real-time image enhancement algorithm for Landing Hazard Avoidance | |
Lavigne et al. | Step-stare technique for airborne high-resolution infrared imaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20180731 |