CN113742863B - Global and local imaging capability ground verification system in ring fire track - Google Patents

Global and local imaging capability ground verification system in ring fire track Download PDF

Info

Publication number
CN113742863B
CN113742863B CN202111016237.5A CN202111016237A CN113742863B CN 113742863 B CN113742863 B CN 113742863B CN 202111016237 A CN202111016237 A CN 202111016237A CN 113742863 B CN113742863 B CN 113742863B
Authority
CN
China
Prior art keywords
camera
image
simulation
mars
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111016237.5A
Other languages
Chinese (zh)
Other versions
CN113742863A (en
Inventor
谢攀
杜洋
陆希
徐亮
郑惠欣
张红英
李青
秦冉冉
印兴峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Institute of Satellite Engineering
Original Assignee
Shanghai Institute of Satellite Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Institute of Satellite Engineering filed Critical Shanghai Institute of Satellite Engineering
Priority to CN202111016237.5A priority Critical patent/CN113742863B/en
Publication of CN113742863A publication Critical patent/CN113742863A/en
Application granted granted Critical
Publication of CN113742863B publication Critical patent/CN113742863B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/17Mechanical parametric or variational design
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4007Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2111/00Details relating to CAD techniques
    • G06F2111/10Numerical modelling

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Analysis (AREA)
  • Computer Hardware Design (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Mathematics (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

The application provides a global and local imaging capability ground verification method in a ring fire track. The application fully integrates and considers the influence of factors such as track, speed, illumination and the like on data application processing, provides a global and local imaging capability ground verification method in a ring fire track, can simulate and generate a dynamic real scene shot by a camera in the in-orbit running process of a Mars detector, and solves the verification problem of the whole flow of the medium-split fire imaging detection from information interaction, imaging and image processing. The application has good effect on the integrated design, test and middle-split imaging capability analysis of the detector platform and the imaging load. The application of the application has the beneficial effects of reducing the test cost, improving the test efficiency, visually evaluating the dynamic imaging quality of the middle camera and the like.

Description

Global and local imaging capability ground verification system in ring fire track
Technical Field
The application relates to the technical field of spacecrafts, in particular to a ground verification method for global and local imaging capability in a ring fire orbit.
Background
By imaging the Mars surface using a medium resolution camera, a global image of the Mars can be obtained. In order to obtain accurate image information, coordination of orbit, pose and camera imaging parameters is required. The optical part simulation verification only verifies the compliance of optical indexes including indexes such as a point spread function, distortion and the like from the computer theory simulation, and cannot verify the influence degree of an optical system on subsequent electronic imaging, ground data processing and the like; the electronic part simulation only independently simulates and verifies the imaging function, the communication function and the like, and does not involve subsequent ground image stitching; the ground application data processing part only applies the simulated images to carry out processing such as splicing, and the influence of factors such as track, speed, illumination and the like on the data application processing is not fully integrated and considered.
In the patent document with publication number CN107504982a, an aerial camera ground imaging simulation system is disclosed, comprising: the device comprises a target rotary drum, a collimator, a reflecting mirror, a heat insulation window, a temperature height box and an aerial camera, wherein the target is arranged on the target rotary drum, the heat insulation window is arranged at the bottom of the temperature height box, the target is aligned to the focal plane of the collimator, the axial distance between the target and the focal plane of the collimator is changed according to set parameters to generate a certain target distance, the target rotary drum drives the target to rotate to generate a certain target rotating speed, the target rotating speed is matched with the target distance to generate a corresponding speed-height ratio, target light generated by the target enters the heat insulation window for insulating air pressure and temperature through the collimator and the reflecting mirror and then enters the aerial camera arranged in the temperature height box to be imaged, and the temperature height box is internally provided with set temperature and air pressure.
A method for processing satellite earth imaging mission planning problem is disclosed in patent document with publication number CN107330544a, the method comprising: s1, when an improved simulated annealing algorithm is adopted to process satellite earth imaging task planning, acquiring an initial control temperature and an initial solution i in the simulated annealing algorithm; s2, executing an iteration process of a simulated annealing algorithm according to the initial control temperature and the initial solution I and the Markov chain length L0, and storing a better solution of each iteration in the iteration process into a memory matrix I; and S3, after the iterative process meets the stopping criterion, processing each preferred solution in the memory matrix I by adopting a local search algorithm, and obtaining one preferred solution to be output as an optimal scheme of satellite to ground imaging task planning.
An optical remote sensing satellite imaging quality determining method and system are disclosed in patent document with publication number of CN 110363758B. The method comprises the following steps: acquiring a point light source remote sensing image count value; the point light source is an automatic reflection point light source arranged on the ground; the point light source remote sensing image count value is obtained by an optical remote sensing satellite imaging system; constructing a response value objective function according to the point source remote sensing image count value and the point spread function of the optical remote sensing satellite imaging system; the point spread function is expressed by a Gaussian model; solving a response value objective function by adopting a least square method to obtain an image point coordinate value of the point light source remote sensing image; and determining the imaging quality of the optical remote sensing satellite according to the coordinate value of the image point.
In the patent document with publication number CN111222544a, a ground simulation test system for the influence of satellite flutter on camera imaging is disclosed, the system comprising: satellite operation flutter simulation component: the device comprises an excitation platform, a camera sliding rail arranged at the output end of the excitation platform and an imaging camera sliding left and right on the camera sliding rail, wherein the imaging camera is used for simulating platform shake generated by satellite on-orbit running; linear terrain simulation component: the imaging camera is arranged opposite to the imaging camera and comprises a target plate sliding rail and a target plate sliding back and forth on the target plate sliding rail, wherein the target plate is used for simulating different satellite orbit heights; and (3) a monitoring component: comprises a computer connected with an imaging camera for obtaining the actually measured flutter frequency and amplitude parameters.
For the related art, the inventor considers that when the above scheme verifies the imaging of the partial camera on the ground, the existing verification method only performs verification on the optical part, the electrical part, the ground data application and other parts of the camera respectively, cannot verify the information flow interaction process, and is insufficient in verification. No verification system is established for the medium resolution camera to cover light, electromechanical, electrical, soft, control and the like from imaging, information interaction, image data processing and the like. Therefore, a technical solution is needed to improve the above technical problems.
Disclosure of Invention
Aiming at the defects in the prior art, the application aims to provide a global and local imaging capability ground verification method in a ring fire track.
The application provides a global and local imaging capability ground verification system in a ring fire track, which comprises a data interface module, a camera simulation module, a gesture control module and an image rendering and processing module;
the data interface module is respectively connected with the camera simulation module and the gesture control module; the image rendering and processing module is respectively connected with the camera simulation module and the gesture control module;
the data interface module performs data communication and resource sharing, and keeps a higher transmission rate.
Preferably, the data interface module manages the RS422 and LAN interface in the simulation of the camera electrical system, the medium resolution load imaging verification system is connected with the detector platform test equipment through a gigabit Ethernet switch, and the real interface receiving function of the RS422 is reserved.
Preferably, the camera simulation module performs optical system simulation according to the internal and external parameters of the camera according to the related instructions and modes transmitted by the data interface module, and transmits the related parameters and attributes to the image rendering and processing module.
Preferably, the camera simulation module generates a simulation image of the camera observation, including the steps of:
step S1: calculating the relative position and the posture between a camera coordinate system and a Mars coordinate system to obtain a parameter matrix of the camera, and calculating a ray equation corresponding to each pixel point on the camera image;
step S2: calculating the intersection point of the ray and the Mars surface, finding 4 pixel points with known brightness adjacent to the intersection point on the obtained Mars image, and calculating the brightness of the pixel points on the simulation image;
step S3: correcting the brightness of pixels on the simulation image according to the sun incidence angle and the camera observation direction, and sequentially executing the steps on 4096×3072 pixel points to obtain the simulation image;
step S4: and outputting the calculated full-frame simulation image according to the setting of the image output mode, and opening a window or sampling.
Preferably, the camera is fixedly connected to the detector, and the installation position, orientation and other parameters of the camera are relative to the coordinate system of the detector body.
Preferably, fine adjustment of the camera position and adjustment of the camera pose in simulation take a detector body coordinate system as a reference coordinate system.
Preferably, the reference coordinate system of the image rendering is a detector orbit coordinate system, and the position and posture parameters of the camera are converted into the detector orbit coordinate system from the detector body coordinate system.
Preferably, the image rendering and processing module simulates the relationship of observed image brightness to integration time and simulates the camera observing a morning and evening line.
Preferably, the camera-observed Mars surface image obtained by the detector relative to Mars comprises the following steps:
step 1: converting 6 degrees of freedom of the detector coordinate system relative to the Mars coordinate system into 6 degrees of freedom of the camera relative to the Mars coordinate system, and obtaining an external parameter matrix of the camera;
step 2: calculating the intersection point of the camera optical center ray passing through each pixel point of the camera image and the Mars sphere;
step 3: converting longitude and latitude coordinates of the intersection point into Mars image coordinates;
step 4: estimating the brightness of a corresponding pixel point on the camera observation image by using the brightness of the Mars image pixel;
step 5: and correcting the brightness of the camera observation image according to the normal direction of the observation point on the Mars surface, the incident direction of the sun and the observation direction of the camera.
Preferably, the longitude and latitude of the pixel point in the Mars full map are defined according to a fire fixed coordinate system, the longitude and latitude of the pixel point in the Mars full map and the rotation Zhou Tongxiang are longitudinal increasing directions, the latitude of the equatorial plane is 0, the latitude of the north pole is 90 degrees, and the latitude of the south pole is-90 degrees
Compared with the prior art, the application has the following beneficial effects:
1. the application can simulate and generate dynamic real scenes shot by a camera in the on-orbit running process of the Mars detector, and has the functions of receiving the simulation instruction, the orbit, the attitude parameters and other information of the detector platform test equipment;
2. according to the application, after optical simulation calculation is carried out according to the optical system parameters of the real camera, corresponding three-dimensional scene imaging information is generated and fed back to the testing equipment;
3. the application realizes the verification of the full flow of medium-split fire imaging detection from information interaction, imaging and image processing.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, given with reference to the accompanying drawings in which:
FIG. 1 is a diagram of the relationship of the various modules of the resolution load imaging verification system of the present application;
FIG. 2 is a flow chart of the attitude adjustment of the Mars imaging model according to the present application;
FIG. 3 is a view of the Mars image and longitude and latitude of the present application;
FIG. 4 is a detailed view of the spark of the present application;
fig. 5 is a diagram of the relationship between the camera coordinate system and the Mars image coordinate system according to the present application.
Detailed Description
The present application will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the present application, but are not intended to limit the application in any way. It should be noted that variations and modifications could be made by those skilled in the art without departing from the inventive concept. These are all within the scope of the present application.
The application provides a ground verification method for global and local imaging capability in a ring fire track, which can simulate and generate dynamic real scenes shot by a camera in the in-track running process of a Mars detector, has the advantages of receiving the simulation instruction of a detector platform test device, the information of the track, attitude parameters and the like, generating corresponding three-dimensional scene imaging information after optical simulation calculation according to the optical system parameters of the real camera, and feeding back the information to the test device, thereby realizing the verification of the complete flow of medium-split fire imaging detection from information interaction, imaging and image processing.
The split imaging verification method in the ring fire track mainly realizes that the position, the posture and other data of the detector are combined with the parameters in the real optical system of the camera according to the data of the detector platform test equipment, and the image in the view field of the medium-resolution camera is calculated according to the parameters. The medium resolution load imaging verification system can be divided into a data interface module, a camera simulation module, a gesture control module and an image rendering and processing module according to functional requirements.
The data interface module mainly completes RS422 and LAN interface management in camera electrical system simulation, the medium resolution load imaging verification system is connected with the detector platform test equipment through a gigabit Ethernet switch, the real interface receiving function of the RS422 is reserved, data communication and resource sharing are achieved, and a high transmission rate is maintained.
The camera simulation module simulates an optical system according to the internal and external parameters of the camera according to the related instructions and modes transmitted by the data interface module, and transmits the related parameters and attributes to the image rendering and processing module. The process of generating the simulated image observed by the camera is as follows:
step S1: and calculating the relative position and the relative posture between the camera coordinate system and the Mars coordinate system to obtain a parameter matrix of the camera, and calculating a ray equation corresponding to each pixel point on the camera image.
Step S2: and calculating the intersection point of the ray and the Mars surface, finding 4 pixel points with known brightness adjacent to the intersection point on the obtained Mars image, and calculating the brightness of the pixel points on the simulation image.
Step S3: and correcting the brightness of pixels on the simulation image according to the incident angle of the sun and the observation direction of the camera, and sequentially executing the steps on 4096 multiplied by 3072 pixel points to obtain the simulation image.
Step S4: and outputting the calculated full-frame simulation image according to the setting of the image output mode, and opening a window or sampling.
And the gesture control module is used for: the camera is rigidly fixed on the detector, and the installation position, orientation and other parameters of the camera are relative to the coordinate system of the detector body. Meanwhile, fine adjustment of the camera position and adjustment of the camera posture during simulation are also performed by taking a detector body coordinate system as a reference coordinate system. The reference coordinate system of image rendering is a detector orbit coordinate system, so that the position and posture parameters of the camera need to be converted from the detector body coordinate system to the detector orbit coordinate system to correctly determine the visible area of the camera and render a correct image. The spark imaging model posture adjustment flow is shown in fig. 2.
And an image rendering and processing module: the highest peak on the Mars is Olympic mountain, 21.2km above the reference plane, 648km in diameter, the lowest basin is a sea wax basin, 2km below the reference plane, 1600km in diameter. The detector minimum track height was 260km, at which the imaging did not take into account simulated shadows. The obtained Mars map has a resolution of 231m/pixel, the camera has a pixel resolution of 100m/pixel at a detector height of 400km, 1 simulated image pixel is obtained by interpolation of about every 2 Mars map pixels, and 1 simulated image pixel is obtained by interpolation of about 4 Mars map pixels at a detector height of 260 km. The rendering of the simulated image includes two aspects, namely, the relation between the brightness of the simulated observation image and the integration time, and the simulated camera observation of the morning and evening line.
The latitude and longitude of the pixel point in the obtained Mars full map are defined according to a fire fixed coordinate system, the latitude and longitude of the pixel point and the rotation Zhou Tongxiang are in a longitude increasing direction, the latitude of the equatorial plane is 0, the latitude of the north pole is 90 degrees, and the latitude of the south pole is-90 degrees. The relationship between the Mars surface image and longitude and latitude is shown in fig. 3. The basic information of the Mars image that has been obtained at present is as follows:
latitude range: -90 DEG
Longitude range: -180 DEG
Resolution ratio: 231.542m/pixel
Angular resolution: 256pixel/°
Major axis radius: 3396190km
Short axis radius: 3376200km
The Mars has a color image data size of 12GB and a gray image data size of 4GB, and FIG. 4 shows local detail on Mars.
After knowing the relation between the detector coordinate system and the Mars coordinate system, the relation between the camera and the Mars coordinate system is obtained through coordinate conversion, and the relation is also the 6-degree-of-freedom relation between the camera coordinate system and the Mars coordinate system, namely the external parameters of the camera. The projection of four points A, B, C and D of the camera image on the Mars surface is calculated, the longitude and latitude of the 4 intersection points in the Mars coordinate system are calculated and mapped to the Mars whole map, and a quadrilateral area formed by the four points is the area of the Mars surface observed by the camera under the current external parameter condition, as shown in fig. 5.
The process of obtaining camera-observed Mars surface images from 6 degrees of freedom of the detector relative to Mars is as follows:
step 1: and converting the 6 degrees of freedom of the detector coordinate system relative to the Mars coordinate system into the 6 degrees of freedom of the camera relative to the Mars coordinate system, and obtaining an external parameter matrix of the camera.
Step 2: and calculating the intersection point of each pixel point of the camera image passing through the camera optical center ray and the Mars sphere.
Step 3: and converting longitude and latitude coordinates of the intersection point into Mars image coordinates.
Step 4: bilinear interpolation, the brightness of the corresponding pixel point on the camera observation image is estimated by using the brightness of the Mars image pixel.
Step 5: and correcting the brightness of the camera observation image according to the normal direction of the observation point on the Mars surface, the incident direction of the sun and the observation direction of the camera.
The input is 6 degrees of freedom of the detector coordinate system relative to the Mars coordinate system; the output is a camera observation image 4096×3076.
The application can simulate and generate dynamic real scenes shot by a camera in the on-orbit running process of the Mars detector, and has the functions of receiving the simulation instruction, the orbit, the attitude parameters and other information of the detector platform test equipment; optical simulation calculation is carried out according to the optical system parameters of the real camera, and corresponding three-dimensional scene imaging information is generated and fed back to the testing equipment; the verification of the whole flow of medium-split fire imaging detection from information interaction, imaging and image processing is realized.
Those skilled in the art will appreciate that the application provides a system and its individual devices, modules, units, etc. that can be implemented entirely by logic programming of method steps, in addition to being implemented as pure computer readable program code, in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Therefore, the system and various devices, modules and units thereof provided by the application can be regarded as a hardware component, and the devices, modules and units for realizing various functions included in the system can also be regarded as structures in the hardware component; means, modules, and units for implementing the various functions may also be considered as either software modules for implementing the methods or structures within hardware components.
The foregoing describes specific embodiments of the present application. It is to be understood that the application is not limited to the particular embodiments described above, and that various changes or modifications may be made by those skilled in the art within the scope of the appended claims without affecting the spirit of the application. The embodiments of the application and the features of the embodiments may be combined with each other arbitrarily without conflict.

Claims (5)

1. The global and local imaging capability ground verification system in the ring fire track is characterized by comprising a data interface module, a camera simulation module, a gesture control module and an image rendering and processing module;
the data interface module is respectively connected with the camera simulation module and the gesture control module; the image rendering and processing module is respectively connected with the camera simulation module and the gesture control module;
the data interface module performs data communication and resource sharing, and keeps a higher transmission rate;
the camera simulation module is used for performing optical system simulation according to the internal and external parameters of the camera according to the related instructions and modes transmitted by the data interface module, and transmitting the related parameters and attributes to the image rendering and processing module;
the camera simulation module generating a simulation image of camera observation includes the steps of:
step S1: calculating the relative position and the posture between a camera coordinate system and a Mars coordinate system to obtain a parameter matrix of the camera, and calculating a ray equation corresponding to each pixel point on the camera image;
step S2: calculating the intersection point of the ray and the Mars surface, finding 4 pixel points with known brightness adjacent to the intersection point on the obtained Mars image, and calculating the brightness of the pixel points on the simulation image;
step S3: correcting the brightness of pixels on the simulation image according to the sun incidence angle and the camera observation direction, and sequentially executing the steps on 4096×3072 pixel points to obtain the simulation image;
step S4: according to the setting of the image output mode, outputting the calculated full-width simulation image according to a full graph or windowing and sampling;
the attitude control module includes:
the camera is fixedly connected to the detector, and the installation position and the orientation parameters of the camera are relative to the coordinate system of the detector body;
the fine adjustment of the camera position in simulation and the adjustment of the camera posture take a detector body coordinate system as a reference coordinate system;
the reference coordinate system of the image rendering is a detector orbit coordinate system, and the position and posture parameters of the camera are converted into the detector orbit coordinate system from the detector body coordinate system;
the system is used for simulating and generating a dynamic real scene shot by a camera in the on-orbit running process of the Mars detector, and has the advantages of receiving the simulation instruction, the track and the gesture parameter information of the detector platform test equipment, performing optical simulation calculation according to the optical system parameters of the real camera, generating corresponding three-dimensional scene imaging information, and feeding back to the test equipment, thereby realizing the verification of the full flow of medium-split fire imaging detection from information interaction, imaging and image processing.
2. The global and local imaging capability ground verification system in a ring fire track according to claim 1, wherein the data interface module manages RS422 and LAN interfaces in camera electrical system simulation, and the medium resolution load imaging verification system is connected with the detector platform test equipment through a gigabit ethernet switch, so as to retain the RS422 real interface receiving function.
3. The global and local imaging capability ground verification system in a ring fire orbit of claim 1, wherein the image rendering and processing module simulates the relationship of observed image brightness to integration time and simulates the camera observed morning and evening lines.
4. The global and local imaging capability ground verification system in a ring fire orbit according to claim 1, wherein said detector obtaining camera-observed spark surface images relative to a spark comprises the steps of:
step 1: converting 6 degrees of freedom of the detector coordinate system relative to the Mars coordinate system into 6 degrees of freedom of the camera relative to the Mars coordinate system, and obtaining an external parameter matrix of the camera;
step 2: calculating the intersection point of the camera optical center ray passing through each pixel point of the camera image and the Mars sphere;
step 3: converting longitude and latitude coordinates of the intersection point into Mars image coordinates;
step 4: estimating the brightness of a corresponding pixel point on the camera observation image by using the brightness of the Mars image pixel;
step 5: and correcting the brightness of the camera observation image according to the normal direction of the observation point on the Mars surface, the incident direction of the sun and the observation direction of the camera.
5. The global and local imaging capability ground verification system according to claim 1, wherein the longitude and latitude of the pixel points in the obtained spark-over-all-image are defined according to a fire-over-solid coordinate system, the longitude and latitude of the pixel points in the spark-over-all-image and the rotation Zhou Tongxiang are in the longitude increasing direction, the latitude of the equatorial plane is 0, the latitude of the north pole is 90 degrees, and the latitude of the south pole is-90 degrees.
CN202111016237.5A 2021-08-31 2021-08-31 Global and local imaging capability ground verification system in ring fire track Active CN113742863B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111016237.5A CN113742863B (en) 2021-08-31 2021-08-31 Global and local imaging capability ground verification system in ring fire track

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111016237.5A CN113742863B (en) 2021-08-31 2021-08-31 Global and local imaging capability ground verification system in ring fire track

Publications (2)

Publication Number Publication Date
CN113742863A CN113742863A (en) 2021-12-03
CN113742863B true CN113742863B (en) 2023-10-27

Family

ID=78734464

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111016237.5A Active CN113742863B (en) 2021-08-31 2021-08-31 Global and local imaging capability ground verification system in ring fire track

Country Status (1)

Country Link
CN (1) CN113742863B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102737357A (en) * 2011-04-08 2012-10-17 中国科学院国家天文台 Method for generating simulation data of lunar three-linear array camera images
CN103017788A (en) * 2012-11-30 2013-04-03 北京控制工程研究所 Interplanetary autonomous navigation ground test verification system based on information fusion
CN106055107A (en) * 2016-06-07 2016-10-26 中国人民解放军国防科学技术大学 Space remote operation technology ground verification system based on man-in-loop
CN110849331A (en) * 2019-11-04 2020-02-28 上海航天控制技术研究所 Monocular vision measurement and ground test method based on three-dimensional point cloud database model
CN111735447A (en) * 2020-05-31 2020-10-02 南京航空航天大学 Satellite-sensitive-simulation type indoor relative pose measurement system and working method thereof
WO2021116078A1 (en) * 2019-12-13 2021-06-17 Connaught Electronics Ltd. A method for measuring the topography of an environment
CN113086255A (en) * 2021-03-16 2021-07-09 上海卫星工程研究所 Ground verification method and system for satellite to evaluate on-orbit stability by observing fixed star

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102737357A (en) * 2011-04-08 2012-10-17 中国科学院国家天文台 Method for generating simulation data of lunar three-linear array camera images
CN103017788A (en) * 2012-11-30 2013-04-03 北京控制工程研究所 Interplanetary autonomous navigation ground test verification system based on information fusion
CN106055107A (en) * 2016-06-07 2016-10-26 中国人民解放军国防科学技术大学 Space remote operation technology ground verification system based on man-in-loop
CN110849331A (en) * 2019-11-04 2020-02-28 上海航天控制技术研究所 Monocular vision measurement and ground test method based on three-dimensional point cloud database model
WO2021116078A1 (en) * 2019-12-13 2021-06-17 Connaught Electronics Ltd. A method for measuring the topography of an environment
CN111735447A (en) * 2020-05-31 2020-10-02 南京航空航天大学 Satellite-sensitive-simulation type indoor relative pose measurement system and working method thereof
CN113086255A (en) * 2021-03-16 2021-07-09 上海卫星工程研究所 Ground verification method and system for satellite to evaluate on-orbit stability by observing fixed star

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
一种相机标定辅助的单目视觉室内定位方法;王勇等;《测绘通报》(第2期);第38-43页 *
交会航天器姿态指向控制与地面仿真验证研究;李超;《中国优秀硕士学位论文全文数据库工程科技II辑》(第1期);C031-815 *
无人机全天时航空成像侦察系统建模与仿真;周锦春等;《计算机工程与应用》(第24期);第259-265页 *
遥感观测卫星多系统交互验证的可视化仿真;郭崇滨等;《计算机仿真》;第33卷(第4期);第144-149页 *

Also Published As

Publication number Publication date
CN113742863A (en) 2021-12-03

Similar Documents

Publication Publication Date Title
JP7413321B2 (en) Daily scene restoration engine
CN103245364B (en) Method for testing dynamic performance of star sensor
CN110849331B (en) Monocular vision measurement and ground test method based on three-dimensional point cloud database model
Beierle et al. Variable-magnification optical stimulator for training and validation of spaceborne vision-based navigation
Elmquist et al. Methods and models for simulating autonomous vehicle sensors
Vergauwen et al. A stereo-vision system for support of planetary surface exploration
CN107451957A (en) A kind of spaceborne TDI CMOS camera imagings emulation mode and equipment
CN109269495B (en) Dynamic star map generation method and device
CN115343744A (en) Optical single-double-star combined on-satellite positioning method and system for aerial moving target
CN109975836B (en) Method and device for calculating ground position of CCD image, electronic equipment and medium
CN115439528A (en) Method and equipment for acquiring image position information of target object
CN105547286B (en) A kind of compound three visual fields star sensor star map simulation method
CN113742863B (en) Global and local imaging capability ground verification system in ring fire track
Piccinin et al. ARGOS: Calibrated facility for Image based Relative Navigation technologies on ground verification and testing
Beierle High fidelity validation of vision-based sensors and algorithms for spaceborne navigation
CN113589318B (en) Simulation method for entrance pupil radiation image of satellite-borne infrared staring camera
Glamazda SBG camera of Kourovka Astronomical observatory
CN114659523A (en) Large-range high-precision attitude measurement method and device
Vergauwen et al. A stereo vision system for support of planetary surface exploration
Crockett et al. Visualization tool for advanced laser system development
CN101915581B (en) Comet optical surface signal simulation method for deep space exploration
Wen et al. A simulation system of space object images for space situation awareness
CN110127079B (en) Target flight characteristic simulation system under remote sensing visual field based on six-degree-of-freedom platform
CN116933567B (en) Space-based complex multi-scene space target simulation data set construction method
CN113791630B (en) Mars elliptic orbit image motion compensation control ground verification system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant