CN113742863A - Ground verification method for dividing global and local imaging abilities in circular fire track - Google Patents
Ground verification method for dividing global and local imaging abilities in circular fire track Download PDFInfo
- Publication number
- CN113742863A CN113742863A CN202111016237.5A CN202111016237A CN113742863A CN 113742863 A CN113742863 A CN 113742863A CN 202111016237 A CN202111016237 A CN 202111016237A CN 113742863 A CN113742863 A CN 113742863A
- Authority
- CN
- China
- Prior art keywords
- camera
- image
- mars
- coordinate system
- fire
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 59
- 238000012795 verification Methods 0.000 title claims abstract description 31
- 238000000034 method Methods 0.000 title abstract description 30
- 238000012545 processing Methods 0.000 claims abstract description 23
- 238000012360 testing method Methods 0.000 claims abstract description 14
- 238000004088 simulation Methods 0.000 claims description 40
- 230000003287 optical effect Effects 0.000 claims description 18
- 238000009877 rendering Methods 0.000 claims description 15
- 239000011159 matrix material Substances 0.000 claims description 8
- 238000004891 communication Methods 0.000 claims description 4
- 239000000523 sample Substances 0.000 claims description 4
- 230000005540 biological transmission Effects 0.000 claims description 3
- 230000010354 integration Effects 0.000 claims description 3
- 238000005070 sampling Methods 0.000 claims description 3
- 238000009434 installation Methods 0.000 claims description 2
- 230000008569 process Effects 0.000 abstract description 11
- 230000003993 interaction Effects 0.000 abstract description 6
- 238000001514 detection method Methods 0.000 abstract description 4
- 238000005286 illumination Methods 0.000 abstract description 2
- 238000013461 design Methods 0.000 abstract 1
- 230000006870 function Effects 0.000 description 13
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 238000013041 optical simulation Methods 0.000 description 3
- 238000002922 simulated annealing Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 230000005284 excitation Effects 0.000 description 2
- 238000002955 isolation Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000009413 insulation Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000010845 search algorithm Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/10—Geometric CAD
- G06F30/17—Mechanical parametric or variational design
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4007—Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2111/00—Details relating to CAD techniques
- G06F2111/10—Numerical modelling
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Geometry (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Pure & Applied Mathematics (AREA)
- Mathematical Optimization (AREA)
- Mathematical Analysis (AREA)
- Computer Hardware Design (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Computational Mathematics (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
Abstract
The invention provides a global and local imaging capability ground verification method in a fire-surrounding track. The invention fully integrates and considers the influence of factors such as orbit, speed, illumination and the like on data application processing, provides the global and local imaging capability ground verification method in the ring fire orbit, can simulate and generate the dynamic real scene shot by the camera in the orbit running process of the Mars detector, and solves the verification problem of the whole flow of information interaction, imaging and image processing in the split fire imaging detection. The invention has good effect on the integrated design and test of the detector platform and the imaging load and the analysis of the split imaging capability. The application of the invention has the advantages of reducing the test cost, improving the test efficiency, visually evaluating the dynamic imaging quality of the divide camera and the like.
Description
Technical Field
The invention relates to the technical field of spacecrafts, in particular to a ground verification method for dividing global and local imaging capabilities in a ring fire orbit.
Background
By imaging the Mars surface using the mid-resolution camera, a Mars global image map can be acquired. In order to acquire accurate image information, the orbit, the attitude and the camera imaging parameters need to be coordinated and consistent. The optical part simulation verification only verifies the conformity of optical indexes including point spread function, distortion and other indexes from computer theory simulation, and cannot verify the influence degree of an optical system on subsequent electronic imaging, ground data processing and the like; the electronic part simulates, and only independently simulates and verifies the imaging function, the communication function and the like without involving subsequent ground image splicing; the ground application data processing part only applies the simulation images to carry out processing such as splicing, and the influence of factors such as orbit, speed, illumination and the like on the data application processing is not fully integrated and considered.
Patent document No. CN107504982A discloses a ground imaging simulation system for an aerial camera, which includes: the device comprises a target rotary drum, a collimator, a reflector, a heat insulation window, a temperature height box and an aerial camera, wherein a target is arranged on the target rotary drum, an isolation window is arranged at the bottom of the temperature height box, the target is aligned to a focal plane of the collimator, the axial distance between the target and the focal plane of the collimator is changed according to set parameters to generate a certain target distance, the target rotary drum drives the target to rotate to generate a certain target rotating speed, the target rotating speed is matched with the target distance to generate a corresponding speed-height ratio, target light rays generated by the target enter the isolation window for isolating air pressure and temperature through the collimator and the reflector and then enter the aerial camera arranged in the temperature height box to be imaged, and the set temperature and air pressure are arranged in the temperature height box.
In patent document CN107330544A, a method for processing satellite-to-ground imaging task planning problem is disclosed, the method includes: s1, acquiring initial control temperature and initial solution i in the simulated annealing algorithm when the improved simulated annealing algorithm is adopted to process satellite ground imaging task planning; s2, executing an iteration process of a simulated annealing algorithm according to the initial control temperature, the initial solution I and the Markov chain length L0, and storing a better solution of each iteration in the iteration process into a memory matrix I; and S3, after the iteration process meets the stop criterion, processing each better solution in the memory matrix I by adopting a local search algorithm, and acquiring a better solution to be output as the optimal scheme of the satellite ground-based imaging task planning.
Patent document No. CN110363758B discloses a method and system for determining imaging quality of optical remote sensing satellite. The method comprises the following steps: acquiring a point light source remote sensing image counting value; the point light source is an automatic reflection type point light source arranged on the ground; the point light source remote sensing image counting value is obtained by an optical remote sensing satellite imaging system; constructing a response value target function according to the point light source remote sensing image counting value and the point spread function of the optical remote sensing satellite imaging system; the point spread function is expressed by adopting a Gaussian model; solving the response value target function by adopting a least square method to obtain an image point coordinate value of the point light source remote sensing image; and determining the imaging quality of the optical remote sensing satellite according to the coordinate value of the image point.
In patent document CN111222544A, a ground simulation test system for the influence of satellite flutter on camera imaging is disclosed, which comprises: satellite operation flutter simulation assembly: the device comprises an excitation platform, a camera slide rail arranged at the output end of the excitation platform and an imaging camera sliding left and right on the camera slide rail, and is used for simulating platform shake generated by the in-orbit operation of a satellite; a linear terrain simulation component: the target board is arranged opposite to the imaging camera, comprises a target board slide rail and a target board which slides back and forth on the target board slide rail and is used for simulating different satellite orbit heights; a monitoring component: comprises a computer connected with an imaging camera and used for acquiring actually measured flutter frequency and amplitude parameters.
For the related technologies, the inventor thinks that when the above scheme verifies the imaging of the split camera on the ground, the existing verification method only verifies the imaging of the split camera from the optical part, the electrical part, the ground data application and other parts of the camera respectively, and cannot verify the information flow interaction process, so that the verification is insufficient. A verification system for integrating light, machine, electricity, soft, control and the like from imaging, information interaction, image data processing and the like of a medium-resolution camera is not established. Therefore, a technical solution is needed to improve the above technical problems.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a global and local imaging capability ground verification method in a circular fire track.
The ground verification system for the global and local imaging capability in the fire-surrounding track comprises a data interface module, a camera simulation module, an attitude control module and an image rendering and processing module;
the data interface module is respectively connected with the camera simulation module and the attitude control module; the image rendering and processing module is respectively connected with the camera simulation module and the attitude control module;
the data interface module carries out data communication and resource sharing, and keeps higher transmission rate.
Preferably, the data interface module manages an RS422 interface and an LAN interface in the camera electrical system simulation, and the medium-resolution load imaging verification system is connected with the detector platform test equipment through a gigabit ethernet switch, so that a receiving function of a real interface of the RS422 is retained.
Preferably, the camera simulation module performs optical system simulation according to the internal and external parameters of the camera according to the relevant instructions and modes transmitted by the data interface module, and transmits the relevant parameters and attributes to the image rendering and processing module.
Preferably, the camera simulation module generating a simulated image of the camera observation comprises the steps of:
step S1: calculating the relative position and posture between a camera coordinate system and a Mars coordinate system to obtain a parameter matrix of the camera, and calculating a ray equation corresponding to each pixel point on the camera image;
step S2: calculating the intersection point of the ray and the surface of the Mars, finding 4 pixel points with known brightness adjacent to the intersection point on the obtained Mars image, and calculating the brightness of the pixel points on the simulation image;
step S3: correcting the brightness of pixels on the simulated image according to the sun incident angle and the camera observation direction, and sequentially executing the steps on 4096 x 3072 pixel points to obtain the simulated image;
step S4: and outputting the calculated full-width simulation image according to a full picture or windowing and sampling according to the setting of the image output mode.
Preferably, the camera is fixedly connected to the detector, and the mounting position, orientation and other parameters of the camera are relative to the detector body coordinate system.
Preferably, the fine adjustment of the camera position during simulation and the adjustment of the camera posture are based on a probe body coordinate system as a reference coordinate system.
Preferably, the reference coordinate system of the image rendering is a detector orbit coordinate system, and the position and attitude parameters of the camera are converted from a detector body coordinate system to the detector orbit coordinate system.
Preferably, the image rendering and processing module simulates observing image brightness versus integration time and simulating camera observing morning and evening lines.
Preferably, the method for acquiring the image of the surface of the Mars observed by the camera relative to the Mars by the detector comprises the following steps:
step 1: converting the 6 degrees of freedom of the detector coordinate system relative to the Mars coordinate system into the 6 degrees of freedom of the camera relative to the Mars coordinate system to obtain an external parameter matrix of the camera;
step 2: calculating the intersection point of each pixel point of the camera image passing through the camera optical center ray and the Mars spherical surface;
and step 3: converting longitude and latitude coordinates of the intersection points into Mars image coordinates;
and 4, step 4: estimating the brightness of a corresponding pixel point on an observation image by using the brightness of the pixels of the Mars image;
and 5: and correcting the brightness of the camera observation image according to the normal direction of the observation point on the surface of the mars, the solar incidence direction and the camera observation direction.
Preferably, the longitude and latitude of the pixel points in the Mars full map are obtained and defined according to a fire fixation coordinate system, the same direction of the longitude and the latitude of the pixel points in the Mars full map and the rotation circumference is the longitude increasing direction, the latitude of the equatorial plane is 0, the latitude of the north pole is 90 degrees, and the latitude of the south pole is-90 degrees
Compared with the prior art, the invention has the following beneficial effects:
1. the method can simulate and generate the dynamic real scene shot by the camera in the on-orbit operation process of the Mars detector, and has the functions of receiving the simulation instruction, the orbit, the attitude parameter and other information of the detector platform test equipment;
2. after optical simulation calculation is carried out according to optical system parameters of a real camera, corresponding three-dimensional scene imaging information is generated and fed back to the testing equipment;
3. the invention realizes the verification of the whole flow of the information interaction, imaging and image processing of the medium-paired fire imaging detection.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a diagram of the relationship of the modules of the resolution load imaging verification system of the present invention;
FIG. 2 is a flow chart of Mars imaging model attitude adjustment according to the present invention;
FIG. 3 is a Mars image and longitude and latitude map of the present invention;
FIG. 4 is a detail view of the Mars of the present invention;
FIG. 5 is a diagram of the relationship between the camera coordinate system and the Mars image coordinate system according to the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that it would be obvious to those skilled in the art that various changes and modifications can be made without departing from the spirit of the invention. All falling within the scope of the present invention.
The invention provides a ground verification method for split global and local imaging capability in a fire-surrounding orbit, which can be used for generating dynamic real scenes shot by a camera in the in-orbit operation process of a Mars detector in a simulation mode, receiving information such as simulation instructions, orbit and attitude parameters of a detector platform test device, generating corresponding three-dimensional scene imaging information to feed back to the test device after carrying out optical simulation calculation according to optical system parameters of a real camera, and realizing split fire imaging detection from information interaction, imaging and image processing full flow verification.
The in-orbit in-loop imaging verification method mainly realizes that the image in the field of view of the medium-resolution camera is calculated according to the data such as the position and the posture of the detector given by the detector platform testing equipment and by combining the parameters in the real optical system of the camera. The medium-resolution load imaging verification system can be divided into a data interface module, a camera simulation module, a posture control module and an image rendering and processing module according to functional requirements.
The data interface module mainly completes the management of RS422 and LAN interfaces in the simulation of the camera electrical system, the middle-resolution load imaging verification system is connected with the detector platform test equipment through a gigabit Ethernet switch, the receiving function of the RS422 real interface is reserved, the data communication and resource sharing are realized, and the higher transmission rate is kept.
The camera simulation module carries out optical system simulation according to the relevant instructions and modes transmitted by the data interface module and the internal and external parameters of the camera, and transmits the relevant parameters and attributes to the image rendering and processing module. The process of generating the simulated image observed by the camera comprises the following steps:
step S1: and calculating the relative position and posture between the camera coordinate system and the Mars coordinate system to obtain a parameter matrix of the camera, and calculating a ray equation corresponding to each pixel point on the camera image.
Step S2: and calculating the intersection point of the ray and the surface of the Mars, finding 4 pixel points with known brightness adjacent to the intersection point on the obtained Mars image, and calculating the brightness of the pixel points on the simulation image.
Step S3: and correcting the brightness of pixels on the simulated image according to the solar incident angle and the observation direction of the camera, and sequentially executing the steps on 4096 x 3072 pixel points to obtain the simulated image.
Step S4: and outputting the calculated full-width simulation image according to a full picture or windowing and sampling according to the setting of the image output mode.
An attitude control module: the camera is rigidly fixed on the detector, and the installation position, the orientation and other parameters of the camera are relative to the coordinate system of the detector body. Meanwhile, the fine adjustment of the position of the camera and the adjustment of the posture of the camera during simulation also take the coordinate system of the detector body as a reference coordinate system. The reference coordinate system of image rendering is the detector orbit coordinate system, so the position and posture parameters of the camera need to be converted into the detector orbit coordinate system from the detector body coordinate system, the visual area of the camera can be correctly determined, and a correct image can be rendered. The Mars imaging model attitude adjustment flow is shown in FIG. 2.
The image rendering and processing module: the highest mountain on mars is olympic mountain, 21.2km above the datum level, with a diameter of 648km, the lowest basin is the halbach basin, 2km below the datum level, with a diameter of 1600 km. The detector minimum orbit height is 260km, at which imaging does not require consideration of simulated shadows. The resolution of the obtained mars map is 231m/pixel, when the height of the detector is 400km, the ground element resolution of the camera is 100m/pixel, about 1 simulation image pixel is obtained through interpolation of every 2 mars map pixels, when the height of the detector is 260km, the ground element resolution of the camera is 65m/pixel, and about 4 mars map pixels are obtained through interpolation of 1 simulation image pixel. The rendering of the simulated image includes two aspects, namely simulating the relationship between the brightness of the observed image and the integration time on one hand and simulating the observation of a morning and evening line by a camera on the other hand.
The longitude and latitude of the pixel points in the acquired mars full map are defined according to a fire-fixed coordinate system, the same direction with the rotation cycle is the longitude increasing direction, the latitude of the equatorial plane is 0, the latitude of the north pole is 90 degrees, and the latitude of the south pole is-90 degrees. The relationship between the Mars surface image and the latitude and longitude is shown in FIG. 3. The basic information of the spark images that have been obtained at present is as follows:
latitude range: -90 ° -degree
Longitude range: -180 ° -degree
Resolution ratio: 231.542m/pixel
Angular resolution: 256pixel/° c
Major axis radius: 3396190km
Minor axis radius: 3376200km
The Mars color image data volume is 12GB, the gray scale image data volume is 4GB, and FIG. 4 shows the local details on Mars.
After the relation between the detector coordinate system and the fire-fixing system is known, the relation between the camera and the Mars coordinate system is obtained through coordinate conversion, namely the 6-degree-of-freedom relation between the camera coordinate system and the Mars coordinate system, namely the external parameters of the camera. The projection of four points a, B, C and D of the camera image on the surface of the mars is calculated, the longitude and latitude of the 4 intersection points in the mars coordinate system are calculated and mapped to the mars overall graph, and a quadrilateral area formed by the four points is an area of the mars surface observed by the camera under the current external parameter condition, as shown in fig. 5.
The process of acquiring the image of the surface of the Mars observed by the camera through the detector with 6 degrees of freedom relative to the Mars is as follows:
step 1: and converting the 6 degrees of freedom of the detector coordinate system relative to the Mars coordinate system into the 6 degrees of freedom of the camera relative to the Mars coordinate system to obtain an external parameter matrix of the camera.
Step 2: and calculating the intersection point of each pixel point of the camera image passing through the camera optical center ray and the Mars spherical surface.
And step 3: and converting the longitude and latitude coordinates of the intersection points into Mars image coordinates.
And 4, step 4: and (4) bilinear interpolation, namely estimating the brightness of a corresponding pixel point on the image observed by the camera by using the brightness of the pixels of the Mars image.
And 5: and correcting the brightness of the camera observation image according to the normal direction of the observation point on the surface of the mars, the solar incidence direction and the camera observation direction.
Inputting the degree of freedom of a detector coordinate system relative to a Mars coordinate system; the output is a camera observation image 4096 × 3076.
The method can simulate and generate the dynamic real scene shot by the camera in the on-orbit operation process of the Mars detector, and has the functions of receiving the simulation instruction, the orbit, the attitude parameter and other information of the detector platform test equipment; after optical simulation calculation is carried out according to the optical system parameters of the real camera, corresponding three-dimensional scene imaging information is generated and fed back to the testing equipment; the verification of the whole flow of the medium-split fire imaging detection, information interaction, imaging and image processing is realized.
Those skilled in the art will appreciate that, in addition to implementing the system and its various devices, modules, units provided by the present invention as pure computer readable program code, the system and its various devices, modules, units provided by the present invention can be fully implemented by logically programming method steps in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Therefore, the system and various devices, modules and units thereof provided by the invention can be regarded as a hardware component, and the devices, modules and units included in the system for realizing various functions can also be regarded as structures in the hardware component; means, modules, units for performing the various functions may also be regarded as structures within both software modules and hardware components for performing the method.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes or modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention. The embodiments and features of the embodiments of the present application may be combined with each other arbitrarily without conflict.
Claims (10)
1. A fire-surrounding track global and local imaging capability dividing ground verification system is characterized by comprising a data interface module, a camera simulation module, an attitude control module and an image rendering and processing module;
the data interface module is respectively connected with the camera simulation module and the attitude control module; the image rendering and processing module is respectively connected with the camera simulation module and the attitude control module;
the data interface module carries out data communication and resource sharing, and keeps higher transmission rate.
2. The system for ground verification of global and local imaging capability in a fire-ring orbit according to claim 1, wherein the data interface module manages the RS422 and LAN interfaces in the camera electrical system simulation, and the middle-resolution load imaging verification system is connected with the detector platform test equipment through a gigabit ethernet switch, so that the receiving function of the RS422 real interface is maintained.
3. The system for ground verification of global and local imaging capability in a fire-circulating orbit according to claim 1, wherein the camera simulation module performs optical system simulation according to the related commands and modes transmitted by the data interface module and the internal and external parameters of the camera, and transmits the related parameters and attributes to the image rendering and processing module.
4. The system for ground verification of global and local imaging capabilities in a fire-ring track of claim 3, wherein said camera simulation module generating a simulated image of camera observations comprises the steps of:
step S1: calculating the relative position and posture between a camera coordinate system and a Mars coordinate system to obtain a parameter matrix of the camera, and calculating a ray equation corresponding to each pixel point on the camera image;
step S2: calculating the intersection point of the ray and the surface of the Mars, finding 4 pixel points with known brightness adjacent to the intersection point on the obtained Mars image, and calculating the brightness of the pixel points on the simulation image;
step S3: correcting the brightness of pixels on the simulated image according to the sun incident angle and the camera observation direction, and sequentially executing the steps on 4096 x 3072 pixel points to obtain the simulated image;
step S4: and outputting the calculated full-width simulation image according to a full picture or windowing and sampling according to the setting of the image output mode.
5. The system for ground verification of global and local imaging capability in a fire-circulating track according to claim 1, wherein the camera is attached to the detector, and the installation position, orientation and other parameters of the camera are relative to the detector body coordinate system.
6. The global and local imaging capability ground verification system in a fire-ring orbit of claim 5, wherein the fine tuning of the camera position during simulation and the camera pose adjustment are based on a detector body coordinate system.
7. The global and local imaging capability ground verification system in a fire-ring orbit of claim 6, wherein the reference coordinate system of the image rendering is a probe orbit coordinate system, and the position and attitude parameters of the camera are transformed from the probe body coordinate system to the probe orbit coordinate system.
8. The system of claim 1, wherein the image rendering and processing module simulates a relationship between observed image brightness and integration time and simulates a camera observing a morning and evening line.
9. The system for ground verification of split global and local imaging capabilities in a fire-ring track of claim 1, wherein said detector acquiring camera observation images of the Mars surface relative to the Mars comprises the steps of:
step 1: converting the 6 degrees of freedom of the detector coordinate system relative to the Mars coordinate system into the 6 degrees of freedom of the camera relative to the Mars coordinate system to obtain an external parameter matrix of the camera;
step 2: calculating the intersection point of each pixel point of the camera image passing through the camera optical center ray and the Mars spherical surface;
and step 3: converting longitude and latitude coordinates of the intersection points into Mars image coordinates;
and 4, step 4: estimating the brightness of a corresponding pixel point on an observation image by using the brightness of the pixels of the Mars image;
and 5: and correcting the brightness of the camera observation image according to the normal direction of the observation point on the surface of the mars, the solar incidence direction and the camera observation direction.
10. The system for ground verification of split global and local imaging capabilities in a fire-ring orbit of claim 1, wherein the longitude and latitude of the pixel points in the obtained mars whole graph are defined according to a fire-fixation coordinate system, the longitude and latitude of the pixel points in the mars whole graph and the direction of the rotation cycle are the longitude increasing direction, the latitude of the equatorial plane is 0, the latitude of the north pole is 90 degrees, and the latitude of the south pole is-90 degrees.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111016237.5A CN113742863B (en) | 2021-08-31 | 2021-08-31 | Global and local imaging capability ground verification system in ring fire track |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111016237.5A CN113742863B (en) | 2021-08-31 | 2021-08-31 | Global and local imaging capability ground verification system in ring fire track |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113742863A true CN113742863A (en) | 2021-12-03 |
CN113742863B CN113742863B (en) | 2023-10-27 |
Family
ID=78734464
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111016237.5A Active CN113742863B (en) | 2021-08-31 | 2021-08-31 | Global and local imaging capability ground verification system in ring fire track |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113742863B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102737357A (en) * | 2011-04-08 | 2012-10-17 | 中国科学院国家天文台 | Method for generating simulation data of lunar three-linear array camera images |
CN103017788A (en) * | 2012-11-30 | 2013-04-03 | 北京控制工程研究所 | Interplanetary autonomous navigation ground test verification system based on information fusion |
CN106055107A (en) * | 2016-06-07 | 2016-10-26 | 中国人民解放军国防科学技术大学 | Space remote operation technology ground verification system based on man-in-loop |
CN110849331A (en) * | 2019-11-04 | 2020-02-28 | 上海航天控制技术研究所 | Monocular vision measurement and ground test method based on three-dimensional point cloud database model |
CN111735447A (en) * | 2020-05-31 | 2020-10-02 | 南京航空航天大学 | Satellite-sensitive-simulation type indoor relative pose measurement system and working method thereof |
WO2021116078A1 (en) * | 2019-12-13 | 2021-06-17 | Connaught Electronics Ltd. | A method for measuring the topography of an environment |
CN113086255A (en) * | 2021-03-16 | 2021-07-09 | 上海卫星工程研究所 | Ground verification method and system for satellite to evaluate on-orbit stability by observing fixed star |
-
2021
- 2021-08-31 CN CN202111016237.5A patent/CN113742863B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102737357A (en) * | 2011-04-08 | 2012-10-17 | 中国科学院国家天文台 | Method for generating simulation data of lunar three-linear array camera images |
CN103017788A (en) * | 2012-11-30 | 2013-04-03 | 北京控制工程研究所 | Interplanetary autonomous navigation ground test verification system based on information fusion |
CN106055107A (en) * | 2016-06-07 | 2016-10-26 | 中国人民解放军国防科学技术大学 | Space remote operation technology ground verification system based on man-in-loop |
CN110849331A (en) * | 2019-11-04 | 2020-02-28 | 上海航天控制技术研究所 | Monocular vision measurement and ground test method based on three-dimensional point cloud database model |
WO2021116078A1 (en) * | 2019-12-13 | 2021-06-17 | Connaught Electronics Ltd. | A method for measuring the topography of an environment |
CN111735447A (en) * | 2020-05-31 | 2020-10-02 | 南京航空航天大学 | Satellite-sensitive-simulation type indoor relative pose measurement system and working method thereof |
CN113086255A (en) * | 2021-03-16 | 2021-07-09 | 上海卫星工程研究所 | Ground verification method and system for satellite to evaluate on-orbit stability by observing fixed star |
Non-Patent Citations (4)
Title |
---|
周锦春等: "无人机全天时航空成像侦察系统建模与仿真", 《计算机工程与应用》, no. 24, pages 259 - 265 * |
李超: "交会航天器姿态指向控制与地面仿真验证研究", 《中国优秀硕士学位论文全文数据库工程科技II辑》, no. 1, pages 031 - 815 * |
王勇等: "一种相机标定辅助的单目视觉室内定位方法", 《测绘通报》, no. 2, pages 38 - 43 * |
郭崇滨等: "遥感观测卫星多系统交互验证的可视化仿真", 《计算机仿真》, vol. 33, no. 4, pages 144 - 149 * |
Also Published As
Publication number | Publication date |
---|---|
CN113742863B (en) | 2023-10-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7413321B2 (en) | Daily scene restoration engine | |
CN110849331B (en) | Monocular vision measurement and ground test method based on three-dimensional point cloud database model | |
CN103245364B (en) | Method for testing dynamic performance of star sensor | |
MX2013003853A (en) | Rapid 3d modeling. | |
CN106679676B (en) | A kind of monoscopic multifunctional optical sensor and implementation method | |
CN104573251A (en) | Method for determining full-field-of-view apparent spectral radiance of satellite-borne optical remote sensor | |
CN112284687B (en) | Imaging simulation system and method suitable for deep space exploration imaging spectrometer | |
CN105444780A (en) | System and processing method for verifying image location of satellite-borne whisk broom optical camera | |
CN107451957A (en) | A kind of spaceborne TDI CMOS camera imagings emulation mode and equipment | |
CN115343744A (en) | Optical single-double-star combined on-satellite positioning method and system for aerial moving target | |
CN109269495B (en) | Dynamic star map generation method and device | |
CN105547286B (en) | A kind of compound three visual fields star sensor star map simulation method | |
CN113742863B (en) | Global and local imaging capability ground verification system in ring fire track | |
CN115270522B (en) | Method and device for simulating and tracking target equipment based on WGS84 coordinates | |
CN113686361B (en) | Ground verification system and method for satellite detection and space-earth collaborative navigation | |
CN114485620B (en) | Autonomous visual positioning system and method for asteroid detector fused with orbit dynamics | |
CN115688440A (en) | Lunar digital environment construction simulation system | |
Glamazda | SBG camera of Kourovka Astronomical observatory | |
CN115292287A (en) | Automatic labeling and database construction method for satellite feature component image | |
Loubser | The development of Sun and Nadir sensors for a solar sail CubeSat | |
CN113791630B (en) | Mars elliptic orbit image motion compensation control ground verification system | |
Vergauwen et al. | A stereo vision system for support of planetary surface exploration | |
Avanesov et al. | Luna-25 service television system | |
Ludivig et al. | Testing Environments for Lunar Surface Perception Systems; Combining Indoor Facilities, Virtual Environments and Analogue Field Tests | |
Crockett et al. | Visualization tool for advanced laser system development |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |