CN104182632B - Disturbance image based method for synthesizing long-exposed deep space visual simulation images - Google Patents

Disturbance image based method for synthesizing long-exposed deep space visual simulation images Download PDF

Info

Publication number
CN104182632B
CN104182632B CN201410415421.0A CN201410415421A CN104182632B CN 104182632 B CN104182632 B CN 104182632B CN 201410415421 A CN201410415421 A CN 201410415421A CN 104182632 B CN104182632 B CN 104182632B
Authority
CN
China
Prior art keywords
disturbance
transformation matrix
camera
deep space
star
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201410415421.0A
Other languages
Chinese (zh)
Other versions
CN104182632A (en
Inventor
周付根
资粤
吴福祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201410415421.0A priority Critical patent/CN104182632B/en
Publication of CN104182632A publication Critical patent/CN104182632A/en
Application granted granted Critical
Publication of CN104182632B publication Critical patent/CN104182632B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)

Abstract

A disturbance image based method for synthesizing long-exposed deep space visual simulation images includes the steps of 1, establishing a high-reality deep space three-dimensional scene through a known star catalogue; 2, generating a camera orientation disturbance set through a 2D (two-dimensional) disturbance path image, and generating a camera position disturbance set through 3D (three-dimensional) lines; 3, randomly selecting a disturbance transformation matrix from the disturbance sets obtained in the step 2, as a current disturbance transformation matrix; 4, setting a current camera transformation matrix through the disturbance transformation matrix obtained in step 3; 5, acquiring star maps through the deep space three-dimensional scene obtained in the step 1 and the current camera transformation matrix obtained in the step 4; 6, repeating the steps from three to five to obtain a series of star maps acquired in an exposure time, synthesizing the star maps to obtain a long-exposed star map synthesis result. The method has promising application prospect in the field of image processing for deep space optical autonomous navigation.

Description

A kind of time exposure deep space vision simulation image combining method based on disturbance image
Technical field
The invention belongs to the image processing field of deep space Optical autonomous navigation, and in particular to a kind of length based on disturbance image Time Exposure deep space vision simulation image combining method.
Background technology
Deep space probe is different from nearly Earth's orbit in deep space inflight phase, and the nautical star target that optical guidance is adopted is except regarding It is very low due to detecting magnitude outside fixed star in, in addition it is also necessary to the asteroid occurred using timesharing around cruise section, it is 9~12 Deng star, sensor is needed to expose for a long time, the position and attitude of detector and sensor has disturbance feelings in time of exposure Condition, therefore the celestial imag-ing that navigates no longer is single punctate opacity of the cornea, but as disturbance changes position in the picture, formed with disturbance Star chart, to image processing method and emulation mode new challenge is brought, and needs to carry out new research.
At present, punctate opacity of the cornea image simulation method has a lot, but the star chart emulation under the conditions of the time exposure with disturbance Systematic study does not almost have.
The content of the invention
It is an object of the invention to provide a kind of time exposure deep space vision simulation image synthesis based on disturbance image Method, it is to carry out the process that deep space inflight phase star sensor in simulating deep space detection process obtains star chart by the method for emulating.
Realize the technical scheme of the object of the invention:A kind of time exposure deep space vision simulation image based on disturbance image Synthetic method, it is characterised in that including step in detail below:
Step one, the deep space three-dimensional scenic that high validity is built using known catalogue data;
Step 2, using 2D disturbance path image generate camera towards disturbance set, using 3D lines generate camera position disturb Dynamic set;
Step 3, using step 2 obtained disturbance set, therefrom randomly select one disturbance transformation matrix as current Disturbance transformation matrix;
Step 4, the disturbance transformation matrix obtained using step 3, arrange Current camera transformation matrix;
Step 5, the Current camera transformation matrix that deep space three-dimensional scenic and step 4 obtain is obtained using step one carry out star chart Collection;
Step 6, repeat step three arrive step 5, a series of star charts gathered in time of exposure are obtained, by these star charts Synthesized, obtained time exposure star chart composite result.
Wherein, " generating camera towards disturbance set using 2D disturbance path images, being given birth to using 3D lines described in step 2 Gather into camera position disturbance ", it is as follows that it implements process:
1 generates towards disturbance collection Θ
A 2D disturbance path image is firstly generated, and the picture position that camera optical axis are passed through is set, and to each of which Individual non-zero pixels calculate current optical axis by following formula:
Formula 1)
λ and α are calculated as follows in formula 1:
Formula 2)
Wherein (xp,yp) be disturb path image in non-zero pixel, (xc,yc) the image position that passes through for camera optical axis Coordinate is put, (sx,sy) be corresponding x directions and y directions zoom factor, zcThe depth value that path image is placed is disturbed for 2D.Separately Outward, the pixel value of respective pixel point is vp, the size of pixel value represents the size of disturbance probability of happening, and pixel value is bigger, disturbance The probability of generation is also bigger.
Thus obtain towards disturbance collection (Rtur,p,vp)∈Θ.2 generate position disturbance collection Ξ
First camera position disturbance is directly generated by 3D lines, then position disturbance battle array T is built to each point on linetur
Formula 3)
Wherein (x, y, z) is the three-dimensional coordinate of respective point.
Thus position disturbance collection T is obtainedtur,p∈Ξ。
The beneficial effects of the present invention is:
(1) it is of the invention in the case where currently without ripe sensor image can be provided, by building high validity Deep space three-dimensional scenic, by emulation functional simulation deep space probe in deep space inflight phase, there is disturbance in star sensor In the case of obtain the star chart through time exposure.Restoration algorithm for the time exposure star chart with disturbance provides emulation star Figure.
(2) perturbation mode in the present invention is not randomly generated, but the structure and detection according to detector itself The state of flight of device, can pre-estimate the disturbance situation of camera.Carry out emulating the emulation for obtaining using the disturbance pre-estimated The star chart that star chart is obtained closer under practical situation.
(3) invention emulates sensor carries out the situation of time exposure, the time exposure star for obtaining finally is synthesized Figure can make a distinction the moving object of image high speed and deep space background.
Description of the drawings
Fig. 1:A kind of time exposure deep space vision simulation image combining method flow chart based on disturbance image;
Fig. 2:Camera generates schematic diagram towards disturbance, will set up 2D disturbances path, projects on sphere, then generates camera Towards disturbance set;
Fig. 3 (a) is acquired original star chart;
Fig. 3 (b) is that 2D disturbs path image;
Fig. 3 (c) is the time exposure star chart with disturbance.
Specific embodiment
See Fig. 1-Fig. 3 (c), technical scheme for a better understanding of the present invention, below in conjunction with the accompanying drawings and specific embodiment party Formula is discussed in detail the present invention.A kind of time exposure deep space vision simulation image combining method based on disturbance image
The present invention is a kind of based on the time exposure deep space vision simulation image combining method for disturbing image, the method master To include following step:
1. the deep space three-dimensional scenic of high validity is built using known catalogue data;
2. camera is generated towards disturbance set, using 3D lines camera position disturbance collection is generated using 2D disturbance path images Close;
3. disturbance set has been obtained using step 2, therefrom randomly selected a disturbance transformation matrix and become as current disturbance Change battle array;
4. the disturbance transformation matrix for being obtained using step 3, arranges Current camera transformation matrix;
5. obtaining the Current camera transformation matrix that deep space three-dimensional scenic and step 4 obtain using step 1 carries out star chart collection;
6. repeat step 3 arrives step 5, obtains a series of star charts gathered in time of exposure, and these star charts are closed Into obtaining time exposure star chart composite result.
The present invention's implements flow process as shown in figure 1, each several part specific implementation details are as follows:
1. high validity deep space three-dimensional scenic is built using known catalogue data
Simulating deep space optical environment first, calculates the light of the optical signal sources such as fixed star in deep space, major planet and asteroid Characteristic is learned, using airship current location and time, emulation camera is arranged according to the Installation posture of star sensor, then used OpenGL draws to deep space three-dimensional scenic.In order to draw scene image, each major planet, asteroid and comet etc. are needed Three-dimensional data, uses hipparcos catalogue data (Hipparcos catalogue) general in the world at present in this method ESA1997, and its rectangular coordinate is calculated using the right ascension of fixed star, declination and distance, the position calculation of major planet is used The VSOP87B ephemeris interpolations of heliocentric coordinates determine positional information.
2. camera is generated towards disturbance set, using 3D lines camera position disturbance set is generated using 2D disturbance path images
1 generates towards disturbance collection Θ
A 2D disturbance path image is firstly generated, and the picture position that camera optical axis are passed through is set, and to each of which Individual non-zero pixels calculate current optical axis by following formula:
Formula 1)
λ and α are calculated as follows in formula 1:
Formula 2)
Wherein (xp,yp) be disturb path image in non-zero pixel, (xc,yc) the image position that passes through for camera optical axis Coordinate is put, (sx,sy) be corresponding x directions and y directions zoom factor, zcThe depth value that path image is placed is disturbed for 2D.Separately Outward, the pixel value of respective pixel point is vp, the size of pixel value represents the size of disturbance probability of happening, and pixel value is bigger, disturbance The probability of generation is also bigger.
Thus obtain towards disturbance collection (Rtur,p,vp)∈Θ。
2 generate position disturbance collection Ξ
First camera position disturbance is directly generated by 3D lines, then position disturbance battle array T is built to each point on linetur
Formula 3)
Wherein (x, y, z) is the three-dimensional coordinate of respective point.
Thus position disturbance collection T is obtainedtur,p∈Ξ。
3. disturbance set has been obtained using step 2, therefrom randomly selected a disturbance transformation matrix and become as current disturbance Change battle array
It is random to generate i ∈ [0, NR] and υ ∈ [0,255], NRIt is towards the element number in disturbance set Θ.If υ is < vi Then choose Rtur,iFor current towards disturbance transformation matrix Rtur, otherwise put RturFor unit battle array.
It is random to generate i ∈ [0, NT], NTFor the element number in position disturbance set Ξ, T is chosentur,iFor current position Disturbance transformation matrix Ttur
4. the disturbance transformation matrix for being obtained using step 3, arranges Current camera transformation matrix
T is settur·Mc·RturFor Current camera transformation matrix, wherein TturFor the translation disturbance conversion chosen in step 3 Battle array, McFor standard camera transformation matrix, RturFor the direction disturbance transformation matrix chosen in step 3.
5. obtaining the Current camera transformation matrix that deep space three-dimensional scenic and step 4 obtain using step 1 carries out star chart collection
According to the camera transformation battle array set in the high validity deep space three-dimensional scenic and step 4 built in step 1, lead to Cross renderer and draw current deep space what comes into a driver's, obtain the emulation star chart under Current camera attitude.
(m, n) pixel when star chart is drawn, on image (0≤m < M, 0≤n < N, image size is M × N) Brightness can be calculated by following formula:
Formula 4)
Wherein MiFor the apparent magnitude, μi(x, y) is point spread function, and C and B is constant.In view of computational efficiency problem, enter one Step does some simplification, by point spread function μi(x, y) is separated into a diffusion texture, and willAlpha passages are inserted, Accumulation summation is carried out by rendering pipeline.
6. all star charts collected in time of exposure are synthesized
Repeat step 3 arrives step 5, continuous acquisition N=T in time of exposureexp/dtexpOpen image, wherein TexpFor exposure Time, and dtexpFor the sampling interval, and synthesized by following formula, obtained time exposure result figure.
Formula 5)
Wherein α be camera Sensitivity Factor, wiFor the synthetic weight of collection image every time, Ii(x, y) is gathered for i & lt The emulation star chart for arriving.

Claims (1)

1. a kind of based on the time exposure deep space vision simulation image combining method for disturbing image, it is characterised in that:It includes Step in detail below:
Step one, the deep space three-dimensional scenic that high validity is built using known catalogue data;
Step 2, using 2D disturbance path image generate camera towards disturbance set, using 3D lines generate camera position disturbance collection Close;
Step 3, the disturbance set obtained using step 2, therefrom randomly select a disturbance transformation matrix as current disturbance Transformation matrix;
Step 4, the disturbance transformation matrix obtained using step 3, arrange Current camera transformation matrix;
Step 5, the Current camera transformation matrix that deep space three-dimensional scenic and step 4 obtain obtained using step one carry out star chart adopting Collection;
Step 6, repeat step three arrive step 5, obtain a series of star charts gathered in time of exposure, and these star charts are carried out Synthesis, obtains time exposure star chart composite result;
Wherein, " generating camera towards disturbance set using 2D disturbance path images, using 3D lines phase generated described in step 2 Machine position disturbance set ", it is as follows that it implements process:
1 generates towards disturbance collection Θ
A 2D disturbance path image is firstly generated, and the picture position that camera optical axis are passed through is set, and it is non-to each of which Zero pixel calculates current optical axis by following formula:
Formula 1) in λ and α be calculated as follows:
Wherein (xp,yp) be disturb path image in non-zero pixel, (xc,yc) sit for the picture position that camera optical axis are passed through Mark, (sx,sy) be corresponding x directions and y directions zoom factor, zcThe depth value that path image is placed is disturbed for 2D;In addition, The pixel value of respective pixel point is vp, the size of pixel value represents the size of disturbance probability of happening, and pixel value is bigger, and disturbance occurs Probability it is also bigger;
Thus obtain towards disturbance collection (Rtur,p,vp)∈Θ;
2 generate position disturbance collection Ξ
First camera position disturbance is directly generated by 3D lines, then position disturbance battle array T is built to each point on linetur
Wherein (x, y, z) is the three-dimensional coordinate of respective point, thus obtains position disturbance collection Ttur,p∈Ξ。
CN201410415421.0A 2014-08-21 2014-08-21 Disturbance image based method for synthesizing long-exposed deep space visual simulation images Expired - Fee Related CN104182632B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410415421.0A CN104182632B (en) 2014-08-21 2014-08-21 Disturbance image based method for synthesizing long-exposed deep space visual simulation images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410415421.0A CN104182632B (en) 2014-08-21 2014-08-21 Disturbance image based method for synthesizing long-exposed deep space visual simulation images

Publications (2)

Publication Number Publication Date
CN104182632A CN104182632A (en) 2014-12-03
CN104182632B true CN104182632B (en) 2017-04-26

Family

ID=51963667

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410415421.0A Expired - Fee Related CN104182632B (en) 2014-08-21 2014-08-21 Disturbance image based method for synthesizing long-exposed deep space visual simulation images

Country Status (1)

Country Link
CN (1) CN104182632B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0969415A2 (en) * 1998-06-30 2000-01-05 Lucent Technologies Inc. Display techniques for threedimensional virtual reality
CN102116633A (en) * 2009-12-31 2011-07-06 北京控制工程研究所 Simulation checking method for deep-space optical navigation image processing algorithm
CN102116626A (en) * 2009-12-31 2011-07-06 北京控制工程研究所 Prediction and correction method of node of star point track image
CN102114919A (en) * 2009-12-31 2011-07-06 北京控制工程研究所 Asteroid imaging simulator at deep space exploration transition stage

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0969415A2 (en) * 1998-06-30 2000-01-05 Lucent Technologies Inc. Display techniques for threedimensional virtual reality
CN102116633A (en) * 2009-12-31 2011-07-06 北京控制工程研究所 Simulation checking method for deep-space optical navigation image processing algorithm
CN102116626A (en) * 2009-12-31 2011-07-06 北京控制工程研究所 Prediction and correction method of node of star point track image
CN102114919A (en) * 2009-12-31 2011-07-06 北京控制工程研究所 Asteroid imaging simulator at deep space exploration transition stage

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
一种基于信息融合的卫星自主天文导航新方法;宁晓琳 等;《宇航学报》;20031130;第24卷(第6期);第579-583、633页 *
利用仿真星图的星敏感器地面功能测试方法;魏新国 等;《红外与激光工程》;20081231;第37卷(第6期);第1087-1091页 *
考虑卫星轨道运动和像移影响的星敏感器星图模拟方法;刘海波 等;《宇航学报》;20110531;第32卷(第5期);第1190-1194页 *

Also Published As

Publication number Publication date
CN104182632A (en) 2014-12-03

Similar Documents

Publication Publication Date Title
US20180096482A1 (en) Devices for refractive field visualization
CN105627991B (en) A kind of unmanned plane image real time panoramic joining method and system
CN107168516B (en) Global climate vector field data method for visualizing based on VR and gesture interaction technology
MX2013003853A (en) Rapid 3d modeling.
CN103344256A (en) Laboratory testing method for multi-field-of-view star sensor
Mel et al. Workflows for virtual reality visualisation and navigation scenarios in earth sciences
Currier Mapping with strings attached: Kite aerial photography of Durai Island, Anambas Islands, Indonesia
CN110869982A (en) Hologram positioning
CN107371009B (en) A kind of human action enhancing method for visualizing and human action augmented reality system
To et al. Automated 3D architecture reconstruction from photogrammetric structure-and-motion: A case study of the One Pilla pagoda, Hanoi, Vienam
CN104182632B (en) Disturbance image based method for synthesizing long-exposed deep space visual simulation images
CN117635816A (en) Method and system for constructing spacecraft simulation data set in space environment
CN112785678A (en) Sunshine analysis method and system based on three-dimensional simulation
CN104851130B (en) A kind of three-dimensional generation method of satellite remote-sensing image
EP2962290B1 (en) Relaying 3d information by depth simulation using 2d pixel displacement
CN109581374B (en) Method and system for simulating imaging form and dynamic simulation of unilateral SAR (synthetic Aperture Radar) satellite
Vincent et al. Stereo panorama photography in archaeology: Bringing the past into the present through CAVEcams and immersive virtual environments
Tack et al. Development and initial testing of XR-based fence diagrams for polar science
Deng et al. Skip attention GAN for remote sensing image synthesis
Timokhin et al. Computer modeling and visualization of accurate terrain shadows in virtual environment system
Schiller et al. A laser technique for capturing cross sections in dry and underwater caves
Cagalaban et al. Projective illumination technique in unprepared environments for augmented reality applications
CN111105488B (en) Imaging simulation method, imaging simulation device, electronic equipment and storage medium
Sorensen et al. A virtual reality framework for multimodal imagery for vessels in polar regions
Gao et al. Study on airborne opto-electronic image and three-dimensional spatial data fusion

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170426

Termination date: 20180821

CF01 Termination of patent right due to non-payment of annual fee