CN112927177A - Multi-frame image rapid reconstruction method on microsatellite platform - Google Patents
Multi-frame image rapid reconstruction method on microsatellite platform Download PDFInfo
- Publication number
- CN112927177A CN112927177A CN202110260840.1A CN202110260840A CN112927177A CN 112927177 A CN112927177 A CN 112927177A CN 202110260840 A CN202110260840 A CN 202110260840A CN 112927177 A CN112927177 A CN 112927177A
- Authority
- CN
- China
- Prior art keywords
- image
- frame
- observation
- images
- satellite
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 62
- 108091092878 Microsatellite Proteins 0.000 title claims abstract description 20
- 238000001228 spectrum Methods 0.000 claims abstract description 32
- 238000001914 filtration Methods 0.000 claims abstract description 12
- 230000008569 process Effects 0.000 claims abstract description 10
- 230000003287 optical effect Effects 0.000 claims description 16
- 238000006073 displacement reaction Methods 0.000 claims description 13
- 238000005259 measurement Methods 0.000 claims description 8
- 238000007781 pre-processing Methods 0.000 claims description 7
- 238000012545 processing Methods 0.000 claims description 7
- 230000009467 reduction Effects 0.000 claims description 7
- 238000004164 analytical calibration Methods 0.000 claims description 6
- 230000005855 radiation Effects 0.000 claims description 6
- 238000004422 calculation algorithm Methods 0.000 claims description 5
- 230000007423 decrease Effects 0.000 claims description 5
- 210000001747 pupil Anatomy 0.000 claims description 4
- 230000003595 spectral effect Effects 0.000 claims description 4
- 239000000654 additive Substances 0.000 claims description 3
- 230000000996 additive effect Effects 0.000 claims description 3
- 238000012937 correction Methods 0.000 claims description 3
- 230000002708 enhancing effect Effects 0.000 claims description 3
- 230000001965 increasing effect Effects 0.000 claims description 3
- 238000003672 processing method Methods 0.000 claims description 3
- 230000004044 response Effects 0.000 claims description 3
- 238000013441 quality evaluation Methods 0.000 claims description 2
- 238000003384 imaging method Methods 0.000 abstract description 13
- 230000010354 integration Effects 0.000 description 12
- 230000000875 corresponding effect Effects 0.000 description 10
- 230000000694 effects Effects 0.000 description 9
- 238000004364 calculation method Methods 0.000 description 6
- 238000013459 approach Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 238000009826 distribution Methods 0.000 description 3
- 230000006872 improvement Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000005375 photometry Methods 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000013016 damping Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 239000002360 explosive Substances 0.000 description 1
- 239000010419 fine particle Substances 0.000 description 1
- 238000009472 formulation Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000011946 reduction process Methods 0.000 description 1
- 238000000638 solvent extraction Methods 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 230000000087 stabilizing effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000009827 uniform distribution Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
- G06T7/344—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
The invention relates to a method for quickly reconstructing multi-frame images on a microsatellite platform, which comprises the following steps: a. establishing a load observation jitter model according to the power spectrum of the satellite platform; b. acquiring a short-exposure image by combining a jitter model, and registering the image; c. performing multi-frame superposition reconstruction on the registered image; d. and denoising and filtering the reconstructed image. The method can accurately and quickly solve the problem of multi-frame image restoration with relative motion in the actual scene of the spread observation on the microsatellite platform, effectively remove the focal plane imaging blurring phenomenon caused by load pointing shake in the imaging process and improve the imaging resolution of the system. And the system has low computational complexity, has the capability of on-satellite real-time operation, and is suitable for observation based on a microsatellite or a similar space vehicle for different scenes such as astronomical imaging, earth observation and the like.
Description
Technical Field
The invention relates to a multi-frame image rapid reconstruction method on a microsatellite platform.
Background
At present, the miniaturization trend of the global satellite application field is continuously accelerated, and the microsatellite shows explosive development. The progress of engineering subsystems such as attitude control, thermal control, propulsion, communication, on-board processing, and payload of satellites has led to the further development of microsatellites in the direction of high performance. For example, purple pupil scientific satellite is a micro-astronomical satellite developed in recent years. The target plan realizes out-of-system planet occultation double-color photometry, ultraviolet waveband fixed star activity monitoring and near-earth asteroid monitoring on the earth orbit.
The pointing direction of the space telescope can generate drift or random jitter during observation, so that the accuracy of an observed target is reduced. Compared with other spacecrafts, the microsatellite is more sensitive to internal and external disturbance and is more prone to jitter. The jitter characteristics of the satellite platform cause the load to generate blurred images in long-time exposure observation, resulting in attenuation and even loss of some frequency component information. These factors can cause the observed orientation of the load to deviate from the expected orientation, in particular by including a drift component in a particular direction and a random jitter component. From the perspective of improving the observation precision, the scientific target puts forward higher requirements on the attitude stability of the microsatellite. It is therefore necessary to design observation plans in a targeted manner and to study corresponding image reconstruction systems.
The existing technique for correcting such blur is generally referred to as image deconvolution and is a key step in improving image resolution and contrast to obtain high quality images. A common approach is to infer parameters from noisy and incomplete observations that are often inadequate. Single frame deconvolution is usually mathematically underdetermined and has stability and multiple solution problems. To date, accurate reconstruction still depends on a priori information, while the current computational complexity may not meet the real-time requirements on the satellite platform.
Under the classic multi-frame processing framework, the solution of such problems is mainly to perform iterative estimation on the convolution kernel and the original image. On the one hand, the convolution kernel cannot always be provided exactly as a priori; on the other hand, even with appropriate a priori information and an additional regularization term, the inverse problem is still sensitive to measurement noise, which may lead to strong artifacts. This process requires powerful computers to extract information from a large number of observations, which puts higher demands on-track computing power or data transmission bandwidth in the spatial observations.
Disclosure of Invention
The invention aims to provide a method for quickly reconstructing multi-frame images on a microsatellite platform.
In order to achieve the above object, the present invention provides a method for rapidly reconstructing a plurality of frames of images on a microsatellite platform, comprising the following steps:
a. establishing a load observation jitter model according to the power spectrum of the satellite platform;
b. acquiring a short-exposure image by combining a jitter model, and registering the image;
c. performing multi-frame superposition reconstruction on the registered image;
d. and denoising and filtering the reconstructed image.
According to an aspect of the present invention, in the step (a), the jitter characteristic of the telescope pointing direction is analyzed according to a power spectrum function of the satellite platform, wherein the power spectrum function of the satellite platform can be expressed as:
wherein S represents a power spectrum function, fbandRepresenting the frequency range of the power spectrum function, c being a constant;
displacement in focal plane ΔxAnd ΔyThe following relationship exists with the telescope pointing jitter characteristics:
wherein, F is the focal length,Dscenedistance between pupil and observed scene, Dx、Dy、DzRepresenting displacements along the x, y, z axes, respectively, thetax、θy、θzIndicating the rotation of the pointing along the x, y, z axes, respectively, the distance dx between the position (x, y) and the plane center and the plausibility transformation function f,yand (4) correlating.
According to an aspect of the invention, due to Dscene>>F>>dx,yThen a displacement ofxAnd ΔyThe method is simplified as follows:
according to one aspect of the present invention, based on the dithering model established in step (a), the dithered image is:
where Y is the dithered image, K, P are the dither and convolution kernels of the optical system, respectively, N is the additional additive noise, X is the non-dithered target image,is a convolution operation.
According to one aspect of the invention, in said step (b), the exposure time for acquiring the short-exposure image is determined based on the observed target source, optical system and detector parameters in combination with K, P, N, X in the dither model.
According to one aspect of the invention, the exposure time at which the short-exposure image is acquired decreases as the observed signal-to-noise ratio of a single frame increases.
According to one aspect of the invention, in the step (b), preprocessing is performed on each frame image before image registration, wherein the preprocessing includes dark current removal, flat field removal and radiation correction;
for undersampled images, the image resolution is increased using a Drizzle or multidizzle algorithm prior to registration;
and combining instrument calibration data when the preprocessing step is carried out, wherein the instrument calibration data comprises detector dark current, optical system flat field and radiation response data obtained by measurement in the ground and space.
According to an aspect of the present invention, in the step (b), in the registration of images of astronomical observations, a fixed reference target in a reference field of view is aligned, or a fine guide star sensor is used to acquire a high-precision image, and the registration of the attitude of each frame is confirmed by star map matching, so that the target point in each frame of image is at the same position;
in the earth observation, selecting a frame with the highest signal-to-noise ratio from a series of short-exposure images, estimating the relative displacement of other frames and the selected frame by using an optical flow method or a feature recognition method, then registering, and selecting the frame with sharpness and the signal-to-noise ratio as quality evaluation parameters;
and combining satellite observation data during image registration, wherein the satellite observation data comprises observation data obtained by a satellite star sensor and a high-precision guide star measurement system.
According to one aspect of the present invention, in the step (c), the registered continuous observation images are converted into a fourier frequency domain, and corresponding frequency components are recombined into a new spectral image based on the weighting factors, and the reconstruction utilizes the following formula:
and
wherein, F-1For the purpose of the operation of the inverse fourier transform,for the ith in the total M frames of imagesFrequency domain information w of frame image after Fourier transformi(xi) is the weighting factor of its corresponding frame spectrum, yp(x) For the reconstructed image, p is a factor that controls the image sharpness and noise;
performing inverse Fourier transform on the reconstructed image to obtain a reconstructed image;
and for the polychromatic light detector system, reconstructing on each wave band respectively, and fusing in the reconstructed images.
According to an aspect of the present invention, in the step (d), the image is subjected to noise reduction processing using BM3D in a block processing method or an nlb yes method under a bayesian framework;
and filtering the image by using differential or Gaussian filtering in the sharpening operation, recovering the weakened image details in the denoising process, and enhancing the edge components of the image.
According to one scheme of the invention, a jitter model is established by combining the characteristics (including power spectrum and the like) of a satellite platform, an observation scheme is determined, and frequency domain superposition reconstruction is carried out on the basis of an image sequence acquired by short exposure. Therefore, the problem of multi-frame image restoration with relative motion in an actual scene of development observation on a microsatellite platform can be accurately and quickly solved, the phenomenon of focal plane imaging blurring caused by load pointing jitter in the imaging process is effectively removed, and the imaging resolution of a system is improved.
According to an aspect of the present invention, when the directions of blur kernels of a plurality of frame images are not consistent, for any one of the blurred images, high-frequency information in the direction of self-blurring is provided for the blurred image by using other blurred images. And selecting less fuzzy information in each frame, and fusing and registering to obtain a clearer image, thereby realizing the image stabilizing effect. Therefore, the method does not need to carry out deconvolution operation, does not require the prior property of the image, has low calculation complexity and the capability of on-satellite real-time processing, and is suitable for observation based on different scenes such as astronomical imaging and earth observation of a space vehicle like a microsatellite.
According to one scheme of the invention, in the reconstruction process, signal components in each frame are linearly superposed and can be realized by only utilizing two-dimensional Fourier transforms, so that the calculation complexity is low, and the on-satellite real-time calculation is further facilitated. And carrying out operations such as noise reduction and filtering on the reconstructed image, so as to improve the signal-to-noise ratio of the image by noise reduction, and recovering weakened image details by utilizing sharpening filtering operation, thereby enhancing the edge components of the image.
Drawings
FIG. 1 is a flow chart of a method for rapidly reconstructing a plurality of frames of images on a microsatellite platform according to an embodiment of the invention;
FIG. 2 is a schematic diagram showing the effect of dithering on a focal plane image and its spatial frequency in a method according to an embodiment of the present invention (upper row: focal plane images obtained with ideal point spread function and different integration time; lower row: two-dimensional Fourier transform of the corresponding image);
FIG. 3 schematically shows the reconstruction effect of a point light source target in a method according to an embodiment of the present invention (bright star, double star, and dark star in order from top to bottom; the left column is an ideal image, the middle column is a long exposure image, and the right column is a reconstructed image according to the method of the present invention);
fig. 4 schematically shows the reconstruction effect on the extended light source target [ the original image, the reconstructed images under different short exposure strategies (two after the upper row and two before the lower row) and the long integral image from left to right and from top to bottom ] in the method according to an embodiment of the invention.
Detailed Description
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments will be briefly described below. It is obvious that the drawings in the following description are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
The present invention is described in detail below with reference to the drawings and the specific embodiments, which are not repeated herein, but the embodiments of the present invention are not limited to the following embodiments.
Referring to fig. 1, according to the method for rapidly reconstructing multi-frame images on a microsatellite platform, a load observation jitter model is firstly established according to a power spectrum of the satellite platform, and of course, parameters of a load optical system need to be considered in the process of establishing the model, so that an imaging process of a target on a focal plane within exposure time is described. Short exposure images are then acquired and the images are registered. And after the image registration is completed, performing multi-frame superposition reconstruction on the image. And finally, denoising and filtering the reconstructed image. Thus, because the jitter of the satellite platform is random, each image in the short exposure typically contains different blur information. When the directions of the blur kernels of the multi-frame images are inconsistent, for any one of the blurred images, the other blurred images can provide high-frequency information in the blurring direction of the other blurred image. Therefore, the invention selects the less fuzzy information in each frame, and becomes a clearer image through fusion and registration, thereby realizing the image stabilization effect. The method does not need to carry out deconvolution operation, does not require the prior property of the image, and has the capability of on-satellite real-time processing.
In the process of establishing a shake model, in a linear invariant system, modeling shake blur as convolution of an original image and a motion kernel, and generating a shake-processed image as follows:
where Y is the dithered image, K, P are the dither and convolution kernels of the optical system, respectively, N is the additional additive noise, X is the non-dithered target image,is a convolution operation.
The instrument structure of the satellite can be considered as a rigid body, and the vibration on the telescope will be transferred to the detector focal plane, thereby causing a two-dimensional vector motion of the ideal point spread function in the image. In a spatial environment, internal and external disturbances on a satellite platform are often caused by a variety of sources, which are sometimes correlated. The jitter of a satellite platform is typically described using a power spectrum whose frequency range determines the frequency domain distribution characteristics of the jitter components and whose amplitude level describes the energy contained in each frequency component.
The method of the present invention is described in detail below with an example of a scene where an optical load on a microsatellite is observed astronomically and on the ground. The power spectrum of satellite platform jitter typically contains a wide range of frequency components: the low frequency component is sometimes associated with multi-body dynamics and satellite mechanics; mechanical motion may cause mid-frequency vibrations; structural vibration modes of the system will typically excite high frequency jitter with lower vibration amplitudes and lower intrinsic damping. Therefore, the invention estimates the jitter characteristic of the telescope pointing direction according to the power spectrum function analysis of the satellite platform. In the present embodiment, when a jitter component of 0.1Hz to 100Hz is generated and simulation is performed based on the power spectrum model in the ESA silix project, the power spectrum function of the satellite platform is:
wherein S is a power spectrum function, fbandRepresenting the frequency range of the power spectrum function, c is constant, μ rad is micro angular seconds, Hz is hertz. It follows that the jitter amplitude level decreases with increasing frequency.
The image reconstruction method does not need to describe the platform jitter in detail, but takes the statistical information of the jitter power spectrum as the optimization condition of the observation strategy, so the method is not limited to a specific satellite platform.
Then, a model is constructed to generate an image on a focal plane, and in order to more intuitively explain the effect of the jitter, the vibration amplitude is calculated according to the propagation principle from an instrument to the focal plane, and the mean square error of the jitter is displayed on the focal plane pixel scale. The main parameters of the optical system simulation model are shown in table 1:
TABLE 1
In the present invention, the displacement Δ in the focal planexAnd ΔyThe following relationship exists with the telescope pointing jitter characteristics:
wherein F is the focal length and DsceneDistance between pupil and observed scene, Dx、Dy、DzRepresenting displacements along the x, y, z axes, respectively, thetax、θy、θzRepresenting the rotation of the pointing along the x, y, z axes, respectively, the distance d between the position (x, y) and the center of the plane and the function f of the similarity transformationx,yAnd (4) correlating.
In space observation, D is usually satisfiedscene>>F>>dx,yThen the above-mentioned displacement ΔxAnd ΔyThe expression of (c) can be simplified as:
thus, ΔxAnd ΔyIs the differential of the displacement, and the dither convolution kernel in the dither model is ΔxAnd ΔvIs integrated. It can be seen that the dither convolution kernel K is a representation of the frequency range of the power spectrum in the spatial dimension. Then, a plurality of frames of images can be acquired by selecting proper short exposure time. And selecting proper exposure time (integration time) to acquire a plurality of frames of images according to the distribution condition of the telescope pointing power spectrum. The basic principle of single-frame integration time selection is to effectively reduce the high-amplitude low-frequency part of dominant jitter in a power spectrum, and the residual high-frequency low-amplitude jitter does not cause obvious influence on reconstruction errors. In the invention, the exposure time when the short-exposure image is acquired is determined by comprehensively considering parameters such as the observation target source, the optical system, the detector parameter and the like. In particular, the determination of the lower limit of the integration time of a single frame is made, for example, according to the light and dark ranges of the target sourceThe degree, instrument parameters, noise of the detector, and other hardware conditions, etc. determine the lower integration limit. As the exposure time decreases, the influence of detector readout noise or the like becomes more significant, which also degrades the quality of the image. Therefore, the exposure time of the camera in practical applications cannot be reduced without limit. Under the condition that the signal-to-noise ratio of single-frame observation is low, the selected exposure time is relatively long so as to improve the image quality, but under the condition that the signal-to-noise ratio of single-frame observation is high, the integration time of single exposure can be properly reduced, so that higher-frequency jitter can be compensated. In summary, when the signal-to-noise ratio of single-frame observation is low, a relatively long exposure time is selected; when the observed signal-to-noise ratio is high, a relatively short integration time for a single exposure is selected. That is, the exposure time when the short-exposure image is acquired decreases as the signal-to-noise ratio of the single-frame observation increases. Therefore, the method is mainly based on target reconstruction after short-time multi-frame exposure, and compared with a traditional control scheme based on hardware, such as swing mirror or piezoelectric control, the method can correct the jitter influence with higher frequency. To summarize, the short exposure time parameter in the formulation of the observation strategy depends on the low and high frequency components of the power spectral model, i.e. through the frequency range f in the power spectral functionbandAnd (4) determining. From the above, K in the dither model is just the representation of the frequency range in the spatial dimension. Therefore, according to the parameters K, P, N and X in the jitter model, when low-frequency components are dominant in the jitter power spectrum, the exposure time can be properly delayed, and the signal-to-noise ratio can be enhanced without affecting the image sharpness. When the high frequency component in the jitter power spectrum is significant, a shorter exposure time needs to be set to overcome the blurring effect of the high frequency jitter. In addition, the specific exposure time needs to be determined by combining the jitter characteristics with the observation target, the optical system and the detector model.
After the measuring numerical value images are obtained through the detector and the reading circuit, the continuous observation images obtained through short exposure can be registered, so that the target points in each frame of image are at the same position. Of course, each frame of image is preprocessed before image registration, and the preprocessing includes dark current removal, flat field removal, radiation correction and the like. For undersampled images, a Drizzle (or multidizzle) algorithm may be first applied to improve image resolution for subsequent registration. The preprocessing step should be performed in combination with instrument calibration data, specifically, the instrument calibration data includes data such as detector dark current, optical system flat field, and radiation response obtained by measurement on the ground and in space.
Specifically, in image registration for astronomical observation, a fixed reference target (e.g., a star) in a reference field of view is aligned, or a high-precision image is acquired by a fine particle swarm sensor (FGS), and the pose of each frame is confirmed by star map matching to be registered, so that the target point in each frame of image is at the same position. In the earth observation, a frame with higher quality selected from a series of short-exposure images is selected, the relative displacement of other frames and the selected frame is estimated by an optical flow method or a feature recognition method, then registration is carried out, and the acutance and the signal-to-noise ratio of the selected frame can be used as evaluation parameters with high and low quality. In the invention, the selected frame is the frame with the highest signal-to-noise ratio. Of course, although the present invention uses the above-described approach for image registration, registration may also be performed using satellite sensor information obtained from the attitude measurement and control unit as a prior condition. And satellite observation data is combined during image registration, and specifically, the satellite observation data comprises observation data obtained by a satellite star sensor and a high-precision guide star measurement system.
Referring to fig. 2 (normalized intensity distribution at the detector focal plane and corresponding two-dimensional fourier transform spectrum), at a very short integration time, the image acquired in a single frame is more similar to an ideal jitter-free image, with only some relative position (phase) shift. The shaking phenomenon is already obvious in a single-frame image under the relatively long integration time. Although the blurring effect of the longer integration time attenuates the intensity of certain frequencies in the spectrum, some significant frequency components remain in it. Longer integration time will eventually result in a more uniform distribution of jitter in all directions, resulting in a broadening of the overall point spread function. So that the whole image, like an ideal image, is passed through a low-pass filter, resulting in attenuation of high-frequency components. Therefore, the invention converts the continuous observation images obtained by multi-frame short exposure and after registration into the Fourier frequency domain, and recombines the corresponding frequency components into a new frequency spectrum image based on the weight factors, thereby obtaining the optimal expression mode at each frequency. In the invention, the image is reconstructed by utilizing a Fourier burst reconstruction (FBA) principle, and the following formula is utilized during reconstruction:
and
wherein, F-1For the purpose of the operation of the inverse fourier transform,the frequency domain information w of the ith frame image after Fourier transform in the total M frames of imagesi(xi) is the weighting factor of its corresponding frame spectrum, yp(x) For the reconstructed image, p is used as a factor to control the image sharpness and noise. Considering the existence of noise, frequency domain phase distortion caused by jitter and other factors, the larger p is better, and the selection space of the parameters is still optimized.
Specifically, when p is smaller, the information of each frequency component in the reconstructed image is mainly based on multi-frame superposition; when p is large, the method mainly carries out image restoration based on information in a plurality of specific frames. More extreme, when p is 0, the method is equivalent to performing simple registration average superposition on each frame; when p- > ∞ (i.e., p approaches infinity), the information for each frequency component is provided by only the frame of the multi-frame image for which the frequency component is optimal.
The frequency components are synthesized from frames with different jitter characteristics, and the reconstructed image may generate a corresponding phase error when the jitter convolution kernel is large. While for small scale jitter on the detector the phase error is negligible. Such phenomena are more obvious in the observation of the point light source, and the reconstruction accuracy of the extended light source is relatively less affected, so that the parameter p needs to be properly adjusted in the actual observation. For example, in practice, a corresponding observation strategy should be specified for the observation target in combination with the current satellite state information, and the selection of the corresponding parameters should be adjusted.
Referring to fig. 3, the reconstructed image for the brighter point light source is very close to the original image, and the longer exposure view is sharper. For the adjacent point light sources similar to double stars, the method can distinguish and identify the adjacent point light sources more effectively. In the reconstruction of dark and weak objects, detector readout noise has a certain effect on the reconstructed image, so that a relatively longer integration time should be selected for such objects.
Referring to fig. 4, the method can also be applied to spread observation of typical extended sources on a satellite platform, including surface targets, space objects, atmosphere, and the like. The invention can effectively improve the reduction of image resolution and contrast caused by platform shake and acquire clearer and more accurate images.
And then carrying out inverse Fourier transform on the reconstructed image to obtain a reconstructed image. And for the polychromatic light detector system, reconstructing on each wave band respectively, and fusing in the reconstructed images.
Therefore, the signal components in each frame are linearly superposed by the method, so that the calculation complexity is mainly concentrated on two-dimensional Fourier transforms, and the on-satellite real-time calculation is possible.
After reconstruction, operations such as noise reduction filtering and the like need to be performed on the image, and the image is subjected to noise reduction processing by using a partitioning processing method such as BM3D or an nlbayer method under a bayesian frame. In addition, the image can be filtered by using sharpening operation such as differentiation or Gaussian filtering according to actual conditions, the details of the image weakened in the noise reduction process are recovered, and the edge components of the image are enhanced.
In conclusion, the image reconstruction method can effectively remove the focal plane imaging blurring phenomenon caused by load pointing jitter in the imaging process, is low in calculation complexity and has the capability of on-satellite real-time operation. Compared with a simple multi-frame superposition algorithm, the method has obvious improvement on the main and guest quality of the image, and can effectively improve the imaging resolution of the system. Therefore, the method can more accurately and quickly solve the problem of multi-frame image restoration with relative motion in the actual scene of the spread observation on the microsatellite platform, so that the algorithm has practical application value while ensuring effectiveness. The method is suitable for the spread observation of typical point light sources and extended sources on a micro satellite platform or similar space vehicles, and comprises surface targets, space targets, atmospheric survey and the like. The method can also be applied to different observation scenes such as astronomical imaging, earth observation, photometry and the like.
The above description is only one embodiment of the present invention, and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (10)
1. A multi-frame image rapid reconstruction method on a microsatellite platform comprises the following steps:
a. establishing a load observation jitter model according to the power spectrum of the satellite platform;
b. acquiring a short-exposure image by combining a jitter model, and registering the image;
c. performing multi-frame superposition reconstruction on the registered image;
d. and denoising and filtering the reconstructed image.
2. The method of claim 1, wherein in step (a), the jitter characteristics of the telescope pointing direction are analyzed according to a power spectrum function of the satellite platform, wherein the power spectrum function of the satellite platform can be expressed as:
wherein S represents a power spectrum function, fbandRepresenting the frequency range of the power spectrum function, c being a constant;
displacement in focal plane ΔxAnd ΔyThe following relationship exists with the telescope pointing jitter characteristics:
wherein F is the focal length and DsceneDistance between pupil and observed scene, Dx、Dy、DzRepresenting displacements along the x, y, z axes, respectively, thetax、θy、θzRepresenting the rotation of the pointing along the x, y, z axes, respectively, the distance d between the position (x, y) and the center of the plane and the function f of the similarity transformationx,yAnd (4) correlating.
4. the method of claim 1, wherein based on the dithering model established in step (a), the dithered image is:
5. The method of claim 4, wherein in step (b), the exposure time for acquiring the short-exposure image is determined based on the observed target source, optical system, and detector parameters in combination with K, P, N, X in the dither model.
6. The method of claim 5, wherein the exposure time at which the short-exposure image is acquired decreases as the observed signal-to-noise ratio of a single frame increases.
7. The method according to claim 1, wherein in step (b), each frame image is preprocessed before image registration, the preprocessing including dark current removal, flat field removal, and radiation correction;
for undersampled images, the image resolution is increased using a Drizzle or multidizzle algorithm prior to registration;
and combining instrument calibration data when the preprocessing step is carried out, wherein the instrument calibration data comprises detector dark current, optical system flat field and radiation response data obtained by measurement in the ground and space.
8. The method according to claim 1, wherein in the step (b), in image registration of astronomical observations, fixed reference targets within a reference field of view are aligned, or high precision images are acquired by using a fine guide star sensor and registered by confirming the pose of each frame by using star map matching so that the target point in each frame of image is at the same position;
in the earth observation, selecting a frame with the highest signal-to-noise ratio from a series of short-exposure images, estimating the relative displacement of other frames and the selected frame by using an optical flow method or a feature recognition method, then registering, and selecting the frame with sharpness and the signal-to-noise ratio as quality evaluation parameters;
and combining satellite observation data during image registration, wherein the satellite observation data comprises observation data obtained by a satellite star sensor and a high-precision guide star measurement system.
9. The method according to claim 1, wherein in step (c), the registered successive observation images are converted into the fourier frequency domain and the corresponding frequency components are recombined into a new spectral image based on the weighting factors, using the following formula:
and
wherein, F-1For the purpose of the operation of the inverse fourier transform,the frequency domain information w of the ith frame image after Fourier transform in the total M frames of imagesi(xi) is the weighting factor of its corresponding frame spectrum, yp(x) For the reconstructed image, p is a factor that controls the image sharpness and noise;
performing inverse Fourier transform on the reconstructed image to obtain a reconstructed image;
and for the polychromatic light detector system, reconstructing on each wave band respectively, and fusing in the reconstructed images.
10. The method according to claim 1, wherein in the step (d), the image is subjected to noise reduction processing using BM3D in a block processing method or an nlb yes method under a bayesian framework;
and filtering the image by using differential or Gaussian filtering in the sharpening operation, recovering the weakened image details in the denoising process, and enhancing the edge components of the image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110260840.1A CN112927177B (en) | 2021-03-10 | 2021-03-10 | Multi-frame image quick reconstruction method on microsatellite platform |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110260840.1A CN112927177B (en) | 2021-03-10 | 2021-03-10 | Multi-frame image quick reconstruction method on microsatellite platform |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112927177A true CN112927177A (en) | 2021-06-08 |
CN112927177B CN112927177B (en) | 2024-09-06 |
Family
ID=76172372
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110260840.1A Active CN112927177B (en) | 2021-03-10 | 2021-03-10 | Multi-frame image quick reconstruction method on microsatellite platform |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112927177B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113945952A (en) * | 2021-09-30 | 2022-01-18 | 中国空间技术研究院 | Space distributed synthetic aperture optical detection method |
CN118314060A (en) * | 2024-06-05 | 2024-07-09 | 中国人民解放军国防科技大学 | Image preprocessing method for space target observation |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080068263A1 (en) * | 2006-09-19 | 2008-03-20 | Tekawy Jonathan A | Method and system for attitude determination of a platform using global navigation satellite system and a steered antenna |
CN105069748A (en) * | 2015-07-16 | 2015-11-18 | 哈尔滨工业大学 | Method for obtaining high-resolution image based on micro-satellite object scanning technique |
US20150331351A1 (en) * | 2014-05-13 | 2015-11-19 | Hiroyuki Suhara | Image forming method, image forming apparatus, print material production method |
US20160334729A1 (en) * | 2015-05-12 | 2016-11-17 | Hiroyuki Suhara | Image forming method and image forming apparatus |
-
2021
- 2021-03-10 CN CN202110260840.1A patent/CN112927177B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080068263A1 (en) * | 2006-09-19 | 2008-03-20 | Tekawy Jonathan A | Method and system for attitude determination of a platform using global navigation satellite system and a steered antenna |
US20150331351A1 (en) * | 2014-05-13 | 2015-11-19 | Hiroyuki Suhara | Image forming method, image forming apparatus, print material production method |
US20160334729A1 (en) * | 2015-05-12 | 2016-11-17 | Hiroyuki Suhara | Image forming method and image forming apparatus |
CN105069748A (en) * | 2015-07-16 | 2015-11-18 | 哈尔滨工业大学 | Method for obtaining high-resolution image based on micro-satellite object scanning technique |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113945952A (en) * | 2021-09-30 | 2022-01-18 | 中国空间技术研究院 | Space distributed synthetic aperture optical detection method |
CN118314060A (en) * | 2024-06-05 | 2024-07-09 | 中国人民解放军国防科技大学 | Image preprocessing method for space target observation |
CN118314060B (en) * | 2024-06-05 | 2024-09-10 | 中国人民解放军国防科技大学 | Image preprocessing method for space target observation |
Also Published As
Publication number | Publication date |
---|---|
CN112927177B (en) | 2024-09-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5243351A (en) | Full aperture image synthesis using rotating strip aperture image measurements | |
Lagendijk et al. | Basic methods for image restoration and identification | |
US8068696B2 (en) | Method of restoring movements of the line of sight of an optical instrument | |
US20100067822A1 (en) | System and method of super-resolution imaging from a sequence of translated and rotated low-resolution images | |
US6192322B1 (en) | Moving object and transient event detection using rotation strip aperture image measurements | |
CN112927177B (en) | Multi-frame image quick reconstruction method on microsatellite platform | |
US20210110514A1 (en) | Method, device and non-transitory computer-readable storage medium for increasing the resolution and dynamic range of a sequence of respective top view images of a same terrestrial location | |
Karaim et al. | Low-cost IMU data denoising using Savitzky-Golay filters | |
Sieberth et al. | UAV image blur–its influence and ways to correct it | |
Zhaoxiang et al. | Attitude jitter compensation for remote sensing images using convolutional neural network | |
He et al. | Motion-blurred star image restoration based on multi-frame superposition under high dynamic and long exposure conditions | |
US20120148113A1 (en) | Method for detecting shifts in line images obtained by a sensor that is airborne or moving in space | |
Llaveria et al. | Correcting the ADCS jitter induced blurring in small satellite imagery | |
Li et al. | Modulation transfer function measurements using a learning approach from multiple diffractive grids for optical cameras | |
Alici | Extraction of modulation transfer function by using simulated satellite images | |
King et al. | The current ability of HST to reveal morphological structure in medium-redshift galaxies | |
Hardie | Super-resolution using adaptive Wiener filters | |
US8351738B2 (en) | Method of estimating at least one deformation of the wave front of an optical system or of an object observed by the optical system and associated device | |
Wang et al. | Optical satellite image MTF compensation for remote-sensing data production | |
Khetkeeree et al. | Satellite image restoration using adaptive high boost filter based on in-flight point spread function | |
Gota et al. | Analysis and Comparison on Image Restoration Algorithms Using MATLAB | |
Meraoumia et al. | Fast strategies for multi-temporal speckle reduction of Sentinel-1 GRD images | |
Jayapriya et al. | A study on image restoration and its various blind image deconvolution algorithms | |
Lei et al. | Sub-pixel location of motion blurred weak celestial objects in optical sensor image based on elliptical 2d gaussian surface fitting | |
Mugnier et al. | Inversion in optical imaging through atmospheric turbulence |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |