Disclosure of Invention
The invention aims to provide a method for quickly reconstructing multi-frame images on a microsatellite platform.
In order to achieve the above object, the present invention provides a method for rapidly reconstructing a plurality of frames of images on a microsatellite platform, comprising the following steps:
a. establishing a load observation jitter model according to the power spectrum of the satellite platform;
b. acquiring a short-exposure image by combining a jitter model, and registering the image;
c. performing multi-frame superposition reconstruction on the registered image;
d. and denoising and filtering the reconstructed image.
According to an aspect of the present invention, in the step (a), the jitter characteristic of the telescope pointing direction is analyzed according to a power spectrum function of the satellite platform, wherein the power spectrum function of the satellite platform can be expressed as:
wherein S represents a power spectrum function, fbandRepresenting the frequency range of the power spectrum function, c being a constant;
displacement in focal plane ΔxAnd ΔyThe following relationship exists with the telescope pointing jitter characteristics:
wherein, F is the focal length,Dscenedistance between pupil and observed scene, Dx、Dy、DzRepresenting displacements along the x, y, z axes, respectively, thetax、θy、θzIndicating the rotation of the pointing along the x, y, z axes, respectively, the distance dx between the position (x, y) and the plane center and the plausibility transformation function f,yand (4) correlating.
According to an aspect of the invention, due to Dscene>>F>>dx,yThen a displacement ofxAnd ΔyThe method is simplified as follows:
according to one aspect of the present invention, based on the dithering model established in step (a), the dithered image is:
where Y is the dithered image, K, P are the dither and convolution kernels of the optical system, respectively, N is the additional additive noise, X is the non-dithered target image,
is a convolution operation.
According to one aspect of the invention, in said step (b), the exposure time for acquiring the short-exposure image is determined based on the observed target source, optical system and detector parameters in combination with K, P, N, X in the dither model.
According to one aspect of the invention, the exposure time at which the short-exposure image is acquired decreases as the observed signal-to-noise ratio of a single frame increases.
According to one aspect of the invention, in the step (b), preprocessing is performed on each frame image before image registration, wherein the preprocessing includes dark current removal, flat field removal and radiation correction;
for undersampled images, the image resolution is increased using a Drizzle or multidizzle algorithm prior to registration;
and combining instrument calibration data when the preprocessing step is carried out, wherein the instrument calibration data comprises detector dark current, optical system flat field and radiation response data obtained by measurement in the ground and space.
According to an aspect of the present invention, in the step (b), in the registration of images of astronomical observations, a fixed reference target in a reference field of view is aligned, or a fine guide star sensor is used to acquire a high-precision image, and the registration of the attitude of each frame is confirmed by star map matching, so that the target point in each frame of image is at the same position;
in the earth observation, selecting a frame with the highest signal-to-noise ratio from a series of short-exposure images, estimating the relative displacement of other frames and the selected frame by using an optical flow method or a feature recognition method, then registering, and selecting the frame with sharpness and the signal-to-noise ratio as quality evaluation parameters;
and combining satellite observation data during image registration, wherein the satellite observation data comprises observation data obtained by a satellite star sensor and a high-precision guide star measurement system.
According to one aspect of the present invention, in the step (c), the registered continuous observation images are converted into a fourier frequency domain, and corresponding frequency components are recombined into a new spectral image based on the weighting factors, and the reconstruction utilizes the following formula:
and
wherein, F
-1For the purpose of the operation of the inverse fourier transform,
for the ith in the total M frames of imagesFrequency domain information w of frame image after Fourier transform
i(xi) is the weighting factor of its corresponding frame spectrum, y
p(x) For the reconstructed image, p is a factor that controls the image sharpness and noise;
performing inverse Fourier transform on the reconstructed image to obtain a reconstructed image;
and for the polychromatic light detector system, reconstructing on each wave band respectively, and fusing in the reconstructed images.
According to an aspect of the present invention, in the step (d), the image is subjected to noise reduction processing using BM3D in a block processing method or an nlb yes method under a bayesian framework;
and filtering the image by using differential or Gaussian filtering in the sharpening operation, recovering the weakened image details in the denoising process, and enhancing the edge components of the image.
According to one scheme of the invention, a jitter model is established by combining the characteristics (including power spectrum and the like) of a satellite platform, an observation scheme is determined, and frequency domain superposition reconstruction is carried out on the basis of an image sequence acquired by short exposure. Therefore, the problem of multi-frame image restoration with relative motion in an actual scene of development observation on a microsatellite platform can be accurately and quickly solved, the phenomenon of focal plane imaging blurring caused by load pointing jitter in the imaging process is effectively removed, and the imaging resolution of a system is improved.
According to an aspect of the present invention, when the directions of blur kernels of a plurality of frame images are not consistent, for any one of the blurred images, high-frequency information in the direction of self-blurring is provided for the blurred image by using other blurred images. And selecting less fuzzy information in each frame, and fusing and registering to obtain a clearer image, thereby realizing the image stabilizing effect. Therefore, the method does not need to carry out deconvolution operation, does not require the prior property of the image, has low calculation complexity and the capability of on-satellite real-time processing, and is suitable for observation based on different scenes such as astronomical imaging and earth observation of a space vehicle like a microsatellite.
According to one scheme of the invention, in the reconstruction process, signal components in each frame are linearly superposed and can be realized by only utilizing two-dimensional Fourier transforms, so that the calculation complexity is low, and the on-satellite real-time calculation is further facilitated. And carrying out operations such as noise reduction and filtering on the reconstructed image, so as to improve the signal-to-noise ratio of the image by noise reduction, and recovering weakened image details by utilizing sharpening filtering operation, thereby enhancing the edge components of the image.
Detailed Description
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments will be briefly described below. It is obvious that the drawings in the following description are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
The present invention is described in detail below with reference to the drawings and the specific embodiments, which are not repeated herein, but the embodiments of the present invention are not limited to the following embodiments.
Referring to fig. 1, according to the method for rapidly reconstructing multi-frame images on a microsatellite platform, a load observation jitter model is firstly established according to a power spectrum of the satellite platform, and of course, parameters of a load optical system need to be considered in the process of establishing the model, so that an imaging process of a target on a focal plane within exposure time is described. Short exposure images are then acquired and the images are registered. And after the image registration is completed, performing multi-frame superposition reconstruction on the image. And finally, denoising and filtering the reconstructed image. Thus, because the jitter of the satellite platform is random, each image in the short exposure typically contains different blur information. When the directions of the blur kernels of the multi-frame images are inconsistent, for any one of the blurred images, the other blurred images can provide high-frequency information in the blurring direction of the other blurred image. Therefore, the invention selects the less fuzzy information in each frame, and becomes a clearer image through fusion and registration, thereby realizing the image stabilization effect. The method does not need to carry out deconvolution operation, does not require the prior property of the image, and has the capability of on-satellite real-time processing.
In the process of establishing a shake model, in a linear invariant system, modeling shake blur as convolution of an original image and a motion kernel, and generating a shake-processed image as follows:
where Y is the dithered image, K, P are the dither and convolution kernels of the optical system, respectively, N is the additional additive noise, X is the non-dithered target image,
is a convolution operation.
The instrument structure of the satellite can be considered as a rigid body, and the vibration on the telescope will be transferred to the detector focal plane, thereby causing a two-dimensional vector motion of the ideal point spread function in the image. In a spatial environment, internal and external disturbances on a satellite platform are often caused by a variety of sources, which are sometimes correlated. The jitter of a satellite platform is typically described using a power spectrum whose frequency range determines the frequency domain distribution characteristics of the jitter components and whose amplitude level describes the energy contained in each frequency component.
The method of the present invention is described in detail below with an example of a scene where an optical load on a microsatellite is observed astronomically and on the ground. The power spectrum of satellite platform jitter typically contains a wide range of frequency components: the low frequency component is sometimes associated with multi-body dynamics and satellite mechanics; mechanical motion may cause mid-frequency vibrations; structural vibration modes of the system will typically excite high frequency jitter with lower vibration amplitudes and lower intrinsic damping. Therefore, the invention estimates the jitter characteristic of the telescope pointing direction according to the power spectrum function analysis of the satellite platform. In the present embodiment, when a jitter component of 0.1Hz to 100Hz is generated and simulation is performed based on the power spectrum model in the ESA silix project, the power spectrum function of the satellite platform is:
wherein S is a power spectrum function, fbandRepresenting the frequency range of the power spectrum function, c is constant, μ rad is micro angular seconds, Hz is hertz. It follows that the jitter amplitude level decreases with increasing frequency.
The image reconstruction method does not need to describe the platform jitter in detail, but takes the statistical information of the jitter power spectrum as the optimization condition of the observation strategy, so the method is not limited to a specific satellite platform.
Then, a model is constructed to generate an image on a focal plane, and in order to more intuitively explain the effect of the jitter, the vibration amplitude is calculated according to the propagation principle from an instrument to the focal plane, and the mean square error of the jitter is displayed on the focal plane pixel scale. The main parameters of the optical system simulation model are shown in table 1:
TABLE 1
In the present invention, the displacement Δ in the focal planexAnd ΔyThe following relationship exists with the telescope pointing jitter characteristics:
wherein F is the focal length and DsceneDistance between pupil and observed scene, Dx、Dy、DzRepresenting displacements along the x, y, z axes, respectively, thetax、θy、θzRepresenting the rotation of the pointing along the x, y, z axes, respectively, the distance d between the position (x, y) and the center of the plane and the function f of the similarity transformationx,yAnd (4) correlating.
In space observation, D is usually satisfiedscene>>F>>dx,yThen the above-mentioned displacement ΔxAnd ΔyThe expression of (c) can be simplified as:
thus, ΔxAnd ΔyIs the differential of the displacement, and the dither convolution kernel in the dither model is ΔxAnd ΔvIs integrated. It can be seen that the dither convolution kernel K is a representation of the frequency range of the power spectrum in the spatial dimension. Then, a plurality of frames of images can be acquired by selecting proper short exposure time. And selecting proper exposure time (integration time) to acquire a plurality of frames of images according to the distribution condition of the telescope pointing power spectrum. The basic principle of single-frame integration time selection is to effectively reduce the high-amplitude low-frequency part of dominant jitter in a power spectrum, and the residual high-frequency low-amplitude jitter does not cause obvious influence on reconstruction errors. In the invention, the exposure time when the short-exposure image is acquired is determined by comprehensively considering parameters such as the observation target source, the optical system, the detector parameter and the like. In particular, the determination of the lower limit of the integration time of a single frame is made, for example, according to the light and dark ranges of the target sourceThe degree, instrument parameters, noise of the detector, and other hardware conditions, etc. determine the lower integration limit. As the exposure time decreases, the influence of detector readout noise or the like becomes more significant, which also degrades the quality of the image. Therefore, the exposure time of the camera in practical applications cannot be reduced without limit. Under the condition that the signal-to-noise ratio of single-frame observation is low, the selected exposure time is relatively long so as to improve the image quality, but under the condition that the signal-to-noise ratio of single-frame observation is high, the integration time of single exposure can be properly reduced, so that higher-frequency jitter can be compensated. In summary, when the signal-to-noise ratio of single-frame observation is low, a relatively long exposure time is selected; when the observed signal-to-noise ratio is high, a relatively short integration time for a single exposure is selected. That is, the exposure time when the short-exposure image is acquired decreases as the signal-to-noise ratio of the single-frame observation increases. Therefore, the method is mainly based on target reconstruction after short-time multi-frame exposure, and compared with a traditional control scheme based on hardware, such as swing mirror or piezoelectric control, the method can correct the jitter influence with higher frequency. To summarize, the short exposure time parameter in the formulation of the observation strategy depends on the low and high frequency components of the power spectral model, i.e. through the frequency range f in the power spectral functionbandAnd (4) determining. From the above, K in the dither model is just the representation of the frequency range in the spatial dimension. Therefore, according to the parameters K, P, N and X in the jitter model, when low-frequency components are dominant in the jitter power spectrum, the exposure time can be properly delayed, and the signal-to-noise ratio can be enhanced without affecting the image sharpness. When the high frequency component in the jitter power spectrum is significant, a shorter exposure time needs to be set to overcome the blurring effect of the high frequency jitter. In addition, the specific exposure time needs to be determined by combining the jitter characteristics with the observation target, the optical system and the detector model.
After the measuring numerical value images are obtained through the detector and the reading circuit, the continuous observation images obtained through short exposure can be registered, so that the target points in each frame of image are at the same position. Of course, each frame of image is preprocessed before image registration, and the preprocessing includes dark current removal, flat field removal, radiation correction and the like. For undersampled images, a Drizzle (or multidizzle) algorithm may be first applied to improve image resolution for subsequent registration. The preprocessing step should be performed in combination with instrument calibration data, specifically, the instrument calibration data includes data such as detector dark current, optical system flat field, and radiation response obtained by measurement on the ground and in space.
Specifically, in image registration for astronomical observation, a fixed reference target (e.g., a star) in a reference field of view is aligned, or a high-precision image is acquired by a fine particle swarm sensor (FGS), and the pose of each frame is confirmed by star map matching to be registered, so that the target point in each frame of image is at the same position. In the earth observation, a frame with higher quality selected from a series of short-exposure images is selected, the relative displacement of other frames and the selected frame is estimated by an optical flow method or a feature recognition method, then registration is carried out, and the acutance and the signal-to-noise ratio of the selected frame can be used as evaluation parameters with high and low quality. In the invention, the selected frame is the frame with the highest signal-to-noise ratio. Of course, although the present invention uses the above-described approach for image registration, registration may also be performed using satellite sensor information obtained from the attitude measurement and control unit as a prior condition. And satellite observation data is combined during image registration, and specifically, the satellite observation data comprises observation data obtained by a satellite star sensor and a high-precision guide star measurement system.
Referring to fig. 2 (normalized intensity distribution at the detector focal plane and corresponding two-dimensional fourier transform spectrum), at a very short integration time, the image acquired in a single frame is more similar to an ideal jitter-free image, with only some relative position (phase) shift. The shaking phenomenon is already obvious in a single-frame image under the relatively long integration time. Although the blurring effect of the longer integration time attenuates the intensity of certain frequencies in the spectrum, some significant frequency components remain in it. Longer integration time will eventually result in a more uniform distribution of jitter in all directions, resulting in a broadening of the overall point spread function. So that the whole image, like an ideal image, is passed through a low-pass filter, resulting in attenuation of high-frequency components. Therefore, the invention converts the continuous observation images obtained by multi-frame short exposure and after registration into the Fourier frequency domain, and recombines the corresponding frequency components into a new frequency spectrum image based on the weight factors, thereby obtaining the optimal expression mode at each frequency. In the invention, the image is reconstructed by utilizing a Fourier burst reconstruction (FBA) principle, and the following formula is utilized during reconstruction:
and
wherein, F
-1For the purpose of the operation of the inverse fourier transform,
the frequency domain information w of the ith frame image after Fourier transform in the total M frames of images
i(xi) is the weighting factor of its corresponding frame spectrum, y
p(x) For the reconstructed image, p is used as a factor to control the image sharpness and noise. Considering the existence of noise, frequency domain phase distortion caused by jitter and other factors, the larger p is better, and the selection space of the parameters is still optimized.
Specifically, when p is smaller, the information of each frequency component in the reconstructed image is mainly based on multi-frame superposition; when p is large, the method mainly carries out image restoration based on information in a plurality of specific frames. More extreme, when p is 0, the method is equivalent to performing simple registration average superposition on each frame; when p- > ∞ (i.e., p approaches infinity), the information for each frequency component is provided by only the frame of the multi-frame image for which the frequency component is optimal.
The frequency components are synthesized from frames with different jitter characteristics, and the reconstructed image may generate a corresponding phase error when the jitter convolution kernel is large. While for small scale jitter on the detector the phase error is negligible. Such phenomena are more obvious in the observation of the point light source, and the reconstruction accuracy of the extended light source is relatively less affected, so that the parameter p needs to be properly adjusted in the actual observation. For example, in practice, a corresponding observation strategy should be specified for the observation target in combination with the current satellite state information, and the selection of the corresponding parameters should be adjusted.
Referring to fig. 3, the reconstructed image for the brighter point light source is very close to the original image, and the longer exposure view is sharper. For the adjacent point light sources similar to double stars, the method can distinguish and identify the adjacent point light sources more effectively. In the reconstruction of dark and weak objects, detector readout noise has a certain effect on the reconstructed image, so that a relatively longer integration time should be selected for such objects.
Referring to fig. 4, the method can also be applied to spread observation of typical extended sources on a satellite platform, including surface targets, space objects, atmosphere, and the like. The invention can effectively improve the reduction of image resolution and contrast caused by platform shake and acquire clearer and more accurate images.
And then carrying out inverse Fourier transform on the reconstructed image to obtain a reconstructed image. And for the polychromatic light detector system, reconstructing on each wave band respectively, and fusing in the reconstructed images.
Therefore, the signal components in each frame are linearly superposed by the method, so that the calculation complexity is mainly concentrated on two-dimensional Fourier transforms, and the on-satellite real-time calculation is possible.
After reconstruction, operations such as noise reduction filtering and the like need to be performed on the image, and the image is subjected to noise reduction processing by using a partitioning processing method such as BM3D or an nlbayer method under a bayesian frame. In addition, the image can be filtered by using sharpening operation such as differentiation or Gaussian filtering according to actual conditions, the details of the image weakened in the noise reduction process are recovered, and the edge components of the image are enhanced.
In conclusion, the image reconstruction method can effectively remove the focal plane imaging blurring phenomenon caused by load pointing jitter in the imaging process, is low in calculation complexity and has the capability of on-satellite real-time operation. Compared with a simple multi-frame superposition algorithm, the method has obvious improvement on the main and guest quality of the image, and can effectively improve the imaging resolution of the system. Therefore, the method can more accurately and quickly solve the problem of multi-frame image restoration with relative motion in the actual scene of the spread observation on the microsatellite platform, so that the algorithm has practical application value while ensuring effectiveness. The method is suitable for the spread observation of typical point light sources and extended sources on a micro satellite platform or similar space vehicles, and comprises surface targets, space targets, atmospheric survey and the like. The method can also be applied to different observation scenes such as astronomical imaging, earth observation, photometry and the like.
The above description is only one embodiment of the present invention, and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.