CN108090869B - On-satellite super-resolution reconstruction method based on area array CMOS optical camera - Google Patents
On-satellite super-resolution reconstruction method based on area array CMOS optical camera Download PDFInfo
- Publication number
- CN108090869B CN108090869B CN201711207230.5A CN201711207230A CN108090869B CN 108090869 B CN108090869 B CN 108090869B CN 201711207230 A CN201711207230 A CN 201711207230A CN 108090869 B CN108090869 B CN 108090869B
- Authority
- CN
- China
- Prior art keywords
- image
- resolution
- satellite
- super
- reconstruction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 230000003287 optical effect Effects 0.000 title claims abstract description 24
- 239000011159 matrix material Substances 0.000 claims abstract description 18
- 230000015556 catabolic process Effects 0.000 claims abstract description 17
- 238000006731 degradation reaction Methods 0.000 claims abstract description 17
- 238000007781 pre-processing Methods 0.000 claims abstract description 7
- 238000012545 processing Methods 0.000 claims description 18
- 230000033001 locomotion Effects 0.000 claims description 9
- 238000004364 calculation method Methods 0.000 claims description 8
- 230000009466 transformation Effects 0.000 claims description 8
- 230000008569 process Effects 0.000 claims description 6
- 238000005070 sampling Methods 0.000 claims description 5
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 claims description 4
- 239000000654 additive Substances 0.000 claims description 3
- 230000000996 additive effect Effects 0.000 claims description 3
- 238000012546 transfer Methods 0.000 claims description 3
- 230000000295 complement effect Effects 0.000 claims description 2
- 229910044991 metal oxide Inorganic materials 0.000 claims description 2
- 150000004706 metal oxides Chemical class 0.000 claims description 2
- 239000004065 semiconductor Substances 0.000 claims description 2
- 238000005516 engineering process Methods 0.000 description 8
- 238000003384 imaging method Methods 0.000 description 5
- 238000013461 design Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 238000007667 floating Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 230000003321 amplification Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000001575 pathological effect Effects 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4053—Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10052—Images from lightfield camera
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
An on-satellite super-resolution reconstruction method based on an area array CMOS optical camera comprises the following steps: the method comprises the following steps: acquiring K low-resolution image sequences to be processed in the same scene in multiple time phases through an area array CMOS detector; step two: calculating the number P of offset pixels according to satellite orbit parameters aiming at an optical remote sensing satellite with a non-geosynchronous orbit; step three: constructing a degradation model; step four: calculating to obtain parameters of the degradation model; step five: preprocessing the image of the region of interest, including intercepting the registration region of the sequence image and deblurring the image; step six: obtaining a matrix of position dislocation and geometric deformation through image registration; step seven: reconstructing by using a super-resolution reconstruction algorithm; step eight: and repeating the sixth step to the seventh step on the residual ROI images until the reconstruction of the whole ROI area is completed, and obtaining a complete super-resolution image through image splicing.
Description
Technical Field
The invention belongs to the field of satellite remote sensing, and relates to an on-satellite super-resolution reconstruction method based on an area array CMOS optical camera.
Background
The image is the most direct way to obtain information, and how to improve the information quantity carried by the image is always the key research direction, and the continuous updating and development of high-resolution images can be seen in recent years. However, in many fields such as remote sensing, medical treatment, security and the like, the improvement of the image resolution is often limited by the hardware cost, the manufacturing process and the information transmission condition of the imaging sensor. There are two layers in terms of resolution of the optical remote sensing camera: the resolution of the optical system and the resolution of the detector, that is to say the spatial resolution of the optical remote sensing camera, are constrained by the double constraints of the optical system and the detector. The optimal camera design should satisfy the following conditions: the sampling of the detector array to the optical system Airy disk (Airy disk) meets the Nyquist sampling theorem, however, for a low-orbit satellite-borne remote sensing camera limited by objective factors such as high-speed movement of a satellite platform, image signal-to-noise ratio and the like, the process level of the existing detector cannot be matched with the resolution of the optical system, so the satellite-borne remote sensing optical camera is a system with limited detector resolution. In the field of space remote sensing, the most direct resolution improvement method is realized by increasing the aperture of an optical system and improving the density of a CCD/CMOS array, on one hand, increasing the aperture of the optical system inevitably brings about the increase of the volume and the weight of imaging equipment, which is particularly difficult for remote sensing satellites with strict requirements on volume, power consumption and weight; on the other hand, the density of the CCD or CMOS is increased, that is, the size of each photosensitive cell is reduced, but when the photosensitive cell of the CCD or CMOS is made small to a certain extent, the quality of an image will start to be degraded because photons collected by each photosensitive cell are masked by thermal noise during exposure as the photosensitive cell is reduced. And once the images are transmitted, the imaging devices of the satellites are difficult to update, so how to acquire more high-frequency information by using the existing low-resolution images, namely the remote sensing image super-resolution reconstruction problem becomes a research focus.
The super-resolution reconstruction method is divided into two modes of reconstruction based on a single-frame image and reconstruction based on a multi-frame image. The super-resolution method based on a single-frame image only comprises one image of a target region, and a high-resolution image is reconstructed by interpolation, reconstruction, learning and other methods; the method comprises the steps of obtaining a plurality of images with known or solvable relative motion relations based on a super-resolution method of multi-frame images, and utilizing sampling information contained in the images to construct high-resolution details of an overlapping area, namely utilizing low-resolution remote sensing images of multiple time phases and the same scene to reconstruct high-resolution images in a post-processing mode. At present, the super-resolution reconstruction technology has not reached a practical stage in the field of remote sensing, and a mode based on single-frame image reconstruction is adopted mostly, because the remote sensing load transmitted in the early stage is mostly a linear array CCD detector, the track height is low, a satellite is difficult to acquire images of the same scene in multiple time phases in one transit, the time interval between the images of the same scene in multiple time phases which can be acquired is long, and the ground feature scene is changed, so that the super-resolution reconstruction cannot be accurately performed. However, with the emission of high-resolution four-signals, the area array CMOS detector has started to develop gradually as a remote sensing load, and it becomes possible to acquire multi-temporal data of the same scene in a short time, so that the advantage of the super-resolution reconstruction technique based on multi-frame images is prominent. In addition, due to the poor interpolation effect of the super-resolution technology based on the single-frame image, the reconstruction is mostly carried out in a dictionary training mode, images with high and low resolutions are required to be obtained as training images, the remote sensing data is difficult to realize, the inversion process which can be regarded as a degraded image by only using the single-frame image reconstruction from the aspect of mathematics is a pathological inversion process, an ideal result cannot be obtained in practical application, and in contrast, the super-resolution reconstruction is carried out by using the area array CMOS to obtain the multi-frame image, so that more abundant image information can be obtained, and a good reconstruction effect is obtained. However, no solution is available in this respect that can be applied on board.
On the other hand, the signal-to-noise ratio of an image is an important index in consideration of image quality. The signal-to-noise ratio of the image is closely related to the exposure time of the camera, and the shorter the exposure time is, the lower the signal-to-noise ratio is, under the premise of the same track height and the same sun altitude angle, which is why the linear array CCD camera generally adopts the TDI (delay integration) working mode. The TDI of a CCD camera is a process in which charges are accumulated inside a sensor, and is usually implemented by using an analog circuit, so that the TDI technology is called as an analog TDI technology. When the analog TDI sensor carries out time delay integration, charges formed by serially sweeping the same scene are accumulated in the sensor, and then are read out through a reading circuit, and differential amplification and digital quantization are carried out in a back-end information processing circuit. For an area array CMOS camera, the exposure time is strictly less than the relative ground speed of the camera and the TDI operation mode of the original physical structure cannot be adopted, so the signal-to-noise ratio of the acquired image is low.
Disclosure of Invention
The technical problem solved by the invention is as follows: the method overcomes the defects of the prior art, and provides a super-resolution reconstruction method on the satellite based on an area array CMOS optical camera.
The technical scheme of the invention is as follows: an on-satellite super-resolution reconstruction method based on an area array CMOS optical camera comprises the following steps:
the method comprises the following steps: acquiring K low-resolution image sequences to be processed in the same scene in multiple time phases through an area array CMOS detector;
step two: calculating the number P of offset pixels according to satellite orbit parameters aiming at an optical remote sensing satellite with a non-geosynchronous orbit;
step three: constructing a degradation model;
step four: calculating to obtain parameters of the degradation model;
step five: preprocessing the image of the region of interest, including intercepting the registration region of the sequence image and deblurring the image;
step six: obtaining a matrix of position dislocation and geometric deformation through image registration;
step seven: reconstructing by using a super-resolution reconstruction algorithm;
step eight: and repeating the sixth step to the seventh step on the residual ROI images until the reconstruction of the whole ROI area is completed, and obtaining a complete super-resolution image through image splicing.
The specific process of the second step is as follows: calculating the number P of offset pixels between two frames of the sequence images according to the satellite motion speed v, the frame frequency f and the spatial resolution R, wherein the number P is used as a reference value for intercepting the size of the same area image in different time phase images, wherein
The degradation model constructed in the third step is as follows:
yi=DHiMix+ni,i=1,2,…,K;
wherein x is the originalAn undegraded high resolution image; y isiIs the observed ith low resolution image; miIs a matrix representing the positional misalignment and geometric distortion of the ith image; hiRepresenting a fuzzy degradation matrix; d is a downsampled matrix; n represents additive noise.
The processing procedure of the step five is as follows:
setting the size of an obtained ROI (region of interest) area image as L multiplied by F, setting L as the length of an image shot by a CMOS (complementary metal oxide semiconductor) camera in the satellite motion direction, setting F as the width vertical to a satellite motion image, setting L ≡ Z (modP), mod as a remainder operation, setting Z as a remainder, if Z is not equal to 0, firstly intercepting the length Z of the image, namely the size of the image to be intercepted is Z multiplied by F, respectively intercepting the image in N frames of remote sensing images containing the image to be intercepted, obtaining N areas to be registered, and if Z is 0, intercepting the size of the intercepted image as S multiplied by F to serve as the areas to be registered; where S is an integer multiple of P and S < L.
The specific calculation method of the parameters in the fourth step comprises the following steps: and estimating a fuzzy matrix H according to the low-resolution image, calculating to obtain a down-sampling matrix D, and estimating the noise variance n of the low-resolution image.
The specific method for obtaining M comprises the following steps:
selecting one of the N regions to be registered as a reference frame; selecting transformation models, i.e. global affine transformation models
Wherein (u, v) is the pixel coordinate in the reference frame, (u ', v') is the pixel coordinate of the region to be registered with the reference frame, and a11,a12,a21,a22,b1,b2Are transformation parameters and are all real numbers; obtaining parameters in a degradation model
The image deblurring adopts an image restoration method based on modulation transfer function compensation.
The reconstruction by utilizing a super-resolution reconstruction algorithm in the seventh step comprises frequency domain reconstruction and space domain reconstruction; performing super-resolution reconstruction on the multiple images by frequency domain reconstruction by using an aliasing relation between continuous Fourier transform of the original high-resolution image and discrete Fourier transform of the low-resolution observation image; the spatial domain reconstruction method comprises a non-uniform sample interpolation method, an iterative back projection method, a convex set projection method, a maximum posterior estimation method and a mixed method based on MAP and POCS.
Compared with the prior art, the invention has the advantages that:
(1) the super-resolution technology based on the area array CMOS optical camera provided by the invention aims at the actual situation that the area array CMOS camera is used as a main load, improves the image resolution, reduces the satellite design cost on the premise of not changing the camera hardware design, does not additionally increase the satellite volume and weight, and boosts the satellite to develop towards miniaturization and low cost;
(2) the invention can realize the conventional allocation of the super-resolution technology on the satellite, improve the resolution and the signal-to-noise ratio of the area array CMOS camera, and overcome the problem of low image signal-to-noise ratio of the area array CMOS camera caused by the lack of TDI working mode
(3) The method fully considers the on-satellite computing resource limitation in hardware implementation, has the characteristics of high computing speed and low computing complexity, and can realize real-time/near real-time super-resolution reconstruction on the embedded GPU.
Drawings
FIG. 1 is a general flow chart of the super-resolution imaging technology on the satellite based on an area array CMOS camera.
Fig. 2 is a sequence image registration region truncation schematic diagram.
FIG. 3 is a schematic diagram of a super-resolution mode adopted on a satellite.
Detailed Description
For a better understanding of the technical aspects of the present invention, reference will now be made in detail to the embodiments illustrated in the accompanying drawings. As shown in the attached figure 1, the method carries out on-satellite super-resolution imaging based on an area array CMOS detector, and comprises eight steps of image acquisition, pixel offset number calculation, degradation model construction, degradation model parameter calculation, image preprocessing (sequence image registration area interception and image deblurring), image registration, super-resolution reconstruction model establishment, reconstruction and image splicing. Taking a visible light CMOS camera with an orbit height of 500 kilometers and a spatial resolution of 2m of a sun synchronous orbit remote sensing satellite as an example, assuming that the attitude stability of the satellite is 0.005 DEG/s, the CMOS camera is in a frame pushing mode of 20 frames/s, the flying speed of the satellite is 7km/s, and the breadth is 5 Kx 5K.
The method comprises the following steps: obtaining 11 multi-temporal low-resolution image sequences of the same scene through a satellite area array CMOS detector;
step two: and calculating the number of the offset pixels according to the satellite flight speed, the spatial resolution and the frame frequency.
step three: constructing a degradation model:
yi=DHiMix+ni,i=1,2,…,K
wherein x is the undegraded high resolution image to be solved; y isiIs the acquired ith low-resolution image, i is more than or equal to 1 and less than or equal to 11; miIs a matrix representing the positional misalignment and geometric distortion of the ith image; hiRepresenting a fuzzy degradation matrix; d is a downsampled matrix; n represents additive noise;
step four: and calculating degradation model parameters. Estimating a fuzzy matrix H according to the low-resolution image, wherein the fuzzy type of the fuzzy image is assumed to be a Gaussian type, and a Gaussian convolution kernel can be estimated according to the number of pixels of the blurred point target or linear target in the image; calculating a downsampling matrix D; estimating the noise variance n of the low-resolution image by using the first-level wavelet transform coefficient; the specific calculation method can be referred to the patent of 'super-resolution reconstruction method based on multiple transform domains', ZL201610032463.5
Step five: preprocessing an image;
preprocessing the acquired low-resolution image sequence to be processed, including sequence image registration area interception and image deblurring; setting the size of an acquired image to be 5120 × 5120, setting L ≡ Z (modp), mod being a remainder operation, and Z being a remainder, where L is 125, so that the image interception length in the motion direction is 125 pixels, that is, the size of an image to be intercepted is 125 × 5120 pixels, respectively intercepting the image in a remote sensing image including the image to be intercepted to obtain 6 images to be registered, as shown in fig. 2, a non-shadow area in fig. 2 is the image to be intercepted, each behavior includes a plurality of low-resolution images of the area to be intercepted, and it can be seen from the images that 6 images in total include a first area to be intercepted; and then, the image quality of the image is improved by adopting an image restoration method based on modulation transfer function compensation, and the image to be registered with better quality is obtained.
Step six: and (5) image registration. Constructing a global affine transformation model, selecting the first image of the 6 images as a reference frame, and respectively calculating affine transformation parameters M of the remaining 5 frames and the reference framei(ii) a Generally, image registration is one step of an image reconstruction step and participates in the loop iteration of an algorithm, but due to the limited hardware resources and the consideration of the operation efficiency, the image registration is taken as a preprocessing step;
step seven: and establishing a super-resolution reconstruction model for reconstruction. And reconstructing the super-resolution image by using the MAP method, and waiting for splicing.
Step eight: and sequentially intercepting the images with the size of 810 multiplied by 5120, and repeating the six steps to the seven steps until the reconstruction of the whole image is completed. Considering that the reconstruction effect is positively correlated with the number of frames participating in reconstruction, the larger the number of frames, the better the reconstruction effect is, but considering the limited calculation resources and reconstruction efficiency on the satellite, 6 frames are selected in a compromise way to complete the super-resolution reconstruction. The size of the stripe interception is a multiple of the image element offset number, and the multiple can be selected according to comprehensive consideration of satellite resources and real-time performance, and is selected to be 6 times in the embodiment. After the full-frame image is reconstructed, the images which are subjected to the super-resolution reconstruction are spliced on the satellite to form a complete high-resolution image.
In the specific embodiment of the invention, embedded GPU Jetson TX2 of Invifax is used as a calculation processing core unit, and an FPGA (field programmable gate array) is matched for data input and output control, and a brief flow chart is shown in figure 3. At present, the data processing of remote sensing data satellites at home and abroad on the satellite consistently adopts a reliable embedded hardware processing platform at the aerospace level, which is determined by factors such as small available space on the satellite, low energy supply, severe space irradiation environment and the like. At present, the mainstream satellite-borne image processing mainly utilizes a Central Processing Unit (CPU) integrated on a space-level Field Programmable Gate Array (FPGA) to perform calculation processing, and due to the limited processing capability of the satellite-borne image processing, satellite-borne equipment can only complete simple data processing tasks, and then transmits the result to a ground processing system to complete complex tasks (image matching, super-resolution and the like). The main reason for this staged processing mode is the low computing power of the on-board device. The computing power generally refers to the data processing power of the CPU, and in recent years, the GPU has more powerful computing power for processing a large amount of parallel data. The GPU is used as a powerful general parallel processor, has high-intensity data parallel computing capability and brings a new breakthrough to the general parallel computing field except for graphic display. Because the architectures of the GPU and the CPU are different, the GPU has obvious advantages in the aspects of storage bandwidth and floating point processing speed. Therefore, one possible solution for the satellite-borne high-performance real-time image processing system is to use a Graphics Processing Unit (GPU) -based stand-alone parallel computing platform, and the NVIDIA new-generation embedded computing platform Jetson TX2 used in this embodiment just meets this requirement. The development board is 50x 87mm in size, 85 grams in weight and 7.5 watts in standard power consumption, a Linux system is carried, 256-core NVIDIA Pascal GPU (16 nanometer technology) and a 16-core 64-bit ARM v8 processor cluster, a highest 8G memory, a 32G solid storage and other components are integrated, the floating point computing power of the development board is 1.5Tera FLOPS, is about 10 times of that of a mainstream CPU, and is far stronger than that of a traditional aerospace-level FPGA.
The above description of the invention and its embodiments is not intended to be limiting, and the illustrations in the drawings are intended to represent only one embodiment of the invention. Without departing from the spirit of the invention, it is within the scope of the invention to design structures or embodiments similar to the technical solution without creation.
Claims (5)
1. An on-satellite super-resolution reconstruction method based on an area array CMOS optical camera is characterized by comprising the following steps:
the method comprises the following steps: acquiring K low-resolution image sequences to be processed in the same scene in multiple time phases through an area array CMOS detector;
step two: calculating the number P of offset pixels according to satellite orbit parameters aiming at an optical remote sensing satellite with a non-geosynchronous orbit;
step three: constructing a degradation model;
step four: calculating to obtain parameters of the degradation model;
step five: preprocessing the image of the region of interest, including intercepting the registration region of the sequence image and deblurring the image;
step six: obtaining a matrix of position dislocation and geometric deformation through image registration;
step seven: reconstructing by using a super-resolution reconstruction algorithm;
step eight: repeating the sixth step to the seventh step on the residual ROI images until the reconstruction of the whole ROI area is completed, and obtaining a complete super-resolution image through image splicing;
the specific process of the second step is as follows: calculating the number P of offset pixels between two frames of the sequence images according to the satellite motion speed v, the frame frequency f and the spatial resolution R, wherein the number P is used as a reference value for intercepting the size of the same area image in different time phase images, wherein
The degradation model constructed in the third step is as follows:
yi=DHiMix+ni,i=1,2,…,K;
where x is the original undegraded high resolution image; y isiIs the observed ith low resolution image; miIs a matrix representing the positional misalignment and geometric distortion of the ith image; hiRepresenting a fuzzy degradation matrix; d is a downsampled matrix; n represents additive noise;
the processing procedure of the step five is as follows:
setting the size of an obtained ROI (region of interest) area image as L multiplied by F, setting L as the length of an image shot by a CMOS (complementary metal oxide semiconductor) camera in the satellite motion direction, setting F as the width vertical to a satellite motion image, setting L ≡ Z (modP), mod as a remainder operation, setting Z as a remainder, if Z is not equal to 0, firstly intercepting the length Z of the image, namely the size of the image to be intercepted is Z multiplied by F, respectively intercepting the image in N frames of remote sensing images containing the image to be intercepted, obtaining N areas to be registered, and if Z is 0, intercepting the size of the intercepted image as S multiplied by F to serve as the areas to be registered; where S is an integer multiple of P and S < L.
2. The on-satellite super-resolution reconstruction method based on the area-array CMOS optical camera according to claim 1, characterized in that: the specific calculation method of the parameters in the fourth step comprises the following steps: and estimating a fuzzy matrix H according to the low-resolution image, calculating to obtain a down-sampling matrix D, and estimating the noise variance n of the low-resolution image.
3. The on-satellite super-resolution reconstruction method based on the area-array CMOS optical camera according to claim 1, characterized in that: the specific method for obtaining M comprises the following steps:
selecting one of the N regions to be registered as a reference frame; selecting transformation models, i.e. global affine transformation models
4. The on-satellite super-resolution reconstruction method based on the area-array CMOS optical camera according to claim 3, characterized in that: the image deblurring adopts an image restoration method based on modulation transfer function compensation.
5. The on-satellite super-resolution reconstruction method based on the area-array CMOS optical camera according to any one of claims 1 to 4, characterized in that: the reconstruction by utilizing a super-resolution reconstruction algorithm in the seventh step comprises frequency domain reconstruction and space domain reconstruction; performing super-resolution reconstruction on the multiple images by frequency domain reconstruction by using an aliasing relation between continuous Fourier transform of the original high-resolution image and discrete Fourier transform of the low-resolution observation image; the spatial domain reconstruction method comprises a non-uniform sample interpolation method, an iterative back projection method, a convex set projection method, a maximum posterior estimation method and a mixed method based on MAP and POCS.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711207230.5A CN108090869B (en) | 2017-11-27 | 2017-11-27 | On-satellite super-resolution reconstruction method based on area array CMOS optical camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711207230.5A CN108090869B (en) | 2017-11-27 | 2017-11-27 | On-satellite super-resolution reconstruction method based on area array CMOS optical camera |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108090869A CN108090869A (en) | 2018-05-29 |
CN108090869B true CN108090869B (en) | 2021-07-09 |
Family
ID=62172267
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711207230.5A Active CN108090869B (en) | 2017-11-27 | 2017-11-27 | On-satellite super-resolution reconstruction method based on area array CMOS optical camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108090869B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110532853B (en) * | 2019-07-09 | 2021-10-15 | 中国空间技术研究院 | Remote sensing time-exceeding phase data classification method and device |
CN111986134B (en) * | 2020-08-26 | 2023-11-24 | 中国空间技术研究院 | Remote sensing imaging method and device for area-array camera |
CN112261315B (en) * | 2020-09-07 | 2022-06-17 | 清华大学 | High-resolution calculation imaging system and method based on camera array aperture synthesis |
CN112949549B (en) * | 2021-03-19 | 2023-04-18 | 中山大学 | Super-resolution-based change detection method for multi-resolution remote sensing image |
CN113284161B (en) * | 2021-05-10 | 2023-05-16 | 深圳市魔方卫星科技有限公司 | Area array remote sensing imaging method, device, computer equipment and storage medium |
CN113962897B (en) * | 2021-11-02 | 2022-09-02 | 中国空间技术研究院 | Modulation transfer function compensation method and device based on sequence remote sensing image |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107194874A (en) * | 2017-05-26 | 2017-09-22 | 上海微小卫星工程中心 | Super-resolution imaging system and method based on bias image stabilization |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8811769B1 (en) * | 2012-02-28 | 2014-08-19 | Lytro, Inc. | Extended depth of field and variable center of perspective in light-field processing |
US9734599B2 (en) * | 2014-10-08 | 2017-08-15 | Microsoft Technology Licensing, Llc | Cross-level image blending |
CN104732532B (en) * | 2015-03-11 | 2017-05-31 | 中国空间技术研究院 | A kind of remote sensing satellite multi-spectrum image registration method |
CN105405114A (en) * | 2015-11-26 | 2016-03-16 | 北京空间飞行器总体设计部 | Method for achieving super resolution of GEO optical satellite by using beam splitting and staggered sampling |
CN105657263B (en) * | 2015-12-31 | 2018-11-02 | 杭州卓腾信息技术有限公司 | A kind of super-resolution digital slices scan method based on area array cameras |
CN105550993B (en) * | 2016-01-18 | 2018-11-20 | 中国空间技术研究院 | Super resolution ratio reconstruction method based on multiple transform domain |
CN106525238B (en) * | 2016-10-27 | 2018-08-03 | 中国科学院光电研究院 | A kind of satellite-borne multispectral imaging system design method based on super-resolution rebuilding |
CN106558036B (en) * | 2016-10-27 | 2019-08-02 | 中国科学院光电研究院 | A kind of spaceborne super-resolution imaging design method |
CN107071281A (en) * | 2017-04-19 | 2017-08-18 | 珠海市魅族科技有限公司 | Panorama shooting method and device |
-
2017
- 2017-11-27 CN CN201711207230.5A patent/CN108090869B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107194874A (en) * | 2017-05-26 | 2017-09-22 | 上海微小卫星工程中心 | Super-resolution imaging system and method based on bias image stabilization |
Also Published As
Publication number | Publication date |
---|---|
CN108090869A (en) | 2018-05-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108090869B (en) | On-satellite super-resolution reconstruction method based on area array CMOS optical camera | |
US7602997B2 (en) | Method of super-resolving images | |
Ma et al. | Quanta burst photography | |
CN111598778B (en) | Super-resolution reconstruction method for insulator image | |
CN105550993B (en) | Super resolution ratio reconstruction method based on multiple transform domain | |
CN109146787B (en) | Real-time reconstruction method of dual-camera spectral imaging system based on interpolation | |
CN106408524A (en) | Two-dimensional image-assisted depth image enhancement method | |
CN108961163A (en) | A kind of high-resolution satellite image super-resolution reconstruction method | |
CN111861884A (en) | Satellite cloud image super-resolution reconstruction method based on deep learning | |
Anger et al. | Fast and accurate multi-frame super-resolution of satellite images | |
CN104217412B (en) | Airborne super-resolution image reconstruction device and reconstruction method | |
CN104376547A (en) | Motion blurred image restoration method | |
CN111986134A (en) | Remote sensing imaging method and device for area-array camera | |
Ma et al. | Extensions of compressed imaging: flying sensor, coded mask, and fast decoding | |
CN104574338B (en) | Remote sensing image super-resolution reconstruction method based on multi-angle linear array CCD sensors | |
CN114363517A (en) | Embedded micro-scanning super-resolution real-time processing system and method | |
CN112017122B (en) | Super-resolution imaging method | |
Hefnawy | An efficient super-resolution approach for obtaining isotropic 3-D imaging using 2-D multi-slice MRI | |
Shin et al. | LoGSRN: Deep super resolution network for digital elevation model | |
CN113608237A (en) | Laser radar three-dimensional range profile super-resolution reconstruction method | |
CN105184762B (en) | The method that geosynchronous satellite posture lack sampling measures lower super-resolution image reconstruction | |
Wang et al. | A research and strategy of space super-resolution imaging system based on Sandroid CubeSat | |
Yang et al. | Hyper-Temporal Data Based Modulation Transfer Functions Compensation for Geostationary Remote Sensing Satellites | |
Chen et al. | Fast image super-resolution for a dual-resolution camera | |
Pattanaik | Deep Learning based Super-Resolution for Medical Volume Visualization with Direct Volume Rendering |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |