CN114813049A - Stray light correction method of optical remote sensing camera - Google Patents

Stray light correction method of optical remote sensing camera Download PDF

Info

Publication number
CN114813049A
CN114813049A CN202210394149.7A CN202210394149A CN114813049A CN 114813049 A CN114813049 A CN 114813049A CN 202210394149 A CN202210394149 A CN 202210394149A CN 114813049 A CN114813049 A CN 114813049A
Authority
CN
China
Prior art keywords
stray light
image
illumination area
remote sensing
optical remote
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210394149.7A
Other languages
Chinese (zh)
Inventor
韩琳
向光峰
孟炳寰
孙亮
洪津
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Institutes of Physical Science of CAS
Original Assignee
Hefei Institutes of Physical Science of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Institutes of Physical Science of CAS filed Critical Hefei Institutes of Physical Science of CAS
Priority to CN202210394149.7A priority Critical patent/CN114813049A/en
Publication of CN114813049A publication Critical patent/CN114813049A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • G01M11/04Optical benches therefor
    • G06T5/90

Landscapes

  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Photometry And Measurement Of Optical Pulse Characteristics (AREA)

Abstract

The invention discloses a stray light correction method of an optical remote sensing camera, which comprises the following steps: 1. building a test system, illuminating the image surface of the camera in different areas, and acquiring images of the areas by adopting a focal plane detector array; 2. preprocessing an image; 3. calculating a stray light influence factor of each image illumination area on each pixel in other non-illumination areas, setting the stray light influence factor of the imaging area to be zero, and forming a stray light distribution matrix of each area image by the stray light influence factors of all the pixels; 4. and (3) shooting an image by the camera in a real scene, keeping the image partition consistent with the step, calculating the stray light influence of each region, and deducting the sum of the stray light influences of each region from the original measured image by adopting an iterative optimization correction algorithm to finish stray light correction. The method can realize stray light correction of the optical remote sensing camera, is simple and reliable, has strong universality and portability, and provides guarantee for accurate remote sensing data application.

Description

Stray light correction method of optical remote sensing camera
Technical Field
The invention belongs to the technical field of space remote sensing, and particularly relates to a method for correcting stray light of an optical remote sensing camera.
Background
Stray light is light that reaches the detector along a non-imaging light path, thereby creating stray light radiant energy that reduces the contrast and sharpness of the image. In some severe cases, the target image is even annihilated in stray light radiation, which severely affects the quantitative application of remote sensing data. Most optical remote sensing cameras at home and abroad are easily influenced by stray light, are interfered by the stray light of direct sunlight, and are limited in quantitative application level. Therefore, in order to realize high-precision remote sensing data application, it is necessary to study how to suppress and eliminate stray light.
Generally, the stray light suppression means mainly include: the surface of the optical-mechanical structure adopts an extinction coating, and an inner light shield, an outer light shield, a light blocking ring, an optical surface plated with an antireflection film and the like are designed. After the suppression method is adopted, stray light measurement is carried out, and if the stray light is still at a higher level, the stray light radiation of the rest part needs to be corrected. The correction method mainly comprises two methods: image restoration method and matrix method. The image restoration method is based on a deconvolution algorithm, and obtains an image after stray light correction by deconvolution of an actually obtained image and a PSF of an optical system. However, the PSF of the optical system is difficult to be accurately measured, and the PSFs of different fields of view may not be completely consistent, which brings huge computation load to the image restoration method. The matrix method was first proposed by Zong and applied to spectral stray light correction for spectrometers. Spectral stray light is defined as the response of the rest of the wavelength positions outside the bandwidth of the central pixel, similar to the response of the target pixel of the optical remote sensing camera to the rest of the pixel positions, so the method can be applied to stray light correction of the remote sensing camera. The method is characterized in that stray light response is estimated, a stray light distribution matrix is constructed firstly, the product of the image response of real scene imaging and the corresponding stray light distribution matrix is regarded as the stray light response, and the stray light response is subtracted from the image of the real scene imaging to obtain a stray light corrected image. However, the stray light distribution matrix is obtained by measurement in a laboratory darkroom environment, and under the real scene imaging condition, the stray light response obtained by multiplying the stray light distribution matrix tested in the laboratory by the image response of the real scene imaging has the problem of overestimation of the real signal of the imaging region, which causes stray light overcorrection.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention provides a stray light correction method of an optical remote sensing camera, which aims to realize simple and reliable stray light correction of the optical remote sensing camera and improve the universality and portability of the correction method, thereby providing guarantee for accurate remote sensing data application.
In order to achieve the purpose, the invention adopts the following technical scheme:
the invention discloses a stray light correction method of an optical remote sensing camera, which is characterized by comprising the following steps of:
step 1, building a stray light testing system;
the stray light testing system consists of an integrating sphere light source, a field diaphragm, an optical remote sensing camera, a two-dimensional turntable and a computer, wherein the optical remote sensing camera comprises a lens and a photoelectric detector and is positioned in a darkroom environment; the optical remote sensing camera is mounted on the two-dimensional rotary table;
the integrating sphere light source, the field of view diaphragm and the optical remote sensing camera are sequentially arranged in the darkroom environment, and the centers of the integrating sphere light source and the field of view diaphragm are coaxial with the optical axis of the optical remote sensing camera;
step 2, the light emitted by the integrating sphere light source sequentially passes through the field diaphragm and the lens to reach the photoelectric detector; the imaging target surface of the photoelectric detector is divided into M multiplied by N illumination areas; determining the field diaphragm size corresponding to each illumination area according to the ray tracing; setting short integration time t for each illumination area 1 Such that the signal value of each illumination area is notSaturation, whereby a plurality of unsaturated images of each illumination area are taken by the optical remote sensing camera; resetting the long integration time t of each illumination area 2 Saturating the signal value of each illumination area, thereby obtaining a plurality of saturated images of each illumination area by shooting of the optical remote sensing camera;
step 3, collecting and preprocessing the image acquired by the optical remote sensing camera by the computer;
step 3.1, averaging the multiple unsaturated images of each illumination area, then carrying out background subtraction correction, and then carrying out nonlinear and non-uniform correction to obtain a preprocessed unsaturated image of each illumination area, thereby obtaining M × N preprocessed unsaturated images;
step 3.2, averaging the multiple saturated images of each illumination area, then carrying out background subtraction correction, and then carrying out nonlinear and non-uniform correction to obtain a preprocessed saturated image of each illumination area, thereby obtaining M multiplied by N preprocessed saturated images;
step 4, setting m rows and n columns of pixels on the imaging target surface of the photoelectric detector, and defining the response value of the ith row and jth column of pixels in the preprocessed non-saturated image of the qth illumination area as
Figure BDA0003596682870000021
Then the real response value of the ith row and jth column pixel in the preprocessed saturated image of the qth illumination area is obtained by using the formula (1)
Figure BDA0003596682870000022
Figure BDA0003596682870000023
Step 5, determining a stray light influence factor of each pixel, and constructing a stray light distribution matrix;
step 5.1, let the response value of the q illumination area be defined as
Figure BDA0003596682870000024
Then the stray light influence factor of the qth illumination area on the ith row and jth column image element in other non-illumination areas is obtained by using the formula (2)
Figure BDA0003596682870000025
Figure BDA0003596682870000026
And 5.2, arranging stray light influence factors of all pixels of the imaging target surface of the photoelectric detector (4) to form M multiplied by N stray light distribution matrixes, wherein the stray light distribution matrix of the qth illumination area is recorded as:
Figure BDA0003596682870000031
step 6, stray light correction;
step 6.1, shooting an image in a real scene by using the optical remote sensing camera, and dividing the real scene image into M multiplied by N areas; let the measured response of the real scene image be I meas And from the real response I of the real scene image meas And stray light influence accumulation I est Composition is carried out;
step 6.2, defining and initializing the iteration times k equal to 1;
step 6.3, measuring response I of real scene image meas As a real response, obtaining a kth stray light estimation response I by using the formula (3) est,k
Figure BDA0003596682870000032
Step 6.4, obtaining image response I after the kth stray light correction by using the formula (4) cor,k
I cor,k =I meas -I est,k (4)
Step 6.5, obtain the kth + by using formula (5)1 stray light estimation response I est,k+1
Figure BDA0003596682870000033
Step 6.6, judgment I est,k+1 -I est,k If the value is less than delta, outputting an image after stray light correction if the value is less than delta; otherwise, after k +1 is assigned to k, returning to the step 6.4 for sequential execution; wherein Δ represents a threshold value, and
Figure BDA0003596682870000034
compared with the prior art, the invention has the beneficial effects that:
the method comprehensively utilizes technical means such as image processing, iterative optimization algorithm and the like, solves the problems of complex image data processing process of image deconvolution algorithm restoration, overcorrection of stray light by matrix method correction and the like, has strong universality and practicability, can reduce the technical requirements on test instruments and equipment in the stray light correction process, simultaneously ensures the test precision and saves the cost. The method is simple and reliable, and can meet the requirements of various remote sensing cameras on stray light correction.
Drawings
FIG. 1 is a schematic diagram of a stray light testing system according to the present invention;
FIG. 2 is a flow chart of a stray light correction method according to the present invention;
FIG. 3a is an image of an integrating sphere with a diameter of 1m collected in the first embodiment;
FIG. 3b is a diagram illustrating an image with stray light corrected according to the first embodiment;
FIG. 4a is a graph illustrating DN value variation before and after stray light correction of the 500 th row of imaging regions according to a first embodiment;
FIG. 4b is a graph showing DN value variation before and after stray light correction of the non-imaging region in row 500 in the first embodiment;
FIG. 5a is an aperture 1.5m integrating sphere image collected in example two;
FIG. 5b is a diagram illustrating an image with stray light corrected according to the second embodiment;
FIG. 6a is a graph showing DN value variation before and after stray light correction of the 250 th row of imaging regions according to the second embodiment;
fig. 6b is a graph showing DN value changes before and after stray light correction of the non-imaging region in the 250 th column in the second embodiment.
Detailed Description
Example 1
Stray light correction is carried out on the optical remote sensing camera carried by a certain type of satellite with the focal length value of 5.54mm, the field of view of 119 degrees and the working wavelength range of 443-910 nm.
The specific steps of this embodiment are as follows, as shown in fig. 2:
step 1, building a stray light testing system, as shown in fig. 1:
the stray light testing system consists of an integrating sphere light source 1, a field diaphragm 2, an optical remote sensing camera, a two-dimensional turntable 5 and a computer 6, wherein the optical remote sensing camera comprises a lens 3 and a photoelectric detector 4 and is positioned in a darkroom environment; an optical remote sensing camera is arranged on the two-dimensional rotary table 5;
an integrating sphere light source 1, a field diaphragm 2 and an optical remote sensing camera are sequentially arranged in a darkroom environment, and the centers of the integrating sphere light source 1 and the field diaphragm 2 are coaxial with the optical axis of the optical remote sensing camera.
And 2, the light emitted by the integrating sphere light source 1 sequentially passes through the field diaphragm 2 and the lens 3 and then reaches the photoelectric detector 4. The stray light and noise level is comprehensively evaluated, and the imaging target surface is divided into 11 multiplied by 11 illumination areas in the embodiment. Determining the field diaphragm size corresponding to each illumination area according to the ray tracing; setting short integration time of each illumination area for 30ms, so that the signal value of each illumination area is unsaturated, and obtaining 50 unsaturated images of each illumination area by shooting through an optical remote sensing camera; setting the long integration time of each illumination area to be 1500ms, so that the signal value of each illumination area is saturated, and acquiring 50 saturated images of each illumination area by shooting through an optical remote sensing camera;
step 3, acquiring and preprocessing an image acquired by the optical remote sensing camera by the computer 5;
step 3.1, averaging 50 unsaturated images of each illumination area, then carrying out background subtraction correction, and then carrying out nonlinear and non-uniform correction to obtain a preprocessed unsaturated image of each illumination area, thereby obtaining 11 × 11 preprocessed unsaturated images;
step 3.2, averaging 50 saturated images of each illumination area, then carrying out background subtraction correction, and then carrying out nonlinear and non-uniform correction to obtain a preprocessed saturated image of each illumination area, thereby obtaining 11 × 11 preprocessed saturated images;
step 4, setting an imaging target surface of the photoelectric detector 4 to have 1030 rows and 1024 columns of pixels, and defining a response value of the ith row and jth column of pixels in the preprocessed unsaturated image of the qth illumination area as
Figure BDA0003596682870000051
Then the real response value of the ith row and jth column pixel in the preprocessed saturated image of the qth illumination area is obtained by using the formula (1)
Figure BDA0003596682870000052
Figure BDA0003596682870000053
Step 5, determining a stray light influence factor of each pixel, and constructing a stray light distribution matrix;
step 5.1, let the response value of the q illumination area be defined as
Figure BDA0003596682870000054
Then the stray light influence factor of the qth illumination area on the ith row and jth column image element in other non-illumination areas is obtained by using the formula (2)
Figure BDA0003596682870000055
Figure BDA0003596682870000056
And 5.2, arranging stray light influence factors of all pixels of the imaging target surface of the photoelectric detector (4) to form 11 multiplied by 11 stray light distribution matrixes, wherein the stray light distribution matrix of the qth illumination area is recorded as:
Figure BDA0003596682870000057
step 6, stray light correction:
and 6.1, shooting an image of the integrating sphere with the aperture of 1m by using an optical remote sensing camera, and dividing the image of the integrating sphere into 11 multiplied by 11 areas as shown in figure 3 a. Let the measured response of the integrating sphere image be I meas And from the true response I of the integrating sphere image meas And stray light influence accumulation I est Composition is carried out;
step 6.2, defining and initializing the iteration number k as 1;
step 6.3, measuring response I of the integrating sphere image meas As a real response, obtaining a kth stray light estimation response I by using the formula (3) est,k
Figure BDA0003596682870000061
Step 6.4, obtaining image response I after the kth stray light correction by using the formula (4) cor,k
I cor,k =I meas -I est,k (4)
Step 6.5, obtaining the (k + 1) th stray light estimation response I by using the formula (5) est,k+1
Figure BDA0003596682870000062
Step 6.6, judgment I est,k+1 -I est,k If the value is less than delta, outputting an image after stray light correction if the value is less than delta; otherwise, after k +1 is assigned to k, returning to the step 6.4 for sequential execution; wherein Δ represents a threshold value, and
Figure BDA0003596682870000063
the embodiment runs a total of 100 iteration cycles, and outputs the flare corrected image, as shown in fig. 3 b. In order to compare the change of the image before and after correction, a DN value change curve before and after stray light correction of an imaging area and a DN value change curve before and after stray light correction of a non-imaging area in a 500 th row of the image are respectively shown in fig. 4a and 4 b. As can be seen from the figure, this embodiment corrects for the stray light effect of 96% on average.
Example 2
Stray light correction is carried out on an optical remote sensing camera carried by a certain type of satellite with the focal length value of 4.8mm, the view field of 107 degrees and the working waveband range of 490-910 nm.
The specific steps of this example are as follows:
step 1, a stray light test system is set up, which is consistent with step 1 in embodiment 1.
Step 2, in this embodiment, the imaging target surface is divided into 11 × 7 illumination areas. And determining the field diaphragm size corresponding to each illumination area according to the ray tracing. Setting the short integration time of each illumination area to be 20ms, so that the signal value of each illumination area is unsaturated, and obtaining 100 unsaturated images of each illumination area by shooting through an optical remote sensing camera; setting the long integration time of each illumination area to be 1500ms, so that the signal value of each illumination area is saturated, and 100 saturated images of each illumination area are obtained by shooting through an optical remote sensing camera;
step 3, collecting and preprocessing an image acquired by an optical remote sensing camera by a computer;
step 3.1, averaging 100 unsaturated images of each illumination area, then carrying out background subtraction correction, and then carrying out nonlinear and non-uniform correction to obtain a preprocessed unsaturated image of each illumination area, thereby obtaining 11 × 7 preprocessed unsaturated images;
step 3.2, averaging 100 saturated images of each illumination area, then carrying out background subtraction correction, and then carrying out nonlinear and non-uniform correction to obtain a preprocessed saturated image of each illumination area, thereby obtaining 11 × 7 preprocessed saturated images;
step 4, setting an imaging target surface of the photoelectric detector to have 1024 rows and 500 columns of pixels, and defining the response value of the ith row and jth column of pixels in the preprocessed unsaturated image of the qth illumination area as
Figure BDA0003596682870000071
Then the real response value of the ith row and jth column pixel in the preprocessed saturated image of the qth illumination area is obtained by using the formula (1)
Figure BDA0003596682870000072
Figure BDA0003596682870000073
Step 5, determining a stray light influence factor of each pixel, and constructing a stray light distribution matrix;
step 5.1, let the response value of the q illumination area be defined as
Figure BDA0003596682870000074
Then the stray light influence factor of the qth illumination area on the ith row and jth column image element in other non-illumination areas is obtained by using the formula (2)
Figure BDA0003596682870000075
Figure BDA0003596682870000076
Step 5.2, arranging stray light influence factors of all pixels of an imaging target surface of the photoelectric detector to form 11 × 7 stray light distribution matrixes, wherein the stray light distribution matrix of the qth illumination area is recorded as:
Figure BDA0003596682870000077
step 6, stray light correction:
and 6.1, shooting an image of the integrating sphere with the aperture of 1.5m by using an optical remote sensing camera, as shown in fig. 5a, and dividing the image of the integrating sphere into 11 multiplied by 7 areas. Let the measured response of the integrating sphere image be I meas And from the true response I of the integrating sphere image meas And stray light influence accumulation I est Composition is carried out;
step 6.2, defining and initializing the iteration number k as 1;
step 6.3, measuring response I of the integrating sphere image meas As a real response, obtaining a kth stray light estimation response I by using the formula (3) est,k
Figure BDA0003596682870000081
Step 6.4, obtaining image response I after the kth stray light correction by using the formula (4) cor,k
I cor,k =I meas -I est,k (4)
Step 6.5, obtaining the (k + 1) th stray light estimation response I by using the formula (5) est,k+1
Figure BDA0003596682870000082
Step 6.6, judgment I est,k+1 -I est,k If the value is less than delta, outputting an image after stray light correction if the value is less than delta; otherwise, after k +1 is assigned to k, returning to the step 6.4 for sequential execution; wherein Δ represents a threshold value, and
Figure BDA0003596682870000083
this embodiment runs a total of 150 iteration cycles and outputs a flare corrected image, as shown in fig. 5 b. In order to compare the change of the images before and after correction, a DN value change curve before and after stray light correction of an imaging area and a DN value change curve before and after stray light correction of a non-imaging area in a 250 th column of the images are respectively shown in fig. 6a and 6 b. As can be seen from the figure, this embodiment corrects for the stray light effect of 92% on average.

Claims (1)

1. A stray light correction method of an optical remote sensing camera is characterized by comprising the following steps:
step 1, building a stray light testing system;
the stray light testing system is composed of an integrating sphere light source (1), a field diaphragm (2), an optical remote sensing camera, a two-dimensional turntable (5) and a computer (6), wherein the optical remote sensing camera comprises a lens (3) and a photoelectric detector (4) and is positioned in a darkroom environment; the optical remote sensing camera is mounted on the two-dimensional rotary table (5);
the integrating sphere light source (1), the field of view diaphragm (2) and the optical remote sensing camera are sequentially arranged in the darkroom environment, and the centers of the integrating sphere light source (1) and the field of view diaphragm (2) are coaxial with the optical axis of the optical remote sensing camera;
step 2, the light emitted by the integrating sphere light source (1) sequentially passes through the field diaphragm (2) and the lens (3) and then reaches the photoelectric detector (4); the imaging target surface of the photoelectric detector (4) is divided into M multiplied by N illumination areas; determining the field diaphragm size corresponding to each illumination area according to the ray tracing; setting short integration time t for each illumination area 1 Causing the signal value of each illumination area to be unsaturated, thereby obtaining a plurality of unsaturated images of each illumination area captured by the optical remote sensing camera; resetting the long integration time t of each illumination area 2 Saturating the signal value of each illumination area, thereby obtaining a plurality of saturated images of each illumination area by shooting of the optical remote sensing camera;
step 3, collecting and preprocessing the image acquired by the optical remote sensing camera by the computer (6);
step 3.1, averaging the multiple unsaturated images of each illumination area, then carrying out background subtraction correction, and then carrying out nonlinear and non-uniform correction to obtain a preprocessed unsaturated image of each illumination area, thereby obtaining M × N preprocessed unsaturated images;
step 3.2, averaging the multiple saturated images of each illumination area, then carrying out background subtraction correction, and then carrying out nonlinear and non-uniform correction to obtain a preprocessed saturated image of each illumination area, thereby obtaining M multiplied by N preprocessed saturated images;
step 4, setting m rows and n columns of pixels on the imaging target surface of the photoelectric detector (4), and defining the response value of the ith row and jth column of pixels in the preprocessed non-saturated image of the qth illumination area as
Figure FDA0003596682860000011
Then the real response value of the ith row and jth column pixel in the preprocessed saturated image of the qth illumination area is obtained by using the formula (1)
Figure FDA0003596682860000012
Figure FDA0003596682860000013
Step 5, determining a stray light influence factor of each pixel, and constructing a stray light distribution matrix;
step 5.1, let the response value of the q illumination area be defined as
Figure FDA0003596682860000014
Then the stray light influence factor of the qth illumination area on the ith row and jth column image element in other non-illumination areas is obtained by using the formula (2)
Figure FDA0003596682860000015
Figure FDA0003596682860000021
And 5.2, arranging stray light influence factors of all pixels of the imaging target surface of the photoelectric detector (4) to form M multiplied by N stray light distribution matrixes, wherein the stray light distribution matrix of the qth illumination area is recorded as:
Figure FDA0003596682860000022
step 6, stray light correction;
step 6.1, shooting an image in a real scene by using the optical remote sensing camera, and dividing the image of the real scene into M multiplied by N areas; let the measured response of the real scene image be I meas And from the real response I of the real scene image meas And stray light influence accumulation I est Composition is carried out;
step 6.2, defining and initializing the iteration number k as 1;
step 6.3, measuring response I of real scene image meas As a real response, obtaining a kth stray light estimation response I by using the formula (3) est,k
Figure FDA0003596682860000023
Step 6.4, obtaining image response I after the kth stray light correction by using the formula (4) cor,k
I cor,k =I meas -I est,k (4)
Step 6.5, obtaining the (k + 1) th stray light estimation response I by using the formula (5) est,k+1
Figure FDA0003596682860000024
Step 6.6, judgment I est,k+1 -I est,k If the value is less than delta, outputting an image after stray light correction if the value is less than delta; otherwise, after k +1 is assigned to k, returning to the step 6.4 for sequential execution; wherein Δ represents a threshold value, and
Figure FDA0003596682860000031
CN202210394149.7A 2022-04-14 2022-04-14 Stray light correction method of optical remote sensing camera Pending CN114813049A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210394149.7A CN114813049A (en) 2022-04-14 2022-04-14 Stray light correction method of optical remote sensing camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210394149.7A CN114813049A (en) 2022-04-14 2022-04-14 Stray light correction method of optical remote sensing camera

Publications (1)

Publication Number Publication Date
CN114813049A true CN114813049A (en) 2022-07-29

Family

ID=82536673

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210394149.7A Pending CN114813049A (en) 2022-04-14 2022-04-14 Stray light correction method of optical remote sensing camera

Country Status (1)

Country Link
CN (1) CN114813049A (en)

Similar Documents

Publication Publication Date Title
JP2004505512A (en) Method and apparatus for image mosaicing
Kordecki et al. Practical vignetting correction method for digital camera with measurement of surface luminance distribution
CN109974854B (en) Radiation correction method for frame-type FPI (field programmable Gate array) hyperspectral image
CN109000637B (en) Star sensor light shield design method and star sensor
CN108226059B (en) Satellite hyperspectral CO2On-orbit radiation calibration method for detector
CN113865717A (en) Transient high-temperature colorimetric temperature measuring device based on high-speed camera
Ishibashi et al. Performance of hayabusa2 dcam3-d camera for short-range imaging of sci and ejecta curtain generated from the artificial impact crater formed on asteroid 162137 ryugu (1999 ju 3 ju_3)
Dong et al. Non-iterative spot center location algorithm based on Gaussian for fish-eye imaging laser warning system
CN113379636B (en) Infrared image non-uniformity correction method, device, equipment and storage medium
NL2010457A (en) Hartmann wavefront measuring instrument adapted for non-uniform light illumination.
US5675513A (en) Method of calibrating an interferometer and reducing its systematic noise
Karpov et al. Observations of Transient Events with Mini-MegaTORTORA Wide-Field Monitoring System with Sub-Second Temporal Resolution
Kloppenborg et al. A demonstration of accurate wide-field V-band photometry using a consumer-grade DSLR camera
CN114813049A (en) Stray light correction method of optical remote sensing camera
Guadagnoli et al. Thermal imager non-uniformity sources modeling
CA2775621C (en) Scanning multispectral telescope comprising wavefront analysis means
CN114240801A (en) Remote sensing image non-uniformity correction method
CN109413302B (en) Dynamic interference fringe distortion correction method for pixel response frequency domain measurement
Kim et al. Vignetting dimensional geometric models and a downhill simplex search
Irwin et al. MIRAGE: calibration radiometry system
CN113820009B (en) On-orbit radiation calibration method for space extreme ultraviolet solar telescope
Zhang et al. Signal-to-noise analysis of point target detection using image pixel binning for space-based infrared electro-optical systems
CN113916382B (en) Star energy extraction method based on sensitivity model in pixel
Liu et al. A high-precision 1D dynamic angular measuring system based on linear CCD for Fengyun-2 meteorological satellite
王旭 et al. Analysis and verification of the positioning accuracy of a flat-panel detector used for precision pointing in space optical communication

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination