CN109215005B - Image fusion error correction method for scanning camera of delta-shaped detector - Google Patents

Image fusion error correction method for scanning camera of delta-shaped detector Download PDF

Info

Publication number
CN109215005B
CN109215005B CN201810995316.7A CN201810995316A CN109215005B CN 109215005 B CN109215005 B CN 109215005B CN 201810995316 A CN201810995316 A CN 201810995316A CN 109215005 B CN109215005 B CN 109215005B
Authority
CN
China
Prior art keywords
row
detectors
interval
integration time
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810995316.7A
Other languages
Chinese (zh)
Other versions
CN109215005A (en
Inventor
杨天远
周峰
行麦玲
杨小乐
刘义良
裴景洋
凌龙
童锡良
余恭敏
高有道
马思宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Space Research Mechanical and Electricity
Original Assignee
Beijing Institute of Space Research Mechanical and Electricity
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Space Research Mechanical and Electricity filed Critical Beijing Institute of Space Research Mechanical and Electricity
Priority to CN201810995316.7A priority Critical patent/CN109215005B/en
Publication of CN109215005A publication Critical patent/CN109215005A/en
Application granted granted Critical
Publication of CN109215005B publication Critical patent/CN109215005B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

A method for correcting image fusion errors of a scanning camera of a delta-shaped detector comprises the following steps: 1) according to angle measuring equipment of a camera, angle measuring values of front and rear rows of detectors of the delta-shaped detector array after each period of integration time are obtained; 2) obtaining a relative angle measurement value of a second row of detectors; 3) positioning the relative angle measurement values of the second row of detectors on the first row of angle measurement value sequence is completed according to the time sequence; 4) obtaining the corner interval and the initial corner and the final corner corresponding to the corner interval; 5) matching the corner intervals according to the time sequence to obtain a matching result; determining a matching result of a rotation angle interval corresponding to each integration time of the first row of detectors and a rotation angle interval corresponding to each integration time of the second row of detectors; 6) calculating to obtain an image correction scale factor for correcting the gray value of the image; 7) correcting the gray value of the image of the second row of detectors; 8) and carrying out image fusion again on the processed images corresponding to the second row of detectors according to the positioning result to obtain final fusion images.

Description

Image fusion error correction method for scanning camera of delta-shaped detector
Technical Field
The invention belongs to the technical field of aerospace optical remote sensing, and relates to a method for correcting image fusion errors of a scanning camera of a delta-shaped detector.
Background
The delta-shaped detector array is widely applied to space-based push-broom and scanning optical systems. The output image of the delta-shaped detector array is the fusion of the output images of the front and the rear two rows of detectors. For a space scanning type camera applying a product font detector array, because the relation between the angle and the time is not completely linear when a scanning mirror rotates, fusion errors exist when output images are fused according to a design method. The images are mainly embodied as 'misalignment', namely, object coordinates corresponding to odd-column images and even-column images have errors, and the quality of the fused images is seriously reduced.
The method has two problems according to the traditional image fusion correction method, one is that the fusion error can only be reduced to a pixel level, and the error smaller than the pixel level can not be eliminated; secondly, fusion errors at different image lines may be different, and if correction is performed according to a uniform offset, the correction at some pixels is good, and the correction at some pixels is poor.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the method for correcting the image fusion error of the scanning camera of the delta-shaped detector overcomes the defects of the prior art, corrects the error smaller than the pixel level, and has certain self-adaptive capacity due to different correction coefficients adopted at the pixels at different positions.
The technical solution of the invention is as follows: a method for correcting image fusion errors of a scanning camera of a delta-shaped detector comprises the following steps:
1) according to angle measuring equipment of a camera, angle measuring values of front and rear rows of detectors of the delta-shaped detector array after each period of integration time are obtained;
2) obtaining a relative angle measurement value of a second row of detectors;
3) positioning the relative angle measurement values of the second row of detectors on the first row of angle measurement value sequence is completed according to the time sequence;
4) obtaining the corner interval and the initial corner and the final corner corresponding to the corner interval;
5) matching the corner intervals according to the time sequence to obtain a matching result; determining a matching result of the rotation angle interval corresponding to each integration time of the first row of detectors and the rotation angle interval corresponding to each integration time of the second row of detectors according to the positioning result of the step 3);
6) calculating to obtain an image correction scale factor for correcting the gray value of the image according to the corner interval matching result obtained in the step 5);
7) correcting the gray value of the image of the second row of detectors;
8) and 7) carrying out image fusion again on the images corresponding to the second row of detectors processed in the step 7) according to the positioning result corresponding to the step 3) to obtain final fusion images.
The specific method for obtaining the relative angle measurement value of the second row of detectors in the step 2) is as follows: according to the focal length of the scanning camera and the physical interval of the first and second rows of detectors, the object space rotation angle corresponding to the theoretical first and second rows of positions is obtained; and subtracting half of the object space rotation angle corresponding to the theoretical first and second rows of positions from the angle measurement value of the second row of detectors, and calling the obtained difference value as the relative angle measurement value of the second row of detectors.
The specific process of the step 4) is as follows: and (3) making a difference between the angle measurement value of the first row of detectors and the relative angle measurement value of the second row of detectors, obtaining a corner interval corresponding to each integral time of the two rows of detectors, and defining two angle measurement values corresponding to each corner interval as an initial corner and a final corner of the corner interval.
In the step 5), for each integration time, the matching result has four conditions:
a: the starting rotation angle corresponding to the interval of the internal rotation angles of the second row of detectors in the integration time is larger than the starting rotation angle corresponding to the interval of the internal rotation angles of the first row of detectors in the integration time, and the ending rotation angle corresponding to the interval of the internal rotation angles of the second row of detectors in the integration time is smaller than the ending rotation angle corresponding to the interval of the internal rotation angles of the first row of detectors in the integration time;
b: the starting rotation angle corresponding to the interval of the internal rotation angles of the second row of detectors in the integration time is larger than the starting rotation angle corresponding to the interval of the internal rotation angles of the first row of detectors in the integration time, and the ending rotation angle corresponding to the interval of the internal rotation angles of the second row of detectors in the integration time is larger than the ending rotation angle corresponding to the interval of the internal rotation angles of the first row of detectors in the integration time;
c: the starting rotation angle corresponding to the interval of the internal rotation angles of the second row of detectors in the integration time is smaller than the starting rotation angle corresponding to the interval of the internal rotation angles of the first row of detectors in the integration time, and the ending rotation angle corresponding to the interval of the internal rotation angles of the second row of detectors in the integration time is smaller than the ending rotation angle corresponding to the interval of the internal rotation angles of the first row of detectors in the integration time;
d: the start angle corresponding to the integration time internal angle interval of the second row of detectors is smaller than the start angle corresponding to the integration time internal angle interval of the first row of detectors, and the end angle corresponding to the integration time internal angle interval of the second row of detectors is larger than the end angle corresponding to the integration time internal angle interval of the first row of detectors.
The principle of correcting the gray value of the second row of detector images in the step 7) is as follows: defining a row which is an image along the direction of a detector in the image and a column which is an image vertical to the direction of the detector; correcting the image according to the image column; each pixel in the image column is mapped to a corner interval of the detector in accordance with the imaging information.
The specific process of calculating and obtaining the image correction scale factor according to the corner interval in the step 6) is as follows: for both cases A, D, the scale factor has a value of 1; for the two cases of B, C, the scale factor calculation method is: the absolute value of the difference between the starting angle corresponding to the second row of detector integration time internal angle intervals and the starting angle corresponding to the first row of detector integration time internal angle intervals is divided by the second row of detector integration time internal angle intervals.
The specific process of correcting the gray value of the second row of detector images in the step 7) is as follows: for both cases A, D, the gray value of the image pixel is kept unchanged;
for the case B pixel gray values are:
(1-the scale factor of this pixel) x the gray value of this pixel + the scale factor of the next pixel x
A gray value of a pixel;
for case C the pixel grey values are:
the scale factor of the pixel x the gray level of the pixel + (1-the scale factor of the previous pixel) x the gray level of the previous pixel.
Compared with the prior art, the invention has the advantages that:
aiming at the phenomenon that odd columns and even columns in a fused image of a delta detector and a scanning camera are misaligned, the image fusion error correction method based on the rotation angle change of a scanning mirror is provided. Compared with the traditional method, the method has the advantages that:
1. the invention can correct the image fusion error generated by the nonlinear motion of the scanning mirror. In addition to correcting for errors at the pixel level, errors smaller than at the pixel level may also be corrected.
2. The invention adopts different correction coefficients for pixels at different positions, thereby having certain adaptivity and higher correction precision of image fusion error, and being capable of obviously improving the image modulation transfer function and improving the image quality.
Drawings
FIG. 1 is a schematic diagram of pixel arrangement of a delta-shaped array detector.
Fig. 2 is a schematic diagram of corner matching of two rows of detectors of a delta-shaped array detector.
Fig. 3 is a schematic diagram of a fused image of a delta-shaped array detector.
Detailed Description
The invention will be further explained with reference to the drawings.
A method for correcting image fusion errors of a scanning camera of a delta-shaped detector comprises the following steps:
1) according to angle measuring equipment of a camera, angle measuring values of front and rear rows of detectors of the delta-shaped detector array after each period of integration time are obtained;
2) obtaining a relative angle measurement value of a second row of detectors;
3) positioning the relative angle measurement values of the second row of detectors on the first row of angle measurement value sequence is completed according to the time sequence;
4) obtaining the corner interval and the initial corner and the final corner corresponding to the corner interval;
5) matching the corner intervals according to the time sequence to obtain a matching result; determining a matching result of the rotation angle interval corresponding to each integration time of the first row of detectors and the rotation angle interval corresponding to each integration time of the second row of detectors according to the positioning result of the step 3);
6) calculating to obtain an image correction scale factor for correcting the gray value of the image according to the corner interval matching result obtained in the step 5);
7) correcting the gray value of the image of the second row of detectors;
8) and 7) carrying out image fusion again on the images corresponding to the second row of detectors processed in the step 7) according to the positioning result corresponding to the step 3) to obtain final fusion images.
The specific method for obtaining the relative angle measurement value of the second row of detectors in the step 2) is as follows: according to the focal length of the scanning camera and the physical interval of the first and second rows of detectors, the object space rotation angle corresponding to the theoretical first and second rows of positions is obtained; and subtracting half of the object space rotation angle corresponding to the theoretical first and second rows of positions from the angle measurement value of the second row of detectors, and calling the obtained difference value as the relative angle measurement value of the second row of detectors.
The specific process of the step 4) is as follows: and (3) making a difference between the angle measurement value of the first row of detectors and the relative angle measurement value of the second row of detectors, obtaining a corner interval corresponding to each integral time of the two rows of detectors, and defining two angle measurement values corresponding to each corner interval as an initial corner and a final corner of the corner interval.
In the step 5), for each integration time, the matching result has four conditions:
a: the starting rotation angle corresponding to the interval of the internal rotation angles of the second row of detectors in the integration time is larger than the starting rotation angle corresponding to the interval of the internal rotation angles of the first row of detectors in the integration time, and the ending rotation angle corresponding to the interval of the internal rotation angles of the second row of detectors in the integration time is smaller than the ending rotation angle corresponding to the interval of the internal rotation angles of the first row of detectors in the integration time;
b: the starting rotation angle corresponding to the interval of the internal rotation angles of the second row of detectors in the integration time is larger than the starting rotation angle corresponding to the interval of the internal rotation angles of the first row of detectors in the integration time, and the ending rotation angle corresponding to the interval of the internal rotation angles of the second row of detectors in the integration time is larger than the ending rotation angle corresponding to the interval of the internal rotation angles of the first row of detectors in the integration time;
c: the starting rotation angle corresponding to the interval of the internal rotation angles of the second row of detectors in the integration time is smaller than the starting rotation angle corresponding to the interval of the internal rotation angles of the first row of detectors in the integration time, and the ending rotation angle corresponding to the interval of the internal rotation angles of the second row of detectors in the integration time is smaller than the ending rotation angle corresponding to the interval of the internal rotation angles of the first row of detectors in the integration time;
d: the start angle corresponding to the integration time internal angle interval of the second row of detectors is smaller than the start angle corresponding to the integration time internal angle interval of the first row of detectors, and the end angle corresponding to the integration time internal angle interval of the second row of detectors is larger than the end angle corresponding to the integration time internal angle interval of the first row of detectors.
The principle of correcting the gray value of the second row of detector images in the step 7) is as follows: defining a row which is an image along the direction of a detector in the image and a column which is an image vertical to the direction of the detector; correcting the image according to the image column; each pixel in the image column is mapped to a corner interval of the detector in accordance with the imaging information.
The specific process of calculating and obtaining the image correction scale factor according to the corner interval in the step 6) is as follows: for both cases A, D, the scale factor has a value of 1; for the two cases of B, C, the scale factor calculation method is: the absolute value of the difference between the starting angle corresponding to the second row of detector integration time internal angle intervals and the starting angle corresponding to the first row of detector integration time internal angle intervals is divided by the second row of detector integration time internal angle intervals.
The specific process of correcting the gray value of the second row of detector images in the step 7) is as follows: for both cases A, D, the gray value of the image pixel is kept unchanged;
for the case B pixel gray values are:
(1-the scale factor of this pixel) x the gray value of this pixel + the scale factor of the next pixel x
A gray value of a pixel;
for case C the pixel grey values are:
the scale factor of the pixel x the gray level of the pixel + (1-the scale factor of the previous pixel) x the gray level of the previous pixel.
Delta-shaped array detector widely used for scanning type space-basedOptical loading. The detector in the shape of a Chinese character 'pin' adopts two rows of devices and is arranged in staggered pixels. When the images output by the two rows of detectors are fused, fusion errors are introduced. The detector array is shown in figure 1. A. the1~AmFor each detector element of the first column of detectors, B1~BmThe mounting distance between the two rows of detectors is D. The invention corrects the fusion error of the scanning camera of the delta-shaped detector, and the specific implementation method comprises the following steps:
and according to angle measuring equipment of the camera, angle measuring values of front and rear rows of detectors of the delta-shaped detector array after each period of integration time are obtained. And according to the focal length of the scanning camera and the physical interval of the first and second rows of detectors, the object space rotation angle corresponding to the theoretical first and second rows of positions is obtained. Subtracting half of the object space rotation angle corresponding to the theoretical first and second rows of positions from the angle measurement value of the second row of detectors to obtain a 'relative angle measurement value' of the second row of detectors. The obtained angle measurement value of the first row of detectors is a1,a2…anThe relative angle value of the second row of detectors is b1,b2…bn
And (3) positioning the relative angle measurement value of the second row of detectors on the first row of angle measurement value sequence in time sequence. And obtaining the rotation angle interval corresponding to each integral time of the two rows of detectors by performing difference on the angle measurement value of the first row of detectors and the 'relative angle measurement value' of the second row of detectors. The angular interval of the detectors of the first column is denoted Δ a1,Δa2…ΔanThe calculation method of each rotation angle interval is delta ai=ai+1-ai,ai+1、aiIs Δ aiThe corresponding starting and ending turns. The second row of detectors corresponds to a rotation angle Δ b1,Δb2…ΔbnThe calculation method of each rotation angle interval is delta bi=bi+1-bi,bi+1、biIs Δ biThe corresponding starting and ending turns.
While searching for the rotation angle interval corresponding to each integration time of the first row of detectors and each integration time of the second rowTo the corresponding corner intervals. According to the actual situation, the angle value a is measurediFor example, there are four cases of the matching result, as shown in fig. 2, which are:
1)bi>ai,bi+1<ai+1
2)bi>ai,bi+1>ai+1
3)bi<ai,bi+1<ai+1
4)bi<ai,bi+1>ai+1
with B1Probe element as an example, Δ biCorresponding to a gray value of B in the image1(i) In that respect The process of correcting the image fusion error is the process of correcting the gray value in the image. Let the corrected gray value be B'1(i) In that respect Then for the four cases above, compare figure 2, B'1(i) The calculation method comprises the following steps:
1) when b isi>ai,bi+1<ai+1Of (c), B'1(i)=B1(i);
2) When b isi>ai,bi+1>ai+1Of (c), B'1(i)=((bi-ai)/Δbi)×B1(i)+(1-(bi+1-ai+1)/Δbi+1)×B1(i+1);
3) When b isi<ai,bi+1<ai+1Of (c), B'1(i)=(1-(ai-bi)/Δbi+1)×B1(i+1)+((ai+1-bi+1)/Δbi+2)×B1(i+2);
4) When b isi<ai,bi+1>ai+1Of (c), B'1(i)=B1(i)。
After the image gray value is corrected, image fusion is performed, and the image fusion result is shown in fig. 3.
The verification proves that the precision of the fusion error correction of the invention is about 1/5 pixels, and the correction precision is far higher than 1 pixel of the common method.
Those skilled in the art will appreciate that those matters not described in detail in the present specification are well known in the art.

Claims (6)

1. A method for correcting image fusion errors of a scanning camera of a delta-shaped detector is characterized by comprising the following steps:
1) according to angle measuring equipment of a camera, angle measuring values of front and rear rows of detectors of the delta-shaped detector array after each period of integration time are obtained;
2) obtaining a relative angle measurement value of a second row of detectors;
3) positioning the relative angle measurement values of the second row of detectors on the first row of angle measurement value sequence is completed according to the time sequence;
4) obtaining the corner interval and the initial corner and the final corner corresponding to the corner interval;
the specific process of the step 4) is as follows: the angle measurement value of the first row of detectors and the relative angle measurement value of the second row of detectors are subjected to difference, a corner interval corresponding to each integral time of the two rows of detectors is obtained, and two angle measurement values corresponding to each corner interval are defined as an initial corner and a final corner of the corner interval;
5) matching the corner intervals according to the time sequence to obtain a matching result;
the specific process is as follows: determining a matching result of the rotation angle interval corresponding to each integration time of the first row of detectors and the rotation angle interval corresponding to each integration time of the second row of detectors according to the positioning result of the step 3);
6) calculating to obtain an image correction scale factor for correcting the gray value of the image according to the corner interval matching result obtained in the step 5);
7) correcting the gray value of the image of the second row of detectors;
8) and 7) carrying out image fusion again on the images corresponding to the second row of detectors processed in the step 7) according to the positioning result corresponding to the step 3) to obtain final fusion images.
2. The method for correcting the image fusion error of the delta-shaped detector scanning camera according to claim 1, characterized in that: the specific method for obtaining the relative angle measurement value of the second row of detectors in the step 2) is as follows: according to the focal length of the scanning camera and the physical interval of the first and second rows of detectors, the object space rotation angle corresponding to the theoretical first and second rows of positions is obtained; and subtracting half of the object space rotation angle corresponding to the theoretical first and second rows of positions from the angle measurement value of the second row of detectors, and calling the obtained difference value as the relative angle measurement value of the second row of detectors.
3. The method for correcting the image fusion error of the delta-shaped detector scanning camera according to claim 1, characterized in that: in the step 5), for each integration time, the matching result has four conditions:
a: the starting rotation angle corresponding to the interval of the internal rotation angles of the second row of detectors in the integration time is larger than the starting rotation angle corresponding to the interval of the internal rotation angles of the first row of detectors in the integration time, and the ending rotation angle corresponding to the interval of the internal rotation angles of the second row of detectors in the integration time is smaller than the ending rotation angle corresponding to the interval of the internal rotation angles of the first row of detectors in the integration time;
b: the starting rotation angle corresponding to the interval of the internal rotation angles of the second row of detectors in the integration time is larger than the starting rotation angle corresponding to the interval of the internal rotation angles of the first row of detectors in the integration time, and the ending rotation angle corresponding to the interval of the internal rotation angles of the second row of detectors in the integration time is larger than the ending rotation angle corresponding to the interval of the internal rotation angles of the first row of detectors in the integration time;
c: the starting rotation angle corresponding to the interval of the internal rotation angles of the second row of detectors in the integration time is smaller than the starting rotation angle corresponding to the interval of the internal rotation angles of the first row of detectors in the integration time, and the ending rotation angle corresponding to the interval of the internal rotation angles of the second row of detectors in the integration time is smaller than the ending rotation angle corresponding to the interval of the internal rotation angles of the first row of detectors in the integration time;
d: the start angle corresponding to the integration time internal angle interval of the second row of detectors is smaller than the start angle corresponding to the integration time internal angle interval of the first row of detectors, and the end angle corresponding to the integration time internal angle interval of the second row of detectors is larger than the end angle corresponding to the integration time internal angle interval of the first row of detectors.
4. The method for correcting the image fusion error of the delta-shaped detector scanning camera according to claim 1, characterized in that: the principle of correcting the gray value of the second row of detector images in the step 7) is as follows: defining a row which is an image along the direction of a detector in the image and a column which is an image vertical to the direction of the detector; correcting the image according to the image column; each pixel in the image column is mapped to a corner interval of the detector in accordance with the imaging information.
5. The method for correcting the image fusion error of the delta-shaped detector scanning camera according to claim 3, characterized in that: the specific process of calculating and obtaining the image correction scale factor according to the corner interval in the step 6) is as follows: for both cases A, D, the scale factor has a value of 1; for the two cases of B, C, the scale factor calculation method is: the absolute value of the difference between the starting angle corresponding to the second row of detector integration time internal angle intervals and the starting angle corresponding to the first row of detector integration time internal angle intervals is divided by the second row of detector integration time internal angle intervals.
6. The method for correcting the image fusion error of the delta-shaped detector scanning camera according to claim 5, characterized in that: the specific process of correcting the gray value of the second row of detector images in the step 7) is as follows: for both cases A, D, the gray value of the image pixel is kept unchanged;
for the case B pixel gray values are:
(1-the scale factor of this pixel) x the gray value of this pixel + the scale factor of the next pixel x the gray value of the next pixel;
for case C the pixel grey values are:
the scale factor of the pixel x the gray level of the pixel + (1-the scale factor of the previous pixel) x the gray level of the previous pixel.
CN201810995316.7A 2018-08-29 2018-08-29 Image fusion error correction method for scanning camera of delta-shaped detector Active CN109215005B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810995316.7A CN109215005B (en) 2018-08-29 2018-08-29 Image fusion error correction method for scanning camera of delta-shaped detector

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810995316.7A CN109215005B (en) 2018-08-29 2018-08-29 Image fusion error correction method for scanning camera of delta-shaped detector

Publications (2)

Publication Number Publication Date
CN109215005A CN109215005A (en) 2019-01-15
CN109215005B true CN109215005B (en) 2021-10-01

Family

ID=64985211

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810995316.7A Active CN109215005B (en) 2018-08-29 2018-08-29 Image fusion error correction method for scanning camera of delta-shaped detector

Country Status (1)

Country Link
CN (1) CN109215005B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102068277A (en) * 2010-12-14 2011-05-25 哈尔滨工业大学 Method and device for observing photoacoustic imaging in single-array element and multi-angle mode based on compressive sensing
CN103247029A (en) * 2013-03-26 2013-08-14 中国科学院上海技术物理研究所 Geometric registration method for hyperspectral image generated by spliced detectors
JP2015224928A (en) * 2014-05-27 2015-12-14 株式会社デンソー Target detector
CN107209944A (en) * 2014-08-16 2017-09-26 Fei公司 The correction of beam hardening pseudomorphism in the sample microtomography being imaged in a reservoir

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102068277A (en) * 2010-12-14 2011-05-25 哈尔滨工业大学 Method and device for observing photoacoustic imaging in single-array element and multi-angle mode based on compressive sensing
CN103247029A (en) * 2013-03-26 2013-08-14 中国科学院上海技术物理研究所 Geometric registration method for hyperspectral image generated by spliced detectors
JP2015224928A (en) * 2014-05-27 2015-12-14 株式会社デンソー Target detector
CN107209944A (en) * 2014-08-16 2017-09-26 Fei公司 The correction of beam hardening pseudomorphism in the sample microtomography being imaged in a reservoir

Also Published As

Publication number Publication date
CN109215005A (en) 2019-01-15

Similar Documents

Publication Publication Date Title
CN109767476B (en) Automatic focusing binocular camera calibration and depth calculation method
CN107255521B (en) A kind of Infrared Image Non-uniformity Correction method and system
TWI511086B (en) Lens distortion calibration method
CN109903352A (en) A kind of seamless orthography production method in the big region of satellite remote-sensing image
CN108830889B (en) Global geometric constraint-based remote sensing image and reference image matching method
CN109685744B (en) Scanning galvanometer precision correction method
CN109345467A (en) Image deformation bearing calibration, device, computer equipment and storage medium
Dufour et al. Integrated digital image correlation for the evaluation and correction of optical distortions
CN110986998B (en) Satellite video camera on-orbit geometric calibration method based on rational function model
CN111340888B (en) Light field camera calibration method and system without white image
CN111047586B (en) Pixel equivalent measuring method based on machine vision
CN108921797B (en) Method for calibrating distorted image
CN115311365B (en) High-precision on-orbit geometric positioning method for long-line-column swing scanning camera
CN105869129B (en) For the thermal infrared images residue non-uniform noise minimizing technology after nonuniformity correction
CN109215005B (en) Image fusion error correction method for scanning camera of delta-shaped detector
CN107240077B (en) Visual measurement method based on elliptic conformation deviation iterative correction
CN110751601A (en) Distortion correction method based on RC optical system
CN113034591B (en) Tooth-shaped structure assembly-oriented addendum circle extraction algorithm
CN110298890B (en) Light field camera calibration method based on Planck parameterization
CN110490941B (en) Telecentric lens external parameter calibration method based on normal vector
CN110020997B (en) Image distortion correction method, image restoration method and alignment method
CN116777769A (en) Method and device for correcting distorted image, electronic equipment and storage medium
CN115880369A (en) Device, system and method for jointly calibrating line structured light 3D camera and line array camera
CN112927299B (en) Calibration method and device and electronic equipment
CN113762098A (en) Satellite remote sensing image matching method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant