CN114820376A - Fusion correction method and device for stripe noise, electronic equipment and storage medium - Google Patents

Fusion correction method and device for stripe noise, electronic equipment and storage medium Download PDF

Info

Publication number
CN114820376A
CN114820376A CN202210465517.2A CN202210465517A CN114820376A CN 114820376 A CN114820376 A CN 114820376A CN 202210465517 A CN202210465517 A CN 202210465517A CN 114820376 A CN114820376 A CN 114820376A
Authority
CN
China
Prior art keywords
pixel
correction
image
value
remote sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210465517.2A
Other languages
Chinese (zh)
Inventor
张延坤
孙景旭
谢虹波
李淑贤
杨兴林
任建岳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ji Hua Laboratory
Original Assignee
Ji Hua Laboratory
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ji Hua Laboratory filed Critical Ji Hua Laboratory
Priority to CN202210465517.2A priority Critical patent/CN114820376A/en
Publication of CN114820376A publication Critical patent/CN114820376A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The application relates to the technical field of remote sensing image processing, and provides a method and a device for fusion correction of stripe noise, electronic equipment and a storage medium, wherein the method collects a remote sensing image shot by a camera; correcting each pixel of the remote sensing image once by using a radiation response uniformity correction method to obtain a first corrected image; and performing secondary correction on each pixel of the first correction image based on the image processing method to obtain a second correction image completely eliminating the stripe noise. By combining the radiation response uniformity correction with the noise correction based on image processing, the stripe noise is effectively eliminated, and the problems of image blurring, ringing response, blocking effect and the like caused by a complex algorithm are avoided, so that the quality of the remote sensing image is improved, and the performance of the remote sensing image is improved.

Description

Stripe noise fusion correction method and device, electronic equipment and storage medium
Technical Field
The application relates to the technical field of remote sensing image processing, in particular to a method and a device for fusion correction of stripe noise, electronic equipment and a storage medium.
Background
With the rapid development of remote sensing technology, spatial remote sensing images are widely applied to multiple fields of military reconnaissance, disaster monitoring, weather prediction and the like. The remote sensing image is easily interfered by stripe noise due to the influence of factors such as the nonuniformity of radiation response of the sensor, the distortion of an optical mechanical system, the change of an external environment and the like. The stripe noise covers the spatial distribution characteristics of the target object and generates a corresponding pseudo structure, so that the quality of the image is influenced, the subsequent inversion analysis and information extraction are also influenced, and the performance of the remote sensing image is finally reduced.
Current methods of removing banding noise can be broadly divided into three categories: filtering based methods, statistical based methods and optimization based methods. Filtering-based methods, which mainly include wavelet analysis, fourier domain filters and combination filters, exhibit superior performance in removing periodic bands, but are prone to blurring or ringing effects. Statistical-based methods mainly include moment matching and histogram matching and their improvement methods, the results of which largely depend on pre-established reference moments or histograms; however, it is difficult to find a suitable reference moment or histogram in an actually acquired image, and the practicality is poor. Optimization-based methods usually introduce a priori knowledge of the bands and the clean image into the energy function, usually achieving better results, but block effects are easily generated when the density of the bands is higher, especially in the experimental results of the aperiodic bands.
Based on the above problems, no effective solution exists at present.
Disclosure of Invention
The application aims to provide a method and a device for fusion correction of stripe noise, electronic equipment and a storage medium, which can effectively eliminate the stripe noise of a remote sensing image and simultaneously avoid the problems of blurring, ringing effect and blocking effect of the image.
In a first aspect, the present application provides a method for fusion correction of stripe noise, including the following steps:
s1, acquiring a remote sensing image shot by a camera;
s2, performing primary correction on each pixel of the remote sensing image by using a radiation response uniformity correction method to obtain a first corrected image;
and S3, performing secondary correction on each pixel of the first correction image based on an image processing method to obtain a second correction image completely eliminating stripe noise.
According to the fusion correction method for the stripe noise, a remote sensing image shot by a camera is collected; correcting each pixel of the remote sensing image once by using a radiation response uniformity correction method to obtain a first corrected image; and performing secondary correction on each pixel of the first correction image based on the image processing method to obtain a second correction image completely eliminating the stripe noise. By combining the radiation response uniformity correction with the noise correction based on image processing, the stripe noise is effectively eliminated, and the problems of image blurring, ringing response, blocking effect and the like caused by a complex algorithm are avoided, so that the quality of the remote sensing image is improved, and the performance of the remote sensing image is improved.
Optionally, step S2 includes the steps of:
s201, acquiring a statistical mean value of gray level response of each pixel, a pixel value of each pixel and a multi-time average dark signal value of each pixel;
s202, calculating a first correction coefficient of the radiation response uniformity of each pixel according to the statistical mean value of the gray level response of each pixel, the pixel value of each pixel and the multiple average dark signal value of each pixel;
s203, correcting the gray value of each pixel according to the first correction coefficient.
Alternatively,
step S202 includes: calculating a first correction coefficient of the radiation response uniformity of each pixel according to the following formula:
Figure 895705DEST_PATH_IMAGE002
step S203 includes: correcting the gray value of each pixel according to the following formula:
Figure 440825DEST_PATH_IMAGE004
wherein,
Figure DEST_PATH_IMAGE005
representative image coordinates of
Figure 599274DEST_PATH_IMAGE006
The first correction coefficient of the pixel of (a); AVE represents the statistical mean value of the gray level response of each pixel;
Figure DEST_PATH_IMAGE007
representative image coordinates are
Figure 858348DEST_PATH_IMAGE006
The pixel value of the pixel of (a);
Figure 723536DEST_PATH_IMAGE008
representative image coordinates of
Figure 455869DEST_PATH_IMAGE006
The multiple average dark signal values of the pixel of (a);
Figure DEST_PATH_IMAGE009
representing the coordinates of the image after one correction
Figure 242559DEST_PATH_IMAGE006
The gray value of the pixel of (1);
Figure 335018DEST_PATH_IMAGE010
representative image coordinates are
Figure 789133DEST_PATH_IMAGE006
The initial gray value of the picture element.
By correcting the remote sensing image once in this way, the noise rule on the image tends to be stable, namely, the stripe noise in the first image is not randomly distributed, but the distributed position is more uniform, so that the method is convenient for positioning and searching, and lays a foundation for removing the stripe noise by using an image processing method in the next step.
Optionally, step S3 includes:
s301, acquiring an upper arc and a lower arc of a strip noise area; the upper circular arc and the lower circular arc are respectively an upper boundary and a lower boundary of the strip noise area, and the upper circular arc and the lower circular arc have the same fitting circle center;
s302, calculating a second correction coefficient sequence of each row of pixels between the upper arc and the lower arc based on a noise distribution rule;
and S303, carrying out secondary correction on the gray value of the pixel between the upper arc and the lower arc according to the second correction coefficient sequence to obtain a second correction image.
Optionally, step S302 includes sequentially taking each column of pixels between the upper arc and the lower arc as a target column of pixels, and performing the following steps:
s3021, acquiring x-axis coordinates of a first pixel and a second pixel in an image coordinate system in the target row pixels; the first pixel is the first pixel of the target row of pixels, and the second pixel is the last pixel of the target row of pixels;
s3022, calculating the y-axis coordinates of the midpoints of the first pixel, the second pixel and the target row pixels in an image coordinate system according to the coordinates of the fitted circle centers;
and S3023, calculating a second correction coefficient of each pixel of the target row of pixels according to the y-axis coordinates of the first pixel, the second pixel and the midpoint of the target row of pixels in an image coordinate system, and obtaining the second correction coefficient sequence.
Optionally, step S3023 includes: calculating a second correction coefficient of each pixel of the target column of pixels according to the following formula:
Figure 957946DEST_PATH_IMAGE012
wherein,
Figure DEST_PATH_IMAGE013
representative image coordinates are
Figure 231932DEST_PATH_IMAGE006
The second correction coefficient of the picture element of (a);
Figure 629547DEST_PATH_IMAGE014
the x-axis coordinate value of the first pixel on the correction plane is taken as the coordinate value;
Figure DEST_PATH_IMAGE015
the coordinate value of the second pixel on the x axis of the correction plane is taken as the coordinate value;
Figure 797223DEST_PATH_IMAGE016
a preset second correction coefficient for the midpoint of the target column pixel; d is the coordinate value of the middle point of the target column pixel on the x axis of the correction plane;
Figure DEST_PATH_IMAGE017
is the x-axis coordinate value of the pixel in the image coordinate system.
According to the fusion correction method for the stripe noise, a remote sensing image shot by a camera is collected; correcting each pixel of the remote sensing image once by using a radiation response uniformity correction method to obtain a first corrected image; and performing secondary correction on each pixel of the first correction image based on the image processing method to obtain a second correction image completely eliminating the stripe noise. By combining the radiation response uniformity correction with the noise correction based on image processing, the stripe noise is effectively eliminated, and the problems of image blurring, ringing response, blocking effect and the like caused by a complex algorithm are avoided, so that the quality of the remote sensing image is improved, and the performance of the remote sensing image is improved.
In a second aspect, the present application provides a device for fusion and correction of stripe noise, which is used for eliminating stripe noise of a remote sensing image, and includes the following modules:
an acquisition module: the remote sensing image acquisition device is used for acquiring a remote sensing image shot by a camera;
a first correction module: the system comprises a remote sensing image acquisition unit, a radiation response uniformity correction unit and a control unit, wherein the radiation response uniformity correction unit is used for correcting each pixel of the remote sensing image once by using a radiation response uniformity correction method to acquire a first correction image;
a second correction module: a method for image-processing-based correction performs a second correction on each pel of the first corrected image to obtain a second corrected image that completely eliminates banding noise.
Optionally, when the first correction module performs primary correction on each pixel of the remote sensing image to obtain a first corrected image, the following steps are performed:
s201, acquiring a statistical mean value of gray level response of each pixel, a pixel value of each pixel and a multi-time average dark signal value of each pixel;
s202, calculating a first correction coefficient of the radiation response uniformity of each pixel according to the statistical mean value of the gray level response of each pixel, the pixel value of each pixel and the multiple average dark signal value of each pixel;
s203, correcting the gray value of each pixel according to the first correction coefficient.
According to the fusion correction device based on the stripe noise, a remote sensing image shot by a camera is collected through a collection module; the first correction module performs primary correction on each pixel of the remote sensing image to obtain a first correction image; the second correction module performs a secondary correction on each pixel of the first corrected image to obtain a second corrected image in which the stripe noise is completely removed. By combining the radiation response uniformity correction with the noise correction based on image processing, the stripe noise is effectively eliminated, and the problems of image blurring, ringing response, blocking effect and the like caused by a complex algorithm are avoided, so that the quality of the remote sensing image is improved, and the performance of the remote sensing image is improved.
In a third aspect, the present application provides an electronic device comprising a processor and a memory, wherein the memory stores computer readable instructions, and when the computer readable instructions are executed by the processor, the steps of the method as provided in the first aspect are executed.
In a fourth aspect, the present application provides a storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method as provided in the first aspect above.
The beneficial effect of this application: effectively eliminates the stripe noise, improves the quality of the remote sensing image and improves the performance of the remote sensing image.
Drawings
Fig. 1 is a flowchart of a method for fusion correction of stripe noise according to the present application.
Fig. 2 is a schematic structural diagram of a fusion correction apparatus for stripe noise according to the present application.
Fig. 3 is a schematic structural diagram of an electronic device provided in the present application.
Fig. 4 is a schematic diagram of noise correction based on image processing provided in the present application.
Description of reference numerals:
100. fitting a circle center; 200. an upper arc; 300. a lower arc; 201. an acquisition module; 202. a first correction module; 203. a second correction module; 301. a processor; 302. a memory; 303. a communication bus.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as presented in the figures, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments obtained by a person skilled in the art based on the embodiments of the present application without making any creative effort fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not construed as indicating or implying relative importance.
In practical application, the remote sensing image is often interfered and influenced by various noise sources in the generation and transmission processes, so that the image quality is deteriorated. Stripe noise is a common phenomenon in many on-board, on-board multi-sensor and single-sensor spectrometer imaging. In the imaging process of repeatedly scanning ground objects by using a sensor and a photoelectric device, a satellite is difficult to achieve a completely consistent response level among a plurality of detection units due to the positive and negative scanning response difference of scanning detection units; plus errors in the spatially complex electromagnetic environment and the device itself; and a special noise which is caused by disturbance of a plurality of factors such as sensor scanning mechanical motion and has certain periodicity and directivity and is distributed in a strip shape.
Referring to fig. 1, fig. 1 is a flowchart of a method for fusion correction of stripe noise in some embodiments of the present application, for eliminating stripe noise of a remote sensing image, including the following steps:
s1, acquiring a remote sensing image shot by a camera;
s2, performing primary correction on each pixel of the remote sensing image by using a radiation response uniformity correction method to obtain a first corrected image;
and S3, performing secondary correction on each pixel of the first correction image based on an image processing method to obtain a second correction image for completely eliminating stripe noise.
The acquisition of remote sensing images belongs to the prior art.
The radiation response uniformity correction method can determine a correction value by analyzing and calculating a radiation value without atmospheric influence obtained by field spectrum test and a satellite sensor synchronous observation result; the radiation response uniformity correction of each pixel can also be performed by the existing regression analysis method and histogram method.
The image processing method can adopt the existing methods such as Fourier transform, Walsh transform and discrete cosine transform; the existing image enhancement and restoration can also be adopted to improve the quality of the image, such as removing noise, improving the definition of the image and the like; and extracting meaningful characteristic parts in the image by adopting a plurality of common image segmentation algorithms so as to correct and eliminate the strip noise.
According to the fusion correction method for the stripe noise, a remote sensing image shot by a camera is collected; correcting each pixel of the remote sensing image once by using a radiation response uniformity correction method to obtain a first corrected image; based on the method of image processing, each image element of the first corrected image is corrected twice to obtain a second corrected image in which the stripe noise is completely removed. By combining the radiation response uniformity correction with the noise correction based on image processing, the stripe noise is effectively eliminated, and the problems of image blurring, ringing response, blocking effect and the like caused by a complex algorithm are avoided, so that the quality of the remote sensing image is improved, and the performance of the remote sensing image is improved.
In some embodiments, step S2 includes the steps of:
s201, acquiring a statistical mean value of gray level response of each pixel, a pixel value of each pixel and a multi-time average dark signal value of each pixel;
s202, calculating a first correction coefficient of the radiation response uniformity of each pixel according to the statistical mean value of the gray level response of each pixel, the pixel value of each pixel and the multiple average dark signal value of each pixel;
s203, correcting the gray value of each pixel according to the first correction coefficient.
In step S201, the statistical mean of the gray scale response of each pixel, the pixel value of each pixel, and the multiple average dark signal value of each pixel may be obtained by the prior art, such as a sensor or a photoelectric device.
In a further embodiment, step S202 comprises: calculating a first correction coefficient of the radiation response uniformity of each pixel according to the following formula:
Figure DEST_PATH_IMAGE019
step S203 includes: correcting the gray value of each pixel according to the following formula:
Figure DEST_PATH_IMAGE021
wherein,
Figure 792730DEST_PATH_IMAGE005
representative image coordinates are
Figure 944225DEST_PATH_IMAGE006
A first correction coefficient of the pixel of (1); AVE represents the statistical mean value of gray response of each pixel;
Figure 270164DEST_PATH_IMAGE007
representative image coordinates are
Figure 308659DEST_PATH_IMAGE006
A pixel value of a pixel of (a);
Figure 694641DEST_PATH_IMAGE008
representative image coordinates are
Figure 208799DEST_PATH_IMAGE006
Multiple average dark signal values of the pixels of (1);
Figure 197483DEST_PATH_IMAGE009
representing the coordinates of the image after one correction
Figure 480697DEST_PATH_IMAGE006
The gray value of the pixel of (1);
Figure 411482DEST_PATH_IMAGE010
representative image coordinates are
Figure 147357DEST_PATH_IMAGE006
Of the pixel.
By correcting the remote sensing image once in this way, the noise rule on the image tends to be stable, namely, the stripe noise in the first image is not randomly distributed, but the distributed position is more uniform, so that the method is convenient for positioning and searching, and lays a foundation for removing the stripe noise by using an image processing method in the next step.
In some embodiments, step S3 includes:
s301, acquiring an upper arc 200 and a lower arc 300 of a strip noise area; the upper arc 200 and the lower arc 300 are respectively an upper boundary and a lower boundary of the strip noise area, and the upper arc 200 and the lower arc 300 have the same fitting circle center 100;
s302, calculating a second correction coefficient sequence of each row of pixels between the upper arc 200 and the lower arc 300 based on a noise distribution rule;
and S303, carrying out secondary correction on the gray value of the pixel between the upper arc 200 and the lower arc 300 according to the second correction coefficient sequence to obtain a second correction image.
In practical application, because only part of the strip noise remains in the remote sensing image after the radiation response uniformity correction, we can select a strip noise at will, specifically refer to fig. 4, the shape of the strip noise is similar to an arc strip, and the position of the strip noise is fixed, and the upper arc 200 and the lower arc 300 of the concentric circle of the arc strip can be obtained by adopting a data fitting mode, and the analytical expressions of the upper arc 200 and the lower arc 300 in the image coordinate system are obtained:
Figure 815098DEST_PATH_IMAGE022
(1)
Figure DEST_PATH_IMAGE023
(2)
Figure 811873DEST_PATH_IMAGE024
(3)
Figure DEST_PATH_IMAGE025
(4)
where Cx is the x-axis coordinate of a point on the upper arc 200 in the image coordinate system; cy is the y-axis coordinate of the point on the upper arc 200 in the image coordinate system; c' x is the x-axis coordinate of the point on the lower arc 300 in the image coordinate system; c' y is the y-axis coordinate of the point on the lower arc 300 in the image coordinate system; r is the radius of the upper arc 200; r is the radius of the lower arc 300; theta 1 is an included angle between a straight line where the radius between any point on the upper arc 200 and the fitting circle center is located and the x axis of the image coordinate system; theta 2 is an included angle between any point on a straight line where the radius between the fitting circle centers of the lower arc 300 is located and the fitting circle center and the x axis of the image coordinate system; xc is the x-axis coordinate of the fitted circle center 100 in the image coordinate system; yc is the y-axis coordinate of the fitted circle center 100 in the image coordinate system.
The coordinates of the fitting circle center 100 can be obtained through the following steps:
acquiring coordinate data of an upper boundary point and a lower boundary point of a strip noise area;
fitting according to the coordinate data of the upper boundary point of the strip noise area to obtain an initial upper circular arc, and fitting according to the coordinate data of the lower boundary point of the strip noise area to obtain an initial lower circular arc;
extracting the circle center coordinate of the initial upper arc as a first circle center coordinate, and extracting the circle center coordinate of the initial lower arc as a second circle center coordinate;
and calculating the coordinate of the fitted circle center according to the first circle center coordinate and the second circle center coordinate.
Specifically, a midpoint between the center of the initial upper circular arc and the center of the initial lower circular arc may be used as a fitting center, and assuming that the obtained first center coordinate is (7, 10) and the obtained second center coordinate is (7, 10.2), the x-axis coordinate value of the fitting center is (7 + 7)/2 =7, and the y-axis coordinate value is (10 + 10.2)/2 =10.1, so that the coordinate of the fitting center is (7, 10.1). By determining the coordinates of the fitting circle center in this way, the accuracy of obtaining the fitting circle center can be improved, and the obtained strip noise area formed by the upper arc 200 and the lower arc 300 is more accurate.
After the coordinates of the fitting circle center 100 are obtained, the coordinates of the fitting circle center 100 can be used as constraint conditions, the final upper arc 200 is obtained by fitting according to the coordinate data of the upper boundary point of the strip noise area, and the final lower arc 300 is obtained by fitting according to the coordinate data of the lower boundary point of the strip noise area.
In some embodiments, step S302 includes sequentially taking each column of pixels between upper arc 200 and lower arc 300 as a target column of pixels, and performing the following steps:
s3021, acquiring x-axis coordinates of a first pixel and a second pixel in a target column pixel; the first pixel is the first pixel of the target row pixel, and the second pixel is the last pixel of the target row pixel;
s3022, calculating y-axis coordinates of the middle points of the first pixel, the second pixel and the target row pixels according to the coordinates of the fitted circle center 100;
and S3023, calculating a second correction coefficient of each pixel of the target column pixel according to the first pixel, the second pixel and the y-axis coordinate of the midpoint of the target column pixel to obtain a second correction coefficient sequence.
With continued reference to fig. 4, the intersection points of the target column pixel and the upper and lower arcs 200 and 300 are the first and second pixel, respectively, and are recorded as A, B, respectively, and the midpoint of the target column pixel is set as E. In practical application, the distribution of the noise intensity is gradually decreased towards two sides by the central arc line of the strip, so that the noise intensity conforms to a quadratic curve model. The law that the noise intensity decreases from the midpoint E of the target column element to both sides of the first and second elements is graphically represented by a parabola passing A, B, E' and the distance from the point on the parabola to the line AB represents the noise intensity of the corresponding point on the line AB (i.e., the projection point of the point on the parabola on the line AB). Thus, the preset second correction coefficient of the midpoint E of the target column element, which is the vertex of the parabola, can be taken as
Figure 883865DEST_PATH_IMAGE026
Then the correction plane coordinates for midpoint E of the target column pixel are set to (d,
Figure DEST_PATH_IMAGE027
) The correction plane is a coordinate plane which takes a y-axis coordinate axis of the pixel in an image coordinate system as a horizontal axis and takes a second correction coefficient of the pixel as a vertical axis; wherein,
Figure 700512DEST_PATH_IMAGE027
is a known quantity based on data obtained by statistical analysis of the image. The upper arc 200 and the lower arc 300 are boundaries of the stripe noise region, and the image elements on the boundaries, i.e. the first image element a and the second image element B, are not needed to be corrected, so the correction coefficient can be set to 1, and then the correction plane coordinates of the first image element a and the second image element B can be respectively set to a
Figure 171944DEST_PATH_IMAGE028
And B
Figure DEST_PATH_IMAGE029
Wherein, the compound can be obtained by the formula (1):
Figure 6914DEST_PATH_IMAGE030
=
Figure DEST_PATH_IMAGE031
is obtained from the formula (3): = g
Figure 499075DEST_PATH_IMAGE032
The formula (2) can be used for obtaining:
Figure DEST_PATH_IMAGE033
from the formula (4):
Figure 819329DEST_PATH_IMAGE034
Figure 828874DEST_PATH_IMAGE036
wherein Xc is the x-axis coordinate of the fitting circle center 100 in the image coordinate system; yc is the y-axis coordinate of the fitting circle center 100 in the image coordinate system; k1 is the x-axis coordinate value of the image coordinate system at any point of the upper arc 200; theta 1 is an included angle between a straight line where the radius between any point on the upper arc 200 and the fitting circle center is located and the x axis of the image coordinate system; theta 2 is an included angle between a straight line where the radius between any point on the lower arc 300 and the fitting circle center is located and the x axis of the image coordinate system; k2 is the x-axis coordinate value of the image coordinate system at any point of the lower arc 300; d is the coordinate value of the midpoint E of the target column pixel on the x-axis of the correction plane;
Figure 65820DEST_PATH_IMAGE014
is the x-axis coordinate value of the first pixel on the correction plane;
Figure 135407DEST_PATH_IMAGE015
is the x-axis coordinate value of the second pixel on the correction plane; r is the radius of the upper arc 200; r is the radius of lower arc 300.
In accordance with common knowledge, the midpoints E (d,
Figure 333170DEST_PATH_IMAGE027
) And one of the intersection coordinates, e.g. the first picture element A: (
Figure 520307DEST_PATH_IMAGE014
And 1) substitution into vertex formula of parabola
Figure DEST_PATH_IMAGE037
The method can be obtained by the following steps:
Figure DEST_PATH_IMAGE039
thus, the second correction coefficient for each picture element of the target column of picture elements can be calculated according to the following formula:
Figure DEST_PATH_IMAGE041
wherein,
Figure 549443DEST_PATH_IMAGE013
representative image coordinates are
Figure 399718DEST_PATH_IMAGE006
Of the picture element (i.e. second correction coefficient)
Figure DEST_PATH_IMAGE043
Second of column elements of the object
Figure 943832DEST_PATH_IMAGE017
A second correction coefficient for each picture element);
Figure 295179DEST_PATH_IMAGE014
the coordinate value of the x axis of the first pixel on the correction plane;
Figure 490406DEST_PATH_IMAGE015
the coordinate value of the x axis of the second pixel on the correction plane;
Figure 901796DEST_PATH_IMAGE016
the coordinate value of the middle point of the target column pixel on the y axis of the correction plane; d is the coordinate value of the middle point of the target column pixel on the x axis of the correction plane;
Figure 808572DEST_PATH_IMAGE017
is the x-axis coordinate value of the pixel in the image coordinate system.
Since the second correction coefficient sequence is calculated, the pixels in the strip noise area can be corrected, specifically referring to the following formula:
Figure DEST_PATH_IMAGE045
wherein,
Figure 822664DEST_PATH_IMAGE046
representative image coordinates are
Figure 373862DEST_PATH_IMAGE006
A second correction coefficient of the picture element of (1);
Figure 221732DEST_PATH_IMAGE008
representative image coordinates are
Figure 615805DEST_PATH_IMAGE006
Multiple average dark signal values of the pixels of (1);
Figure 433588DEST_PATH_IMAGE009
representing the coordinates of the image after one correction
Figure 495085DEST_PATH_IMAGE006
The gray value of the pixel of (1);
Figure DEST_PATH_IMAGE047
representing the image coordinates after the second correction of
Figure 91020DEST_PATH_IMAGE006
The gray value of the pixel.
According to the method for the fusion correction of the stripe noise, the remote sensing image shot by the camera is collected; correcting each pixel of the remote sensing image once by using a radiation response uniformity correction method to obtain a first corrected image; and performing secondary correction on each pixel of the first correction image based on the image processing method to obtain a second correction image completely eliminating the stripe noise. By combining the radiation response uniformity correction with the noise correction based on image processing, the stripe noise is effectively eliminated, and the problems of image blurring, ringing response, blocking effect and the like caused by a complex algorithm are avoided, so that the quality of the remote sensing image is improved, and the performance of the remote sensing image is improved.
Referring to fig. 2, fig. 2 is a block noise fusion correction apparatus for eliminating block noise of a remote sensing image according to some embodiments of the present application, including the following modules:
the acquisition module 201: the remote sensing image acquisition device is used for acquiring a remote sensing image shot by a camera;
the first correction module 202: the method comprises the steps of correcting each pixel of a remote sensing image once by using a radiation response uniformity correction method to obtain a first corrected image;
the second correction module 203: a method for image-processing-based correction performs secondary correction on each pixel of a first corrected image to obtain a second corrected image from which stripe noise is completely removed.
The acquisition of remote sensing images belongs to the prior art.
The radiation response uniformity correction method can determine a correction value by analyzing and calculating a radiation value without atmospheric influence obtained by a field spectrum test and a satellite sensor synchronous observation result; the radiation response uniformity correction of each pixel can also be performed by the existing regression analysis method and histogram method.
The image processing method can adopt the existing methods such as Fourier transform, Walsh transform and discrete cosine transform; the existing image enhancement and restoration can also be adopted to improve the quality of the image, such as removing noise, improving the definition of the image and the like; and extracting meaningful characteristic parts in the image by adopting a plurality of common image segmentation algorithms so as to correct and eliminate the strip noise.
The fusion correction device for the stripe noise acquires the remote sensing image shot by the camera through the acquisition module 201; the first correction module 202 corrects each pixel of the remote sensing image once to obtain a first corrected image; the second correction module 203 secondarily corrects each of the picture elements of the first corrected image to acquire a second corrected image in which the banding noise is completely removed. By combining the radiation response uniformity correction with the noise correction based on image processing, the stripe noise is effectively eliminated, and the problems of image blurring, ringing response, blocking effect and the like caused by a complex algorithm are avoided, so that the quality of the remote sensing image is improved, and the performance of the remote sensing image is improved.
In some embodiments, the first correction module 202 performs the following steps when obtaining the first corrected image by performing a primary correction on each pixel of the remote sensing image by using a radiation response uniformity correction method:
s201, acquiring a statistical mean value of gray level response of each pixel, a pixel value of each pixel and a multi-time average dark signal value of each pixel;
s202, calculating a first correction coefficient of the radiation response uniformity of each pixel according to the statistical mean value of the gray level response of each pixel, the pixel value of each pixel and the multiple average dark signal value of each pixel;
s203, correcting the gray value of each pixel according to the first correction coefficient.
In step S201, the statistical mean of the gray scale response of each pixel, the pixel value of each pixel, and the multiple average dark signal value of each pixel may be obtained by the prior art, such as a sensor or a photoelectric device.
In a further embodiment, the grey value of each picture element is corrected according to the following formula:
Figure DEST_PATH_IMAGE049
Figure DEST_PATH_IMAGE051
wherein,
Figure 34705DEST_PATH_IMAGE005
representative image coordinates are
Figure 672491DEST_PATH_IMAGE006
A first correction coefficient of the pixel of (1); AVE represents the statistical mean value of the gray level response of each pixel;
Figure 322915DEST_PATH_IMAGE007
representative image coordinates are
Figure 371643DEST_PATH_IMAGE006
A pixel value of a pixel of (a);
Figure 5887DEST_PATH_IMAGE008
representative image coordinates are
Figure 40839DEST_PATH_IMAGE006
Multiple average dark signal values of the pixels of (1);
Figure 919671DEST_PATH_IMAGE009
representing the coordinates of the image after one correction
Figure 280245DEST_PATH_IMAGE006
The gray value of the pixel of (1);
Figure 995260DEST_PATH_IMAGE010
representative image coordinates are
Figure 833903DEST_PATH_IMAGE006
The initial gray value of the picture element.
By correcting the remote sensing image once in this way, the noise rule on the image tends to be stable, namely, the stripe noise in the first image is not randomly distributed, but the distributed position is more uniform, so that the method is convenient for positioning and searching, and lays a foundation for removing the stripe noise by using an image processing method in the next step.
In some embodiments, the second correction module 203 performs the following steps when performing secondary correction on each pel of the first corrected image based on the method of image processing to obtain a second corrected image in which the stripe noise is completely removed:
s301, acquiring an upper arc 200 and a lower arc 300 of a strip noise area; the upper arc 200 and the lower arc 300 have the same fitting circle center 100;
s302, calculating a second correction coefficient sequence of each row of pixels between the upper arc 200 and the lower arc 300 based on a noise distribution rule;
and S303, carrying out secondary correction on the gray value of the pixel between the upper arc 200 and the lower arc 300 according to the second correction coefficient sequence to obtain a second correction image.
In practical application, because only part of the strip noise remains in the remote sensing image after the radiation response uniformity correction, we can select a strip noise at will, specifically refer to fig. 4, the shape of the strip noise is similar to an arc strip, and the position of the strip noise is fixed, and the upper arc 200 and the lower arc 300 of the concentric circle of the arc strip can be obtained by adopting a data fitting mode, and the analytical expressions of the upper arc 200 and the lower arc 300 in the image coordinate system are obtained:
Figure 724499DEST_PATH_IMAGE022
(1)
Figure 865761DEST_PATH_IMAGE023
(2)
Figure 943439DEST_PATH_IMAGE024
(3)
Figure 710406DEST_PATH_IMAGE025
(4)
where Cx is the x-axis coordinate of a point on the upper arc 200 in the image coordinate system; cy is the y-axis coordinate of the point on the upper arc 200 in the image coordinate system; c' x is the x-axis coordinate of the point on the lower arc 300 in the image coordinate system; c' y is the y-axis coordinate of the point on the lower arc 300 in the image coordinate system; r is the radius of the upper arc 200; r is the radius of the lower arc 300; theta 1 is an included angle between a straight line where the radius between any point on the upper arc 200 and the fitting circle center is located and the x axis of the image coordinate system; theta 2 is an included angle between a straight line where the radius between any point on the lower arc 300 and the fitting circle center is located and the x axis of the image coordinate system; xc is the x-axis coordinate of the fitted circle center 100 in the image coordinate system; yc is the y-axis coordinate of the fitted circle center 100 in the image coordinate system.
Wherein, the coordinates of the fitting circle center 100 can be obtained through the following steps:
acquiring coordinate data of an upper boundary point and a lower boundary point of a strip noise area;
fitting according to the coordinate data of the upper boundary point of the strip noise area to obtain an initial upper circular arc, and fitting according to the coordinate data of the lower boundary point of the strip noise area to obtain an initial lower circular arc;
extracting the circle center coordinate of the initial upper arc as a first circle center coordinate, and extracting the circle center coordinate of the initial lower arc as a second circle center coordinate;
and calculating the coordinate of the fitting circle center according to the first circle center coordinate and the second circle center coordinate.
Specifically, a midpoint between the center of the initial upper circular arc and the center of the initial lower circular arc may be used as a fitting center, and assuming that the obtained first center coordinate is (7, 10) and the obtained second center coordinate is (7, 10.2), the x-axis coordinate value of the fitting center is (7 + 7)/2 =7, and the y-axis coordinate value is (10 + 10.2)/2 =10.1, so that the coordinate of the fitting center is (7, 10.1). By determining the coordinates of the fitting circle center in this way, the accuracy of obtaining the fitting circle center can be improved, and the obtained strip noise area formed by the upper arc 200 and the lower arc 300 is more accurate.
After the coordinates of the fitting circle center 100 are obtained, the coordinates of the fitting circle center 100 can be used as constraint conditions, the final upper arc 200 is obtained by fitting according to the coordinate data of the upper boundary point of the strip noise area, and the final lower arc 300 is obtained by fitting according to the coordinate data of the lower boundary point of the strip noise area.
In some embodiments, step S302 includes sequentially taking each column of pixels between upper arc 200 and lower arc 300 as a target column of pixels, and performing the following steps:
s3021, acquiring x-axis coordinates of a first pixel and a second pixel in an image coordinate system in the target row pixels; the first pixel is the first pixel of the target row pixel, and the second pixel is the last pixel of the target row pixel;
s3022, calculating y-axis coordinates of the middle points of the first pixel, the second pixel and the target row pixels according to the coordinates of the fitted circle center 100;
and S3023, calculating a second correction coefficient of each pixel of the target column pixel according to the first pixel, the second pixel and the y-axis coordinate of the midpoint of the target column pixel to obtain a second correction coefficient sequence.
With continued reference to fig. 4, the intersection points of the target column pixel and the upper and lower arcs 200 and 300 are the first and second pixel, respectively, and are recorded as A, B, respectively, and the midpoint of the target column pixel is set as E. In practical application, the distribution of the noise intensity is gradually decreased towards two sides by the central arc line of the strip, so that the noise intensity conforms to a quadratic curve model. The law that the noise intensity decreases from the midpoint E of the target column element to both sides of the first and second elements is graphically represented by a parabola passing A, B, E' and the distance from the point on the parabola to the line AB represents the noise intensity of the corresponding point on the line AB (i.e., the projection point of the point on the parabola on the line AB). Thus, the preset second correction coefficient of the midpoint E of the target column element, which is the vertex of the parabola, can be taken as
Figure 189929DEST_PATH_IMAGE026
Then the correction plane coordinates for midpoint E of the target column pixel are set to (d,
Figure 892306DEST_PATH_IMAGE027
) The correction plane is a coordinate plane which takes a y-axis coordinate axis of the pixel in an image coordinate system as a horizontal axis and takes a second correction coefficient of the pixel as a vertical axis; wherein,
Figure 831181DEST_PATH_IMAGE027
is a known quantity based on data obtained by statistical analysis of the image. The upper arc 200 and the lower arc 300 are boundaries of the stripe noise region, and the image elements on the boundaries, i.e. the first image element a and the second image element B, are not needed to be corrected, so the correction coefficient can be set to 1, and then the correction plane coordinates of the first image element a and the second image element B can be respectively set to a
Figure 11626DEST_PATH_IMAGE028
And B
Figure 1448DEST_PATH_IMAGE029
Wherein, the compound can be obtained by the formula (1):
Figure 609147DEST_PATH_IMAGE030
=
Figure 271204DEST_PATH_IMAGE031
is obtained from the formula (3):
Figure 520919DEST_PATH_IMAGE052
=
Figure 975034DEST_PATH_IMAGE032
the formula (2) can be used for obtaining:
Figure 878268DEST_PATH_IMAGE033
from the formula (4):
Figure 152255DEST_PATH_IMAGE034
Figure 48404DEST_PATH_IMAGE054
wherein Xc is the x-axis coordinate of the fitting circle center 100 in the image coordinate system; yc is the y-axis coordinate of the fitting circle center 100 in the image coordinate system; k1 is the x-axis coordinate value of the image coordinate system at any point of the upper arc 200; theta 1 is an included angle between a straight line where the radius between any point on the upper arc 200 and the fitting circle center is located and the x axis of the image coordinate system; theta 2 is an included angle between a straight line where the radius between any point on the lower arc 300 and the fitting circle center is located and the x axis of the image coordinate system; k2 is the x-axis coordinate value of the image coordinate system at any point of the lower arc 300; d is the coordinate value of the midpoint E of the target column pixel on the x-axis of the correction plane;
Figure 357026DEST_PATH_IMAGE014
is the x-axis coordinate of the first pixel element in the correction planeA value;
Figure 696741DEST_PATH_IMAGE015
is the x-axis coordinate value of the second pixel on the correction plane; r is the radius of the upper arc 200; r is the radius of lower arc 300.
In accordance with common knowledge, the midpoint E (d,
Figure 723602DEST_PATH_IMAGE027
) And one of the intersection coordinates, e.g. the first picture element A: (
Figure 659328DEST_PATH_IMAGE014
And 1) substitution into vertex formula of parabola
Figure 88036DEST_PATH_IMAGE037
The method can be obtained by the following steps:
Figure 333072DEST_PATH_IMAGE056
thus, the second correction coefficient for each picture element of the target column of picture elements can be calculated according to the following formula:
Figure 847230DEST_PATH_IMAGE058
wherein,
Figure 711281DEST_PATH_IMAGE013
representative image coordinates are
Figure 368396DEST_PATH_IMAGE006
Of the picture element (i.e. second correction coefficient)
Figure 659700DEST_PATH_IMAGE043
Second of column elements of the object
Figure 785788DEST_PATH_IMAGE017
A second correction coefficient for the individual picture element);
Figure 453530DEST_PATH_IMAGE014
the coordinate value of the x axis of the first pixel on the correction plane;
Figure 856829DEST_PATH_IMAGE015
the coordinate value of the x axis of the second pixel on the correction plane;
Figure 194401DEST_PATH_IMAGE016
the coordinate value of the midpoint of the target column pixel on the y axis of the correction plane; d is the coordinate value of the middle point of the target column pixel on the x axis of the correction plane;
Figure 417572DEST_PATH_IMAGE017
is the x-axis coordinate value of the pixel in the image coordinate system.
Since the second correction coefficient sequence is calculated, the pixels in the strip noise area can be corrected, specifically referring to the following formula:
Figure 13638DEST_PATH_IMAGE060
wherein,
Figure 271444DEST_PATH_IMAGE046
representative image coordinates are
Figure 278452DEST_PATH_IMAGE006
A second correction coefficient of the picture element of (1);
Figure 254499DEST_PATH_IMAGE008
representative image coordinates are
Figure 264043DEST_PATH_IMAGE006
Multiple average dark signal values of the pixels of (1);
Figure 235410DEST_PATH_IMAGE009
representing the coordinates of the image after one correction
Figure 570576DEST_PATH_IMAGE006
The gray value of the pixel of (1);
Figure 378126DEST_PATH_IMAGE047
representing the image coordinates after quadratic correction of
Figure 191362DEST_PATH_IMAGE006
The gray value of the pixel.
As can be seen from the above, the fusion correction device for stripe noise of the present application acquires the remote sensing image shot by the camera through the acquisition module 201; the first correction module 202 corrects each pixel of the remote sensing image once to obtain a first corrected image; the second correction module 203 secondarily corrects each of the picture elements of the first corrected image to acquire a second corrected image in which the banding noise is completely removed. By combining the radiation response uniformity correction with the noise correction based on image processing, the stripe noise is effectively eliminated, and the problems of image blurring, ringing response, blocking effect and the like caused by a complex algorithm are avoided, so that the quality of the remote sensing image is improved, and the performance of the remote sensing image is improved.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure, where the electronic device includes: the processor 301 and the memory 302, the processor 301 and the memory 302 being interconnected and communicating with each other via a communication bus 303 and/or other form of connection mechanism (not shown), the memory 302 storing a computer program executable by the processor 301, the processor 301 executing the computer program when the computing device is running to perform the method in any of the alternative implementations of the above embodiments when executed to implement the following functions: acquiring a remote sensing image shot by a camera; correcting each pixel of the remote sensing image once by using a radiation response uniformity correction method to obtain a first corrected image; and performing secondary correction on each pixel of the first correction image based on the image processing method to obtain a second correction image completely eliminating the stripe noise.
The present application provides a storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the method in any optional implementation manner of the foregoing implementation manner is executed, so as to implement the following functions: acquiring a remote sensing image shot by a camera; correcting each pixel of the remote sensing image once by using a radiation response uniformity correction method to obtain a first corrected image; and performing secondary correction on each pixel of the first correction image based on the image processing method to obtain a second correction image completely eliminating the stripe noise. The storage medium may be implemented by any type of volatile or nonvolatile storage device or combination thereof, such as a Static Random Access Memory (SRAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), an Erasable Programmable Read-Only Memory (EPROM), a Programmable Read-Only Memory (PROM), a Read-Only Memory (ROM), a magnetic Memory, a flash Memory, a magnetic disk, or an optical disk.
In the embodiments provided in the present application, it should be understood that the disclosed system and method may be implemented in other ways. The above-described system embodiments are merely illustrative, and for example, the division of the units is merely a logical division, and there may be other divisions in actual implementation, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of systems or units through some communication interfaces, and may be in an electrical, mechanical or other form.
In addition, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, the functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist alone, or two or more modules may be integrated to form an independent part.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above description is only an embodiment of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A method for fusion correction of stripe noise is used for eliminating the stripe noise of remote sensing images, and is characterized by comprising the following steps:
s1, acquiring a remote sensing image shot by a camera;
s2, performing primary correction on each pixel of the remote sensing image by using a radiation response uniformity correction method to obtain a first corrected image;
and S3, performing secondary correction on each pixel of the first correction image based on an image processing method to obtain a second correction image completely eliminating stripe noise.
2. The fusion correction method of stripe noise according to claim 1, wherein step S2 comprises the steps of:
s201, acquiring a statistical mean value of gray level response of each pixel, a pixel value of each pixel and a multi-time average dark signal value of each pixel;
s202, calculating a first correction coefficient of the radiation response uniformity of each pixel according to the statistical mean value of the gray response of each pixel, the pixel value of each pixel and the multiple average dark signal value of each pixel;
s203, correcting the gray value of each pixel according to the first correction coefficient.
3. The method for fusion correction of stripe noise according to claim 2, wherein step S202 comprises: calculating a first correction coefficient of the radiation response uniformity of each pixel according to the following formula:
Figure DEST_PATH_IMAGE001
step S203 includes: correcting the gray value of each pixel according to the following formula:
Figure 316775DEST_PATH_IMAGE002
wherein,
Figure DEST_PATH_IMAGE003
representative image coordinates are
Figure 333272DEST_PATH_IMAGE004
The first correction coefficient of the pixel of (a); AVE represents the statistical mean value of the gray level response of each pixel;
Figure 300091DEST_PATH_IMAGE005
representative image coordinates are
Figure 524268DEST_PATH_IMAGE004
The pixel value of the pixel of (a);
Figure 943748DEST_PATH_IMAGE006
representative image coordinates are
Figure 295095DEST_PATH_IMAGE004
Said multiple averaging of picture elements ofA dark signal value;
Figure 116420DEST_PATH_IMAGE007
representing the coordinates of the image after one correction
Figure 777078DEST_PATH_IMAGE004
The gray value of the pixel of (1);
Figure 683854DEST_PATH_IMAGE008
representative image coordinates are
Figure 370050DEST_PATH_IMAGE004
The initial gray value of the picture element.
4. The fusion correction method of stripe noise according to claim 1, wherein step S3 comprises:
s301, acquiring an upper arc and a lower arc of a strip noise area; the upper arc and the lower arc are respectively an upper boundary and a lower boundary of the strip noise area, and the upper arc and the lower arc have the same fitting circle center;
s302, calculating a second correction coefficient sequence of each row of pixels between the upper arc and the lower arc based on a noise distribution rule;
s303, carrying out secondary correction on the gray value of the pixel between the upper circular arc and the lower circular arc according to the second correction coefficient sequence to obtain a second correction image.
5. The method for fusion correction of stripe noise according to claim 4, wherein step S302 comprises sequentially taking each column of pixels between said upper arc and said lower arc as a target column of pixels, and performing the following steps:
s3021, acquiring x-axis coordinates of a first pixel and a second pixel in an image coordinate system in the target row pixels; the first pixel is the first pixel of the target row of pixels, and the second pixel is the last pixel of the target row of pixels;
s3022, calculating the y-axis coordinates of the midpoints of the first pixel, the second pixel and the target row pixels in an image coordinate system according to the coordinates of the fitted circle centers;
and S3023, calculating a second correction coefficient of each pixel of the target row of pixels according to the y-axis coordinates of the first pixel, the second pixel and the midpoint of the target row of pixels in an image coordinate system, and obtaining the second correction coefficient sequence.
6. The fusion correction method of the stripe noise according to claim 5, wherein step S3023 comprises: calculating a second correction coefficient of each pixel of the target column of pixels according to the following formula:
Figure 311461DEST_PATH_IMAGE009
wherein,
Figure 893752DEST_PATH_IMAGE010
representative image coordinates are
Figure 537092DEST_PATH_IMAGE004
The second correction coefficient of the picture element of (a);
Figure DEST_PATH_IMAGE011
the x-axis coordinate value of the first pixel on the correction plane is taken as the coordinate value;
Figure 699083DEST_PATH_IMAGE012
the coordinate value of the second pixel on the x axis of the correction plane is taken as the coordinate value;
Figure 963843DEST_PATH_IMAGE013
a preset second correction coefficient for the midpoint of the target column pixel; d is the coordinate value of the middle point of the target column pixel on the x axis of the correction plane;
Figure 966303DEST_PATH_IMAGE014
is the x-axis coordinate value of the pixel in the image coordinate system.
7. A stripe noise fusion correction device is used for eliminating stripe noise of remote sensing images, and is characterized by comprising the following modules:
an acquisition module: the remote sensing image acquisition device is used for acquiring a remote sensing image shot by a camera;
a first correction module: the system comprises a remote sensing image acquisition unit, a radiation response uniformity correction unit and a control unit, wherein the radiation response uniformity correction unit is used for correcting each pixel of the remote sensing image once by using a radiation response uniformity correction method to acquire a first correction image;
a second correction module: a method for image-processing-based correction performs a second correction on each pel of the first corrected image to obtain a second corrected image that completely eliminates banding noise.
8. The fusion correction device of stripe noise according to claim 7, wherein said first correction module performs the following steps when obtaining the first corrected image by performing a primary correction on each pixel of said remote sensing image by using a radiation response uniformity correction method:
s201, acquiring a statistical mean value of gray level response of each pixel, a pixel value of each pixel and a multi-time average dark signal value of each pixel;
s202, calculating a first correction coefficient of the radiation response uniformity of each pixel according to the statistical mean value of the gray level response of each pixel, the pixel value of each pixel and the multiple average dark signal value of each pixel;
s203, correcting the gray value of each pixel according to the first correction coefficient.
9. An electronic device comprising a processor and a memory, wherein the memory stores computer readable instructions, and the computer readable instructions, when executed by the processor, perform the steps of the method for fusion correction of stripe noise according to any one of claims 1-6.
10. A storage medium having stored thereon a computer program, wherein the computer program, when being executed by a processor, executes the steps of the method for fusion correction of stripe noise according to any of claims 1-6.
CN202210465517.2A 2022-04-29 2022-04-29 Fusion correction method and device for stripe noise, electronic equipment and storage medium Pending CN114820376A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210465517.2A CN114820376A (en) 2022-04-29 2022-04-29 Fusion correction method and device for stripe noise, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210465517.2A CN114820376A (en) 2022-04-29 2022-04-29 Fusion correction method and device for stripe noise, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114820376A true CN114820376A (en) 2022-07-29

Family

ID=82509600

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210465517.2A Pending CN114820376A (en) 2022-04-29 2022-04-29 Fusion correction method and device for stripe noise, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114820376A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116074484A (en) * 2023-01-15 2023-05-05 山东产研卫星信息技术产业研究院有限公司 Bayer color reconstruction method of CMOS satellite image

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116074484A (en) * 2023-01-15 2023-05-05 山东产研卫星信息技术产业研究院有限公司 Bayer color reconstruction method of CMOS satellite image

Similar Documents

Publication Publication Date Title
US11783457B2 (en) Multispectral camera dynamic stereo calibration algorithm based on saliency features
WO2020010945A1 (en) Image processing method and apparatus, electronic device and computer-readable storage medium
US20220036589A1 (en) Multispectral camera external parameter self-calibration algorithm based on edge features
CN110766679A (en) Lens contamination detection method and device and terminal equipment
CN111144213B (en) Object detection method and related equipment
US20240303772A1 (en) Device and method for correspondence analysis in images
CN109118544B (en) Synthetic aperture imaging method based on perspective transformation
CN107403410B (en) Splicing method of thermal infrared images
CN110345875B (en) Calibration and ranging method, device, electronic equipment and computer readable storage medium
JP6830712B1 (en) Random sampling Consistency-based effective area extraction method for fisheye images
CN112513936A (en) Image processing method, device and storage medium
CN116228780A (en) Silicon wafer defect detection method and system based on computer vision
CN113379636A (en) Infrared image non-uniformity correction method, device, equipment and storage medium
CN111931744A (en) Method and device for detecting change of remote sensing image
CN118229749B (en) Full-moon digital orthographic image registration method, full-moon digital orthographic image registration device, electronic equipment and storage medium
CN113962877B (en) Pixel distortion correction method, correction device and terminal
CN114820376A (en) Fusion correction method and device for stripe noise, electronic equipment and storage medium
CN108596981B (en) Aerial view angle re-projection method and device of image and portable terminal
CN114674826A (en) Visual detection method and detection system based on cloth
CN111723753A (en) Satellite remote sensing image strip removing method and device and electronic equipment
CN116385370A (en) Fisheye image processing method, device, electronic equipment and storage medium
CN115760653B (en) Image correction method, device, equipment and readable storage medium
CN116385898A (en) Satellite image processing method and system
CN110334606A (en) Picture-in-picture localization method and device
KR20140038749A (en) Apparatus and method for correcting image distortion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination