CN116862815B - Image sensor seam correction method, system, electronic device and storage medium - Google Patents

Image sensor seam correction method, system, electronic device and storage medium Download PDF

Info

Publication number
CN116862815B
CN116862815B CN202311133832.6A CN202311133832A CN116862815B CN 116862815 B CN116862815 B CN 116862815B CN 202311133832 A CN202311133832 A CN 202311133832A CN 116862815 B CN116862815 B CN 116862815B
Authority
CN
China
Prior art keywords
image
fusion
correction
correction coefficient
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311133832.6A
Other languages
Chinese (zh)
Other versions
CN116862815A (en
Inventor
殷亚祥
邵云峰
曹桂平
董宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Eko Photoelectric Technology Co ltd
Original Assignee
Hefei Eko Photoelectric Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Eko Photoelectric Technology Co ltd filed Critical Hefei Eko Photoelectric Technology Co ltd
Priority to CN202311133832.6A priority Critical patent/CN116862815B/en
Publication of CN116862815A publication Critical patent/CN116862815A/en
Application granted granted Critical
Publication of CN116862815B publication Critical patent/CN116862815B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Abstract

The invention provides an image sensor seam correction method, an image sensor seam correction system, an electronic device and a storage medium. The method comprises the following steps: under a uniform light source, collecting a plurality of original images with different brightness, and recording pixel values of the original images; calculating a correction coefficient of the original image based on the pixel value; dividing the original image to enable the effective region to contain a stitching line and sensor images on two sides of the stitching line; calculating a region correction coefficient corresponding to an effective sub-region based on a pixel value of any effective sub-region image; constructing a fusion coefficient function, and calculating a fusion correction coefficient based on the correction coefficient and the regional correction coefficient; and correcting the pixel value based on the fusion correction coefficient, and outputting a corrected image. The invention provides a segmentation fusion correction method, which adopts different correction coefficients in different areas of a physical space to reduce an imaging seam effect, so that the method can recalibrate the coefficients according to environmental changes and has self-adaption capability.

Description

Image sensor seam correction method, system, electronic device and storage medium
Technical Field
The present invention relates to the field of image processing, and in particular, to a method, a system, an electronic device, and a storage medium for correcting a seam of an image sensor.
Background
Existing image sensor non-uniformity correction methods mainly include dark field non-uniformity correction and light response non-uniformity correction, where the corrected image sensor is typically a full-film image sensor, rather than a tiled image sensor. As shown in fig. 1, in order to increase the resolution of the sensor, the large area sensor adopts a plurality of sensors for stitching, and the stitching can form obvious vertical or horizontal stitching lines (stitching lines) on the photographed image, so that the imaging effect is affected. The imaging of the physical stitching line is related to the angle of the light source and the lens, the fixed correction coefficient cannot adapt to the image acquisition environment and the change of the lens, and the traditional transverse and vertical stripe correction algorithm cannot completely eliminate the stitching effect.
Chinese patent CN112291446B discloses a method for correcting non-uniformity of a large area array CMOS image sensor. Firstly, under uniform illumination of an integrating sphere, original image acquisition is carried out on a CMOS image sensor, non-uniformity calculation is carried out on the acquired image, an image corresponding to the maximum non-uniformity value is obtained, then, according to the structural characteristics of the CMOS image sensor, the array of the large-area array CMOS image is subjected to grouping processing, parameter calculation is carried out by grouping, and finally, a correction model for carrying out non-uniformity correction on the CMOS image sensor is established. This patent differs from the solution of the present invention.
The invention provides a method and a device for correcting the non-uniformity of the light response on a COMS image sensor, which can calculate the offset of an image under different saturation levels off line in the process of correcting the non-uniformity, only need to know a standard correction matrix in the process of on-line calculation, obtain the correction matrix parameters under the saturation level through the offset under different saturation levels, introduce a linear interpolation method, a nonlinear interpolation method and a nearest neighbor principle in the process of correcting calculation, calibrate the standard parameter matrix and increase the correction precision. This patent differs from the solution of the present invention.
Chinese patent CN 102176742B proposes an image correction coefficient acquisition method, a non-uniform image correction method and a system, wherein at least two groups of sensor units in a sensor array are adopted, a plurality of background images under different working temperature conditions of the sensor array are acquired, pixels of a pre-corrected image output by the sensor array are correspondingly grouped according to the grouping of the sensor units in the sensor array, pixel values of the pre-corrected image are acquired according to the groups, an average value of the pixel values in each group is calculated, and in the grouping of the pre-corrected image, the correction coefficient of the current background is calculated under the condition that the square sum of the average value difference of every two groups of pixel values of the corrected image is minimum. This patent is different from the technical problem solved by the present invention.
Chinese patent CN113450270a discloses a correction parameter generating method, an electronic device and a storage medium, which acquire a digital image of a single-color object under uniform illumination condition through an image sensor to be corrected; determining a photosensitive response uneven baseline in the digital image, and determining a first image area and a second image area which are symmetrical to each other at two sides of the photosensitive response uneven baseline; initializing a first correction parameter of each first pixel in the first image area; determining a second correction parameter which is symmetrical to each second pixel in a second image area of each first pixel by taking a photosensitive response uneven baseline as a symmetry axis by taking each first pixel in the first image area as a reference standard; according to the first correction parameter and the second correction parameter, the correction parameter of the image sensor to be corrected is determined, the problem that the image sensor in the related technology has photosensitive response non-uniformity is solved, and the photosensitive response uniformity of the image sensor is improved. This patent is different from the technical problem solved by the present invention.
Disclosure of Invention
The invention provides a method, a system, an electronic device and a storage medium for correcting an image sensor seam, which can at least solve one of the technical problems.
In order to achieve the above purpose, the present invention proposes the following technical solutions:
an image sensor seam correction method, comprising:
under a uniform light source, collecting a plurality of original images with different brightness, and recording pixel values of the original images;
calculating a correction coefficient of the original image based on the pixel value;
dividing the original image to enable the effective region to contain a stitching line and sensor images on two sides of the stitching line;
calculating a region correction coefficient corresponding to an effective sub-region based on a pixel value of any effective sub-region image;
constructing a fusion coefficient function, and calculating a fusion correction coefficient based on the correction coefficient and the regional correction coefficient;
and correcting the pixel value based on the fusion correction coefficient, and outputting a corrected image.
Further, the constructing a fusion coefficient function includes: setting left fusion width L in sensor images on the left and right sides of the stitching line position S as an axis 1 Right fusion width L 2 The method comprises the steps of carrying out a first treatment on the surface of the And constructing a fusion coefficient function f (i) around the stitching line position S within a set left and right fusion width range, wherein i represents the number of columns or rows of image pixels.
Further, the fusion coefficient function f (i) satisfies f (S-L 1 )=0,f(S+L 2 ) =0, f (S) =1, and f (i) is at [ S-L 1 ,S]Monotonically increasing in range, f (i) is at [ S, S+L 2 ]Monotonically decreasing in range.
Further, the sum of fusion widths set in any one of the sensor images is smaller than the width of the sensor image.
Further, the correcting the pixel value based on the fusion correction coefficient, before further includes: and storing the fusion correction coefficient corresponding to any effective subarea.
Further, the correcting the pixel value based on the fusion correction coefficient includes: and finding out the effective subarea based on the pixel value and the pixel position of the original image, determining a fusion correction coefficient corresponding to the effective subarea, correcting the pixel value, and outputting a corrected image.
The invention also provides an image sensor seam correction system, which comprises:
the image acquisition unit acquires a plurality of original images with different brightness under a uniform light source and records pixel values of the original images;
an image segmentation unit for segmenting the original image so that the effective region contains a stitching line and sensor images on two sides of the stitching line;
a parameter calculation unit that calculates a correction coefficient of an original image based on pixel values of the original image; calculating a region correction coefficient corresponding to an effective sub-region based on a pixel value of any effective sub-region image; constructing a fusion coefficient function, and calculating a fusion correction coefficient based on the correction coefficient and the regional correction coefficient;
and the correction unit is used for finding out the effective subarea based on the pixel value and the pixel position of the original image, determining a fusion correction coefficient corresponding to the effective subarea, correcting the pixel value and outputting a corrected image.
Further, the method further comprises the following steps:
and the storage unit is used for storing the fusion correction coefficient corresponding to any effective subarea.
The invention also proposes an electronic device comprising a memory in which a computer program is stored and a processor arranged to run the computer program to perform the image sensor seam correction method as described above.
The present invention also proposes a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the image sensor seam correction method as described above.
The beneficial effects of the invention are as follows: the invention provides a segmentation fusion correction method, which adopts different correction coefficients in different areas of a physical space to reduce an imaging seam effect, so that the method can recalibrate the coefficients according to environmental changes and has self-adaption capability.
Drawings
FIG. 1 is a schematic diagram of an image sensor stitching effect;
FIG. 2 is a flowchart of a method for correcting an image sensor seam in embodiment 1 of the present invention;
FIG. 3 is a schematic diagram of a split segment correction according to embodiment 1 of the present invention;
FIG. 4 is a schematic diagram of a multi-sensor image setting fusion width in embodiment 1 of the present invention;
FIG. 5 is an image of the fusion coefficient function in example 1 of the present invention;
FIG. 6 is a flowchart of image correction in embodiment 1 of the present invention;
FIG. 7 is an image before correction of a seam segmentation in example 1 of the present invention;
FIG. 8 is an image after the patch correction in embodiment 1 of the present invention;
FIG. 9 is a schematic diagram of a split segment correction according to embodiment 2 of the present invention;
FIG. 10 is an image of the fusion coefficient function in embodiment 2 of the present invention;
FIG. 11 is an image of the fusion coefficient function in embodiment 3 of the present invention;
fig. 12 is an image of the fusion coefficient function in embodiment 4 of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention.
A horizontal physical stitching line will form a horizontal demarcation line on the image and a vertical physical stitching line will form a vertical demarcation line on the image, and a similar correction algorithm is called either cross-bar correction or vertical-bar correction. The correction thought of the horizontal stitching line is identical to that of the vertical stitching line, and only a vertical dividing line (vertical correction) is taken as an example for illustration in the embodiment of the invention. The correction method provided by the invention is also suitable for the correction of the seam in the horizontal direction, and the correction of the seam in the horizontal direction can be carried out only by exchanging row and column coordinates in the scheme.
Example 1
As shown in fig. 2, the embodiment takes a method of correcting a seam in a vertical direction as an example, and specifically includes:
under the uniform light source, a plurality of original images with different brightness are collected, and pixel values of the original images are recorded, wherein the pixel values comprise row pixel values and column pixel values.
The invention is described by taking a vertical seam correction method as an example, so that the pixel values recorded in the invention are recorded as column pixel values by taking the column coordinates of the image as references. The original image includes several sensor images with a stitching line as a boundary of the sensor images.
In addition, if the horizontal stitching line is corrected, the line coordinate is required to be used as an index, and the pixel value of the original image is recorded as a line pixel value; the specific correction method is the same as the correction method of the vertical seam line, and only column coordinates in the correction method of the seam line in the vertical direction are required to be replaced by row coordinates to be corrected.
In this embodiment, several images with different brightness are collected under a uniform light source, and any pixel (column pixel value) x in the vertical direction of any image is recorded m,i Wherein the subscript m represents the number of steps of the current light source brightness and i represents the ith column of the image.
At the brightness of the first-level light source, the column pixel value x of the original image is recorded 1,i Is denoted as (x) 1,1 、x 1,2 、...x 1,i ) The method comprises the steps of carrying out a first treatment on the surface of the Converting light source brightness, recording column pixel value x of original image under second-level light source brightness 2,i Is denoted as (x) 2,1 、x 2,2 、...x 2,i )。
Based on the pixel values, correction coefficients of the original image are calculated.
In this embodiment, taking the pixel value of the original image column under the two-stage light source brightness as an example, a traditional vertical streak correction algorithm is adopted to calculate the correction coefficient of the original image. Wherein the correction coefficient of the original image comprises a multiplication coefficient k i And addition coefficient b i
Respectively calculating column pixel mean value y according to the recorded column pixel values of the original images with different brightness m,i According to any column pixel mean y m,i Calculate the corresponding correction coefficient, including multiplication coefficient k i And addition coefficient b i The calculation formula is as follows:
in the traditional vertical streak correction algorithm, the correction multiplication coefficient k is obtained according to calculation i And addition coefficient b i For any pixel p r,i And correcting, wherein the correction formula is as follows: p is p 0 =(p r,i +b i )×k i . Wherein the subscript r represents the image row r and i represents the image column i; p is p r,i Pixel values representing the ith row and ith column of any sensor image are also pixel input values; p is p 0 Output values for the pixels.
Preferably, in practical application, a plurality of (at least two) light source brightnesses can be set, and column pixel values of corresponding sensor images are acquired. If the pixel values of the sensor image column under the brightness of multiple (more than two) light sources are recorded, other calculation modes such as a least square method can be used to calculate the average value of the column pixels.
The original image is segmented so that the effective region contains stitching lines and sensor images on both sides of the stitching lines.
The number and the direction of the dividing lines can be set freely, a plurality of effective subareas are needed in the dividing result, any effective subarea contains the stitching line and the sensor images at two sides of the stitching line, and all the effective subareas can restore the stitching line and the sensor images at two sides of the stitching line. In the present invention, at least one dividing line is required to divide the entire image into 2 areas.
As shown in fig. 3, in this embodiment, a dividing line is provided along a direction perpendicular to the stitching line, dividing the original image into upper and lower horizontal regions. When the seam is along the vertical direction, the number of the dividing lines is less than half of the number of lines of the sensor image; when the seam is along the horizontal direction, the number of the dividing lines is less than half of the number of the sensor image columns.
And calculating a region correction coefficient corresponding to the effective sub-region based on the pixel value of any effective sub-region image.
In this embodiment, referring to fig. 3, the original image is divided into two upper and lower horizontal areas, and any horizontal area contains a stitching line and two side sensor images, so that the horizontal areas all belong to the effective sub-areas.
The region correction coefficients are calculated for any horizontal region j respectively, and the specific process is as follows: according to the recorded column pixel value of the original image, finding the column pixel value x corresponding to the horizontal region j m,i,j Wherein the subscript m represents the number of steps of the current light source brightness, i represents the ith column of the image, and j represents the region number.
At the first level of light source brightness, the column pixel value corresponding to the horizontal region j is denoted as (x) 1,1,j 、x 1,2,j 、...x 1,i,j ) The method comprises the steps of carrying out a first treatment on the surface of the Under the brightness of the second-stage light source, waterThe column pixel value corresponding to the flat region j is denoted as (x) 2,1,j 、x 2,2,j 、...x 2,i,j )。
Respectively calculating regional column pixel mean y j,i,m And calculates the corresponding region multiplication coefficient k j,i Sum-region addition coefficient b j,i The calculation formula is as follows:
and constructing a fusion coefficient function, and calculating a fusion correction coefficient based on the correction coefficient and the regional correction coefficient.
The method for constructing the fusion coefficient function specifically comprises the following steps: setting left fusion width L in sensor images on the left and right sides of the stitching line position S as an axis 1 Right fusion width L 2 So that the total fusion width is 2L; within the total fusion width, a fusion coefficient function f (i) is constructed around the stitching line position S, where i represents the number of columns or rows of image pixels. Total fusion width 2l=l set in the present invention 1 +L 2
The fusion coefficient function satisfies f (S-L 1 )=0,f(S+L 2 ) =0, f (S) =1, and f (i) is at [ S-L 1 ,S]Monotonically increasing in range, f (i) is at [ S, S+L 2 ]Monotonically decreasing in range.
The range of the total fusion width is related to the number of rows and columns of the sensor and the positions of the seams, and the sum of the fusion widths set in any sensor image is smaller than the width of the sensor image.
As shown in fig. 4, the original image is stitched from 4 sensor images, containing 3 physical stitching lines. The width of the sensor image 1 is d 1 I.e. the distance d between the physical splice line 1 and the left boundary 1 The method comprises the steps of carrying out a first treatment on the surface of the The width of the sensor image 2 is d 2 Also the distance d between the physical splice line 1 and the physical splice line 2 2 The method comprises the steps of carrying out a first treatment on the surface of the The width of the sensor image 3 is d 3 I.e. the distance d between the physical splice line 2 and the physical splice line 3 3 The method comprises the steps of carrying out a first treatment on the surface of the The width of the sensor image 4 is d 4 I.e. the distance between the physical splice line 3 and the right boundary.
The total fusion width range is set for the physical splice line 1, and specifically comprises the following steps: setting a left fusion width L in the sensor image 1 1 The right fusion width L is set in the sensor image 2 2 . The total fusion width range is set for the physical splice line 2, and specifically comprises the following steps: setting a left fusion width L in the sensor image 2 3 A right fusion width L is set in the sensor image 3 4 . The total fusion width range is set for the physical splice line 3, and specifically comprises: setting a left fusion width L in the sensor image 3 5 The right fusion width L is set in the sensor image 4 6
Wherein the fusion width L set in the sensor image 1 1 <d 1 The method comprises the steps of carrying out a first treatment on the surface of the Fusion width L set in sensor image 2 2 +L 3 <d 2 The method comprises the steps of carrying out a first treatment on the surface of the Fusion width L set in sensor image 3 4 +L 5 <d 3 The method comprises the steps of carrying out a first treatment on the surface of the Fusion width L set in sensor image 4 6 <d 4
As shown in fig. 5, the present embodiment proposes that the function f (i) satisfying the above requirements is defined as follows:
the function f (i) proposed in the present embodiment is a linear function, and is only used as an example of a fusion coefficient function, and the specific definition of the fusion coefficient function may be set according to the actual situation and the requirement.
And calculating a fusion correction coefficient based on the determined fusion coefficient function, the correction coefficient of the original image and the region correction coefficient. For the horizontal region j, the calculation formula of the fusion correction coefficient is as follows:
in this embodiment, a nonvolatile memory is further provided, and is configured to store the calculated fusion correction coefficient, so that the fusion correction coefficient is conveniently found in a subsequent correction process, and correction efficiency is improved.
And correcting the pixel value based on the fusion correction coefficient, and outputting a corrected image. The method specifically comprises the following steps: and finding out the effective subarea according to the pixel position (r, i), determining a fusion correction coefficient corresponding to the effective subarea, correcting the pixel value, and outputting a corrected image.
As shown in fig. 6, the region j to which the current pixel belongs is calculated according to the pixel position (r, i), and the corresponding fusion correction coefficient is read from the nonvolatile memoryAnd->For pixel value p r,i The linear correction formula is adopted:performing linear correction; outputting the correction result p out Replacement pixel value p r,i The linearity correction process is completed.
Wherein the subscript r represents the image row r and i represents the image column i; p is p r,i Pixel values representing the sensor's r-th row and i-th column, are also pixel input values; p is p out Output values for the pixels.
The pre-processing image of this embodiment is shown in fig. 7, and the processed image is obtained by the correction method described above, see fig. 8.
Preferably, a memory is not set, and a fusion correction coefficient corresponding to any effective sub-region is not stored. At this time, the correction specific process is: calculating the region j of the current pixel through the pixel position (r, i), repeating the calculation process of the parameters, and calculating the corresponding fusion correction coefficientAnd->For pixel value p r,i The linear correction formula is adopted:performing linear correction; outputting the correction result p out Replacement pixel value p r,i The linearity correction process is completed.
Example 2
On the basis of embodiment 1, a new fusion coefficient function is proposed in this embodiment. The concrete construction process is as follows:
setting the symmetrical ranges of the left side and the right side as fusion width L by taking the stitching line position S as an axis, and setting the total fusion width as 2L; as a fusion coefficient function f (i), an axisymmetric function around the stitching line position S is used within the fusion width range, where i represents the number of columns or rows of image pixels.
The specific setting of the fusion width in this embodiment is shown in fig. 9.
The fusion coefficient function satisfies f (S-L) =0, f (s+l) =0, f (S) =1, and f (i) monotonically increases in the range of [ S-L, S ].
The present embodiment proposes that the function family f (i) satisfying the above requirements be defined as follows:
where the variable n is used to control the trend of the monotonically increasing function over the range S-L, S, L representing half the fusion width. The function family proposed in the embodiment is only an example of a fusion coefficient function, and the specific definition of the fusion coefficient function can be set according to the actual situation and the requirement.
As shown in fig. 9, the seam line is spaced from the left boundary of the left sensor 1 by a distance d 1 The distance between the stitching line and the right boundary of the right sensor 2 is d 2 The fusion width satisfies L<min(d 1 ,d 2 ). In order to better embody the effect of the fusion effect of the region division, the fusion width L is set in this embodiment<min(d 1 /2,d 2 /2)。
In practical application, the corresponding fusion coefficient function f (i) can be calculated by taking n values for a plurality of times, and the final n value is determined according to the fusion effect of f (i), so as to determine the fusion correction coefficient.
In this embodiment, as shown in fig. 10, taking n=1, a fusion coefficient function f (i) is constructed as follows:
example 3
In this embodiment, different fusion coefficient functions f (i) are set on the basis of embodiment 2. When 0< n <1, the function monotonically increases faster and faster within the [ S-L, S ] range. When n=1/2, the image of the function f (i) is shown in fig. 11, and the specific expression is as follows:
example 4
In this embodiment, different fusion coefficient functions f (i) are set on the basis of embodiment 2. When n >1, the function monotonically increases more and more slowly over the [ S-L, S ] range.
When n=2, the function f (i) image is as shown in fig. 12, and the specific expression is as follows:
the invention provides a segmentation fusion correction method, which adopts different correction coefficients in different areas of a physical space to reduce an imaging seam effect, so that the method can recalibrate the coefficients according to environmental changes and has self-adaption capability.
The invention also provides an image sensor seam correction system, which comprises:
the image acquisition unit acquires a plurality of original images with different brightness under a uniform light source and records pixel values of the original images;
an image segmentation unit for segmenting the original image so that the effective region contains a stitching line and sensor images on two sides of the stitching line;
a parameter calculation unit that calculates a correction coefficient of an original image based on pixel values of the original image; calculating a region correction coefficient corresponding to an effective sub-region based on a pixel value of any effective sub-region image; constructing a fusion coefficient function, and calculating a fusion correction coefficient based on the correction coefficient and the regional correction coefficient;
and the correction unit is used for finding out the effective subarea based on the pixel value and the pixel position of the original image, determining a fusion correction coefficient corresponding to the effective subarea, correcting the pixel value and outputting a corrected image.
Further comprises: and the storage unit is used for storing the fusion correction coefficient corresponding to any effective subarea.
The invention also proposes an electronic device comprising a memory and a processor, the memory storing a computer program, the processor being arranged to run the computer program to perform the above-mentioned image sensor seam correction method.
The present invention also proposes a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the image sensor seam correction method as described above.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (9)

1. An image sensor seam correction method, comprising:
under a uniform light source, collecting a plurality of original images with different brightness, and recording pixel values of the original images;
calculating a correction coefficient of the original image based on the pixel value;
dividing the original image to enable the effective region to contain a stitching line and sensor images on two sides of the stitching line;
calculating a region correction coefficient corresponding to an effective sub-region based on a pixel value of any effective sub-region image;
constructing a fusion coefficient function, and calculating a fusion correction coefficient based on the correction coefficient and the regional correction coefficient;
and finding out the effective subarea based on the pixel value and the pixel position of the original image, determining a fusion correction coefficient corresponding to the effective subarea, correcting the pixel value, and outputting a corrected image.
2. The image sensor seam correction method of claim 1, wherein the constructing a fusion coefficient function comprises: setting left fusion width L in sensor images on the left and right sides of the stitching line position S as an axis 1 Right fusion width L 2 So that the total fusion width is 2L; within the total fusion width, a fusion coefficient function f (i) is constructed around the stitching line position S, where i represents the number of columns or rows of image pixels.
3. The image sensor seam correction method of claim 2, wherein the fusion coefficient function f (i) satisfies f (S-L 1 )=0,f(S+L 2 ) =0, f (S) =1, and f (i) is at [ S-L 1 ,S]Monotonically increasing in range, f (i) is at [ S, S+L 2 ]Monotonically decreasing in range.
4. The image sensor seam correction method according to claim 2, wherein a sum of fusion widths set in the arbitrary sensor image is smaller than a width of the sensor image.
5. The image sensor seam correction method according to claim 1, wherein the correcting the pixel value based on the fusion correction coefficient, before further comprising: and storing the fusion correction coefficient corresponding to any effective subarea.
6. An image sensor seam correction system, comprising:
the image acquisition unit acquires a plurality of original images with different brightness under a uniform light source and records pixel values of the original images;
an image segmentation unit for segmenting the original image so that the effective region contains a stitching line and sensor images on two sides of the stitching line;
a parameter calculation unit that calculates a correction coefficient of an original image based on pixel values of the original image; calculating a region correction coefficient corresponding to an effective sub-region based on a pixel value of any effective sub-region image; constructing a fusion coefficient function, and calculating a fusion correction coefficient based on the correction coefficient and the regional correction coefficient;
and the correction unit is used for finding out the effective subarea based on the pixel value and the pixel position of the original image, determining a fusion correction coefficient corresponding to the effective subarea, correcting the pixel value and outputting a corrected image.
7. The image sensor seam correction system of claim 6, further comprising:
and the storage unit is used for storing the fusion correction coefficient corresponding to any effective subarea.
8. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, the processor being arranged to run the computer program to perform the image sensor seam correction method of any of claims 1-5.
9. A computer readable storage medium having stored thereon a computer program, wherein the computer program when executed by a processor implements the image sensor seam correction method of any of claims 1-5.
CN202311133832.6A 2023-09-05 2023-09-05 Image sensor seam correction method, system, electronic device and storage medium Active CN116862815B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311133832.6A CN116862815B (en) 2023-09-05 2023-09-05 Image sensor seam correction method, system, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311133832.6A CN116862815B (en) 2023-09-05 2023-09-05 Image sensor seam correction method, system, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN116862815A CN116862815A (en) 2023-10-10
CN116862815B true CN116862815B (en) 2023-11-14

Family

ID=88232683

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311133832.6A Active CN116862815B (en) 2023-09-05 2023-09-05 Image sensor seam correction method, system, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN116862815B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103856727A (en) * 2014-03-24 2014-06-11 北京工业大学 Multichannel real-time video splicing processing system
CN109194872A (en) * 2018-10-24 2019-01-11 深圳六滴科技有限公司 Panoramic image pixel brightness correcting method, device, panorama camera and storage medium
CN112085659A (en) * 2020-09-11 2020-12-15 中德(珠海)人工智能研究院有限公司 Panorama splicing and fusing method and system based on dome camera and storage medium
CN113450270A (en) * 2021-05-26 2021-09-28 浙江大华技术股份有限公司 Correction parameter generation method, electronic device, and storage medium
KR20220017697A (en) * 2020-08-05 2022-02-14 한국기술교육대학교 산학협력단 calibration method and apparatus among mutiple sensors
CN115423821A (en) * 2022-08-10 2022-12-02 中国科学院深圳先进技术研究院 LED screen splicing area image segmentation method and LED screen bright and dark line correction method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9743057B2 (en) * 2012-05-31 2017-08-22 Apple Inc. Systems and methods for lens shading correction
US10802085B2 (en) * 2018-12-11 2020-10-13 Vulcan Inc. Magneto optic disk imager

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103856727A (en) * 2014-03-24 2014-06-11 北京工业大学 Multichannel real-time video splicing processing system
CN109194872A (en) * 2018-10-24 2019-01-11 深圳六滴科技有限公司 Panoramic image pixel brightness correcting method, device, panorama camera and storage medium
KR20220017697A (en) * 2020-08-05 2022-02-14 한국기술교육대학교 산학협력단 calibration method and apparatus among mutiple sensors
CN112085659A (en) * 2020-09-11 2020-12-15 中德(珠海)人工智能研究院有限公司 Panorama splicing and fusing method and system based on dome camera and storage medium
CN113450270A (en) * 2021-05-26 2021-09-28 浙江大华技术股份有限公司 Correction parameter generation method, electronic device, and storage medium
CN115423821A (en) * 2022-08-10 2022-12-02 中国科学院深圳先进技术研究院 LED screen splicing area image segmentation method and LED screen bright and dark line correction method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
基于小区域融合的实时视频拼接技术;李勇;杜丙新;;吉林大学学报(理学版)(06);181-186 *
面阵CCD图像传感器不均匀性的校正;李华, 薛建国;传感器技术(04);16-18 *
高速率分辨率线性扫描工业相机;董宁 等;科技成果;1 *

Also Published As

Publication number Publication date
CN116862815A (en) 2023-10-10

Similar Documents

Publication Publication Date Title
CA2659847C (en) System and method for adaptive non-uniformity compensation for a focal plane array
CN104252700B (en) A kind of histogram equalization method of infrared image
JP2010252325A (en) System and method for image correction
CN108550113A (en) Image scanning output method, device, computer equipment and storage medium
CN106506899A (en) A kind of image acquisition based on machine vision and method for reconstructing and device
TWI777536B (en) Enhanced training method and device for image recognition model
CN101510962B (en) Method and apparatus for correcting lens shadow
WO2022127225A1 (en) Image stitching method and apparatus, and device and storage medium
JP2006279920A (en) Image display device, method of generating correction value of image display device, program for generating correction value of image display device, and recording medium recording program thereon
TWI479454B (en) Method and apparatus for correcting for vignetting in an imaging system
CN116862815B (en) Image sensor seam correction method, system, electronic device and storage medium
CN104748865B (en) A kind of multi-point combination bearing calibration for infrared image
CN115100078B (en) Method and related device for correcting and filling dot matrix coordinates in curved screen image
KR101854355B1 (en) Image correction apparatus selectively using multi-sensor
CN112819738B (en) Infrared image fusion method, device, computer equipment and storage medium
CN108510547A (en) A kind of telecentricity moves camera shaft scaling method and system
CN115720299A (en) Black level correction method and device, computer readable storage medium and terminal
CN110276738B (en) ROI replacement processing method of BMP image
CN103929584B (en) Method for correcting image and image calibrating circuit
CN111369552A (en) Infrared blind pixel detection method and device and computer readable storage medium
CN105959598B (en) Camera multichannel balances look-up table scaling method, multichannel balance method and system
CN117671036B (en) Correction parameter calibration method, device, computer equipment and storage medium
CN103229497B (en) For the method and apparatus for the screen window effect for estimating image detection device
CN114007055B (en) Image sensor lens shading correction method and device
CN108510548A (en) A kind of telecentricity moves camera shaft scaling method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant