CN115499629B - Lateral chromatic aberration correction method, device, equipment and storage medium - Google Patents

Lateral chromatic aberration correction method, device, equipment and storage medium Download PDF

Info

Publication number
CN115499629B
CN115499629B CN202211066007.4A CN202211066007A CN115499629B CN 115499629 B CN115499629 B CN 115499629B CN 202211066007 A CN202211066007 A CN 202211066007A CN 115499629 B CN115499629 B CN 115499629B
Authority
CN
China
Prior art keywords
image
image point
correction
channel
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211066007.4A
Other languages
Chinese (zh)
Other versions
CN115499629A (en
Inventor
郭金阳
胥立丰
崔炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Eswin Computing Technology Co Ltd
Original Assignee
Beijing Eswin Computing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Eswin Computing Technology Co Ltd filed Critical Beijing Eswin Computing Technology Co Ltd
Priority to CN202211066007.4A priority Critical patent/CN115499629B/en
Publication of CN115499629A publication Critical patent/CN115499629A/en
Application granted granted Critical
Publication of CN115499629B publication Critical patent/CN115499629B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application discloses a lateral chromatic aberration correction method, a device, equipment and a storage medium, wherein the method comprises the following steps: determining a chromatic aberration offset position of each first image point based on the position information of each first image point in the image to be corrected, and correcting the channel value of the first image point based on the chromatic aberration offset position of each first image point to obtain an initial corrected image; determining a correction boundary corresponding to at least one correction direction based on channel values of each image point in the initial correction image; and determining at least one target image point combination corresponding to each correction boundary, and correcting the channel value of the image point in each target image point combination based on the channel value of each image point in the initial correction image to obtain a target correction image. By adopting the embodiment of the application, the transverse chromatic aberration correction effect can be improved, and the applicability is high.

Description

Lateral chromatic aberration correction method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of image processing, and in particular, to a method, apparatus, device, and storage medium for correcting lateral chromatic aberration.
Background
Since the light rays are refracted when passing through the lens of the image capturing device, different color channels of each pixel point generate different distance deviations along the direction perpendicular to the optical axis when the image capturing device is imaging, so that the captured image has lateral chromatic aberration.
The correction effect of the prior art on the transverse chromatic aberration is limited, and the accuracy is not high, so how to further improve the correction effect of the transverse chromatic aberration becomes a problem to be solved.
Disclosure of Invention
The embodiment of the application provides a transverse chromatic aberration correction method, a device, equipment and a storage medium, which can improve the transverse chromatic aberration correction effect and have high applicability.
In one aspect, an embodiment of the present application provides a lateral chromatic aberration correction method, including:
determining a chromatic aberration offset position of each first image point based on position information of each first image point in an image to be corrected, and correcting a channel value of the first image point based on the chromatic aberration offset position of each first image point to obtain an initial corrected image, wherein the image to be corrected is a RAW image, and each first image point is one image point except a G channel image point in the image to be corrected;
determining a correction boundary corresponding to at least one correction direction based on channel values of each image point in the initial correction image;
determining at least one target image point combination corresponding to each correction boundary, and correcting the channel value of each image point in each target image point combination based on the channel value of each image point in the initial correction image to obtain a target correction image;
Wherein, each target image point combination corresponding to the correction boundary comprises an image point adjacent to the correction boundary in the initial correction image and an image point adjacent to the image point in the correction direction corresponding to the correction boundary, and each image point in each target image point combination is positioned on the same side of the corresponding correction boundary.
In another aspect, an embodiment of the present application provides a lateral chromatic aberration correction device, including:
the correction module is used for determining a chromatic aberration offset position of each first image point based on the position information of each first image point in the image to be corrected, correcting the channel value of the first image point based on the chromatic aberration offset position of each first image point to obtain an initial correction image, wherein the image to be corrected is a RAW image, and each first image point is one image point except a G channel image point in the image to be corrected;
the determining module is used for determining a correction boundary corresponding to at least one correction direction based on the channel value of each image point in the initial correction image;
the correction module is used for determining at least one target image point combination corresponding to each correction boundary, and correcting the channel value of each image point in each target image point combination based on the channel value of each image point in the initial correction image to obtain a target correction image;
Wherein, each target image point combination corresponding to the correction boundary comprises an image point adjacent to the correction boundary in the initial correction image and an image point adjacent to the image point in the correction direction corresponding to the correction boundary, and each image point in each target image point combination is positioned on the same side of the corresponding correction boundary.
In another aspect, an embodiment of the present application provides an electronic device, including a processor and a memory, where the processor and the memory are connected to each other;
the memory is used for storing a computer program;
the processor is configured to execute the lateral chromatic aberration correction method provided by the embodiment of the application when the computer program is called.
In another aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program that is executed by a processor to implement the lateral color correction method provided by embodiments of the present application.
In this embodiment of the present application, according to the color difference offset position of each B-channel image point and each R-channel image point in the image to be corrected, the color difference of each B-channel image point and each R-channel image point may be initially corrected to obtain an initial corrected image. The method can further determine the correction boundary corresponding to each correction direction according to the channel value of each image point in the initial correction image, for example, determine the transverse correction boundary and the longitudinal correction boundary in the initial correction image, and further correct the channel value of the image point in the target image point combination corresponding to each correction direction by each correction boundary to obtain the target correction image, thereby further improving the transverse chromatic aberration correction effect on the basis of the initial correction image.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a lateral chromatic aberration correction method provided in an embodiment of the present application;
fig. 2 is a schematic diagram of a RAW image provided in an embodiment of the present application;
FIG. 3a is a schematic diagram of a scene for determining a correction image point according to an embodiment of the present application;
FIG. 3b is a schematic view of another scenario in which a correction image point is determined according to an embodiment of the present application;
FIG. 3c is a schematic view of yet another scenario in which a correction image point is determined according to an embodiment of the present application;
fig. 4 is a schematic view of a scene of a patch image point according to an embodiment of the present disclosure;
FIG. 5a is a schematic diagram of a scenario for determining correction boundaries according to an embodiment of the present application;
FIG. 5b is another schematic view of a scenario for determining correction boundaries provided by embodiments of the present application;
FIG. 5c is a schematic view of a scenario for determining a combination of target image points according to an embodiment of the present disclosure;
FIG. 5d is a schematic diagram of another scenario for determining a combination of target image points according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a lateral chromatic aberration correction device provided in an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
Referring to fig. 1, fig. 1 is a schematic flow chart of a lateral chromatic aberration correction method provided in an embodiment of the present application. As shown in fig. 1, the lateral chromatic aberration correction method provided in the embodiment of the present application may specifically include the following steps:
step S11, determining a chromatic aberration offset position of each first image point based on the position information of each first image point in the image to be corrected, and correcting the channel value of the first image point based on the chromatic aberration offset position of each first image point to obtain an initial corrected image.
The image to be corrected is a RAW image, and the RAW image contains physical information about illumination intensity and color of a scene.
Each image point in the RAW image corresponds to a color channel, and when the color channel corresponding to each image point is green, the image point can be called a G-channel image point, when the color channel corresponding to each image point is red, the image point can be called an R-channel image point, and when the color channel corresponding to each image point is blue, the image point can be called an R-channel image point.
The image to be corrected can be any one of GRBG format, RGGB format, BGGR format and GBRG format. As shown in fig. 2, fig. 2 is a schematic diagram of a RAW image provided in an embodiment of the present application. Fig. 2 shows the arrangement of G-channel pixels, B-channel pixels and R-channel pixels in RAW images of different formats.
Each first image point in the image to be corrected is an image point except for the G-channel image point in the image to be corrected, that is, each R-channel image point and each B-channel image point in the image to be corrected can be regarded as one first image point.
In some possible embodiments, for each first image point in the image to be corrected, when determining the color difference offset position of the first image point based on the position information of each first image point in the image to be corrected, the color difference offset parameter corresponding to the R channel and the color difference offset parameter corresponding to the B channel may be determined first.
Further, for each first image point in the image to be corrected, a first lateral distance, a first longitudinal distance, and a first linear distance of the first image point relative to an image center of the image to be corrected may be determined based on the position information of the first image point in the image to be corrected.
Wherein the first lateral distance of the first image point i compared to the image center of the image to be corrected can be expressed as:
Figure BDA0003827554420000031
wherein the first longitudinal distance of the first image point i compared to the image center of the image to be corrected can be expressed as:
Figure BDA0003827554420000032
the first straight-line distance of the first image point i compared with the image center of the image to be corrected can be expressed as:
Figure BDA0003827554420000041
wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure BDA0003827554420000042
representing the first lateral distance corresponding to image point i, < >>
Figure BDA0003827554420000043
Representing the first longitudinal distance corresponding to image point i, < >>
Figure BDA0003827554420000044
Representing a first linear distance corresponding to the image point i, the first image point i having coordinates (x i ,y i ) The coordinates of the image center of the image to be corrected are (x 0 ,y 0 )。
Alternatively, the abscissa of the image center of the image to be corrected may be based on
Figure BDA0003827554420000045
The ordinate may be determined according to +.>
Figure BDA0003827554420000048
And (5) determining.
Wherein width and height are the height and width, respectively, of the image to be corrected.
Wherein the image center of the image to be corrected coincides with the optical axis center.
Further, for each first image point, a first lateral color difference offset of the first image point may be determined based on a first lateral distance, a first linear distance, and a corresponding color difference offset parameter of the first image point compared to an image center of the image to be corrected, and a first longitudinal color difference offset of the first image point may be determined based on a first longitudinal distance, a first linear distance, and a corresponding color difference offset parameter of the first image point compared to an image center of the image to be corrected.
The first lateral chromatic aberration offset corresponding to the first image point i may be expressed as:
Figure BDA0003827554420000046
the first longitudinal chromatic aberration offset corresponding to the first image point i may be expressed as:
Figure BDA0003827554420000047
wherein a, B and c represent color difference offset parameters, and when the first image point i is an R-channel image point, a, B and c are color difference offset parameters corresponding to an R-channel, and when the first image point i is a B-channel image point, a, B and c are color difference offset parameters corresponding to a B-channel.
Further, after determining the lateral chromatic aberration offset and the longitudinal chromatic aberration offset of each first image point, the chromatic aberration offset position of the first image point may be determined based on the position information of the first image point in the image to be corrected and the lateral chromatic aberration offset and the longitudinal chromatic aberration offset.
In some possible embodiments, when correcting the channel value of each first image point based on the color difference offset position of the first image point to obtain an initial corrected image, the first direction corrected image point corresponding to the first image point may be determined from the image to be corrected based on the color difference offset position of each first image point in the image to be corrected compared with the first projection position of the first image point in the first direction.
Wherein the first direction is any one of a transverse direction and a longitudinal direction.
If the color difference offset position of each first image point in the image to be corrected is overlapped with any second image point compared with the first projection position of the first image point in the first direction, the second image point is determined to be the first direction correction image point corresponding to the first image point.
Each second image point is one other image point corresponding to the same channel as the first image point in the image to be corrected. For example, if the first image point is an R-channel image point, when the first projection position corresponding to the first image point coincides with an R-channel image point, the R-channel image point is determined as a first direction correction image point corresponding to the first image point.
Fig. 3a is a schematic view of a scene of determining a correction image point according to an embodiment of the present application, as shown in fig. 3 a. In fig. 3a, for the first image point of the 2 nd row and the 2 nd column, the color difference offset position is shown in fig. 3a, where the color difference offset position can be determined as compared with the first projection position of the first image point in the first direction (transverse direction). When the first projection position is associated with an R channel pixel of row 2 and column 4, the R channel pixel of row 2 and column 4 is determined as a first direction correction pixel (lateral correction pixel) of the first pixel.
Or if the color difference offset position of each first image point is located in any second image point range compared with the first projection position of the first image point in the first direction, determining the second image point as the first direction correction image point corresponding to the first image point.
And if the first projection position corresponding to the chromatic aberration offset position of the first image point is not overlapped with each second image point, determining two second image points which are positioned in the first direction of the first image point and closest to the first projection position as first direction correction image points corresponding to the first image point.
As shown in fig. 3b, fig. 3b is another schematic view of a scene for determining a correction image point according to an embodiment of the present application. In fig. 3b, for the first image point of the 2 nd row and the 2 nd column, the color difference offset position is shown in fig. 3b, where the color difference offset position can be determined as compared with the first projection position of the first image point in the first direction (transverse direction). When the first projection position is not coincident with any R-channel image point, two second image points (R-channel image points of the 2 nd row, 4 th column and R-channel image points of the 2 nd row, 6 th column) closest to the first projection position in the first direction where the first image point is located are determined as first direction correction image points (lateral correction image points) corresponding to the first image point.
After determining the first direction correction image point corresponding to the first image point from the image to be corrected, the channel value of the first image point can be corrected based on the channel value of the first direction correction image point.
When the number of the first direction correction pixels corresponding to the first pixel is 1, the channel value of the first pixel can be directly changed into the channel value of the first direction correction pixel. When the number of the first direction correction image points corresponding to the first image point is 2, the channel value of each first direction correction image point can be linearly interpolated, and the channel value of the first image point is changed into a linear interpolation result.
It should be noted that, when the channel value of each first image point is corrected based on the channel value of the first direction correction image point corresponding to each first image point, the channel value of each first direction correction image point is the original channel value in the image to be corrected. For example, for the first image point a, the original channel value of the first image point a in the image to be corrected is f1, and the channel value f1 of the first image point a is corrected to f2 according to the channel value of the first direction correction image point corresponding to the original channel value f 1. If the first image point a is determined as the first direction correction image point corresponding to the first image point B in the process of correcting the channel value of the first image point B, the channel value of the first image point B is still corrected based on the original channel value f1 of the first image point a.
Based on the above manner, the channel values of all the first image points in the image to be corrected can be corrected in the first direction, so as to obtain an intermediate corrected image.
After the intermediate correction image is obtained, the second direction correction image point corresponding to the first image point can be determined from the intermediate correction image based on the color difference offset position of each first image point in the intermediate correction image compared with the second projection position of the second direction in which the first image point is located, and the channel value of the first image point in the intermediate correction image is corrected based on the channel value of the second direction correction image point corresponding to each first image point in the image to be corrected, so that the initial correction image is finally obtained.
The determination method of the color difference offset position of each first image point in comparison with the projection position of the first image point in the second direction is similar to the determination method of the projection position of the first image point in comparison with the projection position of the first image point in the first direction, and will not be described herein.
The determination manner of the second direction correction image point corresponding to each first image point is similar to the determination manner of the first direction correction image point corresponding to each first image point, and will not be described herein.
Wherein the first direction and the second direction are different directions in the transverse direction and the longitudinal direction, respectively. That is, the channel value of each first image point in the image to be corrected may be corrected in the transverse direction or the longitudinal direction to obtain an intermediate corrected image, and then the channel value of each first image point may be corrected in the other direction to finally obtain an initial corrected image.
In some possible embodiments, when correcting the channel value of each first image point based on the color difference offset position of the first image point to obtain an initial corrected image, the target corrected image point corresponding to each first image point may also be determined from the image to be corrected based on the color difference offset position of each first image point.
In particular, a third image point closest to the chromatic aberration offset position of each first image point may be determined from the image to be corrected, each first image point and the corresponding third image point corresponding to different channels of the R-channel and the B-channel, respectively. For example, if a first image point is a B-channel image point, a B-channel image point closest to the first image point is determined from all R-channel image points in the image to be corrected.
For each first image point, four fourth image points nearest to the third image point corresponding to the first image point may be determined as target correction image points corresponding to the first image point, each fourth image point corresponding to the same channel as the first image point.
As shown in fig. 3c, fig. 3c is a schematic view of still another scenario in which a correction image point is determined according to an embodiment of the present application. In fig. 3c, if a certain first image point is a B-channel image point, and the color difference offset position is shown in fig. 3c, then it may be determined that the third image point closest to the first image point is an R-channel image point of the 4 th row and 4 th column, and then it may further determine the B-channel image point closest to the R-channel image point of the 4 th row and 4 th column (the B-channel image point of the 3 rd row and 3 rd column, the B-channel image point of the 3 rd row and 5 th column, the R-channel image point of the 5 th row and 3 rd column, and the B-channel image point of the 5 th row and 5 th column) as the target correction image point corresponding to the first image point.
That is, for each first image point, a target correction region corresponding to the first image point may be determined. All correction areas corresponding to the first image point can be determined, each correction area is a minimum rectangular area formed by 4 image points belonging to the same channel with the first image point, the correction area where the chromatic aberration offset position of the first image point is located is the target correction area corresponding to the first image point, and based on the minimum rectangular area, 4 image points corresponding to the same channel with the first image point and corresponding to the target correction area corresponding to each first image point can be determined as the target correction image point corresponding to the first image point.
Alternatively, for each first image point, the 2 fourth image points closest to the first lateral distance of the color difference offset position of that first image point may be determined from the image to be corrected. And 2 fourth image points closest to the first longitudinal distance of the chromatic aberration offset position, and determining the determined 4 fourth image points as target correction image points corresponding to the first image points. Wherein each fourth image point and the first image point correspond to the same channel.
Optionally, for each first image point, if the color difference offset position of the first image point is located at a fourth image point corresponding to the same channel as the first image point, the fourth image point may be determined as the target correction image point corresponding to the first image point.
Further, for each first image point, when the number of target correction image points corresponding to the first image point is 4, a channel value of a chromatic aberration offset position of the first image point can be determined based on a bilinear interpolation method, and then the channel value of the first image point is corrected to be the channel value of the chromatic aberration offset position. When the number of the target correction pixels corresponding to the first pixel is 1, the channel value of the first pixel can be corrected to the channel value of the target correction pixel.
It should be noted that, when the chromatic aberration offset position of each first image point is located outside the range of the image to be corrected, the image point of the area where the chromatic aberration offset position of each first image point is located may be complemented to determine a first direction correction image point or a target correction image point for correcting each first image point.
As shown in fig. 4, fig. 4 is a schematic view of a scene of a patch image point according to an embodiment of the present application. The middle white area in fig. 4 is an image to be corrected, and when determining an image point to the right of the image to be corrected, the rightmost column of the image to be corrected may be used as a symmetry axis to mirror-fill the image point to the right of the image to be corrected based on other image points except the rightmost column.
Similarly, when determining the image point to the left of the image to be corrected, the leftmost column of the image to be corrected may be used as the symmetry axis to mirror-complement the image point to the left of the image to be corrected based on other image points except for the leftmost column. In determining the image point above the image to be corrected, the top side column of the image to be corrected may be used as a symmetry axis to mirror-fill the image point above the image to be corrected based on other image points than the top side column. In determining the image point under the image to be corrected, the bottom side column of the image to be corrected can be used as a symmetry axis to mirror-fill the image point under the image to be corrected based on other image points except the bottom side column.
For the upper right, upper left, lower left, and lower right regions (regions marked with dotted boxes in fig. 4) of the image to be corrected, the trimming can be performed based on the above-described trimming manner as well. For example, when the image point is at the upper right side of the image to be corrected, a row at the top right side of the image to be corrected is taken as a symmetry axis, mirror image compensation is performed based on the image point below the symmetry axis, or a row at the rightmost side of the image to be corrected is taken as a symmetry axis, and mirror image compensation is performed based on the image point at the left side of the symmetry axis. The image points in other areas are similar and are not described in detail herein.
When determining the second direction correction image point corresponding to each first image point from the intermediate correction image, the intermediate correction image may be supplemented based on the above manner, which is not described herein again.
When the image points in all directions of the image to be corrected are supplemented, the image points in even columns or even rows are supplemented, so that the supplemented image is identical to the image to be corrected in format.
In some possible embodiments, the color difference offset parameter corresponding to the R channel and the color difference offset parameter corresponding to the B channel in the embodiments of the present application may be determined based on the calibration image.
The calibration image may be a RAW image corresponding to a scene with obvious color distinction, such as a checkerboard, a straight line, a vertical line, and the like, and may be specifically determined based on actual application scene requirements, which is not limited herein.
Specifically, a second lateral chromatic aberration offset and a second longitudinal chromatic aberration offset of the B-channel image point and the R-channel image point corresponding to each preset G-channel image point in the calibration image may be determined as compared to the preset G-channel image point.
Each preset G channel image point in the calibration image is a G channel image point corresponding to a color boundary position in a corresponding scene (the scene is a checkerboard), for example, a G channel image point corresponding to a black and white boundary position in the checkerboard, for example, a G channel image point corresponding to an arbitrary position on a straight line (the scene is a straight line), etc., which can be specifically determined based on the actual application scene requirement, and is not limited herein.
The offset positions of the B-channel image points and the R-channel image points corresponding to the preset G-channel image points can be determined by performing lateral and longitudinal edge detection on the preset G-channel image points and the corresponding B-channel image points and G-channel image points in the calibration image. And determining a second lateral chromatic aberration offset and a second longitudinal chromatic aberration offset of the B-channel image point corresponding to the preset G-channel image point compared with the preset G-channel image point based on the offset position of the B-channel image point corresponding to each preset G-channel image point and the image point position of the preset G-channel image point.
Similarly, the second lateral chromatic aberration offset and the second longitudinal chromatic aberration offset of the R channel image point corresponding to the preset G channel image point compared to the preset G channel image point may be determined based on the offset position of the R channel image point corresponding to each preset G channel image point and the image point position of the preset G channel image point.
Further, a second lateral distance, a second longitudinal distance, and a second linear distance of each preset G-channel image point relative to an image center of the calibration image may be determined. For each of the R and B channels, the second lateral distance, the second linear distance, and the second lateral color difference offset corresponding to the channel for each of the preset G channel image points have the following relationship:
Figure BDA0003827554420000081
Wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure BDA0003827554420000082
representing a second lateral distance, < >>
Figure BDA0003827554420000083
And representing a second straight line distance corresponding to the image point j, wherein the image point j is a preset G channel image point.
For each of the R and B channels, the second lateral distance, the second linear distance, and the second longitudinal chromatic aberration offset corresponding to the channel for each preset G channel image point have the following relationship:
Figure BDA0003827554420000084
wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure BDA0003827554420000085
representing a second longitudinal distance, < >>
Figure BDA0003827554420000086
And representing a second straight line distance corresponding to the image point j, wherein the image point j is a preset G channel image point.
Wherein a is e 、b e C e Is the initial color difference offset parameter.
For each channel, each preset G channel image point can establish the two relation groups, and then a least square method is adopted to carry out fitting processing on each relation group corresponding to each preset G channel image point so as to obtain a chromatic aberration offset parameter corresponding to the channel. If the color difference offset parameter corresponding to the B channel is determined to be a based on the second transverse distance and the second linear distance corresponding to each preset G channel image point, the second longitudinal color difference offset and the second transverse color difference offset corresponding to the B channel b 、b b C b Determining a color difference offset parameter corresponding to the R channel as a based on a second transverse distance and a second linear distance corresponding to each preset G channel image point, and a second longitudinal color difference offset and a second transverse color difference offset corresponding to the R channel r 、b r C r
The color difference offset parameters corresponding to the R channel and the offset parameters corresponding to the B channel are color difference offset parameters corresponding to lenses of the current image shooting equipment, and the color difference offset parameters corresponding to different lenses are different. In other words, in the embodiment of the present application, the image to be corrected and the calibration image for determining the chromatic aberration offset parameters corresponding to the R channel and the B channel are images generated by the same lens of the same photographing device.
Step S12, determining a correction boundary corresponding to at least one correction direction based on the channel value of each image point in the initial correction image.
Wherein the at least one correction direction comprises at least one of a lateral direction or a longitudinal direction.
In some possible embodiments, for each correction direction, a G-channel gradient of each image point in the initial correction image in the corresponding correction direction may be determined based on the channel values of each image point in the initial correction image.
The gradient of the G channel change of each image point in the initial correction image in the corresponding correction direction is used for representing the green change degree of the image point in the corresponding correction direction.
Specifically, in determining the G-channel variation gradient in the lateral direction of each of the R-channel image and the B-channel image in the initial correction image, for each of the R-channel image and the B-channel image, the G-channel variation gradient in the lateral direction of the image may be determined based on the channel values of the G-channel image adjacent to the left and right of the image in the initial correction image.
If the pixel Q is a B-channel pixel, the gradient of the G-channel variation of the pixel Q in the lateral direction may be determined according to the channel values of the G-channel pixels adjacent to the left and right of the pixel Q.
As an example, the gradient of the G-channel variation in the lateral direction of the image point i (the image point i is the B-channel or R-channel image point) in the initial correction image
Figure BDA0003827554420000091
Can be expressed as:
Figure BDA0003827554420000092
wherein h represents the transverse direction,
Figure BDA0003827554420000093
for the channel value of the G channel pixel adjacent to the left of pixel i, < >>
Figure BDA0003827554420000094
The channel value of the G-channel pixel adjacent to the right of pixel i.
If the pixel T is a B-channel pixel, the gradient of the G-channel variation of the pixel T in the longitudinal direction can be determined from the channel values of the G-channel pixels adjacent to the pixel T.
As an example, the gradient of the G channel variation in the longitudinal direction of the image point i (the image point i is the B channel or R channel image point) in the initial correction image
Figure BDA0003827554420000095
Can be expressed as:
Figure BDA0003827554420000096
wherein v represents the longitudinal direction,
Figure BDA0003827554420000097
for the channel value of the G channel pixel adjacent to the upper side of pixel i, +.>
Figure BDA0003827554420000098
The channel value of the G-channel pixel adjacent to the underside of pixel i.
Wherein for each G-channel image point in the initial correction image, the G-channel gradient of the image point in the lateral direction is determinable based on the channel values of the G-channel image points four-sided adjacent to each fifth image point. The fifth image point corresponding to the image point is an image point adjacent to the left and right of the image point.
If the pixel P1 is a G-channel pixel, and P2 and P3 are pixels adjacent to the pixel P1 in the left-right direction, the gradient of the G-channel change in the lateral direction of the pixel P1 can be determined according to the channel values of the G-channel pixels adjacent to the four sides of the pixel P2 and the channel values of the G-channel pixels adjacent to the four sides of the pixel P3.
As an example, the gradient of the G-channel variation in the lateral direction of the image point i (the image point i is the G-channel image point) in the initial correction image may be expressed as:
Figure BDA0003827554420000101
where k1 denotes the image point adjacent to the left of image point i,
Figure BDA0003827554420000102
and +.>
Figure BDA0003827554420000103
Channel values of adjacent pixels to the left of the pixel k1, channel values of adjacent pixels to the upper side, channel values of adjacent pixels to the lower side, and adjacent pixels to the right side (i.e., imageThe channel value of point i), k2 represents the image point adjacent to the right of image point i, +.>
Figure BDA0003827554420000104
and
Figure BDA0003827554420000105
The channel value of the adjacent image point on the left side of the image point k2 (i.e., the image point i), the channel value of the adjacent image point on the upper side, the channel value of the adjacent image point on the lower side, and the channel value of the adjacent image point on the right side are respectively.
For each G-channel image point in the initial correction image, the G-channel gradient of the image point in the longitudinal direction may be determined based on the channel values of the G-channel image points four sides adjacent to each sixth image point. The sixth image point corresponding to the image point is an image point adjacent to the image point up and down.
If the pixel P1 is a G-channel pixel, and the pixels P4 and P5 are pixels adjacent to the pixel P1, respectively, the gradient of the G-channel change of the pixel P1 in the longitudinal direction may be determined according to the channel values of the G-channel pixels adjacent to the four sides of the pixel P4 and the channel values of the G-channel pixels adjacent to the four sides of the pixel P5.
As an example, the gradient of the G-channel variation in the longitudinal direction of the image point i (the image point i is the G-channel image point) in the initial correction image may be expressed as:
Figure BDA0003827554420000106
wherein w1 denotes the image point adjacent to the upper side of the image point i,
Figure BDA0003827554420000107
and +.>
Figure BDA0003827554420000108
The channel value of the adjacent pixel on the left side of the pixel w1, the channel value of the adjacent pixel on the upper side, the channel value of the adjacent pixel on the lower side (i.e., the pixel i), and the channel value of the adjacent pixel on the right side are respectively, w2 represents the pixel adjacent to the right side of the pixel i>
Figure BDA0003827554420000109
and
Figure BDA00038275544200001010
The channel value of the adjacent pixel on the left side of the pixel w2, the channel value of the adjacent pixel on the upper side (i.e., the pixel i), the channel value of the adjacent pixel on the lower side, and the channel value of the adjacent pixel on the right side are respectively.
When determining the gradient of the G channel change of each image point in the initial correction image in any correction direction, the method of filling the image to be corrected or the intermediate correction image can be used for filling the initial correction image, so as to determine the gradient of the G channel change of the image point at the edge position of the initial correction image in any correction direction.
In some possible embodiments, for each correction direction, after determining the G-channel gradient of each first image point in the correction direction based on the channel value of each image point in the initial correction image, the correction boundary corresponding to the correction direction may be determined based on the G-channel gradient of each first image point in the initial correction image in the correction direction.
Specifically, for each correction direction, an R-channel image point and a B-channel image point whose G-channel gradient exceeds a corresponding first gradient threshold value in the correction direction may be determined from a preset region in the initial correction image, and the determined image point is referred to as a seventh image point.
The first gradient thresholds corresponding to different correction directions may be the same or different, and may be specifically determined based on actual application scene requirements, which is not limited herein.
Wherein, for each correction direction, the gradient of the G-channel variation of all image points of the preset area in the initial correction image in the correction direction is smaller than the second gradient threshold. That is, for each correction direction, the preset region in the initial correction image is a region in the initial correction image where the gradient of the G-channel variation corresponding to the correction direction is relatively smooth.
Further, for each seventh image point, a respective first image point combination corresponding to the seventh image point may be determined. That is, for each seventh image point, the seventh image point and an image point adjacent to the seventh image point in the correction direction may be determined as a first image point combination. Each seventh image point corresponds to two first image point combinations in the correction direction.
Further, for each seventh image point, a gradient difference of the gradient of the G channel change of the two image points in each first image point combination corresponding to the seventh image point in the correction direction may be determined, and then a boundary of the two image points in the first image point combination with the largest gradient difference is determined as a correction boundary corresponding to the correction direction.
That is, for each correction direction, after determining any number of seventh image points, a correction boundary corresponding to the correction direction may be determined based on each seventh image point.
For example, if the correction direction is lateral, two first pixel combinations may be determined from each row of seventh pixels, one first pixel combination including the seventh pixel and a pixel adjacent to the left of the seventh pixel, and the other first pixel combination including the seventh pixel and a pixel adjacent to the right of the seventh pixel. For each seventh image point, determining the gradient difference of the gradient of the G channel variation of the two image points in the transverse direction in each first image point combination corresponding to the seventh image point, and further determining the boundary of the two image points in the first image point combination with the largest gradient difference as a correction boundary of the row of the seventh image point in the transverse direction.
Fig. 5a is a schematic diagram of a scenario for determining correction boundaries according to an embodiment of the present application. If the correction direction is the lateral direction, when the image point B2 is a seventh image point determined, a first gradient difference (136) between the G-channel gradient (255) of the image point G3 in the lateral direction and the G-channel gradient (119) of the image point B2 in the lateral direction, and a second gradient difference (119) between the G-channel gradient (119) of the image point B2 in the lateral direction and the G-channel gradient (0) of the image point G2 in the lateral direction are determined. Since the first gradient difference is larger than the second gradient difference, the boundaries of the image points G3 and B2 can be determined as a correction boundary of the row of image points in the lateral direction.
For example, if the correction direction is vertical, two first pixel combinations may be determined from each row of the seventh pixel, one first pixel combination including the seventh pixel and a pixel adjacent to the upper side of the seventh pixel, and the other first pixel combination including the seventh pixel and a pixel adjacent to the lower side of the seventh pixel. For each seventh image point, determining the gradient difference of the gradient of the G channel variation of the two image points in the longitudinal direction in each first image point combination corresponding to the seventh image point, and further determining the boundary of the two image points in the first image point combination with the largest gradient difference as a correction boundary of the column of the seventh image point in the longitudinal direction.
As shown in fig. 5b, fig. 5b is another schematic view of determining a correction boundary according to an embodiment of the present application. If the correction direction is longitudinal, when the image point R3 is a seventh image point, a third gradient difference (117) between the G-channel gradient (255) of the image point G3 in the longitudinal direction and the G-channel gradient (138) of the image point R3 in the longitudinal direction and a fourth gradient difference (138) between the G-channel gradient (138) of the image point R3 in the longitudinal direction and the G-channel gradient (0) of the image point G2 in the longitudinal direction are determined. Since the fourth gradient difference is larger than the third gradient difference, the boundary between the image point G3 and the image point G2 can be determined as a correction boundary of the column of image points in the longitudinal direction.
Step S13, determining at least one target image point combination corresponding to each correction boundary, and correcting the channel value of the image point in each target image point combination based on the channel value of each image point in the initial correction image to obtain a target correction image.
The method comprises the steps of correcting a correction boundary of a target image point combination corresponding to the correction boundary, wherein the target image point combination corresponding to each correction boundary comprises an image point adjacent to the correction boundary in an initial correction image and an image point adjacent to the image point in a correction direction corresponding to the correction boundary, and each image point in each target image point combination is positioned on the same side of the corresponding correction boundary.
Fig. 5c is a schematic view of a scene of determining a combination of target image points according to an embodiment of the present application. Fig. 5c shows a correction boundary in a certain line of image points in the initial correction image, and the correction direction corresponding to the correction boundary is a lateral direction. Based on this, the image point G3 adjacent to the correction boundary and the image point B3 located on the same side of the correction boundary as the image point G3 and adjacent to the image point G3 can be determined as one target image point combination. Similarly, the image point B2 adjacent to the correction boundary and the image point G2 on the same side of the correction boundary as the image point B2 and adjacent to the image point B2 may be determined as one target image point combination.
As shown in fig. 5d, fig. 5d is another schematic view of determining a combination of target image points according to an embodiment of the present application. Fig. 5d shows a correction boundary in a column of pixels in the initial correction image, and the correction direction corresponding to the correction boundary is vertical. Based on this, the image point G2 adjacent to the correction boundary and the image point R2 located on the same side of the correction boundary as the image point G2 and adjacent to the image point G2 can be determined as one target image point combination. Similarly, the image point R3 adjacent to the correction boundary and the image point G3 on the same side of the correction boundary as the image point R3 and adjacent to the image point R3 may be determined as one target image point combination.
Further, since there is a constant color difference relationship between adjacent pixel points in a small smooth area of the RGB picture, that is, assuming that the pixel point P (i, j) and the adjacent pixel point P (m, n), there is:
R ij -G ij =R mn -G mn
B ij -G ij =B mn -G mn
the method comprises the following steps of:
R ij -R mn =G ij -G mn
B ij -B mn =G ik -G mn
wherein R is ij 、B ij And G ij R, B for pixel P (i, j) and channel value for G channel, R mn 、B mn G mn R, B for pixel P (m, n) and G channels, respectively.
Thus, for each target image combination, if the target image combination comprises an R-channel image and a G-channel image, the channel value difference Δg of the G-channel image in the target image combination and the corresponding second image combination corresponds to the channel value difference Δr of the R-channel image in the target image combination and the corresponding second image combination. If the target image group includes a B-channel image and a G-channel image, the channel value difference Δg of the G-channel image in the target image group and the corresponding second image group is identical to the channel value difference Δb of the B-channel image in the target image group and the corresponding second image group.
The second image point combination corresponding to each target image point combination is adjacent to the target image point combination in the correction direction corresponding to the target image point combination, and comprises two image points with the same channel arrangement mode with the target image point combination.
When the channel values of the pixels in the target pixel combinations are corrected based on the channel values of the pixels in the initial image to obtain the target corrected image, a second pixel combination corresponding to each target pixel combination can be determined from the initial corrected image, and the second pixel combination corresponding to each target pixel combination is used as a corrected pixel combination for correcting the channel values of the pixels in the target pixel combination.
For example, for the target image point combination G2B2 in fig. 5c, G1B1 may be determined as the correction image point combination corresponding to the target image point combination. For the target image point combination G3B3 in fig. 5c, G4B4 may be determined as the correction image point combination corresponding to the target image point combination.
For another example, for the target image point combination R2G2 in fig. 5d, R1G1 may be determined as the correction image point combination corresponding to the target image point combination. For the target image point combination R3G3 in fig. 5d, R4G4 may be determined as the correction image point combination corresponding to the target image point combination.
Further, when correcting the channel values of the pixels in each target pixel combination, the channel values of the pixels in each target pixel combination can be corrected based on the color difference constant relationship by the channel values of the pixels in the corresponding correction pixel combination of each target pixel combination, so as to obtain the target correction image.
The arrangement characteristics of the pixels in the RAW image can be known, and each target pixel combination and the corresponding correction pixel combination comprise G channel pixels.
In this case, a first difference between the channel value of the G channel image point in the target image point combination and the channel value of the G channel image point in the corresponding correction image point combination may be determined, and then the channel value of the other image point in the target image point combination may be adjusted so that the channel value of the other image point in the target image point combination is equal to a second difference between the channel value of the other image point in the corresponding correction image point combination, thereby completing the channel value correction for the target image point combination and obtaining the target correction image.
For example, for the target image point combination G2B2 in fig. 5c, the channel value of image point B2 may be adjusted such that the first difference between the channel value of image point B2 and the channel value of image point B1 is equal to the second difference between the channel values of image point G2 and image point G1.
For another example, for the target image point combination R2G2 in fig. 5d, the channel value of image point R2 may be adjusted such that the first difference between the channel value of image point R2 and the channel value of image point R1 is equal to the second difference between the channel value of image point G2 and the channel value of image point G1.
It should be noted that, for each target image point combination, if the image point combination is located at the edge of the initial correction image, the channel values of the image points in the target image point combination may not be adjusted. Or the initial correction image can be supplemented by adopting a supplementing mode of the image to be corrected, so that the correction image point combination corresponding to the target image point combination is determined from the supplemented image, and the channel value of the image point in the target image point combination is adjusted based on the correction image point combination.
It should be noted that, when only one correction direction is required to be performed on the initial correction image, all target image point combinations corresponding to all correction boundaries in the correction direction may be determined first, and then the channel values of the image points in the corresponding target image point combinations are corrected based on the correction image point combinations corresponding to each target image point combination, so as to obtain the target correction image finally.
When the initial correction image needs to be subjected to transverse correction and longitudinal correction, all target image point combinations corresponding to all correction boundaries in the third direction can be determined, and then channel values of image points in the corresponding target image point combinations are corrected based on the correction image point combinations corresponding to each target image point combination to obtain candidate correction images. Updating channel values of all target image point combinations corresponding to all correction boundaries in the fourth direction and corresponding correction image combinations of all image points based on the candidate correction images, so that channel values of image points in the updated target image point combinations corresponding to the updated correction image point combinations corresponding to the updated target image point combinations are corrected at the same time, and finally the target correction image is obtained.
Wherein the third direction and the fourth direction are different directions in the longitudinal direction and the transverse direction, respectively.
In this embodiment of the present application, according to the color difference offset position of each B-channel image point and each R-channel image point in the image to be corrected, the color difference of each B-channel image point and each R-channel image point may be initially corrected to obtain an initial corrected image. The correction boundary corresponding to each correction direction can be further determined according to the channel value of each image point in the initial correction image, for example, the transverse correction boundary and the longitudinal correction boundary in the initial correction image are determined, and then the channel value of the image point in the target image point combination near each correction boundary can be corrected to obtain the target correction image, so that the transverse chromatic aberration correction effect is further improved on the basis of the initial correction image.
Meanwhile, the image to be corrected in the transverse chromatic aberration correction method provided by the embodiment of the application is a RAW image, so that the complexity of information related to gradient, position, distance and the like in the chromatic aberration correction process can be reduced, and the correction effect of the transverse chromatic aberration can be improved.
The lateral chromatic aberration correction method provided by the embodiment of the application can also correct the lateral chromatic aberration of the RGB format image.
In some possible embodiments, for each pixel point in the first image in RGB format to be corrected, the color difference offset positions of the B channel and the R channel in the pixel point may be determined, and then the channel value of the B channel of the pixel point is corrected based on the color difference offset position of the B channel of the pixel point, and the channel value of the R channel of the pixel point is corrected based on the color difference offset position of the R channel of the pixel point, so as to finally obtain the initial corrected image corresponding to the first image.
For each pixel point in the first image, when determining the color difference offset position of the B channel of the pixel point, the color difference offset parameter corresponding to the B channel may be determined first, and then, based on the color difference offset coefficient corresponding to the B channel, the position information of the pixel point in the first image, and the position information of the image center of the first image, the third lateral color difference offset amount and the third longitudinal color difference offset amount corresponding to the B channel of the pixel point are determined, so that the color difference offset position of the B channel of the pixel point is determined based on the third lateral color difference offset amount and the third longitudinal color difference offset amount. Similarly, when determining the color difference offset position of the R channel of the pixel point, the color difference offset parameter corresponding to the B channel may be determined based on the above manner, which is not described herein again.
The determination manner of the color difference offset parameters corresponding to the B channel and the R channel is similar to the determination manner of the color difference offset parameters adopted when the transverse color difference correction is performed on the image in the RAW format, and is not described herein again.
Further, for the B channel and the R channel of each pixel in the first image, taking the B channel as an example, after determining the color difference offset position corresponding to the B channel of the pixel, two pixels closest to the third projection position in the first direction than the third projection position in the first direction of the pixel may be determined as first direction correction pixels corresponding to the pixel, and the B channel value of the pixel may be determined based on the B channel value of each first direction correction pixel, so that the B channel of each pixel in the first image may be corrected in the first direction to obtain the second image. And determining two pixels closest to the fourth projection position in the second direction as second direction correction pixels corresponding to the pixels based on the color difference offset position corresponding to the B channel of the pixels compared with the fourth projection position in the second direction, and correcting the B channel value of the pixels in the second image based on the B channel value of each first direction correction pixel in the first image to obtain a third image. Wherein the first direction and the second direction are different directions in the transverse direction and the longitudinal direction, respectively.
Or, the channel value of the B channel of the pixel where the color difference offset position corresponding to the B channel of the pixel is located may be determined as the channel value of the B channel of the pixel, or the channel value of the B channel of the pixel is determined based on the channel values of the B channels of the pixels around the color difference offset position, so as to obtain the third image.
Based on the implementation manner, the correction of the channel value of the B channel of each pixel point in the first image can be completed, and similarly, the correction of the channel value of the R channel of each pixel point in the third image can be performed based on the same principle, so as to obtain an initial correction image corresponding to the first image.
If the color difference offset position of any channel of each pixel point in the first image is outside the first image range, the pixels around the first image may be supplemented in a mirror image supplementing manner, and the specific supplementing manner is not described herein.
In some possible embodiments, after determining the initial correction image corresponding to the first image, the initial correction image may be processed to obtain a fourth image and a fifth image with a single channel, where each pixel in the fourth image corresponds to a B-channel pixel in the initial correction image corresponding to the first image, each pixel in the fifth image corresponds to an R-channel pixel in the initial correction image corresponding to the first image, and a G-channel gradient of each pixel in the fourth image and the fifth image in each correction direction is determined. Wherein the correction direction includes a lateral direction and a longitudinal direction.
Taking the fourth image as an example, the gradient of the G channel variation of each pixel point in the fourth image in the transverse direction may be determined by the channel value of the G channel of the pixel point adjacent to the left and right of the pixel point in the corresponding initial correction image, and the gradient of the G channel variation in the longitudinal direction may be determined by the channel value of the G channel of the pixel point adjacent to the top and bottom of the image point in the corresponding initial correction image.
For each correction direction, a first pixel point in which the gradient of the G channel change in the correction direction exceeds the corresponding third gradient threshold may be determined from a preset region in the fourth image, where the preset region in the fourth image is a region in which the gradient of the G channel change in the fourth image corresponding to the correction direction is relatively smooth. For each first pixel point, each first pixel point combination corresponding to the first pixel point can be determined. For each first pixel, the first pixel and a pixel adjacent to the first pixel in the correction direction may be determined as a first pixel combination, and each first pixel corresponds to two first pixel combinations in the correction direction.
Further, for each first pixel, a gradient difference of a G-channel variation gradient of two pixels in each first pixel combination corresponding to the first pixel in the correction direction may be determined, and then a boundary of two pixels in one pixel combination with the largest gradient difference is determined as a correction boundary corresponding to the correction direction.
Further, for each correction boundary, at least one second pixel point combination corresponding to the correction boundary may be determined, each second pixel point combination including one pixel point adjacent to the correction boundary in the fourth image and one pixel point adjacent to the pixel point in the correction direction corresponding to the correction boundary, each pixel point in each second pixel point combination corresponding to the same side of the correction boundary. Since adjacent pixel points in a small smooth region of an RGB picture have a constant color difference relationship, that is, assuming a pixel point P (i, j) and an adjacent pixel point P (m, n), there is
R ij -R mn =G ij -G mn
B ij -B mn =G ij -G mn
Wherein R is ij 、B ij And G ij R, B for pixel points P (i, j), respectivelyAnd the channel value of the G channel, R mn 、B mn G mn R, B for pixel P (m, n) and G channels, respectively.
Therefore, for each second pixel point combination, the B-channel values of the pixel points in the second pixel point combination can be corrected so that the difference between the B-channel values coincides with the difference between the first channel values. The difference between the first channel values is the difference between the channel values of the G channel in the first image of each pixel in the second pixel combination.
It should be noted that, when correcting the B-channel value of each pixel in the fourth image, all second pixel combinations corresponding to all correction boundaries in the third direction may be determined first, and then the B-channel value of each pixel in each second pixel combination may be corrected to obtain a candidate corrected image. And updating the B channel values of all the pixels in the second pixel point combination corresponding to all the correction boundaries in the fourth direction based on the candidate correction images, correcting the B channel values of all the pixels in the second pixel point combination corresponding to all the correction boundaries in the fourth direction, and finally obtaining a sixth image.
Wherein the third direction and the fourth direction are different directions in the longitudinal direction and the transverse direction, respectively.
The method can be used for correcting the B channel value of each pixel point in the fourth image in the transverse direction and the longitudinal direction to obtain a sixth image, and the method can be used for correcting the R channel value of each pixel point in the fifth image to obtain a seventh image based on the similar implementation method, so that a target correction image corresponding to the first image can be obtained based on the sixth image and the seventh image. That is, for each pixel point in the first image, the RGB three-channel value of the pixel point can be redetermined based on the B-channel value of the pixel point corresponding to the sixth image, the R-channel value corresponding to the seventh image, and the original G-channel value, so as to obtain the target correction image.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a lateral chromatic aberration correction device provided in an embodiment of the present application. The lateral chromatic aberration correcting device provided by the embodiment of the application comprises:
a correction module 61, configured to determine a color difference offset position of each first image point based on position information of each first image point in an image to be corrected, and correct a channel value of the first image point based on the color difference offset position of each first image point to obtain an initial corrected image, where the image to be corrected is a RAW image, and each first image point is an image point in the image to be corrected except a G channel image point;
A determining module 62, configured to determine a correction boundary corresponding to at least one correction direction based on the channel value of each image point in the initial correction image;
the correction module 61 is configured to determine at least one target image point combination corresponding to each correction boundary, and correct the channel value of each image point in each target image point combination based on the channel value of each image point in the initial correction image to obtain a target correction image;
wherein, each target image point combination corresponding to the correction boundary comprises an image point adjacent to the correction boundary in the initial correction image and an image point adjacent to the image point in the correction direction corresponding to the correction boundary, and each image point in each target image point combination is positioned on the same side of the corresponding correction boundary.
In some possible embodiments, the correction module 61 is configured to:
determining a chromatic aberration offset parameter corresponding to the R channel and a chromatic aberration offset parameter corresponding to the B channel;
determining a first transverse distance, a first longitudinal distance and a first linear distance of each first image point in the image to be corrected compared with the center of the image to be corrected based on the position information of the first image point; determining a first lateral chromatic aberration offset of the first image point based on the first lateral distance, the first linear distance, and a chromatic aberration offset parameter corresponding to the first image point, and determining a first longitudinal chromatic aberration offset of the first image point based on the first longitudinal distance, the first linear distance, and the chromatic aberration offset parameter corresponding to the first image point; a color difference offset location of the first image point is determined based on the first lateral color difference offset and the first longitudinal color difference offset of the first image point.
In some possible embodiments, the correction module 61 is configured to:
determining a first direction correction image point corresponding to the first image point from the image to be corrected based on a first projection position of each color difference offset position of the first image point in the image to be corrected, and correcting the channel value of the first image point based on the channel value of the first direction correction image point to obtain an intermediate correction image; determining a second direction correction image point corresponding to the first image point from the intermediate correction image based on a second projection position of each color difference offset position of the first image point in the intermediate correction image compared with a second projection position of the first image point in a second direction, and correcting the channel value of the first image point in the intermediate correction image based on the channel value of the second direction correction image point in the image to be corrected to obtain an initial correction image; the first direction and the second direction are respectively different directions in the transverse direction and the longitudinal direction; or alternatively, the process may be performed,
and determining a plurality of target correction image points corresponding to the first image point from the image to be corrected based on the chromatic aberration offset position of each first image point in the image to be corrected, and correcting the channel value of the first image point based on the channel value of each target correction image point to obtain an initial correction image.
In some possible embodiments, the correction module 61 is configured to:
if the color difference offset position of each first image point in the image to be corrected is overlapped with any second image point compared with the first projection position of the first image point in the first direction, determining the second image point as a first direction correction image point corresponding to the first image point; each of the second image points is one other image point corresponding to the same channel as the first image point in the image to be corrected;
and if the first projection position is not overlapped with each second image point, determining two second image points which are closest to the first projection position in the first direction of the first image point as first direction correction image points corresponding to the first image point.
In some possible embodiments, the correction module 61 is configured to:
determining a third image point closest to the chromatic aberration offset position of each first image point from the image to be corrected, wherein each first image point and the corresponding third image point respectively correspond to different channels in an R channel and a B channel;
for each first image point, determining four fourth image points nearest to a third image point corresponding to the first image point as target correction image points corresponding to the first image point, wherein each fourth image point corresponds to the same channel with the first image point.
In some possible embodiments, the determining module 62 is configured to:
determining a gradient of a G channel change of each image point in the initial correction image in at least one correction direction based on the channel value of each image point in the initial correction image;
and determining a correction boundary corresponding to each correction direction based on the gradient of the G-channel change of each image point in the initial correction image in each correction direction, wherein at least one correction direction comprises at least one of a transverse direction and a longitudinal direction.
In some possible embodiments, the determining module 62 is configured to:
for each image point in the initial correction image, if the image point is a B-channel image point or an R-channel image point, determining a G-channel variation gradient of the image point in the transverse direction based on the channel values of the G-channel image points adjacent to the left and right of the image point, and determining a G-channel variation gradient of the image point in the longitudinal direction based on the channel values of the G-channel image points adjacent to the upper and lower of the image point;
for each image point in the initial correction image, if the image point is a G-channel image point, determining a G-channel gradient of the image point in a transverse direction based on channel values of G-channel image points adjacent to four sides of each fifth image point, determining a G-channel gradient of the image point in a longitudinal direction based on channel values of G-channel image points adjacent to four sides of each sixth image point, wherein each fifth image point is adjacent to the G-channel image point in a left-right direction, and each sixth image point is adjacent to the G-channel image point in a top-bottom direction.
In some possible embodiments, for each of the correction directions, the determining module 62 is configured to:
determining a seventh image point in the initial correction image, wherein the seventh image point is a first image point in a preset area of the initial correction image, the gradient of the G channel change in the correction direction of the first image point exceeds a first gradient threshold value corresponding to the correction direction, and the gradient of the G channel change in the correction direction of each image point in the preset area is smaller than a second gradient threshold value;
for each seventh image point, determining each first image point combination corresponding to the seventh image point, wherein each first image point combination comprises the seventh image point and one image point adjacent to the seventh image point in the correction direction, determining gradient differences of gradient of G channel variation of two image points in each first image point combination in the correction direction, and determining boundaries of two image points in the first image point combination with the largest gradient differences as a correction boundary corresponding to the correction direction.
In some possible embodiments, the correction module 61 is configured to:
determining a correction image point combination corresponding to each target image point combination from the initial correction image, wherein the correction image point combination corresponding to each target image point combination and the target image point combination are positioned on the same side of a corresponding correction boundary, and each correction image point combination corresponding to each target image point combination is adjacent to the target image point combination in the correction direction corresponding to the target image point combination and comprises two image points with the same channel arrangement mode with the target image point combination;
And correcting the channel values of the image points in each target image point combination according to the channel values of the image points in the correction image point combination corresponding to each target image point combination based on the chromatic aberration constant relation to obtain a target correction image.
In some possible embodiments, the determining module 62 is configured to:
determining a second transverse chromatic aberration offset and a second longitudinal chromatic aberration offset of the B channel image point and the R channel image point corresponding to each preset G channel image point in the calibration image compared with the preset G channel image point;
determining a second transverse distance, a second longitudinal distance and a second linear distance of each preset G channel image point compared with the image center of the calibration image;
and for each of the B channel and the R channel, determining a color difference offset parameter corresponding to the channel based on the second transverse distance, the second longitudinal distance and the second linear distance corresponding to each preset G channel image point, and the second transverse color difference offset and the second longitudinal color difference offset corresponding to the channel.
In a specific implementation, the lateral color difference correction device may execute, through each functional module built in the lateral color difference correction device, an implementation manner provided by each step in fig. 1, and specifically, the implementation manner provided by each step may be referred to, which is not described herein again.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 7, the electronic device 700 in the present embodiment may include: processor 701, network interface 704 and memory 705, in addition, the electronic device 700 described above may further include: a user interface 703, and at least one communication bus 702. Wherein the communication bus 702 is used to enable connected communications between these components. The user interface 703 may include a Display screen (Display), a Keyboard (Keyboard), and the optional user interface 703 may further include a standard wired interface, a wireless interface, among others. The network interface 704 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 705 may be a high-speed RAM memory or a non-volatile memory (NVM), such as at least one disk memory. The memory 705 may also optionally be at least one storage device located remotely from the processor 701. As shown in fig. 7, an operating system, a network communication module, a user interface module, and a device control application program may be included in the memory 705, which is one type of computer-readable storage medium.
In the electronic device 700 shown in fig. 7, the network interface 704 may provide network communication functions; while the user interface 703 is primarily used as an interface for providing input to an object; and processor 701 may be configured to invoke a device control application stored in memory 705 to implement:
Determining a chromatic aberration offset position of each first image point based on position information of each first image point in an image to be corrected, and correcting a channel value of the first image point based on the chromatic aberration offset position of each first image point to obtain an initial corrected image, wherein the image to be corrected is a RAW image, and each first image point is one image point except a G channel image point in the image to be corrected;
determining a correction boundary corresponding to at least one correction direction based on the channel value of each image point in the initial correction image;
determining at least one target image point combination corresponding to each correction boundary, and correcting the channel value of each image point in each target image point combination based on the channel value of each image point in the initial correction image to obtain a target correction image;
wherein, each target image point combination corresponding to the correction boundary comprises an image point adjacent to the correction boundary in the initial correction image and an image point adjacent to the image point in the correction direction corresponding to the correction boundary, and each image point in each target image point combination is positioned on the same side of the corresponding correction boundary.
In some possible embodiments, the processor 701 is configured to:
Determining a chromatic aberration offset parameter corresponding to the R channel and a chromatic aberration offset parameter corresponding to the B channel;
determining a first transverse distance, a first longitudinal distance and a first linear distance of each first image point in the image to be corrected compared with the center of the image to be corrected based on the position information of the first image point; determining a first lateral chromatic aberration offset of the first image point based on the first lateral distance, the first linear distance, and a chromatic aberration offset parameter corresponding to the first image point, and determining a first longitudinal chromatic aberration offset of the first image point based on the first longitudinal distance, the first linear distance, and the chromatic aberration offset parameter corresponding to the first image point; a color difference offset location of the first image point is determined based on the first lateral color difference offset and the first longitudinal color difference offset of the first image point.
In some possible embodiments, the processor 701 is configured to:
determining a first direction correction image point corresponding to the first image point from the image to be corrected based on a first projection position of each color difference offset position of the first image point in the image to be corrected, and correcting the channel value of the first image point based on the channel value of the first direction correction image point to obtain an intermediate correction image; determining a second direction correction image point corresponding to the first image point from the intermediate correction image based on a second projection position of each color difference offset position of the first image point in the intermediate correction image compared with a second projection position of the first image point in a second direction, and correcting the channel value of the first image point in the intermediate correction image based on the channel value of the second direction correction image point in the image to be corrected to obtain an initial correction image; the first direction and the second direction are respectively different directions in the transverse direction and the longitudinal direction; or alternatively, the process may be performed,
And determining a plurality of target correction image points corresponding to the first image point from the image to be corrected based on the chromatic aberration offset position of each first image point in the image to be corrected, and correcting the channel value of the first image point based on the channel value of each target correction image point to obtain an initial correction image.
In some possible embodiments, the processor 701 is configured to:
if the color difference offset position of each first image point in the image to be corrected is overlapped with any second image point compared with the first projection position of the first image point in the first direction, determining the second image point as a first direction correction image point corresponding to the first image point; each of the second image points is one other image point corresponding to the same channel as the first image point in the image to be corrected;
and if the first projection position is not overlapped with each second image point, determining two second image points which are closest to the first projection position in the first direction of the first image point as first direction correction image points corresponding to the first image point.
In some possible embodiments, the processor 701 is configured to:
determining a third image point closest to the chromatic aberration offset position of each first image point from the image to be corrected, wherein each first image point and the corresponding third image point respectively correspond to different channels in an R channel and a B channel;
For each first image point, determining four fourth image points nearest to a third image point corresponding to the first image point as target correction image points corresponding to the first image point, wherein each fourth image point corresponds to the same channel with the first image point.
In some possible embodiments, the processor 701 is configured to:
determining a gradient of a G channel change of each image point in the initial correction image in at least one correction direction based on the channel value of each image point in the initial correction image;
and determining a correction boundary corresponding to each correction direction based on the gradient of the G-channel change of each image point in the initial correction image in each correction direction, wherein at least one correction direction comprises at least one of a transverse direction and a longitudinal direction.
In some possible embodiments, the processor 701 is configured to:
for each image point in the initial correction image, if the image point is a B-channel image point or an R-channel image point, determining a G-channel variation gradient of the image point in the transverse direction based on the channel values of the G-channel image points adjacent to the left and right of the image point, and determining a G-channel variation gradient of the image point in the longitudinal direction based on the channel values of the G-channel image points adjacent to the upper and lower of the image point;
For each image point in the initial correction image, if the image point is a G-channel image point, determining a G-channel gradient of the image point in a transverse direction based on channel values of G-channel image points adjacent to four sides of each fifth image point, determining a G-channel gradient of the image point in a longitudinal direction based on channel values of G-channel image points adjacent to four sides of each sixth image point, wherein each fifth image point is adjacent to the G-channel image point in a left-right direction, and each sixth image point is adjacent to the G-channel image point in a top-bottom direction.
In some possible embodiments, for each of the correction directions, the processor 701 is configured to:
determining a seventh image point in the initial correction image, wherein the seventh image point is a first image point in a preset area of the initial correction image, the gradient of the G channel change in the correction direction of the first image point exceeds a first gradient threshold value corresponding to the correction direction, and the gradient of the G channel change in the correction direction of each image point in the preset area is smaller than a second gradient threshold value;
for each seventh image point, determining each first image point combination corresponding to the seventh image point, wherein each first image point combination comprises the seventh image point and one image point adjacent to the seventh image point in the correction direction, determining gradient differences of gradient of G channel variation of two image points in each first image point combination in the correction direction, and determining boundaries of two image points in the first image point combination with the largest gradient differences as a correction boundary corresponding to the correction direction.
In some possible embodiments, the processor 701 is configured to:
determining a correction image point combination corresponding to each target image point combination from the initial correction image, wherein the correction image point combination corresponding to each target image point combination and the target image point combination are positioned on the same side of a corresponding correction boundary, and each correction image point combination corresponding to each target image point combination is adjacent to the target image point combination in the correction direction corresponding to the target image point combination and comprises two image points with the same channel arrangement mode with the target image point combination;
and correcting the channel values of the image points in each target image point combination according to the channel values of the image points in the correction image point combination corresponding to each target image point combination based on the chromatic aberration constant relation to obtain a target correction image.
In some possible embodiments, the processor 701 is configured to:
determining a second transverse chromatic aberration offset and a second longitudinal chromatic aberration offset of the B channel image point and the R channel image point corresponding to each preset G channel image point in the calibration image compared with the preset G channel image point;
determining a second transverse distance, a second longitudinal distance and a second linear distance of each preset G channel image point compared with the image center of the calibration image;
And for each of the B channel and the R channel, determining a color difference offset parameter corresponding to the channel based on the second transverse distance, the second longitudinal distance and the second linear distance corresponding to each preset G channel image point, and the second transverse color difference offset and the second longitudinal color difference offset corresponding to the channel.
It should be appreciated that in some possible embodiments, the above-described processor 701 may be a central processing unit (central processing unit, CPU), which may also be other general purpose processors, digital signal processors (digital signal processor, DSP), application specific integrated circuits (application specific integrated circuit, ASIC), off-the-shelf programmable gate arrays (field-programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The memory may include read only memory and random access memory and provide instructions and data to the processor. A portion of the memory may also include non-volatile random access memory. For example, the memory may also store information of the device type.
In a specific implementation, the electronic device 700 may execute, through each functional module built therein, an implementation provided by each step in fig. 1, and specifically, the implementation provided by each step may be referred to, which is not described herein again.
The embodiments of the present application further provide a computer readable storage medium, where a computer program is stored and executed by a processor to implement the method provided by each step in fig. 1, and specifically refer to the implementation manner provided by each step, which is not described herein.
The computer readable storage medium may be an apparatus provided in any one of the foregoing embodiments or an internal storage unit of an electronic device, for example, a hard disk or a memory of the electronic device. The computer readable storage medium may also be an external storage device of the electronic device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) card, a flash card (flash card) or the like, which are provided on the electronic device. The computer readable storage medium may also include a magnetic disk, an optical disk, a read-only memory (ROM), a random access memory (random access memory, RAM), or the like. Further, the computer-readable storage medium may also include both an internal storage unit and an external storage device of the electronic device. The computer-readable storage medium is used to store the computer program and other programs and data required by the electronic device. The computer-readable storage medium may also be used to temporarily store data that has been output or is to be output.
Embodiments of the present application provide a computer program product comprising a computer program for executing the method provided by the steps of fig. 1 by a processor.
The terms "first," "second," and the like in the claims and specification and drawings of this application are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or electronic device that comprises a list of steps or elements is not limited to the list of steps or elements but may, alternatively, include other steps or elements not listed or inherent to such process, method, article, or electronic device. Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the present application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments. The term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps described in connection with the embodiments disclosed herein may be embodied in electronic hardware, in computer software, or in a combination of the two, and that the elements and steps of the examples have been generally described in terms of function in the foregoing description to clearly illustrate the interchangeability of hardware and software. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The foregoing disclosure is only illustrative of the preferred embodiments of the present application and is not to be construed as limiting the scope of the claims, and therefore, equivalent variations in terms of the claims are intended to be included herein.

Claims (13)

1. A method of correcting lateral chromatic aberration, the method comprising:
determining a chromatic aberration offset position of each first image point based on position information of each first image point in an image to be corrected, and correcting a channel value of the first image point based on the chromatic aberration offset position of each first image point to obtain an initial corrected image, wherein the image to be corrected is a RAW image, and each first image point is one image point except a G channel image point in the image to be corrected;
Determining a correction boundary corresponding to at least one correction direction based on channel values of each image point in the initial correction image;
determining at least one target image point combination corresponding to each correction boundary, and correcting the channel value of each image point in each target image point combination based on the channel value of each image point in the initial correction image to obtain a target correction image;
wherein, each target image point combination corresponding to the correction boundary comprises an image point adjacent to the correction boundary in the initial correction image and an image point adjacent to the image point in the correction direction corresponding to the correction boundary, and each image point in each target image point combination is positioned on the same side of the corresponding correction boundary.
2. The method of claim 1, wherein determining the color difference offset position of each first image point in the image to be corrected based on the position information of the first image point comprises:
determining a chromatic aberration offset parameter corresponding to the R channel and a chromatic aberration offset parameter corresponding to the B channel;
determining a first transverse distance, a first longitudinal distance and a first linear distance of each first image point in the image to be corrected compared with the image center of the image to be corrected based on the position information of the first image point; determining a first lateral color difference offset of the first image point based on the first lateral distance, the first linear distance, and a color difference offset parameter corresponding to the first image point, and determining a first longitudinal color difference offset of the first image point based on the first longitudinal distance, the first linear distance, and the color difference offset parameter corresponding to the first image point; a color difference offset location of the first image point is determined based on the first lateral color difference offset and the first longitudinal color difference offset of the first image point.
3. The method of claim 1, wherein correcting the channel value of each first image point based on the color difference offset position of the first image point results in an initial corrected image, comprising:
determining a first direction correction image point corresponding to the first image point from the image to be corrected based on a first projection position of each color difference offset position of the first image point in the image to be corrected, and correcting the channel value of the first image point based on the channel value of the first direction correction image point to obtain an intermediate correction image; determining a second-direction correction image point corresponding to the first image point from the intermediate correction image based on a second projection position of each color difference offset position of the first image point in the intermediate correction image compared with a second projection position of the first image point in a second direction, and correcting the channel value of the first image point in the intermediate correction image based on the channel value of the second-direction correction image point in the image to be corrected to obtain an initial correction image; the first direction and the second direction are respectively different directions in the transverse direction and the longitudinal direction; or alternatively, the process may be performed,
And determining a plurality of target correction image points corresponding to the first image points from the image to be corrected based on the chromatic aberration offset position of each first image point in the image to be corrected, and correcting the channel value of the first image points based on the channel value of each target correction image point to obtain an initial correction image.
4. A method according to claim 3, wherein determining the first direction correction image point corresponding to the first image point from the image to be corrected based on the color difference offset position of each first image point in the image to be corrected compared to the first projection position of the first image point in the first direction comprises:
if the color difference offset position of each first image point in the image to be corrected is overlapped with any second image point compared with the first projection position of the first image point in the first direction, determining the second image point as a first direction correction image point corresponding to the first image point; each second image point is one other image point corresponding to the same channel with the first image point in the image to be corrected;
and if the first projection position is not overlapped with each second image point, determining two second image points which are closest to the first projection position in the first direction where the first image point is positioned as first direction correction image points corresponding to the first image point.
5. A method according to claim 3, wherein the determining, from the image to be corrected, a plurality of target correction image points corresponding to the first image points based on the color difference offset position of each of the first image points in the image to be corrected, includes:
determining a third image point closest to the chromatic aberration offset position of each first image point from the image to be corrected, wherein each first image point and the corresponding third image point respectively correspond to different channels in an R channel and a B channel;
for each first image point, determining four fourth image points nearest to a third image point corresponding to the first image point as target correction image points corresponding to the first image point, wherein each fourth image point corresponds to the same channel with the first image point.
6. The method of claim 1, wherein determining a correction boundary for at least one correction direction based on channel values for each image point in the initial corrected image comprises:
determining a G channel variation gradient of each image point in the initial correction image in at least one correction direction based on the channel value of each image point in the initial correction image;
and determining a correction boundary corresponding to each correction direction based on the G-channel change gradient of each image point in the initial correction image in each correction direction, wherein at least one correction direction comprises at least one of a transverse direction and a longitudinal direction.
7. The method of claim 6, wherein determining a G-channel gradient in the lateral direction for each image point in the initial corrected image based on the channel values for each image point in the initial corrected image comprises:
for each image point in the initial correction image, if the image point is a B-channel image point or an R-channel image point, determining a G-channel variation gradient of the image point in the transverse direction based on the channel values of the G-channel image points adjacent to the left and right of the image point, and determining a G-channel variation gradient of the image point in the longitudinal direction based on the channel values of the G-channel image points adjacent to the upper and lower of the image point;
for each image point in the initial correction image, if the image point is a G channel image point, determining a G channel variation gradient of the image point in the transverse direction based on channel values of G channel image points adjacent to four sides of each fifth image point, determining a G channel variation gradient of the image point in the longitudinal direction based on channel values of G channel image points adjacent to four sides of each sixth image point, wherein each fifth image point is adjacent to the G channel image point in the left-right direction, and each sixth image point is adjacent to the G channel image point in the up-down direction.
8. The method of claim 6, wherein for each of the correction directions, determining a correction boundary corresponding to the correction direction based on a G-channel gradient of each image point in the initial correction image in the correction direction, comprises:
Determining a seventh image point in the initial correction image, wherein the seventh image point is a first image point in a preset area of the initial correction image, the gradient of the G channel change in the correction direction of the first image point exceeds a first gradient threshold value corresponding to the correction direction, and the gradient of the G channel change in the correction direction of each image point in the preset area is smaller than a second gradient threshold value;
for each seventh image point, determining each first image point combination corresponding to the seventh image point, wherein each first image point combination comprises the seventh image point and one image point adjacent to the seventh image point in the correction direction, determining gradient differences of gradient of G channel variation of two image points in each first image point combination in the correction direction, and determining boundaries of two image points in the first image point combination with the largest gradient differences as one correction boundary corresponding to the correction direction.
9. The method of claim 1, wherein correcting the channel values of the pixels in each of the target pixel combinations based on the channel values of the pixels in the initial corrected image comprises:
determining a correction image point combination corresponding to each target image point combination from the initial correction image, wherein the correction image point combination corresponding to each target image point combination and the target image point combination are positioned on the same side of a corresponding correction boundary, and each correction image point combination corresponding to each target image point combination is adjacent to the target image point combination in the correction direction corresponding to the target image point combination and comprises two image points with the same channel arrangement mode with the target image point combination;
And correcting the channel values of the image points in each target image point combination according to the channel values of the image points in the correction image point combination corresponding to each target image point combination based on the chromatic aberration constant relation to obtain a target correction image.
10. The method according to claim 2, wherein determining the color difference offset parameter corresponding to the R channel and the color difference offset parameter corresponding to the B channel includes:
determining a second transverse chromatic aberration offset and a second longitudinal chromatic aberration offset of the B channel image point and the R channel image point corresponding to each preset G channel image point in the calibration image compared with the preset G channel image point;
determining a second transverse distance, a second longitudinal distance and a second linear distance of each preset G channel image point compared with the image center of the calibration image;
and for each of the B channel and the R channel, determining a color difference offset parameter corresponding to the channel based on a second transverse distance, a second longitudinal distance and a second linear distance corresponding to each preset G channel image point, and a second transverse color difference offset and a second longitudinal color difference offset corresponding to the channel.
11. A lateral chromatic aberration correction device, the device comprising:
The correction module is used for determining a chromatic aberration offset position of each first image point based on the position information of each first image point in the image to be corrected, correcting the channel value of the first image point based on the chromatic aberration offset position of each first image point to obtain an initial correction image, wherein the image to be corrected is a RAW image, and each first image point is one image point except a G channel image point in the image to be corrected;
the determining module is used for determining a correction boundary corresponding to at least one correction direction based on the channel value of each image point in the initial correction image;
the correction module is used for determining at least one target image point combination corresponding to each correction boundary, and correcting the channel value of each image point in each target image point combination based on the channel value of each image point in the initial correction image to obtain a target correction image;
wherein, each target image point combination corresponding to the correction boundary comprises an image point adjacent to the correction boundary in the initial correction image and an image point adjacent to the image point in the correction direction corresponding to the correction boundary, and each image point in each target image point combination is positioned on the same side of the corresponding correction boundary.
12. An electronic device comprising a processor and a memory, the processor and the memory being interconnected;
the memory is used for storing a computer program;
the processor is configured to perform the method of any of claims 1 to 10 when the computer program is invoked.
13. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program, which is executed by a processor to implement the method of any one of claims 1 to 10.
CN202211066007.4A 2022-08-31 2022-08-31 Lateral chromatic aberration correction method, device, equipment and storage medium Active CN115499629B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211066007.4A CN115499629B (en) 2022-08-31 2022-08-31 Lateral chromatic aberration correction method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211066007.4A CN115499629B (en) 2022-08-31 2022-08-31 Lateral chromatic aberration correction method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN115499629A CN115499629A (en) 2022-12-20
CN115499629B true CN115499629B (en) 2023-05-26

Family

ID=84468242

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211066007.4A Active CN115499629B (en) 2022-08-31 2022-08-31 Lateral chromatic aberration correction method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115499629B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111242863A (en) * 2020-01-09 2020-06-05 上海酷芯微电子有限公司 Method and medium for eliminating lens lateral chromatic aberration based on image processor
CN113905183A (en) * 2021-08-25 2022-01-07 珠海全志科技股份有限公司 Chromatic aberration correction method and device for wide dynamic range image
CN114359050A (en) * 2021-12-28 2022-04-15 北京奕斯伟计算技术有限公司 Image processing method, image processing apparatus, computer device, storage medium, and program product
CN114943658A (en) * 2022-06-09 2022-08-26 豪威科技(武汉)有限公司 Color edge removing method based on transverse chromatic aberration calibration

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013240022A (en) * 2012-05-17 2013-11-28 Canon Inc Image processing apparatus, imaging apparatus, and image processing method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111242863A (en) * 2020-01-09 2020-06-05 上海酷芯微电子有限公司 Method and medium for eliminating lens lateral chromatic aberration based on image processor
CN113905183A (en) * 2021-08-25 2022-01-07 珠海全志科技股份有限公司 Chromatic aberration correction method and device for wide dynamic range image
CN114359050A (en) * 2021-12-28 2022-04-15 北京奕斯伟计算技术有限公司 Image processing method, image processing apparatus, computer device, storage medium, and program product
CN114943658A (en) * 2022-06-09 2022-08-26 豪威科技(武汉)有限公司 Color edge removing method based on transverse chromatic aberration calibration

Also Published As

Publication number Publication date
CN115499629A (en) 2022-12-20

Similar Documents

Publication Publication Date Title
US11875475B2 (en) Systems and methods for synthesizing high resolution images using images captured by an array of independently controllable imagers
US9532018B2 (en) Projection system, device and method for the output of calibration projection scenes
US9661257B2 (en) Projection system, image processing device, and projection method
US20220076391A1 (en) Image Distortion Correction Method and Apparatus
US8581995B2 (en) Method and apparatus for parallax correction in fused array imaging systems
JP4815807B2 (en) Image processing apparatus, image processing program, and electronic camera for detecting chromatic aberration of magnification from RAW data
US8699820B2 (en) Image processing apparatus, camera apparatus, image processing method, and program
US20090160997A1 (en) Imaging device
US9996903B2 (en) Super-resolution in processing images such as from multi-layer sensors
US20100033584A1 (en) Image processing device, storage medium storing image processing program, and image pickup apparatus
US9143756B2 (en) Camera module, photographing method, and electronic apparatus
US8818128B2 (en) Image processing apparatus, image processing method, and program
US8385686B2 (en) Image processing method based on partitioning of image data, image processing device based on partitioning image data and program
US8908988B2 (en) Method and system for recovering a code image including blurring
JP6838918B2 (en) Image data processing device and method
CN115499629B (en) Lateral chromatic aberration correction method, device, equipment and storage medium
JP5446285B2 (en) Image processing apparatus and image processing method
JP6415094B2 (en) Image processing apparatus, imaging apparatus, image processing method, and program
US20080316343A1 (en) Method and Apparatus For Allowing Access to Individual Memory
JP2002320237A (en) Method for detecting chromatic aberration in magnification
JP2019121972A (en) Image processing method, image processing apparatus, imaging apparatus, image processing program, and storage medium
US11893706B2 (en) Image correction device
JP5423221B2 (en) Image determination apparatus, image determination program, and image determination method
CN110268439B (en) Motion image corner point sequencer
KR102172634B1 (en) Camera module, electronic device, and method for operating the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant