CN113132562A - Lens shadow correction method and device and electronic equipment - Google Patents

Lens shadow correction method and device and electronic equipment Download PDF

Info

Publication number
CN113132562A
CN113132562A CN202110432799.1A CN202110432799A CN113132562A CN 113132562 A CN113132562 A CN 113132562A CN 202110432799 A CN202110432799 A CN 202110432799A CN 113132562 A CN113132562 A CN 113132562A
Authority
CN
China
Prior art keywords
channel
raw image
compensation
pixel point
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110432799.1A
Other languages
Chinese (zh)
Other versions
CN113132562B (en
Inventor
黄志明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202110432799.1A priority Critical patent/CN113132562B/en
Publication of CN113132562A publication Critical patent/CN113132562A/en
Application granted granted Critical
Publication of CN113132562B publication Critical patent/CN113132562B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Color Television Image Signal Generators (AREA)
  • Image Processing (AREA)

Abstract

The application discloses a lens shadow correction method, a lens shadow correction device and electronic equipment, which belong to the technical field of image processing, and the method comprises the following steps: acquiring a first RAW image, and performing downsampling on the first RAW image to obtain a second RAW image, wherein the first RAW image is an original RAW image to be corrected; pre-compensating the second RAW image based on preset first compensation information to obtain a third RAW image; acquiring a color cast type of the third RAW image according to color values of pixel points in the third RAW image; acquiring an adjustment coefficient corresponding to the color cast type of the third RAW image according to a preset mapping relation between the color cast type and the compensation information adjustment coefficient; adjusting the first compensation information according to the adjustment coefficient to obtain second compensation information; and compensating the first RAW image based on the first compensation information and the second compensation information to obtain a fourth RAW image after lens shading correction.

Description

Lens shadow correction method and device and electronic equipment
Technical Field
The application belongs to the technical field of image processing, and particularly relates to a lens shadow correction method and device and electronic equipment.
Background
In recent years, with rapid development of internet technology and upgrading of configuration of device hardware, functions of electronic devices are becoming more abundant, and more users start entertainment activities using the electronic devices, for example, shooting images or videos using the electronic devices. When an image or a video is shot, the quality of the imaging quality is determined by imaging hardware and a software algorithm in the electronic equipment, wherein the core components of the imaging hardware comprise a lens, an infrared cut-off filter and a CMOS/CCD sensor.
At present, Lens shadows (Lens Shading) are caused by optical characteristics of a Lens, mechanical structure deviation of a Lens module, and inconsistency between micro lenses (micro lenses) on an infrared cut-off filter and a Chief Ray Angle (CRA) of the Lens. The occurrence of lens shading will seriously affect the imaging quality, which not only causes the brightness of the image to be attenuated from the center of the image (luma shading), but also causes the color shading around and in the center of the image, especially occurs in the wide-angle lens of a small camera module, and therefore the lens shading needs to be corrected.
In the prior art, lens shading correction methods mainly include cos4 θ quartic term function fitting method and grid correction method. However, although the two methods can improve the lens shading problem to some extent, the two methods are widely applied to scenes with low requirements on imaging quality. However, when there is a high demand for image quality, for example, when it is necessary to solve the color cast problem in various scenes, both methods cannot meet the application requirements.
Disclosure of Invention
The embodiment of the application aims to provide a lens shading correction method, a lens shading correction device and electronic equipment, and can solve the technical problem that the requirement for high imaging quality cannot be met in the prior art.
In a first aspect, an embodiment of the present application provides a lens shading correction method, where the method includes:
acquiring a first RAW image, and performing downsampling on the first RAW image to obtain a second RAW image, wherein the first RAW image is an original RAW image to be corrected;
pre-compensating the second RAW image based on preset first compensation information to obtain a third RAW image;
acquiring a color cast type of the third RAW image according to color values of pixel points in the third RAW image;
acquiring an adjustment coefficient corresponding to the color cast type of the third RAW image according to a preset mapping relation between the color cast type and the compensation information adjustment coefficient;
adjusting the first compensation information according to the adjustment coefficient to obtain second compensation information;
and compensating the first RAW image based on the first compensation information and the second compensation information to obtain a fourth RAW image after lens shading correction.
In a second aspect, an embodiment of the present application provides a lens shading correction apparatus, including:
the sampling module is used for acquiring a first RAW image and performing downsampling on the first RAW image to obtain a second RAW image, wherein the first RAW image is an original RAW image to be corrected;
the pre-compensation module is used for pre-compensating the second RAW image based on preset first compensation information to obtain a third RAW image;
the first obtaining module is used for obtaining the color cast type of the third RAW image according to the color value of the pixel point in the third RAW image;
the second obtaining module is used for obtaining an adjusting coefficient corresponding to the color cast type of the third RAW image according to a mapping relation between a preset color cast type and a compensation information adjusting coefficient;
the adjusting module is used for adjusting the first compensation information according to the adjusting coefficient to obtain second compensation information;
and the correction module is used for compensating the first RAW image based on the first compensation information and the second compensation information to obtain a fourth RAW image after lens shading correction.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the lens shading correction method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the lens shading correction method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the steps of the lens shading correction method according to the first aspect.
In the embodiment of the application, the RAW image can be precompensated through the compensation information, the color cast type of the RAW image is obtained according to the color values of the pixel points in the precompensated RAW image, the dynamic adjustment coefficient of the compensation information is obtained according to the color cast type, and the lens shade of the RAW image is corrected according to the adjustment coefficient and the compensation information, so that the RAW image with higher imaging quality is obtained. Compared with the prior art, in the embodiment of the application, only a group of pre-calibrated compensation information needs to be provided, and the compensation information is dynamically adjusted by analyzing and estimating the lens shading of the current scene, so that the compensation purpose is accurately achieved, and the scene with higher requirements on imaging quality can be met.
Drawings
Fig. 1 is a flowchart of a lens shading correction method according to an embodiment of the present application;
FIG. 2 is a diagram illustrating a process of calculating a hue component according to an embodiment of the present disclosure;
FIG. 3 is an exemplary diagram of red hue components of different color cast types provided by an embodiment of the present application;
fig. 4 is a block diagram illustrating a structure of a lens shading correction apparatus according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 6 is a hardware structure diagram of an electronic device implementing various embodiments of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The embodiment of the application provides a lens shadow correction method and device and electronic equipment.
The lens shading correction method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
It should be noted that the lens shading correction method provided in the embodiment of the present application is applicable to an electronic device, and in practical application, the electronic device may include: the mobile terminal includes a smart phone, a tablet computer, a personal digital assistant, and the like, which is not limited in the embodiment of the present application.
Fig. 1 is a flowchart of a lens shading correction method provided in an embodiment of the present application, and as shown in fig. 1, the method may include the following steps: step 101, step 102, step 103, step 104, step 105 and step 106, wherein,
in step 101, a first RAW image is obtained and downsampled to obtain a second RAW image, where the first RAW image is an original RAW image to be corrected.
The image sensor is a device for converting optical signals into electric signals, each photosensitive unit on the image sensor is called a pixel, and the pixel value of each pixel is used for representing the intensity of illumination sensed by the pixel, but the color cannot be represented.
In order to characterize the Color, the surface of the image sensor is covered with a layer of CFA (Color filter array), the CFA includes a plurality of Color filter elements, each Color filter element corresponds to a pixel point on the image sensor, and each Color filter element only allows light of a single Color to pass through and be captured by the image sensor. Therefore, the pixel value of the pixel point on the image sensor can be used for representing the illumination intensity of the light with the specific color.
An image in a RAW format (i.e., a RAW image) is directly output by an image sensor without being processed by a software algorithm, and is generally composed of four channels of R, GR, B, and GB arranged in a group in a specific bayer array. For example, the uppermost image in fig. 2 is a RAW data map in accordance with the arrangement of rg (r) g (B) B.
In the embodiment of the application, it is considered that Lens Shading is slowly changed from the edge of an image to the center, the RAW image with the original size has a larger size, and more calculation resources are occupied during processing, therefore, in actual application, when a final compensation table is determined, the final compensation table does not need to be determined according to the RAW image with the original size, the RAW image with the original size can be downsampled first, and then the final compensation table is determined according to the downsampled RAW image. Preferably, the size of the down-sampled RAW image does not exceed 100 x 100.
In the embodiment of the present application, the first RAW image is an image of an original size, and the second RAW image is a sampled image.
In the embodiment of the present application, any downsampling method in the related art may be adopted to downsample the first RAW image, which is not limited in the embodiment of the present application.
In step 102, the second RAW image is pre-compensated based on the preset first compensation information, so as to obtain a third RAW image.
In the embodiment of the present application, the organization form of the compensation information may be a compensation table or other forms. When the organization form of the compensation information is the compensation table, the first compensation information is the first compensation table. For ease of understanding, the first compensation table will be described as an example.
In the embodiment of the application, the second RAW image is pre-compensated through the first compensation table so as to complement the brightness of the image, eliminate the color influence and facilitate the subsequent determination of the color cast type.
In the embodiment of the present application, the first compensation table generally needs to be calibrated and stored in a laboratory environment in advance, that is, the first compensation table is generated in advance. The specific operation is as follows:
shooting a flat image with the difference between the color and the brightness smaller than a preset threshold value under a specific light source;
the method comprises the steps of conducting down-sampling on a flat image, and obtaining a color value of a pixel point corresponding to an R channel, a color value of a pixel point corresponding to a GR channel, a color value of a pixel point corresponding to a GB channel and a color value of a pixel point corresponding to a B channel in the flat image obtained through the down-sampling;
and aiming at each color channel P, acquiring the maximum value Max in the color values of the pixels corresponding to the channel P, and obtaining the compensation table of the channel P by using the color value/Max of each pixel corresponding to the channel P.
In the embodiment of the present application, the color and brightness of the flat image may be substantially the same.
Compared with the method for generating the compensation table directly based on the flat image with the original size, the compensation table generated based on the sampled flat image only needs to occupy less computing resources and storage resources in the embodiment of the application.
In one example, the specific light source is D65, a flat image with substantially the same color and brightness is captured at D65, the flat image is down-sampled, the down-sampled RAW image is split into four channels, i.e., R, GR, B and GB, and the compensation tables gain _ R, gain _ GR, gain _ B and gain _ GB for the corresponding channels are solved, respectively. The method for solving the corresponding channel compensation table is as follows: taking the maximum value of the corresponding channel as Max, then using each point/Max of the channel, the compensation table of the channel is obtained, and the size of the compensation table is the same as that of each channel.
In this embodiment of the application, when the second RAW image is pre-compensated based on the first compensation table, the color value of each pixel point in the second RAW image is multiplied by the gain value at the same position on the first compensation table, so that a pre-compensated third RAW image is obtained.
In step 103, a color cast type of the third RAW image is obtained according to a color value of a pixel point in the third RAW image.
In the embodiment of the present application, the color cast type may include: a reddish center with a greenish periphery, a greenish center with a reddish periphery, and substantially no significant change from periphery to center, etc.
In this embodiment, when the first RAW image is an image arranged according to an rg (r) g (B) arrangement, the second RAW image is also an image arranged according to an rg (r) g (B) arrangement, and correspondingly, the third RAW image is also an image arranged according to an rg (r) g (B) arrangement; the step 103 may specifically include the following steps (not shown in the figure): step 1031, step 1032, step 1033, step 1034, and step 1035, wherein,
in step 1031, the color values of the R channel corresponding pixels, the GR channel corresponding pixels, the GB channel corresponding pixels, and the B channel corresponding pixels in the third RAW image are read.
In this application embodiment, can be with four solitary colour passageways of second RAW image split one-tenth, R passageway, GR passageway, GB passageway and B passageway promptly, after four passageway splits are accomplished, can obtain the colour value of the corresponding pixel of each passageway, do respectively: the color value of the pixel point corresponding to the R channel, the color value of the pixel point corresponding to the GR channel, the color value of the pixel point corresponding to the GB channel and the color value of the pixel point corresponding to the B channel.
In an example, as shown in fig. 2, the image of the first layer from top to bottom is a second RAW image arranged according to the rg (r) g (B) mode, and the second RAW image is split into four separate color channels, so as to obtain the split result in the second layer, which sequentially includes, from left to right: color value set { R of pixel point corresponding to R channel11,R12,…,R44Color value set of corresponding pixel points of GR channel11,GR12,…,GR44Color value set of pixel points corresponding to B channel { B11,B12,…,B44Color value set of pixel points corresponding to the channel GB and the channel GB11,GB12,…,GB44}。
In step 1032, a hue component of the R channel of the third RAW image is calculated according to the color value of the pixel corresponding to the R channel and the color value of the pixel corresponding to the GR channel, where the hue component of one pixel of the R channel is equal to the color value of one pixel of the R channel/the color value of the pixel at the corresponding position in the GR channel.
In step 1033, a hue component of the B channel of the third RAW image is calculated according to the color value of the pixel corresponding to the B channel and the color value of the pixel corresponding to the GB channel, where the hue component of one pixel of the B channel is equal to the color value of one pixel of the B channel/the color value of the pixel at the corresponding position in the GB channel.
In the embodiment of the application, after the four channels are split, the hue component Hr of the pixel point corresponding to the R channel and the hue component Hb of the pixel point corresponding to the B channel can be respectively calculated by the following formulas:
Figure BDA0003032022770000081
wherein, the division in the formula represents division by bit points.
In one example, following the example in step 1021, the color values of the pixels corresponding to the R channel are collected { R }11,R12,…,R44Each color value in the set and a color value set (GR) of corresponding pixel points of the GR channel11,GR12,…,GR44Color values in (f) are divided by the bit point, i.e., R11/GR11,R12/GR12,…,R44/GR44To obtain a set of tone components { Hr ] of the R channel of the second RAW image11,Hr12,…,Hr44As shown to the left of the lowermost layer in fig. 2;
set color values of pixels corresponding to B channel { B11,B12,…,B44Each color value in the pixel set and a color value set (GB) of a pixel point corresponding to the GB channel11,GB12,…,GR44Color values in (f) are divided by bit, i.e., B11/GB11,B12/GB12,…,B44/GB44To obtain a set { Hb } of hue components of B channel of the second RAW image11,Hb12,…,Hb44As shown to the right of the lowest layer in fig. 2.
In step 1034, the hue component of the R channel and the hue component of the B channel of the third RAW image are low-pass filtered to obtain a low-frequency portion of the hue component of the R channel and a low-frequency portion of the hue component of the B channel.
Considering that the color cast type in the lens shading is slowly changed from the four corners to the center of the image, which can be reflected by the low frequency part of the hue component, in the embodiment of the present application, the low frequency part of the hue component of the third RAW image can be separated, and the color cast type of the third RAW image can be analyzed according to the low frequency part of the hue component.
In the embodiment of the present application, the low-frequency portion of the hue component may be obtained by designing a low-pass filter, and a specific solving method may be filtering in a frequency domain through fourier transform, may also be processing in a spatial domain through a gradient map, and may also adopt other similar methods, which is not limited in the embodiment of the present application.
In the embodiment of the present application, when the spatial domain low-pass filtering method is adopted, the gradient map G of the hue component of the R channel of the third RAW image may be generated based on the hue component of the R channelr(ii) a And generating a gradient map G of the hue component of the B channel based on the hue component of the B channel of the third RAW imageb(ii) a According to Gr、GbAnd performing filtering processing by a preset low-pass filtering formula to obtain filtered Gr(i, j) and Gb(i, j) data distribution; the preset low-pass filtering formula is as follows:
Figure BDA0003032022770000091
Gr(i, j) the data distribution is used to characterize the low frequency portion of the tonal component of the R channel, Gb(i, j) the data distribution is used to characterize the low frequency part of the tonal component of the B channel, i and j represent the abscissa and ordinate of the corresponding point, θrAnd thetabIs a preset threshold.
In step 1035, a color cast type of the third RAW image is acquired from the low frequency part of the hue component of the R channel and the low frequency part of the hue component of the B channel.
In the embodiment of the present application, the color cast type Cr of the third RAW image in the R channel may be obtained according to a low-frequency portion of the hue component of the R channel, specifically, according to the G channelr(i, j) acquiring the color cast type Cr of the third RAW image in the R channel according to the change situation of the data distribution from the periphery to the center.
In the embodiment of the present application, the color cast type Cb of the third RAW image in the B channel may be obtained according to the low-frequency part of the hue component of the B channel, specifically, according to Gb(i, j) the change of the data distribution from the periphery to the center, and the color cast type Cb of the third RAW image in the B channel is obtained.
For ease of understanding, taking the hue component of the R channel as an example, the estimation method of the color cast type is described: since the color cast type component is slowly changed from the four corners to the center of the image, it can be specifically changed into the difference of the color tone components at different positions. Generally, if the color cast is shown as the center being redder and greener around, as shown in the left image of fig. 3, it can be found that the change from the periphery to the center of the corresponding gradient map is positive and negative; on the contrary, as shown in the right image of fig. 3, if the color cast phenomenon is represented by the center being greener and the periphery being redder, the change of the corresponding gradient map from the periphery to the center is first negative and then positive; if no significant color cast is present, the corresponding gradient map has substantially no significant change from the periphery to the center.
In the embodiment of the present application, the color cast type of the pre-compensated RAW image can be estimated according to the above determination method and the obtained gradient map information.
In step 104, an adjustment coefficient corresponding to the color cast type of the third RAW image is obtained according to a mapping relationship between a preset color cast type and the compensation information adjustment coefficient.
In this embodiment of the application, after the color cast type of the third RAW image is obtained, a corresponding dynamic adjustment coefficient needs to be obtained.
In the embodiment of the application, the coefficient Sr corresponding to Cr and the coefficient Sb corresponding to Cb can be obtained according to the mapping relationship between the preset color cast type and the compensation information adjustment coefficient; that is, the adjustment coefficient corresponding to the color cast type of the third RAW image in the R channel and the adjustment coefficient corresponding to the color cast type of the third RAW image in the B channel.
In the embodiment of the present application, the positive and negative values of the adjustment coefficient indicate the direction of adjusting the first compensation table, and for the phenomenon that the center is reddish and greenish around and the phenomenon that the center is reddish and greenish around, the absolute value of the adjustment coefficient is used to indicate the color cast degree, that is, the speed of gradient change, which is indicated as the adjustment amplitude of the first compensation table. When the adjustment coefficient is 0, no obvious color cast phenomenon is shown.
In step 105, the first compensation information is adjusted according to the adjustment coefficient corresponding to the color cast type of the third RAW image to obtain second compensation information.
In the embodiment of the present application, the adjustment coefficient S, the first compensation information T and the second compensation information T/The relationship of (d) may be: t is/=S*(T-1)+T。
In this embodiment, the compensation value of the R channel in the first compensation information may be adjusted according to Sr, and the compensation value of the B channel in the first compensation information may be adjusted according to Sb, so as to obtain the second compensation information. That is, only the compensation values of the R channel and the B channel in the first compensation information are adjusted according to the dynamically adjusted coefficient, and the compensation values of the GR channel and the GB channel remain unchanged.
In the embodiment of the application, when the organization form of the compensation information is the compensation table, the first compensation table is adjusted according to the adjustment coefficient to obtain the second compensation table. That is, the second compensation information is a second compensation table.
In step 106, the first RAW image is compensated based on the first compensation information and the second compensation information, so as to obtain a fourth RAW image after lens shading correction.
In the embodiment of the application, the first RAW image can be compensated based on the first compensation information to obtain an intermediate RAW image; and compensating the middle RAW image based on the second compensation information to obtain a fourth RAW image after lens shading correction. That is, the first RAW image is first pre-compensated based on the first compensation information, and then dynamically compensated using the second compensation information.
In consideration that the sizes of the first compensation table and the second compensation table are much smaller than the size of the first RAW image, in an implementation manner of the embodiment of the present application, the first compensation table and the second compensation table may be first up-sampled to obtain a third compensation table and a fourth compensation table having the same size as the first RAW image; then, multiplying the color value of each pixel point in the first RAW image by the compensation value of the corresponding position in the third compensation table to obtain a middle RAW image; and multiplying the color value of each pixel point corresponding to the R channel in the middle RAW image by the compensation value of the corresponding position in the fourth compensation table, and multiplying the color value of each pixel point corresponding to the B channel in the middle RAW image by the compensation value of the corresponding position in the fourth compensation table to obtain the fourth RAW image after lens shading correction.
Or, considering that the sizes of the first compensation table and the second compensation table are the same as the size of the third RAW image, and the third RAW image is pre-compensated by the first compensation table, in another implementation manner of the embodiment of the present application, the color value of each pixel point corresponding to the R channel in the third RAW image may be first multiplied by the compensation value of the corresponding position in the second compensation table, and the color value of each pixel point corresponding to the B channel in the third RAW image may be multiplied by the compensation value of the corresponding position in the second compensation table, so as to obtain a middle RAW image after lens shading correction; the intermediate RAW image is then up-sampled to obtain a RAW image having the same size as the first RAW image, that is, a fourth RAW image of the first RAW image after lens shading correction.
It can be seen from the above embodiment that, in this embodiment, the RAW image may be pre-compensated by the compensation information, the color cast type of the RAW image is obtained according to the color value of the pixel point in the pre-compensated RAW image, the dynamic adjustment coefficient of the compensation information is obtained according to the color cast type, and the lens shading of the RAW image is corrected according to the adjustment coefficient and the compensation information, so as to obtain the RAW image with higher imaging quality. Compared with the prior art, in the embodiment of the application, only a group of pre-calibrated compensation information needs to be provided, and the compensation information is dynamically adjusted by analyzing and estimating the lens shading of the current scene, so that the compensation purpose is accurately achieved, and the scene with higher requirements on imaging quality can be met.
In the embodiment of the application, the dynamically adjusted Lens Shading Correction algorithm can not only solve the problem that the peripheral brightness of an image is obviously lower than the central brightness in the imaging process, but also solve the central and peripheral color cast phenomenon which is difficult to eliminate. The algorithm only needs to provide a group of pre-calibrated compensation tables, and dynamically adjusts the compensation tables by analyzing and estimating Lens shaping of the current scene, thereby accurately achieving the compensation purpose. The algorithm can effectively process the Lens Shading problem of various scenes, has high convergence speed and good compensation effect, can be integrated and applied to the embedded equipment, can better improve the imaging quality of the camera of the embedded equipment, and improves the experience of a consumer when taking pictures.
It should be noted that, in the lens shading correction method provided in the embodiment of the present application, the execution subject may be a lens shading correction apparatus, or a control module in the lens shading correction apparatus for executing the loading lens shading correction method. In the embodiment of the present application, a lens shading correction method executed by a lens shading correction device is taken as an example to describe the lens shading correction device provided in the embodiment of the present application.
Fig. 4 is a block diagram of a lens shading correction apparatus according to an embodiment of the present application, and as shown in fig. 4, the lens shading correction apparatus 400 may include: a sampling module 401, a pre-compensation module 402, a first acquisition module 403, a second acquisition module 404, an adjustment module 405, and a correction module 406, wherein,
the sampling module 401 is configured to acquire a first RAW image, and perform downsampling on the first RAW image to obtain a second RAW image, where the first RAW image is an original RAW image to be corrected;
a pre-compensation module 402, configured to pre-compensate the second RAW image based on preset first compensation information to obtain a third RAW image;
a first obtaining module 403, configured to obtain a color cast type of the third RAW image according to a color value of a pixel point in the third RAW image;
a second obtaining module 404, configured to obtain an adjustment coefficient corresponding to the color cast type of the third RAW image according to a mapping relationship between a preset color cast type and a compensation information adjustment coefficient;
an adjusting module 405, configured to adjust the first compensation information according to the adjustment coefficient to obtain second compensation information;
a correcting module 406, configured to compensate the first RAW image based on the first compensation information and the second compensation information, so as to obtain a fourth RAW image after lens shading correction.
It can be seen from the above embodiment that, in this embodiment, the RAW image may be pre-compensated by the compensation information, the color cast type of the RAW image is obtained according to the color value of the pixel point in the pre-compensated RAW image, the dynamic adjustment coefficient of the compensation information is obtained according to the color cast type, and the lens shading of the RAW image is corrected according to the adjustment coefficient and the compensation information, so as to obtain the RAW image with higher imaging quality. Compared with the prior art, in the embodiment of the application, only a group of pre-calibrated compensation information needs to be provided, and the compensation information is dynamically adjusted by analyzing and estimating the lens shading of the current scene, so that the compensation purpose is accurately achieved, and the scene with higher requirements on imaging quality can be met.
Optionally, as an embodiment, the correcting module 406 may include:
the first correction submodule is used for compensating the first RAW image based on the first compensation information to obtain an intermediate RAW image;
and the second correction submodule is used for compensating the middle RAW image based on the second compensation information to obtain a fourth RAW image after lens shading correction.
Optionally, as an embodiment, the third RAW image is an image arranged according to an rg (r) g (B) arrangement;
the first obtaining module 403 may include:
the reading submodule is used for reading the color value of the pixel point corresponding to the R channel, the color value of the pixel point corresponding to the GR channel, the color value of the pixel point corresponding to the GB channel and the color value of the pixel point corresponding to the B channel in the third RAW image;
the first calculation submodule is used for calculating the hue component of the R channel of the third RAW image according to the color value of the pixel point corresponding to the R channel and the color value of the pixel point corresponding to the GR channel, wherein the hue component of one pixel point of the R channel is equal to the color value of one pixel point of the R channel/the color value of the pixel point at the corresponding position in the GR channel;
the second calculation submodule is used for calculating the hue component of the B channel of the third RAW image according to the color value of the pixel point corresponding to the B channel and the color value of the pixel point corresponding to the GB channel, wherein the hue component of one pixel point of the B channel is equal to the color value of one pixel point of the B channel/the color value of the pixel point at the corresponding position in the GB channel;
the filtering submodule is used for carrying out low-pass filtering processing on the tone component of the R channel and the tone component of the B channel of the third RAW image to obtain a low-frequency part of the tone component of the R channel and a low-frequency part of the tone component of the B channel;
a first obtaining sub-module, configured to obtain a color cast type of the third RAW image according to a low-frequency portion of the hue component of the R channel and a low-frequency portion of the hue component of the B channel.
Optionally, as an embodiment, the filtering sub-module may include:
a first generation unit configured to generate a gradient map G of the hue component of the R channel based on the hue component of the R channelr
A second generation unit for generating a gradient map G of the hue component of the B channel based on the hue component of the B channelb
A filter processing unit for processing the Gr、GbAnd performing filtering processing by a preset low-pass filtering formula to obtain filtered Gr(i, j) and Gb(i, j) data distribution, said Gr(i, j) the data distribution is used to characterize the low frequency portion of the tonal component of the R channel, the Gb(i, j) the data distribution is used to characterize a low frequency portion of the tonal component of the B channel;
the preset low-pass filtering formula is as follows:
Figure BDA0003032022770000141
i and j represent the horizontal and vertical coordinates of the corresponding points, thetarAnd thetabIs a preset threshold.
Optionally, as an embodiment, the first obtaining sub-module may include:
a first acquisition unit for acquiring Gr(i, j) acquiring the color cast type Cr of the third RAW image in an R channel according to the change condition of data distribution from the periphery to the center;
a second acquisition unit for acquiring Gb(i, j) acquiring the color cast type Cb of the third RAW image in the B channel according to the change situation of the data distribution from the periphery to the center.
Optionally, as an embodiment, the second obtaining module 404 may include:
the second obtaining submodule is used for obtaining a coefficient Sr corresponding to the Cr and a coefficient Sb corresponding to the Cb according to a mapping relation between a preset color cast type and a compensation information adjusting coefficient;
the adjusting module 405 may include:
a first adjusting submodule, configured to adjust a compensation value of an R channel in the first compensation information according to the Sr;
and the second adjusting submodule is used for adjusting the compensation value of the B channel in the first compensation information according to the Sb to obtain second compensation information.
Optionally, as an embodiment, the first compensation information is a first compensation table; the lens shading correction apparatus 400 may further include: the generating module is used for generating a first compensation table;
the generation module may include:
the shooting submodule is used for shooting a flat image with the difference between the color and the brightness smaller than a preset threshold value under a specific light source;
a first sampling sub-module for down-sampling the flat image;
the third obtaining submodule is used for obtaining the color value of a pixel point corresponding to an R channel, the color value of a pixel point corresponding to a GR channel, the color value of a pixel point corresponding to a GB channel and the color value of a pixel point corresponding to a B channel in the flat image obtained by down sampling;
and the fourth obtaining submodule is used for obtaining the maximum value Max in the color values of the pixels corresponding to the channel P aiming at each color channel P, and obtaining the color value/Max of each pixel corresponding to the channel P to obtain the compensation table of the channel P.
Optionally, as an embodiment, the adjusting module 405 may include:
the third adjusting submodule is used for adjusting the first compensation table to obtain a second compensation table according to the adjusting coefficient;
the correction module 406 may include:
the second sampling submodule is used for performing up-sampling on the first compensation table and the second compensation table to obtain a third compensation table and a fourth compensation table which have the same size with the first RAW image;
the third correction submodule is used for multiplying the color value of each pixel point in the first RAW image by the compensation value of the corresponding position in the third compensation table to obtain a middle RAW image; and multiplying the color value of each pixel point corresponding to the R channel in the intermediate RAW image by the compensation value of the corresponding position in the fourth compensation table, and multiplying the color value of each pixel point corresponding to the B channel in the intermediate RAW image by the compensation value of the corresponding position in the fourth compensation table to obtain a fourth RAW image after lens shading correction.
The lens shading correction device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The lens shading correction apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The lens shading correction device provided by the embodiment of the application can realize each process realized by the method embodiment, and is not repeated here to avoid repetition.
Optionally, as shown in fig. 5, an electronic device 500 is further provided in this embodiment of the present application, and includes a processor 501, a memory 502, and a program or an instruction stored in the memory 502 and executable on the processor 501, where the program or the instruction is executed by the processor 501 to implement each process of the lens shading correction method embodiment, and can achieve the same technical effect, and in order to avoid repetition, the description is omitted here.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 6 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application. The electronic device 600 includes, but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, and the like.
Those skilled in the art will appreciate that the electronic device 600 may further comprise a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 610 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 6 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The processor 610 is configured to acquire a first RAW image, and perform downsampling on the first RAW image to obtain a second RAW image, where the first RAW image is an original RAW image to be corrected; pre-compensating the second RAW image based on preset first compensation information to obtain a third RAW image; acquiring a color cast type of the third RAW image according to color values of pixel points in the third RAW image; acquiring an adjustment coefficient corresponding to the color cast type of the third RAW image according to a preset mapping relation between the color cast type and the compensation information adjustment coefficient; adjusting the first compensation information according to the adjustment coefficient to obtain second compensation information; and compensating the first RAW image based on the first compensation information and the second compensation information to obtain a fourth RAW image after lens shading correction.
Therefore, in the embodiment of the application, the RAW image can be precompensated through the compensation information, the color cast type of the RAW image is obtained according to the color value of the pixel point in the precompensated RAW image, the dynamic adjustment coefficient of the compensation information is obtained according to the color cast type, and the lens shade of the RAW image is corrected according to the adjustment coefficient and the compensation information, so that the RAW image with higher imaging quality is obtained. Compared with the prior art, in the embodiment of the application, only a group of pre-calibrated compensation information needs to be provided, and the compensation information is dynamically adjusted by analyzing and estimating the lens shading of the current scene, so that the compensation purpose is accurately achieved, and the scene with higher requirements on imaging quality can be met.
Optionally, as an embodiment, the processor 610 is further configured to compensate the first RAW image based on the first compensation information, so as to obtain an intermediate RAW image;
and compensating the middle RAW image based on the second compensation information to obtain a fourth RAW image after lens shading correction.
Optionally, as an embodiment, the third RAW image is an image arranged according to an rg (r) g (B) arrangement;
the processor 610 is further configured to read a color value of a pixel corresponding to an R channel, a color value of a pixel corresponding to a GR channel, a color value of a pixel corresponding to a GB channel, and a color value of a pixel corresponding to a B channel in the third RAW image;
calculating a hue component of an R channel of the third RAW image according to the color value of the pixel point corresponding to the R channel and the color value of the pixel point corresponding to the GR channel, wherein the hue component of one pixel point of the R channel is equal to the color value of one pixel point of the R channel/the color value of the pixel point at the corresponding position in the GR channel;
calculating a hue component of a B channel of the third RAW image according to the color value of the pixel point corresponding to the B channel and the color value of the pixel point corresponding to the GB channel, wherein the hue component of one pixel point of the B channel is equal to the color value of one pixel point of the B channel/the color value of the pixel point at the corresponding position in the GB channel;
performing low-pass filtering processing on the hue component of the R channel and the hue component of the B channel of the third RAW image to obtain a low-frequency part of the hue component of the R channel and a low-frequency part of the hue component of the B channel;
and acquiring the color cast type of the third RAW image according to the low-frequency part of the tone component of the R channel and the low-frequency part of the tone component of the B channel.
Optionally, as an embodiment, the processor 610 is further configured to generate a gradient map G of the hue component of the R channel based on the hue component of the R channelr(ii) a And generating a gradient map G of the hue component of the B channel based on the hue component of the B channelb
According to the Gr、GbAnd performing filtering processing by a preset low-pass filtering formula to obtain filtered Gr(i, j) and Gb(i, j) data distribution, said Gr(i, j) the data distribution is used to characterize the low frequency portion of the tonal component of the R channel, the Gb(i, j) the data distribution is used to characterize a low frequency portion of the tonal component of the B channel;
the preset low-pass filtering formula is as follows:
Figure BDA0003032022770000191
i and j represent the horizontal and vertical coordinates of the corresponding points, thetarAnd thetabIs a preset threshold.
Optionally, as an embodiment, the processor 610 is further configured to perform the method according to the abover(i, j) acquiring the color cast type Cr of the third RAW image in an R channel according to the change condition of data distribution from the periphery to the center;
according to the Gb(i, j) acquiring the color cast type Cb of the third RAW image in the B channel according to the change situation of the data distribution from the periphery to the center.
Optionally, as an embodiment, the processor 610 is further configured to obtain a coefficient Sr corresponding to the Cr and a coefficient Sb corresponding to the Cb according to a mapping relationship between a preset color cast type and a compensation information adjustment coefficient;
and adjusting the compensation value of the R channel in the first compensation information according to the Sr, and adjusting the compensation value of the B channel in the first compensation information according to the Sb to obtain second compensation information.
Optionally, as an embodiment, the adjustment coefficient S, the first compensation information T and the second compensation information T are adjusted/The relationship of (1) is: t is/=S*(T-1)+T。
Optionally, as an embodiment, the first compensation information is a first compensation table;
a processor 610, further configured to generate a first compensation table;
the generating a first compensation table includes:
shooting a flat image with the difference between the color and the brightness smaller than a preset threshold value under a specific light source;
down-sampling the flat image;
acquiring a color value of a pixel point corresponding to an R channel, a color value of a pixel point corresponding to a GR channel, a color value of a pixel point corresponding to a GB channel and a color value of a pixel point corresponding to a B channel in a flat image obtained by down-sampling;
and aiming at each color channel P, obtaining the maximum value Max in the color values of the pixels corresponding to the channel P, and obtaining the color value/Max of each pixel corresponding to the channel P to obtain a compensation table of the channel P.
Optionally, as an embodiment, the processor 610 is further configured to adjust the first compensation table to obtain a second compensation table according to the adjustment coefficient;
the first compensation table and the second compensation table are up-sampled to obtain a third compensation table and a fourth compensation table which have the same size as the first RAW image;
multiplying the color value of each pixel point in the first RAW image by the compensation value of the corresponding position in the third compensation table to obtain a middle RAW image; and multiplying the color value of each pixel point corresponding to the R channel in the intermediate RAW image by the compensation value of the corresponding position in the fourth compensation table, and multiplying the color value of each pixel point corresponding to the B channel in the intermediate RAW image by the compensation value of the corresponding position in the fourth compensation table to obtain a fourth RAW image after lens shading correction.
It is to be understood that, in the embodiment of the present application, the input Unit 604 may include a Graphics Processing Unit (GPU) 6041 and a microphone 6042, and the Graphics Processing Unit 6041 processes image data of a still picture or a video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The display unit 606 may include a display panel 6061, and the display panel 6061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 607 includes a touch panel 6071 and other input devices 6072. A touch panel 6071, also referred to as a touch screen. The touch panel 6071 may include two parts of a touch detection device and a touch controller. Other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 609 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. The processor 610 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the lens shading correction method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above-mentioned lens shading correction method embodiment, and can achieve the same technical effect, and in order to avoid repetition, the description is omitted here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (18)

1. A lens shading correction method, the method comprising:
acquiring a first RAW image, and performing downsampling on the first RAW image to obtain a second RAW image, wherein the first RAW image is an original RAW image to be corrected;
pre-compensating the second RAW image based on preset first compensation information to obtain a third RAW image;
acquiring a color cast type of the third RAW image according to color values of pixel points in the third RAW image;
acquiring an adjustment coefficient corresponding to the color cast type of the third RAW image according to a preset mapping relation between the color cast type and the compensation information adjustment coefficient;
adjusting the first compensation information according to the adjustment coefficient to obtain second compensation information;
and compensating the first RAW image based on the first compensation information and the second compensation information to obtain a fourth RAW image after lens shading correction.
2. The method according to claim 1, wherein the compensating the first RAW image based on the first compensation information and the second compensation information to obtain a fourth RAW image after lens shading correction comprises:
compensating the first RAW image based on the first compensation information to obtain an intermediate RAW image;
and compensating the middle RAW image based on the second compensation information to obtain a fourth RAW image after lens shading correction.
3. The method according to claim 1 or 2, wherein the third RAW image is an image arranged in an rg (r) g (B) B arrangement;
the obtaining of the color cast type of the third RAW image according to the color value of the pixel point in the third RAW image includes:
reading a color value of a pixel point corresponding to an R channel, a color value of a pixel point corresponding to a GR channel, a color value of a pixel point corresponding to a GB channel and a color value of a pixel point corresponding to a B channel in the third RAW image;
calculating a hue component of an R channel of the third RAW image according to the color value of the pixel point corresponding to the R channel and the color value of the pixel point corresponding to the GR channel, wherein the hue component of one pixel point of the R channel is equal to the color value of one pixel point of the R channel/the color value of the pixel point at the corresponding position in the GR channel;
calculating a hue component of a B channel of the third RAW image according to the color value of the pixel point corresponding to the B channel and the color value of the pixel point corresponding to the GB channel, wherein the hue component of one pixel point of the B channel is equal to the color value of one pixel point of the B channel/the color value of the pixel point at the corresponding position in the GB channel;
performing low-pass filtering processing on the hue component of the R channel and the hue component of the B channel of the third RAW image to obtain a low-frequency part of the hue component of the R channel and a low-frequency part of the hue component of the B channel;
and acquiring the color cast type of the third RAW image according to the low-frequency part of the tone component of the R channel and the low-frequency part of the tone component of the B channel.
4. The method according to claim 3, wherein the low-pass filtering the R-channel tone component and the B-channel tone component of the third RAW image to obtain a low-frequency portion of the R-channel tone component and a low-frequency portion of the B-channel tone component includes:
generating a gradient map G of the hue component of the R channel based on the hue component of the R channelr(ii) a And generating a gradient map G of the hue component of the B channel based on the hue component of the B channelb
According to the Gr、GbAnd performing filtering processing by a preset low-pass filtering formula to obtain filtered Gr(i, j) and Gb(i, j) data distribution, said Gr(i, j) the data distribution is used to characterize the low frequency portion of the tonal component of the R channel, the Gb(i, j) the data distribution is used to characterize a low frequency portion of the tonal component of the B channel;
the preset low-pass filtering formula is as follows:
Figure FDA0003032022760000021
i and j represent the horizontal and vertical coordinates of the corresponding points, thetarAnd thetabIs a preset threshold.
5. The method according to claim 4, wherein the obtaining of the color cast type of the third RAW image from the low-frequency part of the hue component of the R channel and the low-frequency part of the hue component of the B channel includes:
according to the Gr(i, j) acquiring the color cast type Cr of the third RAW image in an R channel according to the change condition of data distribution from the periphery to the center;
according to the Gb(i, j) acquiring the color cast type Cb of the third RAW image in the B channel according to the change situation of the data distribution from the periphery to the center.
6. The method according to claim 5, wherein the obtaining an adjustment coefficient corresponding to the color cast type of the third RAW image according to a preset mapping relationship between the color cast type and a compensation information adjustment coefficient includes:
obtaining a coefficient Sr corresponding to the Cr and a coefficient Sb corresponding to the Cb according to a mapping relation between a preset color cast type and a compensation information adjustment coefficient;
the adjusting the first compensation information according to the adjustment coefficient to obtain second compensation information includes:
and adjusting the compensation value of the R channel in the first compensation information according to the Sr, and adjusting the compensation value of the B channel in the first compensation information according to the Sb to obtain second compensation information.
7. The method of claim 1, wherein the adjustment factor S, the first compensation information T and the second compensation information T are/In a relationship of:T/=S*(T-1)+T。
8. The method of claim 3, wherein the first compensation information is a first compensation table;
before the step of performing pre-compensation on the second RAW image based on the preset first compensation information to obtain a third RAW image, the method further includes: generating a first compensation table;
the generating a first compensation table includes:
shooting a flat image with the difference between the color and the brightness smaller than a preset threshold value under a specific light source;
down-sampling the flat image;
acquiring a color value of a pixel point corresponding to an R channel, a color value of a pixel point corresponding to a GR channel, a color value of a pixel point corresponding to a GB channel and a color value of a pixel point corresponding to a B channel in a flat image obtained by down-sampling;
and aiming at each color channel P, obtaining the maximum value Max in the color values of the pixels corresponding to the channel P, and obtaining the color value/Max of each pixel corresponding to the channel P to obtain a compensation table of the channel P.
9. The method of claim 8, wherein adjusting the first compensation information according to the adjustment factor to obtain second compensation information comprises:
adjusting the first compensation table to obtain a second compensation table according to the adjustment coefficient;
the compensating the first RAW image based on the first compensation information and the second compensation information to obtain a fourth RAW image after lens shading correction, including:
the first compensation table and the second compensation table are up-sampled to obtain a third compensation table and a fourth compensation table which have the same size as the first RAW image;
multiplying the color value of each pixel point in the first RAW image by the compensation value of the corresponding position in the third compensation table to obtain a middle RAW image; and multiplying the color value of each pixel point corresponding to the R channel in the intermediate RAW image by the compensation value of the corresponding position in the fourth compensation table, and multiplying the color value of each pixel point corresponding to the B channel in the intermediate RAW image by the compensation value of the corresponding position in the fourth compensation table to obtain a fourth RAW image after lens shading correction.
10. A lens shading correction apparatus, characterized in that the apparatus comprises:
the sampling module is used for acquiring a first RAW image and performing downsampling on the first RAW image to obtain a second RAW image, wherein the first RAW image is an original RAW image to be corrected;
the pre-compensation module is used for pre-compensating the second RAW image based on preset first compensation information to obtain a third RAW image;
the first obtaining module is used for obtaining the color cast type of the third RAW image according to the color value of the pixel point in the third RAW image;
the second obtaining module is used for obtaining an adjusting coefficient corresponding to the color cast type of the third RAW image according to a mapping relation between a preset color cast type and a compensation information adjusting coefficient;
the adjusting module is used for adjusting the first compensation information according to the adjusting coefficient to obtain second compensation information;
and the correction module is used for compensating the first RAW image based on the first compensation information and the second compensation information to obtain a fourth RAW image after lens shading correction.
11. The apparatus of claim 10, wherein the correction module comprises:
the first correction submodule is used for compensating the first RAW image based on the first compensation information to obtain an intermediate RAW image;
and the second correction submodule is used for compensating the middle RAW image based on the second compensation information to obtain a fourth RAW image after lens shading correction.
12. The apparatus according to claim 10 or 11, wherein the third RAW image is an image arranged in an rg (r) g (B) B arrangement;
the first obtaining module comprises:
the reading submodule is used for reading the color value of the pixel point corresponding to the R channel, the color value of the pixel point corresponding to the GR channel, the color value of the pixel point corresponding to the GB channel and the color value of the pixel point corresponding to the B channel in the third RAW image;
the first calculation submodule is used for calculating the hue component of the R channel of the third RAW image according to the color value of the pixel point corresponding to the R channel and the color value of the pixel point corresponding to the GR channel, wherein the hue component of one pixel point of the R channel is equal to the color value of one pixel point of the R channel/the color value of the pixel point at the corresponding position in the GR channel;
the second calculation submodule is used for calculating the hue component of the B channel of the third RAW image according to the color value of the pixel point corresponding to the B channel and the color value of the pixel point corresponding to the GB channel, wherein the hue component of one pixel point of the B channel is equal to the color value of one pixel point of the B channel/the color value of the pixel point at the corresponding position in the GB channel;
the filtering submodule is used for carrying out low-pass filtering processing on the tone component of the R channel and the tone component of the B channel of the third RAW image to obtain a low-frequency part of the tone component of the R channel and a low-frequency part of the tone component of the B channel;
a first obtaining sub-module, configured to obtain a color cast type of the third RAW image according to a low-frequency portion of the hue component of the R channel and a low-frequency portion of the hue component of the B channel.
13. The apparatus of claim 12, wherein the filtering sub-module comprises:
a first generation unit configured to generate a gradient map G of the hue component of the R channel based on the hue component of the R channelr
A second generation unit for generating BThe tone component of the channel, and a gradient map G of the tone component of the B channelb
A filter processing unit for processing the Gr、GbAnd performing filtering processing by a preset low-pass filtering formula to obtain filtered Gr(i, j) and Gb(i, j) data distribution, said Gr(i, j) the data distribution is used to characterize the low frequency portion of the tonal component of the R channel, the Gb(i, j) the data distribution is used to characterize a low frequency portion of the tonal component of the B channel;
the preset low-pass filtering formula is as follows:
Figure FDA0003032022760000061
i and j represent the horizontal and vertical coordinates of the corresponding points, thetarAnd thetabIs a preset threshold.
14. The apparatus of claim 13, wherein the first acquisition submodule comprises:
a first acquisition unit for acquiring Gr(i, j) acquiring the color cast type Cr of the third RAW image in an R channel according to the change condition of data distribution from the periphery to the center;
a second acquisition unit for acquiring Gb(i, j) acquiring the color cast type Cb of the third RAW image in the B channel according to the change situation of the data distribution from the periphery to the center.
15. The apparatus of claim 14, wherein the second obtaining module comprises:
the second obtaining submodule is used for obtaining a coefficient Sr corresponding to the Cr and a coefficient Sb corresponding to the Cb according to a mapping relation between a preset color cast type and a compensation information adjusting coefficient;
the adjustment module includes:
a first adjusting submodule, configured to adjust a compensation value of an R channel in the first compensation information according to the Sr;
and the second adjusting submodule is used for adjusting the compensation value of the B channel in the first compensation information according to the Sb to obtain second compensation information.
16. The apparatus of claim 12, wherein the first compensation information is a first compensation table; the device further comprises: the generating module is used for generating a first compensation table;
the generation module comprises:
the shooting submodule is used for shooting a flat image with the difference between the color and the brightness smaller than a preset threshold value under a specific light source;
a first sampling sub-module for down-sampling the flat image;
the third obtaining submodule is used for obtaining the color value of a pixel point corresponding to an R channel, the color value of a pixel point corresponding to a GR channel, the color value of a pixel point corresponding to a GB channel and the color value of a pixel point corresponding to a B channel in the flat image obtained by down sampling;
and the fourth obtaining submodule is used for obtaining the maximum value Max in the color values of the pixels corresponding to the channel P aiming at each color channel P, and obtaining the color value/Max of each pixel corresponding to the channel P to obtain the compensation table of the channel P.
17. The apparatus of claim 16, wherein the adjustment module comprises:
the third adjusting submodule is used for adjusting the first compensation table to obtain a second compensation table according to the adjusting coefficient;
the correction module includes:
the second sampling submodule is used for performing up-sampling on the first compensation table and the second compensation table to obtain a third compensation table and a fourth compensation table which have the same size with the first RAW image;
the third correction submodule is used for multiplying the color value of each pixel point in the first RAW image by the compensation value of the corresponding position in the third compensation table to obtain a middle RAW image; and multiplying the color value of each pixel point corresponding to the R channel in the intermediate RAW image by the compensation value of the corresponding position in the fourth compensation table, and multiplying the color value of each pixel point corresponding to the B channel in the intermediate RAW image by the compensation value of the corresponding position in the fourth compensation table to obtain a fourth RAW image after lens shading correction.
18. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the lens shading correction method according to any one of claims 1 to 9.
CN202110432799.1A 2021-04-21 2021-04-21 Lens shading correction method and device and electronic equipment Active CN113132562B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110432799.1A CN113132562B (en) 2021-04-21 2021-04-21 Lens shading correction method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110432799.1A CN113132562B (en) 2021-04-21 2021-04-21 Lens shading correction method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN113132562A true CN113132562A (en) 2021-07-16
CN113132562B CN113132562B (en) 2023-09-29

Family

ID=76778869

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110432799.1A Active CN113132562B (en) 2021-04-21 2021-04-21 Lens shading correction method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN113132562B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115767290A (en) * 2022-09-28 2023-03-07 荣耀终端有限公司 Image processing method and electronic device

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090128670A1 (en) * 2006-05-24 2009-05-21 Yo-Hwan Noh Apparatus and method for compensating color, and image processor, digital processing apparatus, recording medium using it
JP2010177917A (en) * 2009-01-28 2010-08-12 Acutelogic Corp White balance adjusting device, white balance adjusting method, and white balance adjusting program
US20100232705A1 (en) * 2009-03-12 2010-09-16 Ricoh Company, Ltd. Device and method for detecting shadow in image
CN105100550A (en) * 2014-04-21 2015-11-25 展讯通信(上海)有限公司 Shadow correction method and device and imaging system
CN108234824A (en) * 2018-03-26 2018-06-29 上海小蚁科技有限公司 Shadow correction detection parameters determine, correct detection method and device, storage medium, fisheye camera
CN109068025A (en) * 2018-08-27 2018-12-21 建荣半导体(深圳)有限公司 A kind of camera lens shadow correction method, system and electronic equipment
CN110365949A (en) * 2018-03-26 2019-10-22 展讯通信(天津)有限公司 A kind of bearing calibration of image color cast, device and electronic equipment
CN111385438A (en) * 2018-12-28 2020-07-07 展讯通信(上海)有限公司 Compensating method and device for lens shading correction and computer readable storage medium
CN112150357A (en) * 2019-06-28 2020-12-29 维沃移动通信有限公司 Image processing method and mobile terminal

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090128670A1 (en) * 2006-05-24 2009-05-21 Yo-Hwan Noh Apparatus and method for compensating color, and image processor, digital processing apparatus, recording medium using it
JP2010177917A (en) * 2009-01-28 2010-08-12 Acutelogic Corp White balance adjusting device, white balance adjusting method, and white balance adjusting program
US20100232705A1 (en) * 2009-03-12 2010-09-16 Ricoh Company, Ltd. Device and method for detecting shadow in image
CN105100550A (en) * 2014-04-21 2015-11-25 展讯通信(上海)有限公司 Shadow correction method and device and imaging system
CN108234824A (en) * 2018-03-26 2018-06-29 上海小蚁科技有限公司 Shadow correction detection parameters determine, correct detection method and device, storage medium, fisheye camera
CN110365949A (en) * 2018-03-26 2019-10-22 展讯通信(天津)有限公司 A kind of bearing calibration of image color cast, device and electronic equipment
CN109068025A (en) * 2018-08-27 2018-12-21 建荣半导体(深圳)有限公司 A kind of camera lens shadow correction method, system and electronic equipment
CN111385438A (en) * 2018-12-28 2020-07-07 展讯通信(上海)有限公司 Compensating method and device for lens shading correction and computer readable storage medium
CN112150357A (en) * 2019-06-28 2020-12-29 维沃移动通信有限公司 Image processing method and mobile terminal

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
S. NADIMI: "Moving shadow detection using a physics-based approach", 《2002 INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION》 *
马皇特: "基于无人机航拍图像的阴影分割与消除", 《中国优秀硕士学位论文全文数据库》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115767290A (en) * 2022-09-28 2023-03-07 荣耀终端有限公司 Image processing method and electronic device
CN115767290B (en) * 2022-09-28 2023-09-29 荣耀终端有限公司 Image processing method and electronic device

Also Published As

Publication number Publication date
CN113132562B (en) 2023-09-29

Similar Documents

Publication Publication Date Title
CN109272459B (en) Image processing method, image processing device, storage medium and electronic equipment
US11158033B2 (en) Method for image processing, electronic device, and non-transitory storage medium for improving contrast of image
US9767544B2 (en) Scene adaptive brightness/contrast enhancement
US8547450B2 (en) Methods and systems for automatic white balance
CN102640489B (en) System and method for detecting and correcting defective pixels in an image sensor
CN102640184B (en) Temporal filtering techniques for image signal processing
US8639050B2 (en) Dynamic adjustment of noise filter strengths for use with dynamic range enhancement of images
US20100278423A1 (en) Methods and systems for contrast enhancement
CN112330531B (en) Image processing method, image processing device, electronic equipment and storage medium
US20220159162A1 (en) Imaging compensation device, imaging compensation method, and application
US8717460B2 (en) Methods and systems for automatic white balance
WO2023010754A1 (en) Image processing method and apparatus, terminal device, and storage medium
JP2008263475A (en) Image processing device, method, and program
CN113132695B (en) Lens shading correction method and device and electronic equipment
TW201433168A (en) Image processing method
CN111161188B (en) Method for reducing image color noise, computer device and readable storage medium
US8648937B2 (en) Image processing apparatus, image processing method, and camera module
US8384796B2 (en) Methods and systems for automatic white balance
JP2003304549A (en) Camera and image signal processing system
WO2019104047A1 (en) Global tone mapping
CN109727216A (en) Image processing method, device, terminal device and storage medium
CN111226256A (en) System and method for image dynamic range adjustment
WO2022218245A1 (en) Image processing method and apparatus, electronic device, and readable storage medium
CN114998122A (en) Low-illumination image enhancement method
CN113132562B (en) Lens shading correction method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant