CN114339187A - Image processing method, image processing apparatus, and storage medium - Google Patents

Image processing method, image processing apparatus, and storage medium Download PDF

Info

Publication number
CN114339187A
CN114339187A CN202011062405.XA CN202011062405A CN114339187A CN 114339187 A CN114339187 A CN 114339187A CN 202011062405 A CN202011062405 A CN 202011062405A CN 114339187 A CN114339187 A CN 114339187A
Authority
CN
China
Prior art keywords
gain value
channel
gain
determining
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011062405.XA
Other languages
Chinese (zh)
Other versions
CN114339187B (en
Inventor
谢俊麒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202011062405.XA priority Critical patent/CN114339187B/en
Publication of CN114339187A publication Critical patent/CN114339187A/en
Application granted granted Critical
Publication of CN114339187B publication Critical patent/CN114339187B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)
  • Processing Of Color Television Signals (AREA)

Abstract

The present disclosure relates to an image processing method, an image processing apparatus, and a storage medium. An image processing method, comprising: acquiring an image to be processed, and determining an edge area image of the image to be processed; carrying out combined correction processing of white balance correction and color uniformity correction on the edge area image, and determining a gain value after combined correction; and performing gain compensation on the green channel and/or the blue channel of the edge area image based on the jointly corrected gain value. By the method and the device, gain compensation is not needed to be carried out on the red channel of the image in the edge area, the gain value multiplied by each color channel when the gain compensation is carried out is reduced, and the noise of the image in the edge area is reduced.

Description

Image processing method, image processing apparatus, and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, and a storage medium.
Background
The application field of image processing technology is becoming more and more extensive, for example, various image processing technologies are used in the fields of professional cameras, smart phones, even picture software and the like. With the increasing demand of people for image quality, the demand of image processing technology is higher and higher. For the control of noise, it is one of the important contents.
In the related art, color shading (color shading) and white balance correction are generally required for performing image processing. However, the color uniformity correction and the white balance correction are continuously multiplied, so that the noise is easily amplified by equivalently multiplying two gain (gain) values in the edge region of the image, for example, the four corner region of the image.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides an image processing method, an image processing apparatus, and a storage medium.
According to a first aspect of embodiments of the present disclosure, there is provided an image processing method, including:
acquiring an image to be processed, and determining an edge area image of the image to be processed; carrying out combined correction processing of white balance correction and color uniformity correction on the edge area image, and determining a gain value after combined correction; and performing gain compensation on the green channel and/or the blue channel of the edge area image based on the jointly corrected gain value.
In one embodiment, the determining the jointly corrected gain value includes:
determining a gain statistic value for performing white balance correction on the edge area image of the image to be processed and calibration data for performing color uniformity correction on the edge area image of the image to be processed; and determining a gain value after joint correction based on the gain statistic and the calibration data.
In one embodiment, the determining a jointly corrected gain value based on the gain statistic and the calibration data includes:
determining a green channel gain value and a blue channel gain value based on the gain statistic and the calibration data; determining a first gain value for performing gain compensation on a green channel in the edge area image based on the green channel gain value; determining a second gain value for performing gain compensation on a blue channel in the edge area image based on the blue channel gain value; determining the first gain value and/or the second gain value as a jointly corrected gain value.
In one embodiment, determining a green channel gain value and a blue channel gain value based on the gain statistics and the calibration data comprises:
respectively determining a red channel initial gain value, a green channel initial gain value and a blue channel initial gain value based on the gain statistic value and the calibration data; determining the green channel gain value based on the green channel initial gain value and the red channel initial gain value; determining the blue channel gain value based on the blue channel initial gain value and the red channel initial gain value.
In one embodiment, determining the green channel gain value based on the green channel initial gain value and the red channel initial gain value includes: determining a ratio between the green channel initial gain value and the red channel initial gain value as the green channel gain value.
Determining the blue channel gain value based on the blue channel initial gain value and the red channel initial gain value, including: determining a ratio between the blue channel initial gain value and the red channel initial gain value as the blue channel gain value.
In one embodiment, the determining the red channel initial gain value, the green channel initial gain value, and the blue channel initial gain value based on the gain statistic and the calibration data includes:
determining a product between a red channel gain statistic in the gain statistics and red channel calibration data in the calibration data as a red channel initial gain value, determining a product between a green channel gain statistic in the gain statistics and green channel calibration data in the calibration data as a green channel initial gain value, and determining a product between a blue channel gain statistic in the gain statistics and blue channel calibration data in the calibration data as a blue channel initial gain value.
According to a second aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring an image to be processed and determining an edge area image of the image to be processed; a joint correction unit, configured to perform joint correction processing of white balance correction and color uniformity correction on the edge area image, and determine a gain value after joint correction; and the compensation unit is used for carrying out gain compensation on the green channel and/or the blue channel of the edge area image based on the gain value after the joint correction.
In one embodiment, the joint correction unit determines the jointly corrected gain value as follows:
determining a gain statistic value for performing white balance correction on the edge area image of the image to be processed and calibration data for performing color uniformity correction on the edge area image of the image to be processed; and determining a gain value after joint correction based on the gain statistic and the calibration data.
In one embodiment, the joint correction unit determines the jointly corrected gain value based on the gain statistic and the calibration data as follows:
determining a green channel gain value and a blue channel gain value based on the gain statistic and the calibration data; determining a first gain value for performing gain compensation on a green channel in the edge area image based on the green channel gain value; determining a second gain value for performing gain compensation on a blue channel in the edge area image based on the blue channel gain value; determining the first gain value and/or the second gain value as a jointly corrected gain value.
In one embodiment, the joint correction unit determines the green channel gain value and the blue channel gain value based on the gain statistic and the calibration data as follows:
respectively determining a red channel initial gain value, a green channel initial gain value and a blue channel initial gain value based on the gain statistic value and the calibration data; determining the green channel gain value based on the green channel initial gain value and the red channel initial gain value; determining the blue channel gain value based on the blue channel initial gain value and the red channel initial gain value.
In one embodiment, the joint correction unit determines the green channel gain value based on the green channel initial gain value and the red channel initial gain value as follows: determining a ratio between the green channel initial gain value and the red channel initial gain value as the green channel gain value; the joint correction unit determines the blue channel gain value based on the blue channel initial gain value and the red channel initial gain value as follows: determining a ratio between the blue channel initial gain value and the red channel initial gain value as the blue channel gain value.
In one embodiment, the joint correction unit determines the red channel initial gain value, the green channel initial gain value, and the blue channel initial gain value based on the gain statistic and the calibration data as follows:
determining a product between a red channel gain statistic in the gain statistics and red channel calibration data in the calibration data as a red channel initial gain value, determining a product between a green channel gain statistic in the gain statistics and green channel calibration data in the calibration data as a green channel initial gain value, and determining a product between a blue channel gain statistic in the gain statistics and blue channel calibration data in the calibration data as a blue channel initial gain value.
According to a third aspect of the embodiments of the present disclosure, there is provided an image processing apparatus comprising:
a processor; a memory for storing processor-executable instructions;
wherein the processor is configured to: the image processing method of the first aspect or any one of the embodiments of the first aspect is performed.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium, wherein instructions of the storage medium, when executed by a processor of a mobile terminal, enable the mobile terminal to perform the image processing method of the first aspect or any one of the implementation manners of the first aspect.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: and performing combined correction processing of white balance correction and color uniformity correction on the edge area image of the image to be processed, and performing gain compensation on the green channel and/or the blue channel of the edge area image based on the gain value after the combined correction, so that the red channel of the edge area image does not need to perform gain compensation any more, the gain value multiplied by each color channel during the gain compensation is reduced, and the occurrence of edge area image noise is reduced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a schematic flow diagram illustrating image processing in a conventional technique according to an exemplary embodiment.
FIG. 2 is a flow diagram illustrating an image processing method according to an exemplary embodiment.
FIG. 3 is a schematic diagram illustrating a process for determining jointly corrected gain values according to an exemplary embodiment.
FIG. 4 is a flowchart illustrating a process for determining jointly corrected gain values based on white balance corrected gain statistics and color uniformity corrected calibration data, according to an exemplary embodiment.
FIG. 5 is a diagram illustrating an image processing method according to an exemplary embodiment.
Fig. 6 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment.
Fig. 7 is a block diagram illustrating an apparatus for image processing according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
In the related art, an image captured by an image capturing device such as a camera is generally an original image, and a phenomenon similar to inconsistency of display colors of a central area and an edge area of the image occurs, for example, in a dark-light capturing scene where the ambient light brightness is less than a preset threshold, the noise of the image in the edge area is more obvious than that of the image in the central area. In the process of image processing, the image needs to be subjected to color uniformity correction and white balance correction in sequence. However, the color uniformity correction and the white balance correction are performed in sequence and are continuously multiplied, so that in the edge area of the image, for example, the image is equivalently multiplied by two gain values in the edge area, and the noise is easily amplified. For example, for an image in which the center region is reddish and the edge region displays gray, the process of performing color uniformity correction and white balance correction in this order is shown in fig. 1. Fig. 1 is a schematic flow diagram illustrating image processing in a conventional technique according to an exemplary embodiment, referring to fig. 1: an original image (RAW) is input. And aiming at the input original image, firstly carrying out color uniform correction to correct the color difference between the central area and the edge area of the image, and multiplying the red channel of the edge area by a red channel gain value to obtain a reddish image with the color consistent between the image of the edge area and the image of the central area. Then, white balance correction is carried out, and the whole image is multiplied by a green channel gain value to obtain a moderate gray level image. For the edge area image, the green channel is multiplied by a green channel gain value and the red channel is multiplied by a red channel gain value.
In the related art, the red channel gain value, the green channel gain value, and the blue channel gain value multiplied by the gain compensation are all gain values greater than 1.
Since the more gain values multiplied during the image processing, the more noise is introduced. Therefore, in the related art, the color uniformity correction and the white balance correction are performed in sequence, so that the edge area image and the central area image are corrected to be consistent red, which is equivalent to multiplying a gain value for the red channel, and at this time, the distance difference between the red channel and the green channel, and between the red channel and the blue channel in the image is enlarged. And then white balance correction is carried out, and the whole image is multiplied by a green channel gain value and a blue channel gain value, namely the whole image is multiplied by a red channel gain value, so that more noise is generated. Especially, in a scene where noise of an edge area image shot under a dark light shooting scene where the ambient light brightness is smaller than a preset threshold value is relatively obvious, the edge area image is subjected to a gain compensation process, and the multiplied channel gain value is large, so that the noise of the edge area image is excessive.
In view of the above, embodiments of the present disclosure provide an image processing method, in which a joint correction process of color uniformity correction and white balance correction is performed, that is, white balance correction is performed first, and a color uniformity correction process for correcting an edge area image and a center area image to be uniform red is omitted, and a gain value for gain compensation is relatively reduced without multiplying a red channel by a gain, so that noise of the edge area image can be relatively reduced.
Fig. 2 is a flowchart illustrating an image processing method according to an exemplary embodiment, and as shown in fig. 2, the image processing method is applicable to an image processing apparatus, for example, used in a terminal, and includes the following steps.
In step S11, an image to be processed is acquired, and an edge area image of the image to be processed is determined.
The image to be processed in the embodiment of the present disclosure may be an image that needs to be subjected to noise reduction processing, for example, the image may be an image captured in a dim light scene where the ambient light brightness is smaller than a preset threshold, and certainly, the image may also be an image captured in a bright light scene. The image to be subjected to the noise reduction processing may be an image which has been photographed and subjected to the optimization processing. In an example, the image subjected to the noise reduction processing in the embodiment of the present disclosure has relatively significant basic noise in an edge area, for example, an image captured in a dim light scene with an ambient light brightness less than a preset threshold.
The determination of the edge area image of the image to be processed can be based on the image color difference parameter of the image to be processed. The image color difference parameter can be determined based on a module parameter of an image acquisition device for shooting the image to be processed. In an example, when determining the edge region image based on the image color difference parameter, the embodiment of the present disclosure may divide the image to be processed into a plurality of blocks according to the image color difference parameter, for example, may divide the image to be processed into 9 × 16 blocks. Among the divided blocks, a block within a first region range at the center of the image to be processed may be determined as a center region of the image to be processed, and a block within a region range other than the center region in the image to be processed may be determined as an edge region image of the image to be processed. In one example, the region outside the range of 60% from the center in the image to be processed may be determined as the edge region image in the embodiment of the present disclosure.
It is further understood that, in the embodiment of the present disclosure, the edge region of the image to be processed may be understood as a peripheral region of the image to be processed. In an example, when the image to be processed is rectangular, the image of the edge area of the image to be processed may be a four-corner image of the image to be processed.
In step S12, a joint correction process of white balance correction and color uniformity correction is performed on the edge region image, and a gain value after the joint correction is determined.
In the embodiment of the present disclosure, the combined correction processing of performing white balance correction and color uniformity correction on the edge area image may be understood as a processing procedure of performing white balance correction based on calibration data of color uniformity correction alone first. The calibration data for color uniformity correction can be understood as inherent data calibrated in the image module for color uniformity correction of an image.
The gain value after the joint correction can be understood as a gain value obtained by comprehensively considering the color uniformity of the image in the edge area and performing the white balance correction.
In step S13, gain compensation is performed on the green channel and/or the blue channel of the edge region image based on the jointly corrected gain value.
It is understood that, in the embodiment of the present disclosure, the gain compensation is performed on the green channel of the edge area image, which may be understood as multiplying the green channel of the edge area image by a gain value. The gain compensation is performed on the blue channel of the edge area image, which can be understood as multiplying the blue channel of the edge area image by a gain value.
In the embodiment of the present disclosure, the edge area image of the image to be processed is subjected to the joint correction processing of white balance correction and color uniformity correction, and based on the gain value after the joint correction, the gain compensation is performed on the green channel and/or the blue channel of the edge area image. Because the edge area image is not corrected to be color uniform correction with the color consistent with that of the central area image, the red channel of the edge area image does not need to be subjected to gain compensation, the gain value multiplied by each color channel during the gain compensation is reduced, and the occurrence of edge area image noise is reduced.
The following describes an implementation process of performing a joint correction process of white balance correction and color uniformity correction and determining a gain value after the joint correction, with reference to practical applications.
FIG. 3 is a schematic diagram illustrating a process for determining jointly corrected gain values according to an exemplary embodiment. Referring to fig. 3, the following steps are included.
In step S21, a gain statistic for performing white balance correction on the edge area image of the image to be processed and calibration data for performing color uniformity correction on the edge area image of the image to be processed are determined.
In the embodiment of the present disclosure, the gain statistic value for performing white balance correction on the edge area of the image to be processed may be determined in a conventional manner. For example, the edge area image may be divided into a plurality of blocks, an adjusted color temperature of a block included in the white area among the plurality of blocks may be determined using a reference color temperature line, and a gain statistic value of white balance correction may be obtained by counting red, green, and blue channel gain values to which different weights are applied according to the adjusted color temperature of the block included in the white area.
The calibration data for performing color uniformity correction on the edge area image of the image to be processed can be acquired by acquiring the inherent attribute information of the image module.
In step S22, a jointly corrected gain value is determined based on the gain statistics of the white balance correction and the calibration data of the color uniformity correction.
In the embodiment of the present disclosure, the gain statistic value for performing white balance correction on the edge area image and the calibration data for performing color uniformity correction on the edge area image can determine the gain value after joint correction.
In an implementation manner, the embodiment of the present disclosure may determine a green channel gain value and a blue channel gain value based on a gain statistic value of white balance correction and calibration data of color uniformity correction, and then determine gain values for performing gain compensation on an edge area image based on the green channel gain value and the blue channel gain value, respectively, to obtain a jointly corrected gain value.
FIG. 4 is a flowchart illustrating a process for determining jointly corrected gain values based on white balance corrected gain statistics and color uniformity corrected calibration data, according to an exemplary embodiment. Referring to fig. 4, the following steps are included.
In step S31, green and blue channel gain values are determined based on the gain statistics and the calibration data.
In one implementation, the embodiments of the present disclosure may determine a red channel initial gain value, a green channel initial gain value, and a blue channel initial gain value respectively based on the gain statistic and the calibration data. A green channel gain value is determined based on the green channel initial gain value and the red channel initial gain value, and a blue channel gain value is determined based on the blue channel initial gain value and the red channel initial gain value.
In one example, the product of the red channel gain statistic in the gain statistics and the red channel calibration data in the calibration data is determined as the red channel initial gain value. In one example, a product between a green channel gain statistic in the gain statistics and green channel calibration data in the calibration data is determined as a green channel initial gain value. In one example, the product between the blue channel gain statistic in the gain statistics and the blue channel calibration data in the calibration data is determined as the blue channel initial gain value.
Further, in the embodiment of the present disclosure, in order to further reduce the gain value for gain compensation, when determining the green channel gain value and the blue channel gain value, a ratio between the green channel initial gain value and the red channel initial gain value may be determined as the green channel gain value. The ratio between the blue channel initial gain value and the red channel initial gain value is determined as the blue channel gain value.
In step S32a, a first gain value for gain compensating the green channel in the edge area image is determined based on the green channel gain value.
In the embodiment of the present disclosure, after the gain value of the green channel is determined, a gain value for performing gain compensation on the green channel in the edge area image may be determined. Hereinafter, the gain value for gain compensating the green channel in the edge area image will be referred to as a first gain value for convenience of description. In the embodiment of the present disclosure, the first gain value may be a product between the green channel and the green channel gain value.
In step S32b, a second gain value for gain-compensating the blue channel in the edge area image is determined based on the blue channel gain value.
In the embodiment of the present disclosure, after the gain value of the blue channel is determined, a gain value for performing gain compensation on the blue channel in the edge region image may be determined. Hereinafter, the gain value for gain compensating the blue channel in the edge area image will be referred to as a second gain value for convenience of description. In the embodiment of the present disclosure, the first gain value may be a product between the blue channel and the blue channel gain value.
In step S33, the first gain value and/or the second gain value is determined as the jointly corrected gain value.
In the embodiment of the present disclosure, in the process of performing the joint correction processing on the edge area image, gain compensation may be performed on the blue channel and/or the green channel. Therefore, in the embodiment of the present disclosure, the jointly corrected gain value may be the first gain value, may also be the second gain value, or may also be the first gain value and the second gain value.
The image processing method provided by the embodiment of the disclosure is based on the principle that the smaller the multiplied gain value is when the gain compensation is performed in the image processing process, the less the noise presented on the image is amplified, and adopts the mode of reducing the compensation of the gain value of the red channel in the image of the edge area, so that the multiplied gain value is reduced, and the purpose of reducing the image noise of the image of the edge area is achieved.
FIG. 5 is a diagram illustrating an image processing method according to an exemplary embodiment. Referring to fig. 5, an original image is input. A combined correction process of white balance correction and color uniformity correction is performed on an input original image. According to the color uniformity correction calibration data and the gain statistic value of white balance correction, multiplying the green channel of the edge area image by the ratio of a green channel gain value to a red channel gain value, and multiplying the blue channel of the edge area image by the ratio of a blue channel gain value to a red channel gain value, so as to complete the current noise reduction processing, and outputting the current noise reduction processing to the next image processing module.
The image processing method provided by the embodiment of the disclosure can reduce the gain value multiplied by each color channel of the edge area image, thereby effectively reducing the noise presentation of the edge area image.
Based on the same conception, the embodiment of the disclosure also provides an image processing device.
It is understood that the image processing apparatus provided by the embodiments of the present disclosure includes a hardware structure and/or a software module for performing each function in order to realize the above functions. The disclosed embodiments can be implemented in hardware or a combination of hardware and computer software, in combination with the exemplary elements and algorithm steps disclosed in the disclosed embodiments. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
Fig. 6 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment. Referring to fig. 6, the image processing apparatus 100 includes an acquisition unit 101, a joint correction unit 102, and a compensation unit 103.
The acquiring unit 101 is configured to acquire an image to be processed and determine an edge area image of the image to be processed. A joint correction unit 102, configured to perform joint correction processing of white balance correction and color uniformity correction on the edge area image, and determine a gain value after the joint correction. And the compensation unit 103 is configured to perform gain compensation on the green channel and/or the blue channel of the edge region image based on the jointly corrected gain value.
In one embodiment, the joint correction unit 102 determines the joint corrected gain value as follows:
and determining a gain statistic value for carrying out white balance correction on the edge area image of the image to be processed and calibration data for carrying out color uniformity correction on the edge area image of the image to be processed. And determining the gain value after the joint correction based on the gain statistic value and the calibration data.
In one embodiment, the joint correction unit 102 determines the jointly corrected gain value based on the gain statistic and the calibration data as follows:
based on the gain statistics and the calibration data, a green channel gain value and a blue channel gain value are determined. Based on the green channel gain value, a first gain value for performing gain compensation on a green channel in the edge area image is determined. Based on the blue channel gain value, a second gain value for gain compensation of the blue channel in the edge area image is determined. The first gain value and/or the second gain value is/are determined as jointly corrected gain values.
In one embodiment, the joint correction unit 102 determines the green channel gain value and the blue channel gain value based on the gain statistic and the calibration data as follows:
and respectively determining a red channel initial gain value, a green channel initial gain value and a blue channel initial gain value based on the gain statistic value and the calibration data. A green channel gain value is determined based on the green channel initial gain value and the red channel initial gain value. A blue channel gain value is determined based on the blue channel initial gain value and the red channel initial gain value.
In one embodiment, the joint correction unit 102 determines the green channel gain value based on the green channel initial gain value and the red channel initial gain value as follows: the ratio between the green channel initial gain value and the red channel initial gain value is determined as the green channel gain value. The joint correction unit 102 determines the blue channel gain value based on the blue channel initial gain value and the red channel initial gain value as follows: the ratio between the blue channel initial gain value and the red channel initial gain value is determined as the blue channel gain value.
In one embodiment, the joint correction unit 102 determines the red channel initial gain value, the green channel initial gain value, and the blue channel initial gain value based on the gain statistic and the calibration data as follows:
determining the product of the red channel gain statistic in the gain statistic and the red channel calibration data in the calibration data as the initial gain value of the red channel, determining the product of the green channel gain statistic in the gain statistic and the green channel calibration data in the calibration data as the initial gain value of the green channel, and determining the product of the blue channel gain statistic in the gain statistic and the blue channel calibration data in the calibration data as the initial gain value of the blue channel.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 7 is a block diagram illustrating an apparatus for image processing according to an exemplary embodiment. For example, the apparatus 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 7, the apparatus 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communications component 816.
The processing component 802 generally controls overall operation of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus 800. Examples of such data include instructions for any application or method operating on device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power component 806 provides power to the various components of device 800. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 800.
The multimedia component 808 includes a screen that provides an output interface between the device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 800 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the device 800. For example, the sensor assembly 814 may detect the open/closed status of the device 800, the relative positioning of components, such as a display and keypad of the device 800, the sensor assembly 814 may also detect a change in the position of the device 800 or a component of the device 800, the presence or absence of user contact with the device 800, the orientation or acceleration/deceleration of the device 800, and a change in the temperature of the device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communications between the apparatus 800 and other devices in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the device 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It is understood that "a plurality" in this disclosure means two or more, and other words are analogous. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. The singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It will be further understood that the terms "first," "second," and the like are used to describe various information and that such information should not be limited by these terms. These terms are only used to distinguish one type of information from another and do not denote a particular order or importance. Indeed, the terms "first," "second," and the like are fully interchangeable. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure.
It is further to be understood that while operations are depicted in the drawings in a particular order, this is not to be understood as requiring that such operations be performed in the particular order shown or in serial order, or that all illustrated operations be performed, to achieve desirable results. In certain environments, multitasking and parallel processing may be advantageous.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (14)

1. An image processing method, comprising:
acquiring an image to be processed, and determining an edge area image of the image to be processed;
carrying out combined correction processing of white balance correction and color uniformity correction on the edge area image, and determining a gain value after combined correction;
and performing gain compensation on the green channel and/or the blue channel of the edge area image based on the jointly corrected gain value.
2. The method of claim 1, wherein determining the jointly corrected gain value comprises:
determining a gain statistic value for performing white balance correction on the edge area image of the image to be processed and calibration data for performing color uniformity correction on the edge area image of the image to be processed;
and determining a gain value after joint correction based on the gain statistic and the calibration data.
3. The image processing method of claim 2, wherein said determining a jointly corrected gain value based on said gain statistic and said calibration data comprises:
determining a green channel gain value and a blue channel gain value based on the gain statistic and the calibration data;
determining a first gain value for performing gain compensation on a green channel in the edge area image based on the green channel gain value;
determining a second gain value for performing gain compensation on a blue channel in the edge area image based on the blue channel gain value;
determining the first gain value and/or the second gain value as a jointly corrected gain value.
4. The image processing method of claim 3, wherein determining a green channel gain value and a blue channel gain value based on the gain statistic and the calibration data comprises:
respectively determining a red channel initial gain value, a green channel initial gain value and a blue channel initial gain value based on the gain statistic value and the calibration data;
determining the green channel gain value based on the green channel initial gain value and the red channel initial gain value;
determining the blue channel gain value based on the blue channel initial gain value and the red channel initial gain value.
5. The image processing method of claim 4, wherein determining the green channel gain value based on the green channel initial gain value and the red channel initial gain value comprises:
determining a ratio between the green channel initial gain value and the red channel initial gain value as the green channel gain value;
determining the blue channel gain value based on the blue channel initial gain value and the red channel initial gain value, including:
determining a ratio between the blue channel initial gain value and the red channel initial gain value as the blue channel gain value.
6. The method according to claim 4, wherein said determining a red channel initial gain value, a green channel initial gain value, and a blue channel initial gain value based on said gain statistic and said calibration data, respectively, comprises:
determining a product between a red channel gain statistic of the gain statistics and red channel calibration data of the calibration data as a red channel initial gain value,
determining the product of the gain statistic of the green channel in the gain statistics and the calibration data of the green channel in the calibration data as the initial gain value of the green channel, and
and determining the product of the blue channel gain statistic in the gain statistics and the blue channel calibration data in the calibration data as the initial blue channel gain value.
7. An image processing apparatus characterized by comprising:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring an image to be processed and determining an edge area image of the image to be processed;
a joint correction unit, configured to perform joint correction processing of white balance correction and color uniformity correction on the edge area image, and determine a gain value after joint correction;
and the compensation unit is used for carrying out gain compensation on the green channel and/or the blue channel of the edge area image based on the gain value after the joint correction.
8. The image processing apparatus according to claim 7, wherein the joint correction unit determines the jointly corrected gain value by:
determining a gain statistic value for performing white balance correction on the edge area image of the image to be processed and calibration data for performing color uniformity correction on the edge area image of the image to be processed;
and determining a gain value after joint correction based on the gain statistic and the calibration data.
9. The image processing apparatus according to claim 8, wherein said joint correction unit determines a joint corrected gain value based on said gain statistic and said calibration data in the following manner:
determining a green channel gain value and a blue channel gain value based on the gain statistic and the calibration data;
determining a first gain value for performing gain compensation on a green channel in the edge area image based on the green channel gain value;
determining a second gain value for performing gain compensation on a blue channel in the edge area image based on the blue channel gain value;
determining the first gain value and/or the second gain value as a jointly corrected gain value.
10. The image processing apparatus according to claim 9, wherein said joint correction unit determines a green channel gain value and a blue channel gain value based on said gain statistic and said calibration data in the following manner:
respectively determining a red channel initial gain value, a green channel initial gain value and a blue channel initial gain value based on the gain statistic value and the calibration data;
determining the green channel gain value based on the green channel initial gain value and the red channel initial gain value;
determining the blue channel gain value based on the blue channel initial gain value and the red channel initial gain value.
11. The image processing apparatus according to claim 10, wherein the joint correction unit determines the green channel gain value based on the green channel initial gain value and the red channel initial gain value in the following manner:
determining a ratio between the green channel initial gain value and the red channel initial gain value as the green channel gain value;
the joint correction unit determines the blue channel gain value based on the blue channel initial gain value and the red channel initial gain value as follows:
determining a ratio between the blue channel initial gain value and the red channel initial gain value as the blue channel gain value.
12. The image processing apparatus according to claim 10, wherein the joint correction unit determines a red channel initial gain value, a green channel initial gain value, and a blue channel initial gain value based on the gain statistic and the calibration data as follows:
determining a product between a red channel gain statistic of the gain statistics and red channel calibration data of the calibration data as a red channel initial gain value,
determining the product of the gain statistic of the green channel in the gain statistics and the calibration data of the green channel in the calibration data as the initial gain value of the green channel, and
and determining the product of the blue channel gain statistic in the gain statistics and the blue channel calibration data in the calibration data as the initial blue channel gain value.
13. An image processing apparatus characterized by comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: performing the image processing method of any one of claims 1 to 6.
14. A non-transitory computer readable storage medium, instructions in which, when executed by a processor of a mobile terminal, enable the mobile terminal to perform the image processing method of any one of claims 1 to 6.
CN202011062405.XA 2020-09-30 2020-09-30 Image processing method, image processing apparatus, and storage medium Active CN114339187B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011062405.XA CN114339187B (en) 2020-09-30 2020-09-30 Image processing method, image processing apparatus, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011062405.XA CN114339187B (en) 2020-09-30 2020-09-30 Image processing method, image processing apparatus, and storage medium

Publications (2)

Publication Number Publication Date
CN114339187A true CN114339187A (en) 2022-04-12
CN114339187B CN114339187B (en) 2024-06-14

Family

ID=81031644

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011062405.XA Active CN114339187B (en) 2020-09-30 2020-09-30 Image processing method, image processing apparatus, and storage medium

Country Status (1)

Country Link
CN (1) CN114339187B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004078652A (en) * 2002-08-20 2004-03-11 Matsushita Electric Ind Co Ltd Image processor and its method
JP2007214674A (en) * 2006-02-07 2007-08-23 Nikon Corp Imaging apparatus
CN101204083A (en) * 2005-03-07 2008-06-18 德克索实验室 Method of controlling an action, such as a sharpness modification, using a colour digital image
US20090027527A1 (en) * 2007-07-23 2009-01-29 Visera Technologies Company Limited Color filter arrays and image sensors using the same
CN102217069A (en) * 2008-11-17 2011-10-12 美商豪威科技股份有限公司 Backside illuminated imaging sensor with improved angular response
US20120188265A1 (en) * 2011-01-25 2012-07-26 Funai Electric Co., Ltd. Image Display Device and Method for Adjusting Correction Data in Look-Up Table
CN104796683A (en) * 2014-01-22 2015-07-22 中兴通讯股份有限公司 Image color calibration method and system
CN107690065A (en) * 2017-07-31 2018-02-13 努比亚技术有限公司 A kind of white balance correcting, device and computer-readable recording medium
WO2018031634A1 (en) * 2016-08-10 2018-02-15 FictionArt, Inc. Volume phase holographic waveguide for display

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004078652A (en) * 2002-08-20 2004-03-11 Matsushita Electric Ind Co Ltd Image processor and its method
CN101204083A (en) * 2005-03-07 2008-06-18 德克索实验室 Method of controlling an action, such as a sharpness modification, using a colour digital image
JP2007214674A (en) * 2006-02-07 2007-08-23 Nikon Corp Imaging apparatus
US20090027527A1 (en) * 2007-07-23 2009-01-29 Visera Technologies Company Limited Color filter arrays and image sensors using the same
CN102217069A (en) * 2008-11-17 2011-10-12 美商豪威科技股份有限公司 Backside illuminated imaging sensor with improved angular response
US20120188265A1 (en) * 2011-01-25 2012-07-26 Funai Electric Co., Ltd. Image Display Device and Method for Adjusting Correction Data in Look-Up Table
CN104796683A (en) * 2014-01-22 2015-07-22 中兴通讯股份有限公司 Image color calibration method and system
WO2018031634A1 (en) * 2016-08-10 2018-02-15 FictionArt, Inc. Volume phase holographic waveguide for display
CN107690065A (en) * 2017-07-31 2018-02-13 努比亚技术有限公司 A kind of white balance correcting, device and computer-readable recording medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
DIL NASHIN ANWAR: "Constellation Design for Single Photodetector Based CSK With Probabilistic Shaping and White Color Balance", 《IEEE ACCESS ( VOLUME: 8)》, 31 August 2020 (2020-08-31) *
杜磊: "基于图像分析的3A算法研究", 《 西安电子科技大学》, 7 December 2014 (2014-12-07) *

Also Published As

Publication number Publication date
CN114339187B (en) 2024-06-14

Similar Documents

Publication Publication Date Title
CN109345485B (en) Image enhancement method and device, electronic equipment and storage medium
CN110958401B (en) Super night scene image color correction method and device and electronic equipment
EP3208745B1 (en) Method and apparatus for identifying picture type
CN107992182B (en) Method and device for displaying interface image
CN104869308A (en) Picture taking method and device
CN117616774A (en) Image processing method, device and storage medium
US11348365B2 (en) Skin color identification method, skin color identification apparatus and storage medium
CN110876014B (en) Image processing method and device, electronic device and storage medium
CN105472228B (en) Image processing method and device and terminal
CN114827391A (en) Camera switching method, camera switching device and storage medium
CN111383608B (en) Display control method and apparatus, electronic device, and computer-readable storage medium
CN111131596B (en) Screen brightness adjusting method and device
CN111835941B (en) Image generation method and device, electronic equipment and computer readable storage medium
CN114339187B (en) Image processing method, image processing apparatus, and storage medium
CN114331852A (en) Method and device for processing high dynamic range image and storage medium
CN105491120A (en) Method and device for picture transfer
CN108538261B (en) Display control method and device and display equipment
CN111383568A (en) Display control method and apparatus, electronic device, and computer-readable storage medium
CN114666439B (en) Method, device and medium for adjusting dark color mode display state
WO2023220868A1 (en) Image processing method and apparatus, terminal, and storage medium
CN110876015B (en) Method and device for determining image resolution, electronic equipment and storage medium
CN113873211A (en) Photographing method and device, electronic equipment and storage medium
CN115541194A (en) Display screen calibration method and device, electronic equipment and storage medium
CN115705174A (en) Display control method, display control device, electronic equipment and storage medium
CN118678232A (en) Lens shading correction method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant