CN115205159A - Image processing method and device, electronic device and storage medium - Google Patents
Image processing method and device, electronic device and storage medium Download PDFInfo
- Publication number
- CN115205159A CN115205159A CN202210933151.7A CN202210933151A CN115205159A CN 115205159 A CN115205159 A CN 115205159A CN 202210933151 A CN202210933151 A CN 202210933151A CN 115205159 A CN115205159 A CN 115205159A
- Authority
- CN
- China
- Prior art keywords
- color
- information
- temperature information
- color temperature
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 29
- 238000012937 correction Methods 0.000 claims abstract description 88
- 239000011159 matrix material Substances 0.000 claims abstract description 86
- 238000012545 processing Methods 0.000 claims abstract description 38
- 238000000638 solvent extraction Methods 0.000 claims abstract description 4
- 238000013507 mapping Methods 0.000 claims description 56
- 238000000034 method Methods 0.000 claims description 27
- 230000000903 blocking effect Effects 0.000 claims description 7
- 238000004590 computer program Methods 0.000 claims description 2
- 230000000875 corresponding effect Effects 0.000 description 36
- 238000010586 diagram Methods 0.000 description 13
- 230000008569 process Effects 0.000 description 12
- 238000001914 filtration Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 8
- 238000009499 grossing Methods 0.000 description 8
- 230000000694 effects Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 239000003086 colorant Substances 0.000 description 5
- 230000007704 transition Effects 0.000 description 5
- 238000013528 artificial neural network Methods 0.000 description 4
- 238000010295 mobile communication Methods 0.000 description 4
- 241001464837 Viridiplantae Species 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000001228 spectrum Methods 0.000 description 3
- 230000002123 temporal effect Effects 0.000 description 3
- 101150065031 cct1 gene Proteins 0.000 description 2
- 101150006121 cct2 gene Proteins 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 238000012806 monitoring device Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- BASFCYQUMIYNBI-UHFFFAOYSA-N platinum Chemical compound [Pt] BASFCYQUMIYNBI-UHFFFAOYSA-N 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 101100260051 Caenorhabditis elegans cct-1 gene Proteins 0.000 description 1
- 101100152636 Caenorhabditis elegans cct-2 gene Proteins 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000003702 image correction Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 229910052697 platinum Inorganic materials 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 239000004576 sand Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/77—Retouching; Inpainting; Scratch removal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
The embodiment of the disclosure relates to an image processing method and device, an electronic device and a storage medium, and relates to the technical field of images, wherein the image processing method comprises the following steps: acquiring an image to be processed, and partitioning the image to be processed to obtain a plurality of image blocks; acquiring color temperature information of each image block, and determining local color temperature information of each image block through a color sensor array; determining target color temperature information of each image block according to the color temperature information and the local color temperature information, and acquiring a color correction gain matrix of the target color temperature information; and processing the color information of the image block according to the color correction gain matrix to obtain the target color information of the image block so as to generate a target image corresponding to the image to be processed. According to the technical scheme in the embodiment of the disclosure, the accuracy of color correction can be improved.
Description
Technical Field
The present disclosure relates to the field of image technologies, and in particular, to an image processing method and apparatus, an electronic device, and a computer-readable storage medium.
Background
In image processing, color correction may be required to an image to improve image quality.
In the correlation technology, color correction is calibrated by using a color card under different color temperature light source conditions, then the color temperature of the current scene is judged in real time through equipment, and finally a color correction matrix required to be used is calculated through interpolation. The method has certain limitation, the actual use scene of the equipment is a complex and changeable environment, and the accurate judgment of the color temperature has certain deviation, so that the obtained color correction matrix is inaccurate, and the image quality is influenced.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide an image processing method and apparatus, an electronic device, and a computer-readable storage medium, which overcome, at least to some extent, the problem of poor image correction accuracy due to the limitations and disadvantages of the related art.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided an image processing method comprising: acquiring an image to be processed, and blocking the image to be processed to obtain a plurality of image blocks; acquiring color temperature information of each image block, and determining local color temperature information of each image block through a color sensor array; determining target color temperature information of each image block according to the color temperature information and the local color temperature information, and acquiring a color correction gain matrix of the target color temperature information; and processing the color information of the image block according to the color correction gain matrix to obtain the target color information of the image block so as to generate a target image corresponding to the image to be processed.
According to a second aspect of the present disclosure, there is provided an image processing apparatus comprising: the image blocking module is used for acquiring an image to be processed and blocking the image to be processed to obtain a plurality of image blocks; the color temperature acquisition module is used for acquiring color temperature information of each image block and determining local color temperature information of each image block through the color sensor array; the matrix determining module is used for determining target color temperature information of each image block according to the color temperature information and the local color temperature information and acquiring a color correction gain matrix of the target color temperature information; and the color correction module is used for processing the color information of the image block according to the color correction gain matrix to obtain the target color information of the image block so as to generate a target image corresponding to the image to be processed.
According to a third aspect of the present disclosure, there is provided an electronic device comprising: an image module comprising a color sensor array; a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the image processing method of the first aspect described above and possible implementations thereof via execution of the executable instructions.
According to a fourth aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the image processing method of the first aspect described above and possible implementations thereof.
In the technical scheme provided in the embodiment of the disclosure, on the one hand, the target color temperature information of each image block is obtained through the local color temperature information obtained by the color sensor array, and then the color correction gain matrix corresponding to the target color temperature information is determined, so that the deviation caused by the influence of the external environment on the color temperature is avoided, the accuracy and the stability of the target color temperature information can be improved, the accuracy of the color correction gain matrix corresponding to the target color temperature information is further improved, the color correction capability and the reliability of the color correction module are improved, and the image quality and the image effect are improved. On the other hand, the color correction gain matrix of each image block is calculated through the target color temperature information of each image block, each image block can be independently controlled, local or whole color correction is realized through each image block, and the flexibility is improved. On the other hand, the limitation that the color temperature of the current scene can only be judged in real time according to equipment is avoided, the application range can be enlarged, and the application convenience is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 shows a schematic diagram of an application scenario to which the image processing method of the embodiment of the present disclosure may be applied.
Fig. 2 schematically illustrates a schematic diagram of an image processing method according to an embodiment of the present disclosure.
Fig. 3 schematically illustrates a schematic diagram of an image block in an embodiment of the present disclosure.
Fig. 4 schematically shows a flow chart of determining target color temperature information in different ways in the embodiment of the present disclosure.
Fig. 5 schematically shows a weight change diagram of target color temperature information in an embodiment of the present disclosure.
Fig. 6 schematically shows a schematic diagram of a color correction gain matrix for determining target color temperature information in an embodiment of the present disclosure.
Fig. 7 schematically illustrates a schematic diagram of performing color mapping according to an embodiment of the present disclosure.
Fig. 8 schematically shows a schematic structural diagram of an image signal processor in an embodiment of the present disclosure.
Fig. 9 schematically illustrates a flow chart of image color processing in the embodiment of the present disclosure.
Fig. 10 schematically illustrates a block diagram of an image processing apparatus in an embodiment of the present disclosure.
Fig. 11 schematically illustrates a block diagram of an electronic device in an embodiment of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The current prior art scheme performs color correction in the full RGB domain in the ISP and performs color adjustment processing through a 2D/3D LUT (2D/3D lookup table). And calibrating the color correction by using a color card under different color temperature light source conditions, judging the color temperature of the current scene in real time through equipment, and finally calculating a color correction matrix required to be used through interpolation. And the 2D/3D LUT maps the color information in the current corresponding color coordinate through a predetermined mapping relation to obtain the required color effect.
For the color correction module, the preliminary calibration can only be performed under a standard laboratory light source. The actual use scene of the equipment is a complex and changeable environment, and the accurate judgment of the color temperature is prone to have certain deviation. This will result in a less accurate interpolation of the final color correction coefficients. For 2D/3D LUT, the actual scene color needs to be pre-judged when the color space is subjected to the preset mapping processing. Such as the skin color of the face in the scene, the green plants, the red flowers, the blue sky, etc. which need to be processed. Therefore, there is a limitation that the color mapping is difficult to be performed.
In order to solve technical problems in the related art, an embodiment of the present disclosure provides an image processing method, which may be applied to an application scenario in which an image is processed in a photographing process. Fig. 1 is a schematic diagram illustrating a system architecture to which the image processing method and apparatus according to the embodiment of the present disclosure can be applied.
As shown in fig. 1, the terminal 101 may be a smart device with an image processing function, for example, a smart device such as a smart phone, a computer, a tablet computer, a smart speaker, a smart watch, an in-vehicle device, a wearable device, and a monitoring device. The terminal can contain a camera, and the type of the camera can be any type as long as the shooting processing can be carried out. The number of cameras may be at least one, for example, one, four, and so on, as long as photographing is possible. The image to be processed can be a shot image or an image of each frame in a shot video.
In the disclosed embodiment, the terminal 101 may include a memory 102 and a processor 103. The memory is used for storing the image, and the processor is used for processing the image, such as white balance processing and the like. The memory 102 may store a to-be-processed image 104 therein. The terminal 101 acquires the image to be processed 104 from the memory 102 and sends the image to the processor 103, the image to be processed is subjected to block processing in the processor 103 to obtain a plurality of image blocks, and the local color temperature information and the color temperature information of the image to be processed are determined. And determining target color temperature information of each image block according to the color temperature information and the local color temperature information of each image block acquired by the color sensor array, and further determining a color correction gain matrix of the target color temperature information so as to perform color correction on the image to be processed through the color correction gain matrix, thereby generating a target image 105.
It should be noted that the image processing method provided by the embodiment of the present disclosure may be executed by the terminal 101. The image processing method may also be provided in the terminal.
Next, an image processing method in the embodiment of the present disclosure is explained in detail with reference to fig. 2.
In step S210, an image to be processed is obtained, and the image to be processed is partitioned into a plurality of image blocks.
In the embodiment of the present disclosure, the image to be processed may be an image obtained by shooting an object to be shot through a camera module of the terminal, or may be an image of each frame in a shot video. The image to be processed may also be each frame of image in an image or video taken directly from an album or other storage location. The terminal may be any one of a smartphone, a digital camera, a smart watch, a wearable device, a vehicle-mounted device, or a camera of a monitoring device, as long as it can photograph an object to be photographed and can perform image processing, and the smartphone is taken as an example here for description. Can include at least one camera in the module of making a video recording, can include any one or its combination in main camera, long burnt camera, wide-angle camera, the macro camera for example. The image to be processed may be various types of images, and may be, for example, a moving image or a still image, or the like.
The image to be processed may be an RGB image, i.e., an RGB three-channel image. Each pixel point of the RGB image is composed of three colors of RGB. When the terminal is in a photographing mode, a photographed image obtained through the camera module can be a RAW image. The RAW image is the original image data information collected by the camera module. In the embodiment of the disclosure, the shot image shot by the terminal can be converted to obtain the image to be processed. For example, format conversion may be performed on a captured image in the RAW format by using a general conversion algorithm to obtain an image to be processed in the RGB format, so as to improve convenience of subsequent processing.
After the image to be processed is acquired, the image to be processed may be subjected to block division to obtain a plurality of image blocks. The image block may be a part of an image to be processed, and a plurality of image blocks are not overlapped. The size of each image block may be the same, and the number of image blocks may be determined according to the number of grids. For example, a mesh area may be provided and applied to the image to be processed to divide the image to be processed into a plurality of image blocks according to the mesh area, and the divided image blocks may correspond to the mesh area one to one. Referring to fig. 3, a plurality of image blocks 301 may be included. For example, grid 00 corresponds to image block 00, grid 01 corresponds to image block 01, and so on. The size of the grid area can be set according to actual requirements and hardware structures together, namely, under the condition that the hardware structures can support, the grid area is set according to the actual requirements. For example row col column. Based on this, the color sensor may be associated with an image block and a grid, the image to be processed may be divided into image blocks of row × col, and each image block corresponds to each grid, and under the same field angle, the multi-window spatial range represented by the grid area coincides with the image to be processed, so that the grid area can be overlaid on the image to be processed. Each grid may correspond to a window and thus may be referred to as a multi-window.
In the embodiment of the present disclosure, a color sensor array may be included, and the color sensor array may include a plurality of color sensors 302 arranged in an array. The number of color sensors may be determined according to the number of image blocks. Also, the color sensor array may be combined with multi-window information, so each color sensor may be a multi-window color sensor. Each grid represents a color sensor, each color sensor corresponding to each image block. Since the image to be processed is divided into a plurality of grids, a row × col Color sensor array is spatially formed, which forms a multi-window Color sensor.
The multi-window color sensor array is an independent sensor and can be arranged on one side of any camera in the camera module and close to the camera module. The module of making a video recording can be rear-mounted camera module, and the module of making a video recording can include at least one camera, for example can include main camera, long burnt camera, wide-angle camera, arbitrary one in the macro camera or its combination. The specific arrangement position and arrangement order of the multiple cameras can be determined according to actual requirements, and are not specifically limited herein. The color sensor array may be, for example, the left side of a tele camera, or the right side of a main camera or the underside of the last camera in at least one camera, etc. The camera module and the color sensor array can be arranged adjacently or at intervals. The specific position of the color sensor array may be determined according to a calibration result obtained by calibrating the color sensor array in an actual application process, or may be determined according to an actual requirement, which is not specifically limited herein.
It should be noted that, for each module in the image signal processor, the image to be processed may be partitioned, and the image blocks obtained by partitioning may be consistent with the image blocks obtained by partitioning according to the grid, so as to ensure consistency and accuracy. For example, the color correction module may perform multi-window area division on the image to obtain a first image block; the 2D/3D lookup table module may also perform multi-window area division on the image to obtain the second image block. The image division modes between different modules are the same, and the first image block and the second image block are substantially the same as the image blocks divided in step S210, so as to ensure consistency and accuracy between the image blocks.
The multi-window area information of the color sensor array is designed corresponding to the color correction module, and the window specification of the color correction module is equal to or slightly larger than or smaller than the multi-window area of the color sensor array, namely the number of rows and the number of columns of the multi-window area information are the same.
Next, with continued reference to fig. 2, in step S220, color temperature information of each image block is obtained, and local color temperature information of each of the image blocks is determined by the color sensor array.
In the embodiment of the present disclosure, when the light emitted from one light source is consistent with the light radiated from a black body (e.g., platinum) at a certain temperature, the temperature of the black body at that time is expressed as the color temperature cct (corrected color temperature) of the light source.
The color temperature information of each image block may be first obtained by the image signal processor ISP, which may be specifically denoted by cct 1. Besides, the color temperature of a local area of the current scene can be calculated by the color sensor array, i.e. local color temperature information of each image block is acquired, which can be represented by cct 2. The color sensor may be a sensor for detecting relevant information such as a color, a color temperature, a spectrum, and the like of a scene, and may be configured to detect color information of an object corresponding to each image block and color temperature information of a current scene. The object may be an object included in each image block, and may be any type of object, such as an object, a person, and the like. The spectrum is different wavelengths, the color temperature is a response curve of different wavelengths, and the wavelength corresponds to the color, so that the spectrum, the color temperature and the color are correlated. The color temperature information and the local color temperature information may be the same or different, and are determined here according to the actual acquisition result.
In step S230, target color temperature information of each image block is determined according to the color temperature information and the local color temperature information, and a color correction gain matrix of the target color temperature information is obtained.
In the embodiment of the present disclosure, after the color temperature information and the local color temperature information are obtained, the target color temperature information of the image block may be determined jointly according to the color temperature information and the local color temperature information. For example, the color temperature information may be compared with the local color temperature information to determine difference information; and selecting different modes to obtain the target color temperature information according to the comparison result of the difference information and the threshold parameter. Specifically, the difference information between the color temperature information and the local color temperature information may be calculated from the absolute value of the difference therebetween. Further, the difference information may be compared with the threshold parameter to obtain a comparison result, and the target color temperature information of each image block may be determined in different manners according to the comparison result between the difference information and the threshold parameter. The threshold parameter may include a first threshold and a second threshold, and the first threshold Th1 is smaller than the second threshold Th2.
Fig. 4 schematically shows a flow chart for selecting different ways to determine the target color temperature information, and referring to fig. 4, the method mainly comprises the following steps:
in step S410, it is determined whether the difference information is smaller than a first threshold; if yes, go to step S420; if not, go to step S430;
in step S420, if the difference information is smaller than a first threshold, determining target color temperature information according to the color temperature information;
in step S430, it is determined whether the difference information is smaller than a second threshold; if yes, go to step S440; if not, go to step S450;
in step S440, if the difference information is greater than a first threshold and smaller than a second threshold, the color temperature information is adjusted by local color temperature information to determine the target color temperature information;
in step S450, if the difference information is greater than a second threshold, the target color temperature information is determined according to the local color temperature information.
In the embodiment of the present disclosure, as the difference information changes, the weight of the target color temperature information in the color temperature information also gradually decreases, specifically referring to the weight change diagram shown in fig. 5. When the difference information is larger than a first threshold value, the weight of the target color temperature information in the color temperature information is 1; when the difference information is between the first threshold value and the second threshold value, the weight of the target color temperature information in the color temperature information is gradually reduced; when the difference information is greater than the second threshold, the weight of the target color temperature information in the color temperature information is 0, that is, the weight of the target color temperature information in the local color temperature information acquired by the color sensor array is 1.
On this basis, the target color temperature information of each image block may be color temperature information acquired by the image signal processor if the difference information therebetween is smaller than the first threshold value. In this case, the color sensor array does not need to correct the color temperature information acquired by the image signal processor. The color temperature information is thus directly used as the target color temperature information for determining each image block.
If the difference information between the two is greater than the first threshold and less than the second threshold, the target color temperature information of each image block can be determined jointly according to the color temperature information obtained by the color sensor array and the image signal processor, that is, the local color temperature information obtained by the color sensor array needs to be adjusted, that is, the color temperature information is interpolated according to the local color temperature information obtained by the color sensor array. For example, the target color temperature information of each image block may be obtained by interpolating all color temperature information calculated in different manners. The interpolation may be a weighted sum operation. The weighting parameters corresponding to the color temperature information obtained in different modes are also different, but the sum of all the weighting parameters is 1. Based on this, all color temperature information can be fused according to the weight parameters. For example, the local color temperature information acquired by the color sensor array and the color temperature information may be weighted and fused according to the corresponding weighting parameters to obtain the target color temperature information, for example, as shown in formula (1):
cct = cct1 w1+ cct2 w2 formula (1)
Where cct1 represents color temperature information acquired by the image signal processor, and cct2 represents local color temperature information of each image block acquired by the color sensor array.
The color temperature information is adjusted through the local color temperature information acquired by the color sensor array, so that the accuracy of the target color temperature information of each image block can be improved, the influence of the environment on the color temperature in the related technology is avoided, and the stability is improved.
The target color temperature information of each image block may be determined according to the local color temperature information acquired by the color sensor array if the difference information therebetween is greater than the second threshold.
In the embodiment of the disclosure, the difference information between the local color temperature information and the color temperature information acquired by the color sensor array is compared with the first threshold and the second threshold, and the target color temperature information of each image block is acquired in different manners based on the comparison result, so that the accuracy of the target color temperature information corresponding to each image block can be improved.
After the target color temperature information of each image block is determined, the target color temperature information of each image block can be smoothed to realize smooth transition among the target color temperature information of all the image blocks, and the abrupt change effect generated after different target color temperature information acts among local blocks represented by each image block is reduced. Wherein the smoothing process may be implemented by low-pass filtering. The low-pass filtering refers to a filtering mode that low-frequency signals can normally pass through, and high-frequency signals exceeding a set critical value are blocked and weakened. The blocking and attenuating amplitudes may vary depending on the frequency and filtering procedure or purpose.
Next, a color correction gain matrix of the target color temperature information represented by each image block may be obtained according to the target color temperature information of each image block to implement color correction. The color correction means that the difference between the current color and the target color of the image is corrected by a 24-color card.
Color correction means that the color values of all pixels in an image are changed in the same way to obtain different display effects. When an image acquisition system acquires a digital image, the acquired image is often greatly different from an original image due to the influence of ambient light or human factors. Color correction can reduce this difference to some extent. Color correction may be implemented with a color correction gain matrix. The color correction gain matrix is used for correcting the color parameters of each pixel point so as to adjust the color parameters to be the target color parameters. The method is mainly used for converting the image data after white balance processing into a standard RGB color space. However, the RGB data after color correction is still linear, and it is also necessary to convert the image data into sRGB space closer to the human eye effect after Gamma processing.
Specifically, the color correction gain matrix may be determined according to the target color temperature information of each image block and a calibrated light source result of the image signal processor, where the calibrated light source result refers to the color temperature information acquired by the image signal processor. In some embodiments, there is a corresponding gain matrix for the color temperature information of each image block, and the gain matrix for each image block may be the same or different. And the local color temperature information of each image block also has a corresponding gain matrix. The gain matrices for the scaled color temperature information and the local color temperature information may each be, for example, a 3 × 3 matrix.
On this basis, the gain matrix of the color temperature information of each image block and the gain matrix of the local color temperature information can be interpolated to obtain the color correction gain matrix of the target color temperature information corresponding to each image block. For example, the gain matrix of the color temperature information and the gain matrix of the local color temperature information may be fused according to the corresponding weight parameters. The fusion here can be achieved by a weighted sum operation. The weight parameter and the distance from the corresponding color temperature information to the target color temperature information are in negative correlation, that is, the larger the distance from the corresponding color temperature information to the target color temperature information is, the smaller the weight parameter is. The weight parameter of the gain matrix for each color temperature information may be determined according to a distance of another color temperature information from the target color temperature information. For example, referring to fig. 6, if the gain matrix of the color temperature information 2000K at the point a is A1 and the weight parameter thereof is m2/m1+ m2, the gain matrix of the local color temperature information 3000K at the point B is B1 and the weight parameter thereof is m1/m1+ m2, the color correction gain matrix of the target color temperature information at the point C can be represented as m2/m1+ m2 A1+ m1/m1+ m 2B 1.
After the color correction gain matrix of each image block is obtained, the color correction gain matrix of each image block may be smoothed again. The smoothing process may be low-pass filtering, so as to obtain a color correction gain matrix of each image block. The color correction gain matrix may be a red, green, and blue gain matrix.
Further, a color correction gain matrix may be applied to the image to be processed. Exemplarily, a target image block can be determined from a plurality of image blocks, and a multiplication operation is performed on a color correction gain matrix of each image block included in the target image block and color parameters of pixel points of each image block, so that the color parameters of the pixel points of each image block are adjusted to be required target color parameters, the image color better meets the actual requirement, and the image quality is improved. The target image block may be part or all of the plurality of image blocks, and is determined according to actual requirements.
In the embodiment of the disclosure, each image block can be independently controlled by calculating the color correction gain matrix of each image block, and local or full color correction is realized by each image block, so that the flexibility is improved. In addition, the local color temperature information acquired by the color sensor array acquires the target color temperature information of each image block, so that the color correction gain matrix corresponding to the target color temperature information is determined, the deviation caused by the influence of the color temperature of the external environment on the color information is avoided, the accuracy and the stability of the target color temperature information can be improved, the accuracy of the color correction gain matrix is further improved, and the color correction capacity and the reliability of the color correction module are improved.
In some scenes, the original image captured by the camera is not vivid enough in overall color, and it is also desirable to capture images of different scenes, for example: green plants, grasslands, flowers, blue sky, clouds, sand, buildings, animals, etc. may have specific changes to the hue and saturation of certain colors, such as: when the scene is green, the green is expected to be more dense, and other colors are kept unchanged, and then color mapping can be performed through the 2D/3 DLUT. The color mapping refers to performing color mapping on an original image or a video image according to an adjusted color mapping table, wherein the color mapping table is a three-dimensional lookup table, three color components of IN _ RGB are input, and 3DLUT is looked up to directly obtain three corresponding color components of OUT _ RGB. In the disclosed embodiment, the color sensor array may also be associated with a 2D/3D LUT (2D/3D lookup table) to improve the capability of the 2D/3D LUT. It should be noted that the window size of the 2D/3D lookup table module should be equal to or larger or smaller than the color sensor multi-window area. In addition, the manner of dividing the image blocks and the number of the divided image blocks are substantially the same.
Based on this, the method further comprises: and carrying out color mapping on the color information of the image block by combining the local color information corresponding to the local color temperature information acquired by the color temperature sensor array and the reference color information to acquire target color information. Since the color temperature is a response curve of different wavelengths, the wavelengths correspond to colors, and thus the color temperature and the colors are correlated with each other. For the color sensor array, each of the color sensors may acquire local color temperature information and local color information of the corresponding image block, where the local color information corresponds to the local color temperature information, or it may be understood that the local color information is color information presented under the local color temperature information, and the local color information may change along with a change in the local color temperature information.
In some embodiments, the color information may be color mapped to adjust the color information to the target color information in combination with the local color information obtained by the color sensor array and the reference color information obtained by the look-up table module. The color mapping may be used to implement color correction to adjust color parameters, such as adjusting the values of the RGB three color sub-pixels. In the embodiment of the present disclosure, color mapping may be performed by combining the parameter mapping relationship of the color sensor array and the lookup table module itself. Specifically, the color mapping performed in combination with the local color information acquired by the color sensor array and the reference color information acquired by the lookup table module may be that, in combination with a color threshold parameter and a region type, the target color information is determined in different mapping manners. The color threshold parameter may be a first color threshold and a second color threshold, the first color threshold being different from the first threshold, the second color threshold being different from the second threshold, and the first color threshold being less than the second color threshold.
Exemplarily, the local color information may be compared with the reference color information to determine color difference information; and selecting different mapping modes to obtain the target color information according to the comparison result of the color difference information and the color threshold parameter. Specifically, the color difference information between the reference color information and the local color information may be calculated from an absolute value of a difference therebetween. Further, the color difference information may be compared with the color threshold parameter to obtain a comparison result, and different mapping manners may be selected to determine the target color information of each image block according to the comparison result between the color difference information and the color threshold parameter. Specifically, the target color information of the pixel point in each image may be determined.
In the embodiment of the present disclosure, when the area type of the image block is determined to be the target area type, the target color information may be determined by using different mapping manners combined by the local color information and the reference color information, in combination with the color threshold parameter and the area type, and in combination with the comparison result between the color difference information and the color threshold parameter. A flow chart for performing color mapping is schematically shown in fig. 7, and referring to fig. 7, mainly includes the following steps:
in step S710, it is determined whether the area type of the image block is the target type.
The target type refers to a type of an area whose color needs to be adjusted, and may be determined specifically according to a type of an object included in the area, color information of the area, or whether a user operation is received. For example, if the object type included in the image block is a face skin color, a blue sky, a green plant, or the like, it may be determined that the region type of the image block belongs to the target type.
In step S720, it is determined whether the color difference information is smaller than a first color threshold; if yes, go to step S730; if not, go to step S740.
In step S730, if the color difference information is smaller than a first color threshold, the reference color information is mapped through a parameter mapping relationship to obtain the target color information.
In this step, the parameter mapping relationship may be a lookup Table (2D/3D-Look-UP-Table, 2D/3D LUT for short), such as a color mapping Table. For example, the color information of the image block may be input to the color mapping table, and the target color information corresponding to the color mapping table may be output, where the color enhancement intensity may be an intensity set by the lookup table itself, or may be adjusted according to actual requirements. For example, the current three RGB color components of a certain pixel point in an image block may be input into the color mapping table, and then the corresponding three RGB color mapping components are output through the color mapping table. When color mapping is carried out through the mapping table, because the image block can contain a plurality of pixel points, each pixel point can be adjusted according to the corresponding color mapping table, and the target color information output after color adjustment is obtained.
Besides, the parameter mapping relationship may be a non-color mapping table, for example, infrared information of the input image is output as pseudo-color information through the non-color mapping table. The embodiment of the disclosure does not specially limit the specific type of the image parameter mapping relationship, and can select the corresponding parameter mapping relationship according to the correction requirement scene of the image parameter.
In step S740, it is determined whether the color difference information is smaller than a second color threshold; if yes, go to step S750; if not, go to step S760.
In step S750, if the color difference information is greater than the first color threshold and smaller than the second color threshold, the target color information after color mapping of the color information is determined by fusing the local color information and the reference color information.
In this step, if the color difference information is between the two color threshold parameters, the target color information may be determined jointly from the local color information and the reference color information. For example, the local color information and the reference color information may be weighted and fused according to corresponding weights, so as to obtain the target color information. As the difference information increases, the weight of the local color information gradually increases, and the weight of the reference color information gradually decreases.
In step S760, if the color difference information is greater than a second color threshold, the target color information is determined according to the local color information.
If the color difference information is greater than the second color threshold, it indicates that the mapped color information may be inaccurate, and at this time, the local color information output by the color sensor array needs to be used as the target color information, that is, the weight of the local color information is relatively large.
Further, the target color information obtained after the respective color mapping is obtained for each image block may be smoothly transitioned. The smooth transition can be low-pass smooth filtering, so that the transition between different image blocks in a space domain is smooth, the situation that the color effect is suddenly changed between the image blocks is avoided, the smooth transition can be realized, and the image quality is improved. The processing in the spatial domain refers to processing at the pixel level.
It should be added that, in the process of image processing, the smoothing process of the time domain space can be performed for the real-time processing mode such as video or preview. Each time t may input an image into the image signal processing system for processing, and each time t or each frame may correspond to an image. In this process, a data smoothing filter may be applied to the temporal domain on each image. Smoothing the temporal domain of each image may be understood as smoothing in the direction of the image time series. Specifically, the smoothing filtering may use an IIR filtering method performed in a time domain space, where the output result may be calculated as I = a × w + B (1-w). Wherein, I represents the current Frame, i.e. the output result of Frame t, a is the data or parameter of the current Frame, and w is the weight of the current Frame. B is the data or parameter of the last Frame-1 adjacent to the current Frame on the time axis, and 1-w is the weight of Frame-1. By the method, the smoothing of the time domain (image sequence) can be carried out to reduce the difference between different times and avoid the situation of being more abrupt.
Fig. 8 schematically shows a block diagram of an image signal processor, and referring to fig. 8, the image signal processor may include a color sensor array 801, a grid 802, an algorithm module 803, and may further include a color correction module 804, a tone mapping (tone) 805, a 2D/3D LUT 806, and the like.
Referring to fig. 8, the grid information obtained by the color sensor array is associated with the grid obtained by the color correction module and the 2D/3D LUT so that the divided image blocks are substantially uniform. The local color temperature information output by the color sensor array through the algorithm module is input to the color correction module, and the local color information output by the color sensor array through the algorithm module is input to the 2D/3D LUT. The color correction module and the 2D/3D LUT are typically located in the second stage of the ISP processing flow, the RGB domain stage.
Based on the hardware structure, the color sensor array can acquire local color temperature information of each image block, acquire target color temperature information of each image block in different modes according to the difference information between the local color temperature information and the color temperature information of the image block acquired by the image signal processor and the comparison result between the local color temperature information and the threshold parameter, further acquire a color correction gain matrix corresponding to the target color temperature information of each image block according to the gain matrix of the local color temperature information and the gain matrix of the color temperature information, correct the color information of the image block according to the color correction gain matrix corresponding to the target color temperature information of each image block, obtain the target color information of the image block, and further generate a target image corresponding to the image to be processed.
In addition, after the local color temperature information and the local color information of each image block are obtained, the color sensor array can also judge whether the area type of the image block is a target type, and when the area type of the image block is the target type, the color sensor array can determine the target color information corresponding to the color information of the image block by adopting different mapping modes according to a color threshold parameter, a comparison result between the local color information of each image block and the color difference information of the reference color information, and the area type of each image block. If the area type of the image block does not belong to the target type, no color mapping is required.
In the embodiment of the present disclosure, the local color temperature information and the local color information are obtained by introducing the color sensor array, so that the color adjustment capability of the color correction module and the 2D/3D LUT module can be improved by the color sensor array, and the accuracy of the target color information can be improved.
Fig. 9 schematically shows a flowchart for acquiring a target image, and referring to fig. 9, mainly includes the following steps:
in step S901, an image to be processed 910 is acquired.
In step S902, the image to be processed is divided into blocks to obtain a plurality of image blocks 920.
In step S903, color temperature information 930 for each image block is acquired.
In step S904, local color temperature information 950 of each image block is acquired based on the color sensor array 940.
In step S905, target color temperature information 960 for the image block is generated from the color temperature information 930 and the local color temperature information 950.
In step S906, a color correction gain matrix 970 of the target color temperature information 960 is acquired from the gain matrix 971 of the color temperature information 930 and the gain matrix 972 of the local color temperature information 950.
In step S907, color correction is performed on the image blocks in the image to be processed through the color correction gain matrix to obtain the target image 900.
In step S908, different mapping modes 990 are obtained by combining the local color information 980 of the color sensor array, and the color information of the image block is mapped to obtain the target image 900.
In step S909, the target image 900 is output.
In the embodiment of the disclosure, by introducing the color sensor array, the local color temperature information of each image block can be obtained according to the multi-window information of the color sensor array. And the color temperature information of each image block can be adjusted according to the local color temperature information of each image block to obtain the target color temperature information of each image block. And then obtaining a color correction gain matrix of the target color temperature information according to the gain matrix of the color temperature information and the gain matrix of the local color temperature information. By calculating the color correction gain matrix of each image block, each image block can be independently controlled, and local or complete color correction is realized through each image block, so that the flexibility is improved. In addition, the local color temperature information acquired by the color sensor array acquires the target color temperature information of each image block, so that the color correction gain matrix corresponding to the target color temperature information is determined, the deviation caused by the influence of the color temperature of the external environment on the color information is avoided, the accuracy and the stability of the target color temperature information can be improved, the accuracy of the color correction gain matrix is further improved, and the color correction capability and the reliability of the color correction module are improved. Furthermore, a proper mapping mode can be selected according to the local color information acquired by the color sensor array to map the color information to obtain the target color information, so that the accuracy and the authenticity of the target color information are improved, and the color richness is improved. The target color temperature information of each image block calculated by combining the acquired local color temperature information of the color sensor array can more accurately perform color processing on the image to be processed from multiple dimensions such as global, local and image contents, the effect and the authenticity of block white balance can be improved, smooth transition among the image blocks is realized, the accuracy of image processing is improved, and the image quality is improved.
An embodiment of the present disclosure provides an image processing apparatus, and referring to fig. 10, the image processing apparatus 1000 may include:
an image blocking module 1001, configured to obtain an image to be processed, and block the image to be processed to obtain a plurality of image blocks;
the color temperature acquisition module 1002 is configured to acquire color temperature information of each image block, and determine local color temperature information of each image block through a color sensor array;
a matrix determining module 1003, configured to determine target color temperature information of each image block according to the color temperature information and the local color temperature information, and obtain a color correction gain matrix of the target color temperature information;
the color correction module 1004 is configured to process the color information of the image block according to the color correction gain matrix, and acquire target color information of the image block to generate a target image corresponding to the image to be processed.
In an exemplary embodiment of the present disclosure, the matrix determination module includes: the difference information determining module is used for comparing the color temperature information with the local color temperature information to determine difference information; and the mode determining module is used for selecting different modes to obtain the target color temperature information according to the comparison result of the difference information and the threshold parameter.
In an exemplary embodiment of the present disclosure, the manner determining module includes: the first determining module is used for determining target color temperature information according to the color temperature information if the difference information is smaller than a first threshold value; a second determining module, configured to adjust the color temperature information according to local color temperature information to determine the target color temperature information if the difference information is greater than a first threshold and smaller than a second threshold; and the third determining module is used for determining the target color temperature information according to the local color temperature information if the difference information is larger than a second threshold value.
In an exemplary embodiment of the present disclosure, the second determining module includes: and the interpolation acquisition module is used for interpolating the color temperature information according to the local color temperature information to acquire the target color temperature information.
In an exemplary embodiment of the present disclosure, the matrix determination module includes: and the matrix interpolation module is used for interpolating a gain matrix of the color temperature information of each image block and a gain matrix of the local color temperature information to obtain a color correction gain matrix of the target color temperature information corresponding to each image block.
In an exemplary embodiment of the present disclosure, the matrix interpolation module includes: and the fusion module is used for fusing the gain matrix of the color temperature information and the gain matrix of the local color temperature information according to the corresponding weight parameters to obtain the color correction gain matrix.
In an exemplary embodiment of the present disclosure, the apparatus further includes: and the color mapping module is used for performing color mapping on the color information of the image block by combining the local color information corresponding to the local color temperature information acquired by the color temperature sensor array and the reference color information to acquire target color information.
In an exemplary embodiment of the present disclosure, the color mapping module includes: and the mapping mode determining module is used for determining the target color information by adopting different mapping modes according to the color threshold parameter, the comparison result between the local color information of each image block and the color difference information of the reference color information, and the region type of each image block.
In an exemplary embodiment of the present disclosure, the mapping manner determining module is configured to: if the color difference information is smaller than a first color threshold value and the area type is a target type, mapping the reference color information through a parameter mapping relation to obtain the target color information; if the color difference information is larger than a first color threshold and smaller than a second color threshold and the region type is a target type, fusing local color information and reference color information to obtain the target color information; and if the color difference information is larger than a second color threshold value and the region type is a target type, determining the target color information according to the local color information.
It should be noted that, the specific details of each part in the image processing apparatus have been described in detail in the embodiment of the image processing method part, and details that are not disclosed may refer to the embodiment of the method part, and thus are not described again.
Exemplary embodiments of the present disclosure also provide an electronic device. The electronic device may be the terminal 101 described above. In general, the electronic device may include a processor and a memory for storing executable instructions of the processor, the processor being configured to perform the above-described image processing method via execution of the executable instructions.
The configuration of the electronic device will be exemplarily described below by taking the mobile terminal 1100 in fig. 11 as an example. It will be appreciated by those skilled in the art that the configuration of figure 11 can also be applied to fixed type devices, in addition to components specifically intended for mobile purposes.
As shown in fig. 11, the mobile terminal 1100 may specifically include: processor 1101, memory 1102, bus 1103, mobile communication module 1104, antenna 1, wireless communication module 1105, antenna 2, display 1106, camera module 1107, audio module 1108, power module 1109, and sensor module 1110.
An encoder may encode (i.e., compress) an image or video to reduce the data size for storage or transmission. The decoder may decode (i.e., decompress) the encoded data for the image or video to recover the image or video data. The mobile terminal 1100 may support one or more encoders and decoders, such as: image formats such as JPEG (Joint Photographic Experts Group), PNG (Portable Network Graphics), BMP (Bitmap), and Video formats such as MPEG (Moving Picture Experts Group) 1, MPEG10, h.1063, h.1064, and HEVC (High Efficiency Video Coding).
The processor 1101 may be connected to the memory 1102 or other components by a bus 1103.
The communication function of the mobile terminal 1100 may be implemented by the mobile communication module 1104, the antenna 1, the wireless communication module 1105, the antenna 2, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. The mobile communication module 1104 can provide a mobile communication solution of 3G, 4G, 5G, etc. applied to the mobile terminal 1100. The wireless communication module 1105 may provide wireless communication solutions for wireless lan, bluetooth, near field communication, etc. applied to the mobile terminal 1100.
The display screen 1106 is used to implement display functions, such as displaying a user interface, images, video, and the like. The camera module 1107 is used to perform a shooting function, such as taking images, videos, etc., and may contain a color sensor array. The audio module 1108 is used to implement audio functions, such as playing audio, collecting voice, etc. The power module 1109 is used to implement power management functions, such as charging batteries, powering devices, monitoring battery status, and the like. The sensor module 1110 may include one or more sensors for implementing corresponding sensing functions. For example, the sensor module 1110 may include an inertial sensor for detecting a motion pose of the mobile terminal 1100 and outputting inertial sensing data.
It should be noted that, in the embodiments of the present disclosure, a computer-readable storage medium is also provided, and the computer-readable storage medium may be included in the electronic device described in the foregoing embodiments; or may be separate and not incorporated into the electronic device.
A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable storage medium may transmit, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The computer-readable storage medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the embodiments below.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Furthermore, the above-described drawings are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes illustrated in the above figures are not intended to indicate or limit the temporal order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims. It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.
Claims (12)
1. An image processing method, comprising:
acquiring an image to be processed, and partitioning the image to be processed to obtain a plurality of image blocks;
acquiring color temperature information of each image block, and determining local color temperature information of each image block through a color sensor array;
determining target color temperature information of each image block according to the color temperature information and the local color temperature information, and acquiring a color correction gain matrix of the target color temperature information;
and processing the color information of the image block according to the color correction gain matrix to obtain the target color information of the image block so as to generate a target image corresponding to the image to be processed.
2. The image processing method according to claim 1, wherein the determining the target color temperature information of each image block according to the color temperature information and the local color temperature information comprises:
comparing the color temperature information with the local color temperature information to determine difference information;
and selecting different modes to obtain the target color temperature information according to the comparison result of the difference information and the threshold parameter.
3. The image processing method according to claim 2, wherein the selecting different ways to obtain the target color temperature information according to the comparison result of the difference information and the threshold parameter comprises:
if the difference information is smaller than a first threshold value, determining target color temperature information according to the color temperature information;
if the difference information is larger than a first threshold value and smaller than a second threshold value, adjusting the color temperature information through local color temperature information to determine the target color temperature information;
and if the difference information is larger than a second threshold value, determining the target color temperature information according to the local color temperature information.
4. The method of claim 3, wherein the adjusting the color temperature information by the local color temperature information to determine the target color temperature information comprises:
and interpolating the color temperature information according to the local color temperature information to obtain the target color temperature information.
5. The image processing method according to claim 1, wherein said obtaining a color correction gain matrix of the target color temperature information comprises:
and interpolating the gain matrix of the color temperature information of each image block and the gain matrix of the local color temperature information to obtain a color correction gain matrix of the target color temperature information corresponding to each image block.
6. The image processing method according to claim 5, wherein the interpolating the gain matrix of the color temperature information of each image block and the gain matrix of the local color temperature information to obtain the color correction gain matrix of the target color temperature information corresponding to each image block comprises:
and fusing the gain matrix of the color temperature information and the gain matrix of the local color temperature information according to the corresponding weight parameters to obtain the color correction gain matrix.
7. The image processing method according to claim 1, characterized in that the method further comprises:
and carrying out color mapping on the color information of the image block by combining the local color information corresponding to the local color temperature information acquired by the color temperature sensor array and the reference color information to acquire target color information.
8. The image processing method according to claim 7, wherein the color mapping of the color information of the image block using the local color information corresponding to the local color temperature information obtained by the combination color temperature sensor array and the reference color information to obtain the target color information comprises:
and determining the target color information by adopting different mapping modes according to the color threshold parameter, the contrast result between the local color information and the color difference information of the reference color information of each image block and the region type of each image block.
9. The image processing method according to claim 8, wherein the determining the target color information according to the color threshold parameter, the comparison result between the local color information and the color difference information of the reference color information of each image block, and the region type of each image block by using different mapping manners comprises:
if the color difference information is smaller than a first color threshold and the area type is a target type, mapping the reference color information through a parameter mapping relation to obtain the target color information;
if the color difference information is larger than a first color threshold and smaller than a second color threshold and the region type is a target type, fusing local color information and reference color information to obtain the target color information;
and if the color difference information is larger than a second color threshold value and the region type is a target type, determining the target color information according to the local color information.
10. An image processing apparatus characterized by comprising:
the image blocking module is used for acquiring an image to be processed and blocking the image to be processed to obtain a plurality of image blocks;
the color temperature acquisition module is used for acquiring color temperature information of each image block and determining local color temperature information of each image block through a color sensor array;
the matrix determining module is used for determining target color temperature information of each image block according to the color temperature information and the local color temperature information and acquiring a color correction gain matrix of the target color temperature information;
and the color correction module is used for processing the color information of the image block according to the color correction gain matrix to obtain the target color information of the image block so as to generate a target image corresponding to the image to be processed.
11. An electronic device, comprising:
an image module comprising a color sensor array;
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the image processing method of any of claims 1-9 via execution of the executable instructions.
12. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the image processing method of any one of claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210933151.7A CN115205159A (en) | 2022-08-04 | 2022-08-04 | Image processing method and device, electronic device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210933151.7A CN115205159A (en) | 2022-08-04 | 2022-08-04 | Image processing method and device, electronic device and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115205159A true CN115205159A (en) | 2022-10-18 |
Family
ID=83585056
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210933151.7A Pending CN115205159A (en) | 2022-08-04 | 2022-08-04 | Image processing method and device, electronic device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115205159A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116152361A (en) * | 2023-04-20 | 2023-05-23 | 高视科技(苏州)股份有限公司 | Method for estimating chromaticity, electronic device, and computer-readable storage medium |
-
2022
- 2022-08-04 CN CN202210933151.7A patent/CN115205159A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116152361A (en) * | 2023-04-20 | 2023-05-23 | 高视科技(苏州)股份有限公司 | Method for estimating chromaticity, electronic device, and computer-readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11849224B2 (en) | Global tone mapping | |
US11317070B2 (en) | Saturation management for luminance gains in image processing | |
WO2024027287A9 (en) | Image processing system and method, and computer-readable medium and electronic device | |
KR102386385B1 (en) | Electronic device and method for compressing image thereof | |
US10600170B2 (en) | Method and device for producing a digital image | |
CN114693580B (en) | Image processing method and related device | |
CN111696039B (en) | Image processing method and device, storage medium and electronic equipment | |
KR20190010040A (en) | Electronic device and method for compressing high dynamic range image data in the electronic device | |
WO2019104047A1 (en) | Global tone mapping | |
KR102285756B1 (en) | Electronic system and image processing method | |
KR20120114899A (en) | Image processing method and image processing apparatus | |
CN115330633A (en) | Image tone mapping method and device, electronic equipment and storage medium | |
CN115205159A (en) | Image processing method and device, electronic device and storage medium | |
CN115835034A (en) | White balance processing method and electronic equipment | |
CN115187487A (en) | Image processing method and device, electronic device and storage medium | |
WO2020133331A1 (en) | Systems and methods for exposure control | |
EP4117282A1 (en) | Image sensor, imaging apparatus, electronic device, image processing system and signal processing method | |
CN115187488A (en) | Image processing method and device, electronic device and storage medium | |
JP6415094B2 (en) | Image processing apparatus, imaging apparatus, image processing method, and program | |
CN115239739A (en) | Image processing method and device, electronic equipment and computer readable medium | |
CN115278189A (en) | Image tone mapping method and apparatus, computer readable medium and electronic device | |
CN112967194B (en) | Target image generation method and device, computer readable medium and electronic equipment | |
CN115550575A (en) | Image processing method and related device | |
JP2023090492A (en) | Image processing device, image processing method, and imaging apparatus | |
CN109447925B (en) | Image processing method and device, storage medium and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |