CN115187487A - Image processing method and device, electronic device and storage medium - Google Patents

Image processing method and device, electronic device and storage medium Download PDF

Info

Publication number
CN115187487A
CN115187487A CN202210934711.0A CN202210934711A CN115187487A CN 115187487 A CN115187487 A CN 115187487A CN 202210934711 A CN202210934711 A CN 202210934711A CN 115187487 A CN115187487 A CN 115187487A
Authority
CN
China
Prior art keywords
color
target
image
information
temperature information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210934711.0A
Other languages
Chinese (zh)
Inventor
张科武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210934711.0A priority Critical patent/CN115187487A/en
Publication of CN115187487A publication Critical patent/CN115187487A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the disclosure relates to an image processing method and device, an electronic device and a storage medium, and relates to the technical field of images, wherein the image processing method comprises the following steps: acquiring an image to be processed, and partitioning the image to be processed to obtain a plurality of image blocks; determining target color temperature information of each image block according to the color temperature information of each image block and the local color temperature information of each image block determined by the color sensor array; acquiring regional color temperature information of a target region in the image to be processed according to the target color temperature information, and acquiring a color correction gain matrix of the target region based on the regional color temperature information; and correcting the color information of the target area according to the color correction gain matrix of the target area to acquire the target color information of the target area so as to generate a target image corresponding to the image to be processed. According to the technical scheme in the embodiment of the disclosure, the accuracy of color correction can be improved.

Description

Image processing method and device, electronic device and storage medium
Technical Field
The present disclosure relates to the field of imaging technologies, and in particular, to an image processing method and apparatus, an electronic device, and a computer-readable storage medium.
Background
In image processing, color correction may be required for an image to improve image quality.
In the correlation technology, color correction is calibrated by using a color card under different color temperature light source conditions, then the color temperature of the current scene is judged in real time through equipment, and finally a color correction matrix required to be used is calculated through interpolation. The method has certain limitation, and only integral correction can be carried out, so that the obtained color correction matrix is inaccurate, and the image quality is influenced.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide an image processing method and apparatus, an electronic device, and a computer-readable storage medium, which overcome, at least to some extent, the problem of poor image correction accuracy due to the limitations and disadvantages of the related art.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided an image processing method including: acquiring an image to be processed, and partitioning the image to be processed to obtain a plurality of image blocks; determining target color temperature information of each image block according to the color temperature information of each image block and the local color temperature information of each image block determined by the color sensor array; acquiring regional color temperature information of a target region in the image to be processed according to the target color temperature information, and acquiring a color correction gain matrix of the target region based on the regional color temperature information; and correcting the color information of the target area according to the color correction gain matrix of the target area to acquire the target color information of the target area so as to generate a target image corresponding to the image to be processed.
According to a second aspect of the present disclosure, there is provided an image processing apparatus comprising: the image blocking module is used for acquiring an image to be processed and blocking the image to be processed to obtain a plurality of image blocks; the color temperature acquisition module is used for determining target color temperature information of each image block according to the color temperature information of each image block and the local color temperature information of each image block determined by the color sensor array; the gain matrix determining module is used for acquiring the regional color temperature information of a target region in the image to be processed according to the target color temperature information and acquiring a color correction gain matrix of the target region based on the regional color temperature information; and the color correction module is used for correcting the color information of the target area according to the color correction gain matrix of the target area to acquire the target color information of the target area so as to generate a target image corresponding to the image to be processed.
According to a third aspect of the present disclosure, there is provided an electronic device comprising: an image module including a color sensor array; a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the image processing method of the first aspect described above and possible implementations thereof via execution of the executable instructions.
According to a fourth aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the image processing method of the first aspect described above and possible implementations thereof.
In the technical scheme provided in the embodiment of the disclosure, on one hand, the target color temperature information of each image block is acquired through the local color temperature information acquired by the color sensor array, and then the color correction gain matrix corresponding to the regional color temperature information of the target region is determined, so that the limitation that only integral correction can be performed is avoided, the local image block represented by the target region is independently controlled, the independence and flexibility of image processing are improved, the pertinence of image processing is also improved, the application range can be increased, and the application convenience is improved. On the other hand, the color correction gain matrix in the target area is calculated through the target color temperature information of each image block, each image block in the target area can be independently controlled, color correction can be performed through the color correction gain matrix of the target area, accuracy is improved, and the color correction effect and image quality are improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 shows a schematic diagram of an application scenario to which the image processing method of the embodiment of the present disclosure may be applied.
Fig. 2 schematically illustrates a schematic diagram of an image processing method according to an embodiment of the present disclosure.
Fig. 3 schematically illustrates a schematic diagram of an image block in an embodiment of the present disclosure.
Fig. 4 schematically shows a flow chart of determining target color temperature information in different ways in the embodiment of the present disclosure.
Fig. 5 schematically shows a weight change diagram of target color temperature information in an embodiment of the present disclosure.
Fig. 6 schematically illustrates a schematic diagram of determining a target area in an embodiment of the present disclosure.
Fig. 7 schematically shows a schematic diagram of color temperature transition of different regions in an embodiment of the present disclosure.
FIG. 8 schematically illustrates a schematic view of a radial transition in an embodiment of the present disclosure.
Fig. 9 schematically illustrates a schematic diagram of color transitions of different regions in an embodiment of the present disclosure.
Fig. 10 schematically shows a schematic configuration diagram of an image signal processor in an embodiment of the present disclosure.
Fig. 11 schematically illustrates a block diagram of an image processing apparatus in an embodiment of the present disclosure.
Fig. 12 schematically illustrates a block diagram of an electronic device in an embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The embodiment of the disclosure provides an image processing method, which can be applied to an application scene for processing an image in a photographing process. Fig. 1 is a schematic diagram illustrating a system architecture to which the image processing method and apparatus according to the embodiment of the present disclosure can be applied.
As shown in fig. 1, the terminal 101 may be a smart device with an image processing function, for example, a smart device such as a smart phone, a computer, a tablet computer, a smart speaker, a smart watch, an in-vehicle device, a wearable device, and a monitoring device. The terminal can contain a camera, and the type of the camera can be any type as long as the shooting processing can be carried out. The number of cameras may be at least one, for example, one, four, etc., as long as photographing is possible. The image to be processed can be a shot image or an image of each frame in a shot video.
In the disclosed embodiment, the terminal 101 may include a memory 102 and a processor 103. The memory is used for storing the image, and the processor is used for processing the image, such as white balance processing and the like. The memory 102 may store a pending image 104 therein. The terminal 101 acquires the image to be processed 104 from the memory 102 and sends the image to the processor 103, the image to be processed is subjected to block processing in the processor 103 to obtain a plurality of image blocks, and the local color temperature information and the color temperature information of the image to be processed are determined. And determining target color temperature information of each image block according to the color temperature information and the local color temperature information of each image block acquired by the color sensor array, further determining a color correction gain matrix of the target color temperature information in the target area, and performing color correction on the target area of the image to be processed through the color correction gain matrix, thereby generating a target image 105.
It should be noted that the image processing method provided by the embodiment of the present disclosure may be executed by the terminal 101. The image processing method may also be provided in the terminal.
Next, an image processing method in the embodiment of the present disclosure is explained in detail with reference to fig. 2.
In step S210, an image to be processed is obtained, and the image to be processed is partitioned into a plurality of image blocks.
In the embodiment of the disclosure, the image to be processed may be an image obtained by shooting an object to be shot through a camera module of the terminal, or may be an image of each frame in a shot video. The image to be processed may also be each frame of image in an image or video directly obtained from an album or other storage location. The terminal may be any one of a smartphone, a digital camera, a smart watch, a wearable device, a vehicle-mounted device, or a camera of a monitoring device, as long as it can photograph an object to be photographed and can perform image processing, and the smartphone is taken as an example here for description. Can include at least one camera in the module of making a video recording, can include any one or its combination in main camera, long burnt camera, wide-angle camera, the macro camera for example. The image to be processed may be various types of images, for example, a dynamic image or a static image, and the like.
After the image to be processed is acquired, the image to be processed may be subjected to block division to obtain a plurality of image blocks. The image block may be a part of an image to be processed, and a plurality of image blocks are not overlapped. The size of each image block may be the same, and the number of image blocks may be determined according to the number of grids. For example, a mesh area may be provided and applied to the image to be processed to divide the image to be processed into a plurality of image blocks according to the mesh area, and the divided image blocks may correspond to the mesh area one to one. Referring to fig. 3, a plurality of image blocks 301 may be included. For example, grid 00 corresponds to image block 00, grid 01 corresponds to image block 01, and so on. The size of the grid area can be set according to actual requirements and hardware structures together, namely, under the condition that the hardware structures can support, the grid area is set according to the actual requirements. For example row col column. Based on this, the color sensor may be associated with an image block and a grid, the image to be processed may be divided into image blocks of row × col, and each image block corresponds to each grid, and under the same field angle, the multi-window spatial range represented by the grid area coincides with the image to be processed, so that the grid area can be overlaid on the image to be processed. Each grid may correspond to a window and thus may be referred to as a multi-window.
In the embodiment of the present disclosure, a color sensor array may be included, and the color sensor array may include a plurality of color sensors 302 arranged in an array. The number of color sensors may be determined according to the number of image blocks. Also, the color sensor array may be combined with multi-window information, so each color sensor may be a multi-window color sensor. Each grid represents a color sensor, each color sensor corresponding to each image block. Since the image to be processed is divided into a plurality of grids, a row × col color sensor array is spatially formed, which forms a multi-window color sensor.
The multi-window color sensor array is an independent sensor and can be arranged on one side of any camera in the camera module and is close to the camera module. The module of making a video recording can be rear-mounted camera module, and the module of making a video recording can include at least one camera, for example can include main camera, long burnt camera, wide-angle camera, arbitrary one in the macro camera or its combination. The specific arrangement position and arrangement order of the multiple cameras can be determined according to actual requirements, and are not specifically limited herein. The color sensor array may be, for example, the left side of a tele camera, or the right side of a main camera or the underside of the last camera of at least one camera, etc. The camera module and the color sensor array can be arranged adjacently or at intervals. The specific position of the color sensor array may be determined according to a calibration result obtained by calibrating the color sensor array in an actual application process, or may be determined according to an actual requirement, which is not specifically limited herein.
It should be noted that, for each module in the image signal processor, the image to be processed may be partitioned, and the image blocks obtained by partitioning may be consistent with the image blocks obtained by partitioning according to the grid, so as to ensure consistency and accuracy. For example, the color correction module may perform multi-window area division on the image to obtain a first image block; the 2D/3D lookup table module may also perform multi-window area division on the image to obtain the second image block. The image division modes among different modules are the same, and the first image block and the second image block are basically the same as the image blocks divided in step S210, so as to ensure the consistency and accuracy among the image blocks.
The multi-window area information of the color sensor array is designed corresponding to the color correction module, and the window specification of the color correction module is equal to or slightly larger than or smaller than the multi-window area of the color sensor array, namely the number of rows and the number of columns of the multi-window area information of the color sensor array are the same.
Next, with continued reference to fig. 2, in step S220, target color temperature information of each image block is determined according to the color temperature information of each image block and the local color temperature information of each image block determined by the color sensor array.
In the embodiment of the present disclosure, color temperature information of each image block may be first acquired by the image signal processor ISP, and may be specifically represented by cct 1. Besides, the color temperature of a local area of the current scene can be calculated by the color sensor array, i.e. local color temperature information of each image block is acquired, which can be represented by cct 2. The color sensor may be a sensor for detecting relevant information such as a color, a color temperature, a spectrum, and the like of a scene, and may be configured to detect color information of an object corresponding to each image block and color temperature information of a current scene. The object may be an object included in each image block, and may be any type of object, such as an object, a person, or the like.
In the embodiment of the disclosure, the target color temperature information of the image block can be determined according to the color temperature information and the local color temperature information. For example, the color temperature information may be compared with the local color temperature information to determine difference information; and selecting different modes to obtain the target color temperature information according to the comparison result of the difference information and the threshold parameter. The threshold parameter may include a first threshold value and a second threshold value, and the first threshold value Th1 is smaller than the second threshold value Th2.
Fig. 4 schematically shows a flow chart for selecting different ways to determine the target color temperature information, and referring to fig. 4, the method mainly comprises the following steps:
in step S410, it is determined whether the difference information is smaller than a first threshold; if yes, go to step S420; if not, go to step S430;
in step S420, if the difference information is smaller than a first threshold, determining target color temperature information according to the color temperature information;
in step S430, it is determined whether the difference information is smaller than a second threshold; if yes, go to step S440; if not, go to step S450;
in step S440, if the difference information is greater than a first threshold and smaller than a second threshold, the color temperature information is adjusted by local color temperature information to determine the target color temperature information;
in step S450, if the difference information is greater than a second threshold, the target color temperature information is determined according to the local color temperature information.
In the embodiment of the present disclosure, as the difference information changes, the weight of the target color temperature information in the color temperature information also gradually decreases, specifically referring to the weight change diagram shown in fig. 5. When the difference information is larger than a first threshold value, the weight of the target color temperature information in the color temperature information is 1; when the difference information is between the first threshold value and the second threshold value, the weight of the target color temperature information in the color temperature information is gradually reduced; when the difference information is greater than the second threshold, the weight of the target color temperature information in the color temperature information is 0, that is, the weight of the target color temperature information in the local color temperature information acquired by the color sensor array is 1.
On the basis, if the difference information between the two is smaller than the first threshold, in this case, the color sensor array does not need to correct the color temperature information obtained by the image signal processor, so the color temperature information is directly used as the target color temperature information of each image block.
If the difference information between the two is greater than the first threshold and smaller than the second threshold, the local color temperature information acquired by the color sensor array needs to be adjusted to the color temperature information acquired by the image signal processor, and specifically, the target color temperature information of each image block can be determined jointly according to the color temperature information acquired by the color sensor array and the image signal processor, that is, the color temperature information is interpolated through the local color temperature information acquired by the color sensor array. For example, the target color temperature information of each image block may be obtained by interpolating all color temperature information calculated in different manners. The interpolation may be a weighted sum operation. For example, the local color temperature information acquired by the color sensor array and the color temperature information may be weighted and fused according to the corresponding weighting parameters to obtain the target color temperature information, for example, as shown in formula (1):
cct = cct1 w1+ cct2 w2 formula (1)
Where cct1 represents color temperature information acquired by the image signal processor, and cct2 represents local color temperature information of each image block acquired by the color sensor array.
The color temperature information is adjusted through the local color temperature information acquired by the color sensor array, the accuracy of the target color temperature information of each image block can be improved, the influence of the environment on the color temperature in the related technology is avoided, and the stability is improved.
The target color temperature information of each image block may be determined according to the local color temperature information acquired by the color sensor array if the difference information therebetween is greater than the second threshold.
After the target color temperature information of each image block is determined, the target color temperature information of each image block can be smoothed to realize smooth transition among the target color temperature information of all the image blocks, and the abrupt change effect generated after different target color temperature information acts among local blocks represented by each image block is reduced. Wherein the smoothing process may be implemented by low-pass filtering.
Next, in step S230, region color temperature information of a target region in the image to be processed is obtained according to the target color temperature information, and a color correction gain matrix of the target region is obtained.
In the disclosed embodiment, a color correction gain matrix is used to achieve color correction. Color correction refers to correcting the difference between the current color and the target color of an image through a 24-color card. The color correction gain matrix is used for correcting the color parameters of each pixel point so as to adjust the color parameters to be the target color parameters. The color parameters may be RGB values.
In order to solve the problem that the content of the partial region cannot be specifically processed in the related art, mask information may be provided to select a target region from the image to be processed for specific processing. The mask information may be at least one, and is determined according to actual requirements, and a mask information is taken as an example for description herein. The target area may be a partial area of the image to be processed that matches the mask information. Referring to fig. 6, a target region 602 may be selected from the image to be processed 600 by mask information 601, and a region other than the target region may be determined as a reference region 603. The target area may be an area that needs to focus on or an area that needs fine-grained adjustment, such as a face area or an area with more details. The target area may include at least one image block, and the image block may be a complete image block or a partial image block, which is specifically determined according to the size of the mask information.
After the target area is acquired, the area color temperature information of the target area, that is, the fourth color temperature information cct4, may be output according to the target color temperature information of the image block included in the target area. For example, the region color temperature information may be target color temperature information for each image block, for example, the target color temperature information of image block 1, image block 2, and image block 3 is included; the color temperature information obtained by integrating the target color temperature information of a plurality of image blocks may be used, and the description will be given by taking the regional color temperature information as the individual target color temperature information for each image block. Meanwhile, the color temperature information of the reference region other than the target region in the image to be processed may be determined as the third color temperature information cct3. The third color temperature information is different from the fourth color temperature information.
After obtaining the region color temperature information of the target region, the region color temperature information of the target region may be adjusted according to the first adjustment parameter to output the fourth color temperature information, and the color temperature information of the reference region is kept unchanged to obtain the third color temperature information. Illustratively, the first adjustment parameter is specifically determined according to actual requirements, such as user requirements or system settings, and the like. The first adjustment parameter may include an image block to be adjusted and an adjustment degree. When the adjustment is carried out, the image blocks and the adjustment amplitude which are adjusted according to needs can be completely or partially adjusted, so that the regional color temperature information of the target region is adjusted to output the fourth color temperature information, the fourth color temperature information of the target region is accurately and independently controlled, and the flexibility is improved. It should be noted that, for each image block included in the target area, since the ratio of each image block in the target area is different, the adjustment priority for each image block may be different, specifically, the adjustment priority may be positively correlated with the ratio of the image block, that is, the larger the ratio of the image block located in the target area is, the higher the adjustment priority is. The adjustment directions of different image blocks may be different, and are determined according to actual requirements.
In order to avoid the difference between the color temperature information of different regions, the fourth color temperature information in the target region and the third color temperature information in the reference region outside the target region may be smoothly transitioned. The smooth transition may be a radial transition, or may be in other transition manners, and the radial transition is taken as an example for description here.
In the embodiment of the present disclosure, a central area of the third color temperature information in the reference area outside the target area is taken as a radial origin, and radial transition is performed from the fourth color temperature information in the target area to the third color temperature information in the reference area, as shown in fig. 7. The transition manner between the two can be a series of weight transition manners such as smooth curve weight radial distribution or linear distribution, which is shown in fig. 8 and is not limited in detail herein. The smooth curve weight radial distribution means that the weight of the transition mode can be in a smooth curve distribution, and the linear distribution means that the distribution mode of the weight of the transition mode is in a linear distribution. When the radial distance is greater than the first threshold Th1 and less than the first threshold Th2, the fourth color temperature information and the third color temperature information may be weighted and fused, and the third color temperature information may be used when the radial distance is greater than the second threshold Th2. When the weighting fusion is performed, the weight occupied by the fourth color temperature information decreases with the increase of the radial distance, and the transition gradually proceeds to the third color temperature information.
In the embodiment of the disclosure, the target region is obtained through the mask information, and then the region color temperature information of the target region can be obtained, and further, the color correction gain matrix of the target region can be determined based on the region color temperature information.
Specifically, the color correction gain matrix may be determined according to the regional color temperature information within the target region and the calibrated light source result of the image signal processor. The region color temperature information in the target region may be target color temperature information of each image block included in the target region, and the calibration light source result refers to color temperature information acquired by the image signal processor. In some embodiments, there is a corresponding gain matrix for the scaled color temperature information of each image block, and the gain matrix for each image block may be the same or different. And, the local color temperature information of each image block acquired by the color sensor array also has a corresponding gain matrix. The gain matrices for the scaled color temperature information and the local color temperature information may each be, for example, a 3 × 3 matrix.
On this basis, the gain matrix of the color temperature information of each image block and the gain matrix of the local color temperature information may be interpolated to obtain the color correction gain matrix of the target color temperature information corresponding to each image block, and the color correction gain matrices of all image blocks included in the target area are determined as the color correction gain matrix of the target area. For example, the gain matrix of the color temperature information and the gain matrix of the local color temperature information may be fused according to the corresponding weight parameters. The fusion here can be achieved by a weighted sum operation. The weight parameter and the distance from the corresponding color temperature information to the target color temperature information are in negative correlation, that is, the larger the distance from the corresponding color temperature information to the target color temperature information is, the smaller the weight parameter is. The weight parameter of the gain matrix for each color temperature information may be determined according to a distance of another color temperature information from the target color temperature information. For example, point a is color temperature information, point B is local color temperature information, point C is target color temperature information, and the distance between the ACs is m1 and the distance between the BCs is m2. The color temperature information at the point a is 2000K, the gain matrix is A1 and the weight parameter thereof is m2/m1+ m2, the gain matrix of the local color temperature information 3000K at the point B is B1 and the weight parameter thereof is m1/m1+ m2, and then the color correction gain matrix of the target color temperature information at the point C can be represented as m2/m1+ m2 A1+ m1/m1+ m 2B 1.
After the color correction gain matrix of each image block is obtained, the color correction gain matrix of each image block may be smoothed again. The smoothing process may be low-pass filtering, so as to obtain a color correction gain matrix of each image block. The color correction gain matrix may be a red, green, and blue gain matrix. Based on this, the color correction gain matrix of the target region can be obtained according to the color correction gain matrices of all the image blocks included in the target region, that is, the color correction gain matrix of the target region is the color correction gain matrix of all the image blocks included in the target region. With the adjustment of the fourth color temperature information in the target region, the color correction gain matrix of the target region may also be changed, which is specifically determined according to the adjustment parameter. But the color correction gain matrix of the reference area is fixed.
Continuing to refer to fig. 2, in step S240, color information of the target region is corrected according to the color correction gain matrix of the target region, so as to generate a target image corresponding to the image to be processed.
Further, a color correction gain matrix may be applied to the target area of the image to be processed. For example, image blocks may be obtained in the target region, and color information of each image block may be corrected according to the color correction gain matrix of each image block. Specifically, the color parameter of the pixel point of each image block in the target region can be multiplied by the color correction gain matrix of each image block included in the target region, so that the color parameter of the pixel point of each image block in the target region is adjusted to the required target color information, the image color is more in line with the actual requirement, and the image quality is improved.
In the embodiment of the disclosure, by calculating the color correction gain matrix of each image block and obtaining the color correction gain matrix of the target region, part of regions in the image to be processed can be independently controlled, so that the limitation that only the whole process can be performed in the related art is avoided, the flexibility and pertinence of image processing are improved, and the image quality is also improved.
In some scenes, the overall color is not bright enough due to the original image captured by the camera, but the user also desires to capture images of different scenes, such as: green plants, grasslands, flowers, blue sky, clouds, beach, buildings, animals, etc. Therefore, specific transformation can be performed on the hue and saturation of some specific colors, such as: when the scene is green, the green is expected to be thicker, and other colors are kept unchanged, and color mapping can be performed through the 2D/3 DLUT. The color mapping refers to performing color mapping on an original image or a video image according to an adjusted color mapping table, wherein the color mapping table is a three-dimensional lookup table, three color components of IN _ RGB are input, and 3DLUT is looked up to directly obtain three corresponding color components of OUT _ RGB. In the disclosed embodiment, the color sensor array may also be associated with a 2D/3D LUT (2D/3D lookup table) to improve the capability of the 2D/3D LUT. It should be noted that the window size of the 2D/3D lookup table module should be equal to or larger or smaller than the color sensor multi-window area. In addition, the manner of dividing the image blocks and the number of the divided image blocks are substantially the same.
Based on this, the method further comprises: acquiring target color information of a target area in the image to be processed; and mapping the target color information according to the color difference information of the target area and a mapping mode corresponding to the area type. The target area may be an area that needs to be focused, such as a human face, a tree area, and the like, and the target area may be the same as or different from the target area of the color correction module, and is specifically determined according to actual processing requirements. It should be noted that, the target color information may be mapped, and the original color information in the image to be processed may also be mapped, which is not limited herein. That is, the 2D/3D LUT module may be executed concurrently with the color correction module or may be executed after the color correction module.
In some embodiments, the target color information of the image block of the target area may be color mapped in combination with the local color information acquired by the color sensor array and the reference color information acquired by the look-up table module to adjust the target color information to the required corrective color information. Or the original color information of the image block of the target area may be mapped. The color mapping may be used to implement color correction to adjust color parameters, such as adjusting the values of the RGB three color sub-pixels. In the embodiment of the present disclosure, color mapping may be performed by combining the parameter mapping relationships of the color sensor array and the lookup table module. Specifically, the color mapping performed in combination with the local color information acquired by the color sensor array and the reference color information acquired by the lookup table module may be performed by determining a mapping manner in combination with a color threshold parameter and a region type of the target region to perform color mapping. The color threshold parameter may be a first color threshold and a second color threshold.
Exemplarily, the local color information may be compared with the reference color information to determine color difference information; and selecting a mapping mode corresponding to the comparison result to acquire the corrected color information according to the comparison result of the color difference information and the color threshold parameter.
In the embodiment of the present disclosure, when the area type of the image block included in the target area is determined to be the target type, the target color information may be determined in different mapping manners combined by local color information and reference color information, in combination with the comparison result between the color difference information and the color threshold parameter, and in combination with the color threshold parameter and the area type.
In some embodiments, it is first determined whether the area type of the image block included in the target area is the target type. The target type refers to an area type whose color needs to be adjusted, and may be determined specifically according to an object type included in the image block, color information of the image block, or whether a user operation is received. For example, if the object type included in the image block is human face skin color, blue sky or green plant, etc., it may be determined that the area type of the image block belongs to the target type.
And if the color difference information is smaller than a first color threshold, mapping the target color information or the color information of the image block by acquiring a parameter mapping relation of reference color information to acquire the corrected color information. It can also be considered that the reference color information of the image block is directly determined as the correction color information. In this step, the parameter mapping relationship may be a look-up table 2D/3D LUT, such as a color mapping table. For example, the target color information of the image block included in the target area may be input to the color mapping table, and the corrected color information corresponding to the color mapping table is output, where the color enhancement intensity may be an intensity set by the lookup table itself, or may be adjusted according to an actual requirement. For example, the current three RGB color components of a certain pixel point in an image block may be input into the color mapping table, and then the corresponding three RGB color mapping components are output through the color mapping table. When color mapping is performed through the mapping table, since the image block can contain a plurality of pixel points, each pixel point can be adjusted according to the corresponding color mapping table, and corrected color information output after color adjustment is obtained.
And if the color difference information is larger than the first color threshold and smaller than the second color threshold, fusing the local color information and the reference color information to determine the corrected color information after color mapping is performed on the target color information. In this step, the local color information and the reference color information may be weighted and fused according to the corresponding weights, so as to obtain the corrected color information. As the color difference information increases, the weight of the local color information gradually increases, and the weight of the reference color information gradually decreases.
And if the color difference information is larger than a second color threshold value, determining the target color information according to the local color information. If the color difference information is greater than the second color threshold, it indicates that the mapped color information may be inaccurate, and at this time, the local color information output by the color sensor array needs to be used as the corrected color information, that is, the weight of the local color information is relatively large.
Further, the corrected color information obtained after the respective color mapping is obtained for each image block may be smoothly transitioned. The smooth transition can be low-pass smooth filtering, so that the transition between different image blocks in a space domain is smooth, the situation that the color effect is suddenly changed between the image blocks is avoided, the smooth transition can be realized, and the image quality is improved.
After obtaining the corrected color information of each image block, a mapping manner corresponding to the comparison result may be selected to map the target color information of each image block into the corrected color information according to the color temperature and color difference information of the image block and the comparison result between the color threshold parameters included in the target area. Specifically, the pixel information of each image block included in the target area may be mapped. On this basis, it is possible to determine the correction color information in the target region as the first color information rgb1, and the correction color information of the reference region as the second color information rgb2, and the first color information is different from the second color information. And, the corrective color information in the target region may be adjusted according to the second adjustment parameter to update the first color information while keeping the second color information of the reference region unchanged. The second adjustment parameter may include the image block to be adjusted and the degree of adjustment. When the adjustment is carried out, the image blocks and the adjustment amplitude which are adjusted according to needs can be completely or partially adjusted, so that the first color information of the target area is adjusted, the first color information of the target area is accurately and independently controlled, and the flexibility is improved. In the embodiment of the present disclosure, to avoid abrupt change between the correction color information of different regions, a radial transition is performed from the first color information of the target region to the second color information of the reference region with the central region of the first color information of the target region as a radial origin, as shown in fig. 9. The transition mode between the two can be a series of weight transition modes such as smooth curve weight radial distribution or linear distribution, which is not limited herein.
Based on this, a target area can be determined through the mask information, and then the mapping mode of the target area is determined according to the image blocks contained in the target area, so that the target color information of each image block in the target area is mapped to obtain the corrected color information, the dimension for distinguishing the image content can be provided, the targeted processing of partial areas such as the target area is realized, the richness and flexibility of color expression are improved, and the image quality is improved.
Fig. 10 schematically shows a block diagram of an image signal processor, and referring to fig. 10, the image signal processor may include a color sensor array 1001, a grid 1002, an algorithm module 1003, and may further include a color correction module 1004, a tone map 1005, a 2D/3D LUT 1006, a segmentation module 1007, and a sensor 1008.
Referring to fig. 10, the grid information obtained by the color sensor array is associated with the grid obtained by the color correction module and the 2D/3D LUT so that the divided image blocks substantially coincide. The local color temperature information output by the color sensor array through the algorithm module is input to the color correction module, and the local color information output by the color sensor array through the algorithm module is input to the 2D/3D LUT. The color correction module and the 2D/3D LUT are typically located in the second stage of the ISP processing flow, the RGB domain stage. The segmentation module is connected with the color correction module, the 2D/3D LUT and the sensor.
In the embodiment of the disclosure, the local color temperature information and the local color information are acquired by introducing the color sensor array, so that the color adjustment capability of the color correction module and the 2D/3D LUT module can be improved by the color sensor array, and the accuracy of the target color information can be improved.
In summary, according to the technical scheme in the embodiment of the present disclosure, after the segmentation module is introduced, the target region may be obtained in the image to be processed through the mask information, and then the image content of the local region represented by the target region in the image to be processed may be subjected to local color correction and color mapping processing, so that the limitation that only the whole processing in the related art is performed is avoided, and the flexibility and pertinence of the image processing are improved; the accuracy of color restoration of the image can be effectively improved, and the image quality of the target image is improved.
An embodiment of the present disclosure provides an image processing apparatus, and referring to fig. 11, the image processing apparatus 1100 may include:
the image blocking module 1101 is configured to acquire an image to be processed, and block the image to be processed to obtain a plurality of image blocks;
the color temperature acquisition module 1102 is configured to determine target color temperature information of each image block according to color temperature information of each image block and local color temperature information of each image block determined by a color sensor array;
a gain matrix determining module 1103, configured to obtain, according to the target color temperature information, region color temperature information of a target region in the image to be processed, and obtain, based on the region color temperature information, a color correction gain matrix of the target region;
a color correction module 1104, configured to correct the color information of the target region according to the color correction gain matrix of the target region, and obtain the target color information of the target region, so as to generate a target image corresponding to the image to be processed.
In an exemplary embodiment of the present disclosure, the gain matrix determination module includes: and the region color temperature acquisition module is used for acquiring a target region from the image to be processed according to the mask information and determining the region color temperature information of the target region according to the target color temperature information of the image block contained in the target region.
In an exemplary embodiment of the present disclosure, the regional color temperature acquisition module includes: and the adjusting module is used for adjusting the target color temperature information of the image block contained in the target area according to the first adjusting parameter so as to adjust the area color temperature information of the target area.
In an exemplary embodiment of the present disclosure, the color temperature acquisition module includes: the difference calculation module is used for comparing the color temperature information with the local color temperature information to determine difference information; and the mode selection module is used for selecting different modes to acquire the target color temperature information according to the comparison result of the difference information and the threshold parameter.
In an exemplary embodiment of the present disclosure, the gain matrix determination module includes: and the interpolation module is used for interpolating the color temperature information of each image block and the local color temperature information of each image block determined by the color sensor array, and determining a color correction gain matrix of the target color temperature information of each image block so as to determine the color correction gain matrix in the target area.
In an exemplary embodiment of the present disclosure, the apparatus further includes: and the color mapping module is used for determining a mapping mode according to the color difference information of the target area and the area type and mapping the target color information into correction color information according to the mapping mode.
In an exemplary embodiment of the disclosure, the color mapping module is configured to: if the color difference information of the image block in the target area is smaller than a first color threshold and the area type is the target type, determining the corrected color information according to the parameter mapping relation of the acquired reference color information; if the color difference information is larger than a first color threshold and smaller than a second color threshold and the region type is a target type, fusing local color information and reference color information to obtain the correction color information; and if the color difference information is larger than a second color threshold value and the region type is a target type, determining the correction color information according to the local color information.
It should be noted that, the specific details of each part in the image processing apparatus have been described in detail in the embodiment of the image processing method part, and details that are not disclosed may refer to the embodiment of the method part, and thus are not described again.
Exemplary embodiments of the present disclosure also provide an electronic device. The electronic device may be the terminal 101 described above. In general, the electronic device may include a processor and a memory for storing executable instructions of the processor, the processor being configured to perform the above-mentioned image processing method via execution of the executable instructions.
The following takes the mobile terminal 1200 in fig. 12 as an example, and exemplifies the configuration of the electronic device. It will be appreciated by those skilled in the art that the configuration of figure 12 can also be applied to fixed type devices, in addition to components specifically intended for mobile purposes.
As shown in fig. 12, the mobile terminal 1200 may specifically include: a processor 1201, a memory 1202, a bus 1203, a mobile communication module 1204, an antenna 1, a wireless communication module 1205, an antenna 2, a display 1206, a camera module 1207, an audio module 1208, a power module 1209, and a sensor module 1210.
The processor 1201 may include one or more processing units, such as: the Processor 1201 may include an AP (Application Processor), a modem Processor, a GPU (Graphics Processing Unit), an ISP (Image Signal Processor), a controller, an encoder, a decoder, a DSP (Digital Signal Processor), a baseband Processor, and/or an NPU (Neural-Network Processing Unit), etc. The image denoising processing method in the exemplary embodiment may be performed by an AP, a GPU, or a DSP, and when the method involves neural network related processing, may be performed by an NPU, for example, the NPU may load neural network parameters and execute neural network related algorithm instructions.
An encoder may encode (i.e., compress) an image or video to reduce the data size for storage or transmission. The decoder may decode (i.e., decompress) the encoded data for the image or video to recover the image or video data. The mobile terminal 1200 may support one or more encoders and decoders, such as: image formats such as JPEG (Joint Photographic Experts Group), PNG (Portable Network Graphics), BMP (Bitmap), and Video formats such as MPEG (Moving Picture Experts Group) 1, MPEG10, h.1063, h.1064, and HEVC (High Efficiency Video Coding).
The processor 1201 may be connected to the memory 1202 or other component via the bus 1203.
Memory 1202 may be used to store computer-executable program code, which includes instructions. The processor 1201 executes various functional applications of the mobile terminal 1200 and data processing by executing instructions stored in the memory 1202. The memory 1202 may also store application data, such as files for storing images, videos, and the like.
The communication function of the mobile terminal 1200 may be implemented by the mobile communication module 1204, the antenna 1, the wireless communication module 1205, the antenna 2, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. The mobile communication module 1204 may provide a mobile communication solution of 3G, 4G, 5G, etc. applied to the mobile terminal 1200. The wireless communication module 1205 may provide a wireless communication solution for wireless local area network, bluetooth, near field communication, etc. applied to the mobile terminal 1200.
The display screen 1206 is used to implement display functions, such as displaying a user interface, images, videos, and the like. The camera module 1207 is used to perform a shooting function, such as shooting an image, a video, etc., and may include a color sensor array therein. The audio module 1208 is used to implement audio functions, such as playing audio, collecting voice, etc. The power module 1209 is used to implement power management functions, such as charging a battery, powering a device, monitoring a battery status, and so on. The sensor module 1210 may include one or more sensors for implementing corresponding sensing functions. For example, the sensor module 1210 may include an inertial sensor for detecting a motion pose of the mobile terminal 1200 and outputting inertial sensing data.
It should be noted that, in the embodiments of the present disclosure, a computer-readable storage medium is also provided, and the computer-readable storage medium may be included in the electronic device described in the foregoing embodiments; or may exist separately without being assembled into the electronic device.
A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable storage medium may transmit, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The computer readable storage medium carries one or more programs which, when executed by one of the electronic devices, cause the electronic device to implement the method as described in the embodiments below.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims. It will be understood that the present disclosure is not limited to the precise arrangements that have been described above and shown in the drawings, and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (10)

1. An image processing method, comprising:
acquiring an image to be processed, and partitioning the image to be processed to obtain a plurality of image blocks;
determining target color temperature information of each image block according to the color temperature information of each image block and the local color temperature information of each image block determined by the color sensor array;
acquiring regional color temperature information of a target region in the image to be processed according to the target color temperature information, and acquiring a color correction gain matrix of the target region based on the regional color temperature information;
and correcting the color information of the target area according to the color correction gain matrix of the target area to acquire the target color information of the target area so as to generate a target image corresponding to the image to be processed.
2. The image processing method according to claim 1, wherein said obtaining region color temperature information of a target region in the image to be processed according to the target color temperature information comprises:
acquiring a target area from an image to be processed according to mask information, and determining the area color temperature information of the target area according to the target color temperature information of an image block contained in the target area.
3. The method according to claim 2, wherein the determining the region color temperature information of the target region according to target color temperature information of an image block included in the target region comprises:
and adjusting the target color temperature information of the image block contained in the target area according to the first adjustment parameter so as to adjust the area color temperature information of the target area.
4. The method of claim 1, wherein determining the target color temperature information of each image block according to the color temperature information of each image block and the local color temperature information of each image block determined by the color sensor array comprises:
comparing the color temperature information with the local color temperature information to determine difference information;
and selecting different modes to obtain the target color temperature information according to the comparison result of the difference information and the threshold parameter.
5. The image processing method according to claim 1, wherein the obtaining a color correction gain matrix of the target region based on the region color temperature information comprises:
and interpolating the color temperature information of each image block and the local color temperature information of each image block determined by the color sensor array, and determining a color correction gain matrix of the target color temperature information of each image block so as to determine the color correction gain matrix in the target area.
6. The method of image processing according to claim 1, further comprising:
and determining a mapping mode according to the color difference information of the target area and the area type, and mapping the target color information into correction color information according to the mapping mode.
7. The image processing method according to claim 6, wherein the determining a mapping manner from the color difference information of the target region and the region type, and mapping the target color information to the corrected color information according to the mapping manner comprises:
if the color difference information of the image block in the target area is smaller than a first color threshold and the area type is the target type, determining the corrected color information according to the parameter mapping relation of the acquired reference color information;
if the color difference information is larger than a first color threshold and smaller than a second color threshold and the region type is a target type, fusing local color information and reference color information to obtain the corrected color information;
and if the color difference information is larger than a second color threshold value and the region type is a target type, determining the correction color information according to the local color information.
8. An image processing apparatus characterized by comprising:
the image blocking module is used for acquiring an image to be processed and blocking the image to be processed to obtain a plurality of image blocks;
the color temperature acquisition module is used for determining target color temperature information of each image block according to the color temperature information of each image block and the local color temperature information of each image block determined by the color sensor array;
the gain matrix determining module is used for acquiring the regional color temperature information of a target region in the image to be processed according to the target color temperature information and acquiring a color correction gain matrix of the target region based on the regional color temperature information;
and the color correction module is used for correcting the color information of the target area according to the color correction gain matrix of the target area to acquire the target color information of the target area so as to generate a target image corresponding to the image to be processed.
9. An electronic device, comprising:
an image module including a color sensor array;
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the image processing method of any one of claims 1-7 via execution of the executable instructions.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the image processing method of any one of claims 1 to 7.
CN202210934711.0A 2022-08-04 2022-08-04 Image processing method and device, electronic device and storage medium Pending CN115187487A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210934711.0A CN115187487A (en) 2022-08-04 2022-08-04 Image processing method and device, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210934711.0A CN115187487A (en) 2022-08-04 2022-08-04 Image processing method and device, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN115187487A true CN115187487A (en) 2022-10-14

Family

ID=83520898

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210934711.0A Pending CN115187487A (en) 2022-08-04 2022-08-04 Image processing method and device, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN115187487A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116152361A (en) * 2023-04-20 2023-05-23 高视科技(苏州)股份有限公司 Method for estimating chromaticity, electronic device, and computer-readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116152361A (en) * 2023-04-20 2023-05-23 高视科技(苏州)股份有限公司 Method for estimating chromaticity, electronic device, and computer-readable storage medium

Similar Documents

Publication Publication Date Title
US11849224B2 (en) Global tone mapping
US10007967B2 (en) Temporal and spatial video noise reduction
US11317070B2 (en) Saturation management for luminance gains in image processing
KR20150099302A (en) Electronic device and control method of the same
WO2012108094A1 (en) Image processing device, image processing method, image processing program, and image pick-up device
KR102386385B1 (en) Electronic device and method for compressing image thereof
WO2024027287A9 (en) Image processing system and method, and computer-readable medium and electronic device
WO2019104047A1 (en) Global tone mapping
US10600170B2 (en) Method and device for producing a digital image
CN105578065A (en) Method for generating high-dynamic range image, photographing device and terminal
KR20120114899A (en) Image processing method and image processing apparatus
KR102285756B1 (en) Electronic system and image processing method
CN115187487A (en) Image processing method and device, electronic device and storage medium
WO2020133331A1 (en) Systems and methods for exposure control
CN115205159A (en) Image processing method and device, electronic device and storage medium
CN116614716A (en) Image processing method, image processing device, storage medium, and electronic apparatus
CN115239739A (en) Image processing method and device, electronic equipment and computer readable medium
CN115330633A (en) Image tone mapping method and device, electronic equipment and storage medium
CN115187488A (en) Image processing method and device, electronic device and storage medium
CN115278189A (en) Image tone mapping method and apparatus, computer readable medium and electronic device
CN112967194B (en) Target image generation method and device, computer readable medium and electronic equipment
JP2023090492A (en) Image processing device, image processing method, and imaging apparatus
WO2016200480A1 (en) Color filter array scaler
CN115314695A (en) Image white balance processing method and device, electronic equipment and storage medium
CN115278191B (en) Image white balance method and device, computer readable medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination