CN117979179A - Color correction method and system - Google Patents

Color correction method and system Download PDF

Info

Publication number
CN117979179A
CN117979179A CN202410059092.4A CN202410059092A CN117979179A CN 117979179 A CN117979179 A CN 117979179A CN 202410059092 A CN202410059092 A CN 202410059092A CN 117979179 A CN117979179 A CN 117979179A
Authority
CN
China
Prior art keywords
matrix
color
sub
feature
calibrated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410059092.4A
Other languages
Chinese (zh)
Inventor
唐晓芳
李珂
隋庆成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Weijing Technology Co ltd
Original Assignee
Shanghai Weijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Weijing Technology Co ltd filed Critical Shanghai Weijing Technology Co ltd
Priority to CN202410059092.4A priority Critical patent/CN117979179A/en
Publication of CN117979179A publication Critical patent/CN117979179A/en
Pending legal-status Critical Current

Links

Abstract

The application provides a color correction method and a system, wherein the method comprises the following steps: obtaining sample colors, and adjusting calibration weights according to deviations of the sample colors and the corresponding memory colors; obtaining a theoretical optimal matrix for restoring the sample color according to the calibration weight; obtaining a color to be calibrated, obtaining a characteristic weight according to the characteristic of the color to be calibrated, and calibrating a theoretical optimal matrix and a color temperature matrix by using the characteristic weight to obtain a characteristic matrix; and fusing the characteristic matrix with the color temperature matrix to obtain an actual optimal matrix, and correcting the color to be calibrated by using the actual optimal matrix. According to the understanding of human eyes to different scene memory colors, the application optimizes the existing color correction matrix and adaptively adjusts parameters in the color correction matrix. In popular terms, the application decides which colors are short plates to be abandoned and which colors are to be highlighted according to scene requirements; thereby making possible an otherwise unavailable color style.

Description

Color correction method and system
Technical Field
The present application relates to the field of image correction, and more particularly, to a color correction method and system.
Background
The module responsible for restoring the camera's color space to the standard color space in the ISP is a color correction matrix (color correction matrix, CCM). Because the color vividness of the pictures is greatly different before and after correction, the CCM module also becomes an indispensable ring in the field of image correction. How to optimize the color correction matrix so that the CCM module can better adapt to different environmental conditions, and further correct the image colors which accord with human visual perception is the core of the application.
Disclosure of Invention
The application aims to provide a color correction method and a system, which realize the self-adaptive adjustment of parameters in a color correction matrix according to the environmental characteristics in an actual image and the understanding of human eyes on memory colors, so that the image after color correction can accord with the common cognition of people and is commonly accepted by people.
The technical scheme provided by the invention is as follows:
A color correction method comprising the steps of: obtaining sample colors, and adjusting calibration weights according to deviations of the sample colors and the corresponding memory colors; obtaining a theoretical optimal matrix for restoring the sample color according to the calibration weight; obtaining a color to be calibrated, obtaining a characteristic weight according to the characteristic of the color to be calibrated, and calibrating a theoretical optimal matrix and a color temperature matrix by using the characteristic weight to obtain a characteristic matrix; and fusing the characteristic matrix with the color temperature matrix to obtain an actual optimal matrix, and correcting the color to be calibrated by using the actual optimal matrix.
The feature matrix in the color correction method adds the understanding of the human eyes on the memory color into the process of color correction, and extracts the color features according to the color distribution of the scene for controlling the direction of final color reproduction.
In some embodiments, when a theoretical optimal matrix for restoring the sample color is obtained according to the calibration weight, a plurality of sample colors are selected, a corresponding relationship between each sample color and its corresponding theoretical optimal matrix is established, and the corresponding relationship is stored.
The embodiment has the advantages that the corresponding relations between more and more sample colors and the corresponding theoretical optimal matrixes can be obtained through continuous accumulation, and the corresponding relations are stored in a storage medium such as a database, so that convenience can be brought to the subsequent calculation of the characteristic matrixes of the colors to be calibrated. Furthermore, these embodiments may also provide theoretical support for adaptively adjusting parameters in a color correction matrix.
In some embodiments, when obtaining the feature weight according to the feature of the color to be calibrated, firstly judging the number of sub-features of the color to be calibrated, and obtaining the sub-feature weight corresponding to the sub-features; when the number of the sub-features of the color to be calibrated is 1, obtaining a sub-feature weight of 1; when the number of the sub-features of the color to be calibrated is larger than 1, the sub-feature weight corresponding to each sub-feature of the color to be calibrated is obtained, wherein the sub-feature weight is determined by the state of the corresponding sub-feature in the color to be calibrated, and the states comprise: a duty ratio value and a luminance value.
Further, when the feature weight is used for calibrating the theoretical optimal matrix and the color temperature matrix to obtain the feature matrix, the sub-feature matrix corresponding to each sub-feature weight is calculated according to the sub-feature weight obtained in the embodiment, and each sub-feature matrix is multiplied to obtain the feature matrix, wherein the calculation formula is as follows:
M_feat=I+α*(M_opt*M_CT-1-I),
M_feati=I+αi*(M_opt*M_CT-1-I),
M_feat=ΠM_feati
wherein m_ feat represents a feature matrix, I represents a unit matrix, α represents a feature weight, m_opt represents a theoretical optimal matrix, m_ct represents a color temperature matrix, m_ct -1 represents an inverse matrix of the color temperature matrix, m_ feat i represents a sub-feature matrix, and α i represents a sub-feature weight.
Typically, more than one feature will be included in the color to be calibrated, such as blue sky, white cloud, and beach on the image; in this case, it is necessary to determine which sub-features are specifically comprised in the color to be calibrated; and then calculating the corresponding sub-feature weight and the sub-feature matrix for each sub-feature constituting the color to be calibrated, and finally multiplying all the sub-feature matrices to obtain the feature matrix of the color to be calibrated. When the color to be calibrated consists of only one feature, say, there is only a blue sky on the image; the method is also applicable to the calculation formula, and only one sub-feature weight is required to be reserved.
Further, the calculation formula of the actual optimal matrix is:
M_t=M_feat*M_CT,
Where m_t represents the actual optimal matrix.
Further, when correcting the color to be calibrated by using the actual optimal matrix, fusing the actual optimal matrix with the saturation matrix to obtain a color correction matrix, and correcting the color to be calibrated, wherein the calculation formula is as follows:
M_final=M_sat*M_t,
wherein M_final represents the color correction matrix, M_sat represents the saturation matrix, and the saturation matrix is determined by the color saturation.
The application also discloses a color correction system, which comprises: the acquisition module is used for acquiring the color of the sample; the calibration module is used for adjusting the calibration weight according to the deviation of the sample color and the corresponding memory color; the acquisition module is also used for acquiring the color to be calibrated; the processor is used for obtaining a theoretical optimal matrix for restoring the sample color according to the calibration weight obtained by the calibration module and obtaining a characteristic weight according to the characteristic of the color to be calibrated; the calibration module is also used for calibrating the theoretical optimal matrix and the color temperature matrix by using the characteristic weight obtained by the processor to obtain a characteristic matrix; the fusion module is used for fusing the characteristic matrix with the color temperature matrix according to the characteristic matrix obtained by the calibration module to obtain an actual optimal matrix; and the calibration module is also used for correcting the color to be calibrated by using the actual optimal matrix according to the actual optimal matrix of the fusion module.
Further, the processor is further configured to determine the number of sub-features of the color to be calibrated, and obtain corresponding sub-feature weights according to the determination result; when the number of the sub-features of the color to be calibrated is 1, the processor is used for setting the weight of the sub-features to be 1; when the number of the sub-features of the color to be calibrated is larger than 1, the processor is used for obtaining the sub-feature weight corresponding to each sub-feature of the color to be calibrated; the sub-feature weights are determined by the states of the corresponding sub-features in the color to be calibrated, wherein the states include: a duty ratio value and a luminance value.
Further, the calibration module is further configured to calculate a sub-feature matrix corresponding to each sub-feature weight according to the sub-feature weights, multiply each sub-feature matrix to obtain a feature matrix, where the calculation formula is as follows:
M_feat=I+α*(M_opt*M_CT-1-I),
M_feati=I+αi*(M_opt*M_CT-1-I),
M_feat=ΠM_feati
wherein m_ feat represents a feature matrix, I represents a unit matrix, α represents a feature weight, m_opt represents a theoretical optimal matrix, m_ct represents a color temperature matrix, m_ct -1 represents an inverse matrix of the color temperature matrix, m_ feat i represents a sub-feature matrix, and α i represents a sub-feature weight.
Further, the fusion module is further configured to fuse the actual optimal matrix with the saturation matrix, obtain a color correction matrix, and transmit the color correction matrix to the calibration module, where a calculation formula is as follows:
M_t=M_feat*M_CT,
M_final=M_sat*M_t,
Wherein M_t represents the actual optimal matrix, M_final represents the color correction matrix, M_sat represents the saturation matrix, and the saturation matrix is determined by the color saturation.
Further, the calibration module is further configured to receive the color correction matrix from the fusion module, and correct the color to be calibrated using the color correction matrix.
Compared with the prior art, the method and the system disclosed by the application have at least the following beneficial effects:
(1) When the color correction matrix is calculated, besides taking the color saturation and the color temperature into consideration, the understanding of human eyes on the memory color is taken as an influence factor to be considered, so that the image processed by the color correction matrix is more accurate in correction of some key characteristics and can accord with the common cognition of people;
(2) When the color correction system has enough characteristic types (blue sky, white cloud, grassland, beach, skin color and the like), after each time a new color to be calibrated is acquired, the color correction can be adaptively performed according to the state of the characteristic in the color to be calibrated, so that the workload of personnel is reduced;
(3) The original ISP can be multiplexed without modifying the existing ISP hardware algorithm; only the software algorithm is required to be modified, so that the implementation difficulty of the application is reduced;
(4) Each color correction matrix has a color that is well restored and also has a color that is not well restored; the application is equivalent to deciding which colors are short plates to discard according to scene requirements and which colors are strong to highlight, thereby enabling the color styles which are not obtained originally.
Drawings
The drawings that follow are briefly described as applied to the description of embodiments of the present application:
FIG. 1 is a schematic diagram of an embodiment of a color correction method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an embodiment of a color correction system according to the present application;
FIG. 3 is a schematic diagram of an embodiment of a color correction method according to the present application;
fig. 4 is a schematic diagram of an embodiment of a color correction method according to an embodiment of the present application.
In the figure: 500-color correction system, 501-acquisition module, 502-calibration module, 503-processor, 504-fusion module.
Detailed Description
In order to more clearly describe the technical solution of the embodiment of the present application, a specific embodiment of the present application will be described below with reference to the accompanying drawings. The drawings described below are only examples of the present application, and it is apparent to those skilled in the art that other drawings and other embodiments can be made from these drawings without departing from the spirit of the present application.
For the sake of simplicity of the drawing, only the parts relevant to the corresponding embodiments are schematically represented in the figures, which do not represent their actual structure as a product. In addition, in order to simplify the drawing for understanding, components having the same structure or function are shown only in part schematically in some drawings, and more or fewer components having the same structure or function may actually be present.
In the present application, ordinal terms such as "first," "second," and the like, are used solely to distinguish between the associated objects and are not to be construed as indicating or implying a relative importance or order between such associated objects unless otherwise expressly specified and defined; in addition, the number of associated objects is not represented. "plurality" includes two or more, and the like. "/" is used to describe a relationship between associated objects, which represents an or relationship between associated objects. "and/or" is used to describe a relationship between associated objects that includes any combination of relationships between associated objects, e.g., "a and/or b" includes: "a alone", "b alone", or "a and b". "one or more" or "at least one" of the plurality of objects refers to any object or any combination of the plurality of objects, such as "one or more of a1, a2, a3" or "at least one of a1, a2, a3" includes: "individual a1", "individual a2", "individual a3", "a1 and a2", "a1 and a3", "a2 and a3", or "a1, a2 and a3".
In the prior art, color correction matrices are often not capable of satisfying all color accuracy, which results in inconsistent color rendering effects in different scenes. This is because the existing color correction matrix only considers the influence of color saturation and color temperature on color when calibrating the color to be calibrated; at ISP PIPELINE, the final correction of the color signal at the board end is a Color Correction Matrix (CCM) based on a matrix multiplication of the current color temperature (color temperature, CT) and the saturation input by the user. The formula is as follows: m=m_sat×m_ct, where m_sat represents the saturation matrix and m_ct represents the color temperature matrix.
Wherein saturation generally varies with gain as follows: a saturation=f_1 (Gain); the saturation matrix m_sat varies with saturation as follows: m_sat=f_2 (Saturation). According to the change of gain/ambient brightness saturation, the method is consistent with the visual principle of human eyes that the cone cells used in the bright light environment of human eyes are more sensitive to colors and are insensitive to colors in dark light.
In addition, M_CT is derived from calibration. The calibration process comprises simulating color processing in the camera by using raw images of standard calibration color cards shot by cameras under different color temperatures, comparing color difference between the color corrected output color signals and the standard calibration color cards, and iteratively reducing color difference between the output color signals and the color signals of the standard calibration color cards through an optimization algorithm. Thereby obtaining a set of calibrated color correction matrices at different color temperatures. The final m_ct may be obtained by interpolating the calibrated color correction matrix at different color temperatures using the color temperature output by the AWB (Auto White Balance automatic white balance) algorithm. The formula is as follows: m_ct=f_3 (CT, { m_ct0, m_ct1, m_ct2, … }).
However, the color correction matrix obtained only by the above method has the following problems. First, the calibrated color correction matrix is restored according to the standard color saturation, and the operations such as increasing the saturation can only be changed according to the saturation change mode specified by f_2 in the linearly selected calibrated RGB domain. However, the saturation definition for the various hues that fit the human eye is typically nonlinear and dependent on the overall color perception. Therefore, operating on saturation only through f_2 cannot meet people's requirements for color saturation changes. Second, color correction matrix calibration algorithms are generally not able to fully restore all standard colors to the desired color coordinates, and typically Δe76 color errors will be between 4 and 5 at maximum, which is the limit of the optimization algorithm under the linear color correction mathematical model of the color correction matrix, but may lead to dissatisfaction with hue, saturation, and brightness restoration effects of certain colors (e.g., skin tone, sky color, grass color). The problem that the restoration of all colors cannot be accurately achieved is that the mathematical model of the color correction matrix is too simplified. Again, in different actual scenes, depending on the color temperature, brightness, and dominant colors of the environment of the scene, human vision typically has different concerns over different colors. And the single color correction matrix after calibration cannot meet the requirement of color reproduction.
By analyzing the problems of the existing color correction matrix, the application considers the characteristics of the response of human eyes to colors, extracts color features according to the color distribution of a scene for controlling the direction of final color reproduction. The human eye is more sensitive to large areas of color in the scene and to certain "memory colors" (e.g., skin tone, blue sky, vegetation, dirt) which are abrupt when they deviate from the colors in the human impression. The application is characterized in that considering the fact that the existing color correction matrix model can not meet the factor of perfectly restoring all colors, the color correction matrix model outputs a color temperature matrix M_CT which is balanced under different color temperatures when being calibrated, and a characteristic matrix M_ feat which is optimized for various custom color blocks. When the characteristic matrix and the color temperature matrix are fused, the emphasis on the set color is contained, and the intensity of CCM can be adjusted in a self-adaptive mode.
Description: when each feature in the color to be calibrated is refined, the concepts of the feature, the feature weight and the feature matrix are replaced by the sub-feature, the sub-feature weight and the sub-feature matrix respectively for distinguishing in order that the concept confusion does not occur. In the context of the present application, the description applies wherever this is the case.
In one embodiment, as shown in fig. 1, the color correction method provided by the application comprises the following steps:
s100, obtaining sample colors, and adjusting calibration weights according to deviations of the sample colors and the corresponding memory colors;
Generally, a sample color refers to a picture, i.e., a frame of image; however, the color correction method of the present application can be applied to video files, considering that the video is formed by fusion of multiple frames of images. As noted above, memory colors include colors of scenes such as blue sky, white clouds, grasslands, beach, etc., in some embodiments, the sample color is a picture containing blue sky, but the blue sky of the sample color is not as "blue" as perceived by humans; at this time, the blue sky in the sample color is "blue" as the blue sky in the human cognition by adjusting the calibration weight, and the calibration weight at this time is recorded.
S200, obtaining a theoretical optimal matrix for restoring the sample color according to the calibration weight;
after the calibration weight is obtained, a theoretical optimal matrix is obtained according to the calibration weight, and after the sample color is fused with the theoretical optimal matrix, the required memory color can be restored.
S300, obtaining a color to be calibrated, obtaining a characteristic weight according to the characteristic of the color to be calibrated, and calibrating a theoretical optimal matrix and a color temperature matrix by using the characteristic weight to obtain a characteristic matrix;
For a color to be calibrated, which may contain one or more features, say, using the camera perspective to pull in to take a picture of the blue sky, then there is only one feature of the blue sky in the color to be calibrated; when the viewing angle of the camera is zoomed out, more than one feature, say blue sky, white cloud and beach, will be included in the color to be calibrated. For each feature in the color to be calibrated, the sub-feature weights corresponding to them need to be calculated. And then calibrating the theoretical optimal matrix and the color temperature matrix by using each obtained sub-feature weight to obtain a feature matrix or a sub-feature matrix.
S400, fusing the characteristic matrix and the color temperature matrix to obtain an actual optimal matrix, and correcting the color to be calibrated by using the actual optimal matrix.
After the feature matrix obtained by the method is fused with the color temperature matrix, the obtained actual optimal matrix contains the restoration of each feature in the color to be calibrated, and the specific restoration condition is determined by the feature weight, so that the purpose of correcting the color to be calibrated is achieved.
According to the color correction method disclosed by the embodiment of the application, when the color correction matrix is calculated, the understanding of human eyes on the memory color is considered as an influence factor, so that correction of some key features is more accurate, and the common cognition of people can be more met; if the color correction system has enough characteristic types (blue sky, white cloud, grassland, beach, skin color and the like), the color correction can be adaptively performed according to the state of the characteristic in the color to be corrected after each time of obtaining a new color to be corrected, so that the workload of personnel is reduced; in addition, as can be seen from the embodiment, all the improvements are algorithm improvements on the software level of the ISP, and the hardware structure is not changed, so that the original ISP can be reused, and the implementation difficulty of the application is reduced; when calculating the characteristic weight of each sub-characteristic in the color to be calibrated, the state of each sub-characteristic in the color to be calibrated is comprehensively considered, so that the key characteristic is given a large weight, and the non-key characteristic is given a small weight, so that the main and sub-main can be separated, and the resources are used in the place where the resources are most needed, so that the color style which is not obtained originally becomes possible.
In another embodiment, as shown in fig. 3, on the basis of S300 in the above embodiment, the method further includes:
s301, judging the number of sub-features of the color to be calibrated, and obtaining sub-feature weights corresponding to the sub-features;
When the number of the sub-features of the color to be calibrated is 1, obtaining a sub-feature weight of 1;
When the number of the sub-features of the color to be calibrated is larger than 1, the sub-feature weight corresponding to each sub-feature of the color to be calibrated is obtained, wherein the sub-feature weight is determined by the state of the corresponding sub-feature in the color to be calibrated, and the states comprise: a duty ratio value and a luminance value. For example, when the occupancy ratio of the component of the blue sky color range of the current scene reaches the threshold value and the threshold value of the brightness is met, the blue sky characteristic is taken into consideration as one of the sub-characteristics. In some embodiments, the sub-feature weights corresponding to the sub-features are obtained by: recording Mean weight Mean, analyzing the number Num of humps, height of humps and color temperature weight colorTem obtained by AWB statistical information by a color histogram, converting a scene image into YUV space, recording UV distribution Uvalue, vvalue to obtain information such as color weight uvStr, normalizing all the information to a unified dimension to obtain sub-feature weights corresponding to different sub-features, and adopting the following calculation formula: alpha i=(Meani+Numi+Heighti+colorTemi+uvStri)/5.
S302, calculating a sub-feature matrix corresponding to each sub-feature weight according to the sub-feature weights, and multiplying each sub-feature matrix to obtain a feature matrix;
the specific calculation formula of the steps is as follows:
M_feat*M_CT=α*M_opt+(1-α)*M_CT,
M_feat=I+α*(M_opt*M_CT-1-I),
M_feati=I+αi*(M_opt*M_CT-1-I),
M_feat=ΠM_feati
wherein m_ feat represents a feature matrix, I represents a unit matrix, α represents a feature weight, m_opt represents a theoretical optimal matrix, m_ct represents a color temperature matrix, m_ct -1 represents an inverse matrix of the color temperature matrix, m_ feat i represents a sub-feature matrix, and α i represents a sub-feature weight. Through the formula, the corresponding relation among the parameters can be clearly obtained, so that the characteristic matrix of the color to be calibrated is obtained.
According to the method, when the color to be calibrated to be processed has a plurality of sub-features, the state of each sub-feature needs to be analyzed one by one, and different sub-feature weights are given according to the difference of the states; the result obtained by the operation can well restore the color to be calibrated, meanwhile, the primary and secondary are distinguished, the characteristics needing to be emphasized are given a large weight, the non-important characteristics are given a small weight, and the customization requirement can be met to a certain extent.
In another embodiment, as shown in fig. 1, 3 and 4, the method further includes, on the basis of the above embodiment:
S401, obtaining an actual optimal matrix, fusing the actual optimal matrix with a saturation matrix to obtain a color correction matrix, and correcting colors to be calibrated;
The calculation formula is as follows:
M_t=M_feat*M_CT,
M_final=M_sat*M_t,
Wherein M_t represents the actual optimal matrix, M_final represents the color correction matrix, M_sat represents the saturation matrix, and the saturation matrix is determined by the color saturation. After the actual optimal matrix is fused with the saturation matrix, the color to be calibrated can be calibrated better.
According to the method, the actual optimal matrix and the saturation matrix are fused, and then the color correction matrix can be obtained; although the actual optimal matrix can take various memory colors into consideration, so that color reproduction is more realistic and accords with the cognition of people, the saturation matrix is fused together, and the finally obtained color correction matrix can achieve better effect. As noted above, saturation generally varies with gain and the saturation matrix m_sat varies with saturation. According to the change of gain/ambient brightness saturation, the method is consistent with the visual principle of human eyes that the cone cells used in the bright light environment of human eyes are more sensitive to colors and are insensitive to colors in dark light. Therefore, the present embodiment can provide a color correction matrix with better correction effect for correcting the color to be calibrated. Meanwhile, the beneficial effects generated in the other embodiments described above can also be generated in this embodiment, and will not be described here again.
Based on the same technical concept, the application also discloses a color correction system, as shown in fig. 2, the color correction system 500 comprises:
An obtaining module 501, configured to obtain a color of a sample;
Generally, a sample color refers to a picture, i.e., a frame of image; however, the color correction method of the present application can be applied to video files, considering that the video is formed by fusion of multiple frames of images.
The calibration module 502 is configured to adjust a calibration weight according to a deviation between a sample color and a corresponding memory color;
the calibration module 502 obtains the sample colors from the acquisition module 501. As noted above, memory colors include colors of scenes such as blue sky, white clouds, grasslands, beach, etc., in some embodiments, the sample color is a picture containing blue sky, but the blue sky of the sample color is not as "blue" as perceived by humans; at this time, the blue sky in the sample color is "blue" as the blue sky in the human cognition by adjusting the calibration weight, and the calibration weight at this time is recorded.
The obtaining module 501 is further configured to obtain a color to be calibrated.
A processor 503, configured to obtain a theoretical optimal matrix for restoring the color of the sample according to the calibration weight obtained by the calibration module 502, and obtain a feature weight according to the feature of the color to be calibrated;
For each feature in the color to be calibrated, the sub-feature weights corresponding to them need to be calculated.
The calibration module 502 is further configured to calibrate the theoretical optimal matrix and the color temperature matrix by using the feature weight obtained by the processor 503, so as to obtain a feature matrix;
the calibration module 502 uses each obtained sub-feature weight to calibrate the theoretical optimal matrix and the color temperature matrix to obtain a feature matrix or a sub-feature matrix. For a color to be calibrated, which may contain one or more features, say, using the camera perspective to pull in to take a picture of the blue sky, then there is only one feature of the blue sky in the color to be calibrated; when the viewing angle of the camera is zoomed out, more than one feature, say blue sky, white cloud and beach, will be included in the color to be calibrated. For each feature in the color to be calibrated, the sub-feature weights corresponding to them need to be calculated. And then calibrating the theoretical optimal matrix and the color temperature matrix by using each obtained sub-feature weight to obtain a feature matrix or a sub-feature matrix.
The fusion module 504 is configured to fuse the feature matrix with the color temperature matrix according to the feature matrix obtained by the calibration module 502, so as to obtain an actual optimal matrix;
The calibration module 502 is further configured to correct the color to be calibrated according to the actual optimal matrix of the fusion module 504 using the actual optimal matrix.
The actual optimal matrix contains the restoration of each feature in the color to be calibrated, and the specific restoration condition is determined by the feature weight, so that the purpose of correcting the color to be calibrated is achieved.
According to the color correction system disclosed by the embodiment of the application, when the color correction matrix is calculated, the understanding of human eyes on the memory color is considered as an influence factor, so that correction of some key characteristics is more accurate, and the color correction system can more accord with common cognition of people; if the color correction system has enough characteristic types (blue sky, white cloud, grassland, beach, skin color and the like), the color correction can be adaptively performed according to the state of the characteristic in the color to be corrected after each time of obtaining a new color to be corrected, so that the workload of personnel is reduced; in addition, as can be seen from the embodiment, all the improvements are algorithm improvements on the software level of the ISP, and the hardware structure is not changed, so that the original ISP can be reused, and the implementation difficulty of the application is reduced; when calculating the characteristic weight of each sub-characteristic in the color to be calibrated, the state of each sub-characteristic in the color to be calibrated is comprehensively considered, so that the key characteristic is given a large weight, and the non-key characteristic is given a small weight, so that the main and sub-main can be separated, and the resources are used in the place where the resources are most needed, so that the color style which is not obtained originally becomes possible. The above-mentioned function implementation mainly relies on the calibration module 502 and the processor 503 in the system to obtain the feature matrix corresponding to the color to be calibrated through continuous calibration, so as to achieve the purpose of calibration.
In another embodiment of the system disclosed in the present application, as shown in fig. 2, the processor is further configured to determine the number of sub-features of the color to be calibrated, and obtain corresponding sub-feature weights according to the result of the determination; when the number of the sub-features of the color to be calibrated is 1, the processor is used for setting the weight of the sub-features to be 1; when the number of the sub-features of the color to be calibrated is larger than 1, the processor is used for obtaining the sub-feature weight corresponding to each sub-feature of the color to be calibrated; the sub-feature weights are determined by the states of the corresponding sub-features in the color to be calibrated, wherein the states include: a duty ratio value and a luminance value.
For example, when the occupancy ratio of the component of the blue sky color range of the current scene reaches the threshold value and the threshold value of the brightness is met, the blue sky characteristic is taken into consideration as one of the sub-characteristics. In some embodiments, the sub-feature weights corresponding to the sub-features are obtained by: recording Mean weight Mean, analyzing the number Num of humps, height of humps and color temperature weight colorTem obtained by AWB statistical information by a color histogram, converting a scene image into YUV space, recording UV distribution Uvalue, vvalue to obtain information such as color weight uvStr, normalizing all the information to a unified dimension to obtain sub-feature weights corresponding to different sub-features, and adopting the following calculation formula: alpha i=(Meani+Numi+Heighti+colorTemi+uvStri)/5. Overall, α is particularly important for different values of M feat for adaptive control to select a reasonable matrix. For example, shooting mainly a face scene is extremely sensitive to skin colors, and alpha weight corresponding to skin colors is required to be increased; shooting a blue sky scene, the alpha weight of the corresponding blue sky needs to be increased.
In some embodiments, the calibration module is further configured to calculate a sub-feature matrix corresponding to each sub-feature weight according to the sub-feature weights, multiply each sub-feature matrix to obtain a feature matrix, where the calculation formula is:
M_feat=I+α*(M_opt*M_CT-1-I),
M_feati=I+αi*(M_opt*M_CT-1-I),
M_feat=ΠM_feati
wherein m_ feat represents a feature matrix, I represents a unit matrix, α represents a feature weight, m_opt represents a theoretical optimal matrix, m_ct represents a color temperature matrix, m_ct -1 represents an inverse matrix of the color temperature matrix, m_ feat i represents a sub-feature matrix, and α i represents a sub-feature weight.
On the basis of the above embodiment, the fusion module is further configured to fuse the actual optimal matrix with the saturation matrix, obtain a color correction matrix, and transmit the color correction matrix to the calibration module, where the calculation formula is as follows:
M_t=M_feat*M_CT,
M_final=M_sat*M_t,
Wherein M_t represents the actual optimal matrix, M_final represents the color correction matrix, M_sat represents the saturation matrix, and the saturation matrix is determined by the color saturation.
In other embodiments, the calibration module is further configured to receive a color correction matrix from the fusion module and correct the color to be calibrated using the color correction matrix.
According to the color correction system disclosed by the embodiment of the application, when the color correction matrix is calculated, the understanding of human eyes on the memory color is taken as an influence factor to be considered, so that correction of some key characteristics is more accurate, and the color correction system can more accord with common cognition of people; if the color correction system has enough characteristic types (blue sky, white cloud, grassland, beach, skin color and the like), the color correction can be adaptively performed according to the state of the characteristic in the color to be corrected after each time of obtaining a new color to be corrected, so that the workload of personnel is reduced; in addition, as can be seen from the embodiment, all the improvements are algorithm improvements on the software level of the ISP, and the hardware structure is not changed, so that the original ISP can be reused, and the implementation difficulty of the application is reduced; when calculating the characteristic weight of each sub-characteristic in the color to be calibrated, the state of each sub-characteristic in the color to be calibrated is comprehensively considered, so that the key characteristic is given a large weight, and the non-key characteristic is given a small weight, so that the main and sub-main can be separated, and the resources are used in the place where the resources are most needed, so that the color style which is not obtained originally becomes possible. After fusing the actual optimal matrix and the saturation matrix, a color correction matrix can be obtained; although the actual optimal matrix can take various memory colors into consideration, so that color reproduction is more realistic and accords with the cognition of people, the saturation matrix is fused together, and the finally obtained color correction matrix can achieve better effect. As noted above, saturation generally varies with gain and the saturation matrix m_sat varies with saturation. According to the change of gain/ambient brightness saturation, the method is consistent with the visual principle of human eyes that the cone cells used in the bright light environment of human eyes are more sensitive to colors and are insensitive to colors in dark light. Therefore, the present embodiment can provide a color correction matrix with better correction effect for correcting the color to be calibrated.
In the foregoing embodiments, the descriptions of the embodiments are focused on, and the parts of a certain embodiment that are not described or depicted in detail may be referred to in the related descriptions of other embodiments. Furthermore, the above embodiments can be freely combined as needed.

Claims (10)

1. A color correction method, comprising the steps of:
Acquiring sample colors, and adjusting calibration weights according to deviations of the sample colors and the corresponding memory colors;
Obtaining a theoretical optimal matrix for restoring the sample color according to the calibration weight;
obtaining a color to be calibrated, obtaining a characteristic weight according to the characteristic of the color to be calibrated, and calibrating the theoretical optimal matrix and the color temperature matrix by using the characteristic weight to obtain a characteristic matrix;
And fusing the characteristic matrix with the color temperature matrix to obtain an actual optimal matrix, and correcting the color to be calibrated by using the actual optimal matrix.
2. The method of claim 1, wherein said deriving a theoretical optimal matrix for color reproduction of said sample based on said calibration weights, further comprises:
And obtaining a plurality of sample colors, establishing a corresponding relation between each sample color and a corresponding theoretical optimal matrix, and storing the corresponding relation.
3. The method for color correction according to claim 1, wherein the obtaining the feature weight according to the feature of the color to be calibrated specifically comprises:
Judging the number of the sub-features of the color to be calibrated, and obtaining the sub-feature weight corresponding to the sub-features;
When the number of the sub-features of the color to be calibrated is 1, obtaining the weight of the sub-features to be 1;
When the number of the sub-features of the color to be calibrated is greater than 1, a sub-feature weight corresponding to each sub-feature of the color to be calibrated is obtained, wherein the sub-feature weight is determined by a state of the corresponding sub-feature in the color to be calibrated, and the state comprises: a duty ratio value and a luminance value.
4. A color correction method according to claim 3, wherein said calibrating the theoretical optimal matrix and the color temperature matrix using the feature weights to obtain a feature matrix comprises:
According to the sub-feature weights, calculating a sub-feature matrix corresponding to each sub-feature weight, multiplying each sub-feature matrix to obtain the feature matrix, wherein a calculation formula is as follows:
M_feat=I+α*(M_opt*M_CT-1-I),
M_feati=I+αi*(M_opt*M_CT-1-I),
M_feat=ΠM_feati
Wherein m_ feat represents the feature matrix, I represents the identity matrix, α represents the feature weight, m_opt represents the theoretical optimum matrix, m_ct represents the color temperature matrix, m_ct -1 represents the inverse of the color temperature matrix, m_ feat i represents the sub-feature matrix, and α i represents the sub-feature weight.
5. The method for color correction according to claim 4, wherein said fusing the feature matrix with the color temperature matrix to obtain an actual optimal matrix, and correcting the color to be calibrated using the actual optimal matrix, specifically comprises:
The actual optimal matrix is obtained, the actual optimal matrix is fused with a saturation matrix, a color correction matrix is obtained, the color to be calibrated is corrected, and a calculation formula is as follows:
M_t=M_feat*M_CT,
M_final=M_sat*M_t,
Wherein m_t represents the actual optimal matrix, m_final represents the color correction matrix, and m_sat represents the saturation matrix, which is determined by color saturation.
6. A color correction system, comprising:
the acquisition module is used for acquiring the color of the sample;
the calibration module is used for adjusting the calibration weight according to the deviation of the sample color and the corresponding memory color;
the acquisition module is also used for acquiring the color to be calibrated;
the processor is used for obtaining a theoretical optimal matrix for restoring the sample color according to the calibration weight obtained by the calibration module, and obtaining a characteristic weight according to the characteristic of the color to be calibrated;
the calibration module is further used for calibrating the theoretical optimal matrix and the color temperature matrix by using the characteristic weight obtained by the processor to obtain a characteristic matrix;
The fusion module is used for fusing the characteristic matrix with the color temperature matrix according to the characteristic matrix obtained by the calibration module to obtain an actual optimal matrix;
the calibration module is further configured to correct the color to be calibrated according to an actual optimal matrix of the fusion module by using the actual optimal matrix.
7. The color correction system of claim 6, wherein the processor is further configured to determine the number of sub-features of the color to be calibrated, and obtain corresponding sub-feature weights according to the determination result;
When the number of the sub-features of the color to be calibrated is 1, the processor is used for setting the weight of the sub-features to be 1;
When the number of the sub-features of the color to be calibrated is greater than 1, the processor is used for obtaining the sub-feature weight corresponding to each sub-feature of the color to be calibrated;
the sub-feature weights are determined by states of the corresponding sub-features in the color to be calibrated, wherein the states include: a duty ratio value and a luminance value.
8. The color correction system of claim 7, wherein the calibration module is further configured to calculate a sub-feature matrix corresponding to each of the sub-feature weights according to the sub-feature weights, and multiply each of the sub-feature matrices to obtain the feature matrix, where a calculation formula is:
M_feat=I+α*(M_opt*M_CT-1-I),
M_feati=I+αi*(M_opt*M_CT-1-I),
M_feat=ΠM_feati
Wherein m_ feat represents the feature matrix, I represents the identity matrix, α represents the feature weight, m_opt represents the theoretical optimum matrix, m_ct represents the color temperature matrix, m_ct -1 represents the inverse of the color temperature matrix, m_ feat i represents the sub-feature matrix, and α i represents the sub-feature weight.
9. The color correction system according to claim 8, wherein the fusion module is further configured to fuse the actual optimal matrix with a saturation matrix to obtain a color correction matrix, and transmit the color correction matrix to the calibration module, where a calculation formula is:
M_t=M_feat*M_CT,
M_final=M_sat*M_t,
Wherein m_t represents the actual optimal matrix, m_final represents the color correction matrix, and m_sat represents the saturation matrix, which is determined by color saturation.
10. The color correction system of claim 9, wherein said calibration module is further configured to receive a color correction matrix from said fusion module, and to use said color correction matrix to correct said color to be calibrated.
CN202410059092.4A 2024-01-16 2024-01-16 Color correction method and system Pending CN117979179A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410059092.4A CN117979179A (en) 2024-01-16 2024-01-16 Color correction method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410059092.4A CN117979179A (en) 2024-01-16 2024-01-16 Color correction method and system

Publications (1)

Publication Number Publication Date
CN117979179A true CN117979179A (en) 2024-05-03

Family

ID=90852827

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410059092.4A Pending CN117979179A (en) 2024-01-16 2024-01-16 Color correction method and system

Country Status (1)

Country Link
CN (1) CN117979179A (en)

Similar Documents

Publication Publication Date Title
US6947080B2 (en) Device for image processing, method of adjusting white-balance, and computer products
US6919924B1 (en) Image processing method and image processing apparatus
CN101179746B (en) Image processing apparatus, image processing method
CN111292246B (en) Image color correction method, storage medium, and endoscope
US20140240533A1 (en) Imaging device and image signal processor
CN112752023B (en) Image adjusting method and device, electronic equipment and storage medium
JP2000013626A (en) Image processing method, device and storage medium
US6850272B1 (en) Image processing method and system
CN104869380A (en) Image processing apparatus and image processing method
US7046400B2 (en) Adjusting the color, brightness, and tone scale of rendered digital images
JP2006203841A (en) Device for processing image, camera, device for outputting image, method for processing image, color-correction processing program and readable recording medium
KR20120016475A (en) Image processing method and image processing apparatus
US8639030B2 (en) Image processing using an adaptation rate
CN114500843A (en) Shooting method, shooting device, storage medium and electronic equipment
JP2019071568A (en) Image processing apparatus, image processing method, program, and storage medium
JP5132470B2 (en) Image processing device
US8164650B2 (en) Image processing apparatus and method thereof
US20070013714A1 (en) Simple and robust color saturation adjustment for digital images
CN117979179A (en) Color correction method and system
WO2022067761A1 (en) Image processing method and apparatus, capturing device, movable platform, and computer readable storage medium
JP2004364297A (en) Method and system for correcting color in image
JP2007267170A (en) Electronic camera with chroma saturation regulating function and image processing program
JP3539883B2 (en) Image processing method and apparatus, recording medium, imaging apparatus, and image reproducing apparatus
US11641525B2 (en) Image capturing apparatus capable of displaying live view image high in visibility, method of controlling image capturing apparatus, and storage medium
JP2009130630A (en) Color processor and method thereof

Legal Events

Date Code Title Description
PB01 Publication