CN114820863B - Intelligent color matching method and system based on color uniform coding - Google Patents
Intelligent color matching method and system based on color uniform coding Download PDFInfo
- Publication number
- CN114820863B CN114820863B CN202210763964.6A CN202210763964A CN114820863B CN 114820863 B CN114820863 B CN 114820863B CN 202210763964 A CN202210763964 A CN 202210763964A CN 114820863 B CN114820863 B CN 114820863B
- Authority
- CN
- China
- Prior art keywords
- color
- difference
- compared
- uniform
- colors
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 33
- 239000003086 colorant Substances 0.000 claims abstract description 92
- 238000012549 training Methods 0.000 claims abstract description 55
- 239000002994 raw material Substances 0.000 claims abstract description 40
- 238000012544 monitoring process Methods 0.000 claims abstract description 7
- 230000002159 abnormal effect Effects 0.000 claims description 14
- 230000008569 process Effects 0.000 claims description 9
- 210000002364 input neuron Anatomy 0.000 claims description 8
- 210000004205 output neuron Anatomy 0.000 claims description 7
- 239000000049 pigment Substances 0.000 abstract description 12
- 238000002156 mixing Methods 0.000 abstract description 7
- 239000000203 mixture Substances 0.000 abstract description 6
- 238000001514 detection method Methods 0.000 abstract description 2
- 230000006870 function Effects 0.000 description 23
- 238000013528 artificial neural network Methods 0.000 description 8
- 230000005856 abnormality Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 238000002360 preparation method Methods 0.000 description 2
- 241000700588 Human alphaherpesvirus 1 Species 0.000 description 1
- 241000701074 Human alphaherpesvirus 2 Species 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000009827 uniform distribution Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/042—Knowledge-based neural networks; Logical representations of neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/04—Inference or reasoning models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Mathematical Physics (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
The invention relates to the technical field of image detection, in particular to an intelligent color matching method and system based on color uniform coding. The method comprises the following steps: constructing a color uniform encoder, inputting HSV channel values corresponding to all colors, and outputting color uniform codes; training the encoder: summing the products of the difference grades of the color groups to be compared and the corresponding proportions to obtain the comprehensive difference degree of the color groups to be compared; monitoring twin network training by using a loss function, and training a color uniformity encoder by using the twin network; obtaining a loss function of the twin network according to the comprehensive difference degree of the color groups to be compared and the difference degree of the color uniform coding of the colors in the color groups to be compared; the color uniform codes of the raw material color and the target color are obtained, the color mixture ratio reasoning network is input, and the mixture ratio of each raw material is output. The invention obtains the proportion of the pigments which are used as raw materials for blending various target colors, reduces the difficulty in blending the colors and assists in improving the color blending efficiency by manpower.
Description
Technical Field
The invention relates to the technical field of image detection, in particular to an intelligent color matching method and system based on color uniform coding.
Background
Most of the existing color matching methods are based on standard three primary colors, but the pigments produced by different manufacturers in the actual production process have certain difference, and the color matching methods cannot well judge the proportion according to the actual raw material colors. Human eyes have different sensitivity degrees to different colors, the corresponding difficulty and accuracy of modulating different colors are different, and colors with smaller difference need to be paid more attention to when the colors of the pigments are blended.
The existing color matching method cannot well solve the problem of difficult color matching under the condition of uneven color distribution, and is difficult to pay attention to color matching of colors with small differences, so that the efficiency is low when the color matching is manually carried out according to the existing color matching method.
Disclosure of Invention
In order to solve the above technical problems, the present invention aims to provide an intelligent color matching method and system based on uniform color coding, and the adopted technical scheme is as follows:
in a first aspect, an embodiment of the present invention provides an intelligent color matching method based on uniform color coding. The method comprises the following steps: constructing a color uniform encoder, inputting HSV channel values corresponding to all colors, and outputting color uniform codes;
the training process of the color uniform coding comprises the following steps: randomly obtaining a preset number of colors to form color groups to be compared, artificially judging the difference grade of each color group to be compared, and counting the selected times of each difference grade; obtaining the proportion of the selected times of the difference grades of the color groups to be compared to the sum of the selected times of all the difference grades: summing the products of the difference grades of the color groups to be compared and the corresponding proportions to obtain the comprehensive difference degree of the color groups to be compared, and obtaining the comprehensive difference degree of each color group to be compared; monitoring twin network training by using a loss function, and training a color uniformity encoder by using the twin network; obtaining a loss function of the twin network according to the comprehensive difference degree of the color groups to be compared and the difference degree of the uniform color codes;
and obtaining the color uniform codes corresponding to the colors of the raw materials and the target color, inputting the codes into a color matching reasoning network, and outputting the matching of each raw material.
Preferably, randomly obtaining a predetermined number of colors to form color groups to be compared, and artificially judging the difference grade of each color group to be compared includes: generating a pure color contrast image by utilizing colors in a color group to be contrasted, wherein the image comprises a preset number of colors, and the occupied areas of the colors in the image are equal; and determining the difference grade of the color group to be compared based on the difference of colors in the pure color contrast image observed by human eyes.
Preferably, before obtaining the ratio of the selected number of the difference levels of the color groups to be compared to the sum of the selected numbers of all the difference levels, the method further comprises: recording the ratio of the selected times of the currently selected difference grade to the sum of the selected times of all the difference grades as a first ratio; recording the difference value between the currently selected difference level and the difference level with the maximum selected times as a first difference value; obtaining the abnormal degree of the current selected times of the difference grade by using the first ratio and the first difference; and setting an abnormal data removal threshold, and removing the current selected times of the difference grade with the abnormal degree greater than the abnormal data removal threshold.
Preferably, the structure of the color uniform encoder is a full-connection network structure, and the number of input neurons and output neurons is equal to the number of HSV channels.
Preferably, before obtaining the loss function of the twin network according to the comprehensive difference degree and the color uniform coding difference degree of the color groups to be compared, the method further comprises: and obtaining the attention coefficient of the color group to be compared by utilizing the ratio of the comprehensive difference degree of the color group to be compared to the maximum difference grade, wherein the attention coefficient and the ratio are in a negative correlation relationship.
Preferably, the obtaining the color uniform coding difference degree comprises: and calculating the spatial distance of each color uniform code, wherein the ratio of the spatial distance of the color uniform codes to the maximum coding range of the color uniform coder is the color uniform code difference degree of the colors in the color group to be compared.
wherein e represents a loss function; n represents the number of color sets to be compared;representing the attention coefficient corresponding to the nth color group to be compared;representing the comprehensive difference degree of the colors in the nth color group to be compared;and the difference degree of the color uniform coding corresponding to the colors in the nth color group to be compared is represented.
Preferably, the training process of the color matching inference network is as follows: the color codes of the colors of the raw materials and the color codes of the target colors are used as training data, the ratio of the raw materials when the target colors are actually obtained is used as a label of the training data, and the training data with the label is used for training a color ratio inference network.
In a second aspect, another embodiment of the present invention provides an intelligent color matching system based on color uniformity coding. The system comprises: the color uniform encoder building module is used for building a color uniform encoder, inputting HSV channel values corresponding to all colors and outputting color uniform codes;
the color uniform encoder training module is used for training a color uniform encoder, and the training process is as follows: randomly obtaining a preset number of colors to form color groups to be compared, artificially judging the difference grade of each color group to be compared, and counting the selected times of each difference grade; obtaining the proportion of the selected times of the difference grades of the color groups to be compared to the sum of the selected times of all the difference grades: summing the products of the difference grades of the color groups to be compared and the corresponding proportions to obtain the comprehensive difference degree of the color groups to be compared, and obtaining the comprehensive difference degree of each color group to be compared; monitoring twin network training by using a loss function, and training a color uniform encoder by using the twin network; obtaining a loss function of the twin network according to the comprehensive difference degree of the color groups to be compared and the difference degree of the uniform color codes;
and the color matching obtaining module is used for obtaining the color of the raw materials and the color uniform code corresponding to the target color, inputting the color matching reasoning network and outputting the matching of each raw material.
The color homogeneous encoder training module is further configured to supervise twin network training using a loss function, where the loss function is:
wherein e represents a loss function; n represents the number of color sets to be compared;representing the attention coefficient corresponding to the nth color group to be compared;representing the comprehensive difference degree of the colors in the nth color group to be compared;and the difference degree of the color uniform coding corresponding to the colors in the nth color group to be compared is represented.
The embodiment of the invention at least has the following beneficial effects: the invention utilizes the color uniform encoder to simulate human eyes to uniformly encode colors, so that the colors have uniform distribution, focuses on the colors with small difference while unifying the colors according to the perception of human eyes to the colors, obtains the mixture ratio of the pigments which are used as raw materials for preparing various target colors by utilizing the ratio reasoning network after obtaining the uniform encoding of each color, reduces the difficulty in preparing the colors, and assists in improving the efficiency of preparing the colors.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flow chart of an intelligent color matching method based on uniform color coding.
Fig. 2 is a block diagram of a uniform color encoder.
FIG. 3 is a diagram of a twin neural network.
Detailed Description
To further illustrate the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description, the structures, the features and the effects of the intelligent color matching method and system based on uniform color coding according to the present invention are described in detail with reference to the accompanying drawings and the preferred embodiments. In the following description, the different references to "one embodiment" or "another embodiment" do not necessarily refer to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following describes a specific scheme of an intelligent color matching method and system based on uniform color coding in detail with reference to the accompanying drawings.
Example 1
The main application scenarios of the invention are as follows: the color of the target pigment to be formulated at present is known, and the colors of several pigments as raw materials are known, and the mixing ratio of the target pigment to be formulated from the several raw materials is required to be obtained.
Referring to fig. 1, a flowchart of a method for obtaining an intelligent color matching based on color uniformity coding according to an embodiment of the present invention is shown, where the method includes the following steps:
the method comprises the following steps: and constructing a color uniform encoder, inputting HSV channel values corresponding to all colors, and outputting color uniform codes.
This embodiment requires a camera to capture the image of each pigment, and a target color and an image of an existing color, that is, an existing usable color and a target color to be prepared.
Extracting a pigment area in the acquired image to obtain a corresponding pure color in the image, giving the color category, and then converting the pure color from an RGB space to an HSV space to obtain HSV encoding information of each color. The colors obtained from the collected color image comprise the color of the pigment used as the raw material and the target color, so that usable HSV (hue, saturation and value) coding information of the raw material color and the target color is obtained.
It is an object of the present invention to formulate pigments that are consistent in color with the target pigment, where consistent is consistent as observed by the human eye, not strictly consistent in wavelength. The human eyes have different sensitivity degrees to different colors, namely, the sensing ranges of the human eyes for corresponding color changes are different, when the colors are changed in the sensing ranges of the human eyes for the colors, the human eyes can consider that the colors are not changed, for example, for blue, 3 pixel values of HSV hue components can be changed, the human eyes can sense the color changes, and for green, 5 pixel values of the HSV hue components can be changed, the human eyes can sense the color changes.
The prior art is always building up various color spaces and it is desirable to obtain a more uniform description of the color distribution, for example from 1931CIE-XYZ to CIE 1964 to CIE 1976. The invention trains the neural network to encode through a large amount of training data, so that the neural network can simulate the result of human eye recognition, thereby encoding the color according with the human vision instead of obtaining a color uniform space by using a three-dimensional coordinate conversion mode. And the three-dimensional color space is used for distinguishing colors of different degrees, the mode is that each color interval is separated by a linear means, and the mode of the neural network can obtain the distinguishing knowledge of the colors by a high-dimensional nonlinear mode.
The invention constructs a color uniform encoder, so that the color uniform encoder learns the knowledge of distinguishing the colors by human eyes, and can distribute the colors in the uniform space according to the standard of the human eyes for the known colors. The uniform space of the invention refers to: and observing a space with the chromatic aberration consistent with the space distance by human eyes. Human eyes have different sensitivity degrees to different colors, so if the human eyes want to perceive that different colors change, the corresponding pixel change degrees are different, namely are not uniform. The color uniform space can ensure that the color difference perceived by human eyes is consistent with the space distance, so that the result of color matching can be in line with the observation of human eyes.
Constructing a color uniform encoder, wherein the structure of the encoding network adopts a fully-connected neural network structure, the number of input neurons is 3, the input neurons respectively correspond to the values of three HSV (hue, saturation and value) channels of the color, the number of output neurons is still 3, and the three-dimensional representation of the color in a uniform space is represented; the structure of the color uniformity encoder is shown in fig. 2, which is a structure diagram of the uniform color encoder, wherein H, S and V represent the values of three channels HSV of the color, and U, V and W represent the three-dimensional representation of the color in a uniform space, i.e., the color uniformity encoding of the color. The color uniformity encoder implements a non-uniform color space (HSV) to uniform color space (UVW) 0 conversion. After the uniform color codes of the raw material color and the target color are obtained, the uniform color codes represent color information to be matched to obtain the target color, and therefore the uniform color encoder needs to be trained subsequently.
Step two: training a color uniform encoder to learn the knowledge of color discrimination by human eyes; the training process of the color uniform coding comprises the following steps: randomly obtaining a preset number of colors to form color groups to be compared, artificially judging the difference grade of each color group to be compared, and counting the selected times of each difference grade; obtaining the proportion of the selected times of the difference grades of the color groups to be compared to the selected times of all the difference grades: summing the products of the difference grades of the color groups to be compared and the corresponding proportions to obtain the comprehensive difference degree of the color groups to be compared, and obtaining the comprehensive difference degree of each color group to be compared; monitoring twin network training by using a loss function, and training a color uniform encoder by using the twin network; and obtaining the loss function of the twin network according to the comprehensive difference degree of the color groups to be compared and the difference degree of the color uniform coding.
In the HSV space, three channel components are randomly selected to obtain a color, a preset number of colors to be compared are obtained to form a color group to be compared, preferably, two colors to be compared are selected to form a color group to be compared, a corresponding pure color contrast image is generated according to the colors to be compared, and the two colors in the image occupy equal areas and are independently distributed in the image. In this embodiment, the difference degree of the color groups to be compared needs to be obtained by observing the pure color contrast image, so that corresponding data is obtained in a question and answer manner, that is, the difference and the difference grade of the color in the color groups to be compared are artificially judged; in this embodiment, the difference level defining the color observed by human eyes may be divided into 10 levels, the range is [0-9], and when the selected number of the difference levels is collected, the person participating in question answering gives the observation judgment result.
Because a certain amount of noise data exists in the selected times of the acquired difference levels, the data is required to be subjected to denoising processing to obtain the ratio of the selected times of the currently selected difference levels to the sum of the selected times of all the difference levelsAnd is recorded as a first ratio,indicates the number of times the ith difference level was selected,a sum of selected times representing all levels of variance; obtaining the difference value between the currently selected difference grade and the difference grade with the maximum selected timesAnd is recorded as a first difference value,indicating the ith difference level, i.e. the currently selected difference level,representing the difference grade with the maximum selected times, and obtaining the abnormality degree of the current selected times of the difference grade by using the first ratio and the first difference:
wherein,an abnormality degree indicating the ith current selected number of times; the first part in the abnormal degree calculation formula is a proportion part, namely the smaller the proportion of the selected times of the difference grade to which the current selected times belongs to the sum of the selected times of all the difference grades is, the higher the abnormal degree of the current selected times is; the second part of the abnormality degree calculation formula is a distance part,the greater the distance between the difference level to which the current selected number of times belongs and the difference level at which the selected number of times is the maximum, the greater the degree of abnormality of the current selected number of times.
Setting an abnormal data removal thresholdPreferably, in this implementationAnd removing the current selected times of the difference grade with the abnormal degree larger than the abnormal data removing threshold value. After removing the abnormal data, the comprehensive difference degree of the colors in the color group to be compared needs to be calculated:
wherein CY represents the comprehensive difference degree of the color groups to be compared; m represents the level of the difference,which represents the m-th level of difference,representing the ratio of the selected number of the m-th difference level to the sum of the selected numbers of all difference levels. And meanwhile, obtaining the comprehensive difference degree of each color group to be compared.
After the difference of each color under the observation of the human eyes is obtained, a neural network is required to be trained to simulate the observation effect of the human eyes, the adopted colors are partially discrete data in a color space, and internal knowledge is learned in a neural network mode in order to obtain the mapping relation between HSV colors and corresponding uniform codes.
In the implementation, a twin network training color uniformity encoder is adopted, wherein the twin network has a structure shown in fig. 3, wherein HSV1 and HSV2 represent three channel values of HSV corresponding to two colors in a group of color groups to be compared respectively; the weights inside the left and right color uniformity encoders in FIG. 3 are synchronized identically; in the figure, UVW1 and UVW2 indicate that Loss is expressed by Loss functions, and the Loss functions are used for supervising the training of the twin network and simultaneously measuring the difference of two color uniform codes to ensure the uniformity of the codes.
The data used for training the twin network are three channel values of HSV corresponding to the colors in the color groups to be compared, and the label data are the comprehensive difference degree CY of the colors in the color groups to be compared.
Obtaining the attention coefficient of each group of color groups to be compared:
wherein,the attention coefficient of the nth color group to be compared is represented, and the value range of the attention coefficient is [0, 1 ]];And the comprehensive difference degree of the input nth group of color groups to be compared is represented. That is, the more the same, the more the colors that are observed without difference by human eyes, the more the encoding result needs to be consistent, that is, the smaller the difference degree, the more attention needs to be paid.
The off-function of the existing twin network is only used to scale the same and different types of data. In this embodiment, the network needs to adjust the difference of encoding after color uniform encoding according to the obtained comprehensive difference degree of the color group to be compared, so that the Loss function is improved as follows:
and e is a Loss function value, and the consistency degree of the difference of input data and the difference of output data is measured by adopting a mean square error form.For the current batchThe number of color groups to be compared in the same batch in the processing process;expressing the attention coefficient of the nth color group to be compared;indicating the comprehensive difference degree of the color groups to be compared input by the nth group.Indicating the color uniformity coding difference degree of the colors in the nth group color group to be compared.
The calculation formula of the color uniform coding difference degree BM of the colors in the color group to be compared is as follows:
in the formula,representing the spatial distance of the color uniform coding of the colors in the nth set of colors to be compared.The maximum length in the limited coding range of the current three-dimensional space is adopted. Wherein:
wherein the left branch of the twin network is color-uniformly coded asThe right branch is color-coded uniformly as。
Wherein,the maximum coding range of the current coding space, respectively, may be limited according to practical requirements, and is preferably 100 in this embodiment, that is, each type of color is uniformly coded in a three-dimensional space of 100 × 100.
Thus, the color uniformity encoder is obtained, the HSV channel values corresponding to the colors are input, the color uniformity codes are output, the codes are uniform UVW descriptions, and the difference degree of the colors can be ensured to be uniform.
Step three: and obtaining the color uniform codes corresponding to the colors of the raw materials and the target color, inputting the codes into a color matching reasoning network, and outputting the matching of each raw material.
The final purpose of the invention is to realize the reasoning of the mixing proportion of the raw material color to the target color, so that a uniform coding space is needed, and the color mixing is simpler. Because of the uneven coding, the proportion of the preparation (20, 30, 20) needs to be accurately deduced, but now, the range of the preparation (20-22, 28-32, 18-20) is not different from the target color in human eye observation, and the difficulty of deducting the proportion is reduced.
Training a corresponding color matching reasoning network, wherein the network structure is as follows: the color matching reasoning network also adopts a fully connected structure, the number of input neurons is 3 x (T +1) +3 x M, T is the total number of colors of the current raw materials, for example, 1 color is synthesized by 3 raw materials, then T =3, and the added 1 represents the target color. M is the fault-tolerant blending frequency, namely the target color can be obtained according to the inferred mixture ratio without the first time, and the blended color needs to be used as one of the raw materials. The value of M may be set manually before the neural network training, and preferably M is 3 in this embodiment. The number of the output neurons is T + M, namely the proportion of each corresponding raw material. When the color quantity is not enough, the vacant raw material data is supplemented with 0.
The input data is therefore of the form:
the output data is in the form of:
wherein pb is the mixture ratio of the raw materials.
When network training is carried out, color matching experiments are carried out through practical experiments to obtain the mixture ratio of the colors of the raw materials when the target color is blended; the color code of the raw material color and the color code of the target color are used as training data, the proportion of the raw material color is used as a label of the training data, and the color proportion network is trained by using the training data; and (4) performing a regression task by using the loss function network, and then adopting a mean square error loss function.
After the color matching reasoning network is trained, inputting the color of the raw material and the color uniform code corresponding to the target color, and outputting the matching of each raw material.
Example 2
The present embodiments provide a system embodiment. An intelligent color matching system based on color uniformity coding, the system comprising: the color uniform encoder building module is used for building a color uniform encoder, inputting HSV channel values corresponding to all colors and outputting color uniform codes;
the color uniform encoder training module is used for training a color uniform encoder, and the training process is as follows: randomly obtaining a preset number of colors to form color groups to be compared, artificially judging the difference grade of each color group to be compared, and counting the selected times of each difference grade; obtaining the proportion of the selected times of the difference grades of the color groups to be compared to the sum of the selected times of all the difference grades: summing the products of the difference grades of the color groups to be compared and the corresponding proportions to obtain the comprehensive difference degree of the color groups to be compared, and obtaining the comprehensive difference degree of each color group to be compared; monitoring twin network training by using a loss function, and training a color uniform encoder by using the twin network; obtaining a loss function of the twin network according to the comprehensive difference degree of the color groups to be compared and the difference degree of the uniform color coding;
and the color matching obtaining module is used for obtaining the color of the raw materials and the color uniform code corresponding to the target color, inputting the color matching reasoning network and outputting the matching of each raw material.
The color homogeneous encoder training module is further configured to supervise twin network training using a loss function, where the loss function is:
wherein e represents a loss function; n represents the number of color sets to be compared;representing the attention coefficient corresponding to the nth color group to be compared;representing the comprehensive difference degree of the colors in the nth color group to be compared;and the difference degree of the color uniform coding corresponding to the colors in the nth color group to be compared is represented.
It should be noted that: the precedence order of the above embodiments of the present invention is only for description, and does not represent the merits of the embodiments. And specific embodiments thereof have been described above. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (5)
1. An intelligent color matching method based on color uniform coding is characterized by comprising the following steps: constructing a color uniform encoder, inputting HSV channel values corresponding to all colors, and outputting color uniform codes;
wherein the training process of the color uniform coding comprises: randomly obtaining a preset number of colors to form color groups to be compared, artificially judging the difference grade of each color group to be compared, and counting the selected times of each difference grade; obtaining the proportion of the selected times of the difference grades of the color groups to be compared to the sum of the selected times of all the difference grades: summing the products of the difference grades of the color groups to be compared and the corresponding proportions to obtain the comprehensive difference degree of the color groups to be compared, and obtaining the comprehensive difference degree of each color group to be compared; monitoring twin network training by using a loss function, and training a color uniformity encoder by using the twin network; obtaining a loss function of the twin network according to the comprehensive difference degree of the color groups to be compared and the difference degree of the uniform color coding; the loss function is:
wherein e represents a loss function; n represents the number of color sets to be compared;representing the attention coefficient corresponding to the nth color group to be contrasted;representing the color in the nth color group to be comparedThe overall degree of difference in color;expressing the difference degree of color uniform coding corresponding to the colors in the nth color group to be compared;
obtaining the color of the raw materials and the color uniform code corresponding to the target color, inputting the color matching reasoning network, and outputting the matching of each raw material; the structure of the color matching inference network is a full-connection network structure, the number of input neurons is 3 x (T +1) +3 x M, the value of T is 3, M is the fault-tolerant allocation times, and the number of output neurons is T + M; the color code of the raw material color and the color code of the target color are used as training data, the ratio of the raw materials when the target color is actually obtained is used as a label of the training data, and the training data with the label is used for training a color ratio inference network; the color uniform encoder is in a full-connection network structure, and the number of input neurons and output neurons of the color uniform encoder is equal to that of HSV channels; the attention coefficient of the color group to be compared is as follows:
2. The intelligent color matching method based on uniform color coding according to claim 1, wherein the randomly obtained predetermined number of colors form color groups to be compared, and the artificially judging the difference level of each color group to be compared comprises: generating a pure color contrast image by utilizing colors in a color group to be contrasted, wherein the image comprises a preset number of colors, and the occupied areas of the colors in the image are equal; and determining the difference grade of the color groups to be compared based on the difference of colors in the pure color contrast images observed by human eyes.
3. The intelligent color matching method based on color uniform coding according to claim 1, further comprising before the obtaining the ratio of the selected number of the difference levels of the color groups to be compared to the sum of the selected number of all the difference levels: recording the ratio of the selected times of the currently selected difference grade to the sum of the selected times of all the difference grades as a first ratio; recording the difference value between the currently selected difference level and the difference level with the maximum selected times as a first difference value; obtaining the abnormal degree of the current selected times of the difference grade by using the first ratio and the first difference; and setting an abnormal data removal threshold, and removing the current selected times of the difference grade with the abnormal degree greater than the abnormal data removal threshold.
4. The intelligent color matching method based on color uniformity coding according to claim 1, wherein obtaining the degree of difference of color uniformity coding comprises: and calculating the spatial distance of the uniform color codes, wherein the ratio of the spatial distance of the uniform color codes to the maximum coding range of the uniform color coder is the color uniform coding difference degree of the colors in the color group to be compared.
5. An intelligent color matching system based on uniform color coding, the system comprising: the color uniform encoder building module is used for building a color uniform encoder, inputting HSV channel values corresponding to all colors and outputting color uniform codes;
the color uniform encoder training module is used for training a color uniform encoder, and the training process is as follows: randomly obtaining a preset number of colors to form color groups to be compared, artificially judging the difference grade of each color group to be compared, and counting the selected times of each difference grade; obtaining the proportion of the selected times of the difference grades of the color groups to be compared to the sum of the selected times of all the difference grades: summing the products of the difference grades of the color groups to be compared and the corresponding proportions to obtain the comprehensive difference degree of the color groups to be compared, and obtaining the comprehensive difference degree of each color group to be compared; monitoring twin network training by using a loss function, and training a color uniform encoder by using the twin network; obtaining a loss function of the twin network according to the comprehensive difference degree of the color groups to be compared and the difference degree of the uniform color codes; the loss function is:
wherein e represents a loss function; n represents the number of color sets to be compared;representing the attention coefficient corresponding to the nth color group to be contrasted;representing the comprehensive difference degree of the colors in the nth color group to be compared;representing the difference degree of color uniform coding corresponding to the colors in the nth color group to be compared; the color uniform encoder is in a full-connection network structure, and the number of input neurons and output neurons of the color uniform encoder is equal to that of HSV channels;
wherein,representing the attention coefficient of the nth color group to be compared;representing the nth set of color sets to be comparedThe degree of the comprehensive difference;
the color matching obtaining module is used for obtaining the color of the raw materials and the color uniform code corresponding to the target color, inputting the color matching reasoning network and outputting the matching of each raw material; the structure of the color matching inference network is a fully-connected network structure, the number of input neurons is 3 x (T +1) +3 x M, the value of T is 3, M is the fault-tolerant deployment time, and the number of output neurons is T + M; the color codes of the colors of the raw materials and the color codes of the target colors are used as training data, the ratio of the raw materials when the target colors are actually obtained is used as a label of the training data, and the training data with the label is used for training a color ratio inference network.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210763964.6A CN114820863B (en) | 2022-07-01 | 2022-07-01 | Intelligent color matching method and system based on color uniform coding |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210763964.6A CN114820863B (en) | 2022-07-01 | 2022-07-01 | Intelligent color matching method and system based on color uniform coding |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114820863A CN114820863A (en) | 2022-07-29 |
CN114820863B true CN114820863B (en) | 2022-09-09 |
Family
ID=82522393
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210763964.6A Active CN114820863B (en) | 2022-07-01 | 2022-07-01 | Intelligent color matching method and system based on color uniform coding |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114820863B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103167289A (en) * | 2013-03-06 | 2013-06-19 | 硅谷数模半导体(北京)有限公司 | Method and device for coding and decoding image |
CN108596984A (en) * | 2018-03-21 | 2018-09-28 | 李荣陆 | A kind of Automatic color matching device generated based on neural network |
CN112991493A (en) * | 2021-04-09 | 2021-06-18 | 华南理工大学 | Gray level image coloring method based on VAE-GAN and mixed density network |
CN113129390A (en) * | 2020-01-10 | 2021-07-16 | 山东工商学院 | Color blindness image re-coloring method and system based on joint significance |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113822830B (en) * | 2021-08-30 | 2023-06-06 | 天津大学 | Multi-exposure image fusion method based on depth perception enhancement |
CN113947640A (en) * | 2021-10-12 | 2022-01-18 | 华东师范大学 | Image-driven visual harmonious color matching generation method |
-
2022
- 2022-07-01 CN CN202210763964.6A patent/CN114820863B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103167289A (en) * | 2013-03-06 | 2013-06-19 | 硅谷数模半导体(北京)有限公司 | Method and device for coding and decoding image |
CN108596984A (en) * | 2018-03-21 | 2018-09-28 | 李荣陆 | A kind of Automatic color matching device generated based on neural network |
CN113129390A (en) * | 2020-01-10 | 2021-07-16 | 山东工商学院 | Color blindness image re-coloring method and system based on joint significance |
CN112991493A (en) * | 2021-04-09 | 2021-06-18 | 华南理工大学 | Gray level image coloring method based on VAE-GAN and mixed density network |
Also Published As
Publication number | Publication date |
---|---|
CN114820863A (en) | 2022-07-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103402117B (en) | Based on the video image color cast detection method of Lab chrominance space | |
CN104680524B (en) | A kind of leafy vegetable disease screening method | |
CN108279238A (en) | A kind of fruit maturity judgment method and device | |
CN103067734B (en) | Video image color cast detecting method of video quality diagnostic system | |
CN107292885A (en) | A kind of product defects classifying identification method and device based on autocoder | |
CN101650833A (en) | Color image quality evaluation method | |
CN102088539B (en) | Method and system for evaluating pre-shot picture quality | |
CN104092919B (en) | Chromatic adaptation transformation optimizing method and system for color digital imaging system | |
CN114820863B (en) | Intelligent color matching method and system based on color uniform coding | |
CN105721858B (en) | 3 d display device display effect evaluation method, equipment and system | |
CN110910480A (en) | Environment monitoring image rendering method based on color mode mapping relation | |
CN116664431B (en) | Image processing system and method based on artificial intelligence | |
CN116597029B (en) | Image re-coloring method for achromatopsia | |
CN112016621B (en) | Training method of classification model, color classification method and electronic equipment | |
CN105844676A (en) | Color cluster analysis device and color cluster analysis method for printed fabric | |
CN101937663B (en) | Image display coding method | |
Zhang et al. | How well can people use different color attributes? | |
CN109285153A (en) | A kind of image quality evaluating method and system | |
CN111325730B (en) | Underwater image index evaluation method based on random connection network | |
CN112288828B (en) | Picture identification method for automatic ovulation test paper | |
CN107316040A (en) | A kind of color of image spatial transform method of illumination invariant | |
CN103745084A (en) | Chinese medicine tongue color perception quantification method based on psychophysical test | |
CN114155384A (en) | Method for calculating pattern clipping effect similarity of colored woven fabric | |
CN111504460A (en) | Poultry water chroma detecting system based on BIM | |
Monroy-Sahade et al. | Fuzzy color description on Raspberry PI 3 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |