CN114972328B - Machine vision-based protective glove dyeing mixing control method and system - Google Patents

Machine vision-based protective glove dyeing mixing control method and system Download PDF

Info

Publication number
CN114972328B
CN114972328B CN202210818925.1A CN202210818925A CN114972328B CN 114972328 B CN114972328 B CN 114972328B CN 202210818925 A CN202210818925 A CN 202210818925A CN 114972328 B CN114972328 B CN 114972328B
Authority
CN
China
Prior art keywords
entropy
overlapping
time
mixing
dyeing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210818925.1A
Other languages
Chinese (zh)
Other versions
CN114972328A (en
Inventor
蔡康连
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Ruisida Security Protection Products Co ltd
Original Assignee
Jiangsu Ruisida Security Protection Products Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Ruisida Security Protection Products Co ltd filed Critical Jiangsu Ruisida Security Protection Products Co ltd
Priority to CN202210818925.1A priority Critical patent/CN114972328B/en
Publication of CN114972328A publication Critical patent/CN114972328A/en
Application granted granted Critical
Publication of CN114972328B publication Critical patent/CN114972328B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Abstract

The invention relates to the technical field of machine vision, in particular to a machine vision-based protective glove dyeing mixing control method and a machine vision-based protective glove dyeing mixing control system, wherein the control method comprises the following steps: acquiring initial images of continuous multiframe mixed pigments, wherein the mixed pigments are used for dyeing protective gloves; converting each frame of the initial image into a CMY image; acquiring the overlapping entropy of each frame of CMY images to obtain a time sequence consisting of the overlapping entropy of the initial images of each frame; sampling the time sequence by using the time windows to obtain the multi-scale entropy of each time window, and forming a change curve of the multi-scale entropy along with the change of time; calculating the corresponding mixing uniformity time of the mixed pigment according to the fluctuation degree of the change curve; and controlling the corresponding mixer to stop according to the uniform mixing moment. The problem of judging the error can be avoided to the discrimination difference of the mixing uniformity under different resolutions is solved.

Description

Machine vision-based protective glove dyeing mixing control method and system
Technical Field
The invention relates to the technical field of machine vision, in particular to a method and a system for controlling dyeing and mixing of protective gloves based on machine vision.
Background
The nature of the dyeing of protective gloves is based on the three primary colors of the pigments: magenta, huang Heqing were color matched to match the desired color. The most important step in dyeing is to pre-mix the dry color powder of the pigment and other functional powder, fully pre-mix the dry color powder and other functional powder, and then mix the pre-mixed powder with the plastic to meet the color matching process requirement, thereby avoiding the phenomenon of uneven coloring of the plastic to a certain extent.
In the prior art, since the particles of the powder are very fine, for a uniformly mixed mixture, a pixel value of each pixel point appearing in an image is a pixel of the mixture, that is, a color of the mixture formed by a large number of uniformly mixed powder particles appears on each pixel point, and therefore, for the evaluation of the effect of the dry-toner mixing, the color difference of a picture of the mixture obtained after the mixture is sufficiently mixed is generally used for characterization.
In practice, the inventors found that the above prior art has the following disadvantages:
the dry toner mixing process is actually to mix different colors according to different proportions of three-color pigments, the dry toners cannot be fused with each other, and the dry toners can be in the color of the fused toners when being photographed under the condition of low resolution, namely if the dry toners are uniformly mixed, the dry toners are in the target color, but if the dry toners are not uniformly mixed, the dry toners can have a large number of primary colors. Under the condition of higher resolution, the color obtained after toner fusion is not presented, but only the color of single primary color toner is presented in a single pixel point instead of fused color, so that the judgment results of the mixing uniformity under different resolutions are different.
Disclosure of Invention
In order to solve the technical problems, the invention aims to provide a method and a system for controlling dyeing and mixing of protective gloves based on machine vision, and the adopted technical scheme is as follows:
in a first aspect, an embodiment of the present invention provides a machine vision-based protective glove dyeing and mixing control method, which is characterized in that the control method includes:
acquiring initial images of continuous multiframe mixed pigments, wherein the mixed pigments are used for dyeing protective gloves; converting each frame of the initial image into a CMY image;
acquiring the overlapping entropy of each frame of the CMY images to obtain a time sequence consisting of the overlapping entropy of the initial images of each frame; sampling the time sequence by utilizing time windows to obtain the multi-scale entropy of each time window, and forming a change curve of the multi-scale entropy changing along with time; calculating the corresponding uniform mixing moment of the mixed pigment according to the fluctuation degree of the change curve;
controlling a corresponding mixer to stop according to the uniform mixing moment;
wherein, the step of obtaining the overlapping entropy is as follows: taking each pixel point in the CMY image as a central pixel point, and acquiring a vector formed by the channel proportion of the central pixel point in each channel; and calculating the information entropy of a binary group consisting of the vector corresponding to the central pixel point and the mean vector corresponding to the field of the central pixel point, wherein the information entropy of the binary group is the overlapping entropy.
Further, the step of calculating the mixing uniformity time corresponding to the mixed pigment according to the fluctuation degree of the change curve comprises the following steps: and obtaining a multi-scale entropy sequence arranged along with the frame number according to the time sequence position corresponding to the time window, fitting the multi-scale entropy sequence to form a curve, wherein the time of the center of the sliding window corresponding to the minimum value of the curve is the uniform mixing time.
Further, after the step of controlling the shutdown of the corresponding mixer according to the time of uniform mixing, the method further comprises the following steps of: acquiring an image of the mixture after shutdown, and calculating the overlapping entropy of the image of the mixture; comparing the obtained overlapping entropy of the image after shutdown with the overlapping entropy of the image at the moment of uniform mixing to obtain the difference of the overlapping entropy before and after shutdown; when the difference is greater than a threshold, mixing continues.
Further, the step of obtaining the difference in overlap entropy before and after shutdown comprises: the overlapping entropy of the moment of uniform mixing is recorded as
Figure 100002_DEST_PATH_IMAGE001
And the overlapping entropy of the images acquired after shutdown is recorded as
Figure 801368DEST_PATH_IMAGE002
Then the difference ratio is:
Figure 100002_DEST_PATH_IMAGE003
further, the step of obtaining a vector formed by the channel occupation ratios of the center pixel points in each channel includes: the channel occupancy for each channel is calculated by dividing the color value of the corresponding channel by the sum of the color values of its three channels.
Further, the step of obtaining the information entropy of the duplet as the overlapping entropy includes: and acquiring the probability of the appearance of the duplet in the CMY image, and calculating the information entropy of the CMY image according to the probability.
Further, the step of acquiring the probability of the bigram appearing in the CMY image includes: the ratio of the frequency of occurrence of the couple to the total number of occurrences of all couples in a CMY image is obtained.
In a second aspect, another embodiment of the present invention provides a machine vision-based protective glove dyeing and blending control system, which includes a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor executes the computer program to implement the steps of the method according to any one of the above-mentioned items.
The invention has the following beneficial effects:
acquiring initial images of continuous multiframe mixed pigments, wherein the mixed pigments are used for dyeing protective gloves; converting each frame of the initial image into a CMY image; acquiring the overlapping entropy of each frame of CMY images to obtain a time sequence consisting of the overlapping entropy of the initial images of each frame; sampling the time sequence by utilizing time windows to obtain the multi-scale entropy of each time window, and forming a change curve of the multi-scale entropy changing along with time; calculating the corresponding mixing uniformity time of the mixed pigment according to the fluctuation degree of the change curve; and controlling the corresponding mixer to stop according to the uniform mixing moment. The real mixing condition of the powder is reflected through the overlapping entropy, and the problem that judgment errors occur due to different judgment of mixing uniformity under different resolutions is solved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flow chart of a method for controlling dyeing and mixing of protective gloves based on machine vision according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating the steps of obtaining overlapping entropy according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an edge pixel neighborhood;
FIG. 4 is a schematic diagram of a neighborhood of corner pixel points;
fig. 5 is a schematic diagram of a color space.
Detailed Description
In order to further illustrate the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of the embodiments, structures, features and effects of the present invention will be made with reference to the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "another embodiment" refers to not necessarily the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The specific scheme of the protective glove dyeing mixing control method and system based on machine vision provided by the invention is specifically described below with reference to the accompanying drawings.
Referring to fig. 1, a flow chart of a method for controlling dyeing and mixing of protective gloves based on machine vision according to an embodiment of the present invention is shown, wherein the method comprises the following steps:
s001, acquiring initial images of continuous multi-frame mixed pigments, wherein the mixed pigments are used for dyeing the protective gloves; each frame of the initial image is converted to a CMY image.
The process of mixing the pigments of the protective gloves is monitored by means of a camera, recording is started from the moment when the batching is completed and the mixer is started, and at the same time an initial image of a succession of frames is obtained, this initial image being an RGB image. The continuous frame image may be obtained by capturing a video or may be an image sampled at a certain time interval.
For one target color match, the three primary colors of the toner mix are cyan, magenta, and yellow (CMY). For the obtained RGB image, it should be converted into CMY image first, and the conversion formula is as follows:
Figure DEST_PATH_IMAGE005
Figure DEST_PATH_IMAGE007
wherein R, G and B are respectivelyThe three channel components in the RGB image,
Figure 604940DEST_PATH_IMAGE008
Figure DEST_PATH_IMAGE009
and
Figure 328045DEST_PATH_IMAGE010
c, M and Y are the three channel components of the CMY image obtained after the conversion, respectively.
Then for a certain pixel value in a video frame obtained, its corresponding CMY components can represent the primary color ratio of its mixed toner.
For the obtained video of the mixing process, a plurality of frames of initial images of continuous video in a certain time period are intercepted and all converted into CMY images to form continuous frames of CMY images.
Step S002, acquiring the overlapping entropy of each frame of CMY image, and obtaining a time sequence formed by the overlapping entropy of each frame of initial image; sampling the time sequence by using the time windows to obtain the multi-scale entropy of each time window to form a change curve of the multi-scale entropy changing along with time; calculating the corresponding uniform mixing moment of the mixed pigment according to the fluctuation degree of the change curve; wherein, the step of obtaining the overlapping entropy is as follows: taking each pixel point in the CMY image as a central pixel point, and acquiring a vector formed by the channel proportion of the central pixel point in each channel; and calculating the information entropy of the binary group consisting of the vector corresponding to the central pixel point and the mean vector corresponding to the field of the central pixel point, wherein the information entropy of the binary group is the overlapping entropy.
In the mixing process, the mixture is poured into the mixing barrel by three primary color powders according to the total amount ratio, after the mixer is started, the powders are slowly mixed under the action of the mixer and are slowly fused into a distinguishable color, namely, the color presented to the pixel point of any one of the powders in the image is the mixed color presented by the superposition of the reflected lights of a plurality of different powders, namely, the color of each pixel point is the color of the uniformly mixed mixture due to the relationship of resolution ratio. With the action time of the mixer, the powder is distributed in the same area from the same color to a small area, and then the powder is considered to be uniformly distributed under the condition of camera resolution, namely the moment of most uniform distribution.
The information entropy of the image is used for reflecting the complexity of the color, but the situation before mixing and after sufficiently and uniformly mixing cannot be distinguished by a mode of judging the mixing degree by using the complexity of the classic information entropy to the occurrence of the gray level. Furthermore, for the images which are not mixed, the distribution areas of the colors are concentrated, and the information entropy of the images is smaller; for the image collected after uniform mixing, because of the resolution of the image, the color of the uniformly mixed mixture in the image is the same color, and the information entropy thereof is still relatively small, so the classical information entropy obtaining method is not suitable for the embodiment of the present invention, and to solve the problem, the embodiment of the present invention adopts the overlapping entropy, specifically, please refer to fig. 2, and the obtaining of the overlapping entropy includes the following steps:
step S201, taking each pixel point in the CMY image as a central pixel point, and obtaining a vector formed by the channel ratio of the central pixel point in each channel.
The comprehensive characteristic that whether the toner of a mixture area corresponding to one pixel point and the toner of a mixture area corresponding to the surrounding neighborhood are uniformly mixed or not can be reflected by the CMY proportion of the pixel points and the CMY proportion of the pixel points in the surrounding neighborhood.
First, the pixel dot and the ratio of each of the CMY colors in the surrounding eight neighborhood on each frame CMY image are calculated. Specifically, for a pixel point on a single frame CMY image, the corresponding color ratio is a vector
Figure DEST_PATH_IMAGE011
The channel proportion for each channel is calculated by dividing the color value of the respective channel by the sum of the color values of its three channels, in particular, by a vector
Figure 689887DEST_PATH_IMAGE011
The calculation method of (c) is as follows:
Figure DEST_PATH_IMAGE013
Figure DEST_PATH_IMAGE015
Figure DEST_PATH_IMAGE017
Figure DEST_PATH_IMAGE019
wherein the content of the first and second substances,
Figure 450033DEST_PATH_IMAGE020
respectively representing the toner ratios of the primary colors in the ratio vector.
Figure DEST_PATH_IMAGE021
Being the three channel components of the CMY image.
Taking each pixel point in the CMY image as a central pixel point, and calculating the proportion vector of each neighborhood pixel point in eight neighborhoods of each central pixel point
Figure 530116DEST_PATH_IMAGE011
. For a central pixel point, the occupation ratio vectors in the eight neighborhoods are respectively
Figure 260174DEST_PATH_IMAGE022
}。
Preferably, referring to fig. 3 and 4, when the central pixel is an edge point or an angle point of the image, the corresponding neighborhood proportion vectors are { } and great curl respectively
Figure DEST_PATH_IMAGE023
Black pixels in the figureThe block is the current pixel point, and the white pixel block is the neighborhood pixel block. For a neighborhood of a central pixel point, a plurality of neighborhood pixel points are contained in the neighborhood, and the mean value of the proportion vectors corresponding to the neighborhood pixel points is calculated in the following specific calculation mode:
Figure DEST_PATH_IMAGE025
wherein the content of the first and second substances,
Figure 538840DEST_PATH_IMAGE026
Figure DEST_PATH_IMAGE027
Figure 860100DEST_PATH_IMAGE028
respectively represent the first around the central pixel
Figure DEST_PATH_IMAGE029
The CMY of each neighborhood pixel has three components,
Figure 427479DEST_PATH_IMAGE030
representing its total neighborhood consensus
Figure 695649DEST_PATH_IMAGE030
And (5) each pixel point.
Step S202, calculating the information entropy of the binary group formed by the central pixel point and the vector thereof, wherein the information entropy of the binary group is the overlapping entropy.
Referring to fig. 4, for a certain central pixel point, the corresponding CMY values are a vector, in the CMY space, the range of values of each direction (CMY is three directions) is 0 to 255, and the CMY values of all the pixel points in a single frame image are in a cube with a side length of 255, that is, the color space. Then for a central pixel point, the corresponding CMY three components are a vector with the origin as the starting point and the corresponding (C, M, Y) as the end point in the space, and the vector corresponds to a ratio vector, if there is aIf the vectors have the same direction but different lengths, the ratio vectors corresponding to the two vectors are also the same. Based on the above logic, all the pixels of the pixel points on the single-frame image can correspond to a vector in the original color space and a ratio vector corresponding to the vector. For the reason of camera imaging, all the values of all the quantities in the three CMY directions are integers, that is, all the vectors in the color space point to integer points from the origin, so that the space should have all the values
Figure DEST_PATH_IMAGE031
If there are vectors with the same direction and the corresponding proportion vectors are the same, then the vectors are aligned
Figure 32083DEST_PATH_IMAGE031
And classifying the vectors based on cosine values of two included angles of the vectors, and if the cosine values of the included angles of the two vectors are 1, indicating that the two vectors belong to the same class, and finishing the classification. The number of kinds of vectors obtained in total is
Figure 727507DEST_PATH_IMAGE032
Which corresponds to
Figure 500291DEST_PATH_IMAGE032
And different proportion vectors, namely all possible proportion vectors of pixel points on the image.
For a central pixel, it corresponds to two ratio vectors, one for itself
Figure 306573DEST_PATH_IMAGE011
And the other is the vector mean value corresponding to the surrounding neighborhood
Figure DEST_PATH_IMAGE033
Then, a binary set is constructed:
Figure 763093DEST_PATH_IMAGE034
. All of
Figure 363839DEST_PATH_IMAGE032
The fractional vectors correspond to a number (1,2, … …,
Figure 623919DEST_PATH_IMAGE032
) I.e. there is a correspondence table. For ones of the doublets
Figure DEST_PATH_IMAGE035
Which correspond to the two numbers, respectively. The overlapping entropy of the single frame image
Figure 527501DEST_PATH_IMAGE036
The calculation method of (c) is as follows:
Figure 556637DEST_PATH_IMAGE038
Figure 859443DEST_PATH_IMAGE040
wherein, the number is used to replace the corresponding proportion vector. Wherein, the first and the second end of the pipe are connected with each other,
Figure DEST_PATH_IMAGE041
representing a doublet
Figure 623130DEST_PATH_IMAGE034
The number of times of occurrence of the event,
Figure 36794DEST_PATH_IMAGE042
the number of all the pixel points on the current frame,
Figure DEST_PATH_IMAGE043
representing doublets
Figure 451595DEST_PATH_IMAGE034
Probability of occurrence over the entire image.
The overlapping entropy of each frame of CMY images is obtained according to step S201 and step S202, and a time series composed of the overlapping entropy of the initial images of each frame can be obtained. And sampling the time sequence by utilizing the time windows to obtain the multi-scale entropy of each time window, and forming a change curve of the multi-scale entropy along with the change of time.
Specifically, a time series is established according to the overlapping entropy of the single frame image, and the time series representation mode is as follows:
Figure DEST_PATH_IMAGE045
the time sequence is composed of
Figure 941613DEST_PATH_IMAGE046
And (4) forming overlapping entropies, wherein each overlapping entropy is obtained by video continuous frame images with equal time intervals. The time window is a sliding window, and sliding window sampling is performed on the time sequence. The specific length of the sliding window is
Figure DEST_PATH_IMAGE047
The step length of the sliding window is
Figure 441865DEST_PATH_IMAGE048
I.e. each sliding
Figure 393640DEST_PATH_IMAGE048
And (5) frame. The sequence within the sliding window for a certain position is as follows:
Figure 882522DEST_PATH_IMAGE050
wherein
Figure DEST_PATH_IMAGE051
The length center position of the sliding window is the time sequence position of the sliding window.
Note that the length of the sliding window
Figure 58288DEST_PATH_IMAGE047
Is a preset parameter which represents the number of continuous frames and isAn odd number. The value in the embodiment of the present invention is 5. Step size
Figure 249098DEST_PATH_IMAGE048
In the embodiment of the present invention, the value is 1 for the preset parameter.
And calculating the multi-scale entropy of the sampling sequence in each sliding window, and forming a change curve of the multi-scale entropy changing along with time according to the time sequence position acquired by the sliding window. Specifically, a multi-scale entropy within a sliding window is calculated. Center of the sliding window
Figure 4564DEST_PATH_IMAGE052
Start sampling, step length
Figure 347952DEST_PATH_IMAGE048
1, then for the entire obtained overlapping entropy time series (length is
Figure 897882DEST_PATH_IMAGE046
) The total number of the obtained sampling sequences is
Figure DEST_PATH_IMAGE053
. The multi-scale entropy of the sample sequence in all sliding windows is calculated. The calculation mode of the multi-scale entropy is the prior art, and is not described in detail.
And obtaining a multi-scale entropy sequence arranged along with the frame number according to the time sequence position corresponding to the time window, fitting the multi-scale entropy sequence to form a curve, wherein the time of the center of the sliding window corresponding to the minimum value of the curve is the uniform mixing time.
Specifically, a multi-scale entropy sequence arranged along with the frame number is obtained according to the time sequence position corresponding to the sliding window. Specifically, according to the obtained multi-scale entropy and the corresponding sliding window center
Figure 372726DEST_PATH_IMAGE051
And establishing a multi-scale entropy-frame sequence number sequence. The frame number is the time sequence number, i.e. the center of the sliding window
Figure 682616DEST_PATH_IMAGE051
. And calculating the average value of the overlapping entropies of the corresponding sliding windows according to the multi-scale entropy-frame number sequence so as to obtain the uniform mixing moment. Specifically, the multi-scale entropy-frame number sequence is smoothed and fitted into a curve. When it obtains the lowest value, the center of the sliding window corresponding to it is used
Figure 395357DEST_PATH_IMAGE051
And as the moment of uniform mixing, obtaining the overlapping entropy mean value of the overlapping entropy sequence in the corresponding sliding window at the moment according to the previously calculated overlapping entropy time sequence. This is taken as the overlap entropy at the moment of homogeneous mixing.
And step S003, controlling the corresponding mixer to stop according to the uniform mixing time.
Preferably, since the calculated time of uniformly mixing has a certain time delay relative to the actual time, the mixer does not stop within the time delay range, and continues to rotate, and after the mixer actually stops, the overlapping entropy corresponding to the collected image when the mixer stops is compared with the overlapping entropy when the mixer is uniformly mixed, so as to obtain the difference between the two. That is, if the mixing time is correctly determined, the difference between the two is small because the overlap entropy still tends to be constant no matter how the mixing is continued. If the difference is larger, the caking phenomenon still exists in the inner part of the container, and the caking quantity is larger. The overlapping entropy of the moment of uniform mixing is recorded as
Figure 116188DEST_PATH_IMAGE001
The overlap entropy of the images acquired after shutdown is recorded as
Figure 281590DEST_PATH_IMAGE002
. The difference ratio is:
Figure DEST_PATH_IMAGE055
whether mixing is continued or not is judged according to the difference ratio, and when the difference ratio approaches 1, the difference between the uniform mixing moment and the overlapped entropy after shutdown is larger, the mixing is continued; shutdown is possible when the difference ratio approaches 0, which indicates that the difference between the overlapping entropies at the time of homogeneous mixing and after shutdown is smaller. Additional adjustments of the mixing time may be made after the evaluation of the mixing results.
In conclusion, acquiring initial images of continuous multi-frame mixed pigments, wherein the mixed pigments are used for dyeing the protective gloves; converting each frame of the initial image into a CMY image; acquiring the overlapping entropy of each frame of the CMY images to obtain a time sequence consisting of the overlapping entropy of the initial images of each frame; sampling the time sequence by using the time windows to obtain the multi-scale entropy of each time window, and forming a change curve of the multi-scale entropy along with the change of time; calculating the corresponding mixing uniformity time of the mixed pigment according to the fluctuation degree of the change curve; and controlling the corresponding mixer to stop according to the uniform mixing moment. The real mixing condition of the powder is reflected through the overlapping entropy, and the problem that judgment errors occur when the mixing uniformity is judged to be different under different scales is solved.
Based on the same inventive concept as the method embodiment, the embodiment of the present invention further provides an artificial intelligence based system for detecting a scab defect of a flexible leather material, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements any one of the steps of the artificial intelligence based method for detecting a scab defect of a flexible leather material when executing the computer program. The method for detecting scabies defects of flexible leather material based on artificial intelligence has been described in detail in the above embodiments, and is not repeated.
It should be noted that: the precedence order of the above embodiments of the present invention is only for description, and does not represent the merits of the embodiments. And specific embodiments thereof have been described above. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (8)

1. A machine vision-based protective glove dyeing mixing control method is characterized by comprising the following steps:
acquiring initial images of continuous multiframe mixed pigments, wherein the mixed pigments are used for dyeing protective gloves; converting each frame of the initial image into a CMY image;
acquiring the overlapping entropy of each frame of the CMY images to obtain a time sequence consisting of the overlapping entropy of the initial images of each frame; sampling the time sequence by using the time windows to obtain the multi-scale entropy of each time window, and forming a change curve of the multi-scale entropy along with the change of time; calculating the corresponding uniform mixing moment of the mixed pigment according to the fluctuation degree of the change curve;
controlling a corresponding mixer to stop according to the uniform mixing moment;
wherein, the step of obtaining the overlapping entropy is as follows: taking each pixel point in the CMY image as a central pixel point, and acquiring a vector formed by the channel proportion of the central pixel point in each channel; and calculating the information entropy of the binary group consisting of the vector corresponding to the central pixel point and the mean vector corresponding to the field of the central pixel point, wherein the information entropy of the binary group is the overlapping entropy.
2. The machine vision-based protective glove dyeing and mixing control method according to claim 1, wherein the step of calculating the corresponding mixing uniformity time of the mixed pigment according to the fluctuation degree of the variation curve comprises:
and obtaining a multi-scale entropy sequence arranged along with the frame number according to the time sequence position corresponding to the time window, fitting the multi-scale entropy sequence to form a curve, wherein the time of the center of the sliding window corresponding to the minimum value of the curve is the uniform mixing time.
3. The machine vision-based protective glove dyeing and mixing control method according to claim 1, characterized in that after the step of controlling the shutdown of the corresponding mixer according to the mixing uniformity time, the method further comprises:
acquiring an image of the mixture after shutdown, and calculating the overlapping entropy of the image of the mixture;
comparing the obtained overlapping entropy of the image after shutdown with the overlapping entropy of the image at the moment of uniform mixing to obtain the difference of the overlapping entropy before and after shutdown; when the difference is greater than a threshold, mixing continues.
4. The machine-vision-based protective glove dyeing hybrid control method according to claim 3, characterized in that the step of obtaining the difference in the overlapping entropies before and after the stoppage comprises:
the overlapping entropy of the moment of uniform mixing is recorded as
Figure DEST_PATH_IMAGE001
And the overlapping entropy of the images acquired after shutdown is recorded as
Figure 321819DEST_PATH_IMAGE002
Then the difference ratio is:
Figure DEST_PATH_IMAGE003
5. the machine-vision-based protective glove dyeing mixing control method according to claim 1, wherein the step of obtaining the vector constituted by the channel ratio of the central pixel point in each channel comprises:
the channel occupancy for each channel is calculated by dividing the color value of the corresponding channel by the sum of the color values of its three channels.
6. The machine-vision-based protective glove dyeing hybrid control method according to claim 1, wherein the step of obtaining the information entropy of the doublet as the overlapping entropy comprises:
acquiring the probability of the duplet appearing in the CMY image, and calculating the information entropy of the CMY image according to the probability.
7. The machine-vision-based protective glove dyeing hybrid control method according to claim 6, characterized in that said step of acquiring the probability of the appearance of said dyads in said CMY images comprises: the ratio of the frequency of occurrence of the couple to the total number of occurrences of all couples in a CMY image is obtained.
8. A machine vision based protective glove dyeing hybrid control system comprising a memory, a processor and a computer program stored in said memory and executable on said processor, characterized in that said processor, when executing said computer program, implements the steps of the method according to any one of claims 1 to 7.
CN202210818925.1A 2022-07-13 2022-07-13 Machine vision-based protective glove dyeing mixing control method and system Active CN114972328B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210818925.1A CN114972328B (en) 2022-07-13 2022-07-13 Machine vision-based protective glove dyeing mixing control method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210818925.1A CN114972328B (en) 2022-07-13 2022-07-13 Machine vision-based protective glove dyeing mixing control method and system

Publications (2)

Publication Number Publication Date
CN114972328A CN114972328A (en) 2022-08-30
CN114972328B true CN114972328B (en) 2022-10-25

Family

ID=82969742

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210818925.1A Active CN114972328B (en) 2022-07-13 2022-07-13 Machine vision-based protective glove dyeing mixing control method and system

Country Status (1)

Country Link
CN (1) CN114972328B (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013048416A1 (en) * 2011-09-29 2013-04-04 Hewlett-Packard Development Company, L.P. Determining new color values of an image based on an activity map
CN104408711B (en) * 2014-10-30 2017-05-24 西北工业大学 Multi-scale region fusion-based salient region detection method
CN112312087B (en) * 2020-10-22 2022-07-29 中科曙光南京研究院有限公司 Method and system for quickly positioning event occurrence time in long-term monitoring video
CN114693650A (en) * 2022-03-31 2022-07-01 南通俊朗智能科技有限公司 Intelligent control method of mixing machine based on machine vision

Also Published As

Publication number Publication date
CN114972328A (en) 2022-08-30

Similar Documents

Publication Publication Date Title
US7720279B2 (en) Specifying flesh area on image
US7769231B2 (en) Method and apparatus for improving quality of images using complementary hues
CN104683634B (en) Mobile terminal device
US20100020341A1 (en) Image Processing Apparatus, Image Processing Method, Image Processing Program, and Image Printing Apparatus
US20070115286A1 (en) Signal processing device, signal processing method, program, and recording medium
US20040150734A1 (en) Interpolation of edge portions of a digital image
JPH07114503B2 (en) Color correction method
US11544853B2 (en) Image processing apparatus and non-transitory computer readable medium for preparing color conversion model using color data
EP3142345A1 (en) Image processing apparatus for monchrome conversion and image forming apparatus including the same
US20100253852A1 (en) Image processing apparatus, image processing method, and computer program
CN102053804A (en) Image processing apparatus and control method
US20090316168A1 (en) Image processing apparatus, image processing method, and image processing program
US8331695B1 (en) Integrated circuit having a circuit for and method of updating parameters associated with a background estimation portion of a video frame
JP4329797B2 (en) Image color determination apparatus, image color determination method, and program
CN114494739B (en) Toner mixing effect detection method based on artificial intelligence
CN114972328B (en) Machine vision-based protective glove dyeing mixing control method and system
CN113763298A (en) Endoscope image processing method, endoscope image processing device, endoscope, and storage medium
JP6915483B2 (en) Image processing equipment, image processing systems and programs
US11508071B2 (en) System and method to detect, suppress, and modify background regions of scanned documents
US10397483B2 (en) Image processing device, image processing system and non-transitory computer readable medium storing program
CN114618371B (en) Batch mixer control method and system based on artificial intelligence
US7532759B2 (en) Method, system and computer software product for selecting elements of a digital image
CN109087371B (en) Method and system for controlling robot portrait
CN113706638B (en) Intelligent control method and system for pharmaceutical mixer based on intelligent Internet of things
JP4179998B2 (en) Image processing method, image processing apparatus, image forming apparatus, and computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant