CN115319979A - Gum dipping control method for protective gloves - Google Patents
Gum dipping control method for protective gloves Download PDFInfo
- Publication number
- CN115319979A CN115319979A CN202211249088.1A CN202211249088A CN115319979A CN 115319979 A CN115319979 A CN 115319979A CN 202211249088 A CN202211249088 A CN 202211249088A CN 115319979 A CN115319979 A CN 115319979A
- Authority
- CN
- China
- Prior art keywords
- fluctuation
- area
- dipping
- image
- comparison
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B29—WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
- B29C—SHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
- B29C41/00—Shaping by coating a mould, core or other substrate, i.e. by depositing material and stripping-off the shaped article; Apparatus therefor
- B29C41/02—Shaping by coating a mould, core or other substrate, i.e. by depositing material and stripping-off the shaped article; Apparatus therefor for making articles of definite length, i.e. discrete articles
- B29C41/20—Shaping by coating a mould, core or other substrate, i.e. by depositing material and stripping-off the shaped article; Apparatus therefor for making articles of definite length, i.e. discrete articles incorporating preformed parts or layers, e.g. moulding inserts or for coating articles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B29—WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
- B29C—SHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
- B29C41/00—Shaping by coating a mould, core or other substrate, i.e. by depositing material and stripping-off the shaped article; Apparatus therefor
- B29C41/34—Component parts, details or accessories; Auxiliary operations
- B29C41/52—Measuring, controlling or regulating
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B29—WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
- B29L—INDEXING SCHEME ASSOCIATED WITH SUBCLASS B29C, RELATING TO PARTICULAR ARTICLES
- B29L2031/00—Other particular articles
- B29L2031/48—Wearing apparel
- B29L2031/4842—Outerwear
- B29L2031/4864—Gloves
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Gloves (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to the field of protective glove impregnation control, in particular to a protective glove impregnation control method, which comprises the following steps: obtaining a gum dipping area gray scale image and a gum dipping area difference image of the knitted gloves before gum dipping; acquiring a contrast area 1 and a contrast area 2 corresponding to the two gray-scale images; acquiring fluctuation positions of rows and columns in two contrast areas and a fluctuation position pixel difference value; acquiring fluctuation consistency and average fluctuation difference degree of the two comparison areas by using fluctuation positions of rows and columns in the two comparison areas and a fluctuation position pixel difference value; obtaining the knitting visibility of the knitted gloves according to the fluctuation consistency and the average fluctuation difference degree of the two contrast areas, and further obtaining the gum dipping suitability of the knitted gloves; and adjusting the dipping mold according to the appropriate dipping and gluing degree of the knitted gloves. The method is used for controlling the gum dipping degree of the protective gloves, and the gum dipping degree can be intelligently controlled by the method.
Description
Technical Field
The invention relates to the field of protective glove impregnation control, in particular to a protective glove impregnation control method.
Background
Gloves are indispensable articles for daily life of people, and protective gloves are widely used in industrial production due to the characteristics of good insulation and the like. In the production process of the hand protection sleeve, in order to enhance the protection capability of the glove, a layer of rubber is often added on the basis of the woven glove to play a protection role. Different use scenes have different requirements on the gum dipping degree of the protective gloves. Therefore, control of the degree of impregnation of protective gloves is essential.
At present, in the production process of protective gloves, the gum dipping degree is manually controlled, and meanwhile, as the gloves are continuously dipped, the liquid level position of a gum solution pool is also changed, and a gum dipping machine needs to be adjusted according to experience.
The existing method for manually controlling the dipping degree depends on manpower, has higher cost and lower efficiency, and meanwhile, the accuracy of the dipping degree also depends on the existing experience, and the accuracy cannot be ensured, so that a method for improving the efficiency and the accuracy of controlling the dipping degree of the protective gloves is urgently needed.
Disclosure of Invention
The invention provides a gum dipping control method for protective gloves, which comprises the following steps: obtaining a gum dipping area gray scale image and a gum dipping area difference image of the knitted gloves before gum dipping; acquiring a contrast area 1 and a contrast area 2 corresponding to the two gray-scale images; acquiring fluctuation positions of rows and columns in two contrast areas and a fluctuation position pixel difference value; acquiring fluctuation consistency and average fluctuation difference degree of the two comparison areas by using fluctuation positions of rows and columns in the two comparison areas and a fluctuation position pixel difference value; obtaining the knitting line visibility of the knitted gloves according to the fluctuation consistency and the average fluctuation difference degree of the two comparison areas, and further obtaining the gum dipping appropriateness of the knitted gloves; the control parameters of the dipping die are adjusted according to the dipping suitability of the knitted gloves, and compared with the prior art, the knitted glove knitting visibility is obtained by analyzing the gray value variation difference of adjacent pixel points in the dipping area image and the dipping area difference image before dipping of the protective gloves through combining computer vision and image processing, so that the dipping suitability of the protective gloves is obtained through calculation, and the accuracy of the dipping degree is effectively improved.
Furthermore, the reasonable degree of the current impregnation behavior is judged by using the obtained impregnation suitability, so that the self-adaptive adjustment of the impregnation process is finally realized, and the intellectualization of the impregnation degree control is effectively realized.
In order to achieve the purpose, the invention adopts the following technical scheme that the gum dipping control method of the protective gloves comprises the following steps:
and obtaining a gum dipping area gray scale image before and after gum dipping of the knitted gloves.
And obtaining a gum dipping area difference diagram of the knitted gloves by differentiating the gum dipping area gray level diagrams before and after gum dipping of the knitted gloves.
And masking the gum dipping area gray scale image before gum dipping of the knitted gloves to obtain a contrast area 1 corresponding to the gum dipping area gray scale image.
And masking the gum dipping area difference image of the thread woven glove to obtain a contrast area 2 corresponding to the gum dipping area difference image.
And respectively calculating the gray difference value of adjacent pixel points in the comparison area 1 and the comparison area 2, and acquiring the fluctuation positions of rows and columns in the two comparison areas and the fluctuation position pixel difference value.
And acquiring fluctuation consistency of the two contrast areas by utilizing fluctuation positions of rows and columns in the two contrast areas.
And obtaining the average fluctuation difference degree of the two contrast areas by utilizing the pixel difference value of the fluctuation position in the two contrast areas.
And acquiring the knitting thread visibility of the knitted gloves according to the fluctuation consistency and the average fluctuation difference degree of the two contrast areas.
The degree of impregnation of the knitted gloves was calculated from the knitted visibility of the knitted gloves.
And adjusting the dipping mold according to the appropriate dipping and gluing degree of the knitted gloves.
Further, according to the method for controlling the dipping of the protective gloves, the gray level images of the dipping areas before and after the dipping of the knitted gloves are obtained according to the following method:
images before and after the thread woven gloves are dipped in glue are collected.
Semantic segmentation is carried out on the images of the knitted gloves before and after gum dipping to obtain gum dipping area images of the knitted gloves before and after gum dipping.
The gray level images of the dipping areas before and after the knitted gloves are dipped are obtained by carrying out gray level processing on the images of the dipping areas before and after the knitted gloves are dipped.
Further, according to the method for controlling gum dipping of the protective gloves, the fluctuation positions of the rows and the columns in the two comparison areas and the pixel difference value of the fluctuation positions are obtained as follows:
and calculating the gray difference value of each row of adjacent pixel points in the comparison area 1, and acquiring the row adjacent pixel difference value image of the comparison area 1.
And (4) counting the gray value of the difference image of the adjacent pixels in the row of the contrast area 1 to obtain a difference histogram of the adjacent pixels in the row of the contrast area 1.
And calculating the distance of the peak value from the center and the probability density at the peak value in the difference value histogram of the adjacent pixels in the row of the comparison area 1, and acquiring the row motion judgment threshold value of the comparison area 1.
And judging the gray difference value of adjacent pixels in the row adjacent pixel difference value image of the comparison area 1 and the size of a row motion judgment threshold value, and acquiring the fluctuation position of the row in the comparison area 1.
The above steps are repeated to obtain the fluctuation positions of the columns in the contrast area 1.
And acquiring the fluctuation position pixel difference value of the rows and the columns in the contrast area 1 according to the fluctuation positions of the rows and the columns in the contrast area 1.
The fluctuation positions of the rows and columns in the contrast area 2 and the fluctuation position pixel difference values are obtained in the manner described above.
Further, according to the method for controlling gum dipping of the protective gloves, the fluctuation consistency of the two comparison areas is obtained according to the following mode:
thresholding is carried out on the difference value image of the adjacent pixels in the row of the contrast area 1 to obtain the row motion position image of the contrast area 1。
Travelling motion position image from contrast region 1Coordinates of middle pixel points are obtainedThe lateral position of the image is the centerline.
ComputingDistance of each wave position in image from central line of transverse positionGet away, getImage-corresponding fluctuating distance image。
Calculating outAnd the sum of the distance from the fluctuation position of each row in the image to the central line of the position to obtain a traveling wave sequence of the contrast area 1.
Repeating the steps to obtain the column fluctuation sequence of the comparison area 1.
The row and column wave sequences of the contrast region 2 are obtained as described above.
And respectively calculating the Euclidean distance of the traveling wave sequence and the Euclidean distance of the column wave sequence of the two comparison regions.
And acquiring the fluctuation consistency of the two comparison areas according to the Euclidean distance of the fluctuation sequences of the two comparison areas.
Further, in the method for controlling the gum dipping of the protective gloves, the average fluctuation difference degree of the two comparison areas is obtained as follows:
setting the pixel value of the area position outside the row fluctuation position in the contrast area 1 as 0 to obtain the row fluctuation difference value image of the contrast area 1。
Calculating outAnd acquiring a traveling wave motion difference value sequence of the comparison area 1 according to the average value of the difference values of each row in the image.
And repeating the steps to obtain a column fluctuation difference value sequence of the comparison area 1.
The traveling wave differential sequence and the column wave differential sequence of the contrast region 2 are obtained in the above manner.
And respectively calculating the Euclidean distance of the traveling wave motion difference value sequence and the Euclidean distance of the column wave motion difference value sequence of the two comparison areas.
And acquiring the average fluctuation difference degree of the two comparison areas according to the Euclidean distance of the fluctuation difference value sequences of the two comparison areas.
Further, in the method for controlling gum dipping of the protective gloves, the expression of the visibility of the knitted threads of the knitted gloves is as follows:
in the formula:the consistency of the fluctuation of the two areas is shown,indicating the average degree of fluctuation difference between the two regions,indicating the thread visibility of the thread woven glove.
Further, in the method for controlling the dipping of the protective gloves, the expression of the proper dipping and gluing of the thread woven gloves is as follows:
in the formula:as a result of the empirical values, the values,for the boundary values of the visibility range of the standard glove knitting line,the final deviation of the visibility of the weaving lines from the border of the range,the dipping and gluing of the knitted gloves are moderate.
The invention has the beneficial effects that:
by combining computer vision and image processing, the knitting thread visibility of the protective gloves is obtained by analyzing the gray value variation difference of adjacent pixel points in the gum dipping area image and the gum dipping area difference image before gum dipping of the protective gloves, so that the gum dipping suitability of the protective gloves is obtained through calculation, and the accuracy of the gum dipping degree is effectively improved.
The method judges the reasonable degree of the current impregnation behavior by using the obtained impregnation suitability, finally realizes the self-adaptive adjustment of the impregnation process, and effectively realizes the intellectualization of the impregnation degree control.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a schematic flow chart of a method for controlling dipping of protective gloves according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a method for controlling dipping of protective gloves according to an embodiment of the present invention;
FIG. 3 is a schematic view illustrating the dipping effect of a protective glove according to an embodiment of the present invention;
FIG. 4 is a schematic view of the gum dipping effect of another protective glove provided by the embodiment of the invention;
fig. 5 is a schematic diagram of a difference value diagram of a dipping area of a knitted glove according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example 1
The embodiment of the invention provides a method for controlling gum dipping of protective gloves, which comprises the following steps of:
s101, obtaining a gum dipping area gray scale image before and after gum dipping of the knitted gloves.
The gray scale map is also called a gray scale map. The relationship between white and black is logarithmically divided into several levels, called gray scale. The gray scale is divided into 256 steps. An image represented in grayscale is referred to as a grayscale map.
S102, obtaining a gum dipping area difference diagram of the knitted glove by subtracting the gum dipping area gray level diagrams before and after gum dipping of the knitted glove.
The difference image is obtained from a gum dipping area gray level image before gum dipping and a gum dipping area gray level image after gum dipping, and the image can reflect the difference change before and after gum dipping.
S103, masking the gum dipping area gray scale image before gum dipping of the woven glove to obtain a comparison area 1 corresponding to the gum dipping area gray scale image.
The contrast area 1 is obtained by masking the difference image to obtain a binary mask image and multiplying the binary mask image and the image before gum dipping.
S104, shading the gum dipping area difference image of the woven glove to obtain a comparison area 2 corresponding to the gum dipping area difference image.
Wherein, the contrast region 2 refers to a corresponding connected component region in the difference image.
And S105, respectively calculating the gray level difference value of adjacent pixel points in the comparison area 1 and the comparison area 2, and acquiring the fluctuation positions of rows and columns in the two comparison areas and the fluctuation position pixel difference value.
And the fluctuation positions of the rows and the columns are obtained according to the gray value difference value of the adjacent pixel points in each contrast area.
And S106, acquiring the fluctuation consistency of the two contrast areas by utilizing the fluctuation positions of the rows and the columns in the two contrast areas.
The fluctuation consistency is used for judging whether the gray level change of the difference image is consistent with the gray level change before gum dipping, and the more consistent the fluctuation consistency, the smaller the shading area of the texture is, and the more the corresponding knitting line can be observed.
And S107, acquiring the average fluctuation difference degree of the two contrast areas by using the fluctuation position pixel difference value in the two contrast areas.
The average fluctuation difference degree is used for judging the gray level difference of the fluctuation position, and the larger the average difference is, the more obvious the weaving line is.
And S108, acquiring the knitting line visibility of the knitted gloves according to the fluctuation consistency and the average fluctuation difference degree of the two contrast areas.
The visibility of the knitting yarn is the visibility of the original knitting yarn of the glove after being attached with rubber, and the larger the visibility is, the thinner or thinner the rubber attached to the glove is, so that the requirement set by people in advance is not necessarily met. Namely, the difference between the dipping effect and the expected effect can be evaluated through the visibility of the knitting threads after the glove is dipped, so that the dipping process can be intelligently controlled.
And S109, calculating the dipping and gluing appropriateness of the knitted gloves according to the knitted line visibility of the knitted gloves.
The gum dipping suitability is the suitability of gum dipping operation of the protective gloves, and the gum dipping suitability is obtained according to the relation between the weaving visibility of the obtained knitted gloves and the weaving visibility of the standard gloves.
And S110, adjusting the dipping mold according to the dipping and gluing proper degree of the thread woven gloves.
The method comprises the steps of obtaining the gum dipping suitability of the protective glove, adjusting a gum dipping mold according to the gum dipping suitability of the knitted gloves, judging whether the gum dipping amount of the mold is suitable or not according to the obtained gum dipping suitability of the protective glove, and adjusting a gum dipping machine.
The beneficial effect of this embodiment lies in:
by combining computer vision and image processing, the knitting thread visibility of the protective gloves is obtained by analyzing the gray value variation difference of adjacent pixel points in the gum dipping area image and the gum dipping area difference image before gum dipping of the protective gloves, so that the gum dipping suitability of the protective gloves is obtained through calculation, and the accuracy of the gum dipping degree is effectively improved.
In the embodiment, the reasonable degree of the current impregnation behavior is judged by using the obtained impregnation suitability, so that the self-adaptive adjustment of the impregnation process is finally realized, and the intellectualization of the impregnation degree control is effectively realized.
Example 2
The embodiment of the invention provides a method for controlling gum dipping of protective gloves, which comprises the following steps of:
s201, collecting images of the knitted gloves before and after gum dipping.
The glove mold is sleeved with a thread knitted glove needing gum dipping, the glove enters the rubber pool by controlling the angle of the mold bracket, so that rubber can be dipped into the glove, and then the mold is lifted to finish the gum dipping process.
After the gloves are dipped, a corresponding track is required to be arranged opposite to the glove mold, and a camera is erected on the track, so that the camera can acquire suitable images of the gloves before and after dipping.
S202, obtaining gum dipping area images of the knitted gloves before and after gum dipping.
The factory environment is relatively complex, the corresponding background is multilateral, and in order to quickly and accurately find the dipped gloves in the image, the deep learning neural network technology is adopted to realize the target detection of the dipped gloves.
This embodiment obtains the corresponding gum dipping glove area by DNN in a semantic segmentation manner. The specific contents of the DNN network are as follows:
1) The network adopts a semantic segmentation form, and the corresponding structure is Encoder-Decoder.
2) The dataset used by the network is the actual acquired image of the dipped glove on the mould.
3) The label is a single-channel semantic label, wherein the pixel position corresponding to the position of the dipped glove is marked as 1, and the rest is a scene background and is marked as 0.
4) The loss function uses a cross entropy loss function.
Therefore, the gum dipping area image after gum dipping can be acquired through the camera, the mask image of the gum dipping gloves in the image can be obtained through the trained neural network reasoning, and the mask image and the acquired RGB image are multiplied, so that the purpose of picking the gum dipping gloves in the image can be achieved.
S203, obtaining a gum dipping area difference diagram of the knitted gloves.
The dipping results of the gloves are different due to problems of rubber modulation or problems of the pressing angle of the mold. The corresponding images are shown in fig. 3 and 4: FIG. 3 is an image showing a poor dipping effect, and the texture of the knitting yarn can be clearly observed in the rubber portion. FIG. 4 is a diagram showing the effect of dipping the fabric with the rubber portion covering the lower knitting yarn.
In order to judge the effect of gum dipping, images before and after gum dipping need to be collected for comparison.
Firstly, acquiring an image of a glove impregnation area without impregnation before impregnation, marking the image as QF, acquiring an image of the glove impregnation area after impregnation as JF, and reasoning by adopting the neural network to identify a glove target in the image.
And performing graying treatment on the two images, and subtracting JF from QF to obtain a difference image which can reflect the difference change before and after gum dipping. The difference image is shown in fig. 5.
S204, obtaining a contrast area 1 and a contrast area 2 corresponding to the gray level images of the two glove dipping areas.
After obtaining the difference image, the weaving line visibility is obtained by analyzing the consistency and fluctuation difference degree of the gray level fluctuation of the difference image and the image before gum dipping.
1) Firstly, morphological processing (closed operation) is carried out on the difference image, isolated noise is removed, and meanwhile the difference regions can be connected into a closed region. The closed area is the gumming area. Corresponding gum dipping binary mask images can be obtained from the areas.
2) The binary mask image is multiplied by the image before gum dipping to obtain a contrast area 1. And the corresponding connected domain region in the difference image is the contrast region 2. Both regions need to be analyzed in comparison.
S205, acquiring fluctuation positions of rows and columns in two contrast areas and a fluctuation position pixel difference value.
The texture is represented as regular change of gray value in a gray scale image, the areas of the two regions are the same, and the area ratio of the same gray change (the same texture) in the judgment region is mainly evaluated.
And describing texture information in the region by a row-column fluctuation sequence corresponding to the minimum bounding rectangle of the connected component. Firstly, the gray value in the connected domain is counted to obtain a fluctuation judgment threshold value, and when the gray difference value of adjacent pixels is larger than the threshold value, the fluctuation is a primary fluctuation.
(1) And calculating the gray difference value of each row and each column of adjacent pixel points in the comparison area 1, and acquiring the row adjacent pixel difference value image and the column adjacent pixel difference value image of the comparison area 1.
(1) Subtracting adjacent pixels in each row by twoObtaining a difference image of adjacent pixels in a row。
Wherein the content of the first and second substances,is shown asLine, firstThe pixel values of the pixel points of a column,is shown asLine, line 1The pixel values of the pixel points of a column,is shown asLine, firstPixel point and second of columnLine, firstThe difference between the pixel values of the pixels of the row.
(2) Subtracting adjacent pixels in each column by twoObtaining a difference image of adjacent pixels in a column。
Wherein the content of the first and second substances,is shown asLine, line 1The pixel values of the pixel points of a column,is shown asLine, firstThe pixel values of the pixel points of a column,denotes the firstLine, firstPixel point and second of columnLine, firstThe difference between the pixel values of the pixels of the row.
(2) And (4) counting the gray value of the difference image of the adjacent pixels in the row of the contrast area 1 to obtain a difference histogram of the adjacent pixels in the row of the contrast area 1.
(3) And calculating the distance of the peak value from the center and the probability density at the peak value in the difference value histogram of the adjacent pixels in the row of the comparison area 1, and acquiring the row motion judgment threshold value of the comparison area 1.
And calculating the distance of the peak value in the pixel value histogram from the center and the probability density at the peak value to determine a judgment threshold value, wherein the threshold value is relatively small when the histogram is biased to the left, and is relatively large when the histogram is biased to the right.
(1) Counting the gray value of the row adjacent pixel difference image (column adjacent pixel difference image) to obtain the abscissa value of the maximum gray value of the imageAnd the abscissa value of the minimum valueAnd calculating the mean value in the middle of the abscissa.
In the formula:is the abscissa value of the maximum value of the gray value of the image,an abscissa value which is the minimum value of the gray value of the image,is the mean value in the middle of the abscissa of the gray value of the image.
(3) The distance between the highest position of the frequency and the mean value in the middle is calculated.
In the formula:an abscissa value indicating the highest frequency of the gradation value,is the mean value in the middle of the abscissa of the gray value of the image,the distance between the highest position of the frequency and the mean value in the middle.
(6) And obtaining a fluctuation judgment threshold value.
In the formula:a fluctuation judging threshold value is indicated and,indicating the highest position interval of the frequencyThe density of the probability of the inner-band,is the mean value in the middle of the abscissa of the image gray value,the distance between the highest position of the frequency and the mean value in the middle.
(4) And judging the gray difference value of adjacent pixels in the row adjacent pixel difference value image of the comparison area 1 and the size of a row motion judgment threshold value, and acquiring the fluctuation position of the row in the comparison area 1.
(5) The above steps are repeated to obtain the fluctuation positions of the columns in the contrast area 1.
(6) And acquiring the fluctuation position pixel difference value of the rows and the columns in the contrast area 1 according to the fluctuation positions of the rows and the columns in the contrast area 1.
(7) The fluctuation positions of the rows and the columns in the contrast area 2 and the fluctuation position pixel difference are obtained by adopting the method.
S206, obtaining the row-column fluctuation sequence of the two comparison areas.
Taking the comparison area 1 as an example, the specific process is as follows:
(2) And acquiring the position central line of the connected domain.
Previously obtained impregnation communicating domainExtracting minimum transverse coordinates of connected componentsAnd maximum lateral coordinateCalculating the mean value of two transverse coordinates
(3) The distance of each wave position from the center line is calculated.
(1) To findThe distance of each wave position in the image from the center line of the longitudinal position, assuming coordinatesIs located at a wave position at a distance from the position's center line
In the formula:represents the first in the imageLine and firstThe fluctuating position coordinates of the columns,is the lateral coordinate of the wave position,is the mean of the transverse coordinates.
(2) To findThe distance of each wave position in the image from the center line of the transverse position, assuming coordinatesIs located at a wave position at a distance from the position's center line
In the formula:represents the first in the imageGo, firstThe fluctuating position coordinates of the columns,is the longitudinal coordinate of the wave position,is the mean value of the longitudinal coordinates.
The process will obtainImage-corresponding fluctuating distance imageAndimage-corresponding fluctuating distance image。
(4) And calculating the distance between the wave position of each row and each column and the position central line, and obtaining the distance and the wave sequence.
Will be provided withPixel value of each row in imageAddingObtaining a train fluctuation sequenceIn the formula:represents the first in the imageGo, firstThe fluctuating position coordinates of the columns,to representThe distance of each wave position in the image from the line in the longitudinal position, n representingNumber of columns in the image.
Will be provided withAdding pixel values of each column in the imageObtaining a travelling motion sequenceIn the formula:represents the first in the imageLine and firstThe fluctuating position coordinates of the columns,representThe distance of each wave position in the image from the centerline of the transverse position, m representing the distanceThe number of lines in the image.
The above process is repeated to obtain the row-column fluctuation sequence of the contrast region 2.
And S207, acquiring the fluctuation consistency of the two comparison areas.
(1) And calculating the Euclidean distance between the two regional row (column) fluctuation sequences.
Calculating Euclidean distance of two-region traveling wave motion sequenceAnd Euclidean distance of two-region column fluctuation sequence。
(2) And judging the gray level fluctuation consistency through the Euclidean distance.
In the formula:representing the euclidean distance of the traveling sequence in the two regions,representing the euclidean distance of the two-region train fluctuation sequence,representing the final euclidean distance of the two regions.
The fluctuation uniformity of the gradation values is thus:
in the formula:the final euclidean distance between the two regions is represented,indicating the consistency of the fluctuations of both regions.
And S208, acquiring the average fluctuation difference degree of the two contrast areas.
The difference of gray values of each fluctuation position is obtained in the foregoing, and the image is obtained at the place where the image is requiredSetting the pixel values of other area positions to zero by the pixel values at the medium fluctuation position to obtain a column fluctuation difference value imageAnalogy to this procedure to obtain a travelling wave differential value imageTo find outMean per line of an imageAndmean value of difference values of each column of image。
The traveling wave motion difference value sequence can be obtained in the modeSequence of sum and column fluctuation differences。
Calculating the Euclidean distance between the traveling wave differential value sequence and the column wave differential value sequence of the two regions,。
And the average fluctuation difference degree of the amplified images is as follows:
in the formula:representing the Euclidean distance between the traveling wave motion difference value sequences of the two regions,representing the Euclidean distance between the two regions of the sequence of column fluctuation difference values,indicating the average degree of fluctuation difference between the two regions.
S209, obtaining the knitting thread visibility of the knitting glove.
And calculating the visibility of the knitting lines in the gum dipping area.
in the formula:indicating the consistency of the fluctuations of the two regions,indicating the average degree of fluctuation difference between the two regions.
S210, obtaining the knitted gloves with proper impregnation and gluing.
Extracting the fabric visibility of the standard glove to obtain the fabric visibility range of the standard gloveDetermining whether the visibility of the fabric is
in the formula:for deviations of the visibility of the weaving lines from the smaller boundaries of the range,for deviations of the visibility of the weaving lines from the boundaries of the larger extent,to eventually deviate the visibility of the weaving line from the boundaries of the range,is composed of、The smaller of the absolute values.
In the formula:as a result of the empirical values, the values,to eventually deviate the visibility of the weaving line from the boundaries of the range,are the boundary values of the visibility range of the standard glove fabric.
Thus, the current appropriate gum dipping operation is obtained.
S211, adjusting control parameters of the dipping mold according to the dipping suitability of the knitted gloves, and achieving intelligent control.
After the appropriateness corresponding to the gum dipping operation is obtained, the next control parameter of the die is adjusted according to the appropriateness, so that the abnormality is effectively solved.
Adjusting the dipping parameters of the die according to the fitness value:
when the temperature is higher than the set temperatureWhen the mold is used, the gum dipping amount is small, and the gum dipping amount of the mold needs to be increased by adjusting control parameters.
When the temperature is higher than the set temperatureThe amount of the dipping solution was as described above, and it was not necessary to adjust the amount.
When in useWhen the amount of the gum dipping is too much, the gum dipping amount of the mold is reduced by adjusting control parameters.
The beneficial effect of this embodiment lies in:
by combining computer vision and image processing, the knitting thread visibility of the protective gloves is obtained by analyzing the gray value variation difference of adjacent pixel points in the gum dipping area image and the gum dipping area difference image before gum dipping of the protective gloves, so that the gum dipping suitability of the protective gloves is obtained through calculation, and the accuracy of the gum dipping degree is effectively improved.
In the embodiment, the reasonable degree of the current impregnation behavior is judged by using the obtained impregnation suitability, so that the self-adaptive adjustment of the impregnation process is finally realized, and the intellectualization of the impregnation degree control is effectively realized.
The beneficial effect of this embodiment lies in:
by combining computer vision and image processing, the knitting thread visibility of the protective gloves is obtained by analyzing the gray value variation difference of adjacent pixel points in the gum dipping area image and the gum dipping area difference image before gum dipping of the protective gloves, so that the gum dipping suitability of the protective gloves is obtained through calculation, and the accuracy of the gum dipping degree is effectively improved.
In the embodiment, the reasonable degree of the current impregnation behavior is judged by using the obtained impregnation suitability, so that the self-adaptive adjustment of the impregnation process is finally realized, and the intellectualization of the impregnation degree control is effectively realized.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (7)
1. The method for controlling gum dipping of the protective gloves is characterized by comprising the following steps:
obtaining a gum dipping area gray scale image before and after gum dipping of the knitted gloves;
obtaining a gum dipping area difference diagram of the knitted gloves by differentiating the gum dipping area gray level diagrams before and after gum dipping of the knitted gloves;
masking a gum dipping area gray scale image before gum dipping of the knitted gloves to obtain a comparison area 1 corresponding to the gum dipping area gray scale image;
masking the gum dipping area difference image of the thread woven glove to obtain a comparison area 2 corresponding to the gum dipping area difference image;
respectively calculating gray level difference values of adjacent pixel points in the comparison area 1 and the comparison area 2, and acquiring fluctuation positions of rows and columns and fluctuation position pixel difference values in the two comparison areas;
acquiring fluctuation consistency of the two comparison areas by utilizing fluctuation positions of rows and columns in the two comparison areas;
obtaining the average fluctuation difference degree of the two contrast areas by utilizing the fluctuation position pixel difference value in the two contrast areas;
acquiring the knitting line visibility of the knitted gloves according to the fluctuation consistency and the average fluctuation difference degree of the two contrast areas;
calculating the appropriate dipping and gluing degree of the knitted gloves according to the knitted visibility of the knitted gloves;
and adjusting the dipping mold according to the appropriate dipping and gluing degree of the knitted gloves.
2. The method for controlling gum dipping of protective gloves according to claim 1, wherein the gray-scale maps of the gum dipping areas before and after the gum dipping of the knitted gloves are obtained as follows:
collecting images of the knitted gloves before and after gum dipping;
semantic segmentation is carried out on the images of the knitted gloves before and after gum dipping to obtain gum dipping area images of the knitted gloves before and after gum dipping;
the gray level images of the dipping areas before and after the knitted gloves are dipped are obtained by carrying out gray level processing on the images of the dipping areas before and after the knitted gloves are dipped.
3. The method for controlling dipping of protective gloves according to claim 1, wherein the fluctuation positions of the rows and the columns in the two comparison areas and the pixel difference value of the fluctuation positions are obtained as follows:
calculating the gray difference value of each row of adjacent pixel points in the comparison area 1, and acquiring a row adjacent pixel difference value image of the comparison area 1;
counting the gray value of the difference image of the adjacent pixels in the row of the comparison area 1 to obtain a difference histogram of the adjacent pixels in the row of the comparison area 1;
calculating the distance of the peak value from the center and the probability density at the peak value in the difference value histogram of the adjacent pixels of the row of the comparison area 1, and acquiring a row motion judgment threshold value of the comparison area 1;
judging the gray level difference value of adjacent pixels in the row adjacent pixel difference value image of the comparison area 1 and the size of a row motion judgment threshold value, and acquiring the fluctuation position of the row in the comparison area 1;
repeating the steps to obtain the fluctuation position of the columns in the comparison area 1;
acquiring a fluctuation position pixel difference value of the rows and the columns in the comparison area 1 according to the fluctuation positions of the rows and the columns in the comparison area 1;
the fluctuation positions of the rows and columns in the contrast area 2 and the fluctuation position pixel difference values are obtained in the manner described above.
4. The method for controlling dipping of protective gloves according to claim 1, wherein the fluctuation consistency of the two contrast areas is obtained as follows:
thresholding is carried out on the difference value image of the adjacent pixels in the row of the contrast area 1 to obtain the row motion position image of the contrast area 1;
Travelling motion position image from contrast region 1Coordinates of middle pixel points are obtainedA transverse positional centerline of the image;
computingThe distance between each fluctuation position in the image and the central line of the transverse position is obtainedImage-corresponding fluctuating distance image;
ComputingThe sum of the distance from the fluctuation position of each row in the image to the central line of the position to obtain a traveling fluctuation sequence of the comparison area 1;
repeating the steps to obtain a train fluctuation sequence of the comparison area 1;
the row wave sequence and the traveling wave sequence of the comparison area 2 are obtained according to the method;
respectively calculating the Euclidean distance of the traveling wave sequence and the Euclidean distance of the column wave sequence of the two comparison regions;
and acquiring the fluctuation consistency of the two comparison areas according to the Euclidean distance of the fluctuation sequences of the two comparison areas.
5. The method for controlling dipping of protective gloves according to claim 1, wherein the average fluctuation difference degree of the two contrast areas is obtained as follows:
setting the pixel value of the area position outside the row fluctuation position in the contrast area 1 as 0 to obtain the row fluctuation difference value image of the contrast area 1;
ComputingAcquiring a traveling wave motion difference value sequence of the comparison area 1 from the mean value of the difference values of each row in the image;
repeating the steps to obtain a train fluctuation difference value sequence of the comparison area 1;
the traveling wave motion difference value sequence and the column wave motion difference value sequence of the comparison area 2 are obtained according to the mode;
respectively calculating Euclidean distances of the traveling wave differential value sequences and the column wave differential value sequences of the two comparison regions;
and acquiring the average fluctuation difference degree of the two comparison areas according to the Euclidean distance of the fluctuation difference value sequences of the two comparison areas.
6. The method for controlling dipping of protective gloves according to claim 1, wherein the expression of the visibility of the knitted threads of the knitted gloves is as follows:
7. The method for controlling dipping of protective gloves according to claim 1, wherein the expression of the dipping moderation of the knitted gloves is as follows:
in the formula:as a result of the empirical values, the values,for the boundary values of the visibility range of the standard glove knitting line,for the final deviation of the visibility of the weaving line from the boundaries of the range,the dipping and gluing of the knitted gloves are moderate.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211249088.1A CN115319979B (en) | 2022-10-12 | 2022-10-12 | Gum dipping control method for protective gloves |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211249088.1A CN115319979B (en) | 2022-10-12 | 2022-10-12 | Gum dipping control method for protective gloves |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115319979A true CN115319979A (en) | 2022-11-11 |
CN115319979B CN115319979B (en) | 2023-01-03 |
Family
ID=83913187
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211249088.1A Active CN115319979B (en) | 2022-10-12 | 2022-10-12 | Gum dipping control method for protective gloves |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115319979B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116721067A (en) * | 2023-05-29 | 2023-09-08 | 宿迁凯达环保设备制造有限公司 | Impregnated paper impregnation quality detection method based on machine vision |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN208180066U (en) * | 2018-01-25 | 2018-12-04 | 黄兆周 | A kind of full-automatic multi-functional safety and industrial gloves production line |
CN112198161A (en) * | 2020-10-10 | 2021-01-08 | 安徽和佳医疗用品科技有限公司 | PVC gloves real-time detection system based on machine vision |
CN114248385A (en) * | 2021-12-15 | 2022-03-29 | 张家港思淇科技有限公司 | Production process of gum dipping gloves |
CN114627117A (en) * | 2022-05-13 | 2022-06-14 | 启东市鸿盛纺织有限公司 | Knitted fabric defect detection method and system based on projection method |
-
2022
- 2022-10-12 CN CN202211249088.1A patent/CN115319979B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN208180066U (en) * | 2018-01-25 | 2018-12-04 | 黄兆周 | A kind of full-automatic multi-functional safety and industrial gloves production line |
CN112198161A (en) * | 2020-10-10 | 2021-01-08 | 安徽和佳医疗用品科技有限公司 | PVC gloves real-time detection system based on machine vision |
CN114248385A (en) * | 2021-12-15 | 2022-03-29 | 张家港思淇科技有限公司 | Production process of gum dipping gloves |
CN114627117A (en) * | 2022-05-13 | 2022-06-14 | 启东市鸿盛纺织有限公司 | Knitted fabric defect detection method and system based on projection method |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116721067A (en) * | 2023-05-29 | 2023-09-08 | 宿迁凯达环保设备制造有限公司 | Impregnated paper impregnation quality detection method based on machine vision |
CN116721067B (en) * | 2023-05-29 | 2024-04-12 | 宿迁凯达环保设备制造有限公司 | Impregnated paper impregnation quality detection method based on machine vision |
Also Published As
Publication number | Publication date |
---|---|
CN115319979B (en) | 2023-01-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108960245B (en) | Tire mold character detection and recognition method, device, equipment and storage medium | |
CN110717896B (en) | Plate strip steel surface defect detection method based on significance tag information propagation model | |
Sen et al. | Generalized rough sets, entropy, and image ambiguity measures | |
CN111340824B (en) | Image feature segmentation method based on data mining | |
CN109657612B (en) | Quality sorting system based on facial image features and application method thereof | |
CN107038416B (en) | Pedestrian detection method based on binary image improved HOG characteristics | |
CN111915704A (en) | Apple hierarchical identification method based on deep learning | |
CN114359190B (en) | Plastic product molding control method based on image processing | |
CN109800698A (en) | Icon detection method based on depth network | |
CN111611643A (en) | Family type vectorization data obtaining method and device, electronic equipment and storage medium | |
CN107545571A (en) | A kind of image detecting method and device | |
CN108921813A (en) | Unmanned aerial vehicle detection bridge structure crack identification method based on machine vision | |
CN109540925B (en) | Complex ceramic tile surface defect detection method based on difference method and local variance measurement operator | |
CN110889332A (en) | Lie detection method based on micro expression in interview | |
CN113298809B (en) | Composite material ultrasonic image defect detection method based on deep learning and superpixel segmentation | |
CN113252614B (en) | Transparency detection method based on machine vision | |
CN115319979B (en) | Gum dipping control method for protective gloves | |
CN116704209B (en) | Quick flange contour extraction method and system | |
CN115239718B (en) | Plastic product defect detection method and system based on image processing | |
CN106023249A (en) | Moving object detection method based on local binary similarity pattern | |
CN115439494A (en) | Spray image processing method for quality inspection of sprayer | |
CN107274452A (en) | A kind of small pox automatic testing method | |
CN109460767A (en) | Rule-based convex print bank card number segmentation and recognition methods | |
CN116823824A (en) | Underground belt conveyor dust fall detecting system based on machine vision | |
CN116977960A (en) | Rice seedling row detection method based on example segmentation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |