CN115319979A - Gum dipping control method for protective gloves - Google Patents

Gum dipping control method for protective gloves Download PDF

Info

Publication number
CN115319979A
CN115319979A CN202211249088.1A CN202211249088A CN115319979A CN 115319979 A CN115319979 A CN 115319979A CN 202211249088 A CN202211249088 A CN 202211249088A CN 115319979 A CN115319979 A CN 115319979A
Authority
CN
China
Prior art keywords
fluctuation
area
dipping
image
comparison
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211249088.1A
Other languages
Chinese (zh)
Other versions
CN115319979B (en
Inventor
王洁明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Mingfeng Food Co ltd
Original Assignee
Jiangsu Mingfeng Food Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Mingfeng Food Co ltd filed Critical Jiangsu Mingfeng Food Co ltd
Priority to CN202211249088.1A priority Critical patent/CN115319979B/en
Publication of CN115319979A publication Critical patent/CN115319979A/en
Application granted granted Critical
Publication of CN115319979B publication Critical patent/CN115319979B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C41/00Shaping by coating a mould, core or other substrate, i.e. by depositing material and stripping-off the shaped article; Apparatus therefor
    • B29C41/02Shaping by coating a mould, core or other substrate, i.e. by depositing material and stripping-off the shaped article; Apparatus therefor for making articles of definite length, i.e. discrete articles
    • B29C41/20Shaping by coating a mould, core or other substrate, i.e. by depositing material and stripping-off the shaped article; Apparatus therefor for making articles of definite length, i.e. discrete articles incorporating preformed parts or layers, e.g. moulding inserts or for coating articles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C41/00Shaping by coating a mould, core or other substrate, i.e. by depositing material and stripping-off the shaped article; Apparatus therefor
    • B29C41/34Component parts, details or accessories; Auxiliary operations
    • B29C41/52Measuring, controlling or regulating
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29LINDEXING SCHEME ASSOCIATED WITH SUBCLASS B29C, RELATING TO PARTICULAR ARTICLES
    • B29L2031/00Other particular articles
    • B29L2031/48Wearing apparel
    • B29L2031/4842Outerwear
    • B29L2031/4864Gloves

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Gloves (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the field of protective glove impregnation control, in particular to a protective glove impregnation control method, which comprises the following steps: obtaining a gum dipping area gray scale image and a gum dipping area difference image of the knitted gloves before gum dipping; acquiring a contrast area 1 and a contrast area 2 corresponding to the two gray-scale images; acquiring fluctuation positions of rows and columns in two contrast areas and a fluctuation position pixel difference value; acquiring fluctuation consistency and average fluctuation difference degree of the two comparison areas by using fluctuation positions of rows and columns in the two comparison areas and a fluctuation position pixel difference value; obtaining the knitting visibility of the knitted gloves according to the fluctuation consistency and the average fluctuation difference degree of the two contrast areas, and further obtaining the gum dipping suitability of the knitted gloves; and adjusting the dipping mold according to the appropriate dipping and gluing degree of the knitted gloves. The method is used for controlling the gum dipping degree of the protective gloves, and the gum dipping degree can be intelligently controlled by the method.

Description

Gum dipping control method for protective gloves
Technical Field
The invention relates to the field of protective glove impregnation control, in particular to a protective glove impregnation control method.
Background
Gloves are indispensable articles for daily life of people, and protective gloves are widely used in industrial production due to the characteristics of good insulation and the like. In the production process of the hand protection sleeve, in order to enhance the protection capability of the glove, a layer of rubber is often added on the basis of the woven glove to play a protection role. Different use scenes have different requirements on the gum dipping degree of the protective gloves. Therefore, control of the degree of impregnation of protective gloves is essential.
At present, in the production process of protective gloves, the gum dipping degree is manually controlled, and meanwhile, as the gloves are continuously dipped, the liquid level position of a gum solution pool is also changed, and a gum dipping machine needs to be adjusted according to experience.
The existing method for manually controlling the dipping degree depends on manpower, has higher cost and lower efficiency, and meanwhile, the accuracy of the dipping degree also depends on the existing experience, and the accuracy cannot be ensured, so that a method for improving the efficiency and the accuracy of controlling the dipping degree of the protective gloves is urgently needed.
Disclosure of Invention
The invention provides a gum dipping control method for protective gloves, which comprises the following steps: obtaining a gum dipping area gray scale image and a gum dipping area difference image of the knitted gloves before gum dipping; acquiring a contrast area 1 and a contrast area 2 corresponding to the two gray-scale images; acquiring fluctuation positions of rows and columns in two contrast areas and a fluctuation position pixel difference value; acquiring fluctuation consistency and average fluctuation difference degree of the two comparison areas by using fluctuation positions of rows and columns in the two comparison areas and a fluctuation position pixel difference value; obtaining the knitting line visibility of the knitted gloves according to the fluctuation consistency and the average fluctuation difference degree of the two comparison areas, and further obtaining the gum dipping appropriateness of the knitted gloves; the control parameters of the dipping die are adjusted according to the dipping suitability of the knitted gloves, and compared with the prior art, the knitted glove knitting visibility is obtained by analyzing the gray value variation difference of adjacent pixel points in the dipping area image and the dipping area difference image before dipping of the protective gloves through combining computer vision and image processing, so that the dipping suitability of the protective gloves is obtained through calculation, and the accuracy of the dipping degree is effectively improved.
Furthermore, the reasonable degree of the current impregnation behavior is judged by using the obtained impregnation suitability, so that the self-adaptive adjustment of the impregnation process is finally realized, and the intellectualization of the impregnation degree control is effectively realized.
In order to achieve the purpose, the invention adopts the following technical scheme that the gum dipping control method of the protective gloves comprises the following steps:
and obtaining a gum dipping area gray scale image before and after gum dipping of the knitted gloves.
And obtaining a gum dipping area difference diagram of the knitted gloves by differentiating the gum dipping area gray level diagrams before and after gum dipping of the knitted gloves.
And masking the gum dipping area gray scale image before gum dipping of the knitted gloves to obtain a contrast area 1 corresponding to the gum dipping area gray scale image.
And masking the gum dipping area difference image of the thread woven glove to obtain a contrast area 2 corresponding to the gum dipping area difference image.
And respectively calculating the gray difference value of adjacent pixel points in the comparison area 1 and the comparison area 2, and acquiring the fluctuation positions of rows and columns in the two comparison areas and the fluctuation position pixel difference value.
And acquiring fluctuation consistency of the two contrast areas by utilizing fluctuation positions of rows and columns in the two contrast areas.
And obtaining the average fluctuation difference degree of the two contrast areas by utilizing the pixel difference value of the fluctuation position in the two contrast areas.
And acquiring the knitting thread visibility of the knitted gloves according to the fluctuation consistency and the average fluctuation difference degree of the two contrast areas.
The degree of impregnation of the knitted gloves was calculated from the knitted visibility of the knitted gloves.
And adjusting the dipping mold according to the appropriate dipping and gluing degree of the knitted gloves.
Further, according to the method for controlling the dipping of the protective gloves, the gray level images of the dipping areas before and after the dipping of the knitted gloves are obtained according to the following method:
images before and after the thread woven gloves are dipped in glue are collected.
Semantic segmentation is carried out on the images of the knitted gloves before and after gum dipping to obtain gum dipping area images of the knitted gloves before and after gum dipping.
The gray level images of the dipping areas before and after the knitted gloves are dipped are obtained by carrying out gray level processing on the images of the dipping areas before and after the knitted gloves are dipped.
Further, according to the method for controlling gum dipping of the protective gloves, the fluctuation positions of the rows and the columns in the two comparison areas and the pixel difference value of the fluctuation positions are obtained as follows:
and calculating the gray difference value of each row of adjacent pixel points in the comparison area 1, and acquiring the row adjacent pixel difference value image of the comparison area 1.
And (4) counting the gray value of the difference image of the adjacent pixels in the row of the contrast area 1 to obtain a difference histogram of the adjacent pixels in the row of the contrast area 1.
And calculating the distance of the peak value from the center and the probability density at the peak value in the difference value histogram of the adjacent pixels in the row of the comparison area 1, and acquiring the row motion judgment threshold value of the comparison area 1.
And judging the gray difference value of adjacent pixels in the row adjacent pixel difference value image of the comparison area 1 and the size of a row motion judgment threshold value, and acquiring the fluctuation position of the row in the comparison area 1.
The above steps are repeated to obtain the fluctuation positions of the columns in the contrast area 1.
And acquiring the fluctuation position pixel difference value of the rows and the columns in the contrast area 1 according to the fluctuation positions of the rows and the columns in the contrast area 1.
The fluctuation positions of the rows and columns in the contrast area 2 and the fluctuation position pixel difference values are obtained in the manner described above.
Further, according to the method for controlling gum dipping of the protective gloves, the fluctuation consistency of the two comparison areas is obtained according to the following mode:
thresholding is carried out on the difference value image of the adjacent pixels in the row of the contrast area 1 to obtain the row motion position image of the contrast area 1
Figure 536592DEST_PATH_IMAGE001
Travelling motion position image from contrast region 1
Figure 771264DEST_PATH_IMAGE001
Coordinates of middle pixel points are obtained
Figure 736421DEST_PATH_IMAGE001
The lateral position of the image is the centerline.
Computing
Figure 474570DEST_PATH_IMAGE001
Distance of each wave position in image from central line of transverse positionGet away, get
Figure 853599DEST_PATH_IMAGE001
Image-corresponding fluctuating distance image
Figure 575567DEST_PATH_IMAGE002
Calculating out
Figure 81766DEST_PATH_IMAGE002
And the sum of the distance from the fluctuation position of each row in the image to the central line of the position to obtain a traveling wave sequence of the contrast area 1.
Repeating the steps to obtain the column fluctuation sequence of the comparison area 1.
The row and column wave sequences of the contrast region 2 are obtained as described above.
And respectively calculating the Euclidean distance of the traveling wave sequence and the Euclidean distance of the column wave sequence of the two comparison regions.
And acquiring the fluctuation consistency of the two comparison areas according to the Euclidean distance of the fluctuation sequences of the two comparison areas.
Further, in the method for controlling the gum dipping of the protective gloves, the average fluctuation difference degree of the two comparison areas is obtained as follows:
setting the pixel value of the area position outside the row fluctuation position in the contrast area 1 as 0 to obtain the row fluctuation difference value image of the contrast area 1
Figure 674421DEST_PATH_IMAGE003
Calculating out
Figure 489931DEST_PATH_IMAGE003
And acquiring a traveling wave motion difference value sequence of the comparison area 1 according to the average value of the difference values of each row in the image.
And repeating the steps to obtain a column fluctuation difference value sequence of the comparison area 1.
The traveling wave differential sequence and the column wave differential sequence of the contrast region 2 are obtained in the above manner.
And respectively calculating the Euclidean distance of the traveling wave motion difference value sequence and the Euclidean distance of the column wave motion difference value sequence of the two comparison areas.
And acquiring the average fluctuation difference degree of the two comparison areas according to the Euclidean distance of the fluctuation difference value sequences of the two comparison areas.
Further, in the method for controlling gum dipping of the protective gloves, the expression of the visibility of the knitted threads of the knitted gloves is as follows:
Figure 168037DEST_PATH_IMAGE004
in the formula:
Figure 477926DEST_PATH_IMAGE005
the consistency of the fluctuation of the two areas is shown,
Figure 190668DEST_PATH_IMAGE006
indicating the average degree of fluctuation difference between the two regions,
Figure 177078DEST_PATH_IMAGE007
indicating the thread visibility of the thread woven glove.
Further, in the method for controlling the dipping of the protective gloves, the expression of the proper dipping and gluing of the thread woven gloves is as follows:
Figure 608059DEST_PATH_IMAGE008
in the formula:
Figure 456061DEST_PATH_IMAGE009
as a result of the empirical values, the values,
Figure 23308DEST_PATH_IMAGE010
for the boundary values of the visibility range of the standard glove knitting line,
Figure 180620DEST_PATH_IMAGE011
the final deviation of the visibility of the weaving lines from the border of the range,
Figure 567739DEST_PATH_IMAGE012
the dipping and gluing of the knitted gloves are moderate.
The invention has the beneficial effects that:
by combining computer vision and image processing, the knitting thread visibility of the protective gloves is obtained by analyzing the gray value variation difference of adjacent pixel points in the gum dipping area image and the gum dipping area difference image before gum dipping of the protective gloves, so that the gum dipping suitability of the protective gloves is obtained through calculation, and the accuracy of the gum dipping degree is effectively improved.
The method judges the reasonable degree of the current impregnation behavior by using the obtained impregnation suitability, finally realizes the self-adaptive adjustment of the impregnation process, and effectively realizes the intellectualization of the impregnation degree control.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a schematic flow chart of a method for controlling dipping of protective gloves according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a method for controlling dipping of protective gloves according to an embodiment of the present invention;
FIG. 3 is a schematic view illustrating the dipping effect of a protective glove according to an embodiment of the present invention;
FIG. 4 is a schematic view of the gum dipping effect of another protective glove provided by the embodiment of the invention;
fig. 5 is a schematic diagram of a difference value diagram of a dipping area of a knitted glove according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example 1
The embodiment of the invention provides a method for controlling gum dipping of protective gloves, which comprises the following steps of:
s101, obtaining a gum dipping area gray scale image before and after gum dipping of the knitted gloves.
The gray scale map is also called a gray scale map. The relationship between white and black is logarithmically divided into several levels, called gray scale. The gray scale is divided into 256 steps. An image represented in grayscale is referred to as a grayscale map.
S102, obtaining a gum dipping area difference diagram of the knitted glove by subtracting the gum dipping area gray level diagrams before and after gum dipping of the knitted glove.
The difference image is obtained from a gum dipping area gray level image before gum dipping and a gum dipping area gray level image after gum dipping, and the image can reflect the difference change before and after gum dipping.
S103, masking the gum dipping area gray scale image before gum dipping of the woven glove to obtain a comparison area 1 corresponding to the gum dipping area gray scale image.
The contrast area 1 is obtained by masking the difference image to obtain a binary mask image and multiplying the binary mask image and the image before gum dipping.
S104, shading the gum dipping area difference image of the woven glove to obtain a comparison area 2 corresponding to the gum dipping area difference image.
Wherein, the contrast region 2 refers to a corresponding connected component region in the difference image.
And S105, respectively calculating the gray level difference value of adjacent pixel points in the comparison area 1 and the comparison area 2, and acquiring the fluctuation positions of rows and columns in the two comparison areas and the fluctuation position pixel difference value.
And the fluctuation positions of the rows and the columns are obtained according to the gray value difference value of the adjacent pixel points in each contrast area.
And S106, acquiring the fluctuation consistency of the two contrast areas by utilizing the fluctuation positions of the rows and the columns in the two contrast areas.
The fluctuation consistency is used for judging whether the gray level change of the difference image is consistent with the gray level change before gum dipping, and the more consistent the fluctuation consistency, the smaller the shading area of the texture is, and the more the corresponding knitting line can be observed.
And S107, acquiring the average fluctuation difference degree of the two contrast areas by using the fluctuation position pixel difference value in the two contrast areas.
The average fluctuation difference degree is used for judging the gray level difference of the fluctuation position, and the larger the average difference is, the more obvious the weaving line is.
And S108, acquiring the knitting line visibility of the knitted gloves according to the fluctuation consistency and the average fluctuation difference degree of the two contrast areas.
The visibility of the knitting yarn is the visibility of the original knitting yarn of the glove after being attached with rubber, and the larger the visibility is, the thinner or thinner the rubber attached to the glove is, so that the requirement set by people in advance is not necessarily met. Namely, the difference between the dipping effect and the expected effect can be evaluated through the visibility of the knitting threads after the glove is dipped, so that the dipping process can be intelligently controlled.
And S109, calculating the dipping and gluing appropriateness of the knitted gloves according to the knitted line visibility of the knitted gloves.
The gum dipping suitability is the suitability of gum dipping operation of the protective gloves, and the gum dipping suitability is obtained according to the relation between the weaving visibility of the obtained knitted gloves and the weaving visibility of the standard gloves.
And S110, adjusting the dipping mold according to the dipping and gluing proper degree of the thread woven gloves.
The method comprises the steps of obtaining the gum dipping suitability of the protective glove, adjusting a gum dipping mold according to the gum dipping suitability of the knitted gloves, judging whether the gum dipping amount of the mold is suitable or not according to the obtained gum dipping suitability of the protective glove, and adjusting a gum dipping machine.
The beneficial effect of this embodiment lies in:
by combining computer vision and image processing, the knitting thread visibility of the protective gloves is obtained by analyzing the gray value variation difference of adjacent pixel points in the gum dipping area image and the gum dipping area difference image before gum dipping of the protective gloves, so that the gum dipping suitability of the protective gloves is obtained through calculation, and the accuracy of the gum dipping degree is effectively improved.
In the embodiment, the reasonable degree of the current impregnation behavior is judged by using the obtained impregnation suitability, so that the self-adaptive adjustment of the impregnation process is finally realized, and the intellectualization of the impregnation degree control is effectively realized.
Example 2
The embodiment of the invention provides a method for controlling gum dipping of protective gloves, which comprises the following steps of:
s201, collecting images of the knitted gloves before and after gum dipping.
The glove mold is sleeved with a thread knitted glove needing gum dipping, the glove enters the rubber pool by controlling the angle of the mold bracket, so that rubber can be dipped into the glove, and then the mold is lifted to finish the gum dipping process.
After the gloves are dipped, a corresponding track is required to be arranged opposite to the glove mold, and a camera is erected on the track, so that the camera can acquire suitable images of the gloves before and after dipping.
S202, obtaining gum dipping area images of the knitted gloves before and after gum dipping.
The factory environment is relatively complex, the corresponding background is multilateral, and in order to quickly and accurately find the dipped gloves in the image, the deep learning neural network technology is adopted to realize the target detection of the dipped gloves.
This embodiment obtains the corresponding gum dipping glove area by DNN in a semantic segmentation manner. The specific contents of the DNN network are as follows:
1) The network adopts a semantic segmentation form, and the corresponding structure is Encoder-Decoder.
2) The dataset used by the network is the actual acquired image of the dipped glove on the mould.
3) The label is a single-channel semantic label, wherein the pixel position corresponding to the position of the dipped glove is marked as 1, and the rest is a scene background and is marked as 0.
4) The loss function uses a cross entropy loss function.
Therefore, the gum dipping area image after gum dipping can be acquired through the camera, the mask image of the gum dipping gloves in the image can be obtained through the trained neural network reasoning, and the mask image and the acquired RGB image are multiplied, so that the purpose of picking the gum dipping gloves in the image can be achieved.
S203, obtaining a gum dipping area difference diagram of the knitted gloves.
The dipping results of the gloves are different due to problems of rubber modulation or problems of the pressing angle of the mold. The corresponding images are shown in fig. 3 and 4: FIG. 3 is an image showing a poor dipping effect, and the texture of the knitting yarn can be clearly observed in the rubber portion. FIG. 4 is a diagram showing the effect of dipping the fabric with the rubber portion covering the lower knitting yarn.
In order to judge the effect of gum dipping, images before and after gum dipping need to be collected for comparison.
Firstly, acquiring an image of a glove impregnation area without impregnation before impregnation, marking the image as QF, acquiring an image of the glove impregnation area after impregnation as JF, and reasoning by adopting the neural network to identify a glove target in the image.
And performing graying treatment on the two images, and subtracting JF from QF to obtain a difference image which can reflect the difference change before and after gum dipping. The difference image is shown in fig. 5.
S204, obtaining a contrast area 1 and a contrast area 2 corresponding to the gray level images of the two glove dipping areas.
After obtaining the difference image, the weaving line visibility is obtained by analyzing the consistency and fluctuation difference degree of the gray level fluctuation of the difference image and the image before gum dipping.
1) Firstly, morphological processing (closed operation) is carried out on the difference image, isolated noise is removed, and meanwhile the difference regions can be connected into a closed region. The closed area is the gumming area. Corresponding gum dipping binary mask images can be obtained from the areas.
2) The binary mask image is multiplied by the image before gum dipping to obtain a contrast area 1. And the corresponding connected domain region in the difference image is the contrast region 2. Both regions need to be analyzed in comparison.
S205, acquiring fluctuation positions of rows and columns in two contrast areas and a fluctuation position pixel difference value.
The texture is represented as regular change of gray value in a gray scale image, the areas of the two regions are the same, and the area ratio of the same gray change (the same texture) in the judgment region is mainly evaluated.
And describing texture information in the region by a row-column fluctuation sequence corresponding to the minimum bounding rectangle of the connected component. Firstly, the gray value in the connected domain is counted to obtain a fluctuation judgment threshold value, and when the gray difference value of adjacent pixels is larger than the threshold value, the fluctuation is a primary fluctuation.
(1) And calculating the gray difference value of each row and each column of adjacent pixel points in the comparison area 1, and acquiring the row adjacent pixel difference value image and the column adjacent pixel difference value image of the comparison area 1.
(1) Subtracting adjacent pixels in each row by two
Figure 468699DEST_PATH_IMAGE013
Obtaining a difference image of adjacent pixels in a row
Figure 903835DEST_PATH_IMAGE014
Wherein the content of the first and second substances,
Figure 232048DEST_PATH_IMAGE015
is shown as
Figure 372043DEST_PATH_IMAGE016
Line, first
Figure 811114DEST_PATH_IMAGE017
The pixel values of the pixel points of a column,
Figure 103687DEST_PATH_IMAGE018
is shown as
Figure 602801DEST_PATH_IMAGE016
Line, line 1
Figure 964512DEST_PATH_IMAGE019
The pixel values of the pixel points of a column,
Figure 207275DEST_PATH_IMAGE020
is shown as
Figure 885512DEST_PATH_IMAGE016
Line, first
Figure 289948DEST_PATH_IMAGE017
Pixel point and second of column
Figure 138956DEST_PATH_IMAGE016
Line, first
Figure 185409DEST_PATH_IMAGE019
The difference between the pixel values of the pixels of the row.
(2) Subtracting adjacent pixels in each column by two
Figure 452573DEST_PATH_IMAGE021
Obtaining a difference image of adjacent pixels in a column
Figure 293491DEST_PATH_IMAGE022
Wherein the content of the first and second substances,
Figure 629794DEST_PATH_IMAGE023
is shown as
Figure 214359DEST_PATH_IMAGE024
Line, line 1
Figure 344819DEST_PATH_IMAGE019
The pixel values of the pixel points of a column,
Figure 356637DEST_PATH_IMAGE018
is shown as
Figure 180237DEST_PATH_IMAGE016
Line, first
Figure 302914DEST_PATH_IMAGE019
The pixel values of the pixel points of a column,
Figure 793938DEST_PATH_IMAGE020
denotes the first
Figure 727390DEST_PATH_IMAGE024
Line, first
Figure 38286DEST_PATH_IMAGE019
Pixel point and second of column
Figure 964653DEST_PATH_IMAGE016
Line, first
Figure 310184DEST_PATH_IMAGE019
The difference between the pixel values of the pixels of the row.
(2) And (4) counting the gray value of the difference image of the adjacent pixels in the row of the contrast area 1 to obtain a difference histogram of the adjacent pixels in the row of the contrast area 1.
(3) And calculating the distance of the peak value from the center and the probability density at the peak value in the difference value histogram of the adjacent pixels in the row of the comparison area 1, and acquiring the row motion judgment threshold value of the comparison area 1.
And calculating the distance of the peak value in the pixel value histogram from the center and the probability density at the peak value to determine a judgment threshold value, wherein the threshold value is relatively small when the histogram is biased to the left, and is relatively large when the histogram is biased to the right.
(1) Counting the gray value of the row adjacent pixel difference image (column adjacent pixel difference image) to obtain the abscissa value of the maximum gray value of the image
Figure 680117DEST_PATH_IMAGE025
And the abscissa value of the minimum value
Figure 478308DEST_PATH_IMAGE026
And calculating the mean value in the middle of the abscissa.
Figure 208367DEST_PATH_IMAGE027
In the formula:
Figure 159136DEST_PATH_IMAGE025
is the abscissa value of the maximum value of the gray value of the image,
Figure 683659DEST_PATH_IMAGE026
an abscissa value which is the minimum value of the gray value of the image,
Figure 969147DEST_PATH_IMAGE028
is the mean value in the middle of the abscissa of the gray value of the image.
(2) Calculating the abscissa value having the highest frequency of gray value
Figure 502896DEST_PATH_IMAGE029
(3) The distance between the highest position of the frequency and the mean value in the middle is calculated.
Figure 570822DEST_PATH_IMAGE030
In the formula:
Figure 266245DEST_PATH_IMAGE029
an abscissa value indicating the highest frequency of the gradation value,
Figure 39029DEST_PATH_IMAGE028
is the mean value in the middle of the abscissa of the gray value of the image,
Figure 110890DEST_PATH_IMAGE031
the distance between the highest position of the frequency and the mean value in the middle.
(4) Calculating the variance of the pixel values
Figure 770673DEST_PATH_IMAGE032
(5) Calculating the highest position interval of frequency
Figure 902577DEST_PATH_IMAGE033
Inner probability density
Figure 897078DEST_PATH_IMAGE034
(6) And obtaining a fluctuation judgment threshold value.
Figure 772630DEST_PATH_IMAGE035
In the formula:
Figure 286919DEST_PATH_IMAGE036
a fluctuation judging threshold value is indicated and,
Figure 855304DEST_PATH_IMAGE034
indicating the highest position interval of the frequency
Figure 71521DEST_PATH_IMAGE033
The density of the probability of the inner-band,
Figure 485185DEST_PATH_IMAGE028
is the mean value in the middle of the abscissa of the image gray value,
Figure 119560DEST_PATH_IMAGE031
the distance between the highest position of the frequency and the mean value in the middle.
(4) And judging the gray difference value of adjacent pixels in the row adjacent pixel difference value image of the comparison area 1 and the size of a row motion judgment threshold value, and acquiring the fluctuation position of the row in the comparison area 1.
(5) The above steps are repeated to obtain the fluctuation positions of the columns in the contrast area 1.
(6) And acquiring the fluctuation position pixel difference value of the rows and the columns in the contrast area 1 according to the fluctuation positions of the rows and the columns in the contrast area 1.
(7) The fluctuation positions of the rows and the columns in the contrast area 2 and the fluctuation position pixel difference are obtained by adopting the method.
S206, obtaining the row-column fluctuation sequence of the two comparison areas.
Taking the comparison area 1 as an example, the specific process is as follows:
(1) Thresholding adjacent pixel difference images to obtain fluctuation position images
Figure 593267DEST_PATH_IMAGE037
(2) And acquiring the position central line of the connected domain.
(1) Obtaining
Figure 562360DEST_PATH_IMAGE038
The longitudinal position of the image is centered.
Previously obtained impregnation communicating domain
Figure 779714DEST_PATH_IMAGE039
Extracting minimum transverse coordinates of connected components
Figure 531245DEST_PATH_IMAGE040
And maximum lateral coordinate
Figure 175853DEST_PATH_IMAGE041
Calculating the mean value of two transverse coordinates
Figure 632242DEST_PATH_IMAGE042
Transverse coordinate is
Figure 138441DEST_PATH_IMAGE043
Is a longitudinal position central line
Figure 262255DEST_PATH_IMAGE044
(2) Obtaining
Figure 812185DEST_PATH_IMAGE045
The lateral position of the image is the midline.
The analogy process (1) can obtain the transverse position centerline
Figure 755870DEST_PATH_IMAGE046
(3) The distance of each wave position from the center line is calculated.
(1) To find
Figure 65760DEST_PATH_IMAGE038
The distance of each wave position in the image from the center line of the longitudinal position, assuming coordinates
Figure 778501DEST_PATH_IMAGE047
Is located at a wave position at a distance from the position's center line
Figure 764912DEST_PATH_IMAGE048
In the formula:
Figure 930314DEST_PATH_IMAGE049
represents the first in the image
Figure 43894DEST_PATH_IMAGE050
Line and first
Figure 876721DEST_PATH_IMAGE051
The fluctuating position coordinates of the columns,
Figure 34033DEST_PATH_IMAGE052
is the lateral coordinate of the wave position,
Figure 700113DEST_PATH_IMAGE043
is the mean of the transverse coordinates.
(2) To find
Figure 601073DEST_PATH_IMAGE045
The distance of each wave position in the image from the center line of the transverse position, assuming coordinates
Figure 288406DEST_PATH_IMAGE047
Is located at a wave position at a distance from the position's center line
Figure 632931DEST_PATH_IMAGE053
In the formula:
Figure 772925DEST_PATH_IMAGE049
represents the first in the image
Figure 211997DEST_PATH_IMAGE050
Go, first
Figure 753837DEST_PATH_IMAGE051
The fluctuating position coordinates of the columns,
Figure 3684DEST_PATH_IMAGE054
is the longitudinal coordinate of the wave position,
Figure 630974DEST_PATH_IMAGE055
is the mean value of the longitudinal coordinates.
Figure 873737DEST_PATH_IMAGE056
The process will obtain
Figure 270083DEST_PATH_IMAGE038
Image-corresponding fluctuating distance image
Figure 690831DEST_PATH_IMAGE057
And
Figure 805418DEST_PATH_IMAGE045
image-corresponding fluctuating distance image
Figure 851871DEST_PATH_IMAGE058
(4) And calculating the distance between the wave position of each row and each column and the position central line, and obtaining the distance and the wave sequence.
Will be provided with
Figure 102724DEST_PATH_IMAGE057
Pixel value of each row in imageAdding
Figure 703162DEST_PATH_IMAGE059
Obtaining a train fluctuation sequence
Figure 305045DEST_PATH_IMAGE060
In the formula:
Figure 889610DEST_PATH_IMAGE049
represents the first in the image
Figure 260549DEST_PATH_IMAGE050
Go, first
Figure 23099DEST_PATH_IMAGE051
The fluctuating position coordinates of the columns,
Figure 112278DEST_PATH_IMAGE061
to represent
Figure 234955DEST_PATH_IMAGE038
The distance of each wave position in the image from the line in the longitudinal position, n representing
Figure 194821DEST_PATH_IMAGE038
Number of columns in the image.
Will be provided with
Figure 643119DEST_PATH_IMAGE058
Adding pixel values of each column in the image
Figure 704747DEST_PATH_IMAGE062
Obtaining a travelling motion sequence
Figure 631115DEST_PATH_IMAGE063
In the formula:
Figure 976646DEST_PATH_IMAGE049
represents the first in the image
Figure 330267DEST_PATH_IMAGE050
Line and first
Figure 879191DEST_PATH_IMAGE051
The fluctuating position coordinates of the columns,
Figure 609250DEST_PATH_IMAGE064
represent
Figure 809287DEST_PATH_IMAGE045
The distance of each wave position in the image from the centerline of the transverse position, m representing the distance
Figure 333809DEST_PATH_IMAGE045
The number of lines in the image.
The above process is repeated to obtain the row-column fluctuation sequence of the contrast region 2.
And S207, acquiring the fluctuation consistency of the two comparison areas.
(1) And calculating the Euclidean distance between the two regional row (column) fluctuation sequences.
Calculating Euclidean distance of two-region traveling wave motion sequence
Figure 367100DEST_PATH_IMAGE065
And Euclidean distance of two-region column fluctuation sequence
Figure 635270DEST_PATH_IMAGE066
(2) And judging the gray level fluctuation consistency through the Euclidean distance.
Figure 689814DEST_PATH_IMAGE067
In the formula:
Figure 650816DEST_PATH_IMAGE065
representing the euclidean distance of the traveling sequence in the two regions,
Figure 174333DEST_PATH_IMAGE066
representing the euclidean distance of the two-region train fluctuation sequence,
Figure 980615DEST_PATH_IMAGE068
representing the final euclidean distance of the two regions.
The fluctuation uniformity of the gradation values is thus:
Figure 155244DEST_PATH_IMAGE069
in the formula:
Figure 21569DEST_PATH_IMAGE068
the final euclidean distance between the two regions is represented,
Figure 32381DEST_PATH_IMAGE070
indicating the consistency of the fluctuations of both regions.
And S208, acquiring the average fluctuation difference degree of the two contrast areas.
The difference of gray values of each fluctuation position is obtained in the foregoing, and the image is obtained at the place where the image is required
Figure 642354DEST_PATH_IMAGE071
Setting the pixel values of other area positions to zero by the pixel values at the medium fluctuation position to obtain a column fluctuation difference value image
Figure 671490DEST_PATH_IMAGE072
Analogy to this procedure to obtain a travelling wave differential value image
Figure 708716DEST_PATH_IMAGE073
To find out
Figure 206825DEST_PATH_IMAGE072
Mean per line of an image
Figure 620489DEST_PATH_IMAGE074
And
Figure 504131DEST_PATH_IMAGE073
mean value of difference values of each column of image
Figure 712258DEST_PATH_IMAGE075
The traveling wave motion difference value sequence can be obtained in the mode
Figure 694733DEST_PATH_IMAGE076
Sequence of sum and column fluctuation differences
Figure 646509DEST_PATH_IMAGE077
Calculating the Euclidean distance between the traveling wave differential value sequence and the column wave differential value sequence of the two regions
Figure 650237DEST_PATH_IMAGE078
Figure 294845DEST_PATH_IMAGE079
And the average fluctuation difference degree of the amplified images is as follows:
Figure 236387DEST_PATH_IMAGE080
in the formula:
Figure 991854DEST_PATH_IMAGE078
representing the Euclidean distance between the traveling wave motion difference value sequences of the two regions,
Figure 584509DEST_PATH_IMAGE079
representing the Euclidean distance between the two regions of the sequence of column fluctuation difference values,
Figure 400018DEST_PATH_IMAGE081
indicating the average degree of fluctuation difference between the two regions.
S209, obtaining the knitting thread visibility of the knitting glove.
And calculating the visibility of the knitting lines in the gum dipping area.
Visibility of threads in the impregnated area
Figure 94436DEST_PATH_IMAGE007
Comprises the following steps:
Figure 653593DEST_PATH_IMAGE004
in the formula:
Figure 100755DEST_PATH_IMAGE005
indicating the consistency of the fluctuations of the two regions,
Figure 821586DEST_PATH_IMAGE006
indicating the average degree of fluctuation difference between the two regions.
S210, obtaining the knitted gloves with proper impregnation and gluing.
Extracting the fabric visibility of the standard glove to obtain the fabric visibility range of the standard glove
Figure 252568DEST_PATH_IMAGE082
Determining whether the visibility of the fabric is
Figure 100569DEST_PATH_IMAGE083
Figure 198975DEST_PATH_IMAGE084
When in use
Figure 90708DEST_PATH_IMAGE085
The deviation of the fabric visibility from the range boundary is calculated:
Figure 743406DEST_PATH_IMAGE086
Figure 392169DEST_PATH_IMAGE087
Figure 813923DEST_PATH_IMAGE088
in the formula:
Figure 142136DEST_PATH_IMAGE089
for deviations of the visibility of the weaving lines from the smaller boundaries of the range,
Figure 282130DEST_PATH_IMAGE090
for deviations of the visibility of the weaving lines from the boundaries of the larger extent,
Figure 737514DEST_PATH_IMAGE091
to eventually deviate the visibility of the weaving line from the boundaries of the range,
Figure 279353DEST_PATH_IMAGE092
is composed of
Figure 778468DEST_PATH_IMAGE089
Figure 405758DEST_PATH_IMAGE090
The smaller of the absolute values.
Calculating suitability of gum dipping by standard deviation
Figure 399253DEST_PATH_IMAGE093
Figure 795599DEST_PATH_IMAGE094
In the formula:
Figure 465615DEST_PATH_IMAGE095
as a result of the empirical values, the values,
Figure 580202DEST_PATH_IMAGE091
to eventually deviate the visibility of the weaving line from the boundaries of the range,
Figure 111808DEST_PATH_IMAGE096
are the boundary values of the visibility range of the standard glove fabric.
Thus, the current appropriate gum dipping operation is obtained.
S211, adjusting control parameters of the dipping mold according to the dipping suitability of the knitted gloves, and achieving intelligent control.
After the appropriateness corresponding to the gum dipping operation is obtained, the next control parameter of the die is adjusted according to the appropriateness, so that the abnormality is effectively solved.
Adjusting the dipping parameters of the die according to the fitness value:
when the temperature is higher than the set temperature
Figure 628240DEST_PATH_IMAGE097
When the mold is used, the gum dipping amount is small, and the gum dipping amount of the mold needs to be increased by adjusting control parameters.
When the temperature is higher than the set temperature
Figure 469157DEST_PATH_IMAGE098
The amount of the dipping solution was as described above, and it was not necessary to adjust the amount.
When in use
Figure 71040DEST_PATH_IMAGE099
When the amount of the gum dipping is too much, the gum dipping amount of the mold is reduced by adjusting control parameters.
The beneficial effect of this embodiment lies in:
by combining computer vision and image processing, the knitting thread visibility of the protective gloves is obtained by analyzing the gray value variation difference of adjacent pixel points in the gum dipping area image and the gum dipping area difference image before gum dipping of the protective gloves, so that the gum dipping suitability of the protective gloves is obtained through calculation, and the accuracy of the gum dipping degree is effectively improved.
In the embodiment, the reasonable degree of the current impregnation behavior is judged by using the obtained impregnation suitability, so that the self-adaptive adjustment of the impregnation process is finally realized, and the intellectualization of the impregnation degree control is effectively realized.
The beneficial effect of this embodiment lies in:
by combining computer vision and image processing, the knitting thread visibility of the protective gloves is obtained by analyzing the gray value variation difference of adjacent pixel points in the gum dipping area image and the gum dipping area difference image before gum dipping of the protective gloves, so that the gum dipping suitability of the protective gloves is obtained through calculation, and the accuracy of the gum dipping degree is effectively improved.
In the embodiment, the reasonable degree of the current impregnation behavior is judged by using the obtained impregnation suitability, so that the self-adaptive adjustment of the impregnation process is finally realized, and the intellectualization of the impregnation degree control is effectively realized.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (7)

1. The method for controlling gum dipping of the protective gloves is characterized by comprising the following steps:
obtaining a gum dipping area gray scale image before and after gum dipping of the knitted gloves;
obtaining a gum dipping area difference diagram of the knitted gloves by differentiating the gum dipping area gray level diagrams before and after gum dipping of the knitted gloves;
masking a gum dipping area gray scale image before gum dipping of the knitted gloves to obtain a comparison area 1 corresponding to the gum dipping area gray scale image;
masking the gum dipping area difference image of the thread woven glove to obtain a comparison area 2 corresponding to the gum dipping area difference image;
respectively calculating gray level difference values of adjacent pixel points in the comparison area 1 and the comparison area 2, and acquiring fluctuation positions of rows and columns and fluctuation position pixel difference values in the two comparison areas;
acquiring fluctuation consistency of the two comparison areas by utilizing fluctuation positions of rows and columns in the two comparison areas;
obtaining the average fluctuation difference degree of the two contrast areas by utilizing the fluctuation position pixel difference value in the two contrast areas;
acquiring the knitting line visibility of the knitted gloves according to the fluctuation consistency and the average fluctuation difference degree of the two contrast areas;
calculating the appropriate dipping and gluing degree of the knitted gloves according to the knitted visibility of the knitted gloves;
and adjusting the dipping mold according to the appropriate dipping and gluing degree of the knitted gloves.
2. The method for controlling gum dipping of protective gloves according to claim 1, wherein the gray-scale maps of the gum dipping areas before and after the gum dipping of the knitted gloves are obtained as follows:
collecting images of the knitted gloves before and after gum dipping;
semantic segmentation is carried out on the images of the knitted gloves before and after gum dipping to obtain gum dipping area images of the knitted gloves before and after gum dipping;
the gray level images of the dipping areas before and after the knitted gloves are dipped are obtained by carrying out gray level processing on the images of the dipping areas before and after the knitted gloves are dipped.
3. The method for controlling dipping of protective gloves according to claim 1, wherein the fluctuation positions of the rows and the columns in the two comparison areas and the pixel difference value of the fluctuation positions are obtained as follows:
calculating the gray difference value of each row of adjacent pixel points in the comparison area 1, and acquiring a row adjacent pixel difference value image of the comparison area 1;
counting the gray value of the difference image of the adjacent pixels in the row of the comparison area 1 to obtain a difference histogram of the adjacent pixels in the row of the comparison area 1;
calculating the distance of the peak value from the center and the probability density at the peak value in the difference value histogram of the adjacent pixels of the row of the comparison area 1, and acquiring a row motion judgment threshold value of the comparison area 1;
judging the gray level difference value of adjacent pixels in the row adjacent pixel difference value image of the comparison area 1 and the size of a row motion judgment threshold value, and acquiring the fluctuation position of the row in the comparison area 1;
repeating the steps to obtain the fluctuation position of the columns in the comparison area 1;
acquiring a fluctuation position pixel difference value of the rows and the columns in the comparison area 1 according to the fluctuation positions of the rows and the columns in the comparison area 1;
the fluctuation positions of the rows and columns in the contrast area 2 and the fluctuation position pixel difference values are obtained in the manner described above.
4. The method for controlling dipping of protective gloves according to claim 1, wherein the fluctuation consistency of the two contrast areas is obtained as follows:
thresholding is carried out on the difference value image of the adjacent pixels in the row of the contrast area 1 to obtain the row motion position image of the contrast area 1
Figure 421039DEST_PATH_IMAGE001
Travelling motion position image from contrast region 1
Figure 122148DEST_PATH_IMAGE001
Coordinates of middle pixel points are obtained
Figure 205773DEST_PATH_IMAGE001
A transverse positional centerline of the image;
computing
Figure 223407DEST_PATH_IMAGE001
The distance between each fluctuation position in the image and the central line of the transverse position is obtained
Figure 983553DEST_PATH_IMAGE001
Image-corresponding fluctuating distance image
Figure 703116DEST_PATH_IMAGE002
Computing
Figure 839699DEST_PATH_IMAGE002
The sum of the distance from the fluctuation position of each row in the image to the central line of the position to obtain a traveling fluctuation sequence of the comparison area 1;
repeating the steps to obtain a train fluctuation sequence of the comparison area 1;
the row wave sequence and the traveling wave sequence of the comparison area 2 are obtained according to the method;
respectively calculating the Euclidean distance of the traveling wave sequence and the Euclidean distance of the column wave sequence of the two comparison regions;
and acquiring the fluctuation consistency of the two comparison areas according to the Euclidean distance of the fluctuation sequences of the two comparison areas.
5. The method for controlling dipping of protective gloves according to claim 1, wherein the average fluctuation difference degree of the two contrast areas is obtained as follows:
setting the pixel value of the area position outside the row fluctuation position in the contrast area 1 as 0 to obtain the row fluctuation difference value image of the contrast area 1
Figure 931414DEST_PATH_IMAGE003
Computing
Figure 862461DEST_PATH_IMAGE003
Acquiring a traveling wave motion difference value sequence of the comparison area 1 from the mean value of the difference values of each row in the image;
repeating the steps to obtain a train fluctuation difference value sequence of the comparison area 1;
the traveling wave motion difference value sequence and the column wave motion difference value sequence of the comparison area 2 are obtained according to the mode;
respectively calculating Euclidean distances of the traveling wave differential value sequences and the column wave differential value sequences of the two comparison regions;
and acquiring the average fluctuation difference degree of the two comparison areas according to the Euclidean distance of the fluctuation difference value sequences of the two comparison areas.
6. The method for controlling dipping of protective gloves according to claim 1, wherein the expression of the visibility of the knitted threads of the knitted gloves is as follows:
Figure 803741DEST_PATH_IMAGE004
in the formula:
Figure 744015DEST_PATH_IMAGE005
indicating the consistency of the fluctuations of the two regions,
Figure 707815DEST_PATH_IMAGE006
indicating the average degree of fluctuation difference between the two regions,
Figure 793452DEST_PATH_IMAGE007
indicating the thread visibility of the thread woven glove.
7. The method for controlling dipping of protective gloves according to claim 1, wherein the expression of the dipping moderation of the knitted gloves is as follows:
Figure 972760DEST_PATH_IMAGE008
in the formula:
Figure DEST_PATH_IMAGE009
as a result of the empirical values, the values,
Figure 873982DEST_PATH_IMAGE010
for the boundary values of the visibility range of the standard glove knitting line,
Figure 438825DEST_PATH_IMAGE011
for the final deviation of the visibility of the weaving line from the boundaries of the range,
Figure 977254DEST_PATH_IMAGE012
the dipping and gluing of the knitted gloves are moderate.
CN202211249088.1A 2022-10-12 2022-10-12 Gum dipping control method for protective gloves Active CN115319979B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211249088.1A CN115319979B (en) 2022-10-12 2022-10-12 Gum dipping control method for protective gloves

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211249088.1A CN115319979B (en) 2022-10-12 2022-10-12 Gum dipping control method for protective gloves

Publications (2)

Publication Number Publication Date
CN115319979A true CN115319979A (en) 2022-11-11
CN115319979B CN115319979B (en) 2023-01-03

Family

ID=83913187

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211249088.1A Active CN115319979B (en) 2022-10-12 2022-10-12 Gum dipping control method for protective gloves

Country Status (1)

Country Link
CN (1) CN115319979B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116721067A (en) * 2023-05-29 2023-09-08 宿迁凯达环保设备制造有限公司 Impregnated paper impregnation quality detection method based on machine vision

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN208180066U (en) * 2018-01-25 2018-12-04 黄兆周 A kind of full-automatic multi-functional safety and industrial gloves production line
CN112198161A (en) * 2020-10-10 2021-01-08 安徽和佳医疗用品科技有限公司 PVC gloves real-time detection system based on machine vision
CN114248385A (en) * 2021-12-15 2022-03-29 张家港思淇科技有限公司 Production process of gum dipping gloves
CN114627117A (en) * 2022-05-13 2022-06-14 启东市鸿盛纺织有限公司 Knitted fabric defect detection method and system based on projection method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN208180066U (en) * 2018-01-25 2018-12-04 黄兆周 A kind of full-automatic multi-functional safety and industrial gloves production line
CN112198161A (en) * 2020-10-10 2021-01-08 安徽和佳医疗用品科技有限公司 PVC gloves real-time detection system based on machine vision
CN114248385A (en) * 2021-12-15 2022-03-29 张家港思淇科技有限公司 Production process of gum dipping gloves
CN114627117A (en) * 2022-05-13 2022-06-14 启东市鸿盛纺织有限公司 Knitted fabric defect detection method and system based on projection method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116721067A (en) * 2023-05-29 2023-09-08 宿迁凯达环保设备制造有限公司 Impregnated paper impregnation quality detection method based on machine vision
CN116721067B (en) * 2023-05-29 2024-04-12 宿迁凯达环保设备制造有限公司 Impregnated paper impregnation quality detection method based on machine vision

Also Published As

Publication number Publication date
CN115319979B (en) 2023-01-03

Similar Documents

Publication Publication Date Title
CN108960245B (en) Tire mold character detection and recognition method, device, equipment and storage medium
CN110717896B (en) Plate strip steel surface defect detection method based on significance tag information propagation model
Sen et al. Generalized rough sets, entropy, and image ambiguity measures
CN111340824B (en) Image feature segmentation method based on data mining
CN109657612B (en) Quality sorting system based on facial image features and application method thereof
CN107038416B (en) Pedestrian detection method based on binary image improved HOG characteristics
CN111915704A (en) Apple hierarchical identification method based on deep learning
CN114359190B (en) Plastic product molding control method based on image processing
CN109800698A (en) Icon detection method based on depth network
CN111611643A (en) Family type vectorization data obtaining method and device, electronic equipment and storage medium
CN107545571A (en) A kind of image detecting method and device
CN108921813A (en) Unmanned aerial vehicle detection bridge structure crack identification method based on machine vision
CN109540925B (en) Complex ceramic tile surface defect detection method based on difference method and local variance measurement operator
CN110889332A (en) Lie detection method based on micro expression in interview
CN113298809B (en) Composite material ultrasonic image defect detection method based on deep learning and superpixel segmentation
CN113252614B (en) Transparency detection method based on machine vision
CN115319979B (en) Gum dipping control method for protective gloves
CN116704209B (en) Quick flange contour extraction method and system
CN115239718B (en) Plastic product defect detection method and system based on image processing
CN106023249A (en) Moving object detection method based on local binary similarity pattern
CN115439494A (en) Spray image processing method for quality inspection of sprayer
CN107274452A (en) A kind of small pox automatic testing method
CN109460767A (en) Rule-based convex print bank card number segmentation and recognition methods
CN116823824A (en) Underground belt conveyor dust fall detecting system based on machine vision
CN116977960A (en) Rice seedling row detection method based on example segmentation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant