CN115035116A - Waterproof cloth quality inspection method based on computer vision - Google Patents

Waterproof cloth quality inspection method based on computer vision Download PDF

Info

Publication number
CN115035116A
CN115035116A CN202210959332.7A CN202210959332A CN115035116A CN 115035116 A CN115035116 A CN 115035116A CN 202210959332 A CN202210959332 A CN 202210959332A CN 115035116 A CN115035116 A CN 115035116A
Authority
CN
China
Prior art keywords
waterproof cloth
value
water drop
edge
lbp
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210959332.7A
Other languages
Chinese (zh)
Other versions
CN115035116B (en
Inventor
李丽艳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong Yanlu Enterprise Management Consulting Co ltd
Original Assignee
Qidong Gude Waterproof Fabric Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qidong Gude Waterproof Fabric Co ltd filed Critical Qidong Gude Waterproof Fabric Co ltd
Priority to CN202210959332.7A priority Critical patent/CN115035116B/en
Publication of CN115035116A publication Critical patent/CN115035116A/en
Application granted granted Critical
Publication of CN115035116B publication Critical patent/CN115035116B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to a waterproof cloth quality inspection method based on computer vision, which comprises the steps of obtaining each water drop edge in a waterproof cloth image, obtaining an element size coefficient and an element shape coefficient of each water drop according to the minimum external rectangle of each water drop edge, obtaining the element of the water drop through grid division, obtaining an arithmetic coding value of each element according to the arithmetic coding corresponding to chain codes of pixel points of the water drop edge of eight-neighborhood elements, calculating an LBP value of the element, traversing the eight-neighborhood elements to obtain an LBP value combination, selecting the LBP characteristic value of the water drop edge, obtaining an edge vector according to the element size coefficient, the element shape coefficient and the LBP characteristic value, classifying the water drops according to the similarity of the edge vectors, obtaining a characteristic quantity of each piece of waterproof cloth according to the edge vector quantity of each type of water drop quantity in each piece of waterproof cloth, classifying the waterproof cloth according to the characteristic quantity, selecting a sample of each type of waterproof cloth to participate in machine detection, the method improves the reliability and the representativeness of the waterproof cloth material detection sample.

Description

Waterproof cloth quality inspection method based on computer vision
Technical Field
The application relates to the field of machine vision, in particular to a waterproof cloth quality inspection method based on computer vision.
Background
For the waterproof cloth, the detection of the waterproof performance of the waterproof cloth material is the most important index. The existing method for detecting the waterproof performance of the cloth usually detects the waterproof performance, the air permeability and the like of the waterproof cloth through a professional instrument, only can realize sampling detection, and has larger randomness of a detection result. Based on the method, batch detection of the waterproof cloth can be realized, representative waterproof cloth in the waterproof cloth is selected according to detection results of waterproof cloth materials to carry out detection of professional instruments, so that waterproof performance of all the waterproof cloth is known more clearly, production and sale of the waterproof cloth are facilitated, and quality of the waterproof cloth is controlled.
Disclosure of Invention
The invention provides a method for detecting the quality of waterproof cloth based on computer vision, which solves the problem that the quality of waterproof performance of the waterproof cloth cannot be accurately reflected by random sampling detection in the process of detecting the quality of the waterproof cloth, and adopts the following technical scheme:
acquiring a waterproof cloth image, and detecting each water drop edge in the waterproof cloth image;
obtaining the element size coefficient and the element shape coefficient of each water drop according to the minimum external rectangle of the edge of each water drop;
dividing the minimum external rectangle at the edge of each water drop into a central element and eight neighborhood elements of the water drop by grids;
acquiring an arithmetic coding result interval corresponding to chain codes of pixel points at the edge of the water drop in each element in eight neighborhood elements, and taking a middle value of the arithmetic coding result interval as an arithmetic coding value of each element;
obtaining LBP values of eight neighborhood primitives according to the arithmetic coding value of each primitive and the arithmetic coding mean value of all the primitives;
traversing the eight neighborhood elements counterclockwise by taking each element in the eight neighborhood elements as a starting point to obtain eight LBP value combinations, converting each LBP value combination into a decimal system, and selecting the LBP value combination corresponding to the minimum decimal system value as an LBP characteristic value of the edge of the water drop;
obtaining an edge vector of each water drop according to the primitive size coefficient, the primitive shape coefficient and the LBP characteristic value of each water drop;
classifying the water drops according to the similarity of edge vectors among the water drops;
obtaining the characterization quantity of each piece of waterproof cloth according to the edge vector of each type of water drop in each piece of waterproof cloth and the quantity of the water drops contained in the type of waterproof cloth;
and classifying the tarpaulins according to the characteristic quantity of each tarpaulin, and selecting a sample of each tarpaulin to participate in machine detection.
The method for calculating the primitive size coefficient and the primitive shape coefficient of the water drop comprises the following steps:
the method for calculating the primitive size coefficient comprises the following steps:
Figure 162777DEST_PATH_IMAGE001
in the formula (I), the compound is shown in the specification,
Figure 876655DEST_PATH_IMAGE002
Figure 515447DEST_PATH_IMAGE003
respectively the length and the width of the minimum circumscribed rectangle of the edge of the water drop,
Figure 379498DEST_PATH_IMAGE004
is the size coefficient of water droplets;
the element shape coefficient calculation method comprises the following steps:
Figure 787345DEST_PATH_IMAGE005
in the formula (I), the compound is shown in the specification,
Figure 908010DEST_PATH_IMAGE006
is the shape coefficient of the primitive.
The method for acquiring the LBP value of the eight neighborhood primitives comprises the following steps:
comparing the arithmetic coding value of each element in the eight neighborhood elements with the average value of the arithmetic coding values of the eight neighborhood elements, if the arithmetic coding value of the element is more than or equal to the average value of the arithmetic coding values, the LBP value of the element is 0, otherwise, the LBP value of the element is 1.
The edge vector of the water drop is
Figure 643885DEST_PATH_IMAGE007
Wherein, in the step (A),
Figure 436261DEST_PATH_IMAGE008
the LBP characteristic value of the edge of the water drop is shown.
The method for classifying the water drops comprises the following steps:
for counting different water drops
Figure 259467DEST_PATH_IMAGE009
Cosine value similarity between
Figure 846306DEST_PATH_IMAGE010
Calculating code similarity of different water drops
Figure 69477DEST_PATH_IMAGE011
The method comprises the following steps:
and calculating according to the method that the corresponding digit is the same:
Figure 836182DEST_PATH_IMAGE012
in the formula (I), the compound is shown in the specification,
Figure 93988DEST_PATH_IMAGE013
for LBP characteristic values of different water drop edges, the calculation method comprises the following steps:
Figure 320570DEST_PATH_IMAGE014
in the formula (I), the compound is shown in the specification,
Figure 155671DEST_PATH_IMAGE015
the number of LBP characteristic values of different water drop edges with the same number on corresponding positions is defined, namely the number of two binary digits with the same number on corresponding positions exists;
computing
Figure 398171DEST_PATH_IMAGE010
And
Figure 136583DEST_PATH_IMAGE011
product of
Figure 940591DEST_PATH_IMAGE016
If, if
Figure 528567DEST_PATH_IMAGE016
Greater than 0.8, the two beads being compared are of the same class.
The method for acquiring the characteristic quantity of each piece of waterproof cloth comprises the following steps:
counting the number of water drops of each category in each piece of waterproof cloth;
acquiring an average edge vector of each category of water drops;
taking the number of water drops of each category as weight;
and the sum of the average edge vector of each type of water drop and the product of the weight is used as the characterization quantity of the waterproof cloth.
The method for classifying the waterproof cloth comprises the following steps:
and calculating cosine similarity among the characterization quantities of different waterproof cloths, and classifying the waterproof cloths with the cosine similarity larger than 0.8 into one class.
The method for selecting the sample of each type of waterproof cloth comprises the following steps:
and calculating the sum of cosine similarity of the characteristic quantity of each image in each type of waterproof cloth and the characteristic quantities of other images, and taking the waterproof cloth image corresponding to the minimum value of the sum of cosine similarity as the type of waterproof cloth sample.
The invention has the beneficial effects that:
the method includes the steps of realizing surface detection of a tarpaulin material based on machine vision, selecting a representative sample, detecting water drops in the tarpaulin, dividing primitives to obtain primitive size coefficients and primitive shape coefficients, expressing an arithmetic coding value of each primitive by using an arithmetic coding interval middle value of a chain code of a water drop edge pixel point in the primitives, obtaining LBP characteristic values of eight neighborhood primitives according to the arithmetic coding value and an arithmetic coding mean value of each primitive, traversing the eight neighborhood primitives by taking any one neighborhood primitive as an initial reverse order to obtain LBP characteristic combinations, selecting one LBP combination corresponding to the decimal minimum as the LBP characteristic value of the water drop edge, obtaining an edge vector of each water drop according to the primitive size coefficient, the primitive shape coefficient and the LBP characteristic values of each water drop, classifying the water drops through edge vector similarity, and obtaining the representation quantity of each type of water drops according to the quantity and the edge vector mean value, the method is based on computer vision, waterproof cloth water drops are detected, the problem of uncertain results caused by random sampling is avoided, representativeness and reliability of the samples are greatly improved, machine detection is carried out according to the selected representative samples, and accurate quality control of the waterproof cloth is achieved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a schematic flow diagram of a method for inspecting waterproof cloth based on computer vision according to the present invention;
FIG. 2 is a schematic diagram of the elements of a computer vision-based tarpaulin quality inspection method of the present invention;
FIG. 3 is a schematic view of a chain code direction of a waterproof cloth quality inspection method based on computer vision according to the present invention;
fig. 4 is a schematic diagram of distribution of pixel points at the edge of a water droplet in the waterproof cloth quality inspection method based on computer vision.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
An embodiment of a method for inspecting quality of waterproof cloth based on computer vision according to the present invention is shown in fig. 1, and includes:
the method comprises the following steps: acquiring the edge of each water drop in the waterproof cloth image;
the purpose of the step is to detect the edge of the water drop in the acquired waterproof cloth image through preprocessing.
The use scene of this embodiment is through setting up a topography model, on paving the topography model with the tarpaulin according to the topography condition, through the water jet equipment water spray, then classify the tarpaulin through the degree of roundness of drop of water and the distribution condition of drop of water, finally select representative tarpaulin in every classification and carry out the instrument testing.
The method for preprocessing the waterproof cloth image comprises the following steps:
firstly, identifying an object in a segmentation tarpaulin image by adopting a DNN semantic segmentation mode:
the relevant content of the DNN network is as follows:
(1) the used data set is a tarpaulin image data set acquired in a overlooking mode;
(2) the pixels needing to be segmented are divided into two types, namely the labeling process of the corresponding labels of the training set is as follows: in the semantic label of the single channel, the label that the corresponding position pixel belongs to the background class is 0, and the label that the corresponding position pixel belongs to the colored silk is 1.
(3) The task of the network is to classify, and all the used loss functions are cross entropy loss functions.
Then, the 0-1 mask image obtained by semantic segmentation is multiplied by the original image to obtain an image only containing the waterproof cloth, and the interference of the background is removed.
The method for acquiring the edge of the water drop comprises the following steps:
(1) and detecting a water drop target in the image by adopting a YOLO5 network to obtain a surrounding frame of the water drop.
The relevant content of the network is as follows:
the applicable data sets are different water drop image data sets acquired in a overlooking mode, and the water drop modes are various;
the pixels needing to be segmented are divided into two types, namely the labeling process of the corresponding labels of the training set is as follows: the semantic label of the single channel, the label of the corresponding position pixel belonging to the background class is 0, and the label of the corresponding position pixel belonging to the water drop is 1;
the network has the task of classification, and all used loss functions are cross entropy loss functions;
the enclosing frame of the water drop on the waterproof cloth image can be obtained through the yolo5 network.
(2) And detecting the water drop in the surrounding frame by canny edge to obtain the edge of the water drop in the surrounding frame.
Step two: obtaining the element size coefficient and the element shape coefficient of each water drop according to the minimum circumscribed rectangle of the edge of each water drop;
the purpose of this step is to calculate coefficients to reflect the size and shape characteristics of the bead by the length and width of the minimum bounding rectangle of the bead edge.
The method for calculating the primitive size coefficient comprises the following steps:
Figure 76223DEST_PATH_IMAGE001
in the formula (I), the compound is shown in the specification,
Figure 716412DEST_PATH_IMAGE017
Figure 783332DEST_PATH_IMAGE003
are respectively beadsThe length and width of the smallest circumscribed rectangle of the edge,
Figure 327446DEST_PATH_IMAGE004
is the size coefficient of water droplets;
the method for calculating the primitive shape coefficient comprises the following steps:
Figure 678792DEST_PATH_IMAGE005
in the formula (I), the compound is shown in the specification,
Figure 93593DEST_PATH_IMAGE006
is the shape coefficient of the primitive.
It should be noted that:
under normal conditions, pixel points in the image are square, namely the length and the width of the pixel points are the same, but actually obtained water drops are not necessarily standard spheres and can be polymers of a plurality of water drops, so that the actual shapes are various, and based on the method, a primitive shape coefficient is provided for facilitating subsequent calculation of LBP edge characteristics. The length and the width of the square are equal, the element shape coefficient of the square is 1, and the farther the shape of the water drop or the water drop polymer is away from the square, the farther the element shape coefficient is away from 1;
different primitive shapes can represent the nature of different tarpaulins, and the drop of water on the tarpaulin that waterproof performance is good is comparatively mellow, and the tarpaulin that waterproof performance is poor some exists certain water absorption ability promptly, and then the drop of water shape is not so mellow, and the drop of water polymer is closer to the beach, rather than comparatively three-dimensional form.
Step three: dividing the minimum external rectangle at the edge of each water drop into a central element and eight neighborhood elements of the water drop by grids;
the purpose of this step is to get the primitive through the minimum bounding rectangle.
The method for converting the minimum bounding rectangle into the primitive comprises the following steps:
the minimum circumscribed rectangle of the edge of the bead is divided into a3 x 3 cell form by a grid, and the length and the width are uniformly divided into 3 parts to obtain a3 x 3 grid, wherein each grid can be used as a cell, and the cell positioned in the center is called as a center cell. The length of the minimum circumscribed rectangle is represented by H, the width of the minimum circumscribed rectangle is represented by W, as shown in FIG. 2, the edge of the bead is an ellipse part in the figure, the large rectangle in the figure is the minimum circumscribed rectangle of the edge of the bead, 9 small rectangles in the figure are 9 primitives, the small rectangle positioned in the center is the center primitive, 8 small rectangles around the small rectangle are eight neighborhood primitives of the center primitive, and the edge pixel points of the bead are in eight neighborhoods of the center primitive.
Step four: acquiring an arithmetic coding result interval corresponding to chain codes of pixel points at the edge of the water drop in each element in eight neighborhood elements, and taking a middle value of the arithmetic coding result interval as an arithmetic coding value of each element;
the purpose of this step is that the detail information of the original pixels at the edge of the bead is reflected by the primitive, and the adaptability information of the bead edge is increased relative to the original pixel grid.
The method for acquiring the chain code of the water drop edge pixel point in each element comprises the following steps:
according to the distribution positions of the pixel points in the water drop edge and the adjacent pixel points thereof, 8-connected chain codes of the water drop pixel points in each element are obtained, and the method comprises the following steps:
defining 8 directions 1,2, 3, 4, 5, 6, 7, 8, taking a first pixel in each primitive as an initial target pixel, judging that an adjacent pixel of the target pixel is located in the direction of the pixel, if the adjacent pixel is located in the direction 1, obtaining a chain code [1], updating the target pixel to be a second pixel, continuously judging the direction of the next adjacent pixel of the target pixel, if the adjacent pixel is located in the direction 5, obtaining the chain code [1,5], updating the target pixel, continuing the operation, and finally obtaining the chain code of the pixel at the edge of the water drop in each primitive.
The method for acquiring the arithmetic coding value of the chain code of each primitive comprises the following steps:
(1) counting the occurrence times and probability of each numerical value (direction) in the chain code of each primitive;
(2) dividing [0,1) into a plurality of intervals according to the probability of each numerical value, wherein the size of the interval is in direct proportion to the probability of the numerical value, the larger the probability is, the larger the interval is, and all the subintervals are exactly [0, 1] in sum;
(3) sequentially reading each numerical value in the chain code in the primitive, finding a probability interval [ L, H ] in the numerical value (2) as a target interval, and dividing the target interval according to the probability ratio in the step (1);
(4) and (3) repeating the step (3) to finish processing the chain code value of the pixel point at the edge of the water drop in the primitive to obtain the final arithmetic coding interval of each primitive, and selecting the intermediate value of the interval as the arithmetic coding value of the primitive.
For example, the following steps are carried out:
the method for acquiring the chain code comprises the following steps:
as shown in fig. 3, 8 chain code directions are defined, if the distribution of edge pixels is shown in fig. 4, pixels 1,2, 3, 4, 5, and 6 are bead edge pixels in a certain primitive, and it can be known through analysis that pixel 2 is in the 1 direction of pixel 1, pixel 3 is in the 2 direction of pixel 2, pixel 4 is in the 2 direction of pixel 3, pixel 5 is in the 8 direction of pixel 4, and pixel 6 is in the 7 direction of pixel 5, then the chain code of the bead edge pixel in the primitive is [1,2,2,8,7];
the method for obtaining the arithmetic coding interval and the arithmetic coding value is
(1) Counting the probability P (1) =0.2, P (2) =0.4, P (8) =0.2, P (7) =0.2 of the occurrence of elements in the chain code;
(2) dividing probability intervals: the probability interval of element 1 is [0,0.2 ], the probability interval of element 2 is [0.2,0.6 ], the probability interval of element 8 is [0.6,0.8 ], and the probability interval of element 7 is [0.8, 1);
(3) reading in chain code elements in sequence:
a. reading in 1, wherein the probability interval is [0,0.2), the target probability interval is [0,0.2), the probability of each chain code element in the target probability region according to the probability ratio of the occurrence of the chain code elements in (2) is P (1) =0.04, P (2) =0.08, P (8) =0.04, and P (7) =0.04, and the target probability interval is divided into:
the probability interval of element 1 is [0,0.04 ], the probability interval of element 2 is [0.04,0.12 ], the probability interval of element 8 is [0.12,0.16 ], and the probability interval of element 7 is [0.16, 0.2);
b. reading 2, taking the probability interval of the element 2 as a target probability interval, [0.04,0.12 ], and dividing the target probability interval into the following according to the probability ratio of the occurrence of the chain code element in (2):
the probability interval of element 1 is [0,04, 0.056 ], the probability interval of element 2 is [0.056,0.088 ], the probability interval of element 8 is [0.088,0.104 ], and the probability interval of element 7 is [0.104, 0.12);
c. reading 2 again, taking the probability interval of the element 2 in the previous step as a target probability interval [0.056,0.088), and dividing the target probability interval into the following probability intervals according to the probability ratio of the chain code elements in (2):
probability interval of element 1 is [0.056,0.0624), probability interval of element 2 is [0.0624,0.0752), probability interval of element 8 is [0.0752,0.0816), probability interval of element 7 is [0.0816, 0.088);
d. reading 8, taking the probability interval of the element 8 in the previous step as a target probability interval [0.0752,0.0816), and dividing the target probability interval into the following probability intervals according to the probability ratio of the occurrence of the chain code element in the step (2):
probability interval of element 1 is [0.0752,0.07648 ], probability interval of element 2 is [0.07648,0.07904 ], probability interval of element 8 is [0.07904,0.08032), probability interval of element 7 is [0.08032, 0.0816);
e. finally, reading 7, taking the probability interval of the element 7 in the previous step as a target probability interval [0.08032,0.0816 ], and dividing the target probability interval into the following probability intervals according to the probability ratio of the occurrence of the chain code elements in the step (2):
probability interval of element 1 is [0.08032,0.080576), probability interval of element 2 is [0.080576,0.081088), probability interval of element 8 is [0.081088,0.081344), probability interval of element 7 is [0.081344, 0.0816);
f. the final target probability interval obtained from the a, b, c, d, e operation is the probability interval [0.08032,0.0816) of element 7 in e, which is used as the arithmetic coding interval of the primitive.
g. The middle value of the arithmetic coding interval is selected to be 0.08096, and the middle value is used as the arithmetic coding value of the element.
Step five: obtaining LBP values of eight neighborhood primitives according to the arithmetic coding value of each primitive and the arithmetic coding mean value of all the primitives; traversing the eight neighborhood elements counterclockwise by taking each element in the eight neighborhood elements as a starting point to obtain eight LBP value combinations, converting each LBP value combination into a decimal system, and selecting the LBP value combination corresponding to the minimum decimal system value as an LBP characteristic value of the edge of the water drop;
the method comprises the following steps of obtaining LBP characteristics of a central primitive and eight neighborhood primitives, and further screening out an LBP characteristic value of each water drop edge;
the LBP value obtaining method of the eight neighborhood primitives comprises the following steps:
comparing the arithmetic coding value of each primitive with the average of the arithmetic coding values of the eight neighborhood primitives:
if the arithmetic coding value of the primitive is more than or equal to the average value, the LBP value of the primitive is 0; if the arithmetic coding value of the primitive is less than the mean, the LBP value of the primitive is 1.
The LBP characteristic value of the water drop edge is obtained by the following steps:
sequentially taking each element of the eight neighborhoods as an initial element, traversing all the elements of the eight neighborhoods counterclockwise to obtain 8 LBP value combinations, wherein each combination contains the LBP values of 8 elements, but the initial elements are different, so that the LBP values in each combination are different in sequence;
and converting the binary number formed by each LBP value into a 10-system number, and selecting the LBP combination corresponding to the minimum decimal number as the LBP characteristic value of the water drop edge.
Step six: obtaining an edge vector of each water drop according to the primitive size coefficient, the primitive shape coefficient and the LBP characteristic value of each water drop;
the purpose of this step is to analyze the primitive size coefficient and primitive shape coefficient and LBP characteristic value of each bead together to obtain the edge vector of each bead.
Wherein the edge vector of the water drop is
Figure 364038DEST_PATH_IMAGE007
Wherein, in the step (A),
Figure 365754DEST_PATH_IMAGE008
the LBP characteristic value of the edge of the water drop is shown.
Step seven: classifying the water drops according to the similarity of edge vectors among the water drops;
the purpose of this step is to calculate the similarity between beads and classify them according to similarity.
The method for classifying the water drops comprises the following steps:
(1) for counting different water drops
Figure 379846DEST_PATH_IMAGE009
Cosine value similarity between
Figure 649154DEST_PATH_IMAGE010
(2) Calculating code similarity of different water drops
Figure 231445DEST_PATH_IMAGE011
And calculating according to the method with the same corresponding digit, wherein the method comprises the following steps:
Figure 15730DEST_PATH_IMAGE018
in the formula (I), the compound is shown in the specification,
Figure 708879DEST_PATH_IMAGE013
for LBP characteristic values of different water drop edges, the calculation method comprises the following steps:
Figure 862387DEST_PATH_IMAGE019
in the formula (I), the compound is shown in the specification,
Figure 615579DEST_PATH_IMAGE015
the number of the LBP feature values for different bead edges at the corresponding position is the same, i.e. two binary digits have the same number of the corresponding position digits, for example: when calculating the similarity of 11110000 to 11111000, the corresponding position numbers are the same: upper 4 bits 1111 and lower 3 bits 000, becauseThis s = 7;
(3) computing
Figure 621581DEST_PATH_IMAGE010
And
Figure 118422DEST_PATH_IMAGE011
product of
Figure 893480DEST_PATH_IMAGE016
If at all
Figure 551994DEST_PATH_IMAGE016
Greater than 0.8, the two beads being compared are of the same class.
Step eight: obtaining the characterization quantity of each piece of waterproof cloth according to the edge vector of each type of water drop in each piece of waterproof cloth and the quantity of the water drops contained in the type of waterproof cloth;
the purpose of this step is to obtain a representation of the tarpaulin according to the type and the quantity of the water drops contained inside each tarpaulin.
The method for acquiring the characterization quantity of each piece of waterproof cloth comprises the following steps:
(1) counting the number of water drops of each category in each piece of waterproof cloth;
(2) acquiring an average edge vector of each category of water drops, wherein the average edge vector is calculated by dividing the sum of the edge vectors of all the water drops in the category by the number of the water drops in the category;
(3) taking the number of water drops of each category as weight;
(4) and the sum of the average edge vector of each type of water drop and the product of the weight is used as the characterization quantity of the waterproof cloth.
Examples are as follows: three types of beads A, B and C exist in a piece of waterproof cloth, the number of the beads A, B and C is 1, 1 and 2 respectively, the average edge vector of the bead in the type A is (a 1, B1 and C1), the average edge vector of the bead in the type B is (a 2, B2 and C2), and the average edge vector of the bead in the type C is (a 3, B3 and C3), so that the characteristic quantity of the waterproof cloth is 0.25 (a 1, B1, C1) + 0.25 (a 2, B2, C2) + 0.5 (a 3, B3 and C3).
Step nine: and classifying the tarpaulins according to the characteristic quantity of each tarpaulin, and selecting a sample of each tarpaulin to participate in machine detection.
The purpose of the step is to classify the waterproof cloth through the characterization quantity and select a proper sample to participate in detection.
The method for classifying the waterproof cloth comprises the following steps:
and calculating cosine similarity among the characterization quantities of different waterproof cloths, and classifying the waterproof cloths with the cosine similarity larger than 0.8 into one class.
The method for selecting the sample of each type of waterproof cloth comprises the following steps:
and calculating the sum of cosine similarity of the characteristic quantity of each image in each type of waterproof cloth and the characteristic quantities of other images, and taking the waterproof cloth image corresponding to the minimum value of the sum of cosine similarity as the type of waterproof cloth sample.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (8)

1. A waterproof cloth quality inspection method based on computer vision is characterized by comprising the following steps:
acquiring a waterproof cloth image, and detecting the edge of each water drop in the waterproof cloth image;
obtaining the element size coefficient and the element shape coefficient of each water drop according to the minimum external rectangle of the edge of each water drop;
dividing the minimum external rectangle at the edge of each water drop into a central element and eight neighborhood elements of the water drop by grids;
acquiring an arithmetic coding result interval corresponding to chain codes of pixel points at the edge of the water drop in each element in eight neighborhood elements, and taking a middle value of the arithmetic coding result interval as an arithmetic coding value of each element;
obtaining LBP values of eight neighborhood primitives according to the arithmetic coding value of each primitive and the arithmetic coding mean value of all the primitives;
traversing the eight neighborhood elements counterclockwise by taking each element in the eight neighborhood elements as a starting point to obtain eight LBP value combinations, converting each LBP value combination into a decimal system, and selecting the LBP value combination corresponding to the minimum decimal system value as an LBP characteristic value of the edge of the water drop;
obtaining an edge vector of each water drop according to the primitive size coefficient, the primitive shape coefficient and the LBP characteristic value of each water drop;
classifying the water drops according to the similarity of edge vectors among the water drops;
obtaining the characterization quantity of each piece of waterproof cloth according to the edge vector of each type of water drop in each piece of waterproof cloth and the quantity of the water drops contained in the type of waterproof cloth;
and classifying the tarpaulins according to the characteristic quantity of each tarpaulin, and selecting a sample of each tarpaulin to participate in machine detection.
2. The computer vision-based tarpaulin quality inspection method of claim 1, wherein the elementary size coefficients and the elementary shape coefficients of the water drop are calculated by the following method:
the method for calculating the primitive size coefficient comprises the following steps:
Figure 566808DEST_PATH_IMAGE002
in the formula (I), the compound is shown in the specification,
Figure DEST_PATH_IMAGE003
Figure 899700DEST_PATH_IMAGE004
respectively the length and the width of the minimum circumscribed rectangle of the edge of the water drop,
Figure DEST_PATH_IMAGE005
is the size coefficient of water droplets;
the element shape coefficient calculation method comprises the following steps:
Figure DEST_PATH_IMAGE007
in the formula (I), the compound is shown in the specification,
Figure 438041DEST_PATH_IMAGE008
is the shape factor of the primitive.
3. The computer vision-based tarpaulin quality inspection method of claim 1, wherein the LBP values of the eight neighborhood primitives are obtained by:
comparing the arithmetic coding value of each element in the eight neighborhood elements with the average value of the arithmetic coding values of the eight neighborhood elements, if the arithmetic coding value of the element is more than or equal to the average value of the arithmetic coding values, the LBP value of the element is 0, otherwise, the LBP value of the element is 1.
4. The computer vision-based tarpaulin quality inspection method of claim 2, wherein the edge vector of the water drop is
Figure DEST_PATH_IMAGE009
Wherein, in the step (A),
Figure 431143DEST_PATH_IMAGE010
the LBP characteristic value of the edge of the water drop is shown.
5. The computer vision-based tarpaulin quality inspection method of claim 1, wherein the method for classifying water drops comprises the following steps:
for counting different water drops
Figure DEST_PATH_IMAGE011
Cosine value similarity between
Figure 370280DEST_PATH_IMAGE012
Calculating code similarity of different water drops
Figure DEST_PATH_IMAGE013
The method comprises the following steps:
and calculating according to the method that the corresponding digits are the same:
Figure DEST_PATH_IMAGE015
in the formula (I), the compound is shown in the specification,
Figure 710125DEST_PATH_IMAGE016
for LBP characteristic values of different water drop edges, the calculation method comprises the following steps:
Figure 549643DEST_PATH_IMAGE018
in the formula (I), the compound is shown in the specification,
Figure DEST_PATH_IMAGE019
the number of LBP characteristic values of different water drop edges with the same number on corresponding positions is defined, namely the number of two binary digits with the same number on corresponding positions exists;
computing
Figure 152794DEST_PATH_IMAGE012
And
Figure 281025DEST_PATH_IMAGE013
product of
Figure 221299DEST_PATH_IMAGE020
If the number is more than 0.8, the two beads to be compared are of the same type.
6. The method for detecting the quality of the waterproof cloth based on the computer vision as claimed in claim 5, wherein the method for acquiring the characterization quantity of each piece of waterproof cloth is as follows:
counting the number of water drops of each category in each piece of waterproof cloth;
acquiring an average edge vector of each category of water drops;
taking the number of water drops of each category as weight;
and the sum of the average edge vector of each type of water drop and the product of the weight is used as the characterization quantity of the waterproof cloth.
7. The method for inspecting quality of tarpaulin based on computer vision of claim 1, wherein the method for classifying tarpaulin is:
calculating cosine similarity among the characterization quantities of different waterproof cloths, and classifying the waterproof cloths with the cosine similarity larger than 0.8 into one type.
8. The method for detecting the quality of the tarpaulin based on the computer vision of claim 1, wherein the method for selecting the sample of each type of the tarpaulin comprises the following steps:
and calculating the sum of cosine similarity of the characteristic quantity of each image in each type of waterproof cloth and the characteristic quantities of other images, and taking the waterproof cloth image corresponding to the minimum value of the sum of cosine similarity as the type of waterproof cloth sample.
CN202210959332.7A 2022-08-11 2022-08-11 Waterproof cloth quality inspection method based on computer vision Active CN115035116B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210959332.7A CN115035116B (en) 2022-08-11 2022-08-11 Waterproof cloth quality inspection method based on computer vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210959332.7A CN115035116B (en) 2022-08-11 2022-08-11 Waterproof cloth quality inspection method based on computer vision

Publications (2)

Publication Number Publication Date
CN115035116A true CN115035116A (en) 2022-09-09
CN115035116B CN115035116B (en) 2022-10-25

Family

ID=83130233

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210959332.7A Active CN115035116B (en) 2022-08-11 2022-08-11 Waterproof cloth quality inspection method based on computer vision

Country Status (1)

Country Link
CN (1) CN115035116B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101329728A (en) * 2008-07-03 2008-12-24 深圳市康贝尔智能技术有限公司 LBP human face light irradiation preprocess method based on Hamming distance restriction
US20170343481A1 (en) * 2016-05-27 2017-11-30 Purdue Research Foundation Methods and systems for crack detection
CN110021024A (en) * 2019-03-14 2019-07-16 华南理工大学 A kind of image partition method based on LBP and chain code technology

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101329728A (en) * 2008-07-03 2008-12-24 深圳市康贝尔智能技术有限公司 LBP human face light irradiation preprocess method based on Hamming distance restriction
US20170343481A1 (en) * 2016-05-27 2017-11-30 Purdue Research Foundation Methods and systems for crack detection
CN110021024A (en) * 2019-03-14 2019-07-16 华南理工大学 A kind of image partition method based on LBP and chain code technology

Also Published As

Publication number Publication date
CN115035116B (en) 2022-10-25

Similar Documents

Publication Publication Date Title
CN110363182B (en) Deep learning-based lane line detection method
CN106529537B (en) A kind of digital instrument reading image-recognizing method
CN113240626B (en) Glass cover plate concave-convex type flaw detection and classification method based on neural network
CN103400151B (en) The optical remote sensing image of integration and GIS autoregistration and Clean water withdraw method
CN110458172A (en) A kind of Weakly supervised image, semantic dividing method based on region contrast detection
CN105809121A (en) Multi-characteristic synergic traffic sign detection and identification method
CN115082419A (en) Blow-molded luggage production defect detection method
CN114627052A (en) Infrared image air leakage and liquid leakage detection method and system based on deep learning
CN114332650B (en) Remote sensing image road identification method and system
CN109035196B (en) Saliency-based image local blur detection method
CN114882026B (en) Sensor shell defect detection method based on artificial intelligence
CN115311503B (en) Fiber classification method, system, computer device and medium
CN114022793A (en) Optical remote sensing image change detection method based on twin network
Veras et al. Discriminability tests for visualization effectiveness and scalability
Wang et al. An maize leaf segmentation algorithm based on image repairing technology
CN107992856A (en) High score remote sensing building effects detection method under City scenarios
CN115375690A (en) Tongue picture putrefaction classification and identification method
CN108280469A (en) A kind of supermarket's commodity image recognition methods based on rarefaction representation
CN116188756A (en) Instrument angle correction and indication recognition method based on deep learning
CN115731400A (en) X-ray image foreign matter detection method based on self-supervision learning
CN111210447B (en) Hematoxylin-eosin staining pathological image hierarchical segmentation method and terminal
CN111291818A (en) Non-uniform class sample equalization method for cloud mask
CN115546795A (en) Automatic reading method of circular pointer instrument based on deep learning
CN108596244A (en) A kind of high spectrum image label noise detecting method based on spectrum angle density peaks
JP3020973B2 (en) Image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20231221

Address after: Room 3414, Building 33, Zhongnan Century City, Chongchuan District, Nantong City, Jiangsu Province, 226000

Patentee after: Nantong Yanlu Enterprise Management Consulting Co.,Ltd.

Address before: 226000 No.7 Jinggong Road, Qidong Economic Development Zone, Nantong City, Jiangsu Province

Patentee before: Qidong Gude waterproof fabric Co.,Ltd.

TR01 Transfer of patent right