CN113610852B - Yarn drafting quality monitoring method based on image processing - Google Patents
Yarn drafting quality monitoring method based on image processing Download PDFInfo
- Publication number
- CN113610852B CN113610852B CN202111177959.9A CN202111177959A CN113610852B CN 113610852 B CN113610852 B CN 113610852B CN 202111177959 A CN202111177959 A CN 202111177959A CN 113610852 B CN113610852 B CN 113610852B
- Authority
- CN
- China
- Prior art keywords
- image
- value
- gray
- pixel
- fiber
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 64
- 238000012545 processing Methods 0.000 title claims abstract description 25
- 238000012544 monitoring process Methods 0.000 title claims abstract description 23
- 239000000835 fiber Substances 0.000 claims abstract description 98
- 238000009826 distribution Methods 0.000 claims abstract description 71
- 230000007797 corrosion Effects 0.000 claims abstract description 40
- 238000005260 corrosion Methods 0.000 claims abstract description 40
- 238000004458 analytical method Methods 0.000 claims abstract description 26
- 238000010586 diagram Methods 0.000 claims abstract description 17
- 230000003044 adaptive effect Effects 0.000 claims abstract description 12
- 239000013598 vector Substances 0.000 claims description 46
- 239000011159 matrix material Substances 0.000 claims description 20
- 230000004927 fusion Effects 0.000 claims description 17
- 239000000203 mixture Substances 0.000 claims description 14
- 230000003628 erosive effect Effects 0.000 claims description 12
- 238000012937 correction Methods 0.000 claims description 11
- 238000013507 mapping Methods 0.000 claims description 11
- 238000002156 mixing Methods 0.000 claims description 4
- 230000008569 process Effects 0.000 abstract description 13
- 238000004519 manufacturing process Methods 0.000 description 8
- 239000000047 product Substances 0.000 description 7
- 238000001914 filtration Methods 0.000 description 5
- 238000009987 spinning Methods 0.000 description 3
- 230000001419 dependent effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000010339 dilation Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000002708 enhancing effect Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 241000287196 Asthenes Species 0.000 description 1
- 229920000742 Cotton Polymers 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000006227 byproduct Substances 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000005670 electromagnetic radiation Effects 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000009941 weaving Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
- G06T5/30—Erosion or dilatation, e.g. thinning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20004—Adaptive image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20192—Edge enhancement; Edge preservation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30124—Fabrics; Textile; Paper
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Quality & Reliability (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a yarn drafting quality monitoring method based on image processing. The method comprises the following steps: acquiring a gray image of the drafted yarn as an original image, performing fiber analysis on the original image as an image to be processed, and fusing a self-adaptive corrosion image, an expansion image, a confidence binary image and the original image in the fiber analysis process to obtain a new gray image; taking a new gray-scale image obtained by fiber analysis as an image to be processed for fiber analysis, continuously iterating until the new gray-scale image obtained by fiber analysis is stable, and taking the stable new gray-scale image as a fiber distribution map; and obtaining a texture image map according to the fiber distribution diagram, and judging the yarn quality. Compared with the prior art, the method has the advantages that the fiber edge characteristics are iteratively optimized according to the edge distribution field by using the edge adaptive corrosion method based on the mixed model, so that the interwoven and folded fiber edges can be clearly distinguished, and the yarn drafting quality monitoring result is more accurate.
Description
Technical Field
The invention relates to the field of digital image processing, identification and spinning, in particular to a yarn drafting quality monitoring method based on image processing.
Background
In the spinning process, the yarns are prepared by carrying out technological processes of mixing, drafting, mixing, twisting and the like on the cotton slivers, so that the yarns have certain twist and strength, and fibers on the yarns can be straightened as far as possible and kept parallel; the quality of the final yarn is related to these flows. In the process, the quality of the yarn is affected by defects and damages of the equipment and unreasonable control parameters of the equipment, so that the quality of the yarn in each process link needs to be strictly controlled in some spinning and weaving processes, and problems in the production process, such as equipment problems, control parameter problems and the like, need to be found in time. In the yarn production process, the fiber elongation degree after drafting is an important link for evaluating and controlling the yarn quality.
Disclosure of Invention
In order to solve the above technical problems, an object of the present invention is to provide a yarn drafting quality monitoring method based on image processing, which obtains the straightening degree of the fiber on the yarn, detects the yarn with quality problems with low stretching degree, and timely adjusts the production parameters or overhauls the equipment. The adopted technical scheme is as follows:
the invention provides a yarn drafting quality monitoring method based on image processing.
Acquiring a gray image of the drafted yarn as an original image, and performing fiber analysis by taking the original image as an image to be processed; the fiber analysis comprises: respectively obtaining a de-noised image of an image to be processed and a distribution vector of each pixel point; selecting a window to slide the denoised image, performing coordinate centralization on all pixels in the window, and projecting the pixel points after coordinate centralization onto a normal vector of a distribution vector to obtain the positions of the pixel points in the normal vector direction; obtaining a maximum value point under a mapping relation according to the mapping relation between the gray values of all pixel points in the window and the positions of the pixel points in the normal vector direction; constructing a Gaussian mixture model according to the gray value of the pixels in the window, wherein the number of single Gaussian models in the Gaussian mixture model is the same as the number of maximum value points; determining a new pixel value of the center position of the window according to the output of zero taking of the independent variable of the single Gaussian model with the mean value closest to zero; traversing the denoised image by using a sliding window to obtain new pixel values of all pixel points to form a self-adaptive corrosion image; obtaining an expansion image of the original image, and obtaining a confidence binary image according to the self-adaptive corrosion image and the expansion image; fusing the original image, the self-adaptive corrosion image and the expansion image by using the confidence coefficient binary image to obtain a new gray level image;
taking a new gray-scale image obtained by fiber analysis as an image to be processed for fiber analysis, continuously iterating until the new gray-scale image obtained by fiber analysis is stable, and taking the stable new gray-scale image as a fiber distribution map; and obtaining a texture image map according to the fiber distribution diagram, and judging the yarn quality.
Preferably, the coordinate centering specifically includes: and acquiring the pixel coordinates of the center of the sliding window, and making the coordinates of all pixel points in the window to be different from the center coordinates to realize the coordinate centralization.
Preferably, the obtaining of the texture image map according to the fiber distribution map specifically comprises: obtaining a texture image map according to the fiber distribution map, and calculating the fiber straightening degree of all pixel positions in the texture image map; setting a degree threshold value, binarizing the texture image map according to the relationship between the fiber straightening degree and the degree threshold value to obtain a binarized image, and calculating the area of a connected domain of the binarized image; setting an area threshold, and judging that the yarn has a quality problem if the area of the connected domain is larger than the area threshold; otherwise, the yarn quality is qualified.
Preferably, the method for acquiring the new pixel value at the center position of the window includes: acquiring a single Gaussian model with a positive mean value and closest to zero and a single Gaussian model with a negative mean value and closest to zero, and determining the pixel gray value of the center position of the window according to the zero output of the acquired independent variable of the single Gaussian model if only one single Gaussian model is acquired; if two single Gaussian models are obtained, correcting the output value of the single Gaussian model with larger square difference by using a first fusion weight for the two single Gaussian models with the mean value closest to zero to obtain a first correction value; correcting the output value with zero of the independent variable of the single Gaussian model with smaller square difference by using a second fusion weight to obtain a second corrected value; and obtaining a new pixel value of the window center position according to the first correction value and the second correction value.
Preferably, the first fusion weight is:
wherein,is the first of the fusion weights to be applied,in order to be of a smaller variance,is the mean of a single gaussian model with a small variance,in the case of a large variance, the variance is high,the mean of the single gaussian model with the larger variance.
Preferably, the second fusion weight is:
Preferably, the method for obtaining the confidence binary image according to the adaptive erosion image and the expansion image comprises: obtaining a confidence coefficient distribution diagram according to the gray value difference of each pixel position of the self-adaptive corrosion image and the expansion image, setting a confidence coefficient threshold, setting the gray value of a pixel point which is larger than the confidence coefficient threshold in the confidence coefficient distribution diagram as a first set value, and setting the gray value of the rest pixel points as a second set value to obtain a confidence coefficient two-value diagram.
Preferably, the method for obtaining the new gray scale map by fusing the original image, the adaptive erosion image and the expansion image by using the confidence binary image comprises:
wherein,in order to be a new gray-scale image,is a gray matrix of the pixels of the original image,is a gray matrix of the pixel points of the confidence binary image,in order to expand the gray matrix of the pixels of the image,is a pixel gray matrix of the self-adaptive corrosion image,to representAndthe product of the Hadamard sum of (C),to representAndthe hadamard product of (a).
Preferably, the fiber straightening degree is determined by the gray value of a pixel point in the texture image.
Preferably, the method for binarizing the texture image map according to the straightening degree of the fibers to obtain a binarized image comprises the following steps: and setting a degree threshold value, setting the gray value of the pixel point with the fiber straightening degree in the texture image greater than the degree threshold value as a first set value, and setting the gray value of the rest pixel points as a second set value to obtain a binary image.
The embodiment of the invention at least has the following beneficial effects:
according to the invention, the distribution information of the fibers on the yarns is extracted by collecting the image data of the yarns, the fiber straightening distribution map of the yarns is obtained, and the quality of the yarns is accurately evaluated. In the process, the fiber edge characteristics are iteratively optimized according to the edge distribution field by using the edge adaptive corrosion method and the expansion algorithm based on the mixed model, so that the interwoven and folded fiber edges can be clearly distinguished, and the subsequent fiber straightening distribution map is more accurate.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of a method for monitoring yarn drafting quality based on image processing according to an embodiment of the present invention.
Detailed Description
To further illustrate the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of the embodiments, structures, features and effects of the method for monitoring the yarn drafting quality based on image processing according to the present invention will be made with reference to the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "another embodiment" refers to not necessarily the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The invention aims to provide a yarn drafting quality monitoring method based on image processing, which is used for acquiring the straightening degree of fibers on yarns, detecting yarns with quality problems and low stretching degree, and adjusting production parameters or overhauling equipment in time.
The specific scenes aimed by the invention are as follows: and installing a high-definition high-frame-rate camera above the drafted yarns, overlooking the yarns by the camera, vertically irradiating the yarns downwards by using parallel white light, randomly acquiring an image by the camera (random sampling detection), wherein each acquired image is a gray image. The image comprises a yarn, a plurality of fibers are arranged on the yarn, each fiber is expected to be parallel and completely straightened, but the fiber straightening method cannot be realized in actual production, different fibers can only be straightened and parallel as far as possible, but due to equipment or control reasons, some fibers on the yarn are gathered together, the fibers are not straightened, and the quality of the yarn is affected.
The following describes a specific scheme of the yarn drafting quality monitoring method based on image processing in detail with reference to the accompanying drawings.
The specific embodiment is as follows:
referring to fig. 1, a flow chart of steps of a method for monitoring yarn draft quality based on image processing according to an embodiment of the present invention is shown, the method includes the following steps:
acquiring a gray image of the drafted yarn as an original image, and performing fiber analysis by taking the original image as an image to be processed; the fiber analysis comprises: respectively obtaining a de-noised image of an image to be processed and a distribution vector of each pixel point; selecting a window to slide the denoised image, performing coordinate centralization on all pixels in the window, and projecting the pixel points after coordinate centralization onto a normal vector of a distribution vector to obtain the positions of the pixel points in the normal vector direction; obtaining a maximum value point under a mapping relation according to the mapping relation between the gray values of all pixel points in the window and the positions of the pixel points in the normal vector direction; constructing a Gaussian mixture model according to the gray value of the pixels in the window, wherein the number of single Gaussian models in the Gaussian mixture model is the same as the number of maximum value points; determining a new pixel value of the center position of the window according to the output of zero taking of the independent variable of the single Gaussian model with the mean value closest to zero; traversing the denoised image by using a sliding window to obtain new pixel values of all pixel points to form a self-adaptive corrosion image; obtaining an expansion image of the original image, and obtaining a confidence binary image according to the self-adaptive corrosion image and the expansion image; fusing the original image, the self-adaptive corrosion image and the expansion image by using the confidence coefficient binary image to obtain a new gray level image;
taking a new gray-scale image obtained by fiber analysis as an image to be processed for fiber analysis, continuously iterating until the new gray-scale image obtained by fiber analysis is stable, and taking the stable new gray-scale image as a fiber distribution map; and obtaining a texture image map according to the fiber distribution diagram, and judging the yarn quality.
The specific implementation steps are as follows:
firstly, a gray image of the drafted yarn is obtained as an original image, and the original image is used as an image to be processed.
Specifically, a high-definition high-frame-rate camera is installed above the drafted yarn, the camera looks down the yarn and vertically irradiates the yarn downwards by using parallel white light, the camera randomly collects one image (random sampling detection), each collected image is a gray image, the gray image is called as an original image, and the original image is used as an image to be processed.
At this point, an image to be processed is obtained.
And secondly, performing fiber analysis on the image to be processed to obtain a new gray-scale image.
Respectively obtaining a de-noised image of an image to be processed and a distribution vector of each pixel point; selecting a window to slide the denoised image, performing coordinate centralization on all pixels in the window, and projecting the pixel points after coordinate centralization onto a normal vector of a distribution vector to obtain the positions of the pixel points in the normal vector direction; obtaining a maximum value point under a mapping relation according to the mapping relation between the gray values of all pixel points in the window and the positions of the pixel points in the normal vector direction; constructing a Gaussian mixture model according to the gray value of the pixels in the window, wherein the number of single Gaussian models in the Gaussian mixture model is the same as the number of maximum value points; determining a new pixel value of the center position of the window according to the output of zero taking of the independent variable of the single Gaussian model with the mean value closest to zero; traversing the denoised image by using a sliding window to obtain new pixel values of all pixel points to form a self-adaptive corrosion image; obtaining an expansion image of the original image, and obtaining a confidence binary image according to the self-adaptive corrosion image and the expansion image; and fusing the original image, the self-adaptive corrosion image and the expansion image by using the confidence coefficient binary image to obtain a new gray level image.
Specifically, the specific method for obtaining a new gray scale map by performing fiber analysis on an image to be processed comprises the following steps:
(1) and acquiring a distribution vector of each pixel point of the image to be processed. Firstly, enhancing the high-frequency characteristics of an image to be processed by utilizing an unsharp mask algorithm. The high-frequency features comprise edge features on the image and also comprise noise features. The purpose of image enhancement is to: because the fibers on the yarns are staggered and covered, the fiber structure is fine, and the yarns move in production, the fiber structure is possibly fuzzy and unclear in detail, and accurate fiber information on the yarns cannot be directly acquired, so that the details and high-frequency characteristics of an image to be processed need to be enhanced by using an unsharp mask method, and edge information of the fibers is highlighted. Secondly, extracting edges on the image after feature enhancement by using a Canny operator to obtain an edge binary image, wherein the edges represent the distribution of the fibers. However, since the noise of the original image is also enhanced, and the fibers on the yarn are staggered and covered, the noise is originally much, the edges on the obtained edge binary image may be discontinuous or adhered and covered together, the fibers on the image are not easy to distinguish, the stretching condition of a whole fiber on the yarn cannot be distinguished, and the straightening degree of the fiber cannot be obtained, so that the subsequent processing is required: the method comprises the steps of obtaining any pixel with the gray value of 1 on an edge binary image, obtaining a Hessian matrix of the pixel position, and then calculating two eigenvalues and two corresponding eigenvectors of the Hessian matrix. The feature vector is a two-dimensional unit vector, the vector represents the distribution direction of the edge at the position, i.e. the pixel coordinates of the edge at the position are distributed approximately along the direction of the feature vector, and the feature vector is called as the distribution vector of the position; and finally, acquiring distribution vectors of all pixels with the gray value of 1 on the edge binary image, and differentiating the distribution vectors of unknown positions (positions with the gray value of 0) according to the distribution vectors of the known positions (positions with the gray value of 1) by utilizing a linear interpolation algorithm of a two-dimensional space. To this end, a distribution vector for all positions is obtained, representing the trend of the edge of each position. The distribution constants of all positions are considered as one edge distribution field.
(2) Acquiring a de-noised image of an image to be processed, selecting a window to slide the de-noised image, performing coordinate centering on all pixels in the window, projecting pixel points after coordinate centering onto a normal vector of a distribution vector of pixel points in the center of the window, and obtaining the positions of the pixel points in the direction of the normal vector. Specifically, first, the image to be processed (i.e., the original) is blurred with a 3 × 3 gaussian blur kernel in order to remove a large amount of noise in the original. And secondly, performing sliding window on the blurred original image, namely, each position corresponds to the center of one rectangular window, preferably, the width of the window is 5, the height of the window is 11, and the axis of the window in the width direction is parallel to the distribution vector of each position. Thirdly, all the pixels in the window are obtained, and the coordinates of the pixel points of the pixels are centralized, wherein the specific method comprises the following steps: and acquiring the pixel coordinates of the center of the window, and subtracting the coordinates of all pixels in the window from the center coordinates. After centralization, the coordinates of all pixel points in the window are sequentially used for solving the inner product with the normal vector of the distribution vector corresponding to the central pixel of the window. The step aims to reduce the coordinates of all pixel points in the window to one dimension, namely, project all the pixel points to the normal vector of the distribution vector. Because the distribution vector represents the edge trend of the center position of the window, the normal vector of the distribution vector represents the vertical edge trend, each pixel point after projection is regarded as a position x on the number axis of the normal vector, corresponds to a gray value y, and all the x and y are used for reflecting the gray distribution information in the edge vertical direction.
(3) According to the gray value of all pixel points in the window andand obtaining a maximum value point under the mapping relation according to the mapping relation of the position of the point in the normal vector direction. Specifically, a 5 th-order polynomial curve is fitted by using the least square method by taking the gray values of all positions as a data setThe mapping relation between the position x and the gray value y is roughly described; and acquiring the number of the curve maximum value points, wherein the value is K.
(4) Constructing a Gaussian mixture model according to the gray value of the pixels in the window, wherein the number of single Gaussian models in the Gaussian mixture model is the same as the number of maximum value points; determining a new pixel value of the center position of the window according to the output of zero taking of the independent variable of the single Gaussian model with the mean value closest to zero; and traversing the denoised image by using a sliding window to obtain new pixel values of all pixel points to form the self-adaptive corrosion image.
Specifically, the process of obtaining the adaptive corrosion image is as follows:
first, the invention establishes a mixed Gaussian modelThe gaussian mixture model G is formed by superimposing K single gaussian models, and in particular, the K value is the maximum number of curves. The invention roughly evaluates how many single Gaussian models are superposed to form the mixed Gaussian model by utilizing a polynomial curve. Wherein,and the mean value and the variance of the kth single Gaussian model in the Gaussian mixture model and the position x of the projection of the pixel point on the numerical axis of the normal vector of the pixel point in the center of the window are represented. Solving a Gaussian mixture model by using gray values of all positions as a data set, pixel positions x in a window as independent variables and pixel gray values as dependent variables through an EM (effective electromagnetic radiation) algorithmAll of the parameters in. In the Gaussian mixture model, each single Gaussian model represents a possible edge, the mean value represents the position of the edge, and the variance represents the thickness of the edge.
And secondly, determining a new pixel value of the center position of the window according to the output of zero-taking of the independent variable of the single Gaussian model with the mean value closest to zero.
The method for acquiring the new pixel value of the window center position comprises the following steps: acquiring a single Gaussian model with a positive mean value and closest to zero and a single Gaussian model with a negative mean value and closest to zero, and determining the pixel gray value of the center position of the window according to the zero output of the acquired independent variable of the single Gaussian model if only one single Gaussian model is acquired; if two single Gaussian models are obtained, correcting the output value of the single Gaussian model with larger square difference by using a first fusion weight for the two single Gaussian models with the mean value closest to zero to obtain a first correction value; correcting the output value with zero of the independent variable of the single Gaussian model with smaller square difference by using a second fusion weight to obtain a second corrected value; and obtaining a new pixel value of the window center position according to the first correction value and the second correction value.
In particular, assume thatAndtwo parameters that are the most close to 0, one with positive sign and the other with negative sign. If it is notAndonly one of which exists, provided that the parameters existThen construct a single Gaussian modelThen the present invention considers that the new pixel at the center of the window is located atOn the corresponding edge, the gray value of the new pixel at the center of the window is set toWhereinis a single Gaussian modelAn output value when the argument x is 0.
If both existAndthen the present invention considers the pixel at the center of the window to be atOn the corresponding edge, again atOn the corresponding edge; however, the invention expects the pixel at the center of the window to be at a certain edge, and the simple way is to obtainAndthe parameter with the minimum absolute value is assumed asThen the new pixel value for the center position of the window is set toWhereinis a single Gaussian modelIn the independent variableAn output value at 0.
This simple situation isAndare usable when the absolute values of (A) are all large, but if they are largeAndwhen the absolute value of the window is small, that is, the corresponding edges of the two single gaussian models are almost overlapped, it cannot be determined simply by using the above method to which edge the pixel at the center of the window belongs, that is, it cannot be determined the new pixel value at the center of the window. The invention provides a new method for determining a new pixel value of a window center position: suppose thatLet the new pixel value of the pixel at the center of the window be set. Wherein,is a single Gaussian modelThe output value when the argument is 0,is a single Gaussian modelAn output value when the argument is 0; i.e. the new pixel value of the pixel at the center of the window isAndweighted fusion of (2). Wherein,is the first of the fusion weights to be applied,is the second of the fusion weights and is,is a first correction value for the first image data,is a second correction value, specifically,、respectively as follows:
wherein,representing the difference in the absolute values of the means of the two gaussian models,the larger the distance of the pixel to the center of the window is, the Gaussian modelThe closer the mean value of (c) is, the more attention is paid to the new pixel value of the pixel at the center position of the window;The smaller the pixel distance Gaussian model representing the center position of the windowThe closer the mean value of (c) is, the more attention is paid to the new pixel value of the pixel at the center position of the window。The larger the distance between the edges represented by the two single Gaussian models, the less attention is paid to the situationThe size of (d);the smaller the distance between the edges represented by the two Gaussian models, the more the dependence on the distance is requiredTo determine the new pixel value of the pixel at the center of the window to approach to the corresponding edge of the single gaussian model. Given a,The smaller, i.e.Ratio ofThe larger the difference between the corresponding edge thicknesses of the two single Gaussian models is, the larger the difference isThe smaller the new pixel value of the pixel at the center of the window is, the more interesting it isI.e. the thin edges are of greater concern, which is advantageous for enhancing the structure of the thin edges. It is noted that if not presentWill be at onceAndviewed as a parameter of negative and positive infinity, respectively, i.e. Gaussian modelIs a single Gaussian model with infinite variance at infinity, whenThe general applicability of the method is demonstrated.
Thus, a new pixel value of the center pixel of the sliding window is obtained.
And finally, traversing the denoised image by using a sliding window to obtain new pixel values of all pixel points to form a self-adaptive corrosion image.
Thus, an adaptive corrosion image is obtained.
In summary, the new pixel value of the window center position obtained by the method can satisfy the following condition: firstly, the single Gaussian model close to the center of the window can determine the gray value of the pixel; secondly, if the variances of the two single gaussian models are inconsistent, that is, the thicknesses of the two edges are inconsistent, the gray value of the pixel at the center of the window is preferentially dependent on the value of the single gaussian model corresponding to the thin edge. The method has the advantages that the edge of the pixel at the center of the window can be estimated, the overlapped edges are distinguished after the gray value of the pixel at the center of the window is reset, and the thin edge structure is prevented from being submerged. According to the method, each pixel point on the original image corresponds to one window, a gray value is distributed for the pixel in the center of the window by utilizing the distribution characteristics of the gray values of the pixels in the window, and the redistributed gray value is smaller than the initial gray value and is equivalent to the fact that the pixel in the center of the window is subjected to corrosion filtering operation. Therefore, the process of reassigning a gray value to the pixel at the center of the window is called an edge adaptive erosion process based on a hybrid model, which is called edge adaptive erosion for short.
It should be added that, in practice, the pixel gray value distribution of the vertical edge trend is not processed by gaussian distribution, but the invention fits the pixel gray value distribution to gaussian distribution, and can also represent edge information, and simultaneously, the invention adopts gaussian filtering to remove noise, so that a large part of image details can be accurately restored by using a mixed gaussian model. Therefore, the invention uses the Gaussian model to describe the gray scale distribution of the vertical direction of the edge trend reasonably and accurately.
(5) And obtaining an expansion image of the original image, and obtaining a confidence binary image according to the self-adaptive corrosion image and the expansion image. Obtaining a confidence coefficient distribution diagram according to the gray value difference of each pixel position of the self-adaptive corrosion image and the expansion image, setting a confidence coefficient threshold, setting the gray value of a pixel point which is larger than the confidence coefficient threshold in the confidence coefficient distribution diagram as a first set value, and setting the gray value of the rest pixel points as a second set value to obtain a confidence coefficient two-value diagram.
Specifically, first, a dilated image of the image to be processed (i.e., the original image) is acquired: and performing sliding window on the blurred image to be processed, namely each position corresponds to the center of a rectangular window, and preferably, the window width is 11 and the height is 3. The axis of the window along the width direction is parallel to the distribution vector of each position, and the expansion operation is performed on the pixel point in the center of the window by using the window, wherein the specific method comprises the following steps: and acquiring gray values of all pixels in the window, selecting the maximum gray value, and taking the maximum gray value as a new gray value of a central pixel point of the window, wherein the process is expansion operation. The dilation operation can connect broken detail structures, restoring some of the image detail structures. And performing expansion operation on all pixel points of the blurred image to be processed to obtain an expanded image. Therefore, each pixel on the original image corresponds to one erosion result and one expansion result. Secondly, a confidence distribution map of the image to be processed is obtained. If the expansion and erosion results for a location do not differ significantly, the results are accurate, and if the difference is significant, the erosion and expansion results are incongruous. Subtracting the expansion result and the corrosion result at the same position and marking as a, wherein exp (-3a) is used as a confidence coefficient of the position to represent the accuracy of the corrosion expansion result of the position after the corrosion expansion treatment is carried out on the original image; the confidence of all the positions constitutes a confidence distribution graph. And finally, obtaining a confidence coefficient binary image. Carrying out mean value filtering on the confidence coefficient distribution diagram by using a 3 multiplied by 3 window to obtain a filtering result diagram; the filtering result graph is subjected to threshold processing, and the specific method comprises the following steps: setting the gray value of the pixel with the gray value larger than the first threshold (0.3) as a first set value, preferably, the first set value is 1, and setting the gray values of the rest pixels as a second set value, preferably, the second set value is 0; a binary image is obtained, i.e. a confidence binary image. The corrosion result and the expansion result obtained at the pixel position with the gray value of 1 on the confidence coefficient binary image are consistent, and the confidence coefficient is high; the corrosion result and the expansion result obtained by the pixel position with the gray value of 0 are inconsistent, and the reliability is low.
(6) And fusing the original image, the self-adaptive corrosion image and the expansion image by using the confidence coefficient binary image to obtain a new gray level image. Setting a confidence binary image as I1, a dilation image as I2, an adaptive erosion image as I3 and an original image as I0, wherein the images are all regarded as two-dimensional matrixes with equal sizes, and then the new gray-scale map matrix is:
wherein,in order to be a new gray-scale image,is a gray matrix of the pixels of the original image,is a gray matrix of the pixel points of the confidence binary image,in order to expand the gray matrix of the pixels of the image,is a pixel gray matrix of the self-adaptive corrosion image,meaning that the expansion image and the adaptive erosion image are averaged,to representAndthe product of the Hadamard sum of (C),to representAndthe hadamard product of (a). The Hadamard product of any two matrices is a matrix formed by products of corresponding elements of the two matrices, and the Hadamard product of any two matrices is still a matrix.The mean (of the expansion result and the corrosion result) representing the position where only the corrosion result and the expansion result agree is noted, and the result of the position where the corrosion result and the expansion result disagree is 0;results indicating the location of inconsistency between the erosion results and the expansion results areAnd the result of the position where the corrosion result and the expansion result agree is 0. Therefore, forIf the corrosion result and the expansion result of the position are consistent, the gray value of the position is determined by the average value of the corrosion result and the expansion result, otherwise, the gray value of the position is determined by the original imageAnd (6) determining.
And finally, completing the fiber analysis process of the image to be processed to obtain a new gray-scale image.
And thirdly, taking the new gray-scale image obtained by the fiber analysis as an image to be processed for fiber analysis, continuously iterating until the new gray-scale image obtained by the fiber analysis is stable, and taking the stable new gray-scale image as a fiber distribution map. Specifically, willAnd the image to be processed is regarded as the image to be processed, and the processing process of the original image is executed again. Iteratively, preferably, the above-described part of the invention is cycled four or more times, so that a result is obtainedWhen no change occurs, the last one is obtainedNamely, the stable new gray-scale image is taken as the fiber distribution image. In this iterative iteration, it is constantly updatedThe edge information is obtained, and the edge distribution field is continuously updated, so that the corrosion result graph and the expansion result graph are consistent in region, and finally, a stable new gray scale graph is obtainedThe fiber distribution diagram on the yarn can be represented, the problems of non-separation of fibers and discontinuous fibers caused by interweaving covering of the fibers and noise and blurring can be solved, and the fiber distribution condition is clear and complete and is easy to distinguish.
And obtaining a fiber distribution diagram according to the new gray-scale image.
And finally, obtaining a texture image map according to the fiber distribution map, and judging the yarn quality.
Preferably, the obtaining of the texture image map according to the fiber distribution map specifically comprises: obtaining a texture image map according to the fiber distribution map, and calculating the fiber straightening degree of all pixel positions in the texture image map; setting a degree threshold value, binarizing the texture image map according to the relationship between the fiber straightening degree and the degree threshold value to obtain a binarized image, and calculating the area of a connected domain of the binarized image; setting an area threshold, and judging that the yarn has a quality problem if the area of the connected domain is larger than the area threshold; otherwise, the yarn quality is qualified.
Preferably, the method for obtaining the binary image by binarizing the texture image according to the straightening degree of the fibers comprises the following steps: and setting a degree threshold value, setting the gray value of the pixel point with the fiber straightening degree in the texture image greater than the degree threshold value as a first set value, and setting the gray value of the rest pixel points as a second set value to obtain a binary image. The degree of fiber straightening is determined by the gray value of the pixel points in the texture image.
Specifically, first, according to the fiber distribution patternThe texture image map is calculated. The specific method comprises the following steps: a 17 x 17 window is set up,all pixels in each small window form a sub-image, entropy values of the small window image gray level co-occurrence matrix and the gray level co-occurrence matrix are calculated, then the entropy values are assigned to the central point of the window, texture feature calculation of the first small window is completed, then the window is moved by one pixel to form another small window image, and new co-occurrence matrix and entropy values are calculated repeatedly. And by analogy, gradually generating a texture feature image map according to the fiber distribution map, wherein the size of the texture feature image map is the same as that of the original image, each pixel of the texture image map represents the complexity or disorder degree of the texture, the larger the value is, the more complex the texture distribution is, the more disordered the fiber distribution at the position is, and the lower the straightening degree is. Next, a fiber straightening profile is obtained. The pixel at any position on the texture image is p, and the extension degree of the fiber at the position is pThus, the degree of extension at each position is obtained, and the degree of extension at all positions constitutes the fiber extension profile. Again, the fiber straightening profile is thresholded. The specific method comprises the following steps: and setting the gray value of the pixel with the gray value smaller than the second threshold value as a first set value, and setting the gray values of the rest pixels as second set values to obtain a binary image. Preferably, the second threshold is 0.2, the first set value is 1, and the second set value is 0. And finally, monitoring the yarn drafting quality and judging the yarn quality. And calculating the area of the connected domain of the binary image, and if the area is larger than an area threshold value of the area of the ROI (the ROI is considered to be defined in advance) of the yarn, indicating that the quality of the yarn has problems, wherein the area threshold value is preferably one tenth of the area of the ROI of the yarn. If the yarns obtained by the images acquired by the invention for three times have quality problems, the production equipment or the production parameters have problems, and the yarns need to be adjusted in time.
And finishing monitoring the yarn drafting quality.
It should be noted that: the precedence order of the above embodiments of the present invention is only for description, and does not represent the merits of the embodiments. And specific embodiments thereof have been described above. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (10)
1. A yarn drafting quality monitoring method based on image processing comprises the following steps:
acquiring a gray image of the drafted yarn as an original image, and performing fiber analysis by taking the original image as an image to be processed; the fiber analysis comprises: respectively obtaining a de-noised image of an image to be processed and a distribution vector of each pixel point; selecting a window to slide the denoised image, performing coordinate centralization on all pixels in the window, and projecting the pixel points after coordinate centralization onto a normal vector of a distribution vector to obtain the positions of the pixel points in the normal vector direction; obtaining a maximum value point under a mapping relation according to the mapping relation between the gray values of all pixel points in the window and the positions of the pixel points in the normal vector direction; constructing a Gaussian mixture model according to the gray value of the pixels in the window, wherein the number of single Gaussian models in the Gaussian mixture model is the same as the number of maximum value points; determining a new pixel value of the center position of the window according to the output of zero taking of the independent variable of the single Gaussian model with the mean value closest to zero; traversing the denoised image by using a sliding window to obtain new pixel values of all pixel points to form a self-adaptive corrosion image; obtaining an expansion image of the original image, and obtaining a confidence binary image according to the self-adaptive corrosion image and the expansion image; fusing the original image, the self-adaptive corrosion image and the expansion image by using the confidence coefficient binary image to obtain a new gray level image;
taking a new gray-scale image obtained by fiber analysis as an image to be processed for fiber analysis, continuously iterating until the new gray-scale image obtained by fiber analysis is stable, and taking the stable new gray-scale image as a fiber distribution map; and obtaining a texture image map according to the fiber distribution diagram, and judging the yarn quality.
2. The yarn drafting quality monitoring method based on image processing as claimed in claim 1, wherein the coordinate centering is specifically: and acquiring the pixel coordinates of the center of the sliding window, and making the coordinates of all pixel points in the window to be different from the center coordinates to realize the coordinate centralization.
3. The yarn drafting quality monitoring method based on image processing according to claim 1, wherein the texture image map is obtained according to the fiber distribution map, and the judgment of the yarn quality specifically comprises:
obtaining a texture image map according to the fiber distribution map, and calculating the fiber straightening degree of all pixel positions in the texture image map; setting a degree threshold value, binarizing the texture image map according to the relationship between the fiber straightening degree and the degree threshold value to obtain a binarized image, and calculating the area of a connected domain of the binarized image; setting an area threshold, and judging that the yarn has a quality problem if the area of the connected domain is larger than the area threshold; otherwise, the yarn quality is qualified.
4. The yarn drafting quality monitoring method based on image processing as claimed in claim 1, wherein the new pixel value of the window center position is obtained by:
acquiring a single Gaussian model with a positive mean value and closest to zero and a single Gaussian model with a negative mean value and closest to zero, and determining the pixel gray value of the center position of the window according to the zero output of the acquired independent variable of the single Gaussian model if only one single Gaussian model is acquired;
if two single Gaussian models are obtained, correcting the output value of the single Gaussian model with larger square difference by using a first fusion weight for the two single Gaussian models with the mean value closest to zero to obtain a first correction value; correcting the output value with zero of the independent variable of the single Gaussian model with smaller square difference by using a second fusion weight to obtain a second corrected value; and obtaining a new pixel value of the window center position according to the first correction value and the second correction value.
5. The method for monitoring the yarn drafting quality based on the image processing as claimed in claim 4, wherein the first blending weight is:
7. The method for monitoring the yarn drafting quality based on the image processing as claimed in claim 1, wherein the method for obtaining the confidence binary image according to the adaptive erosion image and the expansion image is as follows:
obtaining a confidence coefficient distribution diagram according to the gray value difference of each pixel position of the self-adaptive corrosion image and the expansion image, setting a confidence coefficient threshold, setting the gray value of a pixel point which is larger than the confidence coefficient threshold in the confidence coefficient distribution diagram as a first set value, and setting the gray value of the rest pixel points as a second set value to obtain a confidence coefficient two-value diagram.
8. The method for monitoring the yarn drafting quality based on the image processing as claimed in claim 1, wherein the method for fusing the original image, the adaptive erosion image and the expansion image by using the confidence binary image to obtain the new gray level image comprises:
wherein,in order to be a new gray-scale image,is a gray matrix of the pixels of the original image,is a gray matrix of the pixel points of the confidence binary image,in order to expand the gray matrix of the pixels of the image,is a pixel gray matrix of the self-adaptive corrosion image,to representAndthe product of the Hadamard sum of (C),to representAndthe hadamard product of (a).
9. The method of claim 3, wherein the fiber straightening degree is determined by gray-scale values of pixels in the texture image.
10. The yarn drafting quality monitoring method based on image processing as claimed in claim 3, characterized in that the method for binarizing the texture image map according to the fiber straightening degree to obtain a binarized image comprises:
and setting a degree threshold value, setting the gray value of the pixel point with the fiber straightening degree in the texture image greater than the degree threshold value as a first set value, and setting the gray value of the rest pixel points as a second set value to obtain a binary image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111177959.9A CN113610852B (en) | 2021-10-10 | 2021-10-10 | Yarn drafting quality monitoring method based on image processing |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111177959.9A CN113610852B (en) | 2021-10-10 | 2021-10-10 | Yarn drafting quality monitoring method based on image processing |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113610852A CN113610852A (en) | 2021-11-05 |
CN113610852B true CN113610852B (en) | 2021-12-10 |
Family
ID=78343412
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111177959.9A Active CN113610852B (en) | 2021-10-10 | 2021-10-10 | Yarn drafting quality monitoring method based on image processing |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113610852B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114067227B (en) * | 2021-11-29 | 2024-06-28 | 黑龙江农垦建工路桥有限公司 | Slope potential safety hazard monitoring method based on unmanned aerial vehicle aerial photography |
CN115272303B (en) * | 2022-09-26 | 2023-03-10 | 睿贸恒诚(山东)科技发展有限责任公司 | Textile fabric defect degree evaluation method, device and system based on Gaussian blur |
CN115345882B (en) * | 2022-10-18 | 2023-03-24 | 南通莱晋纺织科技有限公司 | Textile compactness evaluation method based on image enhancement |
CN116934749B (en) * | 2023-09-15 | 2023-12-19 | 山东虹纬纺织有限公司 | Textile flaw rapid detection method based on image characteristics |
CN117994236B (en) * | 2024-02-21 | 2024-07-09 | 徐州普路通纺织科技有限公司 | Vortex spun broken yarn intelligent detection method and system based on image analysis |
CN118297914A (en) * | 2024-04-12 | 2024-07-05 | 乐昌市恒发纺织企业有限公司 | Detection system for quality of cotton web in cotton carding process |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107870172A (en) * | 2017-07-06 | 2018-04-03 | 黎明职业大学 | A kind of Fabric Defects Inspection detection method based on image procossing |
CN108346141A (en) * | 2018-01-11 | 2018-07-31 | 浙江理工大学 | Unilateral side incidence type light guide plate defect extracting method |
CN109461136A (en) * | 2018-09-20 | 2019-03-12 | 天津工业大学 | The detection method of fiber distribution situation in a kind of blended fibre products |
EP3732470A2 (en) * | 2017-12-26 | 2020-11-04 | Petr Perner | Devices and methods for yarn quality monitoring |
-
2021
- 2021-10-10 CN CN202111177959.9A patent/CN113610852B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107870172A (en) * | 2017-07-06 | 2018-04-03 | 黎明职业大学 | A kind of Fabric Defects Inspection detection method based on image procossing |
EP3732470A2 (en) * | 2017-12-26 | 2020-11-04 | Petr Perner | Devices and methods for yarn quality monitoring |
CN108346141A (en) * | 2018-01-11 | 2018-07-31 | 浙江理工大学 | Unilateral side incidence type light guide plate defect extracting method |
CN109461136A (en) * | 2018-09-20 | 2019-03-12 | 天津工业大学 | The detection method of fiber distribution situation in a kind of blended fibre products |
Non-Patent Citations (3)
Title |
---|
Yarn Hairiness Evaluation Using Image Processing;Subhasish Roy等;《2014 International Conference on Control, Instrumentation, Energy & Communication(CIEC) 》;20141231;588-592 * |
图像处理技术在纱线毛羽检测方面的应用;章国红等;《河北科技大学学报》;20160229;第37卷(第1期);76-82 * |
基于数字图像处理的交叉纤维的分离方法;解德义等;《山东轻工业学院学报》;20080331;第22卷(第1期);40-42 * |
Also Published As
Publication number | Publication date |
---|---|
CN113610852A (en) | 2021-11-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113610852B (en) | Yarn drafting quality monitoring method based on image processing | |
CN110619618B (en) | Surface defect detection method and device and electronic equipment | |
CN114723681B (en) | Concrete crack defect detection method based on machine vision | |
CN110378313B (en) | Cell cluster identification method and device and electronic equipment | |
CN116934749B (en) | Textile flaw rapid detection method based on image characteristics | |
CN117764989B (en) | Visual-aided display screen defect detection method | |
CN117422712B (en) | Plastic master batch visual detection method and system based on image filtering processing | |
CN113592782A (en) | Method and system for extracting X-ray image defects of composite carbon fiber core rod | |
CN116152261B (en) | Visual inspection system for quality of printed product | |
CN113744142A (en) | Image restoration method, electronic device and storage medium | |
CN113610850A (en) | Decorative paper texture abnormity detection method based on image processing | |
CN114565607B (en) | Fabric defect image segmentation method based on neural network | |
CN107895371B (en) | Textile flaw detection method based on peak coverage value and Gabor characteristics | |
CN115008255B (en) | Tool wear identification method and device for machine tool | |
CN115937186A (en) | Textile defect identification method and system | |
CN114905712A (en) | Injection molding machine control method based on computer vision | |
CN117067112B (en) | Water cutting machine and control method thereof | |
WO2021260765A1 (en) | Dimension measuring device, semiconductor manufacturing device, and semiconductor device manufacturing system | |
CN111161228B (en) | Button surface defect detection method based on transfer learning | |
CN115311269B (en) | Textile abnormity detection method | |
Tao | Enhanced Canny Algorithm for Image Edge Detection in Print Quality Assessment | |
CN117911419A (en) | Method and device for detecting steel rotation angle enhancement of medium plate, medium and equipment | |
CN111882495A (en) | Image highlight processing method based on user-defined fuzzy logic and GAN | |
CN109064425A (en) | A kind of image de-noising method of adaptive non local total variation | |
CN112233130B (en) | Cladding pool morphology recognition and closed-loop control method based on instance segmentation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |