CN109211918B - Fabric bow weft detection method based on weft trend - Google Patents
Fabric bow weft detection method based on weft trend Download PDFInfo
- Publication number
- CN109211918B CN109211918B CN201810992722.8A CN201810992722A CN109211918B CN 109211918 B CN109211918 B CN 109211918B CN 201810992722 A CN201810992722 A CN 201810992722A CN 109211918 B CN109211918 B CN 109211918B
- Authority
- CN
- China
- Prior art keywords
- image
- weft yarn
- weft
- effective
- fabric
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8854—Grading and classifying of flaws
- G01N2021/888—Marking defects
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8887—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30124—Fabrics; Textile; Paper
Abstract
The invention discloses a fabric bow weft detection method based on weft trend, which comprises the following steps of 1: processing the collected fabric original image; step 2: correcting the reconstructed image until the warp yarns are vertical to the horizontal direction to obtain a corrected image; and step 3: extracting effective weft yarn areas of the image corrected in the step 2, and marking each effective weft yarn area by using a rectangular frame; and 4, step 4: each effective weft yarn region obtains a group of discrete coordinate points by a stripe space constraint method, and the discrete coordinate data points on one effective weft yarn region are fitted into a straight line which is the axis curve of the effective weft yarn region; and 5: and (5) quantitatively analyzing the trend of the axis curve to obtain the bending direction and the bending degree parameters of the fabric bowing latitude so as to display the fabric bowing latitude defects. The invention can carry out targeted adjustment according to the trend of the weft yarns, and has better adjustment effect and stronger practicability.
Description
Technical Field
The invention relates to a fabric bow weft detection method based on weft trend, and belongs to the technical field of textile fabric detection.
Background
At present, China is a large textile country, and the textile industry plays a very important role in the social economy of China, but in the processing and subsequent finishing processes of fabrics, and in the technological processes of washing, bleaching and dyeing and the like, weft yarns or knitted courses deviate from straight lines vertical to the warp yarns or wales of the fabrics to form a skew state, namely skew weft yarns; the weft or knit courses deviate from a straight line perpendicular to the warp or wale of the fabric and form one or more arcuate distortion, called bow picks, across the width of the fabric. Whether weft skew or bow weft, if not detected, can affect the quality of the finished fabric, and can seriously affect the appearance and texture of the garment. With the improvement of the requirements of people on the quality of textiles, the fabric bowing index is increasingly emphasized, and how to realize the automatic detection of the bowing index becomes one of the problems which are urgently needed to be solved by the textile industry.
With the continuous reduction of the cost of a computer and image acquisition hardware and the continuous development of a new image processing technology and computer vision, the superiority of the digital image processing technology for bow and weft index detection is more and more prominent. The traditional photoelectric detection method has the defects of dead angle recognition, low detection precision, complex installation and debugging and the like, is limited by the types of fabrics, and cannot be applied to thicker fabrics and the like. The quality of the fabric can be efficiently and quickly detected by using an image processing technology, the fabric weft warp detection efficiency is improved, and the manpower consumption is reduced.
Disclosure of Invention
The purpose is as follows: in order to overcome the defects in the prior art, the invention provides a fabric weft bow detection method based on the weft trend.
The technical scheme is as follows: in order to solve the technical problems, the technical scheme adopted by the invention is as follows:
a fabric bow weft detection method based on weft trend comprises the following steps:
step 1: processing the collected fabric original image to obtain a reconstructed image;
step 2: correcting the reconstructed image, detecting the inclination angle of the image by using the warp direction, extracting the vertical direction of the texture, detecting the warp, and correcting the image until the warp is vertical to the horizontal direction to obtain a corrected image;
and step 3: extracting effective weft yarn areas of the image corrected in the step 2, and marking each effective weft yarn area by using a rectangular frame;
and 4, step 4: each effective weft yarn region obtains a group of discrete coordinate points by a stripe space constraint method, and the discrete coordinate data points on one effective weft yarn region are fitted into a straight line which is the axis curve of the effective weft yarn region;
and 5: and (5) quantitatively analyzing the trend of the axis curve to obtain the bending direction and the bending degree parameters of the fabric bowing latitude so as to display the fabric bowing latitude defects.
Preferably, the step 1 comprises:
1.1: converting the fabric true color image RGB into a gray intensity image, and adopting an R, G, B component weighted average algorithm: 0.29900R + 0.58700G + 0.11400B;
1.2: super-resolution reconstruction is performed on images with resolution lower than 60 pixels/inch, and double cubic interpolation operation is used for increasing the pixel density of the images.
Preferably, the step 2 comprises:
2.1: filtering the image after super-resolution reconstruction, and extracting the vertical direction of the texture to obtain an original warp image;
2.2: corroding and cutting an original warp image, performing binarization processing by using a maximum inter-class variance method, converting the image into an image with a black-white visual effect, and modifying the pixel point to be 255 when the gray value of the pixel point is greater than an optimal threshold tv and modifying the pixel point to be 0 when the gray value of the pixel point is less than or equal to the optimal threshold tv;
2.3: acquiring connected domains in all vertical directions in the processed image, marking the longest connected domain, recording the coordinates of the longest vertical connected domain, and marking by using a rectangular frame;
2.4: tracking the longest vertical connected domain by adopting a stripe space constraint method, taking a group of y coordinate values with fixed intervals, sequentially tracking the coordinate values in two corresponding x directions on the vertical connected domain, and taking the midpoint of the coordinate values in the two x directions as an x coordinate value to obtain a group of discrete coordinate points; fitting the discrete coordinate data points on one warp yarn into a straight line to represent the trend of the warp yarn; establishing a straight line segment vertical to the horizontal direction in the image, and detecting the angle of the fitted straight line deviating from the straight line segment to obtain the angle of the image needing to be rotated;
2.5: the image is rotated by the angle calculated in step 2.4 and the image is corrected to the point where the warp yarns are perpendicular to the horizontal to obtain a skew corrected image.
Preferably, the step 3 comprises: extracting the texture level trend of the image corrected in the step 2, and obtaining all horizontal connected domains in the image after binarization of the image; screening the horizontal connected domains, arranging the horizontal connected domains in descending order according to the length, and deleting the horizontal connected domains with the length less than 0 and 75 times of the corrected image length; the remaining horizontal connected areas are all valid weft yarn areas and are marked with rectangular boxes.
Preferably, the step 4 comprises:
4.1: determining coordinate values of each effective weft yarn region by a stripe space constraint method, taking a group of y coordinate values with fixed intervals, sequentially tracking the corresponding two coordinate values in the x direction on the effective weft yarn region, taking the midpoint of the two coordinate values in the x direction as the x coordinate value, thus obtaining a group of discrete coordinate points in each effective weft yarn region, and fitting the axis curve of each effective weft yarn region according to the coordinate points;
4.2: and 3, dividing the image by using the rectangular frame of each effective weft yarn connected domain in the step 3, displaying each weft yarn by using one image, and displaying the fitted axis curve in each weft yarn image.
Preferably, the step 5 comprises: and (3) quantitatively analyzing an axis curve fitted by each weft yarn, and calculating the angle of the axis curve deviating from the horizontal direction, wherein the tan value of the deviation angle is the bowing rate of the weft yarn.
Has the advantages that: according to the fabric bow weft detection method based on the weft trend, a new weft bow detection algorithm is provided based on the weft trend, the automatic detection of the bow weft defect is carried out on the fabric by using the visual perception technology, the method is simpler to realize and lower in price compared with the traditional photoelectric weft straightening technology, and can be used for carrying out targeted adjustment on bow wefts of different materials and different degrees, so that the adjustment effect is better, and the practicability is stronger.
Drawings
FIG. 1 is a step of analyzing fabric defects by visual perception technology;
FIG. 2 is a fabric artwork;
FIG. 3 is a super-resolution reconstructed image;
FIG. 4 is a raw warp yarn;
FIG. 5 is a corrected image;
FIG. 6 shows the result of extracting the effective weft area;
fig. 7 shows a partial weft fitted curve.
Detailed Description
The present invention will be further described with reference to the accompanying drawings.
As shown in figure 1, a fabric bow weft detection method based on weft trend comprises the following steps:
step 1: as shown in fig. 2, the acquired raw fabric image is processed.
1.1: converting a fabric true color image RGB into a Gray scale intensity image, and using a function of performing Gray scale processing on the image by eliminating hue and saturation information of the image while maintaining brightness, a spatial rectangular coordinate system is established with R, G, B as an axis, so that the color of each pixel of the RGB image can be represented by one point of a three-dimensional space, and the color of each pixel of a Gray image can be represented by one point of a straight line R ═ G ═ B, so that the RGB transfer Gray image is a mapping from the three-dimensional space to a one-dimensional space, and the most conceivable is projection (i.e. one point of the RGB space is perpendicular to the straight line R ═ G ═ B), and the Gray scale processing has a plurality of processing modes: the invention adopts a component method, a maximum method, an average method and a weighted average method, and adopts an algorithm for carrying out weighted average on R, G, B components: gray 0.29900R + 0.58700G + 0.11400B. The gray scale is a quantized value of brightness, the definition of RGB is three objective wavelength values, and sensitivity curves of human eyes to different wavelengths need to be considered during conversion, so that coefficients are not equal.
1.2: the image with the resolution lower than 60 pixels/inch is subjected to super-resolution reconstruction, the 'pixel' density of the image is increased by utilizing double cubic interpolation operation to improve the resolution, and the reconstructed image is shown in figure 3.
Step 2: and correcting the reconstructed image, detecting the inclination angle of the image by using the warp direction, extracting the vertical trend of the texture, detecting the warp, and correcting the image until the warp is vertical to the horizontal direction.
2.1: and filtering the image after super-resolution reconstruction, and extracting the vertical trend of the texture to obtain an original warp image, as shown in fig. 4.
2.2: and corroding and cutting the original warp image, and performing binarization processing by using a maximum inter-class variance method. And converting the image into an image with a black-and-white visual effect, modifying the gray value of each pixel point on the image to be 0 or 255, and modifying the pixel point to be 255 when the gray value of the pixel point is greater than the optimal threshold tv and is less than or equal to the optimal threshold tv, and modifying the pixel point to be 0.
The maximum between class variance method is very sensitive to noise in the image and the size of the target, and produces a good segmentation effect on the image with single peak between classes variance.
The image proportion of the foreground points is w0Average gray of u0(ii) a The segmentation threshold of the foreground and the background is recorded as t, and the average gray level is recorded as u1The image scale of the background point number is w1。
The total average gray scale of the image is:
u=w0*u0+w1*u1
variance of foreground and background images:
g=w0*(u0-u)2+(u1-u)2=w0*w1(u0-u1)2
when the quantitative value of the variance is maximum, the difference existing between the target and the background is considered to be maximum, and the total average gray level u is the optimal threshold value tv-w0*w1*(u1-u0)*(u0-u1)。
2.3: acquiring connected domains in all vertical directions in the processed image, marking the longest connected domain, recording the coordinate of the longest vertical connected domain, namely selecting a grain with the longest connected domain from a plurality of vertical connected domains in the image, namely the connected domain with the most obvious grain characteristic, marking by using a rectangular frame, and recording the coordinate of the largest connected domain.
2.4: tracking the maximum connected domain by adopting a stripe space constraint method, determining coordinate values of the connected domain, taking a group of y coordinate values with fixed intervals, sequentially tracking the two corresponding coordinate values in the x direction on the connected domain, taking the middle point of the two coordinate values in the x direction as the x coordinate value, and obtaining a group of discrete coordinate points. These discrete coordinate data points on a warp yarn are fitted to a straight line, which represents the warp yarn strike. And establishing a straight line segment vertical to the horizontal direction in the image, and detecting the angle of the fitted straight line deviating from the straight line segment to obtain the angle of the image required to rotate.
2.5: and (3) rotating the image after super-resolution reconstruction by the angle calculated in the step (2.4), and correcting the image until the warp is vertical to the horizontal direction to obtain an inclination correction image, as shown in fig. 5.
And step 3: and (3) extracting the effective weft yarn region of the image corrected in the step (2), extracting the horizontal trend of the texture, and obtaining all horizontal connected domains in the image after the image is binarized. And screening the horizontal connected domains, arranging the horizontal connected domains according to the descending order of the lengths, and deleting the connected domains with the lengths less than 0 and 75 times of the image length. Each of the remaining connected component areas is an effective weft yarn area, and each effective weft yarn area is marked with a rectangular frame as shown in fig. 6.
The connected domain is a method for extracting an image region, which is commonly used in the field of image processing. Generally, the image area is an image area composed of pixels with the same pixel value and adjacent positions in the image. In the experiment, we found and labeled each connected region in the image. Selecting a grain with the longest connected domain from a plurality of connected domains in the graph as a standard, screening the connected domains to obtain the connected domains with obvious grain characteristics, namely an effective weft yarn region, and marking by using rectangular frames respectively.
And 4, step 4: each effective weft yarn region obtains a group of discrete coordinate points by a stripe space constraint method, and the discrete coordinate data points on one effective weft yarn region are fitted into a straight line, namely the axis curve of the effective weft yarn region.
4.1: and determining coordinate values of each effective weft yarn region by a stripe space constraint method, taking a group of y coordinate values with fixed intervals, sequentially tracking the corresponding two coordinate values in the x direction on the effective weft yarn region, and taking the midpoint of the two coordinate values in the x direction as the x coordinate value, so that each effective weft yarn region obtains a group of discrete coordinate points. And fitting the axis curve of each effective weft yarn area according to the coordinate points.
The straight line fitting method has wide application in the fields of graph judgment and the like, and is a common mathematical method in mathematical computation. For straight line fitting, many methods are given for computational mathematics, and 3 methods based on residual error criteria are commonly used, and the least square method is most widely used. For the least square method, the fitting precision is high, the calculation method is simple, and the fitting precision is discussed in many documents, and the other two methods are complicated in calculation and not many in discussion. In fact, in some cases the least squares method does not necessarily have the best results, and the other two methods are also sometimes well adapted.
Line fitting is the approximation of a series of discrete data points (x) on a straight linei,yi) I 1,2, n, which is fitted to a straight line, the straight line is not strictly required to pass through each data point, but is expected to reflect the general basic trend of the data, and the equation of the straight line fitted in the rectangular coordinate system is l (x) m + cxWhere m is the slope of the fitted line and c is the intercept on the y-axis, if setRepresents a line L (x) ═ m + cxApproximate value obtained is Oi=f(xi)-yiCalled residual, the size of the residual is an important indicator for measuring the fit, and the following 3 are commonly used:
1) minimizing the sum of the absolute values of the residuals;
2) minimizing the maximum absolute value of the residual;
3) minimizing the sum of the squares of the residuals;
the method in which a fitted straight line is found based on the criterion 3) is called a least square method.
4.2: and 3, dividing the image by using the rectangular frame of each effective weft yarn connected domain in the step 3, displaying each weft yarn by using one image, and displaying the fitted axis curve in each weft yarn image.
And 5: and (5) quantitatively analyzing the trend of the axis curve to obtain the bending direction and the bending degree parameters of the fabric bowing latitude so as to display the fabric bowing latitude defects.
According to the specification of the national standard GB T14801-2009 woven fabric and knitted fabric weft inclination and weft bending test method, the weft inclination or weft bending rate S is d/W100, wherein d is the maximum vertical distance between weft yarns or knitted courses and a reference object vertical to warp yarns or wales, and W is the fabric width or the width of a product measuring part.
And (3) quantitatively analyzing an axis curve fitted by each weft yarn, and calculating the angle of the axis curve deviating from the horizontal direction, wherein the tan value of the deviation angle is the bowing rate of the weft yarn.
Experiments show that the fabric weft bending detection method based on the weft trend can accurately and quickly quantitatively analyze and detect the weft bending degree of the fabric, is simpler to realize and lower in price, improves the fabric weft bending detection efficiency and reduces the manpower consumption.
The above description is only of the preferred embodiments of the present invention, and it should be noted that: it will be apparent to those skilled in the art that various modifications and adaptations can be made without departing from the principles of the invention and these are intended to be within the scope of the invention.
Claims (1)
1. A fabric bow weft detection method based on weft trend is characterized in that: the method comprises the following steps:
step 1: processing the collected fabric original image to obtain a reconstructed image;
step 2: correcting the reconstructed image, detecting the inclination angle of the image by using the warp direction, extracting the vertical direction of the texture, detecting the warp, and correcting the image until the warp is vertical to the horizontal direction to obtain a corrected image;
and step 3: extracting effective weft yarn areas of the image corrected in the step 2, and marking each effective weft yarn area by using a rectangular frame;
and 4, step 4: each effective weft yarn region obtains a group of discrete coordinate points by a stripe space constraint method, and the discrete coordinate data points on one effective weft yarn region are fitted into a straight line which is the axis curve of the effective weft yarn region;
and 5: quantitatively analyzing the trend of the axis curve to obtain the bending direction and the bending degree parameters of the fabric weft bending so as to display the fabric weft bending defect;
the step 1 comprises the following steps:
1.1: converting the fabric true color image RGB into a gray intensity image, and adopting an R, G, B component weighted average algorithm: 0.29900R + 0.58700G + 0.11400B;
1.2: performing super-resolution reconstruction on an image with the resolution lower than 60 pixels/inch, and increasing the pixel density of the image by utilizing double cubic interpolation operation;
the step 2 comprises the following steps:
2.1: filtering the image after super-resolution reconstruction, and extracting the vertical direction of the texture to obtain an original warp image;
2.2: corroding and cutting an original warp image, performing binarization processing by using a maximum inter-class variance method, converting the image into an image with a black-white visual effect, and modifying the pixel point to be 255 when the gray value of the pixel point is greater than an optimal threshold tv and modifying the pixel point to be 0 when the gray value of the pixel point is less than or equal to the optimal threshold tv;
2.3: acquiring connected domains in all vertical directions in the processed image, marking the longest connected domain, recording the coordinates of the longest vertical connected domain, and marking by using a rectangular frame;
2.4: tracking the longest vertical connected domain by adopting a stripe space constraint method, taking a group of y coordinate values with fixed intervals, sequentially tracking the coordinate values in two corresponding x directions on the vertical connected domain, and taking the midpoint of the coordinate values in the two x directions as an x coordinate value to obtain a group of discrete coordinate points; fitting the discrete coordinate data points on one warp yarn into a straight line to represent the trend of the warp yarn; establishing a straight line segment vertical to the horizontal direction in the image, and detecting the angle of the fitted straight line deviating from the straight line segment to obtain the angle of the image needing to be rotated;
2.5: rotating the image by the angle calculated in the step 2.4, and correcting the image until the warp is vertical to the horizontal direction to obtain an inclination correction image;
the step 3 comprises the following steps: extracting the texture level trend of the image corrected in the step 2, and obtaining all horizontal connected domains in the image after binarization of the image; screening the horizontal connected domains, arranging the horizontal connected domains in descending order according to the length, and deleting the horizontal connected domains with the length less than 0 and 75 times of the corrected image length; the rest horizontal connected domains are effective weft yarn regions and are marked by using rectangular frames;
the step 4 comprises the following steps:
4.1: determining coordinate values of each effective weft yarn region by a stripe space constraint method, taking a group of y coordinate values with fixed intervals, sequentially tracking the corresponding two coordinate values in the x direction on the effective weft yarn region, taking the midpoint of the two coordinate values in the x direction as the x coordinate value, thus obtaining a group of discrete coordinate points in each effective weft yarn region, and fitting the axis curve of each effective weft yarn region according to the coordinate points;
4.2: dividing the image by using the rectangular frame of each effective weft yarn connected domain in the step 3, displaying each weft yarn by using one image, and displaying the fitted axis curve in each weft yarn image;
the step 5 comprises the following steps: and (3) quantitatively analyzing an axis curve fitted by each weft yarn, and calculating the angle of the axis curve deviating from the horizontal direction, wherein the tan value of the deviation angle is the bowing rate of the weft yarn.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810992722.8A CN109211918B (en) | 2018-08-28 | 2018-08-28 | Fabric bow weft detection method based on weft trend |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810992722.8A CN109211918B (en) | 2018-08-28 | 2018-08-28 | Fabric bow weft detection method based on weft trend |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109211918A CN109211918A (en) | 2019-01-15 |
CN109211918B true CN109211918B (en) | 2021-02-05 |
Family
ID=64985527
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810992722.8A Active CN109211918B (en) | 2018-08-28 | 2018-08-28 | Fabric bow weft detection method based on weft trend |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109211918B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110390675A (en) * | 2019-07-26 | 2019-10-29 | 常州弘仁智能科技有限公司 | A kind of fabric weft inclination detection method based on image analysing computer |
CN113643276B (en) * | 2021-08-29 | 2024-02-02 | 浙江工业大学 | Textile texture defect automatic detection method based on statistical analysis |
CN113780185B (en) * | 2021-09-13 | 2022-06-21 | 常州市宏发纵横新材料科技股份有限公司 | Weft angle detection method and device based on carbon fibers and storage medium |
CN114913180B (en) * | 2022-07-19 | 2022-09-30 | 海门市芳华纺织有限公司 | Intelligent detection method for defect of cotton cloth reed mark |
CN117670888A (en) * | 2024-02-01 | 2024-03-08 | 天津滨海雷克斯激光科技发展有限公司 | Pipeline inner wall defect detection method, device, equipment and medium |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03249264A (en) * | 1990-02-21 | 1991-11-07 | Kawashima Textile Manuf Ltd | Method for detecting cloth strain |
JP2909192B2 (en) * | 1990-11-09 | 1999-06-23 | 株式会社豊田中央研究所 | Fabric bend detector |
CN1715551B (en) * | 2004-06-28 | 2010-08-04 | 宫元九 | Detecting method for textile bias |
JP2008089534A (en) * | 2006-10-05 | 2008-04-17 | Toray Ind Inc | Method and device for inspecting fabric of carbon fiber |
CN100492396C (en) * | 2006-10-27 | 2009-05-27 | 东华大学 | Method for identifying fabric grain image facing camara weft straightener |
CN102393633A (en) * | 2011-11-10 | 2012-03-28 | 戴红 | Fuzzy control system of automatic photoelectric weft straightener |
CN102660862A (en) * | 2012-05-25 | 2012-09-12 | 常州信息职业技术学院 | Method and device for detecting fabric skewness |
CN102778414B (en) * | 2012-08-14 | 2014-07-09 | 顾金华 | Machine vision-based fabric physical property detection method and device |
CN102901466A (en) * | 2012-10-31 | 2013-01-30 | 江南大学 | Fabric weft inclination detection method based on image analysis |
CN103234969B (en) * | 2013-04-12 | 2015-03-04 | 江苏大学 | Method for measuring fabric weft density based on machine vision |
CN103866551B (en) * | 2014-03-28 | 2016-04-20 | 南京理工大学 | Based on the fabric skew quick detecting method of machine vision |
CN103924432A (en) * | 2014-04-28 | 2014-07-16 | 辽宁大学 | Woven fabric weft skewing detection method |
CN206599692U (en) * | 2017-03-22 | 2017-10-31 | 河海大学常州校区 | A kind of fabric skew of weft on-line detecting system |
-
2018
- 2018-08-28 CN CN201810992722.8A patent/CN109211918B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN109211918A (en) | 2019-01-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109211918B (en) | Fabric bow weft detection method based on weft trend | |
CN116843688B (en) | Visual detection method for quality of textile | |
CN110866924B (en) | Line structured light center line extraction method and storage medium | |
CN108280823A (en) | The detection method and system of the weak edge faults of cable surface in a kind of industrial production | |
CN104021561A (en) | Fabric fuzzing and pilling image segmentation method based on wavelet transformation and morphological algorithm | |
CN115100206B (en) | Printing defect identification method for textile with periodic pattern | |
CN109993797B (en) | Door and window position detection method and device | |
CN108921819A (en) | A kind of cloth examination device and method based on machine vision | |
CN114820631B (en) | Fabric defect detection method capable of resisting texture interference | |
CN103866551B (en) | Based on the fabric skew quick detecting method of machine vision | |
Deborah et al. | Detection of fake currency using image processing | |
CN115131353B (en) | Flat screen printing textile production abnormity identification and positioning method and system | |
CN114549441A (en) | Sucker defect detection method based on image processing | |
CN111738931B (en) | Shadow removal algorithm for aerial image of photovoltaic array unmanned aerial vehicle | |
CN114972575A (en) | Linear fitting algorithm based on contour edge | |
CN111739012A (en) | Camera module white spot detecting system based on turntable | |
CN111665199A (en) | Wire and cable color detection and identification method based on machine vision | |
CN115049671A (en) | Cloth surface defect detection method and system based on computer vision | |
CN115272256A (en) | Sub-pixel level sensing optical fiber path Gaussian extraction method and system | |
CN115018785A (en) | Hoisting steel wire rope tension detection method based on visual vibration frequency identification | |
Colom et al. | Analysis and extension of the percentile method, estimating a noise curve from a single image | |
CN109961432A (en) | A kind of detection method and system of filter cloth breakage | |
CN108805854B (en) | Method for rapidly counting tablets and detecting completeness of tablets in complex environment | |
CN116805312B (en) | Knitted fabric quality detection method based on image processing | |
CN116740579A (en) | Intelligent collection method for territorial space planning data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |