CN115237083B - Textile singeing process control method and system based on computer vision - Google Patents

Textile singeing process control method and system based on computer vision Download PDF

Info

Publication number
CN115237083B
CN115237083B CN202211161434.0A CN202211161434A CN115237083B CN 115237083 B CN115237083 B CN 115237083B CN 202211161434 A CN202211161434 A CN 202211161434A CN 115237083 B CN115237083 B CN 115237083B
Authority
CN
China
Prior art keywords
fuzz
pixel point
value
textile
gray
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211161434.0A
Other languages
Chinese (zh)
Other versions
CN115237083A (en
Inventor
李友芬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong Mumuxingchen Textile Co ltd
Original Assignee
Nantong Mumuxingchen Textile Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong Mumuxingchen Textile Co ltd filed Critical Nantong Mumuxingchen Textile Co ltd
Priority to CN202211161434.0A priority Critical patent/CN115237083B/en
Publication of CN115237083A publication Critical patent/CN115237083A/en
Application granted granted Critical
Publication of CN115237083B publication Critical patent/CN115237083B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41865Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by job scheduling, process planning, material flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/32Operator till task planning
    • G05B2219/32252Scheduling production, machining, job shop

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to the technical field of textile processing, in particular to a textile singeing process control method and system based on computer vision: acquiring a textile surface image, and further acquiring a fuzz area image and a textile edge; acquiring a gray level image of the fuzz area image, and determining a fuzz area binary image and each fuzz imaging line pixel point and a fuzz imaging block pixel point in the gray level image according to gray level values of each pixel point in the gray level image; determining the close degree value of the fuzz line pixel point and the close degree value of the fuzz block pixel point according to the gray level image, the fuzz area binary image, the position of the textile edge, each fuzz line pixel point and each fuzz block pixel point; and determining the fuzz flammability coefficient of the textile according to the fuzz imaging line pixel point close degree value and the fuzz imaging block pixel point close degree value, and further determining the fuzz burning singer temperature of the textile. The invention can accurately determine the temperature of the singeing machine and ensure the singeing quality.

Description

Textile singeing process control method and system based on computer vision
Technical Field
The invention relates to the technical field of textile processing, in particular to a textile singeing process control method and system based on computer vision.
Background
Since textiles have fuzz during the previous production process, which can adversely affect subsequent processes, such as dyeing processes, the textiles often require singeing treatment. In the existing textile singeing technology, the temperature of a singeing machine is usually determined manually, and when the temperature is low, insufficient singeing can be caused, and the smoothness of textiles is low; when the temperature is higher, the textile can be damaged, and the reliability is poor.
Disclosure of Invention
The invention aims to provide a textile singeing process control method and system based on computer vision, which are used for solving the problem that the existing manual determination singeing machine is poor in temperature reliability.
In order to solve the technical problems, the invention provides a textile singeing process control method based on computer vision, which comprises the following steps:
acquiring a textile surface image, and acquiring a fuzz area image and a textile edge according to the textile surface image;
acquiring a gray level image of the fuzz area image, and determining a fuzz area binary image and each fuzz imaging line pixel point and a fuzz imaging block pixel point in the gray level image according to gray level values of each pixel point in the gray level image;
determining the close degree value of the fuzz line pixel point and the close degree value of the fuzz block pixel point according to the gray level image, the fuzz area binary image, the position of the textile edge, each fuzz line pixel point and each fuzz block pixel point;
and determining the fuzz flammability coefficient of the textile according to the fuzz imaging line pixel point close degree value and the fuzz imaging block pixel point close degree value, and further determining the fuzz burning singeing machine temperature of the textile according to the fuzz flammability coefficient of the textile.
The step of determining the pixel point close degree value of the fuzz imaging line comprises the following steps:
determining corresponding fuzz line mapping pixel points of each fuzz line pixel point in the fuzz region binary image, and determining a hessian matrix of each fuzz line mapping pixel point; according to the position of the textile edge and the hessian matrix of each fuzz line-forming mapping pixel point, correspondingly determining the direction of each fuzz line-forming pixel point; in the gray level image, each fuzz imaging line pixel point is taken as a starting point, and sliding is carried out according to the direction of each fuzz imaging line pixel point until the first fuzz imaging block pixel point is reached, so that the sliding distance value corresponding to each fuzz imaging line pixel point and the total number of times that each fuzz imaging line pixel point is passed in all sliding processes corresponding to other fuzz imaging line pixel points are determined; determining the distance from each fuzz line pixel to the textile edge and the distance from the corresponding first fuzz block pixel to the textile edge according to the position of the textile edge; calculating the close degree value of the fuzz imaging pixel points according to the sliding distance value corresponding to each fuzz imaging pixel point, the total number of times that each fuzz imaging pixel point corresponds to each fuzz imaging pixel point and is passed in all sliding processes corresponding to other fuzz imaging pixel points, the distance from each fuzz imaging pixel point to the edge of the textile and the distance from the first fuzz imaging pixel point corresponding to each fuzz imaging pixel point to the edge of the textile;
the calculation formula corresponding to the pixel point close degree value of the fuzz imaging line is as follows:
wherein,for the pixel point of fuzz imaging line, the value of the degree of closeness is +.>Is the firstiSliding distance value corresponding to each fuzz imaging line pixel point, < >>Is the firstiDistance between the first fuzz block pixel corresponding to the fuzz imaging line pixel and the edge of the textile>Is the firstiDistance from individual fuzz thread pixels to textile edge,/->Is the firstiThe total number of times that the corresponding fine hair imaging pixel points pass through in all sliding processes corresponding to other fine hair imaging pixel points is N, and N is the number of fine hair imaging pixel points;
the step of determining the fuzz flammability factor of the textile comprises:
acquiring a first weight value corresponding to a pixel point of the fuzz imaging line, and acquiring a second weight value corresponding to a pixel point of the fuzz imaging block; and according to the first weight value and the second weight value, weighting and summing the fuzz imaging line pixel point close degree value and the fuzz imaging block pixel point close degree value, so as to obtain the fuzz flammability coefficient of the textile.
Further, the step of determining the pixel point proximity value of the fuzz block comprises the following steps:
determining a neighborhood gray level difference value of each fuzz-blocking pixel point in the gray level image according to the gray level value of each pixel point in the gray level image;
calculating the gray value variance of each fuzz block pixel in the gray image according to the gray value of each fuzz block pixel in the gray image;
and calculating the next degree value of each fuzz block pixel point according to the gray value, the neighborhood gray difference value and the gray value variance of each fuzz block pixel point in the gray image.
Further, the calculation formula corresponding to the pixel point of the fuzz-forming block next to the degree value is as follows:
wherein,for the pixel point of the fuzz-forming block, the value of the degree of closeness is +.>For gray value variance of each fuzz block pixel point in gray image,/for gray value variance>Gray value of pixel point of t-th fuzz imaging block in gray image, +.>The difference value of the neighborhood gray scale of the t-th fuzz imaging block pixel point in the gray scale image is M, and M is the number of fuzz imaging block pixel points.
Further, the step of determining each of the fuzz-line pixels and the fuzz-block pixels in the gray scale image includes:
according to the gray value of each pixel point in the gray image, determining each fuzz pixel point in each pixel point in the gray image, and further calculating the neighborhood gray value difference index value of each fuzz pixel point in the gray image;
and distinguishing each fuzz pixel point in the gray level image according to the neighborhood gray level value difference index value of each fuzz pixel point in the gray level image, so as to obtain each fuzz imaging line pixel point and each fuzz imaging block pixel point.
Further, the step of determining the fuzz region binary image includes:
respectively judging whether the gray value of each pixel point in the gray image is smaller than a gray threshold value;
if the gray level threshold value is smaller than the gray level threshold value, the corresponding pixel point in the gray level image is marked as 0, otherwise, the corresponding pixel point in the gray level image is marked as 1, and therefore a fuzz area binary image is obtained.
Further, the step of determining the temperature of the fuzz combustion singer of the textile comprises:
and acquiring a temperature set value of a fuzz combustion singeing machine of the textile in a set fuzz state, and further determining the temperature of the fuzz combustion singeing machine of the textile according to the temperature set value of the fuzz combustion singeing machine and the fuzz combustibility coefficient of the textile.
In order to solve the technical problems, the invention also provides a textile singeing process control system based on computer vision, which comprises a processor and a memory, wherein the processor is used for processing instructions stored in the memory to realize the textile singeing process control method based on computer vision.
The invention has the following beneficial effects: the method comprises the steps of obtaining a surface image of a textile, further obtaining a fuzz area image of the textile and a textile edge, determining each fuzz imaging line pixel point and each fuzz imaging block pixel point according to a gray level image of the fuzz area image, combining the gray level image, a fuzz area binary image and the positions of the textile edge, analyzing the characteristics of the fuzz imaging line pixel points and the fuzz imaging block pixel points, and further determining a fuzz immediately degree value capable of accurately representing the fuzz imaging line area and a fuzz immediately degree value capable of accurately representing the fuzz imaging block area, and further determining the reasonable fuzz combustion singeing machine temperature according to the two immediately degree values. The invention can accurately evaluate the fuzz close degree of the textile, further accurately determine the temperature of the singeing machine and ensure the singeing quality of the textile.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions and advantages of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of a method for controlling a singeing process of a textile based on computer vision according to the present invention.
Detailed Description
In order to further describe the technical means and effects adopted by the present invention to achieve the preset purpose, the following detailed description is given below of the specific implementation, structure, features and effects of the technical solution according to the present invention with reference to the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The embodiment provides a textile singeing process control method based on computer vision, and a corresponding flow chart is shown in fig. 1, and comprises the following steps:
(1) A textile surface image is acquired and a fuzz region image is acquired along with the textile edge from the textile surface image.
In this embodiment, to facilitate the acquisition of the image of the textile surface, a roller structure is provided, the length of which should not be less than the width of the textile. The textile is put on the roller, and the length direction of the roller is consistent with the width direction of the textile. And then acquiring a textile surface image which is put on the roller wheel from one side of the textile by adopting a high-speed ccd camera, thereby obtaining the textile surface image.
After the textile surface image is acquired, an image containing only the textile is obtained by the existing image segmentation algorithm. Since the fuzz areas in the image containing only the textile are located on one side of the textile, a pattern-cutting algorithm is used to distinguish between the textile and the fuzz in the image containing only the textile, thereby obtaining a fuzz area image. And then, the image only containing the textile is subjected to difference with the fuzz area image to obtain the edge of the textile, and the edge of the textile is fitted by a polynomial fitting method to obtain a fitting function. It should be noted that the textile edge is actually a line formed by the textile at the uppermost part of the roller.
(2) And acquiring a gray level image of the fuzz area image, and determining a fuzz area binary image and each fuzz imaging line pixel point and a fuzz imaging block pixel point in the gray level image according to the gray level value of each pixel point in the gray level image.
According to the obtained fuzz region image, a corresponding gray level image is obtained, and each fuzz imaging line pixel point and a fuzz imaging block pixel point are determined according to the gray level image, and the specific implementation steps comprise:
(2-1) determining each fuzz pixel point in each pixel point in the gray level image according to the gray level value of each pixel point in the gray level image, and further calculating the neighborhood gray level value difference index value of each fuzz pixel point in the gray level image.
In the gray level image, since the fuzz pixel point and the background pixel point have obvious differences, the fuzz pixel point and the background pixel point in the gray level image can be distinguished according to the gray level value of each pixel point in the gray level image, so that each fuzz pixel point in each pixel point in the gray level image is determined, and a specific distinguishing method belongs to the prior art and is not repeated here. After each fuzz pixel point is determined, the gray value of the 8 neighborhood pixel point of each fuzz pixel point is obtained, the gray difference value between each neighborhood pixel point and the fuzz pixel point is obtained, the gray difference values obtained by each fuzz pixel point are respectively accumulated, and the sum of the gray value of the 8 neighborhood pixel point of each fuzz pixel point and the difference value of the gray value of the 8 neighborhood pixel point of each fuzz pixel point is obtained, wherein the sum of the gray value of the neighborhood pixel point is the gray value difference index value of the fuzz pixel point.
(2-2) distinguishing each fuzz pixel in the gray scale image according to the neighborhood gray scale value difference index value of each fuzz pixel in the gray scale image, thereby obtaining each fuzz line pixel and each fuzz block pixel.
After the neighborhood gray value difference index value of each fuzz pixel point in the gray level image is obtained through the step (2-1), based on the neighborhood gray value difference index value, a classification algorithm of k-means, k=2 is adopted to distinguish the fuzz line forming area and the fuzz block forming area in the fuzz pixel point, wherein the difference value and the larger category are the fuzz line forming pixel points, namely the fuzz line forming pixel points, and the difference value and the smaller category are the fuzz region forming pixel points, namely the fuzz block forming pixel points.
In addition, in order to facilitate the subsequent determination of the direction of each fuzz imaging line pixel point, a fuzz region binary image corresponding to the gray level image needs to be determined according to the gray level value of each pixel point in the gray level image, and the specific implementation steps include:
respectively judging whether the gray value of each pixel point in the gray image is smaller than a gray threshold value; if the gray level threshold value is smaller than the gray level threshold value, the corresponding pixel point in the gray level image is marked as 0, otherwise, the corresponding pixel point in the gray level image is marked as 1, and therefore a fuzz area binary image is obtained.
The gray threshold is a super parameter, which can be adjusted by an implementer according to a specific implementation scene, and the gray threshold r=3 is set in this embodiment, so as to perform threshold segmentation on the gray image. By performing threshold segmentation using the gray threshold r, a complete fuzz region binary image is obtained in which the fuzz pixels are set to 1 and the non-fuzz pixels are set to 0.
(3) And determining the next degree value of the fuzz line pixel point and the next degree value of the fuzz block pixel point according to the gray level image, the fuzz area binary image, the position of the textile edge, the fuzz line pixel point and the fuzz block pixel point.
The more fuzz is applied to the textile surface, the more difficult it is likely to burn clean, the greater its body size, the higher its density, and the more difficult it is to burn clean. Therefore, the degree of the close proximity of the fuzz is obtained according to the form of the fuzz, and then the degree of the combustibility of the fuzz is obtained according to the whole close proximity of the fuzz, so that the flame control of the singeing machine is completed. Since the distribution forms of the fuzz-forming lines and the fuzz-forming blocks are different, the corresponding closeness values thereof need to be respectively determined, wherein the step of determining the closeness values of the fuzz-forming line pixels comprises the following steps:
(3-1) determining each corresponding one of the respective fuzz line pixels in the fuzz region binary image, and determining a hessian matrix for each one of the respective fuzz line mapped pixels.
For each of the fuzz-line pixels determined in the step (2), a corresponding pixel in the fuzz-area binary image is obtained, and the corresponding pixel in the fuzz-area binary image is referred to as a fuzz-line mapping pixel. The corresponding hessian matrix of each fuzz line mapping pixel point on the binary image is obtained, wherein the hessian matrix is a 2×2 diagonal matrix and is used for representing the second derivative of the fuzz line mapping pixel point on the image, and the specific process of obtaining the hessian matrix of the pixel point belongs to the prior art and is not repeated here.
(3-2) correspondingly determining the direction of each fuzz line pixel according to the position of the textile edge and the hessian matrix of each fuzz line mapped pixel.
After the hessian matrix corresponding to each fuzz line-forming mapping pixel point on the binary image is obtained, the eigenvectors and eigenvalues of the hessian matrix are obtained, and the eigenvector corresponding to the minimum eigenvalue is obtained, which is a two-dimensional unit vector used for representing the direction of the minimum curvature of gray value change of the fuzz line-forming pixel point on the binary image, and the embodiment is used for representing the possible trend of the fuzz at the fuzz line-forming pixel point. After the possible trend of the fuzz at the fuzz imaging line pixel points is obtained, the direction of each fuzz imaging line pixel point is obtained according to the position information of the textile edge relative to each fuzz imaging line pixel point, wherein the direction is the possible trend determined along the minimum characteristic value corresponding to the corresponding hessian matrix and is towards the textile edgeThe direction is set as
It should be noted that, in this embodiment, the reason why the hessian matrix is not directly used for the gray image to further obtain the direction of each fuzz line pixel point is that the fuzz in the gray image is gray and consistent during imaging, so that the direction of each fuzz line pixel point cannot be accurately obtained through the hessian matrix, and the direction of each fuzz line pixel point can be accurately obtained through using the hessian matrix in the binary image.
(3-3) in the gray level image, respectively taking each fuzz imaging line pixel point as a starting point, and sliding according to the direction of each fuzz imaging line pixel point until the first fuzz imaging block pixel point is slid, thereby determining the sliding distance value corresponding to each fuzz imaging line pixel point and the total number of times that each fuzz imaging line pixel point is passed in all sliding processes corresponding to other fuzz imaging line pixel points.
After the direction of each fuzz-imaging line pixel is determined through the step (3-2), the direction is corresponding to the gray image. Then each fuzz imaging line pixel point takes the position of the pixel point as a starting point, slides from the fuzz imaging line region to the region formed by the fuzz imaging block pixel points along the corresponding direction, and determines the first fuzz imaging block pixel point corresponding to the region formed by the fuzz imaging block pixel points. For each fuzz line pixel point, according to the sliding process corresponding to the fuzz line pixel point, a sliding distance value h corresponding to the sliding process of the fuzz line pixel point can be determined, and meanwhile, the total number of times c that each fuzz line pixel point passes in the sliding process of other fuzz line pixel points can be determined.
(3-4) determining the distance from each fuzz line pixel to the textile edge and the distance from the corresponding first fuzz block pixel to the textile edge according to the position of the textile edge.
After determining the first fuzz block pixel determined as each fuzz pixel slides through step (3-3) above, the distance j from each fuzz block pixel to the textile edge is calculated in combination with the textile edge location. At the same time, the distance p from each fuzz pixel to the textile edge is calculated.
(3-5) calculating the close degree value of the fuzz imaging pixel point according to the sliding distance value corresponding to each fuzz imaging pixel point, the total number of times that each fuzz imaging pixel point is passed in all sliding processes corresponding to other fuzz imaging pixel points, the distance from each fuzz imaging pixel point to the edge of the textile and the distance from the first fuzz imaging pixel point corresponding to each fuzz imaging pixel point to the edge of the textile, wherein the corresponding calculation formula is as follows:
wherein,for the pixel point of fuzz imaging line, the value of the degree of closeness is +.>Is the firstiSliding distance value corresponding to each fuzz imaging line pixel point, < >>Is the firstiDistance between the first fuzz block pixel corresponding to the fuzz imaging line pixel and the edge of the textile>Is the firstiDistance from individual fuzz thread pixels to textile edge,/->Is the firstiThe total number of times that the corresponding fine hair imaging pixel points pass through in all sliding processes corresponding to other fine hair imaging pixel points is N, and N is the number of fine hair imaging pixel points.
The fuzz imaging line pixelIn the calculation formula corresponding to the point close degree value, for the firstiSliding distance value corresponding to each fuzz imaging line pixel pointIt is characterized by a sliding distance value along the course of the fuzz, from the area of the fuzz-line to the area of the fuzz mass, which value is greater, indicating a longer fuzz.
For the firstiDistance from first fuzz block pixel point corresponding to each fuzz line pixel point to textile edgeCharacterized by the firstiThe distance from the sliding end point of the individual fuzz-formed line pixels to the edge of the textile is determined by +.>The length of the fuzz can be approximated. When the length of the fuzz is longer, the more likely it is that the fuzz will distort and the more likely it is that the fuzz will be immediately next to each other.
For the firstiDistance from individual fuzz-forming pixel points to textile edgeThe method is characterized in that the distance value of the fuzz-line pixel point relative to the edge of the textile is +.>The larger the value of (a) is, the description of the (b)iThe further away a fuzz-forming line pixel is from the textile edge, the less likely the fuzz is to be immediately adjacent, the fuzz-forming line pixel is immediately adjacent to the extent value +.>Smaller (less)>The smaller the value of (a) is, the description of the (b)iThe closer the individual fuzz thread pixels are to the textile edge, the more likely the fuzz is to be immediately next to the fuzzAn imaging pixel point is immediately adjacent to the degree value +.>The larger. At equal distance->The longer the length of the fuzz, the more closely the fuzz is, the more closely the fuzz imaging pixel is>The larger.
For the firstiThe total number of times that the corresponding one of the plurality of fuzz-formed line pixels is passed in all sliding processes corresponding to the other fuzz-formed line pixelsCharacterized by the firstiThe number of times that a pixel of a line of a corresponding fuzz is passed when other fuzz pixels slide, the closer it is to the fuzz block area, the more times that it is passed when other fuzz pixels slide, so the more the corresponding pixel of the line of the fuzz is, the lower the weight that indicates the degree of the close proximity thereof, so the contrast is good>Making a negative correlation map so that +.>The lower the value of (c), the greater the corresponding closeness weight and ultimately the greater the closeness of the entire fuzz.
Pixel point close degree value for fuzz imagingWhich indicates the current closeness of the fuzz-forming area to the textile, the greater its value, the higher the closeness is. Compared with the distance between the pixel points of the fuzz line and the edge of the textile, the embodiment determines the fuzz line image by combining the binary image and the gray imageThe sliding distance of the pixel points is obtained, the close degree expressed by each fuzz imaging pixel point is further accumulated to obtain +.>,/>The greater the value, the closer the fuzz.
The fuzz not only becomes linear but also becomes agglomerated on the surface of the textile, wherein the more uniform the gray value inside the agglomerated area is during the agglomeration, the more tight the fuzz in the agglomerated area is, and the fuzz is easy to burn and not clean during singeing, so that the degree of tightness of the agglomerated fuzz needs to be obtained. The larger the gray value, the brighter the indication, the better the light transmittance of the current fuzz agglomeration, the less tight the agglomeration, and if the gray difference value of the surrounding pixel points is larger, the less tight the agglomeration. Based on the analysis, the step of determining the value of the degree of closeness of the fuzz-blocking pixels comprises:
according to the gray value of each pixel point in the gray image, determining the neighborhood gray difference value of each fuzz block pixel point in the gray image, wherein the neighborhood gray difference value is obtained by selecting the maximum gray value and the minimum gray value in the 8 neighborhood pixel points of each fuzz block pixel point to take the absolute value. And calculating the gray value variance of each fuzz block pixel in the gray image according to the gray value of each fuzz block pixel in the gray image. According to the gray value, the neighborhood gray difference value and the gray value variance of each fuzz block pixel point in the gray image, calculating the close degree value of the fuzz block pixel point, wherein the corresponding calculation formula is as follows:
wherein,for the pixel point of the fuzz-forming block, the value of the degree of closeness is +.>For gray value variance of each fuzz block pixel point in gray image,/for gray value variance>Gray value of pixel point of t-th fuzz imaging block in gray image, +.>The difference value of the neighborhood gray scale of the t-th fuzz imaging block pixel point in the gray scale image is M, and M is the number of fuzz imaging block pixel points.
The pixel point of the fuzz block is close to the degree valueIn the calculation formula of (2), gray value of t-th fuzz imaging block pixel point in gray image +.>The greater the brightness of the pixel points of the t-th fuzz block, the better the light transmittance of the current fuzz block, and the less compact the current fuzz block. Neighborhood gray scale difference value of t-th fuzz imaging block pixel point in gray scale image>The gray scale difference of the neighbor pixel point of the t-th fuzz block pixel point is shown, and the larger the gray scale difference is, the more uneven the gray scale of the fuzz block is, and the non-uniform is. Gray value variance of each fuzz block pixel in gray image>The gray value distribution of each pixel point of the fuzz block is uniform, and the larger the gray value is, the more uniform the light transmittance is in the fuzz block area, so the negative correlation mapping is performed to ensure +.>The bigger the->The smaller the value of (c), the less closely the fuzz it represents.
(4) And determining the fuzz flammability coefficient of the textile according to the fuzz imaging line pixel point close degree value and the fuzz imaging block pixel point close degree value, and further determining the fuzz burning singeing machine temperature of the textile according to the fuzz flammability coefficient of the textile.
A first weight value corresponding to the pixel point of the fuzz-formed line next to the degree value is obtained, and a second weight value corresponding to the pixel point of the fuzz-formed block next to the degree value is obtained. And according to the first weight value and the second weight value, carrying out weighted summation on the fuzz imaging line pixel point close degree value and the fuzz imaging block pixel point close degree value, thereby obtaining a fuzz flammability coefficient of the textile, wherein a corresponding calculation formula is as follows:
wherein,is the fuzz flammability coefficient of textiles, +.>For the pixel point of fuzz imaging line, the value of the degree of closeness is +.>For the pixel point of the fuzz-forming block, the value of the degree of closeness is +.>For a first flammability coefficient, i.e. a first weight value corresponding to the value of the degree of the fuzz-imaging pixel>For a second flammability coefficient, i.e. fuzz-blocking pixel tighteningAnd a second weight value corresponding to the degree of closeness value.
To determine the first flammability coefficientObtaining the number of the pixel points of each fuzz line, determining the area of a bounding box formed by the fuzz line forming area formed by each fuzz line forming pixel point through convex hull detection, and obtaining the ratio of the number of the pixel points of each fuzz line forming pixel point (representing the fuzz area) to the area of the bounding box formed by the fuzz line forming area->That is, the first flammability coefficient +.>,/>The larger the value of (c) indicates that the more individual fuzz in the fuzz-forming zone, the less likely it is to burn and the worse the flammability level.
Similarly, the ratio of the number of the fuzz block pixels (representing the fuzz area) to the area of the bounding box formed by the fuzz block region formed by the fuzz block pixels is obtainedThat is, the second flammability coefficient +.>,/>The larger the value of (c) the more compact the fuzz agglomeration area, the less readily it burns and the worse it combustibility.
Obtaining the temperature set value of a fuzz combustion singeing machine of a textile in a set fuzz stateFurther, according to the temperature setting value of the fuzz combustion singeing machine +.>And the fuzz flammability coefficient of textiles->Determining the temperature of the fuzz combustion singeing machine of the textile>The corresponding calculation formula is:
wherein,is super parameter, which can be adjusted by the practitioner according to the material coefficient of the textile, the higher the value, the higher the temperature required when the textile is singed, the ∈> =0.5。
In determining the temperature of fuzz burning singeing machine of textileAnd then controlling the flame temperature of singeing to finish the flame control of the singeing process.
The embodiment also provides a textile singeing process control system based on computer vision, which comprises a processor and a memory, wherein the processor is used for processing instructions stored in the memory so as to realize the textile singeing process control method based on computer vision. Since the control method of the singeing process of the textile based on computer vision is described in detail in the above, the details are not repeated here.
According to the invention, a fuzz area image is obtained according to a textile surface image, the fuzz close degree is obtained according to the fuzz area image, the fuzz combustibility degree is obtained according to the fuzz close degree, and the temperature control of the singeing machine is obtained according to the fuzz combustibility degree. Compared with the conventional method, the method can obtain the fuzz flammability degree more accurately, so that the flame of the singeing machine can be controlled more accurately, and the singeing quality is ensured.
It should be noted that: the sequence of the embodiments of the present invention is only for description, and does not represent the advantages and disadvantages of the embodiments. And the foregoing description has been directed to specific embodiments of this specification. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments.
The foregoing description of the preferred embodiments of the invention is not intended to limit the invention to the precise form disclosed, and any such modifications, equivalents, and alternatives falling within the spirit and scope of the invention are intended to be included within the scope of the invention.

Claims (6)

1. The control method of the textile singeing process based on computer vision is characterized by comprising the following steps of:
acquiring a textile surface image, and acquiring a fuzz area image and a textile edge according to the textile surface image;
acquiring a gray level image of the fuzz area image, and determining a fuzz area binary image and each fuzz imaging line pixel point and a fuzz imaging block pixel point in the gray level image according to gray level values of each pixel point in the gray level image;
determining the close degree value of the fuzz line pixel point and the close degree value of the fuzz block pixel point according to the gray level image, the fuzz area binary image, the position of the textile edge, each fuzz line pixel point and each fuzz block pixel point;
determining the fuzz flammability coefficient of the textile according to the fuzz imaging line pixel point close degree value and the fuzz imaging block pixel point close degree value, and further determining the fuzz burning singeing machine temperature of the textile according to the fuzz flammability coefficient of the textile;
the step of determining the pixel point close degree value of the fuzz imaging line comprises the following steps:
determining corresponding fuzz line mapping pixel points of each fuzz line pixel point in the fuzz region binary image, and determining a hessian matrix of each fuzz line mapping pixel point; according to the position of the textile edge and the hessian matrix of each fuzz line-forming mapping pixel point, correspondingly determining the direction of each fuzz line-forming pixel point; in the gray level image, each fuzz line forming pixel point is taken as a starting point, the fuzz line forming region slides to the region formed by the fuzz block pixel points along the corresponding direction, and the first fuzz block pixel point corresponding to the sliding to the region formed by the fuzz block pixel points is determined, so that the sliding distance value corresponding to each fuzz line forming pixel point and the total number of times, corresponding to each fuzz line forming pixel point, of the sliding processes corresponding to other fuzz line forming pixel points are determined; determining the distance from each fuzz line pixel to the textile edge and the distance from the corresponding first fuzz block pixel to the textile edge according to the position of the textile edge; calculating the close degree value of the fuzz imaging pixel points according to the sliding distance value corresponding to each fuzz imaging pixel point, the total number of times that each fuzz imaging pixel point corresponds to each fuzz imaging pixel point and is passed in all sliding processes corresponding to other fuzz imaging pixel points, the distance from each fuzz imaging pixel point to the edge of the textile and the distance from the first fuzz imaging pixel point corresponding to each fuzz imaging pixel point to the edge of the textile;
the step of determining the pixel point close degree value of the fuzz forming block comprises the following steps:
determining a neighborhood gray level difference value of each fuzz-blocking pixel point in the gray level image according to the gray level value of each pixel point in the gray level image;
calculating the gray value variance of each fuzz block pixel in the gray image according to the gray value of each fuzz block pixel in the gray image;
calculating the close degree value of each fuzz block pixel point according to the gray value, the neighborhood gray difference value and the gray value variance of each fuzz block pixel point in the gray image;
the step of determining the fuzz flammability factor of the textile comprises:
acquiring a first weight value corresponding to a pixel point of the fuzz imaging line, and acquiring a second weight value corresponding to a pixel point of the fuzz imaging block; according to the first weight value and the second weight value, weighting and summing the fuzz imaging line pixel point close degree value and the fuzz imaging block pixel point close degree value, so as to obtain a fuzz flammability coefficient of the textile;
the step of determining the temperature of the fuzz combustion singeing machine of the textile comprises the following steps:
and acquiring a temperature set value of a fuzz combustion singeing machine of the textile in a set fuzz state, and further determining the temperature of the fuzz combustion singeing machine of the textile according to the temperature set value of the fuzz combustion singeing machine and the fuzz combustibility coefficient of the textile.
2. The method for controlling a singeing process of a textile product based on computer vision according to claim 1, wherein,
the calculation formula corresponding to the pixel point close degree value of the fuzz imaging line is as follows:
wherein,for the fuzz imaging line pixel point close degree value,/>for the sliding distance value corresponding to the ith fuzz line pixel point, +.>For the distance from the pixel point of the first fuzz block corresponding to the ith fuzz line pixel point to the edge of the textile, +.>For the distance of the ith fuzz-line pixel to the textile edge,/for the distance of the (i) th fuzz-line pixel to the textile edge>The total number of times that the ith fuzz line pixel point corresponds to and passes through in all sliding processes corresponding to other fuzz line pixels points is N, and N is the number of the fuzz line pixels points.
3. The method of claim 2, wherein the calculation formula corresponding to the pixel point proximity value of the fuzz-blocking is:
wherein,for the pixel point of the fuzz-forming block, the value of the degree of closeness is +.>For gray value variance of each fuzz block pixel point in gray image,/for gray value variance>Gray value of pixel point of t-th fuzz imaging block in gray image, +.>The difference value of the neighborhood gray scale of the t-th fuzz imaging block pixel point in the gray scale image is M, and M is the number of fuzz imaging block pixel points.
4. The method of claim 1, wherein determining individual fuzz-line pixels and fuzz-block pixels in the gray scale image comprises:
according to the gray value of each pixel point in the gray image, determining each fuzz pixel point in each pixel point in the gray image, and further calculating the neighborhood gray value difference index value of each fuzz pixel point in the gray image;
and distinguishing each fuzz pixel point in the gray level image according to the neighborhood gray level value difference index value of each fuzz pixel point in the gray level image, so as to obtain each fuzz imaging line pixel point and each fuzz imaging block pixel point.
5. The method of controlling a singeing process for textile fabrics based on computer vision according to claim 1, wherein the step of determining a binary map of fuzz areas comprises:
respectively judging whether the gray value of each pixel point in the gray image is smaller than a gray threshold value;
if the gray level threshold value is smaller than the gray level threshold value, the corresponding pixel point in the gray level image is marked as 0, otherwise, the corresponding pixel point in the gray level image is marked as 1, and therefore a fuzz area binary image is obtained.
6. A computer vision based textile singeing process control system comprising a processor and a memory, the processor for processing instructions stored in the memory to implement the computer vision based textile singeing process control method of any of claims 1-5.
CN202211161434.0A 2022-09-23 2022-09-23 Textile singeing process control method and system based on computer vision Active CN115237083B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211161434.0A CN115237083B (en) 2022-09-23 2022-09-23 Textile singeing process control method and system based on computer vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211161434.0A CN115237083B (en) 2022-09-23 2022-09-23 Textile singeing process control method and system based on computer vision

Publications (2)

Publication Number Publication Date
CN115237083A CN115237083A (en) 2022-10-25
CN115237083B true CN115237083B (en) 2024-01-12

Family

ID=83667542

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211161434.0A Active CN115237083B (en) 2022-09-23 2022-09-23 Textile singeing process control method and system based on computer vision

Country Status (1)

Country Link
CN (1) CN115237083B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005082616A1 (en) * 2004-02-24 2005-09-09 Milliken & Company Treated textile substrate and method for making a textile substrate
CN101419706A (en) * 2008-12-11 2009-04-29 天津工业大学 Jersey wear flokkit and balling up grading method based on image analysis
CN104021561A (en) * 2014-06-17 2014-09-03 浙江理工大学 Fabric fuzzing and pilling image segmentation method based on wavelet transformation and morphological algorithm
CN104727050A (en) * 2015-04-07 2015-06-24 苏州市晨彩纺织研发有限公司 Laser scanning singeing device
CN108344859A (en) * 2018-01-08 2018-07-31 武汉纺织大学 A kind of online test method of circulating friction fabric surface hairiness
CN109727230A (en) * 2018-11-30 2019-05-07 西安工程大学 A kind of pile textile surface apparatus for measuring quality and measurement method
CN112458677A (en) * 2020-11-09 2021-03-09 济南红英贸易有限公司 Singeing device for manufacturing biotechnical textiles
CN214060967U (en) * 2020-11-17 2021-08-27 无锡宇一精工科技有限公司 Textile singeing device for removing fuzz on surface of fabric
CN113610849A (en) * 2021-10-09 2021-11-05 海门市恒创织带有限公司 Intelligent operation method and system for textile singeing process based on image processing
CN216809262U (en) * 2021-12-04 2022-06-24 苏州润骢科技股份有限公司 Singeing device for removing fuzz on fabric
CN114723704A (en) * 2022-04-01 2022-07-08 南通百杭纺织品有限公司 Textile quality evaluation method based on image processing
CN114998227A (en) * 2022-05-20 2022-09-02 江苏博腾家用纺织品有限公司 Cloth printing and dyeing defect detection method and system based on image processing
CN114998321A (en) * 2022-07-19 2022-09-02 南通博纳纺织品有限公司 Textile material surface hairiness degree identification method based on optical means

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19642712A1 (en) * 1996-10-16 1998-04-23 Saechsisches Textilforsch Inst Method and device for measuring and quality evaluation of surface effects on textile webs
US9359721B2 (en) * 2013-03-13 2016-06-07 WestPoint Home LLC Soft feel printed fabric and method of producing same

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005082616A1 (en) * 2004-02-24 2005-09-09 Milliken & Company Treated textile substrate and method for making a textile substrate
CN101419706A (en) * 2008-12-11 2009-04-29 天津工业大学 Jersey wear flokkit and balling up grading method based on image analysis
CN104021561A (en) * 2014-06-17 2014-09-03 浙江理工大学 Fabric fuzzing and pilling image segmentation method based on wavelet transformation and morphological algorithm
CN104727050A (en) * 2015-04-07 2015-06-24 苏州市晨彩纺织研发有限公司 Laser scanning singeing device
CN108344859A (en) * 2018-01-08 2018-07-31 武汉纺织大学 A kind of online test method of circulating friction fabric surface hairiness
CN109727230A (en) * 2018-11-30 2019-05-07 西安工程大学 A kind of pile textile surface apparatus for measuring quality and measurement method
CN112458677A (en) * 2020-11-09 2021-03-09 济南红英贸易有限公司 Singeing device for manufacturing biotechnical textiles
CN214060967U (en) * 2020-11-17 2021-08-27 无锡宇一精工科技有限公司 Textile singeing device for removing fuzz on surface of fabric
CN113610849A (en) * 2021-10-09 2021-11-05 海门市恒创织带有限公司 Intelligent operation method and system for textile singeing process based on image processing
CN216809262U (en) * 2021-12-04 2022-06-24 苏州润骢科技股份有限公司 Singeing device for removing fuzz on fabric
CN114723704A (en) * 2022-04-01 2022-07-08 南通百杭纺织品有限公司 Textile quality evaluation method based on image processing
CN114998227A (en) * 2022-05-20 2022-09-02 江苏博腾家用纺织品有限公司 Cloth printing and dyeing defect detection method and system based on image processing
CN114998321A (en) * 2022-07-19 2022-09-02 南通博纳纺织品有限公司 Textile material surface hairiness degree identification method based on optical means

Also Published As

Publication number Publication date
CN115237083A (en) 2022-10-25

Similar Documents

Publication Publication Date Title
CN114972357B (en) Roller surface defect detection method and system based on image processing
JP4997252B2 (en) How to identify the illumination area in an image
US10713780B2 (en) Color quality assessment based on multispectral imaging
TWI224287B (en) Iris extraction method
US20090034824A1 (en) Computerized image analysis for acetic acid induced Cervical Intraepithelial Neoplasia
CN106971153B (en) Illumination compensation method for face image
US20100195902A1 (en) System and method for calibration of image colors
CN113570602B (en) Hot-rolled steel coil curling evaluation method based on artificial intelligence
CN114998227A (en) Cloth printing and dyeing defect detection method and system based on image processing
CN108710832B (en) Reference-free iris image definition detection method
JP6405124B2 (en) Inspection device, inspection method, and program
CN115237083B (en) Textile singeing process control method and system based on computer vision
CN115272256A (en) Sub-pixel level sensing optical fiber path Gaussian extraction method and system
CN114513607A (en) Method, device and system for self-adjusting field range of high-temperature industrial endoscope
JP6585793B2 (en) Inspection device, inspection method, and program
Zhang et al. Automated microwave tomography (Mwt) image segmentation: State-of-the-art implementation and evaluation
Zulkeflee et al. Detection of a new crescent moon using the Maximally Stable Extremal Regions (MSER) technique
CN115471504B (en) Automatic thread end identification method based on textile fabric
CN107886549A (en) A kind of dermatoglyphic pattern of the fabric color transfer method based on braiding grain details enhancing
CN115880297A (en) Quilt cover dyeing quality evaluation method based on machine vision
CN107705284B (en) Bayesian small sample learning-based surface defect detection method
KR101151154B1 (en) Image processing method for determinimg the skin elasticity using moireimage
Hill et al. Dimensional change measurement and stain segmentation in printed fabrics
JP2009217798A (en) Contour detection method, contour detection device, and contour detection program
JP2009217799A (en) Contour detection method, contour detection device, and contour detection program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant