CN114998321A - Textile material surface hairiness degree identification method based on optical means - Google Patents
Textile material surface hairiness degree identification method based on optical means Download PDFInfo
- Publication number
- CN114998321A CN114998321A CN202210844374.6A CN202210844374A CN114998321A CN 114998321 A CN114998321 A CN 114998321A CN 202210844374 A CN202210844374 A CN 202210844374A CN 114998321 A CN114998321 A CN 114998321A
- Authority
- CN
- China
- Prior art keywords
- hairiness
- bright area
- pixel points
- area connected
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 206010020112 Hirsutism Diseases 0.000 title claims abstract description 160
- 238000000034 method Methods 0.000 title claims abstract description 49
- 239000004753 textile Substances 0.000 title claims abstract description 41
- 239000000463 material Substances 0.000 title claims abstract description 33
- 230000003287 optical effect Effects 0.000 title claims abstract description 25
- 239000013598 vector Substances 0.000 claims abstract description 42
- 210000004209 hair Anatomy 0.000 claims description 32
- 230000007547 defect Effects 0.000 claims description 23
- 230000008859 change Effects 0.000 claims description 19
- 238000003709 image segmentation Methods 0.000 claims description 2
- 238000001514 detection method Methods 0.000 abstract description 15
- 238000012545 processing Methods 0.000 abstract description 6
- 210000003746 feather Anatomy 0.000 description 16
- 238000004891 communication Methods 0.000 description 13
- 238000000605 extraction Methods 0.000 description 9
- 238000004364 calculation method Methods 0.000 description 6
- 239000000835 fiber Substances 0.000 description 5
- 238000004519 manufacturing process Methods 0.000 description 5
- 230000009471 action Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 239000004744 fabric Substances 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 239000000047 product Substances 0.000 description 3
- 238000009941 weaving Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000005520 cutting process Methods 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000007795 chemical reaction product Substances 0.000 description 1
- 230000001112 coagulating effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 238000005498 polishing Methods 0.000 description 1
- 229920006395 saturated elastomer Polymers 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000009999 singeing Methods 0.000 description 1
- 238000009987 spinning Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000004804 winding Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/187—Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/762—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30124—Fabrics; Textile; Paper
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Multimedia (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to the field of data processing, in particular to a textile material surface hairiness degree identification method based on an optical means. The method comprises the following steps: clustering the pixel points with the attention degree larger than or equal to the threshold value to obtain each connected domain, and calculating the main direction vector of each connected domain; for any connected domain: firstly, calculating the similarity between the main direction of the connected domain and the main direction of the adjacent connected domain closest to the connected domain, and then calculating the consistency between the gradient direction of the pixel points around the main direction of the connected domain and the main direction of the bright area; then judging whether the pixel points between the connected domains and the pixel points in the connected domains belong to the same hairiness; and calculating a severity index according to the number and the length of the hairiness. The method is a method for detecting the textile material by an optical means, in particular to the method for detecting the existence of hairiness flaws on the surface of the textile material. The method can be applied to new material related services, and can realize new material detection, metering, related standardization, authentication and approval services and the like. The invention improves the detection efficiency.
Description
Technical Field
The invention relates to the technical field of data processing, in particular to a textile material surface hairiness degree identification method based on an optical means.
Background
The yarn is formed by twisting, winding, twisting and coagulating one or more long or short fibers, so that the fibers are necessarily extended out of the yarn main body, and the yarn hairiness is formed by the comprehensive factors of fiber movement, process configuration and mechanical action in production. The yarn hairiness is one of the characteristics for measuring the basic structure and appearance of the yarn, and is also an important reference index for evaluating the quality of the yarn. The proper amount of short hairiness on the yarn can make the fabric appearance plump and make the clothes more comfortable and soft; too much hairiness on the yarn can cause the yarn to have a rough appearance, and adjacent warp yarns can be intertwined with each other, so that the effective height of the opening of the weaving machine is reduced, the opening of the weaving machine is unclear, false warp hanging, increased broken ends and difficult weft insertion are caused, and the stop of the weaving machine is caused by the blocked flying of the weft yarns.
The traditional method for detecting the new material, namely the hairiness of the textile yarns, comprises the following steps: the method comprises the steps of acquiring a yarn image by an optical means, manually measuring the amplified yarn image, projecting a microscopic yarn image onto a large screen or taking a picture, and judging the defect degree of yarn hairiness defects according to the physical properties of hairiness in the image, such as color, length and the like.
Disclosure of Invention
In order to solve the problem that the existing method can not automatically detect the hairiness defect, the invention aims to provide a textile material surface hairiness degree identification method based on an optical means, and the adopted technical scheme is as follows:
the invention provides a textile material surface hairiness degree identification method based on an optical means, which comprises the following steps:
acquiring a surface gray image of the textile yarns, and segmenting an image of a yarn evenness area in the surface gray image of the textile yarns by adopting an image segmentation algorithm to obtain an image of a target area;
dividing the image of the target area into a set number of area images: for any region image: calculating the attention degree of a pixel point corresponding to any gray value in the region according to the gray value of the pixel point in the image of the region, and marking the pixel point of which the attention degree is more than or equal to a set threshold value; clustering the marking pixel points according to the attention degree of the marking pixel points in the area to obtain each bright area connected area corresponding to the area;
calculating a main direction vector of each bright area connected domain according to the direction vector of the gradient change direction of each marked pixel point in each bright area connected domain; for any bright area connected region in any area image: calculating the similarity of the main directions of the bright area communication domain and the adjacent bright area communication domain closest to the bright area communication domain according to the main direction vector of the bright area communication domain and the main direction vector of the adjacent bright area communication domain closest to the bright area communication domain;
judging whether the similarity is greater than a first threshold value, if so, calculating consistency indexes of the gradient directions of the pixels in the set range around the corresponding two bright area connected domains and the bright area main directions according to the similarity of the gradient directions of the pixels in the set range around the corresponding two bright area connected domains and the bright area main directions, judging whether the consistency indexes are greater than a second threshold value, and if so, judging whether each pixel between the two bright area connected domains and the pixel in the corresponding two bright area connected domains belong to the same hairiness according to the gradient amplitude of each pixel between the corresponding two bright area connected domains;
and calculating the severity index of the defect of the hairiness according to the number of the hairiness in the images of the regions and the length of the hairiness.
Preferably, the following formula is adopted to calculate the attention degree of the pixel point corresponding to any gray value in the region, and the calculation method includes:
wherein,the attention degree of the pixel point corresponding to any gray value,is the gray value of the pixel point, and the gray value is the gray value of the pixel point,is the maximum gray value of the pixel points in the region,is the base number of the natural logarithm,is a gray scale threshold.
Preferably, the following formula is adopted to calculate the main direction vector of each bright area connected domain:
wherein,is the principal direction vector of any bright area connected domain,the number of marked pixels in the connected domain,is the second in the connected domainAnd marking the direction vector of the gradient change direction of the pixel points.
Preferably, the following formula is adopted to calculate the consistency index of the gradient direction of the pixel points in the set range around the main direction of the corresponding two bright area connected domains and the main direction of the bright area:
wherein,is arranged around the main direction of the communicated area corresponding to the two bright areasThe consistency index of the gradient direction of the pixel points in the fixed range and the main direction,the number of pixel points in a range is set for the periphery of the main direction of one bright area connected domain,set the range around the main direction of the bright area connected regionThe direction vector of the gradient change direction of each pixel point,is the principal direction vector of the bright area connected component,the number of pixel points in the range is set for the periphery of the main direction of the other bright area connected domain,within a range set for the main direction periphery of the bright area connected regionThe direction vector of the gradient change direction of each pixel point,is the main direction vector of the bright area connected domain.
Preferably, if the similarity is less than or equal to the first threshold, it is determined that the pixel points in the two bright area connected regions do not belong to the pixel point of the same hairiness.
Preferably, the determining whether each pixel point between two bright area connected domains and a pixel point in the corresponding two bright area connected domains belong to the same hairiness includes:
detecting pixel points between the main directions of the corresponding two bright area communicating regions by using a sliding window, connecting two pixel points which are closest to each other on the main direction vectors of the two bright area communicating regions, and taking a middle point on a connecting line as an initial central point of the sliding window;
calculating the variance of the gradient amplitude of the pixel points in the sliding window, and taking the variance as the fluctuation degree of the corresponding pixel points in the sliding window;
and judging whether the fluctuation degree is greater than or equal to a fluctuation degree threshold value, and if so, judging that the pixel point in the sliding window and the pixel point in the corresponding two bright areas are connected to the same hairiness.
Preferably, if the consistency index is less than or equal to the second threshold, it is determined that the pixels corresponding to the two bright areas in the connected area do not belong to the pixels of the same hairiness.
Preferably, the severity indicator of the hairiness defect is calculated by using the following formula:
wherein,is an index of the severity of the defect of the hairiness,is as followsThe number of hairs in each zone being greater than a set length,is as followsThe number of hairs in an area smaller than or equal to a set length,is the total number of regions.
The invention has the following beneficial effects: the invention aims to detect the severity of defects on the surface of textile yarns, so that the defect detection needs to be carried out on a new material of the textile yarns, and the defect detection mainly comprises the steps of measuring the length of the hairiness and the number of the long hairiness and judging the defect degree of the defects of the hairiness according to the length of the hairiness and the proportion of the long hairiness in consideration of the influence of the length of the yarn hairiness and the density of the long hairiness on the quality of the yarns. The method comprises the steps of firstly, acquiring a surface gray image of the textile yarns by using an optical means, then segmenting the image of a yarn evenness area to obtain an image of a target area, eliminating interference of some useless pixel points on a metering result, and reducing the calculated amount; the invention firstly obtains the connected domain formed by the brighter pixel points on the hair feather, if the same hairiness has a plurality of bright area communicating areas, the main direction of the bright area communicating areas on the hairiness has higher similarity, the invention calculates the similarity of the main directions of the adjacent bright area communicating areas, judges whether the similarity is larger than a threshold value, if so, indicates that the two corresponding bright area communicating areas probably belong to the same hairiness, then calculating the consistency index of the pixel points around the corresponding two adjacent bright area connected regions and the main direction of the corresponding two bright area connected regions, then judging whether the consistency index is larger than a threshold value, if so, judging whether a pixel point with a smaller gray value between the two bright area communicating areas and a pixel point in the two bright area communicating areas belong to a pixel point on the same hairiness, and extracting the framework of the hairiness by adopting a framework extraction algorithm after the judgment is finished; and measuring the quantity and the length of the hairiness, and calculating the severity index of the hairiness defect according to the quantity and the length of the hairiness. The method is a method for detecting the textile material by using an optical means (specifically, visible light images), and specifically, the existence of hairiness flaws on the surface of the textile material is detected. The method can be applied to new material related services, and can realize new material detection, metering, related standardization, authentication and approval services and the like. The method provided by the invention can automatically detect the hairiness flaws, does not need to manually detect the severity of the hairiness flaws of the yarns, saves time and improves the detection efficiency.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flow chart of a method for identifying the hairiness degree of the surface of a textile material based on an optical means provided by the invention;
FIG. 2 is a schematic diagram of an image acquisition system provided by the present invention;
in the figure: 1. a yarn drive system; 2. a first light source; 3. a second light source; 4. a yarn; 5. a background plate; 6. an image acquisition device.
Detailed Description
In order to further illustrate the technical means and effects of the present invention adopted to achieve the predetermined object, the following detailed description of the method for identifying the hairiness on the surface of a textile material based on optical means according to the present invention is provided with the accompanying drawings and the preferred embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following describes a specific scheme of the method for identifying the hairiness degree on the surface of the textile material based on the optical means in detail with reference to the accompanying drawings.
Embodiment of textile material surface hairiness degree identification method based on optical means
The existing method has the problem that the hair feather flaw cannot be automatically detected. In order to solve the above problems, this embodiment proposes a method for identifying the degree of hairiness on the surface of a textile material based on optical means, as shown in fig. 1, the method for identifying the degree of hairiness on the surface of a textile material based on optical means of this embodiment comprises the following steps:
and step S1, obtaining a surface gray image of the textile yarns, and segmenting the image of the yarn evenness area in the surface gray image of the textile yarns by adopting a graph cutting algorithm to obtain an image of a target area.
Yarn hairiness defects are that fibers cannot be wound into yarn evenness and extend out of a yarn main body in the yarn production process, or yarn hairiness fibers are broken due to overlarge tension caused by improper mechanical control in the production process. In the actual production of yarn hairiness, the shape, length, quantity and distribution of the yarn hairiness have extremely important influences on the production efficiency of the subsequent process and the quality of the end product. The length of the yarn hairiness is generally between 1mm and 3mm, different products have different requirements on the length of the yarn hairiness, for soft fabrics, the softness of the fabrics can be increased by properly increasing the length of the hairiness, but the hairiness with the length of more than 3mm can cause the product to have rough appearance and influence the quality and value of the product, so the hairiness with the length of more than 3mm is not allowed to exist.
The specific scenario of this embodiment is as follows: carrying out image acquisition on the produced textile yarns, enabling the yarns to enter a detection area through a transmission device system, polishing the detection area and carrying out image acquisition to obtain gray level images of the surfaces of the yarns, analyzing the acquired yarns, detecting the content of long hairiness and calculating the severity index of hairiness flaws.
Because the length and the inclination angle of the yarn hairiness are different, the hairiness in the collected yarn image is almost different in form and disordered. How to capture a real, complete and clear image of the yarn hairiness is the basis for detecting the yarn hairiness based on image processing. Like other machine vision detection systems, the yarn hairiness image method measuring system provided by the embodiment comprises different components working in combination.
In this embodiment, the defects of hairiness of different forms of the textile yarns need to be identified, so the surface image of the textile yarns is collected by an optical means. As shown in fig. 2, the yarn 4 enters an image acquisition system through a yarn transmission system 1, the image acquisition system acquires an image of the surface of the yarn, and the image acquisition system comprises an electrostatic device, a first light source 2, a second light source 3, a background plate 5 and an image acquisition device 6 (the image acquisition device comprises an industrial CCD camera, a video microscope and a high-speed digital camera). Most of common hairiness is end hairiness, the electrostatic device is arranged to ensure that the hairiness of the yarn is opened to the maximum under the action of static electricity, a plurality of hairiness are prevented from being wound together as much as possible, the subsequent detection of the real length of the hairiness and the statistics of the number of the hairiness are convenient, an industrial CCD (charge coupled device) camera, a video microscope and a high-speed digital camera are arranged right above the yarn to overlook and collect images on the surface of the yarn, a light source adopts an OPT (optical fiber composite) machine vision light source to perform illumination compensation on the surface of the detected yarn, the light source is positioned obliquely above the yarn, a black background plate is arranged and is positioned right below the yarn, the visual field range of the camera is adjusted in advance, and the images collected by the camera are images in the area of the black background plate.
In the embodiment, the collected yarn surface image is processed to remove the influence of other environmental interference and noise, and the obtained image is subjected to graying processing to obtain the surface grayscale image of the textile yarn.
The framework extraction algorithm has a good effect on obtaining the main framework of the target, unnecessary calculation can be reduced by analyzing the framework of the target, and meanwhile, the length calculation of the target is more accurate. The distance field-based skeleton extraction algorithm has a good calculation effect on a distance field with a definite target boundary, but when a target skeleton with a fuzzy boundary or a large gray difference exists in a target and the target integrity is not strong is extracted, the skeleton is easily inaccurate to extract, a plurality of small skeletons are easily extracted from the same target, and the referential property of target skeleton analysis is lost, so that the distance field-based skeleton extraction algorithm has a high requirement on the definition of the target boundary. Because the length, the radiation direction and the inclination angle of the yarn hairiness are different, the collected yarn hairiness in the yarn image has brighter and darker parts, because the gray value of the darker part is lower, the same yarn hairiness is divided into a plurality of small sections by adopting threshold division, for the same hairiness, the vertical directions of local gradient changes of pixel points of the same hairiness are similar, and meanwhile, the gradient changes have a certain similarity rule, so that the approximate area where the hairiness is located is determined, and then the framework extraction is carried out to obtain the hairiness.
The purpose of this embodiment is to determine whether the yarn hairiness is defective, and the presence of the pixel points on the yarn evenness will interfere with the detection of the hairiness, and increase the calculation amount, so the yarn evenness is first segmented from the image of the yarn surface for the subsequent detection of the yarn hairiness, the hairiness generally exists in a certain range on both sides of the yarn evenness, and the range in the image where the hairiness exists is determined by segmenting the yarn evenness, thereby reducing the calculation amount. Specifically, a graph cutting algorithm is adopted to distinguish pixel points on yarn strips from other pixel points, the method selects a global optimal threshold value to segment an image into sub-regions by calculating weights which meet some objects and backgrounds in all segmentation, the image is constructed into an undirected graph, wherein the nodes represent the pixels, and the weights represent boundary energy. The boundary between the object and the background is defined according to the maximum flow min-cut theorem, when the maximum flow goes from source-S to sink-T via the graph, the edge is saturated while the boundary between the object and the background is determined. The graph cut algorithm is prior art and will not be described herein. At this point, the image of the yarn evenness area is segmented.
Because the yarn hairiness only exists in a certain range near the yarn evenness, a large number of background plates exist in the collected yarn image, the calculated amount can be greatly increased when the collected yarn image is processed, and meanwhile, the collected yarn image can be influenced by noise points or other illumination factors, so that x pixel points are vertically extended along the edge pixel points of the yarn evenness by taking the divided yarn main body as the image center, a rough interval in which the hairiness exists is divided, the area is a target area, and the area does not contain the yarn evenness. And calculating the pixel points in the region. The value of x is set to 300 in this embodiment, and in a specific application, the setting is performed according to actual needs.
Step S2, the image of the target area is divided into a set number of area images: for any region image: calculating the attention degree of a pixel point corresponding to any gray value in the region according to the gray value of the pixel point in the region image, and marking the pixel point of which the attention degree is more than or equal to a set threshold value; and clustering the marking pixel points according to the attention degree of the marking pixel points in the area to obtain each bright area connected domain corresponding to the area.
The present embodiment divides the image of the target region obtained in step S1 intoAnd (4) carrying out independent analysis on each area image with the same size, and finally judging the severity of the yarn hairiness defect according to the length of the hairiness in each area image and the density of the hairiness.The value of (b) is set on a case-by-case basis.
For yarn hairiness, under the action of illumination, gray values of hairiness pixel points are larger relative to pixel points on a background plate, and due to the fact that the length, the radiation direction and the inclination angle of the yarn hairiness are different, the gray values of the pixel points on the same hairiness are different greatly, namely a bright area with a larger gray value exists and a dark area with a smaller gray value exists, and therefore the gray level of the pixel point which is most likely to be the hairiness is determined. For any region image: calculating the attention degree of a pixel point corresponding to any gray value in the region, namely:
wherein,the attention degree of the pixel point corresponding to any gray value,is the maximum gray value of the pixel points in the region,is the gray value of the pixel point, and the gray value,is the base number of natural logarithm and takes the value of,For gray scale threshold, in this embodimentIn a specific application, the setting is carried out according to specific situations.
Considering that white noise may exist in the target area image, although the background is black, the gray values of the pixels in the background are slightly different, in order to prevent the noise points and the pixels in the background from being identified as the hair feather pixels, the embodiment first focuses on the pixels with higher brightness in the hair feather, and gives a higher attention degree to the pixels of the same type. That is, the larger the value of Q, the more interesting the pixel point corresponding to the gray value is. The present embodiment sets the attention degree threshold valueWhen the attention degree of the pixel point is more than or equal toAnd then, judging that the pixel point is the pixel point with larger gray value on the yarn hairiness, marking the pixel point, wherein the marked pixel point is the pixel point in the bright area on the yarn hairiness. In this embodiment is provided withThe value of (b) is 0.8, which is set as the case may be in a particular application.
The pixels in the bright area of the hairiness generally exist in a cluster, that is, the pixels in the bright area of the hairiness form a connected domain, so that the embodiment clusters the marked pixels by using a region growing method based on the attention degree of the pixels. Specifically, the pixel point with the largest attention degree in the marked pixel points is selected as a growth seed point, and if a plurality of attention degree values exist, the maximum value is obtainedOne of the pixels is randomly selected, and searching is carried out in 8 neighborhoods of the selected pixel, and the pixels in the neighborhoods belong to attention degreesThe pixel points are reserved and merged into a region, the region is taken as a new growing seed point at the moment, the searching is carried out in the neighborhood again, and the pixel points in the neighborhood belong to the attention degreeThe pixel points are reserved, a new region is updated to obtain a new seed point region, and iteration is carried out for multiple times until the neighborhood does not contain the attention degreeStopping the process of the pixel points, obtaining a first connected domain at the moment, selecting the pixel points with the largest attention degree as growth seed points for the rest of the pixel points in the same way, if the maximum value of a plurality of attention degrees exists, randomly selecting one of the pixel points, searching in 8 neighborhoods of the selected pixel points, wherein the pixel points in the neighborhoods belong to the attention degreeThe pixel points are reserved and merged into a region, the region is taken as a new growing seed point at the moment, the neighborhood is searched again, and the attention degree of the pixel points in the neighborhood is increasedThe pixel points are reserved, a new region is updated to obtain a new seed point region, and iteration is carried out for multiple times until the neighborhood does not contain the attention degreeStopping the operation of the pixel points, obtaining a second connected domain, repeating the operation on the rest pixel points, and iterating to obtain a plurality of connected domains until the attention degreeThe clustering of the pixel points is completed, and a plurality of bright area connected areas corresponding to the area are obtained at the moment.
Step S3, calculating the main direction vector of each bright area connected domain according to the direction vector of the gradient change direction of each marked pixel point in each bright area connected domain; for any bright area connected region in any area image: and calculating the similarity of the main directions of the bright area connected region and the adjacent bright area connected region closest to the bright area connected region according to the main direction vector of the bright area connected region and the main direction vector of the adjacent bright area connected region closest to the bright area connected region.
Considering that the gradient change direction of the pixel points in the bright area of the hair feather is often perpendicular to the main trunk of the hair feather, the main trunk direction of the hair feather is obtained according to the gradient change condition of the marked pixel points. Calculating all pixel points in target area image by using sobel operatorDirectional gradient vectorAnddirectional gradient vectorThen the gradient amplitude of the pixel pointGradient direction of pixel point is。
Next, the principal direction vector of each bright area connected domain is calculated, namely:
wherein,is the principal direction vector of any bright area connected domain,the number of marked pixels in the connected domain,is the second in the connected domainAnd marking the direction vector of the gradient change direction of the pixel points.
When the main directions of two adjacent bright area connected regions are similar, it indicates that the two connected regions are more likely to be the same hairiness. Considering that the hairs are generally thin, the number of the hairs is not particularly large, the length of the hairs is about 3mm, if more than two bright area communication areas are arranged on the same hairs, the distance between the adjacent bright area communication areas is very close, and the distance between the two adjacent hairs is generally larger than the distance between the two adjacent bright area communication areas on the same hairs, so that the distance between the adjacent bright area communication areas on the same hairs is mostly smaller than the distance between the adjacent bright area communication areas on different hairs, the two adjacent bright area communication areas closest to each other may belong to the same hairs, and the main directions of the bright area communication areas on the same hairs are similar. In this embodiment, the adjacent bright area connected regions mentioned later are all the nearest adjacent bright area connected regions. For any bright area connected domain: calculating the similarity of the main direction of the connected domain and the nearest adjacent bright area connected domain, namely:
wherein,for similarity of the main directions of two adjacent connected domains,is the main direction vector of gradient change of the bright area connected domain,is the principal direction vector of the gradient change of the neighboring bright area connected domain closest to the connected domain.
Thus, the similarity of the main directions of the adjacent bright area connected regions is obtained.
Step S4, judging whether the similarity is larger than a first threshold value, if so, calculating the consistency index of the gradient direction of each pixel point in the set range around the main direction of the corresponding two bright area connected areas and the main direction of the bright area according to the gradient direction of each pixel point in the set range around the main direction of the corresponding two bright area connected areas and the similarity of the main direction of the corresponding connected area, judging whether the consistency index is larger than a second threshold value, if so, judging whether each pixel point between the two bright area connected areas and the pixel point in the corresponding two bright area connected areas belong to the same hairiness according to the gradient amplitude of each pixel point between the corresponding two bright area connected areas.
The more similar the main direction of the gradient change of the two adjacent bright area connected regions, namely the similarity of the main directions of the gradient change of the two adjacent bright area connected regionsThe more towards 1, the greater the probability that two adjacent connected components are the same hairiness. The present embodiment sets a similarity index threshold valueThat is, the first threshold, it is determined whether the similarity between the principal directions of any two adjacent bright area connected regions is greater than a set threshold, and if so, it indicates that the two adjacent bright area connected regions are more likely to have the same hairiness. However, due to the influence of noise points or other factors, the judgment of the similarity of the main directions of the connected areas only by two adjacent bright areas is inaccurate, and the conversion from the bright areas to the dark areas of the hairiness and the gradient change of pixel points between the bright areas and the dark areas can be obtained by analysisThe similarity between the direction and the main direction of the hairiness is higher. The specific method for acquiring the surrounding pixel points in the main direction of any bright area connected domain comprises the following steps: establishing a sliding window edge point by using the o-th pixel point at the edge of the connected domainIn the sliding window, all the pixels in the sliding window are the surrounding pixels in the main direction, in this embodimentIs 7, and is set according to specific conditions in specific applications. Then, calculating the consistency index of the gradient direction and the main direction of the pixel points around the main direction of the connected region of the two adjacent bright areas, namely:
wherein,the consistency index of the gradient direction and the main direction of the pixel points around the main direction of the communicating area of two adjacent bright areas,the number of the pixel points around the main direction of one bright area connected domain,the direction vector of the gradient change direction of the ith pixel point around the main direction of the bright area connected domain,the number of the pixel points around the main direction of the connected domain of the other bright area,and the direction vector of the gradient change direction of the ith pixel point around the main direction of the bright area connected domain is obtained.The larger the value of (A) is, the larger the probability that the two bright area connected areas and the pixel points around the main direction of the bright area connected areas are the same hairiness is. The present embodiment sets the consistency index threshold valueI.e. the second threshold, determines whether the consistency indicator is greater thanIf the brightness is larger than the threshold value, performing subsequent processing, and if the brightness is smaller than the threshold value, indicating that the two bright area connected regions do not belong to the same hairiness.
When the consistency index is greater thanIn the embodiment, the sliding window is used to detect the pixel points which have smaller gray values but possibly belong to the hairiness between the two adjacent bright area connected regions, and determine whether the pixel points belong to the same hairiness as the pixel points in the two bright area connected regions. Specifically, establishingThe window of drawing of size, the nearest pixel point of main direction vector apart from second bright area connected domain on the main direction vector of first bright area connected domain is marked as first benchmark, the nearest pixel point of main direction vector apart from first bright area connected domain on the main direction vector of second bright area connected domain is marked as the second benchmark, connect first benchmark and second benchmark, regard the midpoint on the line as the initial central point of sliding window, calculate the volatility of pixel point gradient amplitude in the window of drawing, promptly:
wherein,for marking windowsThe volatility of the gradient amplitude of the middle pixel point,the length of the side of the sliding window,the gradient amplitude of the r-th pixel point in the window is obtained,the mean value of the gradient amplitudes of the pixels in the division window.The larger the value of (A), the more likely the pixel points in the sliding window are feather pixel points, because the background plate is a black background plate, the fluctuation degree of the gradient amplitude of the pixel points is small, and the noise points are usually isolated points, the fluctuation degree is also small, when the background plate is a black background plate, the more likely the pixel points in the sliding window are feather pixel pointsIs greater than or equal toAt this time, the pixel point is subordinate to the hairiness pixel point, and is set in this embodimentIs 3, and is set as the case may be in a particular application.
In this embodiment, a region growing method is used to obtain a hair feather region based on the membership degree of pixel points, and a bright region target point is used as a seed point for region growing to perform region growing, when the fluctuation degree of pixel points in eight neighborhoods isAnd when the number of the pixels in the sliding window is more than or equal to 3, indicating that the pixels in the sliding window belong to the hairiness, combining the pixels meeting the requirements with the bright area, updating the seed points at the moment, performing iteration until the seed point area does not meet the iteration requirement, and obtaining the hairiness area at the moment.
And step S5, calculating the severity index of the hair feather flaws according to the number of the hair feathers in each region image and the length of the hair feathers.
In the present embodiment, in order to obtain the length of each hair feather, a skeleton extraction algorithm based on the distance field is used to perform skeleton extraction of the hair feather, and the distance from the point P inside each hair feather region to the boundary point b (O), i.e., d (P) = min (d (P, O)), where d (P, O) is the euclidean distance from the point P to the point O, is calculated. Both region growing methods and distance field-based skeleton extraction algorithms are conventional and will not be described here. Thus, the skeleton of the hairiness is obtained.
In this embodiment, after obtaining the skeleton of the yarn hairiness in each region, the length of each hairiness is obtained, and the severity index of the yarn hairiness defect is calculated according to the lengths of all the hairiness in each region and the intensity of the long hairiness, that is:
wherein,is the severity index of yarn hairiness flaws,is as followsThe number of long hairs in each zone,is a firstThe number of short hairs in each zone,is the number of the regions, the long hairiness is the hairiness with the length more than 3mm, and the short hairiness is the hairiness with the length less than or equal toHairiness of 3 mm. The denser the distribution of the long hairiness, i.e. the greater the number of the long hairiness, the greater the severity of the yarn hairiness defect.
In this embodiment, quality classification and negative feedback adjustment are performed on the hairiness according to the severity index of the hairiness flaw, when the hairiness flaw is severe, the yarn is recovered and twisted and rewound, and when the hairiness flaw is slight, corresponding singeing processing is performed after the spinning is completed.
The purpose of this embodiment is to detect the severity of a defect on the surface of a textile yarn, and therefore, defect detection needs to be performed on a new material, such as a textile yarn, in consideration of the fact that the length of yarn hairiness and the density of long hairiness may affect the quality of the yarn. The method comprises the steps of firstly, acquiring a surface gray image of the textile yarns by using an optical means, then segmenting the image of a yarn evenness area to obtain an image of a target area, eliminating interference of some useless pixel points on a metering result, and reducing the calculated amount; some of the pixels on the hairs are brighter, and some of the pixels are darker, in this embodiment, the connected domain formed by the brighter pixels on the hairs is obtained first, if the same hairiness has a plurality of bright area connected regions, the main direction of the bright area connected regions on the hairiness has higher similarity, the embodiment calculates the similarity of the main directions of the adjacent bright area connected regions, judges whether the similarity is greater than a threshold value, if so, indicates that the two corresponding bright area connected regions are likely to belong to the same hairiness, then calculating the consistency index of the pixel points around the corresponding two adjacent bright area connected areas and the main direction of the corresponding two bright area connected areas, then judging whether the consistency index is larger than a threshold value, if so, judging whether a pixel point with a smaller gray value between the two bright area communicating areas and a pixel point in the two bright area communicating areas belong to a pixel point on the same hairiness, and extracting the framework of the hairiness by adopting a framework extraction algorithm after the judgment is finished; and measuring the quantity and the length of the hairiness, and calculating the severity index of the hairiness defect according to the quantity and the length of the hairiness. The method is a method for detecting the textile material by using an optical means (specifically, visible light images), and specifically, the existence of hairiness flaws on the surface of the textile material is detected. The method can be applied to new material related services, and can realize new material detection, metering, related standardization, authentication and approval services and the like. The method provided by the embodiment can be used for automatically detecting the hairiness flaws, the severity of the hairiness flaws of the yarns does not need to be manually detected, the time is saved, and the detection efficiency is improved.
It should be noted that: the above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (8)
1. A method for identifying the hairiness degree of the surface of a textile material based on an optical means is characterized by comprising the following steps:
acquiring a surface gray image of the textile yarns, and segmenting an image of a yarn evenness area in the surface gray image of the textile yarns by adopting an image segmentation algorithm to obtain an image of a target area;
dividing the image of the target area into a set number of area images: for any region image: calculating the attention degree of a pixel point corresponding to any gray value in the region according to the gray value of the pixel point in the image of the region, and marking the pixel point of which the attention degree is more than or equal to a set threshold value; clustering the marking pixel points according to the attention degree of the marking pixel points in the area to obtain each bright area connected area corresponding to the area;
calculating a main direction vector of each bright area connected domain according to the direction vector of the gradient change direction of each marked pixel point in each bright area connected domain; for any bright area connected region in any area image: calculating the similarity of the main directions of the bright area communicating region and the adjacent bright area communicating region closest to the bright area communicating region according to the main direction vector of the bright area communicating region and the main direction vector of the adjacent bright area communicating region closest to the bright area communicating region;
judging whether the similarity is greater than a first threshold value, if so, calculating consistency indexes of the gradient directions of the pixels in the set range around the corresponding two bright area connected domains and the bright area main directions according to the similarity of the gradient directions of the pixels in the set range around the corresponding two bright area connected domains and the bright area main directions, judging whether the consistency indexes are greater than a second threshold value, and if so, judging whether each pixel between the two bright area connected domains and the pixel in the corresponding two bright area connected domains belong to the same hairiness according to the gradient amplitude of each pixel between the corresponding two bright area connected domains;
and calculating the severity index of the defect of the hairiness according to the number of the hairiness in the images of the regions and the length of the hairiness.
2. The method for identifying the hairiness degree on the surface of the textile material based on the optical means as claimed in claim 1, wherein the following formula is adopted to calculate the attention degree of a pixel point corresponding to any gray value in the area, and the method comprises the following steps:
wherein,the attention degree of the pixel point corresponding to any gray value,is the gray value of the pixel point, and the gray value is the gray value of the pixel point,is the maximum gray value of the pixel points in the region,is the base number of the natural logarithm,is a gray scale threshold.
3. The method for identifying the hairiness degree on the surface of the textile material based on the optical means as claimed in claim 1, wherein the following formula is adopted to calculate the main direction vector of each bright area connected region:
4. The method for identifying the degree of the hairiness on the surface of the textile material based on the optical means as claimed in claim 1, wherein the consistency index of the gradient direction of the pixel points in the set range corresponding to the main directions of the two bright area connected regions and the main direction of the bright area is calculated by adopting the following formula:
wherein,setting consistency indexes of gradient directions and main directions of pixel points in a range around the main directions of the corresponding two bright area connected regions,the number of pixel points in a range is set for the periphery of the main direction of one bright area connected domain,within a range set for the main direction periphery of the bright area connected regionThe direction vector of the gradient change direction of each pixel point,is the principal direction vector of the bright area connected component,the number of pixel points in the range is set for the periphery of the main direction of the other bright area connected domain,set the range around the main direction of the bright area connected regionThe direction vector of the gradient change direction of each pixel point,is the principal direction vector of the bright area connected domain.
5. The method as claimed in claim 1, wherein if the similarity is less than or equal to a first threshold, it is determined that the pixels in the two bright area connected regions do not belong to the same hairiness pixel.
6. The method for identifying the degree of hairiness on the surface of textile material based on optical means as claimed in claim 1, wherein said determining whether each pixel point between two bright-area connected regions and the corresponding pixel point in the two bright-area connected regions belong to the same hairiness comprises:
detecting pixel points between main directions of two corresponding bright area connected domains by using a sliding window, connecting two pixel points which are closest to each other on main direction vectors of the two bright area connected domains, and taking a middle point on a connecting line as an initial central point of the sliding window;
calculating the variance of the gradient amplitude of the pixel points in the sliding window, and taking the variance as the fluctuation degree of the corresponding pixel points in the sliding window;
and judging whether the fluctuation degree is greater than or equal to a fluctuation degree threshold value, and if so, judging that the pixel point in the sliding window and the pixel point in the corresponding two bright areas are connected to the same hairiness.
7. The method as claimed in claim 1, wherein if the consistency index is less than or equal to a second threshold, it is determined that the pixels corresponding to the two bright area connected regions do not belong to the same hairiness.
8. The method for identifying the hairiness degree on the surface of the textile material based on the optical means as claimed in claim 1, wherein the severity index of the hairiness defect is calculated by using the following formula:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210844374.6A CN114998321B (en) | 2022-07-19 | 2022-07-19 | Textile material surface hairiness degree identification method based on optical means |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210844374.6A CN114998321B (en) | 2022-07-19 | 2022-07-19 | Textile material surface hairiness degree identification method based on optical means |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114998321A true CN114998321A (en) | 2022-09-02 |
CN114998321B CN114998321B (en) | 2023-12-29 |
Family
ID=83021021
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210844374.6A Active CN114998321B (en) | 2022-07-19 | 2022-07-19 | Textile material surface hairiness degree identification method based on optical means |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114998321B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115237083A (en) * | 2022-09-23 | 2022-10-25 | 南通沐沐兴晨纺织品有限公司 | Textile singeing process control method and system based on computer vision |
CN117173162A (en) * | 2023-11-01 | 2023-12-05 | 南通杰元纺织品有限公司 | Textile flaw detection method and system |
CN117422716A (en) * | 2023-12-19 | 2024-01-19 | 沂水友邦养殖服务有限公司 | Ecological early warning method and system for broiler chicken breeding based on artificial intelligence |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020054694A1 (en) * | 1999-03-26 | 2002-05-09 | George J. Vachtsevanos | Method and apparatus for analyzing an image to direct and identify patterns |
CN113012105A (en) * | 2021-02-08 | 2021-06-22 | 武汉纺织大学 | Yarn hairiness detection and rating method based on image processing |
CN114627111A (en) * | 2022-05-12 | 2022-06-14 | 南通英伦家纺有限公司 | Textile defect detection and identification device |
CN114897890A (en) * | 2022-07-08 | 2022-08-12 | 南通华烨塑料工业有限公司 | Artificial intelligence-based modified plastic production regulation and control method |
-
2022
- 2022-07-19 CN CN202210844374.6A patent/CN114998321B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020054694A1 (en) * | 1999-03-26 | 2002-05-09 | George J. Vachtsevanos | Method and apparatus for analyzing an image to direct and identify patterns |
CN113012105A (en) * | 2021-02-08 | 2021-06-22 | 武汉纺织大学 | Yarn hairiness detection and rating method based on image processing |
CN114627111A (en) * | 2022-05-12 | 2022-06-14 | 南通英伦家纺有限公司 | Textile defect detection and identification device |
CN114897890A (en) * | 2022-07-08 | 2022-08-12 | 南通华烨塑料工业有限公司 | Artificial intelligence-based modified plastic production regulation and control method |
Non-Patent Citations (2)
Title |
---|
王文帝等: "单一视角下自适应阈值法的纱线毛羽识别及其应用", 《纺织学报》 * |
王文帝等: "单一视角下自适应阈值法的纱线毛羽识别及其应用", 《纺织学报》, no. 05, 15 May 2019 (2019-05-15) * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115237083A (en) * | 2022-09-23 | 2022-10-25 | 南通沐沐兴晨纺织品有限公司 | Textile singeing process control method and system based on computer vision |
CN115237083B (en) * | 2022-09-23 | 2024-01-12 | 南通沐沐兴晨纺织品有限公司 | Textile singeing process control method and system based on computer vision |
CN117173162A (en) * | 2023-11-01 | 2023-12-05 | 南通杰元纺织品有限公司 | Textile flaw detection method and system |
CN117173162B (en) * | 2023-11-01 | 2024-02-13 | 南通杰元纺织品有限公司 | Textile flaw detection method and system |
CN117422716A (en) * | 2023-12-19 | 2024-01-19 | 沂水友邦养殖服务有限公司 | Ecological early warning method and system for broiler chicken breeding based on artificial intelligence |
CN117422716B (en) * | 2023-12-19 | 2024-03-08 | 沂水友邦养殖服务有限公司 | Ecological early warning method and system for broiler chicken breeding based on artificial intelligence |
Also Published As
Publication number | Publication date |
---|---|
CN114998321B (en) | 2023-12-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114998321B (en) | Textile material surface hairiness degree identification method based on optical means | |
CN107515220B (en) | Yarn blackboard hairiness amount detection and evaluation method based on image processing | |
CN114842007B (en) | Textile wear defect detection method based on image processing | |
CN116309671B (en) | Geosynthetic fabric quality detection system | |
CN113724241B (en) | Broken filament detection method and device for carbon fiber warp-knitted fabric and storage medium | |
CN109509171A (en) | A kind of Fabric Defects Inspection detection method based on GMM and image pyramid | |
CN114581376B (en) | Automatic sorting method and system for textile silkworm cocoons based on image recognition | |
CN116523899A (en) | Textile flaw detection method and system based on machine vision | |
CN113936001B (en) | Textile surface flaw detection method based on image processing technology | |
CN109919939B (en) | Yarn defect detection method and device based on genetic algorithm | |
Anila et al. | Fabric texture analysis and weave pattern recognition by intelligent processing | |
CN116894840B (en) | Spinning proofing machine product quality detection method and system | |
CN110838113B (en) | Method for detecting monofilament count and monofilament thickness consistency in multifilament synthesis | |
TWI417437B (en) | Yarn detecting method | |
Zhang et al. | An investigation of ramie fiber cross-section image analysis methodology based on edge-enhanced image fusion | |
CN114913180B (en) | Intelligent detection method for defect of cotton cloth reed mark | |
CN114897788B (en) | Yarn package hairiness detection method based on guided filtering and discrete difference | |
CN116563276A (en) | Chemical fiber filament online defect detection method and detection system | |
Jeffrey Kuo et al. | Self-organizing map network for automatically recognizing color texture fabric nature | |
CN115294165A (en) | Intelligent operation method for textile singeing process based on machine vision | |
CN115524337A (en) | Cloth inspecting method based on machine vision | |
Ramakrishnan et al. | A Novel Fabric Defect Detection Network in textile fabrics based on DLT | |
Niles et al. | A system for analysis, categorisation and grading of fabric defects using computer vision | |
CN117392132B (en) | Visual detection method for sewing defects of garment fabric | |
CN113393444B (en) | Polyester DTY network point detection method based on image processing technology |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |