CN113610848B - Digital cloth processing system, cloth flaw detection method, device and medium - Google Patents

Digital cloth processing system, cloth flaw detection method, device and medium Download PDF

Info

Publication number
CN113610848B
CN113610848B CN202111175162.5A CN202111175162A CN113610848B CN 113610848 B CN113610848 B CN 113610848B CN 202111175162 A CN202111175162 A CN 202111175162A CN 113610848 B CN113610848 B CN 113610848B
Authority
CN
China
Prior art keywords
cloth
feature
scale
target
feature map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111175162.5A
Other languages
Chinese (zh)
Other versions
CN113610848A (en
Inventor
何梁博
孙凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba China Co Ltd
Original Assignee
Alibaba China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba China Co Ltd filed Critical Alibaba China Co Ltd
Priority to CN202111175162.5A priority Critical patent/CN113610848B/en
Publication of CN113610848A publication Critical patent/CN113610848A/en
Application granted granted Critical
Publication of CN113610848B publication Critical patent/CN113610848B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30124Fabrics; Textile; Paper

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Quality & Reliability (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The embodiment of the application provides a digital cloth processing system, a cloth defect detection method, equipment and a medium. In the embodiment of the application, when the target cloth is loosened, the cloth image of each cloth area is collected, and a defect detection task aiming at the target cloth is executed by adopting a defect detection model with a mesh network structure; or, determining a cut piece area on the target fabric by combining the referee information of the clothing to be produced in the order information, and automatically performing targeted flaw detection on the cut piece area when the target fabric is loosened. Therefore, automatic detection of cloth flaws is achieved in the cloth loosening link, the accuracy and efficiency of flaw detection are improved, and labor cost is reduced. In addition, the flaw detection model adopts a multi-scale feature fusion means to detect the feature flaws, so that more image features can be fused in the multi-scale feature fusion processing process, and the flaw detection accuracy can be further improved.

Description

Digital cloth processing system, cloth flaw detection method, device and medium
Technical Field
The application relates to the technical field of intelligent manufacturing, in particular to a digital cloth processing system, a cloth defect detection method, equipment and a medium.
Background
With the continuous development of technologies such as cloud computing, internet of things and artificial intelligence, more and more digital factories emerge. The digital factory can realize the digital processing of the whole production chain of the product from raw material purchase, product design, production processing and the like; production and manufacturing can also be performed in a flexible manufacturing mode. The flexible manufacturing mode is characterized in that the production system can quickly adapt to market demand changes through the improvement of system structures, personnel organization, operation modes, marketing and the like, meanwhile, redundant and useless loss is eliminated, and enterprises are strived to obtain greater benefits. Under the flexible manufacturing mode, a digital factory takes the requirement of a consumer as a core, reconstructs the traditional production mode with production and marketing, and realizes the intelligent manufacturing according to the requirement.
In the digital production process, the quality of raw materials or semi-finished products in the production process needs to be detected so as to ensure the quality of the whole digital production product. For example, in the clothing manufacturing industry, before a fabric is cut, the fabric needs to be subjected to flaw detection, and the quality of subsequent finished clothing can be improved by removing or cutting the fabric with flaws. At present, the defect detection of the cloth is mostly performed by adopting a manual cloth inspecting mode to perform sampling inspection on the cloth, the quality grade of the cloth is defined, if the quality grade of one roll of cloth is higher, the roll of cloth is considered to have no defect, the labor cost of the defect detection mode is higher, and the stability and the accuracy of the detection result are poorer and have no objectivity.
Disclosure of Invention
Aspects of the present application provide a digital cloth processing system, a cloth defect detection method, a device and a medium, which are used to improve the accuracy and efficiency of defect detection and reduce the labor cost.
The embodiment of the application provides a digital cloth processing system, includes: a scrim device and an edge computing device located in an edge cluster; the cloth loosening equipment comprises a cloth loosening mechanism for performing cloth loosening treatment on target cloth and a vision acquisition system arranged on a cloth loosening path of the cloth loosening mechanism; the visual acquisition system is used for acquiring images of each cloth area entering the visual range of the visual acquisition system and sending the acquired cloth images to the edge computing equipment; the edge computing equipment runs a defect detection model which adopts a mesh network structure to carry out multi-scale fusion, and is used for respectively carrying out multi-scale feature fusion processing on the plurality of cloth images according to the mesh network structure adopted by the defect detection model to obtain a plurality of defect feature maps; and generating a defect map corresponding to the target cloth according to the plurality of defect characteristic maps.
The embodiment of the application further provides a cloth defect detection method, which includes: in the process of loosening a target fabric, collecting a plurality of fabric images for the target fabric, wherein the fabric image is an image of a fabric area of the target fabric; respectively carrying out multi-scale feature fusion processing on the plurality of cloth images according to a mesh network structure adopted by a flaw detection model to obtain a plurality of flaw feature maps, wherein the flaw detection model is a neural network model which adopts the mesh network structure to carry out multi-scale fusion; and generating a defect map corresponding to the target cloth according to the plurality of defect characteristic maps.
The embodiment of the application further provides a cloth cutting method, which comprises the following steps: receiving order information of clothing production, wherein the order information comprises cutting piece information required by clothing to be produced; determining a cutting piece area on the target fabric according to the cutting piece information; in the process of loosening the target fabric, flaw detection is carried out on the cut piece region; and cutting the cut piece area passing the flaw detection to obtain a clothing cut piece for the processing process of the clothing to be produced.
An embodiment of the present application further provides a computer device, including: a memory and a processor; the memory for storing a computer program; the processor is coupled to the memory for executing the computer program for performing the steps in the methods provided by the embodiments of the present application.
Embodiments of the present application also provide a computer-readable storage medium storing a computer program, which, when executed by a processor, causes the processor to implement the steps in the method provided by the embodiments of the present application.
In the embodiment of the application, when the target cloth is loosened by a cloth loosening mechanism in the cloth loosening equipment, the cloth image of each cloth area is collected, and a defect detection task for the target cloth is executed by adopting a defect detection model with a mesh network structure. Or, the cut piece area on the target fabric is determined by combining the referee information of the clothing to be produced in the order information, and when the target fabric is loosened by a cloth loosening mechanism in the cloth loosening equipment, targeted flaw detection is automatically carried out on the cut piece area. Therefore, automatic detection of the cloth flaws is achieved in the cloth loosening link of the target cloth, the flaw detection accuracy and efficiency are improved, and the labor cost is reduced. In addition, the flaw detection model adopts a multi-scale feature fusion means to detect the feature flaws, and further adopts a mesh network structure when the multi-scale features are fused, so that more image features can be fused in the multi-scale feature fusion processing process, and the flaw detection accuracy can be further improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic structural diagram of a digital cloth processing system according to an exemplary embodiment of the present disclosure;
FIG. 2 is a field contrast diagram of a generic convolution and a hole convolution as provided by an exemplary embodiment of the present application;
FIG. 3 is a schematic structural diagram of another digital cloth processing system according to an exemplary embodiment of the present disclosure;
fig. 4 is a schematic diagram of a mesh network structure provided by an exemplary embodiment of the present application;
fig. 5a is a schematic flowchart of a cloth defect detecting method according to an exemplary embodiment of the present disclosure;
FIG. 5b is a schematic flow chart illustrating a method for cutting a fabric according to an exemplary embodiment of the present disclosure;
fig. 6a is a schematic structural diagram of a cloth defect detecting apparatus according to an exemplary embodiment of the present disclosure;
FIG. 6b is a schematic structural diagram of a cloth cutting apparatus according to an exemplary embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a computer device according to an exemplary embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Aiming at the technical problems of poor accuracy and low efficiency of the conventional cloth defect detection, the embodiment of the application provides a digital cloth processing system, a cloth defect detection method, equipment and a medium. In the embodiment of the application, when the target cloth is loosened by a cloth loosening mechanism in the cloth loosening equipment, a vision acquisition system arranged on a cloth loosening path of the cloth loosening mechanism is used for acquiring cloth images of each cloth area entering a visual range of the vision acquisition system, and the acquired cloth images are sent to the edge computing equipment; the edge computing device performs the task of detecting the target cloth defect by adopting a defect detection model of a mesh network structure. Therefore, automatic detection of the cloth flaws is achieved in the cloth loosening link of the target cloth, the flaw detection accuracy and efficiency are improved, and the labor cost is reduced. In addition, the flaw detection model adopts a multi-scale feature fusion means to detect the feature flaws, and further adopts a mesh network structure when the multi-scale features are fused, so that more image features can be fused in the multi-scale feature fusion processing process, and the flaw detection accuracy can be further improved. The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic structural diagram of a digital cloth processing system according to an exemplary embodiment of the present disclosure. As shown in fig. 1, the system includes: a scrim device 10 and an edge computing device 20 located in an edge cluster. The cloth loosening device 10 includes a cloth loosening mechanism capable of performing cloth loosening processing, and a vision acquisition system disposed on a cloth loosening path of the cloth loosening mechanism. The cloth loosening path may refer to a path that a cloth to be loosened passes between entering the cloth loosening mechanism and leaving the cloth loosening mechanism. The vision acquisition system may be communicatively connected to the edge computing devices 20 in the edge cluster through a wired network, or a wireless network. For example, the wired network may include a coaxial cable, a twisted pair, an optical fiber, and the like, and the Wireless network may be a 2G network, a 3G network, a 4G network, or a 5G network, a Wireless Fidelity (WIFI) network, and the like, which is not limited in this application.
Wherein, the cloth loosening mechanism can be any existing equipment with the cloth loosening treatment function. Because the cloth is generally a roll, the cloth has tension when being rolled, and particularly the tension has a large influence on the dimensional stability of knitted and elastic cloth, before the cloth is cut, a cloth loosening mechanism is required to loosen the cloth, and the cloth is waited for a certain time in a natural state to ensure the accuracy of the cutting dimension of subsequent ready-made clothes.
The vision acquisition system may be any system having a vision acquisition function. Illustratively, the vision acquisition system may include an image acquisition device. The image capturing device may be any device having an image capturing function. For example, from the structural characteristics of the sensor, the image capturing apparatus of the present embodiment may employ an area-array camera or a line-array camera. For another example, the image capturing device of the present embodiment may employ a standard definition camera or a high definition camera in view of the picture resolution supported by the camera. For another example, the image capturing apparatus of the present embodiment may employ an analog camera or a digital camera in view of the types of signals supported. For another example, the image capturing apparatus of the present embodiment may employ a monocular camera or a binocular camera in view of the number of cameras included in the camera.
In view of the fact that the line-scan camera is prone to cause non-uniform imaging light, and the collected image is bright in the middle and dark on both sides, brightness adjustment needs to be performed on the collected image, and the brightness pre-adjustment process may aggravate image noise, so in the above or below embodiments of the present application, preferably, the image collection device may be an area-scan camera with relatively uniform imaging light, but is not limited thereto.
In addition, in consideration that the image clarity is closely related to the detection accuracy of the event recognition result, then, in the above or below-described embodiment of the present application, the image capturing apparatus may preferably select a high definition camera of HD 720P with a resolution of 1280 × 720, or a high definition camera of HD 960P with a resolution of 1280 × 960, but is not limited thereto.
Further optionally, the vision acquisition system may further comprise a light source. The light source can provide illumination for the visual range of the visual acquisition system so as to improve the quality of the image acquired by the visual acquisition system.
In the embodiment of the present application, the edge computing device 20 may be any computing device capable of having a communication function and a certain data processing capability, for example, may be a gateway device capable of forwarding and processing data deployed in a production environment, may also be a management device in the production environment, may also be a computer device or a server specially deployed in the production environment for performing defect detection, or may also be a server deployed in a machine room close to the production environment, or the like. The edge computing device 20 may be deployed in an edge cluster, which is deployed on a side close to an object or a data source, for example, inside a data chemical plant or at another location close to the data chemical plant, and provides various data analysis and processing services nearby by using an open platform with network, computing, storage, and application core capabilities as a whole. The edge computing device 20 can implement local linkage and data processing analysis of the device without networking, and can also effectively share cloud load.
In the embodiment of the present application, the target cloth to be subjected to the cloth loosening process by the cloth loosening mechanism may be a batch of cloth. If a batch of material includes multiple rolls of material in the form of a roll, the target material may be a half roll of material, a roll of material, or multiple rolls of material. Of course, the target cloth may also be a part of a roll of cloth, such as several meters of cloth, but not limited thereto.
In the embodiment of the present application, the target fabric may be a knitted fabric, a woven fabric or a non-woven fabric, but not limited thereto. In addition, the target fabric may be a solid color fabric or a multi-color fabric. The pure color cloth refers to a certain single color cloth, and the multi-color cloth refers to two or more colors of cloth.
In the embodiment of the application, in the process of loosening the target cloth by using the cloth loosening mechanism, flaw detection can be performed on the target cloth. When detecting flaws of a target fabric, a vision acquisition system is used to acquire images of each fabric area within a visible range of the vision acquisition system, and a plurality of acquired fabric images are sent to an edge computing device 20, as shown in fig. 1. Wherein, the visual range of the visual acquisition system is determined by the visual range of the image acquisition equipment in the visual acquisition system. In addition, the cloth area refers to a local area where the target cloth enters the visual range of the visual acquisition system, and it should be understood that the target cloth is divided into a plurality of cloth areas by the visual range of the visual acquisition system. Further optionally, the light source in the visual acquisition system can be controlled to illuminate the cloth loosening path within the visual range in the cloth image acquisition stage, so that the quality of the cloth image is improved, and the follow-up improvement of the accuracy of flaw detection is facilitated.
The edge computing device 20 runs a defect detection model which adopts a mesh network structure to perform multi-scale fusion, and the edge computing device 20 performs multi-scale feature fusion processing on a plurality of cloth images respectively according to the mesh network structure adopted by the defect detection model to obtain a plurality of defect feature maps, as shown in fig. 1; and generating a defect map corresponding to the target cloth according to the plurality of defect characteristic maps, as shown in the fourth step in fig. 1.
In the embodiment of the application, a mesh network structure adopted by the flaw detection model performs multi-scale feature fusion processing on each cloth image to obtain a flaw feature map corresponding to the cloth image. The multi-scale feature fusion processing may be understood as performing fusion processing on a plurality of different scale feature maps.
Further optionally, the defect detection model may be a semantic segmentation model capable of performing defect detection on each pixel point in the detected cloth image, and at this time, the defect feature map may be a semantic segmentation map obtained by performing defect detection on the cloth image. The flaw characteristic diagram can reflect the flaw distribution information of the corresponding cloth area. The flaw distribution information includes, for example, but not limited to, flaw position information in a cloth area, flaw category, flaw profile information, and the like.
In the embodiment of the application, the flaw map is an electronic map capable of reflecting flaw distribution information of the target cloth. In practical application, the flaw characteristic maps of the target cloth can be spliced according to the image acquisition sequence to obtain a flaw map. Or splicing a plurality of cloth images of the target cloth according to the image acquisition sequence to obtain an integral image of the target cloth; and then, carrying out statistical analysis on the plurality of flaw characteristic maps, determining flaw distribution information of the target cloth, and marking each flaw in the overall image of the target cloth according to the flaw distribution information of the target cloth to obtain a flaw map. For example, taking a roll of knitted fabric with a length of about 40 meters and a width of about 1.6 meters as an example, statistical analysis is performed on a plurality of defect feature maps of the roll of knitted fabric, so that defect distribution information of a target fabric can be determined; and generating a defect map corresponding to the knitted fabric based on the defect distribution information of the target fabric.
In the embodiment of the present application, the defect detection model includes a plurality of network layers, including but not limited to at least one convolutional neural network layer, at least one pooling layer, and the like, wherein the convolutional neural network layer may be a convolutional neural network layer capable of performing space-time convolutional processing, a convolutional neural network layer capable of performing spatial convolutional processing, or a hybrid neural network layer, and accordingly, the pooling layer may be a pooling layer capable of performing upsampling processing, a pooling layer capable of performing downsampling, and the like, but is not limited thereto. Wherein the hybrid neural network layer is capable of performing at least two different convolution processes of a spatial convolution process, a space-time convolution process, or a hole convolution process. In the embodiment of the application, when the defect detection model performs defect detection on the fabric image by using the plurality of network layers, the multi-scale features of the fabric image can be extracted, the fabric defects are detected in a multi-scale feature fusion mode, and finally, a defect feature map corresponding to the fabric image is output, wherein the defect feature map comprises defect probability corresponding to each pixel point in the fabric image. And the pixel points with the defect probability greater than the set probability value are regarded as defect points. Further optionally, in the embodiment of the present application, a mesh network structure is adopted when multi-scale features are fused, nodes in the mesh network structure represent feature maps on various scales, an association relationship between nodes in the network structure represents a fusion relationship between the feature maps, and the fusion relationship includes both a comprehensive fusion relationship between feature maps on the same scale and a comprehensive fusion relationship between feature maps on different scales. Therefore, the flaw detection model sampling mesh network structure can make the feature fusion mode and scale more abundant and comprehensive in the multi-scale feature fusion processing process, can fuse more image features, and is beneficial to improving the flaw detection accuracy.
Further optionally, in the defect detection model of the embodiment of the application, in the process of fusing the multi-scale features by using the mesh network structure, a hole convolution technique may be further fused, that is, the convolutional neural network layer for feature fusion in the defect detection model includes one or more convolutional layers capable of performing hole convolution processing, so as to increase the receptive field of the defect detection model and help the defect detection model to detect more defects in the cloth image. Taking fig. 2 as an example, in two images corresponding to the ordinary convolution, an image located at a lower layer is an original image, an image located at an upper layer is a feature image extracted through the ordinary convolution processing, and an area with a darker color in the original image is a receptive field of the feature image; in the two images corresponding to the cavity convolution, the image positioned at the lower layer is an original image, the image positioned at the upper layer is a feature image extracted through the cavity convolution processing, and the area with darker color in the original image is the receptive field of the feature image; as can be seen from fig. 2, the field of void convolution covers more image areas in the original image than the field of ordinary convolution. The general convolution is, for example, a spatial convolution.
Further optionally, the mesh network structure adopted in the flaw detection model is a high-resolution mesh network structure, that is, on one hand, the original feature map extracted for the first time needs to be downsampled to obtain feature maps on multiple scales in the mesh network structure, on the other hand, the feature maps on different scales need to be upsampled from bottom to top, and the number of upsampling processes is greater than that of downsampling processes, so that the mesh network structure can perform feature fusion not only between different feature scales, but also perform more feature fusion on each feature scale, and the feature fusion has a high-resolution characteristic. Because the image up-sampling processing times of the high-resolution mesh network structure are greater than the image down-sampling processing times, the flaw detection model can fuse more high-resolution image features in the multi-scale feature fusion processing process on each feature scale and among different feature scales by adopting the high-resolution mesh network structure, and the flaw detection accuracy can be further improved.
The digital cloth processing system that this application embodiment provided includes: a scrim device 10 and an edge computing device 20 located in an edge cluster. When the target cloth is subjected to cloth loosening processing by the cloth loosening mechanism in the cloth loosening device 10, a vision acquisition system arranged on a cloth loosening path of the cloth loosening mechanism is used for acquiring cloth images of each cloth area entering a visual range of the vision acquisition system, and the acquired cloth images are sent to the edge computing device 20; the edge computing device 20 performs the task of defect detection for the target cloth by employing a defect detection model of the mesh network structure. Therefore, automatic detection of the cloth flaws is achieved in the cloth loosening link of the target cloth, the flaw detection accuracy and efficiency are improved, and the labor cost is reduced. In addition, the flaw detection model adopts a multi-scale feature fusion means to detect the feature flaws, and further adopts a mesh network structure when the multi-scale features are fused, so that more image features can be fused in the multi-scale feature fusion processing process, and the flaw detection accuracy can be further improved.
It should be noted that the digital cloth processing system provided by the embodiment of the present application can be applied to defect detection of various cloth materials, and particularly has a good defect detection effect for a pure color cloth material, and can detect different types of defects of various sizes of the pure color cloth material, such as black dots, stains, yarn draws, holes and other defects.
In the above or below embodiments of the present application, the digital cloth processing system further comprises a cutting device 30 for performing a cutting process on the cloth, and the cutting device 30 can interact with the edge computing device 20. Specifically, the edge calculation apparatus 20 provides the flaw map of the target cloth to the cutting apparatus 30, as in fig. 3. The cutting device 30 cuts the target fabric according to the defect map and the order information from the order system to obtain cut pieces meeting the order requirements, so that the cut pieces are used for garment processing, and a feedback closed loop is formed between the edge computing device 20 and the cutting processing.
The order system maintains and manages orders submitted by b (business) end users or c (customer) end users, for example, the order system may maintain the following order information: the style, the size specification, the quantity, the raw materials, the quality requirement, the price, the ordering time, the delivery time, the ordering user name and other information of the clothes to be produced. In a specific application, the cutting device 30 first determines, based on the order information, order requirements that the cut pieces to be cut need to meet, where the order requirements are, for example, style requirements, size specifications, quantity, raw materials, or quality requirements of the cut pieces. Then, the cutting device 30 cuts out the defective fabric region in the target fabric based on the defect map, and cuts out cut pieces meeting the order requirement from the non-defective fabric region in the target fabric. And finally, providing the cut pieces for production equipment in a subsequent manufacturing link for continuous processing. The subsequent manufacturing process includes, but is not limited to, a printing process, a sewing process, or a ironing process.
In the above or below embodiments of the present application, the digital material processing system further comprises a scheduling system 40 for performing production scheduling tasks, and the scheduling system 40 can interact with the edge computing device 20. Specifically, the edge computing device 20 is further configured to: determining the quality grade of the target cloth according to the defect map, generating scheduling guidance information according to the quality grade of the target cloth, and providing the scheduling guidance information to the scheduling system 40, as shown in fig. 3, c and b. The scheduling system 40 is used for performing scheduling processing on the to-be-produced orders which are subsequently dependent on the target material according to the scheduling guide information, so that a feedback closed loop is formed between the edge computing device 20 and the scheduling system 40.
For example, when determining the quality level of the target fabric according to the defect map, the edge computing device 20 may count the number of defects appearing in the target fabric and the types thereof, count the area ratio between the defects and the target fabric, and evaluate the quality level of the target fabric based on multiple dimensions, such as the number of defects and the types thereof, the area ratio between the defects and the target fabric, and the like.
For example, a mapping relation between the number of different defects and the quality grade, a mapping relation between the types of the different defects and the quality grade, and a mapping relation between the area ratio between the defects and the target cloth and the quality grade are established in advance. Determining the quality grades of the target cloth evaluated under different dimensions based on the mapping relation, and carrying out weighting and averaging on the quality scores corresponding to the quality grades of different dimensions to obtain the final quality score of the target cloth; and inquiring the mapping relation between the quality score and the quality grade according to the final quality score of the target cloth, and determining the final quality grade corresponding to the target cloth. It should be noted that if a plurality of defect types appear on the target cloth, when the quality level of the target cloth is evaluated based on the dimension of the defect types, the quality level of the target cloth can be evaluated by using only the defect type appearing most frequently.
For another example, the edge computing device 20 may pre-train a cloth quality evaluation model for quality grade evaluation based on a defect map, and the edge computing device 20 inputs the defect map of the target cloth into the cloth quality evaluation model to obtain the quality grade of the target cloth. Further optionally, the cloth quality evaluation model accurately evaluates the quality grade of the target cloth based on multiple dimensions, such as the number and type of flaws, and the area ratio between the flaws and the target cloth.
In the embodiment of the present application, the edge computing device 20 analyzes whether the target fabric needs to be scrapped or can be used in the subsequent manufacturing links, which subsequent manufacturing links the target fabric can be used in, and whether the target fabric is suitable for a high-quality garment or a general-quality garment according to the quality grade of the target fabric, and sends the analysis information as the scheduling guidance information to the scheduling system 40, and the scheduling system 40 performs scheduling processing on the subsequent to-be-produced orders depending on the target fabric according to the scheduling guidance information. The order to be produced is a garment production order which needs to be subjected to production scheduling. For example, if the order to be produced requires production of a high quality garment, if the target fabric is discarded, the scheduling system 40 may schedule the order to be produced for delayed production; and if the quality grade of the target material distribution is higher, reasonably arranging the production of the order to be produced according to the order requirement of the order to be produced. For another example, if the order to be produced requires production of a ready-made garment with a common quality and the quality level of the target fabric is low, the order to be produced can be reasonably arranged according to the order requirement of the order to be produced.
In this embodiment, the edge computing device 20 feeds back the scheduling guidance information generated based on the quality grade of the target material to the scheduling system 40, so as to guide the quality grade of the target material to the scheduling system 40, promote the scheduling rationality, and improve the overall productivity and efficiency of the digital plant.
In the above or below described embodiments of the present application, the digital fabric processing system further comprises a fabric management system 50, and the fabric management system 50 may interact with the edge computing device 20. Specifically, the edge computing device 20 is further configured to: providing the quality grade of the target fabric to the fabric management system 50, as indicated by ninthly in fig. 3; and the cloth management system 50 is configured to generate cloth supply requirement information according to the quality grade of the target cloth, and provide the cloth supply requirement information to a cloth supplier, so that the cloth supplier subsequently provides the target cloth according to the cloth supply requirement information. The distribution management system 50 may provide a distribution management service. For example, the cloth management system 50 analyzes whether the cloth currently provided by the cloth supplier is up to standard according to the quality level of the target cloth, and expects the cloth of high quality level that the cloth supplier can provide if the cloth is not up to standard, and generates cloth supply demand information according to the analysis information and provides the cloth supplier with the cloth supply demand information, and the cloth supplier determines whether to enhance quality control of a cloth production link based on the cloth supply demand information and provides the target cloth according to the cloth supply demand information.
In the above or below embodiments of the present application, the digital cloth processing system further includes a central scheduling node 60, where the central scheduling node 60 is deployed in a cloud end, for example, in a central cloud or a traditional data center, and may be in an implementation form of a cloud server, a server array, or a virtual machine. In addition, the central scheduling node 60 may interact with the edge computing devices 20, the production devices, or the production management system, respectively, via a wired network, or a wireless network. Including, but not limited to, a scrim device 10 and a cutting device 30, among others. Production management facilities include, for example, but are not limited to, a scheduling system 40 and a distribution management system 50. The central scheduling node 60, the edge computing devices 20 and the production devices or production management systems may form a cloud-edge-end cooperative network system, and the central scheduling node 60 performs global scheduling and management on resources such as production devices, personnel, production lines and the like in the entire production environment, and fully utilizes the computing power of the edge computing devices 20 to meet the real-time requirement of flaw detection.
For example, the central scheduling node 60 may obtain scheduling plan information related to a cloth loosening link uploaded by the scheduling system 40, issue a cloth loosening task to the cloth loosening device 10 that needs to participate in the cloth loosening task, and the cloth loosening device 10 executes the corresponding cloth loosening task. Meanwhile, the loosening equipment 10 may periodically upload its own equipment state log information to the central scheduling node 60, so that the central scheduling node 60 may analyze whether the loosening equipment 10 fails, and generate scheduling guidance information for the loosening equipment 10 based on a failure analysis result and send the scheduling guidance information to the scheduling system 40, so that the scheduling system 40 updates scheduling plan information related to a loosening link. Of course, the central scheduling node 60 and the scrim device 10 may also perform more dimensional interaction, specifically according to the actual application requirements.
For example, the central scheduling node 60 may obtain scheduling plan information related to the cutting link uploaded by the scheduling system 40, issue a cutting task to the cutting device 30 that needs to participate in the cutting task, and execute the corresponding cutting task by the cutting device 30. Meanwhile, the cutting device 30 may periodically upload device status log information of itself to the central scheduling node 60, so that the central scheduling node 60 analyzes whether the cutting device 30 has a fault, and generates scheduling guidance information for the cutting device 30 based on a fault analysis result and sends the scheduling guidance information to the scheduling system 40, so that the scheduling system 40 updates scheduling plan information related to a cutting link. Of course, the central scheduling node 60 and the clipping device 30 may also perform more dimensional interaction, specifically set according to the actual application requirements.
For example, the central scheduling node 60 may perform big data analysis on the production status data of the production environment, and generate the scheduling guidance information based on the big data analysis result to send to the scheduling system 40, so that the scheduling system 40 optimizes the production schedule of the production environment to improve the production efficiency. Of course, the central scheduling node 60 and the scheduling system 40 may also perform more dimensional interaction, specifically set according to the actual application requirements.
For example, the central scheduling node 60 obtains the historical cloth defect results of the respective cloth suppliers from the edge computing device 20 to perform big data analysis, determines which cloth suppliers belong to the suppliers with better supply quality and which cloth suppliers belong to the suppliers with poorer supply quality, and formulates product quality control plans of different cloth suppliers based on the big data analysis results to issue to the cloth management system, so that the cloth management system 50 performs cloth supplier management based on the product quality control plans of the different cloth suppliers. Of course, the central scheduling node 60 and the distribution management system 50 may also perform more dimensional interaction, specifically set according to the actual application requirements.
It is worth noting that deploying a fault detection model on the edge computing device 20 may satisfy the real-time requirements of fault detection. With the continuous development of communication technology, a fault detection model can be deployed on a cloud (such as the central scheduling node 60), and the real-time requirement of fault detection can be met. For example, the fifth Generation Mobile Communication Technology (5 th Generation Mobile Communication Technology, abbreviated as 5G) is a new Generation broadband Mobile Communication Technology with features of high speed, low latency and large connection, and is a network infrastructure for implementing man-machine interconnection. By means of the 5G communication technology, the defect detection real-time requirement can be met even if a defect detection model is deployed at the cloud end.
In the foregoing or the following embodiments of the present application, the edge computing device 20 performs multi-scale feature fusion processing on a plurality of cloth images according to a mesh network structure adopted by a flaw detection model to obtain a plurality of flaw feature maps, which specifically includes: inputting a cloth image into a flaw detection model for any cloth image, and performing multi-scale feature extraction on the cloth image to obtain feature images to be fused on N scales, wherein N is an integer greater than or equal to 2; based on the mesh network structure, according to the sequence of the scale of the feature map from small to large, the feature maps to be fused on the N scales are fused from bottom to top, and the flaw feature map corresponding to the cloth image is obtained.
Further optionally, in order to improve the identification accuracy of the flaw detection model, gray level equalization processing may be performed on the cloth image, and then the cloth image after gray level equalization processing is input to the flaw detection model. The gray scale equalization processing is a gray scale conversion process, and converts the current gray scale distribution into an image with a wider range and more uniform gray scale distribution through a conversion function. After gray level equalization processing, the gray level distribution of the cloth image is more uniform, and the flaw detection accuracy is improved. Note that performing gray scale equalization processing on the fabric image can further reduce the probability of misrecognizing flying, wrinkles, or the like as a flaw.
In the embodiment of the application, the N scales and the number of feature maps to be fused are both related to the mesh network structure. The more nodes in the mesh network structure, the more N scales and the number of feature maps to be fused. The number of feature maps to be fused per size may be one or more. Further optionally, the larger the scale is, the larger the number of corresponding feature maps to be fused is.
In an optional implementation manner, the inputting, by the edge computing device 20, the cloth image into the defect detection model, and performing multi-scale feature extraction on the cloth image to obtain feature maps to be fused on N scales, includes: inputting the cloth image into a flaw detection model, and performing primary feature extraction on the cloth image to obtain an original feature map on the largest scale in N scales; sequentially carrying out N-1 times of downsampling on the original feature map on the maximum scale to obtain original feature maps on the rest N-1 scales; for any non-maximum scale, performing up-sampling on the target feature map on the non-maximum scale to obtain a sampling feature map on the previous scale, wherein the sampling feature map on the previous scale and the original feature map thereof form a feature map to be fused on the previous scale; the target feature map on the non-maximum scale comprises an original feature map thereof, or comprises the original feature map thereof and a fused feature map obtained by fusing the original feature map and the sampling feature map.
In the embodiment of the application, after the cloth image is subjected to the first feature extraction to obtain the original feature map on the maximum scale in N scales, the feature extraction can be performed on N-1 regions with different sizes of the original feature map on the maximum scale to obtain the original feature maps on the rest N-1 scales. Or carrying out image down-sampling on the original feature map on the maximum scale to obtain a second original feature map; and then, performing image down-sampling on the second original feature map to obtain a third original feature map, and repeating the steps of performing image down-sampling on the previous original feature map to obtain the next original feature map until the original feature maps on the rest N-1 scales are obtained. For example, X in FIG. 40,0The method comprises the steps of obtaining an original characteristic diagram by characteristic extraction of a cloth image; x1,0Is to X0,0Carrying out image down-sampling on the original characteristic graph to obtain an original characteristic graph; x2,0Is to X1,0Carrying out image down-sampling on the original characteristic graph to obtain an original characteristic graph; x3,0Is to X2,0Carrying out image down-sampling on the original characteristic graph to obtain an original characteristic graph; x4,0Is to X3,0And the original characteristic map is obtained by image down-sampling. X0,0、X1,0、X2,0、X3,0、X4,0Are respectively fourOriginal feature maps of different sizes, and X0,0The size of the original characteristic diagram is larger than X1,0Size of original feature map, X1,0The size of the original characteristic diagram is larger than X2,0Size of original feature map, X2,0The size of the original characteristic diagram is larger than X3,0Size of original feature map, X3,0The size of the original characteristic diagram is larger than X4,0Dimensions of the original feature map. Suppose that the size of the cloth image is H X W, X0,0The size of the original feature map is H multiplied by W multiplied by C1, X1,0The size of the original feature map is H/2 xW/2 xC 2, X2,0The size of the original feature map is H/4 xW/4 xC 3, X3,0The size of the original feature map is H/8 xW/8 xC 4, X4,0The size of the original feature map is H/16 XW/16 XC 5; where H is the image height, W is the image width, and C1, C2, C3, C4, and C5 are the number of channels.
In the embodiment of the present application, any non-maximum scale refers to other feature scales except for the maximum scale, and continuing to take fig. 4 as an example, X0,0The scale corresponding to the original characteristic diagram is the maximum scale, X1,0Scale, X corresponding to original characteristic diagram2,0Scale, X corresponding to original characteristic diagram3,0Scale, X corresponding to original characteristic diagram4,0The scales corresponding to the original feature maps are all non-maximum scales. The last dimension of any dimension being greater than that dimension, e.g. X4,0The last dimension of the dimension corresponding to the original feature map is X3,0The scale corresponding to the original characteristic graph; x3,0The last dimension of the dimension corresponding to the original feature map is X2,0The scale corresponding to the original characteristic graph; x2,0The last dimension of the dimension corresponding to the original feature map is X1,0The scale corresponding to the original characteristic graph; x1,0The last dimension of the dimension corresponding to the original feature map is X0,0And (4) the scale corresponding to the original characteristic diagram.
Continuing with FIG. 4 as an example, for X4,0The scale corresponding to the original characteristic diagram, and the target characteristic diagram on the scale is X4,0And (5) original feature maps.
For X3,0Scale corresponding to original characteristic diagram, and purpose on the scaleThe characteristic diagram is X3,0Original feature map and X3 ,1And fusing the feature maps. X3,1The fusion characteristic diagram is obtained by fusing the characteristic diagram to be fused on the scale, and the characteristic diagram to be fused on the scale at least comprises X4,0Sampling characteristic diagram and X obtained by up-sampling original characteristic diagram3,0And (5) original feature maps.
For X2,0The scale corresponding to the original characteristic diagram, and the target characteristic diagram on the scale is X2,0Original feature map, X2,1Fused feature map and X2,2And fusing the feature maps. X2,1Fused feature map and X2,2The fusion characteristic diagrams are obtained by fusing the characteristic diagrams to be fused on the scale, and the characteristic diagrams to be fused on the scale at least comprise X3,0Sampling characteristic diagram X obtained by up-sampling original characteristic diagram3,1Sampling feature map and X obtained by upsampling fused feature map2,0And (5) original feature maps.
For X1,0The scale corresponding to the original characteristic diagram, and the target characteristic diagram on the scale is X1,0Original feature map, X1,1Fusion feature map, X1,2Fused feature map and X1,3And fusing the feature maps. X1,1Fusion feature map, X1,2Fused feature map and X1,3The fusion characteristic diagrams are obtained by fusing the characteristic diagrams to be fused on the scale, and the characteristic diagrams to be fused on the scale at least comprise X2,0Sampling characteristic diagram X obtained by up-sampling original characteristic diagram2,1Sampling characteristic diagram and X obtained by upsampling fused characteristic diagram2,2Sampling feature map and X obtained by upsampling fused feature map1,0And (5) original feature maps.
For X0,0The scale corresponding to the original characteristic diagram, and the target characteristic diagram on the scale is X0,0Original feature map, X0,1Fusion feature map, X0,2Fusion feature map, X0,3Fused feature map and X0,4And fusing the feature maps. X0,1Fusion feature map, X0,2Fusion feature map, X0,3Fused feature map and X0,4FusionThe feature maps are obtained by fusing the feature maps to be fused on the scale, and the feature maps to be fused on the scale at least comprise X1,0Sampling characteristic diagram X obtained by up-sampling original characteristic diagram1,1Sampling characteristic diagram and X obtained by upsampling fused characteristic diagram1,2Sampling characteristic diagram and X obtained by upsampling fused characteristic diagram1,3Sampling feature map and X obtained by upsampling fused feature map0,0And (5) original feature maps.
In the foregoing or the following embodiments of the present application, the edge computing device 20 performs fusion processing on feature maps to be fused on N scales from bottom to top according to a sequence of feature map scales from small to large based on a mesh network structure, to obtain a defect feature map corresponding to a fabric image, and specifically includes: based on a mesh network structure, according to the sequence of the scale of the feature map from small to large, fusion processing is carried out on feature maps to be fused on N scales from bottom to top by adopting a cavity convolution and self-attention mechanism, and a flaw feature map corresponding to a cloth image is obtained.
Specifically, the mesh network structure comprises N layers corresponding to N scales, and the scales corresponding to the layers are gradually reduced from top to bottom; each layer comprises at least one node, each node represents a target feature graph or a feature to be fused on a corresponding scale, the node represents the feature to be fused before feature fusion, and the node represents the target feature graph after feature fusion; the jumping link relation between nodes in the same layer represents the fusion relation existing between the feature graphs to be fused before the feature fusion is carried out in the layer, and represents the fusion relation existing between the target feature graphs after the feature fusion is carried out in the layer; correspondingly, the directional relation between the nodes of the adjacent layers represents the up-sampling relation existing between the target feature diagram on the next scale and the feature to be fused on the previous scale before feature fusion is carried out in the layer where the previous scale is located, and represents the fusion relation existing between the target feature diagram on the next scale and the target feature diagram on the previous scale besides feature fusion is carried out in the layer where the previous scale is located.
Therefore, in an optional embodiment, the edge computing device 20 performs fusion processing on feature maps to be fused on N scales from bottom to top by using a void convolution and a self-attention mechanism according to a sequence of feature map scales from small to large based on a mesh network structure to obtain a defect feature map corresponding to the cloth image, and specifically includes: for any non-minimum scale in the N scales, according to the feature map fusion relation corresponding to the non-minimum scale in the mesh network structure, carrying out fusion processing between the original feature map and the sampling feature map on the non-minimum scale by adopting cavity convolution to obtain an intermediate feature map on the non-minimum scale; performing feature correlation calculation on the intermediate feature map on the non-minimum scale by adopting an attention mechanism to obtain a fusion feature map on the non-minimum scale; and the flaw characteristic diagram corresponding to the cloth image is the target characteristic diagram finally obtained by fusion on the maximum scale.
Fig. 4 is an example, dotted arrows indicate that a jump link relationship, i.e., a fusion relationship exists between nodes, arrows in an oblique direction indicate that an up-sampling relationship or a fusion relationship exists between nodes, and arrows in a vertical direction indicate that a down-sampling relationship exists between nodes. It should be noted that, for the node where the target feature map is located, when the target feature map is obtained by fusion, the node having a hopping connection relationship with the node and having an up-sampling relationship with the node need to participate in the fusion task of the target feature map. Specifically, when a target feature map needing to be fused is obtained, a target feature map corresponding to a node which is associated with the target feature map needing to be fused and has a jump connection relation is obtained, a target feature map corresponding to a node which has an up-sampling relation with the target feature map needing to be fused is obtained, and convolution processing at least involving hole convolution is performed on the sampling feature map of the target feature map corresponding to the node which has the up-sampling relation and the target feature map corresponding to the node which has the jump connection relation, so that an intermediate feature map is obtained; and performing dot product operation on the intermediate characteristic diagram and the intermediate characteristic diagram by adopting an attention mechanism to obtain a target characteristic diagram needing to be fused.
Continuing with the example of FIG. 4, X3,1The acquisition process of the fusion characteristic diagram comprises the following steps: to X4,0Obtained by up-sampling the original characteristic diagramSampling feature map and X3,0The original characteristic diagram is subjected to convolution processing at least involving hole convolution to obtain an intermediate characteristic diagram x3,1Then using a self-attention mechanism to align the intermediate feature map x3,1Carries out dot multiplication operation with itself to obtain X3,1And fusing the feature maps.
X2,1The acquisition process of the fusion characteristic diagram comprises the following steps: to X3,0Sampling characteristic diagram and X obtained by up-sampling original characteristic diagram2,0The original characteristic diagram is subjected to convolution processing at least involving hole convolution to obtain an intermediate characteristic diagram x2,1Then using a self-attention mechanism to align the intermediate feature map x2,1Carries out dot multiplication operation with itself to obtain X2,1And fusing the feature maps.
X2,2The acquisition process of the fusion characteristic diagram comprises the following steps: to X3,1Sampling characteristic diagram and X obtained by upsampling fused characteristic diagram2,1Fused feature map and X2,0The original characteristic diagram is subjected to convolution processing at least involving hole convolution to obtain an intermediate characteristic diagram x2,2(ii) a Then using a self-attention mechanism to align the intermediate feature map x2,2Carries out dot multiplication operation with itself to obtain X2,2And fusing the feature maps. In FIG. 4, x3,1、x2,1、x2,2Etc. intermediate feature diagrams are not shown.
By analogy, the fused feature map on each scale can be obtained. In FIG. 4, X0,0The fused feature map obtained by final fusion on the scale corresponding to the original feature map is X4,0,X4,0Is to X1,3Sampling characteristic diagram and X obtained by upsampling fused characteristic diagram0,1Fusion feature map, X0,2Fusion feature map, X0,3Fused feature map and X0,0The original characteristic diagram is subjected to convolution processing at least involving hole convolution to obtain an intermediate characteristic diagram x0,4(ii) a Then using a self-attention mechanism to align the intermediate feature map x0,4Carries out dot multiplication operation with itself to obtain X0,4And fusing the feature maps. X0,4The fused feature map is X0,0The target feature map obtained by final fusion on the scale corresponding to the original feature map, namely the defect detection modelAnd carrying out flaw detection on the cloth image to obtain a flaw characteristic diagram.
In the foregoing or the following embodiments of the present application, the edge calculation device 20 generates a defect map corresponding to the target fabric according to a plurality of defect feature maps, specifically including: determining a suspected defect area in the corresponding cloth area and outline information thereof according to the defect probability corresponding to each pixel point in each defect feature map; comparing the profile information of the suspected defect area with the profile information of the existing defects in the defect library to obtain a target defect area from the suspected defect area; and generating a defect map corresponding to the target cloth according to the position of the target defect area in the corresponding cloth area.
It should be noted that the probability of fault misidentification can be further reduced by comparing the profile information of the suspected fault area with the profile information of the existing faults in the fault library to determine the target fault area. For example, in the case of defect detection of a solid-color cloth, defect contour matching can reduce the probability of misrecognition of a defect such as a flying or wrinkle.
In the embodiment of the application, when a suspected defect area in a corresponding cloth area and profile information thereof are determined according to defect probability corresponding to each pixel point in each defect feature map, a target defect type corresponding to each pixel point in each defect feature map can be determined according to the defect probability corresponding to each pixel point in each defect feature map; determining a suspected flaw area in the corresponding cloth area according to the target flaw type corresponding to each pixel point and the position information of each pixel point in the flaw characteristic map; carrying out binarization processing on the suspected defect area image, and carrying out contour extraction processing on the suspected defect area image after binarization processing to obtain contour information of the suspected defect area; comparing the profile information of the suspected defect area with the profile information of the existing defects in the defect library to obtain a target defect area from the suspected defect area; and generating a defect map corresponding to the target cloth according to the position of the target defect area in the corresponding cloth area.
When the target flaw type corresponding to each pixel point in each flaw feature map is determined according to the flaw probability corresponding to each pixel point in each flaw feature map, aiming at each pixel point in the flaw feature map, the flaw detection model can identify the probability that the pixel point belongs to each flaw type, and the flaw type with the highest probability is used as the target flaw type of the pixel point.
In practical application, when training the flaw detection model, the edge computing device 20 may input a pre-labeled sample cloth image into the initial flaw detection model to obtain a model output result of the sample cloth image; the labeling result of the sample cloth image comprises whether each pixel point in the sample cloth image has a flaw or not and a flaw type; calculating a target loss function according to the model output result of the sample cloth image and the corresponding labeling result; and adjusting the model parameters of the initial flaw detection model according to the target loss function to obtain the flaw detection model.
In the embodiment of the present application, one or more loss functions may be used in the model training process. As an example, to improve the identification accuracy of the flaw detection model, the edge computing device 20 is further configured to: and performing model training on the flaw detection model by adopting a plurality of loss functions, wherein the plurality of loss functions comprise at least two of a loss function for relieving imbalance of positive and negative samples, a loss function for paying attention to a difficult-to-divide sample in the training process, a cross entropy loss function and a loss function for paying attention to the average cross-over ratio of the model.
The Loss function for relieving the imbalance of the positive sample and the negative sample is, for example, a Focal local Loss function, and the Focal local Loss function adds an adjusting factor to reduce the weight of the easily classified samples on the basis of balancing the cross entropy Loss function, and focuses on the training of the difficult samples. For more description of the Focal local Loss function, see the related art.
The Loss function concerning the difficult sample in the training process is, for example, an Ohem (Online hard sample mining cross-entry) Loss function. And (3) using an Ohem Loss function Ohem Loss to ensure that a large number of samples which are easy to separate do not return a Loss function in the training process and concentrate on training samples which are difficult to separate.
The cross entropy Loss function is, for example, Large Margin Softmax Loss, which is an improved cross entropy Loss function, and can increase the difficulty of model learning, and compel the model to continuously learn more distinctive features, so that the inter-class distance is larger, and the intra-class distance is smaller. More description about the Large Margin Softmax Loss function is described in detail in the related art.
Wherein, the Loss function of the average cross-over ratio of the concerned model is Dice Loss, for example. The Dice coefficient is a metric function used to measure the similarity of sets, and is typically used to calculate the similarity between two samples. The average Intersection over Union (mlou) refers to the ratio of the Intersection and the Union of two sets of the real value and the predicted value.
It should be noted that, if the edge calculation device 20 adopts a plurality of loss functions with different dimensions when training the flaw detection model, the loss functions with a plurality of dimensions may be calculated, and the loss functions with a plurality of different dimensions are summed to obtain the target loss function. When the model parameters are adjusted based on the target loss function, whether the target loss function is larger than a preset loss value or not can be judged, if not, model training is stopped, and the flaw detection model obtained through current training is used as a trained flaw detection model. If so, continuing the model training until the target loss function is smaller than the preset loss value. The preset loss value is a judgment index for stopping model training and continuing the model training, and is set according to the actual situation.
Further, when training the flaw detection model, the edge computing device 20 may further select an average Weight in the training process by using a Stochastic Weight Average (SWA), so as to increase the generalization of the model.
In the above system embodiment, the target fabric is firstly subjected to the fabric loosening treatment, the defect detection is performed on the target fabric in the fabric loosening treatment process, then the target fabric is cut according to the order information to obtain the clothing pieces, and the clothing pieces are used for clothing processing, but the whole processing logic is not limited thereto. The system can also work in a mode that order information for producing the clothes is received from the order system, wherein the order information comprises but is not limited to the information of the style, the size specification, the quantity, the raw materials, the quality requirement, the price, the order placing time, the delivery time, the name of the order placing user and the like of the clothes to be produced. From the order information, such as the style, size specification, quantity, raw material and quality requirements of the garment to be produced, the cut piece information of the garment to be produced can be determined, wherein the cut piece information includes, but is not limited to, the size specification, contour shape of the cut piece, fabric information of the cut piece or the processing technology of the cut piece. And based on the cutting piece information required by the clothing to be produced, positioning a cutting piece area used for cutting out clothing cutting pieces in the target cloth. Wherein the number of the panel regions may be one or more. After the cutting piece area in the target fabric is located, when the target fabric is subjected to a fabric loosening treatment process, flaw detection is carried out on the cutting piece area, and the cutting piece area which passes the flaw detection is cut to obtain a clothing cutting piece for the processing process of the clothing to be produced.
The cut piece area passing defect detection refers to a cut piece area with qualified quality, which has no defects or a small number of defects but is within the allowable range of defects, and the cut piece area not passing defect detection refers to a cut piece area with unqualified quality, which has more defects. The evaluation mode of the cut piece region through flaw detection is not limited by the embodiment of the application. For example, for any cut piece region to be evaluated, the quality level of the cut piece region is evaluated according to multiple dimensions such as the number and type of defects, and the area ratio between the defects and the cut piece region in the defect detection result of the cut piece region, and whether the cut piece region to be evaluated passes defect detection or not is determined according to the quality level of the cut piece region to be evaluated and the preset correlation between the quality level and the quality qualification. The quality grade of the cut piece region based on the multi-dimensional flaw detection information may be evaluated in the quality grade evaluation mode of the target fabric, which is not described herein again.
In one possible implementation, the interaction process of each component in the digital cloth processing system is as follows: the edge computing equipment receives order information of clothing production, wherein the order information comprises cutting piece information required by clothing to be produced; determining a cutting piece area on the target fabric according to the cutting piece information; the cloth loosening equipment is used for performing cloth loosening treatment on the target cloth; collecting a cloth image corresponding to a judge area and reporting the cloth image to edge computing equipment in the process of loosening the target cloth; the edge computing equipment performs flaw detection on the cut piece region based on the cloth image; and sending the flaw detection result of the cutting piece region to cutting equipment so that the cutting equipment can cut the cutting piece region detected by the flaw on the target fabric based on the cutting piece information to obtain the clothing cutting piece for the processing process of the clothing to be produced. In the above embodiments, the specific implementation means for detecting the flaws in the panel region is not limited. Optionally, a cloth image corresponding to the cut segment region may be acquired; inputting the cloth image into a flaw detection model which adopts a mesh network structure to perform multi-scale fusion, and respectively performing multi-scale feature fusion processing on the cloth image according to the mesh network structure to obtain a flaw feature map which reflects a flaw detection result of the cutting piece area. Regarding the flaw detection of the cut part region, the implementation manner is the same as the cloth flaw detection manner in the foregoing embodiment, and reference may be made to the foregoing embodiment.
When the cloth loosening mechanism is used for loosening cloth in the cloth cutting area, the vision acquisition system can be used for directly acquiring the cloth image corresponding to the cloth cutting area, and the cloth image is sent to the edge computing equipment for flaw detection. Or the vision system can collect the initial cloth image corresponding to the cloth region entering the visual range of the vision system, and send the initial cloth image to the edge computing device, and the edge computing device cuts out the cloth image corresponding to the cut piece region from the initial cloth image according to the cut piece information.
Further optionally, the edge computing device may evaluate the following information according to the flaw detection result of the cut segment region: the target fabric can produce a plurality of clothes to be produced, whether the quantity of clothes required to be produced by a clothes order can be produced or not, whether the normal delivery can be carried out according to the delivery date of the clothes order or whether the normal scheduling of the tailoring process of the clothes order can be influenced or not, and the information is sent to a scheduling system as scheduling guide information of the tailoring process, so that the scheduling system can update the scheduling plan of the clothes order in the tailoring process, the scheduling rationality of the tailoring process is realized, and the overall productivity and efficiency of a digital factory are improved. See the previous embodiments for the interaction of the edge computing device with the scheduling system.
Further optionally, the central scheduling node may also schedule the idle and fault-free cutting device to execute a cutting task for the cut-part area detected by the flaw based on the device state information of the cutting device on the production line. In addition, the cutting equipment executing the cutting task can also return cutting progress information to the central scheduling node, so that the central scheduling node can master the cutting progress in real time. For more description of the central scheduling node, reference may be made to the foregoing embodiments.
In the above embodiment, firstly, the cut piece information contained in the order information is combined, the cut piece region on the target fabric is locked, the fabric image corresponding to the cut piece region is collected in the process of loosening the target fabric, the fabric image is combined to automatically carry out targeted flaw detection on the cut piece region, the number of fabrics for flaw detection can be reduced, the calculated amount is reduced, the accuracy and the efficiency of flaw detection can be improved, the labor cost is reduced, the efficiency of flaw detection can be improved, and the production efficiency of digital clothing is improved.
Fig. 5a is a schematic flow chart of a cloth defect detecting method according to an exemplary embodiment of the present disclosure. As shown in fig. 5a, the method may comprise the steps of:
501a, in the process of loosening the target cloth, collecting a plurality of cloth images aiming at the target cloth, wherein the cloth images are images of a cloth area of the target cloth.
502a, respectively carrying out multi-scale feature fusion processing on a plurality of cloth images according to a mesh network structure adopted by a flaw detection model to obtain a plurality of flaw feature maps, wherein the flaw detection model is a neural network model for carrying out multi-scale fusion by adopting the mesh network structure.
503a, generating a defect map corresponding to the target fabric according to the plurality of defect characteristic maps.
In some embodiments of the present application, the multi-scale feature fusion processing is performed on a plurality of cloth images respectively according to a mesh network structure adopted by a flaw detection model to obtain a plurality of flaw feature maps, which specifically includes: inputting a cloth image into a flaw detection model for any cloth image, and performing multi-scale feature extraction on the cloth image to obtain feature images to be fused on N scales, wherein N is an integer greater than or equal to 2; based on the mesh network structure, according to the sequence of the scale of the feature map from small to large, the feature maps to be fused on the N scales are fused from bottom to top, and the flaw feature map corresponding to the cloth image is obtained.
In some embodiments of the present application, the mesh network structure includes N layers corresponding to N dimensions, and the dimensions of each layer decrease from top to bottom; each layer comprises at least one node, each node represents a feature graph to be fused or a target feature graph on a corresponding scale, a jump link relation between nodes in the same layer represents a fusion relation between the feature graphs to be fused, and a direction relation between nodes in adjacent layers represents an up-sampling relation between the target feature graph on the next scale and the feature graph to be fused or the target feature graph on the previous scale.
In some embodiments of the present application, inputting a cloth image into a flaw detection model, performing multi-scale feature extraction on the cloth image, and obtaining feature maps to be fused on N scales, includes: inputting the cloth image into a flaw detection model, and performing primary feature extraction on the cloth image to obtain an original feature map on the largest scale in N scales; sequentially carrying out N-1 times of downsampling on the original feature map on the maximum scale to obtain original feature maps on the rest N-1 scales;
for any non-maximum scale, performing up-sampling on the target feature map on the non-maximum scale to obtain a sampling feature map on the previous scale, wherein the sampling feature map on the previous scale and the original feature map thereof form a feature map to be fused on the previous scale; the target feature map on the non-maximum scale comprises an original feature map thereof, or comprises the original feature map thereof and a fused feature map obtained by fusing the original feature map and the sampling feature map.
In some embodiments of the present application, based on a mesh network structure, performing fusion processing from bottom to top on feature maps to be fused on N scales according to a sequence of feature map scales from small to large to obtain a defect feature map corresponding to a cloth image, including: based on a mesh network structure, according to the sequence of the scale of the feature map from small to large, fusion processing is carried out on feature maps to be fused on N scales from bottom to top by adopting a cavity convolution and self-attention mechanism, and a flaw feature map corresponding to a cloth image is obtained.
In some embodiments of the present application, based on a mesh network structure, according to a sequence of feature map scales from small to large, a void convolution and a self-attention mechanism are used to perform fusion processing from bottom to top on feature maps to be fused on N scales, so as to obtain a defect feature map corresponding to a fabric image, which specifically includes: for any non-minimum scale in the N scales, according to the feature map fusion relation corresponding to the non-minimum scale in the mesh network structure, carrying out fusion processing between the original feature map and the sampling feature map on the non-minimum scale by adopting cavity convolution to obtain an intermediate feature map on the non-minimum scale; performing feature correlation calculation on the intermediate feature map on the non-minimum scale by adopting an attention mechanism to obtain a fusion feature map on the non-minimum scale; and the flaw characteristic diagram corresponding to the cloth image is the target characteristic diagram finally obtained by fusion on the maximum scale.
In some embodiments of the present application, generating a defect map corresponding to a target fabric according to a plurality of defect feature maps specifically includes: determining a suspected defect area in the corresponding cloth area and outline information thereof according to the defect probability corresponding to each pixel point in each defect feature map; comparing the profile information of the suspected defect area with the profile information of the existing defects in the defect library to obtain a target defect area from the suspected defect area; and generating a defect map corresponding to the target cloth according to the position of the target defect area in the corresponding cloth area.
In some embodiments of the present application, the method further comprises: and performing model training on the flaw detection model by adopting a plurality of loss functions, wherein the plurality of loss functions comprise at least two of a loss function for relieving imbalance of positive and negative samples, a loss function for paying attention to a difficult-to-divide sample in the training process, a cross entropy loss function and a loss function for paying attention to the average cross-over ratio of the model.
In some embodiments of the present application, the method further comprises: and providing the flaw map for cutting equipment, wherein the cutting equipment cuts the target fabric according to the flaw map and the order information from the order system to obtain cut pieces meeting the order requirements, and the cut pieces are used for garment processing.
In some embodiments of the present application, the method further comprises: and determining the quality grade of the target cloth according to the flaw map, generating production scheduling guide information according to the quality grade of the target cloth, and providing the production scheduling guide information to a production scheduling system so that the production scheduling system can perform production scheduling processing on a subsequent to-be-produced order depending on the target cloth according to the production scheduling guide information.
In some embodiments of the present application, the method further comprises: providing the quality grade of the target cloth to a cloth management system; the cloth supply management system generates cloth supply requirement information according to the quality grade of the target cloth, and provides the cloth supply requirement information to the cloth supplier, so that the cloth supplier can subsequently provide the target cloth according to the cloth supply requirement information. The detailed implementation of the cloth defect detecting method has been described in detail in the embodiments of the digital cloth processing system, and will not be described in detail here.
Fig. 5b is a schematic flow chart of a cloth cutting method according to an exemplary embodiment of the present application. As shown in fig. 5b, the method may comprise the steps of:
501b, receiving order information of clothing production, wherein the order information comprises cutting piece information required by clothing to be produced.
502b, determining the cutting piece area on the target cloth according to the cutting piece information.
503b, in the process of loosening the target fabric, performing flaw detection on the cut piece region.
And 504b, cutting the cut piece region passing the flaw detection to obtain a clothing cut piece for the processing process of the clothing to be produced.
In some embodiments of the present application, in the process of loosening the target fabric, the flaw detection is performed on the cut piece region, including: collecting cloth images corresponding to the cut piece regions;
inputting the cloth image into a flaw detection model which adopts a mesh network structure to perform multi-scale fusion, and respectively performing multi-scale feature fusion processing on the cloth image according to the mesh network structure to obtain a flaw feature map which reflects a flaw detection result of the cutting piece area.
In some embodiments of the present application, respectively performing multi-scale feature fusion processing on the fabric image according to the mesh network structure to obtain a defect feature map, specifically including: inputting a cloth image into a flaw detection model, and performing multi-scale feature extraction on the cloth image to obtain feature maps to be fused on N scales, wherein N is an integer not less than 2; based on the mesh network structure, according to the sequence of the scale of the feature map from small to large, the feature maps to be fused on the N scales are fused from bottom to top, and the flaw feature map corresponding to the cloth image is obtained.
In some embodiments of the present application, the mesh network structure includes N layers corresponding to N dimensions, and the dimensions of each layer decrease from top to bottom; each layer comprises at least one node, each node represents a feature graph to be fused or a target feature graph on a corresponding scale, a jump link relation between nodes in the same layer represents a fusion relation between the feature graphs to be fused, and a direction relation between nodes in adjacent layers represents an up-sampling relation between the target feature graph on the next scale and the feature graph to be fused or the target feature graph on the previous scale.
In some embodiments of the present application, inputting a cloth image into a flaw detection model, performing multi-scale feature extraction on the cloth image, and obtaining feature maps to be fused on N scales, includes: inputting the cloth image into a flaw detection model, and performing primary feature extraction on the cloth image to obtain an original feature map on the largest scale in N scales; sequentially carrying out N-1 times of downsampling on the original feature map on the maximum scale to obtain original feature maps on the rest N-1 scales;
for any non-maximum scale, performing up-sampling on the target feature map on the non-maximum scale to obtain a sampling feature map on the previous scale, wherein the sampling feature map on the previous scale and the original feature map thereof form a feature map to be fused on the previous scale; the target feature map on the non-maximum scale comprises an original feature map thereof, or comprises the original feature map thereof and a fused feature map obtained by fusing the original feature map and the sampling feature map.
In some embodiments of the present application, based on a mesh network structure, performing fusion processing from bottom to top on feature maps to be fused on N scales according to a sequence of feature map scales from small to large to obtain a defect feature map corresponding to a cloth image, including: based on a mesh network structure, according to the sequence of the scale of the feature map from small to large, fusion processing is carried out on feature maps to be fused on N scales from bottom to top by adopting a cavity convolution and self-attention mechanism, and a flaw feature map corresponding to a cloth image is obtained.
In some embodiments of the present application, based on a mesh network structure, according to a sequence of feature map scales from small to large, a void convolution and a self-attention mechanism are used to perform fusion processing from bottom to top on feature maps to be fused on N scales, so as to obtain a defect feature map corresponding to a fabric image, which specifically includes: for any non-minimum scale in the N scales, according to the feature map fusion relation corresponding to the non-minimum scale in the mesh network structure, carrying out fusion processing between the original feature map and the sampling feature map on the non-minimum scale by adopting cavity convolution to obtain an intermediate feature map on the non-minimum scale; performing feature correlation calculation on the intermediate feature map on the non-minimum scale by adopting an attention mechanism to obtain a fusion feature map on the non-minimum scale; and the flaw characteristic diagram corresponding to the cloth image is the target characteristic diagram finally obtained by fusion on the maximum scale.
In some embodiments of the present application, the method further comprises: and performing model training on the flaw detection model by adopting a plurality of loss functions, wherein the plurality of loss functions comprise at least two of a loss function for relieving imbalance of positive and negative samples, a loss function for paying attention to a difficult-to-divide sample in the training process, a cross entropy loss function and a loss function for paying attention to the average cross-over ratio of the model. In some embodiments of the present application, the method further comprises: and generating a flaw map corresponding to the target fabric according to the flaw characteristic maps of the plurality of cut-parts.
In some embodiments of the present application, generating a defect map corresponding to the target fabric according to a defect feature map of a plurality of cut parts includes: determining suspected defect areas in the corresponding cutting areas and outline information thereof according to the defect probability corresponding to each pixel point in each defect feature map; comparing the profile information of the suspected defect area with the profile information of the existing defects in a defect library to obtain a target defect area from the suspected defect area; and generating a defect map corresponding to the target cloth according to the position of the target defect area in the corresponding cutting area.
In some embodiments of the present application, the method further comprises: and determining the quality grade of the target cloth according to the flaw map, generating production scheduling guide information according to the quality grade of the target cloth, and providing the production scheduling guide information to a production scheduling system so that the production scheduling system can perform production scheduling processing on a subsequent to-be-produced order depending on the target cloth according to the production scheduling guide information.
In some embodiments of the present application, the method further comprises: providing the quality grade of the target cloth to a cloth management system; the cloth supply management system generates cloth supply requirement information according to the quality grade of the target cloth, and provides the cloth supply requirement information to the cloth supplier, so that the cloth supplier can subsequently provide the target cloth according to the cloth supply requirement information. The detailed implementation of the cloth cutting method has been described in detail in the embodiment of the digital cloth processing system, and will not be elaborated herein.
Fig. 6a is a schematic structural diagram of a cloth defect detecting apparatus according to an exemplary embodiment of the present disclosure. As shown in fig. 6a, the cloth defect detecting apparatus may include:
the collecting module 61a is configured to collect a plurality of cloth images for a target cloth in a cloth loosening process of the target cloth, where the cloth image is an image of a cloth region of the target cloth.
The processing module 62a is configured to perform multi-scale feature fusion processing on the multiple cloth images respectively according to a mesh network structure adopted by the flaw detection model to obtain multiple flaw feature maps, where the flaw detection model is a neural network model that performs multi-scale fusion by adopting the mesh network structure; and generating a defect map corresponding to the target cloth according to the plurality of defect characteristic maps.
In some embodiments of the present application, the processing module 62a performs multi-scale feature fusion processing on a plurality of cloth images according to a mesh network structure adopted by the flaw detection model, and when obtaining a plurality of flaw feature maps, is specifically configured to: inputting a cloth image into a flaw detection model for any cloth image, and performing multi-scale feature extraction on the cloth image to obtain feature images to be fused on N scales, wherein N is an integer greater than or equal to 2; based on the mesh network structure, according to the sequence of the scale of the feature map from small to large, the feature maps to be fused on the N scales are fused from bottom to top, and the flaw feature map corresponding to the cloth image is obtained.
In some embodiments of the present application, the mesh network structure includes N layers corresponding to N dimensions, and the dimensions of each layer decrease from top to bottom; each layer comprises at least one node, each node represents a feature graph to be fused or a target feature graph on a corresponding scale, a jump link relation between nodes in the same layer represents a fusion relation between the feature graphs to be fused, and a direction relation between nodes in adjacent layers represents an up-sampling relation between the target feature graph on the next scale and the feature graph to be fused or the target feature graph on the previous scale.
In some embodiments of the present application, the processing module 62a inputs the cloth image into the flaw detection model, and performs multi-scale feature extraction on the cloth image to obtain feature maps to be fused on N scales, where the feature maps are specifically configured to: inputting the cloth image into a flaw detection model, and performing primary feature extraction on the cloth image to obtain an original feature map on the largest scale in N scales; sequentially carrying out N-1 times of downsampling on the original feature map on the maximum scale to obtain original feature maps on the rest N-1 scales; for any non-maximum scale, performing up-sampling on the target feature map on the non-maximum scale to obtain a sampling feature map on the previous scale, wherein the sampling feature map on the previous scale and the original feature map thereof form a feature map to be fused on the previous scale; the target feature map on the non-maximum scale comprises an original feature map thereof, or comprises the original feature map thereof and a fused feature map obtained by fusing the original feature map and the sampling feature map.
In some embodiments of the present application, the processing module 62a is configured to perform fusion processing from bottom to top on feature maps to be fused on N scales according to a sequence from small to large of feature map scales based on a mesh network structure, and when obtaining a defect feature map corresponding to a cloth image, specifically: based on a mesh network structure, according to the sequence of the scale of the feature map from small to large, fusion processing is carried out on feature maps to be fused on N scales from bottom to top by adopting a cavity convolution and self-attention mechanism, and a flaw feature map corresponding to a cloth image is obtained.
In some embodiments of the present application, the processing module 62a is based on a mesh network structure, and performs fusion processing on feature maps to be fused on N scales from bottom to top by using a cavity convolution and a self-attention mechanism according to a sequence from small to large in feature map scale, so as to obtain a defect feature map corresponding to a cloth image, and is specifically configured to: for any non-minimum scale in the N scales, according to the feature map fusion relation corresponding to the non-minimum scale in the mesh network structure, carrying out fusion processing between the original feature map and the sampling feature map on the non-minimum scale by adopting cavity convolution to obtain an intermediate feature map on the non-minimum scale; performing feature correlation calculation on the intermediate feature map on the non-minimum scale by adopting an attention mechanism to obtain a fusion feature map on the non-minimum scale; and the flaw characteristic diagram corresponding to the cloth image is the target characteristic diagram finally obtained by fusion on the maximum scale.
In some embodiments of the present application, when the processing module 62a generates the defect map corresponding to the target fabric according to the plurality of defect feature maps, the processing module is specifically configured to: determining a suspected defect area in the corresponding cloth area and outline information thereof according to the defect probability corresponding to each pixel point in each defect feature map; comparing the profile information of the suspected defect area with the profile information of the existing defects in the defect library to obtain a target defect area from the suspected defect area; and generating a defect map corresponding to the target cloth according to the position of the target defect area in the corresponding cloth area.
In some embodiments of the present application, the processing module 62a is further configured to: and performing model training on the flaw detection model by adopting a plurality of loss functions, wherein the plurality of loss functions comprise at least two of a loss function for relieving imbalance of positive and negative samples, a loss function for paying attention to a difficult-to-divide sample in the training process, a cross entropy loss function and a loss function for paying attention to the average cross-over ratio of the model. The detailed implementation of the cloth defect detecting method has been described in detail in the embodiments of the digital cloth processing system, and will not be described in detail here.
The cloth defect detecting apparatus in fig. 6a can execute the cloth defect detecting method in the embodiment shown in fig. 5a, and the implementation principle and the technical effect thereof are not described again. The detailed operation of the cloth defect detecting device in the above embodiments has been described in detail in the embodiments related to the digital cloth processing system, and will not be described in detail here.
Fig. 6b is a schematic structural diagram of a cloth cutting apparatus according to an exemplary embodiment of the present application. As shown in fig. 6b, the cloth cutting apparatus may include: the receiving module 61b is configured to receive order information for clothing production, where the order information includes cut-parts information required by clothing to be produced. And the processing module 62b is used for determining the cutting piece area on the target fabric according to the cutting piece information. And the processing module 62b is further configured to perform flaw detection on the cut piece region in the process of loosening the target fabric. And the cutting module 63b is used for cutting the cut piece region passing the flaw detection to obtain a clothing cut piece for the processing process of the clothing to be produced.
In some embodiments of the present application, the processing module 62b is specifically configured to, during the loosening process of the target fabric, perform defect detection on the cut segment region: collecting cloth images corresponding to the cut piece regions; inputting the cloth image into a flaw detection model which adopts a mesh network structure to perform multi-scale fusion, and respectively performing multi-scale feature fusion processing on the cloth image according to the mesh network structure to obtain a flaw feature map which reflects a flaw detection result of the cutting piece area. For a detailed implementation manner of the processing module 62b performing multi-scale feature fusion processing on the fabric image according to the mesh network structure to obtain the flaw feature map, reference may be made to the foregoing embodiment, which is not described herein again.
In some embodiments of the present application, the processing module 62b is further configured to: and generating a flaw map corresponding to the target fabric according to the flaw characteristic maps of the plurality of cut-parts. Further optionally, when the processing module 62b generates the defect map corresponding to the target fabric according to the defect feature maps of the multiple cut parts, the processing module is specifically configured to: determining suspected defect areas in the corresponding cutting areas and outline information thereof according to the defect probability corresponding to each pixel point in each defect feature map; comparing the profile information of the suspected defect area with the profile information of the existing defects in a defect library to obtain a target defect area from the suspected defect area; and generating a defect map corresponding to the target cloth according to the position of the target defect area in the corresponding cutting area.
In some embodiments of the present application, the processing module 62b is further configured to: and determining the quality grade of the target cloth according to the flaw map, generating production scheduling guide information according to the quality grade of the target cloth, and providing the production scheduling guide information to a production scheduling system so that the production scheduling system can perform production scheduling processing on a subsequent to-be-produced order depending on the target cloth according to the production scheduling guide information.
In some embodiments of the present application, the processing module 62b is further configured to: providing the quality grade of the target cloth to a cloth management system; the cloth supply management system generates cloth supply requirement information according to the quality grade of the target cloth, and provides the cloth supply requirement information to the cloth supplier, so that the cloth supplier can subsequently provide the target cloth according to the cloth supply requirement information. The detailed implementation of the cloth cutting device has been described in detail in the embodiments related to the digital cloth processing system, and will not be elaborated herein.
Fig. 7 is a schematic structural diagram of a computer device according to an exemplary embodiment of the present application. As shown in fig. 7, the computer apparatus may include: a vision acquisition system 70, a memory 71 and a processor 72.
The memory 71 is used for storing computer programs and may be configured to store other various data to support operations on the computing platform. Examples of such data include instructions for any application or method operating on the computing platform, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 71 may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A processor 72, coupled to the memory 71, for executing computer programs in the memory 71 for: in the process of loosening the target cloth, a plurality of cloth images are acquired by a visual acquisition system 70 aiming at the target cloth, wherein the cloth image is an image of a cloth area of the target cloth; respectively carrying out multi-scale feature fusion processing on a plurality of cloth images according to a mesh network structure adopted by a flaw detection model to obtain a plurality of flaw feature maps, wherein the flaw detection model is a neural network model which adopts the mesh network structure to carry out multi-scale fusion; and generating a defect map corresponding to the target cloth according to the plurality of defect characteristic maps.
In some embodiments of the present application, the processor 72 performs multi-scale feature fusion processing on a plurality of cloth images according to a mesh network structure adopted by the flaw detection model, and when obtaining a plurality of flaw feature maps, is specifically configured to: inputting a cloth image into a flaw detection model for any cloth image, and performing multi-scale feature extraction on the cloth image to obtain feature images to be fused on N scales, wherein N is an integer greater than or equal to 2; based on the mesh network structure, according to the sequence of the scale of the feature map from small to large, the feature maps to be fused on the N scales are fused from bottom to top, and the flaw feature map corresponding to the cloth image is obtained.
In some embodiments of the present application, the processor 72 inputs the cloth image into the flaw detection model, and performs multi-scale feature extraction on the cloth image to obtain feature maps to be fused on N scales, which is specifically configured to: inputting the cloth image into a flaw detection model, and performing primary feature extraction on the cloth image to obtain an original feature map on the largest scale in N scales; sequentially carrying out N-1 times of downsampling on the original feature map on the maximum scale to obtain original feature maps on the rest N-1 scales; for any non-maximum scale, performing up-sampling on the target feature map on the non-maximum scale to obtain a sampling feature map on the previous scale, wherein the sampling feature map on the previous scale and the original feature map thereof form a feature map to be fused on the previous scale; the target feature map on the non-maximum scale comprises an original feature map thereof, or comprises the original feature map thereof and a fused feature map obtained by fusing the original feature map and the sampling feature map.
In some embodiments of the present application, the processor 72, based on the mesh network structure, performs fusion processing from bottom to top on feature maps to be fused on N scales according to a sequence from small to large in the feature map scale, and when obtaining a defect feature map corresponding to a cloth image, is specifically configured to: based on a mesh network structure, according to the sequence of the scale of the feature map from small to large, fusion processing is carried out on feature maps to be fused on N scales from bottom to top by adopting a cavity convolution and self-attention mechanism, and a flaw feature map corresponding to a cloth image is obtained.
In some embodiments of the present application, the processor 72 is configured to perform fusion processing from bottom to top on feature maps to be fused on N scales by using a cavity convolution and a self-attention mechanism according to a sequence from small to large in feature map scale based on a mesh network structure, and when obtaining a defect feature map corresponding to a cloth image, the processor is specifically configured to: for any non-minimum scale in the N scales, according to the feature map fusion relation corresponding to the non-minimum scale in the mesh network structure, carrying out fusion processing between the original feature map and the sampling feature map on the non-minimum scale by adopting cavity convolution to obtain an intermediate feature map on the non-minimum scale; performing feature correlation calculation on the intermediate feature map on the non-minimum scale by adopting an attention mechanism to obtain a fusion feature map on the non-minimum scale; and the flaw characteristic diagram corresponding to the cloth image is the target characteristic diagram finally obtained by fusion on the maximum scale.
In some embodiments of the present application, when the processor 72 generates the defect map corresponding to the target fabric according to the plurality of defect feature maps, the processor is specifically configured to: determining a suspected defect area in the corresponding cloth area and outline information thereof according to the defect probability corresponding to each pixel point in each defect feature map; comparing the profile information of the suspected defect area with the profile information of the existing defects in the defect library to obtain a target defect area from the suspected defect area; and generating a defect map corresponding to the target cloth according to the position of the target defect area in the corresponding cloth area.
In some embodiments of the present application, the processor 72 is further configured to: and performing model training on the flaw detection model by adopting a plurality of loss functions, wherein the plurality of loss functions comprise at least two of a loss function for relieving imbalance of positive and negative samples, a loss function for paying attention to a difficult-to-divide sample in the training process, a cross entropy loss function and a loss function for paying attention to the average cross-over ratio of the model.
Further, as shown in fig. 7, the computer apparatus further includes: communication components 73, display 74, power components 75, audio components 76, and the like. Only some of the components are shown schematically in fig. 7, and the computer device is not meant to include only the components shown in fig. 7. In addition, the components within the dashed box in fig. 7 are optional components, not necessary components, and may depend on the product form of the computer device. The computer device of this embodiment may be implemented as a terminal device such as a desktop computer, a notebook computer, a smart phone, or an IOT device, or may be a server device such as a conventional server, a cloud server, or a server array. If the computer device of this embodiment is implemented as a terminal device such as a desktop computer, a notebook computer, a smart phone, etc., the computer device may include components within a dashed line frame in fig. 7; if the computer device of this embodiment is implemented as a server device such as a conventional server, a cloud server, or a server array, the components in the dashed box in fig. 7 may not be included.
The embodiment of the present application also provides a computer device, which has the same structure as the computer device shown in fig. 7, but has a processing logic different from that of the computer device shown in fig. 7. Specifically, the computer device includes: a vision acquisition system, a memory, and a processor. A processor is coupled to the memory for executing the computer program in the memory for: receiving order information of clothing production, wherein the order information comprises cutting piece information required by clothing to be produced; determining a cutting piece area on the target fabric according to the cutting piece information; in the process of loosening the target fabric, flaw detection is carried out on the cut piece region; and cutting the cut piece area passing the flaw detection to obtain a clothing cut piece for the processing process of the clothing to be produced.
In some embodiments of the present application, the processor is specifically configured to, when performing flaw detection on the cut piece region in a process of loosening the target fabric, perform: collecting cloth images corresponding to the cut piece regions; inputting the cloth image into a flaw detection model which adopts a mesh network structure to perform multi-scale fusion, and respectively performing multi-scale feature fusion processing on the cloth image according to the mesh network structure to obtain a flaw feature map which reflects a flaw detection result of the cutting piece area. The detailed implementation process of the processor performing the multi-scale feature fusion processing on the fabric image according to the mesh network structure to obtain the flaw feature map may refer to the foregoing embodiment, and is not described herein again.
In some embodiments of the present application, the processor is further configured to: and performing model training on the flaw detection model by adopting a plurality of loss functions, wherein the plurality of loss functions comprise at least two of a loss function for relieving imbalance of positive and negative samples, a loss function for paying attention to a difficult-to-divide sample in the training process, a cross entropy loss function and a loss function for paying attention to the average cross-over ratio of the model. In some embodiments of the present application, the method further comprises: and generating a flaw map corresponding to the target fabric according to the flaw characteristic maps of the plurality of cut-parts. For generating the defect map and using the defect map, reference may be made to the foregoing embodiments, and details are not repeated herein.
Accordingly, the present application further provides a computer readable storage medium storing a computer program, which when executed by a processor, causes the processor to implement the steps in the above method embodiments.
Accordingly, the present application also provides a computer program product, which includes a computer program/instruction, when the computer program/instruction is executed by a processor, the processor is enabled to implement the steps in the above method embodiments.
The communications component of fig. 7 described above is configured to facilitate communications between the device in which the communications component is located and other devices in a wired or wireless manner. The device where the communication component is located can access a wireless network based on a communication standard, such as a WiFi, a 2G, 3G, 4G/LTE, 5G and other mobile communication networks, or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
The display in fig. 7 described above includes a screen, which may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The power supply assembly of fig. 7 described above provides power to the various components of the device in which the power supply assembly is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
The audio component of fig. 7 described above may be configured to output and/or input an audio signal. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (13)

1. A digital cloth processing system, comprising: a scrim device and an edge computing device located in an edge cluster;
the cloth loosening equipment comprises a cloth loosening mechanism for performing cloth loosening treatment on target cloth and a vision acquisition system arranged on a cloth loosening path of the cloth loosening mechanism; the visual acquisition system is used for acquiring images of each cloth area entering the visual range of the visual acquisition system and sending the acquired cloth images to the edge computing equipment;
the edge computing equipment runs a defect detection model which adopts a mesh network structure to carry out multi-scale fusion, and is used for respectively carrying out multi-scale feature fusion processing on the plurality of cloth images according to the mesh network structure adopted by the defect detection model to obtain a plurality of defect feature maps; generating a flaw map corresponding to the target cloth according to the plurality of flaw characteristic graphs; the mesh network structure comprises N layers corresponding to N scales, wherein N is an integer larger than or equal to 2, and the scales corresponding to the layers are gradually reduced from top to bottom; each layer comprises at least one node, each node represents a target feature graph or a feature to be fused on a corresponding scale, a jump link relation between nodes in the same layer represents a fusion relation existing between the feature graphs to be fused before feature fusion is carried out in the layer, a direction relation between nodes in adjacent layers represents an up-sampling relation existing between the target feature graph on the next scale and the feature to be fused on the previous scale before feature fusion is carried out in the layer where the previous scale is located, and represents a fusion relation existing between the target feature graph on the next scale and the target feature graph on the previous scale after feature fusion is carried out in the layer where the previous scale is located.
2. The system of claim 1, further comprising: cutting equipment;
the edge computing device is further to: providing the defect map to the cropping device; and the cutting equipment is used for cutting the target fabric according to the flaw map and in combination with order information from an order system to obtain cut pieces meeting the order requirements so as to use the cut pieces for garment processing.
3. The system of claim 1, further comprising: a scheduling system;
the edge computing device is further to: determining the quality grade of the target cloth according to the defect map, generating scheduling guide information according to the quality grade of the target cloth, and providing the scheduling guide information to the scheduling system;
and the scheduling system is used for performing scheduling processing on the subsequent orders to be produced depending on the target material distribution according to the scheduling guide information.
4. A cloth flaw detection method is characterized by comprising the following steps:
in the process of loosening a target fabric, collecting a plurality of fabric images for the target fabric, wherein the fabric image is an image of a fabric area of the target fabric;
respectively carrying out multi-scale feature fusion processing on the plurality of cloth images according to a mesh network structure adopted by a flaw detection model to obtain a plurality of flaw feature maps, wherein the flaw detection model is a neural network model which adopts the mesh network structure to carry out multi-scale fusion;
generating a flaw map corresponding to the target cloth according to the plurality of flaw characteristic graphs; the mesh network structure comprises N layers corresponding to N scales, wherein N is an integer larger than or equal to 2, and the scales corresponding to the layers are gradually reduced from top to bottom; each layer comprises at least one node, each node represents a target feature graph or a feature to be fused on a corresponding scale, a jump link relation between nodes in the same layer represents a fusion relation existing between the feature graphs to be fused before feature fusion is carried out in the layer, a direction relation between nodes in adjacent layers represents an up-sampling relation existing between the target feature graph on the next scale and the feature to be fused on the previous scale before feature fusion is carried out in the layer where the previous scale is located, and represents a fusion relation existing between the target feature graph on the next scale and the target feature graph on the previous scale after feature fusion is carried out in the layer where the previous scale is located.
5. The method according to claim 4, wherein the obtaining a plurality of defect feature maps by performing multi-scale feature fusion processing on the plurality of cloth images according to a mesh network structure adopted by the defect detection model specifically comprises:
inputting the cloth image into the flaw detection model for any cloth image, and performing multi-scale feature extraction on the cloth image to obtain feature maps to be fused on N scales;
and based on the mesh network structure, performing fusion processing from bottom to top on the feature maps to be fused on the N scales according to the sequence of the scale of the feature maps from small to large to obtain the flaw feature map corresponding to the cloth image.
6. The method according to claim 5, wherein the inputting the cloth image into the defect detection model, performing multi-scale feature extraction on the cloth image, and obtaining feature maps to be fused on N scales comprises:
inputting the cloth image into the flaw detection model, and performing primary feature extraction on the cloth image to obtain an original feature map on the largest scale in N scales;
sequentially carrying out N-1 times of downsampling on the original feature map on the maximum scale to obtain original feature maps on the rest N-1 scales;
for any non-maximum scale, performing up-sampling on the target feature map on the non-maximum scale to obtain a sampling feature map on the previous scale, wherein the sampling feature map on the previous scale and an original feature map thereof form a feature map to be fused on the previous scale;
the target feature map on the non-maximum scale comprises an original feature map of the target feature map, or comprises the original feature map and a fused feature map obtained by fusing the original feature map and the sampling feature map.
7. The method according to claim 6, wherein the obtaining the flaw feature map corresponding to the cloth image by performing bottom-to-top fusion processing on the feature maps to be fused on the N scales according to the sequence of the feature map scales from small to large based on the mesh network structure comprises:
and based on the mesh network structure, according to the sequence of the scale of the feature map from small to large, performing fusion processing on the feature maps to be fused on the N scales from bottom to top by adopting a cavity convolution and self-attention mechanism to obtain a flaw feature map corresponding to the cloth image.
8. The method according to claim 7, wherein based on the mesh network structure, according to a sequence of feature map scales from small to large, performing fusion processing from bottom to top on the feature maps to be fused on the N scales by using a void convolution and a self-attention mechanism to obtain a defect feature map corresponding to the cloth image, specifically comprising:
for any non-minimum scale in the N scales, according to a feature map fusion relation corresponding to the non-minimum scale in the mesh network structure, performing fusion processing between an original feature map and a sampling feature map on the non-minimum scale by adopting cavity convolution to obtain an intermediate feature map on the non-minimum scale;
performing feature correlation calculation on the intermediate feature map on the non-minimum scale by adopting an attention mechanism to obtain a fused feature map on the non-minimum scale; and the fusion feature map and the original feature map on the non-minimum scale form a target feature map on the non-minimum scale, and the flaw feature map corresponding to the cloth image is the target feature map finally obtained by fusion on the maximum scale.
9. The method according to any one of claims 4 to 8, wherein the generating a defect map corresponding to the target fabric according to the plurality of defect feature maps specifically comprises:
determining a suspected defect area in the corresponding cloth area and outline information thereof according to the defect probability corresponding to each pixel point in each defect feature map;
comparing the profile information of the suspected defect area with the profile information of the existing defects in a defect library to obtain a target defect area from the suspected defect area;
and generating a defect map corresponding to the target cloth according to the position of the target defect area in the corresponding cloth area.
10. The method according to any one of claims 4-8, further comprising: and performing model training on the flaw detection model by adopting a plurality of loss functions, wherein the plurality of loss functions comprise at least two of a loss function for relieving imbalance of positive and negative samples, a loss function for paying attention to a difficult sample in the training process, a cross entropy loss function and a loss function for paying attention to the average cross-parallel ratio of the model.
11. A cloth cutting method is characterized by comprising the following steps:
receiving order information of clothing production, wherein the order information comprises cutting piece information required by clothing to be produced;
determining a cutting piece area on the target fabric according to the cutting piece information;
in the process of loosening the target fabric, flaw detection is carried out on the cut piece region;
cutting the cut piece area passing the flaw detection to obtain a cut piece of the garment for the processing process of the garment to be produced;
in the process of loosening the target fabric, flaw detection is carried out on the cut piece region, and the flaw detection method comprises the following steps:
collecting cloth images corresponding to the cut piece regions;
inputting the cloth image into a defect detection model which adopts a mesh network structure to perform multi-scale fusion, and respectively performing multi-scale feature fusion processing on the cloth image according to the mesh network structure to obtain a defect feature map, wherein the defect feature map reflects a defect detection result of the cut piece region;
the mesh network structure comprises N layers corresponding to N scales, wherein N is an integer larger than or equal to 2, and the scales corresponding to the layers are gradually reduced from top to bottom; each layer comprises at least one node, each node represents a target feature graph or a feature to be fused on a corresponding scale, a jump link relation between nodes in the same layer represents a fusion relation existing between the feature graphs to be fused before feature fusion is carried out in the layer, a direction relation between nodes in adjacent layers represents an up-sampling relation existing between the target feature graph on the next scale and the feature to be fused on the previous scale before feature fusion is carried out in the layer where the previous scale is located, and represents a fusion relation existing between the target feature graph on the next scale and the target feature graph on the previous scale after feature fusion is carried out in the layer where the previous scale is located.
12. A computer device, comprising: a memory and a processor;
the memory for storing a computer program;
the processor is coupled to the memory for executing the computer program for performing the steps of the method of any of claims 4-11.
13. A computer-readable storage medium having a computer program stored thereon, which, when executed by a processor, causes the processor to carry out the steps of the method according to any one of claims 4 to 11.
CN202111175162.5A 2021-10-09 2021-10-09 Digital cloth processing system, cloth flaw detection method, device and medium Active CN113610848B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111175162.5A CN113610848B (en) 2021-10-09 2021-10-09 Digital cloth processing system, cloth flaw detection method, device and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111175162.5A CN113610848B (en) 2021-10-09 2021-10-09 Digital cloth processing system, cloth flaw detection method, device and medium

Publications (2)

Publication Number Publication Date
CN113610848A CN113610848A (en) 2021-11-05
CN113610848B true CN113610848B (en) 2022-04-12

Family

ID=78343392

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111175162.5A Active CN113610848B (en) 2021-10-09 2021-10-09 Digital cloth processing system, cloth flaw detection method, device and medium

Country Status (1)

Country Link
CN (1) CN113610848B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113610848B (en) * 2021-10-09 2022-04-12 阿里巴巴(中国)有限公司 Digital cloth processing system, cloth flaw detection method, device and medium
CN114990763A (en) * 2022-06-27 2022-09-02 佛山市光华织造有限公司 Rapier loom
CN116485790B (en) * 2023-06-16 2023-08-22 吉林省艾优数字科技有限公司 Intelligent detection system for personal protective clothing production abnormality for epidemic prevention

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109767443A (en) * 2019-01-17 2019-05-17 深圳码隆科技有限公司 A kind of fabric flaws method of data capture and device
CN111915593A (en) * 2020-08-04 2020-11-10 中国科学院微电子研究所 Model establishing method and device, electronic equipment and storage medium
CN112200790A (en) * 2020-10-16 2021-01-08 鲸斛(上海)智能科技有限公司 Cloth defect detection method, device and medium
CN112686869A (en) * 2020-12-31 2021-04-20 上海智臻智能网络科技股份有限公司 Cloth flaw detection method and device
CN113610848A (en) * 2021-10-09 2021-11-05 阿里巴巴(中国)有限公司 Digital cloth processing system, cloth flaw detection method, device and medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109767443A (en) * 2019-01-17 2019-05-17 深圳码隆科技有限公司 A kind of fabric flaws method of data capture and device
CN111915593A (en) * 2020-08-04 2020-11-10 中国科学院微电子研究所 Model establishing method and device, electronic equipment and storage medium
CN112200790A (en) * 2020-10-16 2021-01-08 鲸斛(上海)智能科技有限公司 Cloth defect detection method, device and medium
CN112686869A (en) * 2020-12-31 2021-04-20 上海智臻智能网络科技股份有限公司 Cloth flaw detection method and device
CN113610848A (en) * 2021-10-09 2021-11-05 阿里巴巴(中国)有限公司 Digital cloth processing system, cloth flaw detection method, device and medium

Also Published As

Publication number Publication date
CN113610848A (en) 2021-11-05

Similar Documents

Publication Publication Date Title
CN113610848B (en) Digital cloth processing system, cloth flaw detection method, device and medium
KR102229594B1 (en) Display screen quality detection method, device, electronic device and storage medium
US11380232B2 (en) Display screen quality detection method, apparatus, electronic device and storage medium
CN113406092B (en) Digital production detection system, method, device, equipment and storage medium
CN105372581A (en) Flexible circuit board manufacturing process automatic monitoring and intelligent analysis system and method
CN106530284A (en) Solder joint type detection method and apparatus based on image identification
KR20060128979A (en) Maximization of yield for web-based articles
CN108681667A (en) A kind of unit type recognition methods, device and processing equipment
CN109241030A (en) Robot manipulating task data analytics server and robot manipulating task data analysing method
Arikan et al. Surface defect classification in real-time using convolutional neural networks
CN114697212A (en) Device parameter processing method, device, system and medium
KR101206290B1 (en) Quality test equipment using fabric pattern algorithm and method for controlling thereof
CN113487247B (en) Digitalized production management system, video processing method, equipment and storage medium
CN113696641B (en) Digital printing system, method, equipment and storage medium
CN117115138A (en) Intelligent control system and method in clothing production process
CN110567967B (en) Display panel detection method, system, terminal device and computer readable medium
CN116385430A (en) Machine vision flaw detection method, device, medium and equipment
Huynh Online defect prognostic model for textile manufacturing
CN114708584A (en) Big data based cigarette product quality defect prevention and control learning system and method
US20100194562A1 (en) Failure recognition system
Luo et al. RBD-Net: robust breakage detection algorithm for industrial leather
CN113916899B (en) Method, system and device for detecting large transfusion soft bag product based on visual identification
CN212846839U (en) Fabric information matching system
KR20220067924A (en) Loan regular auditing system using artificia intellicence
CN217332186U (en) Plank AI visual defect detection device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant