CN113283499B - Three-dimensional woven fabric weaving density detection method based on deep learning - Google Patents

Three-dimensional woven fabric weaving density detection method based on deep learning Download PDF

Info

Publication number
CN113283499B
CN113283499B CN202110564520.5A CN202110564520A CN113283499B CN 113283499 B CN113283499 B CN 113283499B CN 202110564520 A CN202110564520 A CN 202110564520A CN 113283499 B CN113283499 B CN 113283499B
Authority
CN
China
Prior art keywords
longitudinal
dimensional
density
deep learning
warp
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110564520.5A
Other languages
Chinese (zh)
Other versions
CN113283499A (en
Inventor
汪俊
单忠德
谢乾
涂启帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN202110564520.5A priority Critical patent/CN113283499B/en
Publication of CN113283499A publication Critical patent/CN113283499A/en
Application granted granted Critical
Publication of CN113283499B publication Critical patent/CN113283499B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Abstract

The invention discloses a three-dimensional fabric compiling density detection method based on deep learning, which comprises the following steps: dividing a three-dimensional fabric surface data set into a training set and a verification set, and constructing a deep learning network model for training and verifying; the deep learning network model can accurately detect the classification and position information of each single longitudinal and transverse line on the surface image of the three-dimensional fabric, then outputs a detection result based on the model, and calculates the density of the longitudinal and transverse lines by using a statistical method according to the coordinates of a prediction frame for detecting the longitudinal and transverse lines. The method for detecting the weaving density of the three-dimensional weaving object has high accuracy and efficiency, can achieve the effect of real-time detection, reduces interference of artificial subjectivity, improves the quality of the three-dimensional weaving object, and ensures the reliability of product production. Meanwhile, the invention fills the blank of the automatic detection technology of the three-dimensional woven fabric weaving density and improves the efficiency of the consistency detection of the three-dimensional woven fabric.

Description

Three-dimensional woven fabric weaving density detection method based on deep learning
Technical Field
The invention belongs to the technical field of performance consistency detection of three-dimensional fabrics, and particularly relates to a three-dimensional fabric weaving density detection method based on deep learning.
Background
With the continuous development of the automobile industry field, the composite material has special vibration damping property, can effectively reduce vibration and noise, has superior fatigue resistance, and has strong development on manufacturing automobile bodies, stressed assemblies, transmission shafts and internal structures thereof.
The use of three-dimensional fabrics as composite three-dimensional fabric preforms is the focus of research today. The three-dimensional textile can produce various composite prefabricated parts according to actual requirements. With the continuous development of the manufacturing technology of the composite material three-dimensional weaved object, higher strength requirements are put on the smaller and lighter three-dimensional weaved object, and the higher strength requirements are accompanied by the high quality manufacturing requirements of the three-dimensional weaved object. In the manufacturing process of the three-dimensional fabric, the fabric density of the surface is an important quality index. The traditional weaving density detection adopts manual calibration, the method is time-consuming and labor-consuming, and depends heavily on subjective identification of detection personnel, and has certain influence on the manufacturing quality of the three-dimensional weaving object.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a three-dimensional fabric weaving density detection method based on deep learning.
The technical scheme adopted by the invention is as follows:
a three-dimensional fabric weaving density detection method based on deep learning comprises the following steps:
step 1, acquiring a three-dimensional fabric surface image, and carrying out category and position marking on the acquired three-dimensional fabric surface image according to priori knowledge to obtain a three-dimensional fabric surface data set; the three-dimensional fabric surface data set is divided into a training set and a verification set;
step 2, constructing a deep learning network model, inputting the training set into the deep learning network model for training, and stopping training when the loss function value is less than 1 and does not decrease any more;
inputting the verification set into a trained deep learning network model, outputting a three-dimensional fabric surface image marked with the category and the position of a longitudinal line and a transverse line, comparing the category and the position of the output three-dimensional fabric surface image with the category and the position mark of the verification set, and finishing the training of the deep learning network model if the accuracy rate is higher than a set value;
and 3, acquiring a three-dimensional fabric surface image in real time, inputting the image into the trained deep learning network model, outputting an image marked with longitudinal and transverse categories and positions, and calculating a fabric density value according to the position coordinates of each longitudinal and transverse line monomer.
Further, in step 2, the deep learning network model comprises a feature extraction module for extracting a feature map and a detection module for predicting longitudinal and transverse categories and position coordinates on a surface image of the three-dimensional fabric; the output end of the feature extraction module is connected with the input end of the detection module.
Further, the detection module includes applying a hole convolution on the single feature layer.
Further, in step 2, the loss function L of the deep learning network model is:
Figure BDA0003080246650000021
where N represents the total number of extracted prediction frames, L cls Represents the focal loss function, L reg Representing a cross entropy loss function, and (x, y) representing the position coordinates of the predicted longitudinal and transverse monomers in the characteristic diagram; alpha is a hyperparameter used for balancing the loss of the two, and the value is 1.
Further, in step 2, the set value is 90%.
Further, in step 3, the programming density calculation method includes: according to the predicted single longitudinal and transverse line coordinates on the three-dimensional woven object surface image output by the depth network model, firstly, extracting single longitudinal and transverse lines on the same longitudinal line or transverse line, then, according to coordinate values, calculating the number of the single bodies on each pixel point, and finally, according to the calibrated size, obtaining the woven density.
The invention has the beneficial effects that:
the three-dimensional woven fabric weaving density detection method is high in accuracy and efficiency, can achieve the effect of real-time detection, reduces interference of artificial subjectivity, improves the quality of the three-dimensional woven fabric, ensures the reliability of product production, and improves the efficiency of three-dimensional woven fabric consistency detection.
Drawings
FIG. 1 is a diagram of a deep learning network model according to the present invention;
FIG. 2 is a schematic diagram of a method for calculating the longitudinal and transverse linear density according to the present invention;
FIG. 3 shows the test result of the weaving density test method of the three-dimensional weaving object on the transverse weft;
FIG. 4 shows the test results of the three-dimensional woven fabric weaving density test method of the present invention on longitudinal warps.
Detailed Description
The method for detecting the knitting density of a three-dimensional knitting object based on deep learning according to the present invention is further described in detail with reference to the accompanying drawings and specific embodiments.
In the method for detecting the weaving density of the three-dimensional woven fabric based on deep learning, the density calculation of the longitudinal and transverse lines is carried out based on a deep learning network, each single longitudinal and transverse line is accurately detected through the analysis of the type and the position coordinates of the longitudinal and transverse lines, and the density value is calculated through the distribution of each longitudinal and transverse line on an image.
A three-dimensional fabric weaving density detection method based on deep learning comprises the following steps:
step 1, collecting a three-dimensional fabric surface image, and carrying out category and position labeling on the collected three-dimensional fabric surface image according to prior knowledge to obtain a three-dimensional fabric surface data set. The three-dimensional fabric surface data set is divided into a training set and a validation set.
And 2, constructing a deep learning network model, inputting the training set into the deep learning network model for training, and stopping training when the loss function value is less than 1 and does not decrease any more. Inputting the verification set into the trained deep learning network model, outputting a three-dimensional fabric surface image marked with the longitudinal and transverse line type and position, comparing the type and position of the output three-dimensional fabric surface image with the type and position mark of the verification set, and finishing the training of the deep learning network model if the accuracy is higher than 90%.
In this embodiment, the deep learning network model includes a feature extraction module and a detection module, and the detection module includes a hole convolution applied to a single feature layer. The output end of the characteristic extraction module is connected with the input end of the detection module. The feature extraction module is used for extracting corresponding feature maps from the input three-dimensional fabric surface images, the feature maps are subjected to upsampling at different scales to form feature layer pyramids, then the topmost feature map of the pyramid is taken, and the global receptive field can be ensured to be obtained on a single feature map through one-time 3 x 3 cavity convolution. The feature map after the cavity convolution is sent to a detection module for subsequent detection, and the detection module can predict the classification, the corresponding confidence coefficient and the corresponding position coordinate of each longitudinal and transverse line on the surface image of the three-dimensional woven object based on the feature map.
Specifically, the method comprises the following steps:
step 201, inputting a surface image of a three-dimensional fabric in a training set into a feature extraction module to extract a feature image, extracting a feature map, and performing upsampling on the feature map at different scales to form a feature layer pyramid, which is shown in fig. 1;
s202, taking a characteristic diagram of the topmost layer (C5) of the pyramid to obtain a corresponding characteristic diagram P5, carrying out one-time hole convolution on the characteristic diagram P5 by 3 x 3 to ensure that a global receptive field can be obtained on a single characteristic diagram, and feeding the characteristic diagram after the hole convolution into a detection module for subsequent detection;
step S203, a detection module is used for detecting and outputting the longitudinal and transverse line type and the position coordinates on the surface image of the three-dimensional fabric;
and S204, sequentially inputting the three-dimensional fabric surface images in the training set into the deep learning network model, repeating the steps S201-S203 until the loss function value is less than 1 and does not decrease, stopping training, and reducing the accuracy of the deep neural network model due to excessive iteration times of the deep learning network model.
In this step, the loss function L of the deep learning network model is:
Figure BDA0003080246650000031
where N represents the total number of extracted prediction frames, L cls Represents the focal loss function, L reg Represents the cross entropy loss function, and (x, y) represents the position coordinates of the predicted vertical and horizontal monomer in the characteristic diagram. Alpha is a hyperparameter used for balancing the loss of the two, and the value is 1.
And 3, acquiring a three-dimensional fabric surface image in real time, inputting the image into the trained deep learning network model, outputting an image marked with longitudinal and transverse categories and positions, and calculating a fabric density value according to the position coordinates of each longitudinal and transverse line monomer.
In step 3, the compilation density calculation method comprises the following steps: according to the predicted single longitudinal and transverse line coordinates on the three-dimensional woven object surface image output by the depth network model, firstly, extracting single longitudinal and transverse lines on the same longitudinal line or transverse line, then, according to coordinate values, calculating the number of single bodies on each pixel point, and finally, according to the calibrated size, obtaining the woven density.
Specifically, the method comprises the following steps:
in step S301, for the horizontal wefts, such as the solid rectangular frame in fig. 2, all the weft monomers on the same horizontal line need to be found, and according to statistics, the upper left coordinate in the position coordinates of the weft monomers is taken, and when the difference value of the vertical coordinates is ± 20, all the weft monomers are considered to be on the same horizontal line. Counting the number x of weft single bodies on the same transverse line, taking the width w of a surface image of the three-dimensional fabric, obtaining the density value of the weft on a pixel level as x/w, and expanding the density unit to the density unit in the real world according to the calibrated size. And (4) counting the density values of all transverse lines on the surface image of the whole three-dimensional fabric, and calculating the average value to obtain the weft density value of the image.
Step S302, for longitudinal warps, such as a dotted rectangular frame in FIG. 2, all warp monomers on the same longitudinal line need to be found, according to statistics, an upper left coordinate and a lower right coordinate in position coordinates of the warp monomers are taken, such as black dots on the dotted rectangular frame in FIG. 2, the warp monomer with the smallest upper left coordinate value is taken as the initial end of the topmost longitudinal line of the image, the warp monomers with the smallest upper left coordinate value are found, the warp monomers with the horizontal and vertical coordinate difference values of +/-15 are found when the upper left coordinate in the remaining warp monomers is compared with the lower right coordinate of the warp monomer, and the steps are repeated until the next warp monomer is not found, so that all warp monomers on a complete longitudinal line are obtained. And counting the number y of the warp single bodies on the same longitudinal line, taking the height h of a surface image of the three-dimensional fabric, obtaining the density value of the weft at the pixel level as y/h, and expanding the density value to the density unit in the real world according to the calibrated size, wherein the unit is the number of warps/each pixel. Because there is a case where there is a missing portion in the current image for the longitudinal warps, the warp density value of the entire image is finally taken where the number of warp monomers on the longitudinal warp is the largest.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any alternatives or modifications that can be easily conceived by those skilled in the art within the technical scope of the present invention should be covered within the scope of the present invention.

Claims (1)

1. A three-dimensional fabric weaving density detection method based on deep learning is characterized by comprising the following steps:
step 1, acquiring a three-dimensional fabric surface image, and carrying out category and position marking on the acquired three-dimensional fabric surface image according to priori knowledge to obtain a three-dimensional fabric surface data set; the three-dimensional fabric surface data set is divided into a training set and a verification set;
step 2, constructing a deep learning network model, inputting the training set into the deep learning network model for training, and stopping training when the loss function value is less than 1 and does not decrease any more;
inputting the verification set into a trained deep learning network model, outputting a three-dimensional fabric surface image marked with longitudinal and transverse line categories and positions, comparing the categories and the positions of the output three-dimensional fabric surface image with the categories and the position marks of the verification set, and finishing the training of the deep learning network model if the accuracy is higher than a set value; the set value is 90%;
step 3, acquiring a three-dimensional fabric surface image in real time, inputting the image into a trained deep learning network model, outputting an image marked with longitudinal and transverse categories and positions, and calculating a fabric density value according to the position coordinates of each longitudinal and transverse line monomer;
in step 2, the deep learning network model comprises a feature extraction module for extracting a feature map and a detection module for predicting longitudinal and transverse categories and position coordinates on a surface image of the three-dimensional fabric; the output end of the feature extraction module is connected with the input end of the detection module;
the detection module comprises applying a hole convolution on the single feature layer;
in step 2, the loss function L of the deep learning network model is:
Figure FDA0003757660400000011
where N represents the total number of extracted prediction frames, L cls Represents the focal loss function, L reg Representing a cross entropy loss function, and (x, y) representing the position coordinates of the predicted longitudinal and transverse monomers in the characteristic diagram; alpha is a hyper-parameter used for balancing the loss of the two, and the value is 1;
in step 3, the programming density calculation method comprises the following steps: according to the predicted single longitudinal and transverse line coordinates on the three-dimensional woven object surface image output by the depth network model, firstly extracting single longitudinal and transverse lines on the same longitudinal line or transverse line, then calculating the number of the single bodies on each pixel point according to the coordinate values, and finally obtaining the woven density according to the calibrated size;
specifically, the method comprises the following steps:
s301, for transverse wefts, all weft monomers on the same transverse line need to be found, the upper left coordinates in the position coordinates of the weft monomers are taken, when the difference value of vertical coordinates is +/-20, the weft monomers on the same transverse line are considered to be on the same transverse line, the number x of the weft monomers on the same transverse line is counted, the width w of a three-dimensional woven fabric surface image is taken, the density value of the weft on a pixel level is obtained and divided by w, the unit is 'number of wefts/each pixel', the density unit is expanded to a density unit in the real world according to the calibrated size, the density values of all transverse lines on the whole three-dimensional woven fabric surface image are counted, and the average value is obtained to obtain the weft density value of the image;
step S302, for longitudinal warps, all warp monomers on the same longitudinal line need to be found, an upper left coordinate and a lower right coordinate in position coordinates of the warp monomers are taken, the warp monomer with the minimum upper left coordinate value is taken as a starting end of the longitudinal line at the topmost end of the image, the warp monomers with the upper left coordinate value being compared with the lower right coordinate of the warp monomer in the rest warp monomers and the difference value of the horizontal and vertical coordinates being +/-15 are taken as the next-stage warps of the longitudinal line, the steps are repeated until the next-stage warp monomer cannot be found, all warp monomers on a complete longitudinal line are obtained, the number y of the warp monomers on the same longitudinal line is counted, the height h of the surface image of the three-dimensional woven object is taken, the density value of the weft on a pixel level is obtained as y/h, the unit is 'number of warps/pixel', and the density unit in the real world is expanded according to the calibrated size, because there is a case where there is a missing portion in the current image for the longitudinal warps, the warp density value of the entire image is finally taken where the number of warp monomers on the longitudinal warp is the largest.
CN202110564520.5A 2021-05-24 2021-05-24 Three-dimensional woven fabric weaving density detection method based on deep learning Active CN113283499B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110564520.5A CN113283499B (en) 2021-05-24 2021-05-24 Three-dimensional woven fabric weaving density detection method based on deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110564520.5A CN113283499B (en) 2021-05-24 2021-05-24 Three-dimensional woven fabric weaving density detection method based on deep learning

Publications (2)

Publication Number Publication Date
CN113283499A CN113283499A (en) 2021-08-20
CN113283499B true CN113283499B (en) 2022-09-13

Family

ID=77281015

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110564520.5A Active CN113283499B (en) 2021-05-24 2021-05-24 Three-dimensional woven fabric weaving density detection method based on deep learning

Country Status (1)

Country Link
CN (1) CN113283499B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113715330B (en) * 2021-09-02 2022-07-08 北京理工大学 Interlayer penetrating continuous fiber composite material additive manufacturing equipment and method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101270544A (en) * 2008-04-29 2008-09-24 上海好力纺织机电制造有限公司 Parameter detecting device and method for textile weaving process
CN102788793A (en) * 2012-03-31 2012-11-21 江南大学 Method for measuring density of weft knitted fabric based on spectrum analysis

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1844550B (en) * 2006-01-26 2013-05-01 香港理工大学 Textile and yarn analysis system based on two-side scanning technology
CN103149210B (en) * 2013-02-25 2015-09-30 东华大学 A kind of fabric cashmere content detection system and method based on scale picture and text feature
CN109344821A (en) * 2018-08-30 2019-02-15 西安电子科技大学 Small target detecting method based on Fusion Features and deep learning
CN109785314A (en) * 2019-01-22 2019-05-21 中科院金华信息技术有限公司 A kind of pck count detection system and method based on u-net network
CN110472599B (en) * 2019-08-20 2021-09-03 北京海益同展信息科技有限公司 Object quantity determination method and device, storage medium and electronic equipment
CN111830036B (en) * 2020-07-01 2023-04-07 湖北省纤维检验局 Fabric density analysis method and system
CN112183673A (en) * 2020-11-06 2021-01-05 携程计算机技术(上海)有限公司 Weather time interval classification method, system, equipment and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101270544A (en) * 2008-04-29 2008-09-24 上海好力纺织机电制造有限公司 Parameter detecting device and method for textile weaving process
CN102788793A (en) * 2012-03-31 2012-11-21 江南大学 Method for measuring density of weft knitted fabric based on spectrum analysis

Also Published As

Publication number Publication date
CN113283499A (en) 2021-08-20

Similar Documents

Publication Publication Date Title
CN107644415B (en) A kind of text image method for evaluating quality and equipment
CN103604809B (en) A kind of online visible detection method of pattern cloth flaw
CN111402226A (en) Surface defect detection method based on cascade convolution neural network
CN104361611B (en) Group sparsity robust PCA-based moving object detecting method
CN107123111B (en) Deep residual error network construction method for mobile phone screen defect detection
CN110263705A (en) Towards two phase of remote sensing technology field high-resolution remote sensing image change detecting method
CN111402197B (en) Detection method for colored fabric cut-parts defect area
CN106875373A (en) Mobile phone screen MURA defect inspection methods based on convolutional neural networks pruning algorithms
CN101866427A (en) Method for detecting and classifying fabric defects
CN107093205A (en) A kind of three dimensions building window detection method for reconstructing based on unmanned plane image
CN111127383A (en) Digital printing online defect detection system and implementation method thereof
CN115797354B (en) Method for detecting appearance defects of laser welding seam
CN107563274A (en) A kind of vehicle checking method and method of counting of the video based on confrontation e-learning
CN104504381B (en) Non-rigid object detection method and its system
CN105718866A (en) Visual target detection and identification method
CN113283499B (en) Three-dimensional woven fabric weaving density detection method based on deep learning
CN111862027B (en) Textile flaw detection method based on low-rank sparse matrix decomposition
CN109543672A (en) Object detecting method based on dense characteristic pyramid network
CN109635726A (en) A kind of landslide identification method based on the symmetrical multiple dimensioned pond of depth network integration
CN112102224A (en) Cloth defect identification method based on deep convolutional neural network
CN104091327A (en) Method and system for generating dendritic shrinkage porosity defect simulation image of casting
CN105205816A (en) Method for extracting high-resolution SAR image building zone through multi-feature weighted fusion
CN110544253A (en) fabric flaw detection method based on image pyramid and column template
CN107247954A (en) A kind of image outlier detection method based on deep neural network
CN115100214A (en) Textile quality detection method based on image processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant