CN115471480A - Intelligent production process of hip-up sport pants - Google Patents

Intelligent production process of hip-up sport pants Download PDF

Info

Publication number
CN115471480A
CN115471480A CN202211141108.3A CN202211141108A CN115471480A CN 115471480 A CN115471480 A CN 115471480A CN 202211141108 A CN202211141108 A CN 202211141108A CN 115471480 A CN115471480 A CN 115471480A
Authority
CN
China
Prior art keywords
convolution
pattern
characteristic diagram
feature map
hip
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202211141108.3A
Other languages
Chinese (zh)
Inventor
刘迪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Resonance Sports Technology Co ltd
Original Assignee
Ningbo Resonance Sports Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Resonance Sports Technology Co ltd filed Critical Ningbo Resonance Sports Technology Co ltd
Priority to CN202211141108.3A priority Critical patent/CN115471480A/en
Publication of CN115471480A publication Critical patent/CN115471480A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30124Fabrics; Textile; Paper
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Quality & Reliability (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Image Analysis (AREA)

Abstract

The application relates to the field of intelligent design and manufacture of clothes, and particularly discloses an intelligent production process of hip-lifting sports pants, which is characterized in that a deep convolution neural network model is used as a domain converter to map an image of a flower type pattern of the hip-lifting sports pants to be detected from a source domain into a high-dimensional characteristic domain to obtain a flower type pattern characteristic diagram, a reference flower type characteristic diagram is obtained in the same way, then a difference characteristic diagram between the reference flower type characteristic diagram and the flower type pattern characteristic diagram is calculated to show the difference of the burnt pattern of the sports pants to be detected and a designed burnt pattern in an essential characteristic layer, and the difference characteristic diagram is used for obtaining a classification result which is used for showing whether the flower type pattern of the hip-lifting sports pants to be detected, which is manufactured through the burnt pattern process, meets design requirements or not through a classifier. Therefore, an intelligent scheme of the hip-up sports pants production process is established based on the artificial intelligence technology, and the accuracy of quality inspection of the pattern burnt out of the sports clothes is improved.

Description

Intelligent production process of hip-up sport pants
Technical Field
The application relates to the field of intelligent garment design and manufacture, and more particularly relates to an intelligent production process of hip-lifting sports pants.
Technical Field
The pattern burning process of the sports clothes is characterized in that fibers are corroded and damaged by a reagent, various patterns are formed on the surface of the cloth through a series of chemical reactions, the flow of the whole pattern burning process is complicated, and the cloth treated by the pattern burning process has good standing body feeling and appearance. However, due to the complicated flow of the whole pattern burning process, the problem of a certain process link or the failure of controlling a certain process link in place, the pattern patterns finally prepared by the pattern burning process are difficult to meet the design requirements.
The quality inspection work of the traditional pattern burning process is completed manually, the mode is low in efficiency and easy to make mistakes, and fine defects are difficult to find due to the limited observation resolution of human eyes.
Therefore, an optimized intelligent production process of the hip-up sports pants is expected, and intelligent quality inspection can be performed on the pattern of the burnt patterns of the sports clothes.
Disclosure of Invention
The present application is proposed to solve the above-mentioned technical problems. The embodiment of the application provides an intelligent production process of hip-lift sports pants, which uses a deep convolutional neural network model as a domain converter to map an image of a pattern of the hip-lift sports pants to be detected from a source domain into a high-dimensional characteristic domain to obtain a pattern characteristic diagram, obtains a reference pattern characteristic diagram in the same way, calculates a difference characteristic diagram between the reference pattern characteristic diagram and the pattern characteristic diagram to represent the difference between the pattern of the sports pants to be detected and a designed pattern in an essential characteristic layer, and obtains a classification result used for representing whether the pattern of the hip-lift sports pants to be detected, which is prepared by a pattern burning process, meets design requirements or not through a classifier by using the difference characteristic diagram. Therefore, an intelligent scheme of the hip-up sports pants production process is established based on the artificial intelligence technology, and the accuracy of quality inspection of the pattern burnt out of the sports clothes is improved.
Accordingly, according to an aspect of the present application, there is provided an intelligent production process of hip-up sports pants, comprising:
acquiring an image of a pattern of the to-be-detected hip-up sports pants;
enabling the image of the pattern of the to-be-detected hip-up sports pants to pass through a convolutional neural network model with a multi-scale convolution structure to obtain a pattern characteristic diagram;
extracting an image of a reference pattern matched with the pattern of the hip lifting sports pants to be detected from a database;
passing the image of the reference pattern through the convolutional neural network model with the multi-scale convolutional structure to obtain a reference pattern characteristic diagram;
correcting the characteristic value of each element in the flower type pattern characteristic diagram based on the position information of each element in the flower type pattern characteristic diagram to obtain a corrected flower type pattern characteristic diagram;
correcting the characteristic values of all elements in the reference pattern characteristic diagram based on the position information of all elements in the reference pattern characteristic diagram to obtain a corrected reference pattern characteristic diagram;
calculating a difference characteristic diagram between the corrected flower type pattern characteristic diagram and the corrected reference flower type characteristic diagram; and
and passing the differential characteristic diagram through a classifier to obtain a classification result, wherein the classification result is whether the pattern of the sport pants to be detected, which is manufactured through the pattern burning process, meets the design requirements.
In the above intelligent production process of the hip-up sports pants, passing the image of the pattern of the to-be-detected hip-up sports pants through a convolutional neural network model with a multi-scale convolutional structure to obtain a pattern characteristic diagram, the method includes: performing, using the layers of the convolutional neural network model, in a layer forward pass, input data separately: performing convolution processing on the input data by using a first convolution core of the multi-scale convolution structure to obtain a first convolution feature map; performing convolution processing on the input data by using a second convolution kernel of the multi-scale convolution structure to obtain a second convolution characteristic diagram; performing convolution processing on the input data by using a third convolution kernel of the multi-scale convolution structure to obtain a third convolution characteristic diagram; performing convolution processing on the input data by using a fourth convolution kernel of the multi-scale convolution structure to obtain a fourth convolution feature map; cascading the first convolution feature map, the second convolution feature map, the third convolution feature map and the fourth convolution feature map to obtain a multi-scale convolution feature map; pooling the multi-scale convolution characteristic map to obtain a pooled characteristic map; carrying out nonlinear activation processing on the pooled feature map to obtain an activated feature map; and the activation characteristic graph output by the last layer of the convolutional neural network model is the flower type pattern characteristic graph.
In the intelligent production process of the hip-up sports pants, the size of the first convolution kernel is 7 × 7, the size of the second convolution kernel is 5 × 5, the size of the third convolution kernel is 3 × 3, and the size of the fourth convolution kernel is 1 × 1.
In the above intelligent production process of the hip-up sports pants, the passing the image of the reference pattern through the convolutional neural network model with the multi-scale convolution structure to obtain a reference pattern characteristic diagram includes: performing, using the layers of the convolutional neural network model, in a layer forward pass, input data separately: performing convolution processing on the input data by using a first convolution core of the multi-scale convolution structure to obtain a first convolution feature map; performing convolution processing on the input data by using a second convolution kernel of the multi-scale convolution structure to obtain a second convolution characteristic diagram; performing convolution processing on the input data by using a third convolution kernel of the multi-scale convolution structure to obtain a third convolution characteristic diagram; performing convolution processing on the input data by using a fourth convolution kernel of the multi-scale convolution structure to obtain a fourth convolution feature map; cascading the first convolution feature map, the second convolution feature map, the third convolution feature map and the fourth convolution feature map to obtain a multi-scale convolution feature map; pooling the multi-scale convolution characteristic map to obtain a pooled characteristic map; carrying out nonlinear activation processing on the pooled feature map to obtain an activated feature map; the activation characteristic graph output by the last layer of the convolutional neural network model is the reference pattern type characteristic graph, and the input of the first layer of the convolutional neural network model is an image of the reference pattern type pattern.
In the above intelligent production process of hip-up sports pants, the correcting the characteristic values of the elements in the pattern characteristic diagram based on the position information of the elements in the pattern characteristic diagram to obtain a corrected pattern characteristic diagram includes: based on the position information of each element in the pattern characteristic diagram, correcting the characteristic value of each element in the pattern characteristic diagram by the following formula to obtain the corrected pattern characteristic diagram; wherein the formula is
Figure BDA0003853554820000031
M 1 Is the respective feature matrix along the path of the pattern feature map, cov 1 () And Cov 2 () Are all a single convolution layer and are all a single convolution layer,
Figure BDA0003853554820000032
for mapping two-dimensional position coordinates to one-dimensional values, P M A coordinate matrix representing each feature matrix along the channel of the pattern feature map.
In the above intelligent production process of the hip-up sports pants, the correcting the characteristic values of the elements in the reference pattern characteristic diagram based on the position information of the elements in the reference pattern characteristic diagram to obtain a corrected reference pattern characteristic diagram includes: based on the position information of each element in the reference pattern characteristic diagram, correcting the characteristic value of each element in the reference pattern characteristic diagram by the following formula to obtain the corrected reference pattern characteristic diagram; wherein the formula is:
Figure BDA0003853554820000033
M 2 each feature matrix along the channel of the reference pattern feature map,Cov 1 () And Cov 2 () Are all a single convolution layer, and are,
Figure BDA0003853554820000034
for mapping two-dimensional position coordinates to one-dimensional values, P M A coordinate matrix representing the respective feature matrices along the channels of said reference pattern feature map.
In the above intelligent production process of the hip-up sport pants, the calculating a difference characteristic diagram between the corrected pattern characteristic diagram and the corrected reference pattern characteristic diagram includes: calculating the difference characteristic diagram between the corrected flower type pattern characteristic diagram and the corrected reference flower type characteristic diagram according to the following formula; wherein the formula is:
Figure BDA0003853554820000041
wherein, F' 1 Represents the corrected pattern feature map, F' 2 Showing the characteristic diagram of the reference pattern after correction,
Figure BDA0003853554820000042
indicating that the difference is made by position.
In the above intelligent production process of the hip-up sport pants, the step of passing the differential feature map through a classifier to obtain a classification result includes: processing the differential feature map by using the classifier according to the following formula to obtain the classification result;
wherein the formula is: o = Softmax { (W) n ,B n ):…:(W 1 ,B 1 )|Project(F c ) }, where Project (F) c ) Representing the projection of the difference profile as a vector, W 1 To W n As a weight matrix for each fully connected layer, B 1 To B n A bias matrix representing the fully connected layers of each layer.
According to another aspect of the present application, there is also provided a system for an intelligent production process of hip-up sport pants, comprising:
the to-be-detected image acquisition unit is used for acquiring an image of a pattern of the to-be-detected hip-up sports pants;
the pattern characteristic diagram generating unit is used for enabling the pattern image of the hip lifting sports pants to be detected to pass through a convolution neural network model with a multi-scale convolution structure so as to obtain a pattern characteristic diagram;
the reference image extracting unit is used for extracting an image of a reference pattern matched with the pattern of the hip lifting sports pants to be detected from a database;
the reference pattern characteristic diagram generating unit is used for enabling the image of the reference pattern to pass through the convolutional neural network model with the multi-scale convolution structure so as to obtain a reference pattern characteristic diagram;
a corrected flower type pattern characteristic diagram generating unit, configured to correct characteristic values of each element in the flower type pattern characteristic diagram based on position information of each element in the flower type pattern characteristic diagram to obtain a corrected flower type pattern characteristic diagram;
a corrected reference pattern characteristic diagram generating unit, configured to correct characteristic values of each element in the reference pattern characteristic diagram based on position information of each element in the reference pattern characteristic diagram to obtain a corrected reference pattern characteristic diagram;
a difference unit for calculating a difference characteristic diagram between the corrected flower type pattern characteristic diagram and the corrected reference flower type characteristic diagram; and
and the result generating unit is used for enabling the differential characteristic diagram to pass through a classifier to obtain a classification result, wherein the classification result is whether the pattern type pattern of the hip lifting sports pants to be detected, which is manufactured through the pattern burning process, meets the design requirement.
In the above system of the intelligent production process of the hip-up sports pants, the pattern feature map generation unit is further configured to: performing, using the layers of the convolutional neural network model, in a layer forward pass, input data separately: performing convolution processing on the input data by using a first convolution core of the multi-scale convolution structure to obtain a first convolution feature map; performing convolution processing on the input data by using a second convolution kernel of the multi-scale convolution structure to obtain a second convolution characteristic diagram; performing convolution processing on the input data by using a third convolution kernel of the multi-scale convolution structure to obtain a third convolution characteristic diagram; performing convolution processing on the input data by using a fourth convolution kernel of the multi-scale convolution structure to obtain a fourth convolution feature map; cascading the first convolution feature map, the second convolution feature map, the third convolution feature map and the fourth convolution feature map to obtain a multi-scale convolution feature map; pooling the multi-scale convolution characteristic map to obtain a pooled characteristic map; carrying out nonlinear activation processing on the pooled feature map to obtain an activated feature map; and the activation characteristic graph output by the last layer of the convolutional neural network model is the pattern characteristic graph.
In the above system for the intelligent production process of the hip-up sport pants, the size of the first convolution kernel is 7 × 7, the size of the second convolution kernel is 5 × 5, the size of the third convolution kernel is 3 × 3, and the size of the fourth convolution kernel is 1 × 1.
In the above system of the intelligent production process of the hip-up sports pants, the reference pattern characteristic diagram generation unit is further configured to: using each layer of the convolutional neural network model to respectively perform the following steps on input data in forward transmission of the layer: performing convolution processing on the input data by using a first convolution kernel of the multi-scale convolution structure to obtain a first convolution feature map; performing convolution processing on the input data by using a second convolution kernel of the multi-scale convolution structure to obtain a second convolution characteristic diagram; performing convolution processing on the input data by using a third convolution kernel of the multi-scale convolution structure to obtain a third convolution characteristic diagram; performing convolution processing on the input data by using a fourth convolution kernel of the multi-scale convolution structure to obtain a fourth convolution feature map; cascading the first convolution feature map, the second convolution feature map, the third convolution feature map and the fourth convolution feature map to obtain a multi-scale convolution feature map; pooling the multi-scale convolution characteristic map to obtain a pooled characteristic map; carrying out nonlinear activation processing on the pooled feature map to obtain an activated feature map; the activation characteristic graph output by the last layer of the convolutional neural network model is the reference pattern type characteristic graph, and the input of the first layer of the convolutional neural network model is an image of the reference pattern type pattern.
In the above system of the intelligent production process of the hip-up sports pants, the corrected pattern characteristic diagram generation unit is further configured to: based on the position information of each element in the pattern characteristic diagram, correcting the characteristic value of each element in the pattern characteristic diagram by the following formula to obtain the corrected pattern characteristic diagram; wherein the formula is
Figure BDA0003853554820000061
M 1 Is the respective feature matrix along the channel, cov, of the pattern feature map 1 () And Cov 2 () Are all a single convolution layer, and are,
Figure BDA0003853554820000062
for mapping two-dimensional position coordinates to one-dimensional values, P M A coordinate matrix representing the respective feature matrices along the channels of the pattern feature map.
In the above system of the intelligent production process of the hip-up sports pants, the corrected reference pattern characteristic diagram generation unit is further configured to: based on the position information of each element in the reference pattern characteristic diagram, correcting the characteristic value of each element in the reference pattern characteristic diagram by the following formula to obtain the corrected reference pattern characteristic diagram; wherein the formula is:
Figure BDA0003853554820000063
M 2 each feature matrix along the channel, cov, of said reference pattern feature map 1 () And Cov 2 () Are all a single convolution layer and are all a single convolution layer,
Figure BDA0003853554820000064
for mapping two-dimensional position coordinates to one-dimensional values, P M A coordinate matrix representing each feature matrix along a channel of the reference pattern feature map.
In the above system for the intelligent production process of the hip-up sports pants, the difference unit is further configured to: calculating the difference characteristic diagram between the corrected flower type pattern characteristic diagram and the corrected reference flower type characteristic diagram according to the following formula; wherein the formula is:
Figure BDA0003853554820000065
wherein, F' 1 Represents the corrected pattern feature map, F' 2 Showing the corrected reference pattern characteristic diagram,
Figure BDA0003853554820000066
indicating that the difference is made by position.
According to still another aspect of the present application, there is provided an electronic apparatus including: a processor; and a memory having computer program instructions stored therein, which when executed by the processor, cause the processor to perform the intelligent production process of hip-lift training pants as described above.
According to yet another aspect of the present application, there is provided a computer readable medium having stored thereon computer program instructions which, when executed by a processor, cause the processor to perform the intelligent production process of hip-up sports pants as described above.
Compared with the prior art, the intelligent production process of the hip-lift sports pants, provided by the application, comprises the steps of mapping an image of a pattern of the hip-lift sports pants to be detected from a source domain into a high-dimensional characteristic domain by taking a deep convolutional neural network model as a domain converter to obtain the pattern characteristic diagram, obtaining a reference pattern characteristic diagram in the same way, calculating a difference characteristic diagram between the reference pattern characteristic diagram and the pattern characteristic diagram to show the difference between the pattern of the hip-lift sports pants to be detected and a designed pattern in an essential characteristic layer, and obtaining a classification result used for showing whether the pattern of the hip-lift sports pants to be detected, which is prepared through the pattern burning process, meets design requirements through a classifier by using the difference characteristic diagram. Therefore, an intelligent scheme of the hip-up sports pants production process is established based on the artificial intelligence technology, and the accuracy of quality inspection of the pattern burnt out of the sports clothes is improved.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing in more detail embodiments of the present application with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application. In the drawings, like reference numbers generally indicate like parts or steps.
Fig. 1 illustrates a scene schematic diagram of an intelligent production process of hip-up sports pants according to an embodiment of the present application.
Fig. 2 illustrates a flow chart of an intelligent production process of hip-up sports pants according to an embodiment of the present application.
Fig. 3 illustrates a schematic architecture diagram of an intelligent production process of hip-up sports pants according to an embodiment of the present application.
Fig. 4 illustrates a block diagram of a system for an intelligent production process of hip-up pants according to an embodiment of the present application.
FIG. 5 illustrates a block diagram of an electronic device in accordance with an embodiment of the present application.
Detailed Description
Hereinafter, example embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be understood that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and that the present application is not limited by the example embodiments described herein.
Summary of the application
Aiming at the quality inspection problem of the pattern burning process of the sports clothes, the quality inspection can be carried out by comparing the pattern burning of the sports clothes to be detected with the designed pattern burning, however, if the difference between the pattern burning of the sports clothes to be detected and the designed pattern burning is directly calculated in an image domain, the quality inspection effect is poor, the reason is that when the pattern burning of the sports clothes to be detected is collected, noise and interference are introduced due to the influences of shielding, ambient light and the like, and then the difference scale is introduced due to the fact that the image proportion between the pattern burning of the sports clothes to be detected and the designed pattern burning is different.
Based on this, in the technical scheme of this application, the comparison between the pattern of burning flower of the sports wear that will detect and design pattern of burning flower is carried out in high-dimensional feature space. Compared with the image source domain, in the high-dimensional characteristic domain, although the characteristic representation of the pattern is more abstract, the interference in the image source domain can be eliminated so as to directly compare the difference of the burnt pattern of the sports garment to be detected and the designed burnt pattern on the essential characteristic layer, and therefore the accuracy of the quality inspection of the burnt pattern of the sports garment is improved.
Specifically, firstly, an image of a pattern of the hip-up sports pants to be detected and an image of a reference pattern are obtained, wherein the image of the reference pattern may be a CAD clothing design pattern. And then, taking the deep convolutional neural network model as a feature extractor to extract a pattern feature map from the pattern image of the hip-lift sports pants to be detected, namely, taking the deep convolutional neural network model as a domain converter to map the pattern image of the hip-lift sports pants to be detected from a source domain into a high-dimensional feature domain to obtain the pattern feature map.
Particularly, in the technical solution of the present application, since the pattern is a very complex pattern, which may present different feature representations and high-dimensional information in different image scales, in order to improve richness and accuracy of extracting the pattern information, layer structure improvement is performed on each layer of the deep convolutional neural network so that each layer has multiple receptive fields. Specifically, in the technical solution of the present application, each layer of the deep convolutional neural network has a multi-scale convolutional structure, and the deep convolutional neural network can perform parallel feature extraction based on different receptive fields on input data with a group of convolutional kernels having different sizes to obtain feature maps having different feature scales, and cascade the feature maps having different feature scales to obtain a multi-scale feature map.
For the image of the reference pattern, the image of the reference pattern can be processed by the convolutional neural network model with the multi-scale convolution structure to obtain a reference pattern characteristic diagram. Then, in a high-dimensional feature domain, calculating a difference feature map between the reference pattern feature map and the pattern feature map to show the difference between the pattern burned of the sports garment to be detected and the designed pattern burned in an essential feature layer, and obtaining a classification result for showing whether the pattern manufactured by the pattern burning process of the sports pants to be detected meets the design requirement through a classifier by using the difference feature map.
In particular, in the technical solution of the present application, a convolutional neural network model with a multi-scale convolution structure is used, so that pixel semantic association expression features of an image at multiple scales can be extracted, but in order to improve a classification effect, it is still desirable to further improve global semantic expression.
Therefore, the position proposing local reasoning transformation is respectively carried out on the pattern characteristic diagram and the reference pattern characteristic diagram, namely:
Figure BDA0003853554820000081
Figure BDA0003853554820000091
M 1 and M 2 Each feature matrix along the channel expressing image pixel semantics, cov, of the pattern feature map and the reference pattern feature map, respectively 1 () And Cov 2 () Are all a single convolution layer, and are,
Figure BDA0003853554820000092
for mapping two-dimensional position coordinates to one-dimensional values, P M An (x, y) coordinate matrix representing the matrix M.
The position proposal local inference transformation can use position information as a proposal, and carry out inference on global scene semantics through a local perception field of a convolutional layer so as to comprehensively fuse the captured local semantics and further derive the global semantics, thereby realizing the inference prediction of local-global migration and global semantics of image semantics. Therefore, the accuracy of detecting the forming quality of the pattern burning process is improved.
Based on this, the application provides an intelligent production technology of carrying buttockss motion trousers, it includes: acquiring an image of a pattern of the hip-up sports pants to be detected; enabling the pattern image of the pattern of the hip lifting sports pants to be detected to pass through a convolution neural network model with a multi-scale convolution structure to obtain a pattern characteristic diagram; extracting an image of a reference pattern matched with the pattern of the to-be-detected hip-up sports pants from a database; passing the image of the reference pattern through the convolutional neural network model with the multi-scale convolutional structure to obtain a reference pattern characteristic diagram; correcting the characteristic value of each element in the flower type pattern characteristic diagram based on the position information of each element in the flower type pattern characteristic diagram to obtain a corrected flower type pattern characteristic diagram; correcting the characteristic values of each element in the reference pattern characteristic diagram based on the position information of each element in the reference pattern characteristic diagram to obtain a corrected reference pattern characteristic diagram; calculating a difference characteristic diagram between the corrected flower type pattern characteristic diagram and the corrected reference flower type characteristic diagram; and enabling the differential characteristic diagram to pass through a classifier to obtain a classification result, wherein the classification result is whether the pattern of the hip lifting sports pants to be detected, which is manufactured through a pattern burning process, meets the design requirements.
Fig. 1 illustrates a scene schematic diagram of an intelligent production process of hip-up sports pants according to an embodiment of the present application. As shown in fig. 1, in an application scenario of the intelligent production process of the hip-up sports pants, an image of a pattern of the hip-up sports pants to be detected is first obtained through a camera (e.g., as indicated by C in fig. 1). Further, the image of the pattern of the to-be-detected hip-up sports pants is input into a server (for example, S shown in fig. 1) deployed with an intelligent production process algorithm of the hip-up sports pants, wherein the server can process the image of the pattern of the to-be-detected hip-up sports pants with the intelligent production process algorithm of the hip-up sports pants to obtain a classification result indicating whether the pattern of the to-be-detected hip-up sports pants, which is manufactured through a flower burning process, meets design requirements.
Having described the general principles of the present application, various non-limiting embodiments of the present application will now be described with reference to the accompanying drawings.
Exemplary method
Fig. 2 illustrates a flowchart of an intelligent production process of hip-up sports pants according to an embodiment of the present application. As shown in fig. 2, the intelligent production process of the hip-up sports pants according to the embodiment of the application comprises the following steps: s110, acquiring an image of a pattern of the to-be-detected hip-up sports pants; s120, passing the image of the pattern of the hip lifting sports pants to be detected through a convolutional neural network model with a multi-scale convolutional structure to obtain a pattern characteristic diagram; s130, extracting an image of a reference pattern matched with the pattern of the hip lifting sports pants to be detected from a database; s140, enabling the image of the reference pattern to pass through the convolutional neural network model with the multi-scale convolution structure to obtain a reference pattern characteristic diagram; s150, correcting the characteristic value of each element in the pattern characteristic diagram based on the position information of each element in the pattern characteristic diagram to obtain a corrected pattern characteristic diagram; s160, correcting the characteristic values of all elements in the reference pattern characteristic diagram based on the position information of all elements in the reference pattern characteristic diagram to obtain a corrected reference pattern characteristic diagram; s170, calculating a difference characteristic diagram between the corrected flower type pattern characteristic diagram and the corrected reference flower type characteristic diagram; and S180, passing the differential characteristic diagram through a classifier to obtain a classification result, wherein the classification result is whether the pattern of the sport pants to be detected, which is manufactured through the pattern burning process, meets the design requirements.
Fig. 3 illustrates an architecture diagram of an intelligent production process of hip-up sports pants according to an embodiment of the present application. As shown in fig. 3, firstly, an image of the pattern of the hip-up sport pants to be detected is obtained through the camera. And then, enabling the pattern image of the pattern of the hip lifting sports pants to be detected to pass through a convolutional neural network model with a multi-scale convolution structure to obtain a pattern characteristic diagram. Then, extracting an image of a reference pattern matched with the pattern of the hip lifting sport pants to be detected from a database, and enabling the image of the reference pattern to pass through the convolutional neural network model with the multi-scale convolution structure to obtain a reference pattern characteristic diagram. And then, respectively correcting the characteristic values of each element in the flower type pattern characteristic diagram and the reference flower type characteristic diagram based on the position information of each element in the flower type pattern characteristic diagram and the reference flower type characteristic diagram to obtain a corrected flower type pattern characteristic diagram and a corrected reference flower type characteristic diagram. And then, calculating a difference characteristic diagram between the corrected pattern characteristic diagram and the corrected reference pattern characteristic diagram, and enabling the difference characteristic diagram to pass through a classifier to obtain a classification result, wherein the classification result is whether the pattern manufactured by the pattern burning process of the hip lifting sports pants to be detected meets the design requirement or not.
In step S110, an image of a pattern of the hip-up sport pants to be detected is acquired. Aiming at the quality inspection problem of the pattern burning process of the sports clothes, the pattern burning process can be carried out by comparing the pattern burning process of the sports clothes to be detected with the designed pattern burning process. Wherein, the pattern image of the hip-up sports pants to be detected can be acquired by the camera.
In step S120, passing the image of the pattern of the to-be-detected hip-up sports pants through a convolutional neural network model with a multi-scale convolution structure to obtain a pattern characteristic diagram. Because noise and interference are introduced due to influences of shielding, ambient light and the like when the burning patterns of the sports garment to be detected are collected, and scale differences are introduced because the image proportion between the image of the burning patterns of the sports garment to be detected and the designed burning patterns is different, if the difference between the burning patterns of the sports garment to be detected and the designed burning patterns is directly calculated in an image domain, the quality inspection effect is poor. Therefore, if the comparison between the burned pattern of the sports garment to be detected and the designed burned pattern is performed in the high-dimensional feature space, compared with the image source domain, in the high-dimensional feature domain, although the feature representation of the pattern is more abstract, the interference in the image source domain can be eliminated so as to directly compare the difference of the burned pattern of the sports garment to be detected and the designed burned pattern in the essential feature layer, and thus the accuracy of the quality inspection of the burned pattern of the sports garment is improved.
In one example, in the above intelligent production process of the hip-up sports pants, the step of passing the image of the pattern of the hip-up sports pants to be detected through a convolutional neural network model with a multi-scale convolution structure to obtain a pattern characteristic diagram includes: performing, using the layers of the convolutional neural network model, in a layer forward pass, input data separately: performing convolution processing on the input data by using a first convolution kernel of the multi-scale convolution structure to obtain a first convolution feature map; performing convolution processing on the input data by using a second convolution kernel of the multi-scale convolution structure to obtain a second convolution characteristic diagram; performing convolution processing on the input data by using a third convolution kernel of the multi-scale convolution structure to obtain a third convolution characteristic diagram; performing convolution processing on the input data by using a fourth convolution kernel of the multi-scale convolution structure to obtain a fourth convolution feature map; cascading the first convolution feature map, the second convolution feature map, the third convolution feature map and the fourth convolution feature map to obtain a multi-scale convolution feature map; pooling the multi-scale convolution characteristic map to obtain a pooled characteristic map; carrying out nonlinear activation processing on the pooled feature map to obtain an activated feature map; and the activation characteristic graph output by the last layer of the convolutional neural network model is the flower type pattern characteristic graph.
In particular, in one example, in the intelligent production process of the hip-up sports pants, the size of the first convolution kernel is 7 × 7, the size of the second convolution kernel is 5 × 5, the size of the third convolution kernel is 3 × 3, and the size of the fourth convolution kernel is 1 × 1.
In step S130, an image of a reference pattern that fits the pattern of the to-be-detected hip-up sport pants is extracted from a database. Similarly, the quality inspection work aiming at the burning process of the sports clothes is carried out by comparing the burning pattern of the sports clothes to be detected with the designed burning pattern, and an image of a reference pattern matched with the pattern of the sports pants to be detected is acquired. Here, the image of the reference pattern may be a CAD garment design pattern.
In step S140, passing the image of the reference pattern through the convolutional neural network model with the multi-scale convolutional structure to obtain a reference pattern feature map. Wherein, the reference pattern characteristic diagram is obtained in a similar way to the pattern characteristic diagram. That is, the image of the reference pattern is mapped from the source domain into the high-dimensional feature domain with a deep convolutional neural network model as a domain transformer to obtain the reference pattern feature map. Particularly, in the technical solution of the present application, since the pattern is a very complex pattern, which may present different feature representations and high-dimensional information in different image scales, in order to improve richness and accuracy of extracting the pattern information, layer structure improvement is performed on each layer of the deep convolutional neural network so that each layer has multiple receptive fields. Specifically, in the technical solution of the present application, each layer of the deep convolutional neural network has a multi-scale convolutional structure, and the deep convolutional neural network can perform parallel feature extraction based on different receptive fields on input data with a group of convolutional kernels having different sizes to obtain feature maps having different feature scales, and cascade the feature maps having different feature scales to obtain a multi-scale feature map.
In one example, in the above intelligent production process of the hip-up sports pants, the passing the image of the reference pattern through the convolutional neural network model with the multi-scale convolution structure to obtain a reference pattern characteristic diagram includes: performing, using the layers of the convolutional neural network model, in a layer forward pass, input data separately: performing convolution processing on the input data by using a first convolution kernel of the multi-scale convolution structure to obtain a first convolution feature map; performing convolution processing on the input data by using a second convolution kernel of the multi-scale convolution structure to obtain a second convolution characteristic diagram; performing convolution processing on the input data by using a third convolution kernel of the multi-scale convolution structure to obtain a third convolution characteristic diagram; performing convolution processing on the input data by using a fourth convolution kernel of the multi-scale convolution structure to obtain a fourth convolution feature map; cascading the first convolution feature map, the second convolution feature map, the third convolution feature map and the fourth convolution feature map to obtain a multi-scale convolution feature map; pooling the multi-scale convolution characteristic map to obtain a pooled characteristic map; carrying out nonlinear activation processing on the pooled feature map to obtain an activated feature map; the activation characteristic diagram output by the last layer of the convolutional neural network model is the reference pattern type characteristic diagram, and the input of the first layer of the convolutional neural network model is an image of the reference pattern type pattern.
In step S150, the feature values of the elements in the pattern characteristic diagram are corrected based on the position information of the elements in the pattern characteristic diagram to obtain a corrected pattern characteristic diagram. In particular, in the technical solution of the present application, by using a convolutional neural network model of a multi-scale convolutional structure, pixel semantic association expression features of an image at multiple scales can be extracted, but in order to improve a classification effect, it is still desirable to further improve global semantic expression. Therefore, the position proposal local reasoning transformation is carried out on the pattern characteristic diagram. The position proposal local inference transformation can use position information as a proposal, and infer global scene semantics through a local perception field of a convolutional layer so as to comprehensively fuse the captured local semantics and further derive global semantics, thereby realizing local-global migration of image semantics and inference prediction of global semantics.
In an example, in the above intelligent production process of hip-up sports pants, the correcting the characteristic values of the elements in the pattern characteristic diagram based on the position information of the elements in the pattern characteristic diagram to obtain a corrected pattern characteristic diagram includes: based on the position information of each element in the pattern characteristic diagram, correcting the characteristic value of each element in the pattern characteristic diagram by the following formula to obtain the corrected pattern characteristic diagram; wherein the formula is
Figure BDA0003853554820000131
M 1 Is the respective feature matrix along the channel, cov, of the pattern feature map 1 () And Cov 2 () Are all a single convolution layer, and are,
Figure BDA0003853554820000132
for mapping two-dimensional position coordinates to one-dimensional values, P M A coordinate matrix representing each feature matrix along the channel of the pattern feature map.
In step S160, the feature values of the elements in the reference pattern characteristic diagram are corrected based on the position information of the elements in the reference pattern characteristic diagram to obtain a corrected reference pattern characteristic diagram. In the same way, the position proposing local reasoning transformation is carried out on the reference pattern characteristic diagram to obtain the corrected reference pattern characteristic diagram. Therefore, the accuracy of detecting the forming quality of the pattern burning process is improved.
In an example, in the above intelligent production process of the hip-up sports pants, the correcting the characteristic values of the elements in the reference pattern characteristic diagram based on the position information of the elements in the reference pattern characteristic diagram to obtain a corrected reference pattern characteristic diagram includes: based on the position information of each element in the reference pattern characteristic diagram, correcting the characteristic value of each element in the reference pattern characteristic diagram by the following formula to obtain the corrected reference pattern characteristic diagram; wherein the formula is:
Figure BDA0003853554820000133
M 2 each feature matrix along the path, cov, of the reference pattern feature map 1 () And Cov 2 () Are all a single convolution layer, and are,
Figure BDA0003853554820000134
for mapping two-dimensional position coordinates to one-dimensional values, P M A coordinate matrix representing the respective feature matrices along the channels of said reference pattern feature map.
In step S170, a difference feature map between the corrected pattern feature map and the corrected reference pattern feature map is calculated. In a high-dimensional characteristic domain, the difference of the pattern to be burnt of the sportswear to be detected and the designed pattern to be burnt can be represented by calculating a difference characteristic diagram between the reference pattern characteristic diagram and the pattern characteristic diagram.
In one example, in the above intelligent production process of hip-up sport pants, the calculating a difference feature map between the corrected pattern feature map and the corrected reference pattern feature map includes: calculating the difference characteristic diagram between the corrected flower type pattern characteristic diagram and the corrected reference flower type characteristic diagram according to the following formula; wherein the formula is:
Figure BDA0003853554820000141
wherein, F' 1 Represents the corrected pattern feature map, F' 2 Showing the characteristic diagram of the reference pattern after correction,
Figure BDA0003853554820000142
indicating that the difference is made by position.
In step S180, the differential feature map is passed through a classifier to obtain a classification result, where the classification result is whether the pattern of the training pants to be detected, which is produced by the pattern burning process, meets the design requirements. And (4) obtaining a classification result for indicating whether the pattern of the pattern manufactured by the pattern burning process of the sport pants to be detected meets the design requirements or not by passing the differential characteristic diagram through a classifier.
In one example, in the above intelligent production process of the hip-up sport pants, the passing the differential feature map through a classifier to obtain a classification result includes: processing the differential feature map by using the classifier according to the following formula to obtain the classification result;
wherein the formula is: o = softmax { (W) n ,B n ):…:(W 1 ,B 1 )|Project(F c ) }, where Project (F) c ) Representing the projection of the difference profile as a vector, W 1 To W n As a weight matrix for all connected layers of each layer, B 1 To B n A bias matrix representing the layers of the fully connected layer.
In summary, the intelligent production process of the hip-up sports pants based on the embodiment of the present application is elucidated, which uses a deep convolutional neural network model as a domain converter to map an image of a pattern of the hip-up sports pants to be detected from a source domain into a high-dimensional feature domain to obtain a pattern feature map, and obtains a reference pattern feature map in the same manner, and then calculates a difference feature map between the reference pattern feature map and the pattern feature map to represent a difference between a burnt pattern of the sports pants to be detected and a designed burnt pattern in an essential feature plane, and uses the difference feature map to obtain a classification result for representing whether the pattern of the hip-up sports pants to be detected, which is manufactured through a burnt process, meets design requirements through a classifier. Therefore, an intelligent scheme of the hip-lifting sports pants production process is constructed based on an artificial intelligence technology, and the accuracy of quality inspection of the pattern burnt on the sports clothes is improved.
Exemplary System
Fig. 4 illustrates a block diagram of a system of an intelligent production process of hip-up sports pants according to an embodiment of the present application. As shown in fig. 4, the system 100 for the intelligent production process of hip-up sports pants according to the embodiment of the present application includes: the image acquisition unit 110 to be detected is used for acquiring images of flower patterns of the hip-up sports pants to be detected; the pattern characteristic diagram generating unit 120 is used for enabling the pattern image of the hip lifting sports pants to be detected to pass through a convolution neural network model with a multi-scale convolution structure to obtain a pattern characteristic diagram; a reference image extracting unit 130, configured to extract, from a database, an image of a reference pattern that matches the pattern of the hip-up training pants to be detected; a reference pattern characteristic diagram generating unit 140, configured to pass the image of the reference pattern through the convolutional neural network model with the multi-scale convolutional structure to obtain a reference pattern characteristic diagram; a corrected flower type pattern feature map generating unit 150, configured to correct feature values of each element in the flower type pattern feature map based on position information of each element in the flower type pattern feature map to obtain a corrected flower type pattern feature map; a corrected reference pattern characteristic diagram generating unit 160, configured to correct characteristic values of each element in the reference pattern characteristic diagram based on position information of each element in the reference pattern characteristic diagram to obtain a corrected reference pattern characteristic diagram; a difference unit 170, configured to calculate a difference feature map between the corrected pattern characteristic map and the corrected reference pattern characteristic map; and a result generating unit 180, configured to pass the difference feature map through a classifier to obtain a classification result, where the classification result is whether a pattern of the to-be-detected hip-up sports pants, which is manufactured through a burning process, meets design requirements.
In one example, in the system of the intelligent hip-up sport pants manufacturing process, the pattern characteristic map generating unit 120 is further configured to: performing, using the layers of the convolutional neural network model, in a layer forward pass, input data separately: performing convolution processing on the input data by using a first convolution core of the multi-scale convolution structure to obtain a first convolution feature map; performing convolution processing on the input data by using a second convolution kernel of the multi-scale convolution structure to obtain a second convolution characteristic diagram; performing convolution processing on the input data by using a third convolution kernel of the multi-scale convolution structure to obtain a third convolution characteristic diagram; performing convolution processing on the input data by using a fourth convolution kernel of the multi-scale convolution structure to obtain a fourth convolution feature map; cascading the first convolution feature map, the second convolution feature map, the third convolution feature map and the fourth convolution feature map to obtain a multi-scale convolution feature map; pooling the multi-scale convolution characteristic map to obtain a pooled characteristic map; carrying out nonlinear activation processing on the pooled feature map to obtain an activated feature map; and the activation characteristic graph output by the last layer of the convolutional neural network model is the flower type pattern characteristic graph.
In one example, in the system for the intelligent production process of the hip-up sport pants, the size of the first convolution kernel is 7 × 7, the size of the second convolution kernel is 5 × 5, the size of the third convolution kernel is 3 × 3, and the size of the fourth convolution kernel is 1 × 1.
In one example, in the system for the intelligent production process of hip-up sports pants, the reference pattern characteristic diagram generating unit 140 is further configured to: performing, using the layers of the convolutional neural network model, in a layer forward pass, input data separately: performing convolution processing on the input data by using a first convolution core of the multi-scale convolution structure to obtain a first convolution feature map; performing convolution processing on the input data by using a second convolution kernel of the multi-scale convolution structure to obtain a second convolution characteristic diagram; performing convolution processing on the input data by using a third convolution kernel of the multi-scale convolution structure to obtain a third convolution characteristic diagram; performing convolution processing on the input data by using a fourth convolution kernel of the multi-scale convolution structure to obtain a fourth convolution feature map; cascading the first convolution feature map, the second convolution feature map, the third convolution feature map and the fourth convolution feature map to obtain a multi-scale convolution feature map; pooling the multi-scale convolution characteristic map to obtain a pooled characteristic map; carrying out nonlinear activation processing on the pooled feature map to obtain an activated feature map; the activation characteristic diagram output by the last layer of the convolutional neural network model is the reference pattern type characteristic diagram, and the input of the first layer of the convolutional neural network model is an image of the reference pattern type pattern.
In one example, in the system of the intelligent hip-up sports pants manufacturing process, the corrected pattern characteristic diagram generating unit 150 is further configured to: based on the position information of each element in the pattern characteristic diagram, correcting the characteristic value of each element in the pattern characteristic diagram by the following formula to obtain the corrected pattern characteristic diagram; wherein the formula is
Figure BDA0003853554820000161
M 1 Is the respective feature matrix along the path of the pattern feature map, cov 1 () And Cov 2 () Are all a single convolution layer, and are,
Figure BDA0003853554820000162
for mapping two-dimensional position coordinates to one-dimensional values, P M A coordinate matrix representing the respective feature matrices along the channels of the pattern feature map.
In one example, in the system of the intelligent production process of the hip-up sports pants, the corrected reference pattern characteristic diagram generating unit 160 is further configured to: based on the position information of each element in the reference pattern characteristic diagram, correcting the characteristic value of each element in the reference pattern characteristic diagram by the following formula to obtain the corrected reference pattern characteristic diagram; wherein the formula is:
Figure BDA0003853554820000163
M 2 each feature matrix along the channel, cov, of said reference pattern feature map 1 () And Cov 2 () Are all a single convolution layer and are all a single convolution layer,
Figure BDA0003853554820000164
for mapping two-dimensional position coordinates to one-dimensional values, P M A coordinate matrix representing the respective feature matrices along the channels of said reference pattern feature map.
In one example, in the system for intelligent production process of hip-up sport pants, the difference unit 170 is further configured to: calculating the difference characteristic diagram between the corrected flower type pattern characteristic diagram and the corrected reference flower type characteristic diagram according to the following formula; wherein the formula is:
Figure BDA0003853554820000171
Figure BDA0003853554820000172
wherein, F' 1 Represents the corrected pattern feature map, F' 2 Showing the corrected reference pattern characteristic diagram,
Figure BDA0003853554820000173
indicating that the difference is made by location.
In summary, the system 100 for the intelligent production process of the hip-up sports pants according to the embodiment of the present application is illustrated, which uses a deep convolutional neural network model as a domain converter to map an image of a pattern of the hip-up sports pants to be detected from a source domain into a high-dimensional feature domain to obtain the pattern feature map, and obtains a reference pattern feature map in the same manner, and then calculates a difference feature map between the reference pattern feature map and the pattern feature map to represent a difference between a pattern burned of the sports pants to be detected and a designed pattern in an essential feature plane, and uses the difference feature map to obtain a classification result through a classifier to represent whether the pattern of the hip-up sports pants to be detected, which is produced through the pattern burning process, meets design requirements. Therefore, an intelligent scheme of the hip-lifting sports pants production process is constructed based on an artificial intelligence technology, and the accuracy of quality inspection of the pattern burnt on the sports clothes is improved.
As described above, the system 100 for the intelligent production process of the hip-up sports pants according to the embodiment of the present application may be implemented in various terminal devices, such as an intelligent instrument for the intelligent production process of the hip-up sports pants. In one example, the system 100 for the intelligent production process of hip-up sports pants according to the embodiment of the present application may be integrated into a terminal device as one software module and/or hardware module. For example, the system 100 for the intelligent hip-up sport pants manufacturing process may be a software module in the operating system of the terminal device, or may be an application developed for the terminal device; of course, the system 100 for the intelligent hip-up sport pants production process can also be one of the hardware modules of the terminal device.
Alternatively, in another example, the system 100 for the intelligent production process of the hip-up sport pants and the terminal device may be separate devices, and the system 100 for the intelligent production process of the hip-up sport pants may be connected to the terminal device through a wired and/or wireless network and transmit the interactive information according to an agreed data format.
Exemplary electronic device
Next, an electronic apparatus according to an embodiment of the present application is described with reference to fig. 5. FIG. 5 illustrates a block diagram of an electronic device in accordance with an embodiment of the present application.
As shown in fig. 5, the electronic device 10 includes one or more processors 11 and memory 12.
The processor 11 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the electronic device 10 to perform desired functions.
Memory 12 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, read Only Memory (ROM), hard disk, flash memory, etc. One or more computer program instructions may be stored on the computer readable storage medium and executed by the processor 11 to implement the functions of the intelligent hip-up athletic pant manufacturing process of the various embodiments of the present application described above and/or other desired functions. Various contents such as an image of a pattern of the hip-up sport pants to be detected may also be stored in the computer-readable storage medium.
In one example, the electronic device 10 may further include: an input device 13 and an output device 14, which are interconnected by a bus system and/or other form of connection mechanism (not shown).
The input device 13 may include, for example, a keyboard, a mouse, and the like.
The output device 14 can output various information including optimization results and the like to the outside. The output devices 14 may include, for example, a display, speakers, a printer, and a communication network and its connected remote output devices, among others.
Of course, for simplicity, only some of the components of the electronic device 10 relevant to the present application are shown in fig. 5, and components such as buses, input/output interfaces, and the like are omitted. In addition, the electronic device 10 may include any other suitable components depending on the particular application.
Exemplary computer program product and computer-readable storage Medium
In addition to the above-described methods and apparatus, embodiments of the present application may also be a computer program product comprising computer program instructions that, when executed by a processor, cause the processor to perform the steps in the functions in the intelligent production process of hip-up sport pants according to various embodiments of the present application described in the "exemplary methods" section above of this specification.
The computer program product may be written with program code for performing the operations of embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present application may also be a computer readable storage medium having stored thereon computer program instructions that, when executed by a processor, cause the processor to perform steps in the functions in the process of intelligent production of hip-lift training pants according to various embodiments of the present application described in the "exemplary methods" section above in this specification.
The computer readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The basic principles of the present application have been described above with reference to specific embodiments, but it should be noted that advantages, effects, etc. mentioned in the present application are only examples and are not limiting, and the advantages, effects, etc. must not be considered to be possessed by various embodiments of the present application. Furthermore, the foregoing disclosure of specific details is provided for purposes of illustration and understanding only, and is not intended to limit the application to the details which are set forth in order to provide a thorough understanding of the present application.
The block diagrams of devices, apparatuses, systems referred to in this application are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. As used herein, the words "or" and "refer to, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
It should also be noted that in the devices, apparatuses, and methods of the present application, each component or step can be decomposed and/or re-combined. These decompositions and/or recombinations are to be considered as equivalents of the present application.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, the description is not intended to limit embodiments of the application to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.

Claims (8)

1. An intelligent production process of hip-up sports pants is characterized by comprising the following steps:
acquiring an image of a pattern of the hip-up sports pants to be detected;
enabling the pattern image of the pattern of the hip lifting sports pants to be detected to pass through a convolution neural network model with a multi-scale convolution structure to obtain a pattern characteristic diagram;
extracting an image of a reference pattern matched with the pattern of the hip lifting sports pants to be detected from a database;
enabling the image of the reference pattern to pass through the convolutional neural network model with the multi-scale convolution structure to obtain a reference pattern characteristic diagram;
correcting the characteristic value of each element in the flower type pattern characteristic diagram based on the position information of each element in the flower type pattern characteristic diagram to obtain a corrected flower type pattern characteristic diagram;
correcting the characteristic values of each element in the reference pattern characteristic diagram based on the position information of each element in the reference pattern characteristic diagram to obtain a corrected reference pattern characteristic diagram;
calculating a difference characteristic diagram between the corrected flower type pattern characteristic diagram and the corrected reference flower type characteristic diagram; and
and passing the differential characteristic diagram through a classifier to obtain a classification result, wherein the classification result is whether the pattern of the sport pants to be detected, which is manufactured through the pattern burning process, meets the design requirements.
2. The intelligent hip-lifting sports pants production process according to claim 1, wherein the step of passing the image of the flower type pattern of the hip-lifting sports pants to be detected through a convolutional neural network model with a multi-scale convolution structure to obtain a flower type pattern feature map comprises the steps of: using each layer of the convolutional neural network model to respectively perform the following steps on input data in forward transmission of the layer:
performing convolution processing on the input data by using a first convolution core of the multi-scale convolution structure to obtain a first convolution feature map;
performing convolution processing on the input data by using a second convolution kernel of the multi-scale convolution structure to obtain a second convolution characteristic diagram;
performing convolution processing on the input data by using a third convolution kernel of the multi-scale convolution structure to obtain a third convolution characteristic diagram;
performing convolution processing on the input data by using a fourth convolution kernel of the multi-scale convolution structure to obtain a fourth convolution feature map;
cascading the first convolution feature map, the second convolution feature map, the third convolution feature map and the fourth convolution feature map to obtain a multi-scale convolution feature map;
pooling the multi-scale convolution characteristic map to obtain a pooled characteristic map; and
carrying out nonlinear activation processing on the pooled feature map to obtain an activated feature map;
and the activation characteristic graph output by the last layer of the convolutional neural network model is the pattern characteristic graph.
3. The intelligent hip-up athletic pant production process of claim 2, wherein the first convolution kernel is 7 x 7 in size, the second convolution kernel is 5 x 5 in size, the third convolution kernel is 3 x 3 in size, and the fourth convolution kernel is 1 x 1 in size.
4. The intelligent hip-up athletic pant production process of claim 1, wherein passing the image of the reference pattern through the convolutional neural network model with the multi-scale convolutional structure to obtain a reference pattern feature map comprises:
using each layer of the convolutional neural network model to respectively perform the following steps on input data in forward transmission of the layer:
performing convolution processing on the input data by using a first convolution kernel of the multi-scale convolution structure to obtain a first convolution feature map;
performing convolution processing on the input data by using a second convolution kernel of the multi-scale convolution structure to obtain a second convolution characteristic diagram;
performing convolution processing on the input data by using a third convolution kernel of the multi-scale convolution structure to obtain a third convolution characteristic diagram;
performing convolution processing on the input data by using a fourth convolution kernel of the multi-scale convolution structure to obtain a fourth convolution feature map;
cascading the first convolution feature map, the second convolution feature map, the third convolution feature map and the fourth convolution feature map to obtain a multi-scale convolution feature map;
pooling the multi-scale convolution characteristic map to obtain a pooled characteristic map; and
carrying out nonlinear activation processing on the pooled feature map to obtain an activated feature map;
the activation characteristic graph output by the last layer of the convolutional neural network model is the reference pattern type characteristic graph, and the input of the first layer of the convolutional neural network model is an image of the reference pattern type pattern.
5. The intelligent hip-up sports pants production process according to claim 1, wherein the step of correcting the characteristic values of the elements in the pattern feature map based on the position information of the elements in the pattern feature map to obtain a corrected pattern feature map comprises:
based on the position information of each element in the pattern characteristic diagram, correcting the characteristic value of each element in the pattern characteristic diagram by the following formula to obtain the corrected pattern characteristic diagram;
wherein the formula is
Figure FDA0003853554810000031
M 1 Is the respective feature matrix along the channel, cov, of the pattern feature map 1 () And Cov 2 () Are all a single convolution layer, and are,
Figure FDA0003853554810000032
for mapping two-dimensional position coordinates to one-dimensional values, P M A coordinate matrix representing the respective feature matrices along the channels of the pattern feature map.
6. The intelligent production process of the hip-up sports pants according to claim 1, wherein the step of correcting the characteristic values of the elements in the reference pattern characteristic diagram based on the position information of the elements in the reference pattern characteristic diagram to obtain a corrected reference pattern characteristic diagram comprises:
based on the position information of each element in the reference pattern characteristic diagram, correcting the characteristic value of each element in the reference pattern characteristic diagram by the following formula to obtain the corrected reference pattern characteristic diagram;
wherein the formula is:
Figure FDA0003853554810000033
M 2 each feature matrix along the path, cov, of the reference pattern feature map 1 () And Cov 2 () Are all a single convolution layer and are all a single convolution layer,
Figure FDA0003853554810000034
for mapping two-dimensional position coordinates to one-dimensional values, P M A coordinate matrix representing each feature matrix along a channel of the reference pattern feature map.
7. The intelligent hip-up sport pants production process according to claim 1, wherein said calculating a difference feature map between said corrected pattern feature map and said corrected reference pattern feature map comprises:
calculating the difference characteristic diagram between the corrected flower type pattern characteristic diagram and the corrected reference flower type characteristic diagram according to the following formula;
wherein the formula is:
Figure FDA0003853554810000035
wherein, F' 1 F 'representing the corrected pattern characteristic diagram' 2 Showing the corrected reference pattern characteristic diagram,
Figure FDA0003853554810000036
indicating that the difference is made by location.
8. The intelligent production process of the hip-up sports pants according to claim 1, wherein the step of passing the differential feature map through a classifier to obtain a classification result comprises the steps of:
processing the differential feature map by using the classifier according to the following formula to obtain the classification result;
wherein the formula is: o = softmax { (W) n ,B n ):…:(W 1 ,B 1 )|Project(F c ) In which Project (F) c ) Representing the projection of the difference profile as a vector, W 1 To W n As a weight matrix for all connected layers of each layer, B 1 To B n A bias matrix representing the layers of the fully connected layer.
CN202211141108.3A 2022-09-20 2022-09-20 Intelligent production process of hip-up sport pants Withdrawn CN115471480A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211141108.3A CN115471480A (en) 2022-09-20 2022-09-20 Intelligent production process of hip-up sport pants

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211141108.3A CN115471480A (en) 2022-09-20 2022-09-20 Intelligent production process of hip-up sport pants

Publications (1)

Publication Number Publication Date
CN115471480A true CN115471480A (en) 2022-12-13

Family

ID=84332470

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211141108.3A Withdrawn CN115471480A (en) 2022-09-20 2022-09-20 Intelligent production process of hip-up sport pants

Country Status (1)

Country Link
CN (1) CN115471480A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116051506A (en) * 2023-01-28 2023-05-02 东莞市言科新能源有限公司 Intelligent production system and method for polymer lithium ion battery
CN116536906A (en) * 2023-05-11 2023-08-04 杭州探观科技有限公司 Three-dimensional cutting method for underpants

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116051506A (en) * 2023-01-28 2023-05-02 东莞市言科新能源有限公司 Intelligent production system and method for polymer lithium ion battery
CN116536906A (en) * 2023-05-11 2023-08-04 杭州探观科技有限公司 Three-dimensional cutting method for underpants
CN116536906B (en) * 2023-05-11 2023-09-22 杭州探观科技有限公司 Forming quality detection method of three-dimensional cutting underpants

Similar Documents

Publication Publication Date Title
CN115471480A (en) Intelligent production process of hip-up sport pants
CN109376631B (en) Loop detection method and device based on neural network
Wu et al. Automatic fabric defect detection using a wide-and-light network
CN111832383B (en) Training method of gesture key point recognition model, gesture recognition method and device
CN106446933B (en) Multi-target detection method based on contextual information
US11977604B2 (en) Method, device and apparatus for recognizing, categorizing and searching for garment, and storage medium
CN112215201A (en) Method and device for evaluating face recognition model and classification model aiming at image
CN111667459B (en) Medical sign detection method, system, terminal and storage medium based on 3D variable convolution and time sequence feature fusion
WO2023108418A1 (en) Brain atlas construction and neural circuit detection method and related product
Jia et al. Learning to appreciate the aesthetic effects of clothing
CN115841594B (en) Attention mechanism-based coal gangue hyperspectral variable image domain data identification method
CN115861715A (en) Knowledge representation enhancement-based image target relation recognition algorithm
CN116091414A (en) Cardiovascular image recognition method and system based on deep learning
CN114118303B (en) Face key point detection method and device based on prior constraint
CN117238026A (en) Gesture reconstruction interactive behavior understanding method based on skeleton and image features
CN115223239A (en) Gesture recognition method and system, computer equipment and readable storage medium
JP7552287B2 (en) OBJECT DETECTION METHOD, OBJECT DETECTION DEVICE, AND COMPUTER PROGRAM
CN112861678A (en) Image identification method and device
CN117173154A (en) Online image detection system and method for glass bottle
CN117036933A (en) Intelligent crop pest control system and method
CN117079305A (en) Posture estimation method, posture estimation device, and computer-readable storage medium
CN115631176A (en) Art design teaching system and method
CN111753915B (en) Image processing device, method, equipment and medium
CN112560712B (en) Behavior recognition method, device and medium based on time enhancement graph convolutional network
CN115063598A (en) Key point detection method, neural network, device, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20221213