CN115221571A - Clothing pattern generation method and device, electronic equipment and storage medium - Google Patents

Clothing pattern generation method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115221571A
CN115221571A CN202210680226.5A CN202210680226A CN115221571A CN 115221571 A CN115221571 A CN 115221571A CN 202210680226 A CN202210680226 A CN 202210680226A CN 115221571 A CN115221571 A CN 115221571A
Authority
CN
China
Prior art keywords
clothing
target
pattern
information
printed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210680226.5A
Other languages
Chinese (zh)
Inventor
卜祥技
方思宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yijia Technology Co ltd
Original Assignee
Shenzhen Yijia Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yijia Technology Co ltd filed Critical Shenzhen Yijia Technology Co ltd
Priority to CN202210680226.5A priority Critical patent/CN115221571A/en
Publication of CN115221571A publication Critical patent/CN115221571A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2111/00Details relating to CAD techniques
    • G06F2111/20Configuration CAD, e.g. designing by assembling or positioning modules selected from libraries of predesigned modules
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Computing Systems (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Medical Informatics (AREA)
  • Computational Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application is applicable to the technical field of data processing, and provides a clothing pattern generation method, a device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring a garment picture to be printed; determining target clothing design information, target human body characteristic information and target fabric characteristic information according to the clothing picture to be printed; determining a target pattern model according to the target garment design information; determining a pattern adjustment parameter according to the target garment design information, the target human body characteristic information, the target fabric characteristic information and a preset pattern parameter knowledge graph; and generating a target pattern corresponding to the clothing picture to be printed according to the target pattern model and the pattern adjusting parameters. The embodiment of the application can generate the clothing pattern efficiently and accurately.

Description

Clothing pattern generation method and device, electronic equipment and storage medium
Technical Field
The application belongs to the technical field of data processing, and particularly relates to a clothing pattern generation method and device, electronic equipment and a storage medium.
Background
With the increasing diversification of the demands of users on the clothes, the trends of small batch, multiple varieties, fashion and individuation of the clothes are more and more obvious, and the production mode of the small-order quick-response flexible production in the field of clothes production gradually becomes the dominant melody. The demand for rapid production of clothing patterns is becoming stronger and stronger in this mode of production.
The pattern of clothing, also called slopers or clothing templates, is the most concrete form of presentation of the clothing structure, and the making of the pattern of clothing is the most important link in the production process. At present, the clothing pattern is usually obtained by manual printing by professional paper stylists, and a skilled paper stylist can print a clothing pattern in hours. Therefore, the existing clothes pattern generating process has the defects of low efficiency and high labor cost, and the accuracy of the generated clothes pattern is difficult to ensure because the quality of the pattern depends on the technical level of a paper pattern master seriously.
Disclosure of Invention
In view of this, embodiments of the present application provide a method and an apparatus for generating a clothing pattern, an electronic device, and a storage medium, so as to solve the problem in the prior art of how to efficiently and accurately generate a clothing pattern.
A first aspect of an embodiment of the present application provides a clothing pattern generating method, including:
acquiring a garment picture to be printed;
determining target garment design information, target human body characteristic information and target fabric characteristic information according to the garment picture to be printed;
determining a target pattern model according to the target garment design information;
determining a pattern adjustment parameter according to the target garment design information, the target human body characteristic information, the target fabric characteristic information and a preset pattern parameter knowledge graph;
and generating a target pattern corresponding to the clothing picture to be printed according to the target pattern model and the pattern adjusting parameters.
Optionally, the determining target garment design information, target human body characteristic information and target fabric characteristic information according to the garment picture to be printed includes:
processing the clothing picture to be printed by a preset clothing recognition algorithm to obtain the target clothing design information;
processing the clothing picture to be printed by a preset target human body feature recognition algorithm to obtain target human body feature information;
and processing the clothing picture to be printed by a preset fabric feature recognition algorithm to obtain the target fabric feature information.
Optionally, the target garment design information includes garment style information, fit information, and garment part modeling information, the garment identification algorithm includes a garment style identification algorithm, a garment part detection algorithm, and a part modeling identification algorithm, and the processing of the garment picture to be printed by using a preset garment identification algorithm to obtain the target garment design information includes:
processing the clothing picture to be printed by the clothing style identification algorithm to obtain clothing style information and fit information;
processing the clothing picture to be printed by the clothing component detection algorithm to obtain the position information of each clothing component in the clothing picture to be printed;
and for each clothing part, determining part modeling information corresponding to the clothing part according to the position information of the clothing part and the part modeling identification algorithm corresponding to the clothing part.
Optionally, the determining the target pattern model according to the target garment design information includes:
determining a target body pattern model from a preset body pattern library according to the clothing style information and the fit information;
and determining a target component pattern model from a preset component pattern library according to the component modeling information corresponding to the clothing component.
Optionally, the fabric characteristic information comprises any one or more of fabric type, thickness, softness, elasticity, stiffness, glossiness and drapability.
Optionally, the pattern adjustment parameters include a human body size adjustment parameter, a release amount adjustment parameter, and a key point adjustment parameter for adjusting the garment shape.
Optionally, the generating a target pattern corresponding to the clothing picture to be printed according to the target pattern model and the pattern adjustment parameter includes:
if the personalized configuration information input by the user is obtained, generating target personalized parameters according to the personalized configuration information and the pattern adjustment parameters; the personalized configuration information comprises human body size modification information, loosening modification information and fabric modification information;
and generating a target pattern corresponding to the clothing picture to be printed according to the target pattern model and the target personalized parameters.
A second aspect of an embodiment of the present application provides a clothing pattern generating apparatus, including:
the picture acquisition unit is used for acquiring a garment picture to be printed;
the identification unit is used for determining target garment design information, target human body characteristic information and target fabric characteristic information according to the garment picture to be printed;
the pattern model determining unit is used for determining a target pattern model according to the target garment design information;
the pattern adjustment parameter determining unit is used for determining pattern adjustment parameters according to the target garment design information, the target human body characteristic information, the target fabric characteristic information and a preset pattern parameter knowledge graph;
and the pattern generating unit is used for generating a target pattern corresponding to the clothing picture to be printed according to the target pattern model and the pattern adjusting parameters.
A third aspect of embodiments of the present application provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the electronic device is enabled to implement the steps of the clothing pattern generation method.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium storing a computer program, which, when executed by a processor, causes an electronic device to implement the steps of the clothing pattern generation method as described.
A fifth aspect of embodiments of the present application provides a computer program product, which, when run on an electronic device, causes the electronic device to execute the clothing pattern generating method according to any one of the first aspect.
Compared with the prior art, the embodiment of the application has the advantages that: in the embodiment of the application, a clothing picture to be printed is obtained, and the current target clothing design information, the target human body characteristic information and the target fabric characteristic information are determined according to the clothing picture to be printed. And then, according to the target garment design information, determining a target paper pattern model of the current garment picture to be printed, and according to the target garment design information, the target human body characteristic information, the target fabric characteristic information and a preset paper pattern parameter knowledge graph, determining paper pattern adjustment parameters. And finally, generating a target pattern corresponding to the clothing picture to be printed according to the target pattern model and the pattern adjusting parameters. Because the basic design structure of the clothing to be printed can be embodied by the target clothing design information of the clothing picture to be printed, the basic target pattern model can be automatically and accurately determined according to the target clothing design information; on the basis, the detailed influence of the clothing design, the human body characteristics and the fabric on the pattern is further comprehensively considered, and quantized pattern adjustment parameters corresponding to qualitative characteristic information such as target clothing design information, target human body characteristic information and target fabric characteristic information of the clothing picture to be printed are determined by using a preset pattern parameter knowledge graph, and the pattern adjustment parameters can accurately adjust the details of the target pattern model. In other words, by the method of the embodiment of the application, the target pattern model can be accurately determined without the participation of professional pattern technicians, and further, the detail adjustment can be accurately performed by combining quantized pattern adjustment parameters, so that the automatic generation of the clothing pattern can be efficiently and accurately realized on the premise of saving labor cost.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the embodiments or the description of the prior art will be briefly described below.
Fig. 1 is a schematic flow chart illustrating an implementation process of a clothing pattern generating method according to an embodiment of the present application;
fig. 2 is a schematic view of a clothing pattern generating device provided in an embodiment of the present application;
fig. 3 is a schematic diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In addition, in the description of the present application, the terms "first," "second," "third," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
Traditional clothing pattern generation usually relies on a professional paper stylist to make a manual layout, which is inefficient, labor-intensive, and difficult to ensure accuracy.
In order to improve the generation efficiency of the clothing pattern, the following two automatic printing methods are considered:
1) Parameterization printing: the paper pattern is formulated and parameterized, and corresponding size information is input during use, so that the paper pattern can be automatically generated.
2) Automatic code-putting and printing: and establishing a base paper pattern library and a code setting rule model, inputting corresponding sizes and adjusting parameters during printing, and adjusting the base paper pattern by the system according to rules output by the rule model.
Both of these methods are somewhat automated, but a garment is usually made up of multiple garment components, both of which require the manual selection of the correct pattern component or formula from a vast database. For example, in actual operation, an operator usually needs to refer to a standard clothing design picture to find a corresponding pattern from a database, and even on the software of a graphical interface, the operator is still required to have a certain pattern basis to be able to select a correct pattern component according to the clothing design picture. That is, although the two automatic printing methods can improve the printing speed to some extent, they still require a professional paper pattern operator and still require a certain labor cost.
Therefore, in order to solve the problem of how to efficiently and accurately generate the clothing pattern, the clothing pattern generating method provided by the embodiment of the application is provided based on the automatic printing method and further considering the requirements of intelligent printing in the whole process without depending on a paper pattern master. The clothing pattern generating method comprises the following steps: acquiring a garment picture to be printed; determining target clothing design information, target human body characteristic information and target fabric characteristic information according to the clothing picture to be printed; determining a target pattern model according to the target garment design information; determining a pattern adjustment parameter according to the target garment design information, the target human body characteristic information, the target fabric characteristic information and a preset pattern parameter knowledge graph; and generating a target paper pattern corresponding to the clothing picture to be printed according to the target paper pattern model and the paper pattern adjusting parameters.
Because the basic design structure of the clothing to be printed can be embodied by the target clothing design information of the clothing picture to be printed, the basic target pattern model can be automatically and accurately determined according to the target clothing design information; on the basis, the detailed influence of the clothing design, the human body characteristics and the fabric on the pattern is further comprehensively considered, and quantized pattern adjustment parameters corresponding to qualitative characteristic information such as target clothing design information, target human body characteristic information and target fabric characteristic information of the clothing picture to be printed are determined by using a preset pattern parameter knowledge graph, and the pattern adjustment parameters can accurately adjust the details of the target pattern model. Namely, the method of the embodiment of the application can efficiently and accurately realize the intelligent printing of the whole process from the picture to the pattern, and the whole process does not need the participation of professional paper stylists, so that the generation efficiency of the clothing pattern can be improved on the premise of saving labor cost.
The first embodiment is as follows:
fig. 1 shows a schematic flow chart of a clothing pattern generation method provided in an embodiment of the present application, where an execution subject of the clothing pattern generation method is an electronic device, and details are as follows:
in S101, a garment picture to be printed is obtained.
In the embodiment of the application, the clothing picture to be printed can be a standard clothing design drawing (design line draft drawing) or a clothing picture obtained by a person who takes the clothing to be printed (the clothing picture is hereinafter referred to as a clothing shooting drawing for short). Namely, the garment picture to be printed in the embodiment of the application can be any form of picture containing complete information of the garment to be printed, and the garment picture to be printed is not required to be strictly limited to be a garment design drawing like traditional garment printing.
In one embodiment, the electronic device of the embodiment of the application can obtain a clothing design drawing or a clothing shooting drawing uploaded by a user to obtain the clothing picture to be printed. In another embodiment, the electronic device according to the embodiment of the present application may acquire picture website information input by a user, and acquire the clothing picture to be printed from a specified website according to the picture website information. In another embodiment, the electronic device may take a picture of a garment to be printed by taking a picture of a person wearing the garment to be printed through a camera carried by the electronic device or a camera connected to the electronic device, so as to obtain the picture of the garment to be printed.
In S102, target clothing design information, target human body characteristic information and target fabric characteristic information are determined according to the clothing picture to be printed.
In the embodiment of the application, after the clothing picture to be printed is obtained, the clothing picture to be printed is identified, and target clothing design information, target human body characteristic information and target fabric characteristic information of the clothing to be printed are obtained.
The target garment design information is information indicating a garment design structure of a garment to be patterned, and may generally include any one or more items of garment style information, fit information, and garment component model information of each garment component. The target human body characteristic information is information which is carried in the clothing picture to be printed and is related to the human body characteristic; in general, the target human characteristic information may include any one or more of body type, age group, sex, and the like. When the garment picture to be printed is a garment shot picture, the target human body characteristic information represents the human body characteristic information of a person wearing the garment picture to be printed, so that when the garment picture to be printed is the garment shot picture, the target human body characteristic information can be acquired more accurately, and the accuracy of generating a garment pattern is further improved.
The target fabric characteristic information of the embodiment of the application is the fabric information of the clothes to be printed, which is identified from the clothes pictures to be printed. In one embodiment, the target fabric characteristic information includes any one or more of fabric type, thickness, softness, elasticity, stiffness, gloss, and drape. Wherein, the surface fabric type can include: cotton cloth, linen, silk, woolen cloth, leather and the like; the thickness can be divided into: thin, thick, very thick; softness is divided into soft, hard and hard; the elasticity can be divided into: no bullet, micro bullet, medium bullet and high bullet; stiffness can be divided into poor, medium, good, etc. By identifying one or more fabric characteristic information of the details of the garment pattern to be printed, the accuracy of generating the garment pattern can be further improved.
In S103, a target pattern model is determined according to the target dress design information.
In the embodiment of the application, after the target clothing design information is determined, according to the target clothing design information, a paper pattern model matched with the target clothing design information can be automatically obtained from a preset paper pattern model base to serve as the target paper pattern model. The pattern model library stores various pattern models carrying various garment design information labels in advance, and each pattern model can store related formulas, rules and/or basic graphs for generating patterns of the type of garments.
In S104, determining a pattern adjustment parameter according to the target garment design information, the target human body characteristic information, the target fabric characteristic information and a preset pattern parameter knowledge graph.
In the embodiment of the present application, the pattern adjustment parameters are parameters of the target pattern model for adjusting the basis for quantification (i.e., having specific quantification values). In general, the pattern adjustment parameters may include: a human body size adjusting parameter for adjusting the human body size to which the paper pattern is applied, and/or a release amount adjusting parameter for adjusting the release amount of the paper pattern. The preset paper pattern parameter knowledge graph is a knowledge base which is constructed in advance and comprises a net structure of garment design information, human body characteristic information, fabric characteristic information and paper pattern adjustment parameters, and the paper pattern parameter knowledge graph can represent the coupling relation among the garment design information, the human body characteristic information, the fabric characteristic information and the paper pattern adjustment parameters.
This step and step S103 may be performed simultaneously, or the order of execution of the two may be changed arbitrarily. After the information which qualitatively represents the characteristics of the clothing to be printed, such as the target clothing design information, the target human body characteristic information and the target fabric characteristic information of the clothing to be printed, corresponding quantized paper pattern adjusting parameters can be obtained from a preset paper pattern parameter knowledge graph in an indexing mode according to the three qualitative characteristic information.
And in S105, generating a target pattern corresponding to the clothing picture to be printed according to the target pattern model and the pattern adjusting parameters.
After the pattern adjustment parameters are determined, the pattern adjustment parameters may be input into the target pattern model determined in step S103, and the target pattern model may call a preset drawing software interface to draw the target pattern according to the pattern adjustment parameters. In one embodiment, a pattern in DWG or DXF format may be output as the target pattern. The DWG is a special file format used by computer aided design software AutoCAD and software based on the AutoCAD for storing design data, and the DXF is a CAD data file format used for exchanging CAD data between the AutoCAD and other software
In the embodiment of the application, the basic design structure of the clothing to be printed can be embodied by the target clothing design information of the clothing picture to be printed, so that the basic target paper pattern model can be automatically and accurately determined according to the target clothing design information; on the basis, the detailed influence of the clothing design, the human body characteristics and the fabric on the pattern is further comprehensively considered, and quantized pattern adjustment parameters corresponding to qualitative characteristic information such as target clothing design information, target human body characteristic information and target fabric characteristic information of the clothing picture to be printed are determined by using a preset pattern parameter knowledge graph, and the pattern adjustment parameters can accurately adjust the details of the target pattern model. Namely, by the method of the embodiment of the application, the target pattern model can be accurately determined without the participation of professional pattern technicians, and further, the detail adjustment can be accurately carried out by combining quantized pattern adjustment parameters, so that the automatic generation of the clothing pattern can be efficiently and accurately realized on the premise of saving labor cost.
Optionally, the determining target garment design information, target human body characteristic information and target fabric characteristic information according to the garment picture to be printed includes:
processing the clothing picture to be printed by a preset clothing recognition algorithm to obtain the target clothing design information;
processing the clothing picture to be printed by a preset target human body feature recognition algorithm to obtain target human body feature information;
and processing the clothing picture to be printed by a preset fabric feature recognition algorithm to obtain the target fabric feature information.
In the embodiment of the application, the preset clothing recognition algorithm can be realized based on a clothing recognition network model, and the clothing recognition network model can be a neural network model obtained by training in advance by taking a clothing picture carrying a clothing design information label as sample data. And inputting the current clothing picture to be printed into the trained clothing recognition network model for processing, so as to obtain the current target clothing design information.
In the embodiment of the application, the preset target human body feature identification algorithm may be a preset multi-label classification algorithm for identifying multi-dimensional information such as body type, age group, gender and the like of a human body from a clothing picture, that is, the multi-label classification algorithm is an algorithm for combining multi-dimensional human body feature identification tasks into one network to complete, and multi-dimensional identification results can be directly output through the multi-label classification algorithm. For convenience of description, the multi-label classification algorithm herein is referred to as a human body feature multi-label classification model. And (3) performing model training by taking the clothing picture carrying the three label information of the body type, the age group and the sex as sample data to obtain the trained human body characteristic multi-label classification model. In the step, the currently acquired garment picture to be printed is input into the trained human body characteristic multi-label classification model for processing, and the current characteristic information of the human body, such as the body type, the age group, the gender and the like, is obtained and used as the target human body characteristic information. Wherein, human body type can be divided into: type A, type B, type C, type Y; the age groups can be divided into: teenagers, young people, middle-aged people and old people; gender characteristics can be classified as: male, female, neutral. Furthermore, the output target human characteristic information only contains the body type, the age group and the gender and does not contain specific height information, so that the subsequent generation of the paper pattern is not limited to only generating the paper pattern which is the same as the size of the garment to be printed in the current garment picture to be printed, the sizes of all the heights or the size of a certain specified height can be flexibly selected from a human standard body type library, and the automatic code setting of the paper pattern is realized. Illustratively, the height may be from 160 cm to 195 cm, one yard per 5 cm. The human body standard body type library generally contains the following size characteristic information: height, cervical vertebra height, sitting position cervical vertebra height, full arm length, waist height, chest circumference, neck circumference, total shoulder width, waist circumference, hip circumference, thigh length, thigh circumference, etc.
Similarly, the fabric feature identification algorithm in the embodiment of the present application may be a preset multi-label classification algorithm, which is simply referred to as a fabric feature multi-label classification model and is used for identifying the fabric features of multiple dimensions of the fabric type, thickness, softness, elasticity, stiffness, glossiness, drapability, and the like of the garment from the garment picture. In one embodiment, the 7 tags of fabric type, thickness, softness, elasticity, stiffness, gloss, drape are identified for each garment picture at the time the dataset is created; and then performing model training by taking the clothing pictures in the data set as sample data to obtain a trained fabric feature multi-label classification model. In the step, the currently acquired garment picture to be printed is input into the trained fabric characteristic multi-label classification model for processing, and current target fabric characteristic information is obtained.
Exemplarily, the backbone network of the multi-label classification algorithm adopts a structure combining a depth separable convolution (depthwise separable convolution) mechanism and an attention (attention) mechanism, so that the model ensures the recognition accuracy under a smaller parameter number, thereby ensuring that the algorithm has a faster inference speed while fully extracting the image features; and finally, determining each classification result by adopting a sigmoid activation function in the network.
In the embodiment of the application, the target garment design information, the target human body characteristic information and the target fabric characteristic information of the garment to be printed are not simultaneously identified by directly using one identification algorithm, but the target garment design information, the target human body characteristic information and the target fabric characteristic information are respectively identified by using three preset algorithms, namely a garment identification algorithm, a target human body characteristic identification algorithm and a fabric characteristic identification algorithm, so that the identification complexity can be reduced, and the characteristic identification efficiency can be improved.
Optionally, the target garment design information includes garment style information, fit information, and garment part modeling information, the garment identification algorithm includes a garment style identification algorithm, a garment part detection algorithm, and a part modeling identification algorithm, and the processing of the garment picture to be printed by using a preset garment identification algorithm to obtain the target garment design information includes:
processing the clothing picture to be printed by the clothing style identification algorithm to obtain clothing style information and fit information;
processing the clothing picture to be printed by the clothing component detection algorithm to obtain the position information of each clothing component in the clothing picture to be printed;
and for each clothing part, determining part modeling information corresponding to the clothing part according to the position information of the clothing part and the part modeling identification algorithm corresponding to the clothing part.
In the embodiment of the application, the target garment design information specifically includes garment style information, fit information and garment part modeling information, while the garment identification algorithm is different from a general image segmentation algorithm.
Specifically, the clothing style identification processing is carried out on the clothing picture to be printed through the clothing style identification algorithm, so that the clothing style information and the fit information can be obtained. In the embodiment of the present application, the garment style information may be any one of the following 21 garment styles: examples of garments include sweaters, T-shirts, vests, shirts, polo (also known as tennis shirts, golf shirts), knitwear, sweaters, braces, jackets, coats, weatherdresses, down jackets, suits, coats, pants, shorts, half-dresses, jumpsuits, swimsuits, pajamas. Fitness information can be divided into: tight, fit, loose, oversized, etc. In one embodiment, the clothing style identification algorithm is implemented based on a clothing style identification model, and the clothing style identification model may be a neural network model obtained by training in advance with clothing pictures carrying clothing style information labels and fit labels as sample data.
In the embodiment of the application, the clothing component detection algorithm is a target detection model which is trained in advance and takes clothing components in clothing pictures as detection targets. And inputting the clothing picture to be printed into the target detection model for processing, so as to obtain the type and position information of each clothing component in the clothing picture to be printed. The garment components may include collars, sleeves, caps, pockets, waistbands, hems, trim components, and the like.
Illustratively, the object detection model of the embodiment of the present application is composed of a backbone network (backbone) and a detection head (header). The backsbone part adopts a convolutional neural network with a residual error structure, and enhances the extraction capability of the algorithm on the clothing characteristics by combining an attention (attention) mechanism and a skip-connection (skip-connection). The Header part adopts a characteristic pyramid structure to extract multilayer characteristics, and extracts more position information while ensuring semantic information; the accuracy rate is improved by adopting an improved loss function (generated loss) in the classification branch; the regression branch uses a mode without anchor point (anchor free) instead of a mode based on anchor point (anchor), and the operation speed is improved. The Loss function for network training may use GIoU Loss (a regression Loss function for target detection). In the training process of the target detection model, a data enhancement method that four images are spliced in one image in a 2x2 splicing mode and an indefinite splicing center and a data enhancement method that two images are overlapped and mixed in a certain proportion of channels can be adopted to carry out data enhancement so as to improve the algorithm precision.
After the types and the positions of all the clothing components in the clothing picture to be printed are detected through the clothing component detection algorithm, for each detected clothing component, extracting a local picture corresponding to the clothing component from the clothing picture to be printed according to the positioned position of the clothing component in the clothing picture to be printed; determining a part model recognition algorithm corresponding to the clothing part according to the type of the clothing part; and then, processing the local picture by using a part model recognition algorithm corresponding to the clothing part to obtain part model information corresponding to the clothing part. The part modeling recognition algorithm can also be realized by the multi-label classification algorithm, namely the part modeling recognition algorithm can be a part modeling multi-label classification model; and (3) performing model training by taking the clothing picture carrying the part modeling label as sample data to obtain the trained part modeling multi-label classification model.
In one embodiment, in order to improve the data interaction speed between the clothing component detection algorithm and the component modeling recognition algorithm, a plurality of algorithm models may be fused together in parallel by using a multithread parallel acceleration technology based on a Unified Device Architecture (CUDA) core.
In the embodiment of the application, the clothing style information and the fitting information are identified through the clothing style identification algorithm, the part modeling information corresponding to each clothing part is identified through the clothing part detection algorithm and the part modeling identification algorithm, and compared with the existing method for realizing clothing identification through a single image segmentation algorithm (namely, a multi-label classification algorithm is added on the basis of the image segmentation algorithm and used for judging styles and part modeling), the method can greatly accelerate the speed of data annotation, and meanwhile, the accuracy of identification can be greatly improved through the identification algorithm and the detection algorithm which are independently carried out and are matched with each other.
Optionally, the determining the target pattern model according to the target garment design information includes:
determining a target clothes body pattern model from a preset clothes body pattern library according to the clothes style information and the fit information;
and determining a target component pattern model from a preset component pattern library according to the component modeling information corresponding to the clothing component.
In the embodiment of the application, parameterized paper pattern models are stored in a preset paper pattern model library, including a body paper pattern library storing various body paper pattern models and a part paper pattern library storing various part paper pattern models. In some embodiments, the preset paper pattern model library further stores uniformly defined interface parameters, the interface parameters include a front arm hole line, a rear arm hole line, a front collar line, a rear collar line, a waist curve and a splicing line, and the connection of the various parts of the garment can be realized through the interface parameters.
In the embodiment of the application, each preset body pattern model in the body pattern library carries a corresponding clothing style label and a matching property label, and after the clothing style information and the matching property information corresponding to the current clothing picture to be printed are determined, the body pattern model with the clothing style label matched with the current clothing style information and the matching property label matched with the current matching property information is searched from the body pattern library and serves as the target body pattern model of the current clothing picture to be printed.
And after determining the part modeling information corresponding to each clothing part in the current clothing to be printed, searching for the part paper pattern model of which the part modeling label is matched with the current part modeling information from the part paper pattern library corresponding to each clothing part as a target part paper pattern model of the current clothing part.
In the embodiment of the application, the target body pattern model can be determined according to the clothing style information and the fit information, and the target component pattern model is determined according to the component modeling information corresponding to the clothing component, namely, the pattern model of the corresponding part is determined according to the characteristic information of different parts of the clothing, so that the target pattern model can be obtained more accurately, and the accuracy of the generated clothing pattern is further improved.
Optionally, the pattern adjustment parameters include a human body size adjustment parameter, a release amount adjustment parameter, and a key point adjustment parameter for adjusting the garment shape.
In the embodiment of the application, the human body size adjusting parameter is a parameter for adjusting the size of the paper pattern determined according to target human body characteristic information such as body type, age group, gender and the like; the release amount adjusting parameter is a parameter for adjusting the release amount of the pattern, which is determined according to the target human body characteristic information, the fit degree information in the target garment design information and the target fabric characteristic information. For example, a fabric with high elasticity should have a smaller amount of stretch than a fabric without elasticity, and if the fabric is thicker, the amount of stretch should be increased appropriately, and the amount of stretch of an outer garment should be increased appropriately compared to an inner garment. Illustratively, the relationship between the amount of circumference relaxation for a feminine garment is shown in Table 1.
Table 1:
Figure BDA0003698055450000151
the key point adjusting parameters in the embodiment of the application are parameters for adjusting key points influencing the garment modeling. These key points can be defined when creating the pattern models, each pattern model first determines its possible variation profile and then defines the key points according to the position of the desired variation. In one embodiment, the keypoint adjustment parameter is specifically used to specify actions and values for keypoint adjustments, the actions including: translation, zooming and rotation; the value is the magnitude of the particular adjustment. The action and the numerical value in the key point adjusting parameters are determined according to the target clothing design information, the target human body characteristic information and the target fabric characteristic information in combination with a preset paper pattern parameter knowledge picture, and the key points are adjusted according to the action and the numerical value, so that the modeling style of the generated paper pattern can be changed.
Illustratively, the aforementioned key points may include: anterior midpoint, posterior-medial apex, lateral cervical point, shoulder point, axillary point, armhole curve control point, elbow point, cuff point, thoracic point, abdominal point, lumbar point, gluteal point, knee joint point, anterior-lateral curve control point, and the like. And determining the key point adjusting parameters according to any one or more of the identified target clothing design information, the target human body characteristic information and the target fabric characteristic information so as to flexibly and accurately adjust the clothing shape. For example: armhole curve control points and armpit points in the key points can adjust the forward-leaning and backward-leaning angles of the sleeves by controlling the shapes of armhole curves; if the currently identified clothes style information is a suit coat, the sleeves are required to be slightly forward-inclined in shape, and the clothes shape can be adjusted by adjusting the two key points; if the cuff needs to be slightly backward-inclined according to the currently recognized body type information, the garment shape can be adjusted by adjusting the two key points, namely, the action and the value are adjusted to be different. As another example, the elbow point and cuff point in the key points can control the shape of the sleeve curve; the back middle vertex and the side neck points can control the lifting amount or the descending amount of the back piece of the coat, so that the back piece type elastic sweater is suitable for the conditions of developed muscles or thin and small body shapes, and the back of the coat is prevented from obliquely stretching or being too tight.
In the embodiment of the application, the pattern adjustment parameters specifically comprise a human body size adjustment parameter for adjusting the size of the pattern, a release amount adjustment parameter for adjusting the release amount of the pattern and a key point adjustment parameter for adjusting the shape of the garment, so that after a basic target pattern model is determined, the size, the release amount and the detail shape of the pattern can be further adjusted by combining the detail characteristics of the garment, and the accuracy of generating the pattern of the garment is further improved.
Optionally, the generating a target pattern corresponding to the clothing picture to be printed according to the target pattern model and the pattern adjustment parameter includes:
if the personalized configuration information input by the user is obtained, generating target personalized parameters according to the personalized configuration information and the pattern adjustment parameters; the personalized configuration information comprises human body size modification information, relaxation amount modification information and fabric modification information;
and generating a target pattern corresponding to the clothing picture to be printed according to the target pattern model and the target personalized parameters.
In the embodiment of the application, the electronic equipment is provided with the personalized configuration interface, and a user can input the human body size modification information, the loosening amount modification information and the fabric modification information as personalized configuration information through the personalized configuration interface. And then, adjusting the generated pattern adjustment parameters according to the personalized configuration information to generate target personalized parameters.
And after the target personalized parameters are generated, inputting the target personalized parameters into a target pattern model for processing to generate a target pattern corresponding to the clothing picture to be printed.
In the embodiment of the application, the personalized customization requirement is met, and the corresponding target pattern is generated according to the personalized configuration information input by the user, so that the flexibility of generating the clothing pattern can be improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by functions and internal logic of the process, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Example two:
fig. 2 shows a schematic structural diagram of a clothing pattern generating device provided in an embodiment of the present application, and for convenience of description, only parts related to the embodiment of the present application are shown:
the clothing pattern generating device includes: the image acquisition unit 21, the recognition unit 22, the pattern model determination unit 23, the pattern adjustment parameter unit 24, and the pattern generation unit 25. Wherein:
and the picture acquisition unit 21 is used for acquiring a picture of the garment to be printed.
And the identification unit 22 is used for determining target clothing design information, target human body characteristic information and target fabric characteristic information according to the clothing picture to be printed.
And the pattern model determining unit 23 is configured to determine a target pattern model according to the target clothing design information.
And the pattern adjustment parameter determining unit 24 is configured to determine a pattern adjustment parameter according to the target clothing design information, the target human body characteristic information, the target fabric characteristic information, and a preset pattern parameter knowledge graph.
And the pattern generating unit 25 is configured to generate a target pattern corresponding to the clothing picture to be printed according to the target pattern model and the pattern adjustment parameters.
Optionally, the identification unit 22 includes:
the first identification module is used for processing the clothing picture to be printed by a preset clothing identification algorithm to obtain the target clothing design information;
the second identification module is used for processing the clothing picture to be printed by a preset target human body feature identification algorithm to obtain target human body feature information;
and the third identification module is used for processing the clothing picture to be printed by a preset fabric characteristic identification algorithm to obtain the target fabric characteristic information.
Optionally, the target garment design information includes garment style information, fit information, and garment component modeling information, the garment identification algorithm includes a garment style identification algorithm, a garment component detection algorithm, and a component modeling identification algorithm, and the first identification module includes:
the style identification module is used for processing the clothing picture to be printed by the clothing style identification algorithm to obtain clothing style information and fit information;
the component detection module is used for processing the clothing picture to be printed by the clothing component detection algorithm to obtain the position information of each clothing component in the clothing picture to be printed;
and the part model identification module is used for determining the part model information corresponding to the clothing parts according to the position information of the clothing parts and the part model identification algorithm corresponding to the clothing parts for each clothing part.
Optionally, the pattern model determining unit 23 includes:
the body pattern model determining unit is used for determining a target body pattern model from a preset body pattern library according to the garment style information and the fit information;
and the component pattern model determining unit is used for determining a target component pattern model from a preset component pattern library according to the component modeling information corresponding to the clothing component.
Optionally, the fabric characteristic information comprises any one or more of fabric type, thickness, softness, elasticity, stiffness, glossiness and drapability.
Optionally, the pattern adjustment parameters include a human body size adjustment parameter, a release amount adjustment parameter, and a key point adjustment parameter for adjusting the garment shape.
Optionally, the pattern generating unit 25 includes:
the configuration module is used for generating target personalized parameters according to the personalized configuration information and the paper pattern adjustment parameters if the personalized configuration information input by the user is obtained; the personalized configuration information comprises human body size modification information, relaxation amount modification information and fabric modification information;
and the generating module is used for generating a target pattern corresponding to the clothing picture to be printed according to the target pattern model and the target personalized parameters.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
Example three:
fig. 3 is a schematic diagram of an electronic device according to an embodiment of the present application. As shown in fig. 3, the electronic apparatus 3 of this embodiment includes: a processor 30, a memory 31 and a computer program 32, such as a pattern making program, stored in said memory 31 and executable on said processor 30. The processor 30, when executing the computer program 32, implements the steps in the above-described various embodiments of the clothing pattern generation method, such as the steps S101 to S105 shown in fig. 1. Alternatively, the processor 30 executes the computer program 32 to implement the functions of the modules/units in the device embodiments, such as the functions of the picture acquiring unit 21 to the pattern generating unit 25 shown in fig. 2.
Illustratively, the computer program 32 may be partitioned into one or more modules/units, which are stored in the memory 31 and executed by the processor 30 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 32 in the electronic device 3.
The electronic device 3 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The electronic device may include, but is not limited to, a processor 30, a memory 31. Those skilled in the art will appreciate that fig. 3 is merely an example of the electronic device 3 and does not constitute a limitation of the electronic device 3 and may include more or fewer components than those shown, or some of the components may be combined, or different components, e.g., the electronic device may also include input output devices, network access devices, buses, etc.
The Processor 30 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 31 may be an internal storage unit of the electronic device 3, such as a hard disk or a memory of the electronic device 3. The memory 31 may also be an external storage device of the electronic device 3, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the electronic device 3. Further, the memory 31 may also include both an internal storage unit and an external storage device of the electronic device 3. The memory 31 is used for storing the computer program and other programs and data required by the electronic device. The memory 31 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/electronic device and method may be implemented in other ways. For example, the above-described apparatus/electronic device embodiments are merely illustrative, and for example, the division of the modules or units is only one type of logical function division, and other division manners may exist in actual implementation, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated module/unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, U.S. disk, removable hard disk, magnetic diskette, optical disk, computer Memory, read-Only Memory (ROM), random Access Memory (RAM), electrical carrier wave signal, telecommunications signal, and software distribution medium, etc. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the embodiments of the present application, and they should be construed as being included in the present application.

Claims (10)

1. A clothing pattern generating method is characterized by comprising the following steps:
acquiring a garment picture to be printed;
determining target garment design information, target human body characteristic information and target fabric characteristic information according to the garment picture to be printed;
determining a target pattern model according to the target garment design information;
determining paper pattern adjustment parameters according to the target clothing design information, the target human body characteristic information, the target fabric characteristic information and a preset paper pattern parameter knowledge graph;
and generating a target pattern corresponding to the clothing picture to be printed according to the target pattern model and the pattern adjusting parameters.
2. The method for generating a clothing pattern according to claim 1, wherein the determining target clothing design information, target human body characteristic information and target fabric characteristic information according to the clothing picture to be printed comprises:
processing the clothing picture to be printed by a preset clothing recognition algorithm to obtain the target clothing design information;
processing the clothing picture to be printed by a preset target human body feature recognition algorithm to obtain target human body feature information;
and processing the clothing picture to be printed by a preset fabric feature recognition algorithm to obtain the target fabric feature information.
3. The method for generating a clothing pattern according to claim 2, wherein the target clothing design information includes clothing style information, fit information, and clothing component modeling information, the clothing identification algorithm includes a clothing style identification algorithm, a clothing component detection algorithm, and a component modeling identification algorithm, and the processing of the clothing picture to be printed by a preset clothing identification algorithm to obtain the target clothing design information includes:
processing the clothing picture to be printed by the clothing style identification algorithm to obtain clothing style information and fit information;
processing the clothing picture to be printed by the clothing component detection algorithm to obtain the position information of each clothing component in the clothing picture to be printed;
and for each clothing part, determining part modeling information corresponding to the clothing part according to the position information of the clothing part and the part modeling identification algorithm corresponding to the clothing part.
4. The method for generating a clothing pattern according to claim 3, wherein the target pattern model includes a target body pattern model and a target component pattern model, and the determining the target pattern model based on the target clothing design information includes:
determining a target body pattern model from a preset body pattern library according to the clothing style information and the fit information;
and determining a target component pattern model from a preset component pattern library according to the component modeling information corresponding to the clothing component.
5. The clothing pattern generating method according to claim 1, wherein the fabric characteristic information includes any one or more of a fabric type, a thickness, a softness, an elasticity, a stiffness, a glossiness, and a drape.
6. The method for generating a clothing pattern according to claim 1, wherein the pattern adjustment parameters include a body size adjustment parameter, a slack adjustment parameter, and a key point adjustment parameter for adjusting the clothing style.
7. The garment pattern generation method of any one of claims 1 to 6, wherein the generating of the target pattern corresponding to the garment picture to be printed according to the target pattern model and the pattern adjustment parameters comprises:
if the personalized configuration information input by the user is obtained, generating target personalized parameters according to the personalized configuration information and the paper pattern adjustment parameters; the personalized configuration information comprises human body size modification information, relaxation amount modification information and fabric modification information;
and generating a target pattern corresponding to the clothing picture to be printed according to the target pattern model and the target personalized parameters.
8. A clothing pattern generating apparatus, comprising:
the picture acquisition unit is used for acquiring a garment picture to be printed;
the identification unit is used for determining target garment design information, target human body characteristic information and target fabric characteristic information according to the garment picture to be printed;
the pattern model determining unit is used for determining a target pattern model according to the target garment design information;
the pattern adjustment parameter determining unit is used for determining pattern adjustment parameters according to the target garment design information, the target human body characteristic information, the target fabric characteristic information and a preset pattern parameter knowledge graph;
and the paper pattern generating unit is used for generating a target paper pattern corresponding to the clothing picture to be printed according to the target paper pattern model and the paper pattern adjusting parameters.
9. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the computer program, when executed by the processor, causes the electronic device to carry out the steps of the method according to any one of claims 1 to 7.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, causes an electronic device to carry out the steps of the method according to any one of claims 1 to 7.
CN202210680226.5A 2022-06-16 2022-06-16 Clothing pattern generation method and device, electronic equipment and storage medium Pending CN115221571A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210680226.5A CN115221571A (en) 2022-06-16 2022-06-16 Clothing pattern generation method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210680226.5A CN115221571A (en) 2022-06-16 2022-06-16 Clothing pattern generation method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115221571A true CN115221571A (en) 2022-10-21

Family

ID=83608470

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210680226.5A Pending CN115221571A (en) 2022-06-16 2022-06-16 Clothing pattern generation method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115221571A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117291111A (en) * 2023-11-24 2023-12-26 宁波博洋服饰集团有限公司 Digital fabric simulation optimization method combined with garment fabric cloud computing platform

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117291111A (en) * 2023-11-24 2023-12-26 宁波博洋服饰集团有限公司 Digital fabric simulation optimization method combined with garment fabric cloud computing platform
CN117291111B (en) * 2023-11-24 2024-04-05 宁波博洋服饰集团有限公司 Digital fabric simulation optimization method combined with garment fabric cloud computing platform

Similar Documents

Publication Publication Date Title
US10918150B2 (en) Methods and systems for customized garment and outfit design generation
US11439194B2 (en) Devices and methods for extracting body measurements from 2D images
CN112100908B (en) Clothing design method for generating countermeasure network based on multi-condition deep convolution
Hsiao et al. ViBE: Dressing for diverse body shapes
CN109614925A (en) Dress ornament attribute recognition approach and device, electronic equipment, storage medium
US20200082138A1 (en) Clothing having one or more printed areas disguising a shape or a size of a biological feature
CN115221571A (en) Clothing pattern generation method and device, electronic equipment and storage medium
CN115272579A (en) Single-image three-dimensional garment reconstruction method based on multi-feature fusion
CN111401306A (en) Method, device and equipment for recommending clothes putting on
Hong et al. Application of 3D-TO-2D garment design for atypical morphology: a design case for physically disabled people with scoliosis
CN112685649A (en) Clothing recommendation method and device, storage medium and terminal equipment
CN110335104A (en) A kind of quick ordering system and its method of customized clothing class order
Lee et al. Heuristic misfit reduction: A programmable approach for 3D garment fit customization
Wolff et al. Designing personalized garments with body movement
CN113538074A (en) Method, device and equipment for recommending clothes
CN108829958A (en) A kind of clothes automatic plate making method and its system
Huang et al. Consistentid: Portrait generation with multimodal fine-grained identity preserving
CN115482577A (en) Clothing style matching algorithm based on human face features
CN115082669A (en) Garment fabric recommendation method and device, electronic equipment and storage medium
CN113487619A (en) Data processing method, device, equipment and medium
CN111383068A (en) Method and system for matching style of clothing style
CN110598546B (en) Image-based target object generation method and related equipment
CN116188729A (en) Fitting method of clothes treatment equipment and clothes treatment equipment
CN116503569B (en) Virtual fitting method and system, computer readable storage medium and electronic device
CN113032962A (en) Garment structure virtual simulation system and method for analyzing garment pattern

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination