WO2020098982A1 - Method and devices for fitting a garment onto a body - Google Patents

Method and devices for fitting a garment onto a body Download PDF

Info

Publication number
WO2020098982A1
WO2020098982A1 PCT/EP2019/066898 EP2019066898W WO2020098982A1 WO 2020098982 A1 WO2020098982 A1 WO 2020098982A1 EP 2019066898 W EP2019066898 W EP 2019066898W WO 2020098982 A1 WO2020098982 A1 WO 2020098982A1
Authority
WO
WIPO (PCT)
Prior art keywords
garment
neural network
onto
network
sub
Prior art date
Application number
PCT/EP2019/066898
Other languages
French (fr)
Inventor
Victor CONSTANTIN
Erhan GÜNDOGU
Mathieu SALZMANN
Pascal Fua
Minh Dang
Amrollah SEIFODDINI BANADKOOKI
Original Assignee
Fision Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fision Ag filed Critical Fision Ag
Publication of WO2020098982A1 publication Critical patent/WO2020098982A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • G06F30/27Design optimisation, verification or simulation using machine learning, e.g. artificial intelligence, neural networks, support vector machines [SVM] or training a model
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2113/00Details relating to the application field
    • G06F2113/12Cloth
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/16Cloth
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2021Shape modification

Definitions

  • the present invention relates to a method and devices for fitting a garment onto a body. Specifically, the present invention relates to a method, a computer system, a communica tion terminal, and a computer program product for fitting a 3 D model of a garment onto a 3 D model of a body.
  • Generating realistic 3 D simulations remains computationally expensive, both in terms of the cost of specialized hardware, such as graphics processing units, reguired to efficiently simulate physical forces which act on rigid and non-rigid 3 D objects, and also in the time and energy reguired for such hardware to execute such simulations.
  • specialized hardware such as graphics processing units
  • special software tools and libraries are reguired.
  • Of special difficulty is generating 3 D simulations of non-rigid bodies such as fabrics, as the demand for realism reguires simulation software to consider the laws of physics and how they relate them to non-rigid material properties and behavior.
  • Neural networks have in the last decade, with the advent of more powerful hardware and more advanced learning algorithms, proven both powerful and useful in implementing ap plications of machine learning.
  • Today, neural networks are employed in a vast array of fields - from speech and character recognition, to image classification, machine transla tion, to driving vehicles and aiding doctors in medical diagnoses.
  • the architectures of neu ral networks differ widely, and developing a neural network architecture with the aim of solving a specific problem reguires significant ingenuity.
  • a successful neural network how- ever, requires more than an appropriate architecture. It must be trained with carefully se lected training samples and tuned such that it can produce reliable outputs for previously unknown inputs.
  • WO 201 7/203262 A2 describes a system of predicting attributes of clothing pieces or accessories using deep learning techniques, wherein a deep learning model is trained to predict discrete and continuous attributes of digital photos of a garment or accessory, and this deep learning model is used for feature extraction. These features may be used for finding similar items to a queried item, identifying an item provided in a query, or searching items based on attributes. Summary of the Invention
  • a computer-implemented method of fitting a garment onto a body comprises stor ing a training dataset of fitted garments, the training dataset of fitted garments generated by fitting 3 D representations of garments from a garment dataset onto 3 D representations of bodies from a body dataset.
  • the method further comprises generating a trained neural network by training a neural network to fit the garments onto the bodies, using the training dataset.
  • the method further comprises receiving, from a user, a selection of a particular garment and a particular body, which particular garment and which particular body are not in the training dataset.
  • the method further comprises fitting the particular garment onto the particular body using the trained neural network, and showing on a display the fitting of the particular garment onto the particular body.
  • the present invention in particular enables the generation of the trained neural network which can perform a close approximation of a computation ally expensive, physics based, simulation used to generate the training dataset of fitted garments.
  • This close approximation, as generated by the trained neural network is such that a user can judge, based on a fitting by the trained neural network, whether a garment is likely to fit his or her body well.
  • the trained neural network may generate more guickly and /or with less computational processing power.
  • the fitted garment may also be generated on a mobile user device, e.g. a commu nication terminal of the user, which has limited computational power, such as a portable phone or a tablet device.
  • the computation may also be performed guickly on a server computer and transmitted to the communication terminal.
  • generating the trained neural network is executed on a computer sys tem and the method further comprises a communication terminal receiving the trained neural network from the computer system via a communication network.
  • the method fur ther comprises the communication terminal receiving the selection of the particular gar ment and the particular body from the user.
  • the method further comprises the communi cation terminal fitting the particular garment onto the particular body using the trained neural network.
  • the method further comprises the communication terminal showing on the display the fitting of the particular garment onto the particular body.
  • training the neural network comprises training a plurality of sub-net- works.
  • the sub-networks include a garment sub-network configured to receive the gar ment as an input.
  • the garment sub-network comprises a plurality of layers.
  • the sub-net- works also include a body sub-network configured to receive the body as an input.
  • the body sub-network comprises a plurality of layers.
  • An output of the body sub-network is provided to at least one layer of the garment sub-network, and the garment sub-network is configured to use the output of the body sub-network to fit the garment onto the body.
  • training the neural network comprises training a plurality of sub-net- works and /or layers.
  • Each sub-network and /or layer has inputs and outputs.
  • the sub-net- works and/or layers include a spatial transformer sub-network, a multi-layer perceptron sub-network, and/or a maximum pooling layer.
  • the spatial transformer sub-network is configured to apply a spatial transformation to a 3 D input.
  • the multi-layer perceptron sub network enables detection of geometric features in an input.
  • the maximum pooling layer is configured to reduce a number of parameters in an output from the number of parame ters in an input.
  • training the neural network comprises determining a fitting error by comparing the garments fitted onto the bodies by the neural network to corresponding fitted garments from the training dataset, and adjusting the neural network using the fit ting error.
  • determining the fitting error comprises determining a distance term which represents distances between vertices of the garments fitted onto the body by the neural network and corresponding vertices of the corresponding fitted garments from the training dataset. Determining the fitting error further comprises determining an orientation term which represents angular differences between vertex normals of the garments fitted onto the body by the neural network and the corresponding vertex normals of the corre sponding fitted garments from the training dataset. Determining the fitting error further comprises determining a penetration term which indicates whether vertices of the garment fitted by the neural network are inside the body onto which the garment has been fitted.
  • the present invention also relates to a computer system comprising one or more processors configured to fit a garment onto a body.
  • the one or more processors are configured to fit the garment onto the body by storing a training dataset of fitted garments, generated by fitting 3 D representations of garments from a garment dataset onto 3 D representations of bodies from a body dataset; generating a trained neural network by training a neural network to fit the garments onto the bodies, using the training dataset; receiving, from a user, a selection of a particular garment and a particular body, which particular garment and which particular body are not in the training dataset; fitting the particular garment onto the particular body using the trained neural network; and showing on a display the fitting of the particular garment onto the particular body.
  • the processors are configured to implement the neural network to com prise a plurality of sub-networks.
  • the sub-networks include a garment sub-network con figured to receive the garment as an input.
  • the garment sub-network comprises a plurality of layers.
  • the sub-networks further include a body sub-network configured to receive the body as an input.
  • the body sub-network comprises a plurality of layers.
  • An output of the body sub-network is provided to at least one layer of the garment sub-network, and the garment sub-network is configured to use the output of the body sub-network to fit the garment onto the body.
  • the processors are configured to implement the neural network to com- prise a plurality of sub-networks and /or layers.
  • Each sub-network and /or layer has inputs and outputs.
  • the sub-networks and/or layers include a spatial transformer sub-network, a multi-layer perceptron sub-network, and /or a maximum pooling layer.
  • the spatial trans former sub-network is configured to apply a spatial transformation to a 3 D input.
  • the multi-layer perceptron sub-network enables detection of geometric features in an input.
  • the maximum pooling layer is configured to reduce a number of parameters in an output from the number of parameters in an input.
  • the processors are configured to train the neural network by determin ing a fitting error by comparing the garments fitted onto the bodies by the neural network to corresponding fitted garments from the training dataset, and adjusting the neural net- work using the fitting error.
  • At least one of the processors is arranged on a networked computer system and configured to store the training dataset of fitted garments, to generate the trained neural network, and to transfer the trained neural network via a communication network to a communication terminal. Furthermore, at least one of the processors is ar ranged on the communication terminal and configured to receive the trained neural net work from the networked computer system via the communication network, to receive the selection of the particular garment and the particular body from the user, to fit the partic- ular garment onto the particular body using the trained neural network, and to show on the display the fitting of the particular garment onto the particular body.
  • the present invention also relates to a computer program product comprising a non-tran- sitory computer readable medium having stored thereon computer code configured to control a processor of a computer system to fit a garment onto a body by performing the steps of: training dataset of fitted garments, generated by fitting 3 D representations of garments from a garment dataset onto 3 D representations of bodies from a body dataset; generating a trained neural network by training a neural network to fit the garments onto the bodies, using the training dataset; receiving, from a user, a selection of a particular garment and a particular body, which particular garment and which particular body are not in the training dataset; fitting the particular garment onto the particular body using the trained neural network; and showing on a display the fitting of the particular garment onto the particular body.
  • the present invention also relates to a computer program product comprising a non-tran- sitory computer readable medium having stored thereon computer code configured to control a processor of a communication terminal to fit a garment onto a body by perform ing the steps of: receiving from a computer system a trained neural network, trained with a training dataset of fitted garments to fit garments onto bodies; receiving, from a user, a selection of a particular garment and a particular body, which particular garment and which particular body are not in the training dataset; fitting the particular garment onto the particular body using the trained neural network; and showing on a display the fitting of the particular garment onto the particular body.
  • Figure 1 shows a block diagram illustrating a computer system in communicative con nection with a communication terminal for fitting a garment onto a body
  • Figure 2 shows a block diagram illustrating schematically a computer system compris ing a processor, a neural network, a garment dataset, a body dataset, and a training dataset
  • Figure 3 shows a flow diagram illustrating a step for generating a training dataset of fitted garments
  • Figure 4 shows a flow diagram illustrating a step for training a neural network
  • Figure 5 shows a flow diagram illustrating a step for fitting a particular garment using a trained neural network
  • Figure 6 shows a flow diagram illustrating an exemplary seguence of steps for training a neural network
  • Figure 7 shows a flow diagram illustrating an exemplary seguence of steps for fitting a garment using a neural network, including transmission of a trained neural network via a communication network;
  • Figure 8 shows a block diagram illustrating a neural network for fitting a garment onto a body
  • Figure 9 shows a flow diagram illustrating an exemplary seguence of steps for fitting a garment using a neural network
  • Figure 1 0 shows a flow diagram illustrating an exemplary seguence of steps for fitting a garment using a neural network.
  • reference numeral 1 refers to a computer system.
  • the computer system 1 is a desktop computer or a server computer, which comprises one or more processors 2, a memory, and a communications interface for communicating with a communication ter minal 7 via a communication network 4.
  • the computer system 1 is a virtual machine being executed on a server computing system.
  • the communication terminal 7 more specifically an electronic communication device such as a desktop computer, a laptop computer, a tablet computer, a mobile radio phone, a smart watch, or the like, comprises a processor 71 and a display 72.
  • the communication network 4 comprises one or more wired or wireless networks (or set of networks), such as the Internet, LAN ( Local Area Network), WLAN (Wireless Local Area Network), and mobile radio networks, such as GSM (Global System for Mobile Communi cation) or UMTS ( Universal Mobile Telephone System) networks.
  • networks such as the Internet, LAN ( Local Area Network), WLAN (Wireless Local Area Network), and mobile radio networks, such as GSM (Global System for Mobile Communi cation) or UMTS ( Universal Mobile Telephone System) networks.
  • GSM Global System for Mobile Communi cation
  • UMTS Universal Mobile Telephone System
  • reference numeral 3 refers to a neural network, in par ticular a feed-forward neural network.
  • the neural network 3 implements a function which can be decomposed into other functions.
  • the neural network 3 comprises a set of param- eters which are adjusted during training of the neural network 3.
  • the neural network 3 can be represented as a network structure comprising a set of connected layers, which layers may be grouped into sub-networks. Each layer comprises a set of nodes, each node having adjustable parameters including one or more weight parameters and a bias parameter.
  • a layer may be either an input layer, an output layer, or a hidden layer.
  • the neural network 3 comprises one or more input layers, one or more output layers, and one or more hidden layers.
  • Data is processed by a given layer and provided to one or more subseguent layers.
  • Data being input into the neural network 3 is passed to an input layer.
  • the output of the input layer is passed to a succession of hidden layers, each taking as an input the data out put from an antecedent layer, and providing as an input to a subseguent layer its own out put.
  • An output of a given node is a nonlinear weighted sum of its inputs. Specifically, the output of a given node is computed by using an activation function, the inputs of which are the one or more weight parameters of the node, the bias parameter of the node, and all inputs to the given node. If the layer to which the given node belongs is an input layer, then the inputs to the given node are data to be processed by the neural network 3. If the layer to which the given node belongs is an output layer or a hidden layer, then the inputs to the given node are outputs from a previous layer of the neural network 3. The output of the neural network 3 is the data which the one or more output layers provide.
  • reference numeral 4 refers to a garment dataset, which is stored in the computer system 1 .
  • the garment dataset 4 is a set of garments 41 .
  • a garment 41 is a data file comprising a 3 D representation of an item of clothing, and comprises a collection of points in 3 D space connected by lines, triangles, curved surfaces, or other geometric enti ties to form a mesh.
  • the garment 41 may further comprise data describing other garment properties, such as garment dimensions, garment orientations, garment boundaries, col ours, and material properties such as weight, texture, and diffusion/reflection properties of the garment 41 enabling an accurate 3 D representation of the garment 41 to be gen- erated.
  • reference numeral 5 refers to a body dataset, which is stored in the computer system 1 .
  • the body dataset 5 is a set of bodies 51 .
  • a body is a data file compris- ing a 3 D representation of a human body, or part of a human body, and comprises a col lection of points in 3 D space connected by lines, triangles, curved surfaces, or other geo metric entities to form a mesh.
  • the body 51 may further comprise data describing other body properties such as sex, height, age, weight, or specific measurements such as arm- length, leg length, neck circumference, chest circumference, and waist circumference, etc.
  • reference numeral 6 refers to a training dataset, which is stored in the computer system 1 .
  • the training dataset 6 comprises a set of fitted garments 61 .
  • a fitted garment 61 is a data file comprising a 3 D representation of a specific garment 41 which has been fitted onto a specific body 51 of the training dataset 6. Specifically, the fitted garment has been modified, i.e. deformed, such that it represents the specific garment 41 dressed on or worn by the specific body 51 .
  • This training dataset 6 is generated by a com puter program, such as a computer-aided design (CAD) program, configured to generate realistic garments which are fitted onto bodies such that they appear natural.
  • CAD computer-aided design
  • This CAD program preferably uses a physics engine which applies the known behaviour of soft bodies to garments lying on top of, or being worn by, a body.
  • This computer program in particular does not necessarily comprise any learnable parameters, such as comprised in a neural network.
  • the fitted garment 61 comprises a collection of points in 3 D space connected by lines, triangles, curved surfaces, or other geometric enti ties to form a mesh.
  • the fitted garment 61 is defined by differential data, defining differences with respect to a garment 41 in its initial non-fitted state).
  • the fitted garment 61 may further comprise data describing other properties of the fitted gar ment 61 , such as areas where the fitted garment 61 has been stretched or bunched during fitting.
  • reference numeral 31 refers to a garment sub-network of the neu ral network 3.
  • Reference numeral 32 refers to a body sub-network of the neural network 3.
  • the body sub-network 32 has as an inputthe body 51 , and the output of the body sub network 32 is provided to the garment sub-network 31.
  • the garment sub-network 31 has as inputs the garment 41 as well as the output of the body sub-network 32.
  • the output of the garment sub-network is the fitted garment 61.
  • the separation of the neural network 3 into a body sub-network 32 and a garment sub-network 31 enables the neural network 3 to maintain two separate sets of layers and sub-networks for feature detection of seman tically different inputs.
  • the garment sub-network 31 and the body sub-network 32 each comprise a plurality of sub-networks and/or layers including at least a spatial transformer network 311, 313, 317, 319, 321, 324, 328, 352, a multi-layer perceptron sub-network 312, 318, 323, 314, 325, 341, 343, 345, 351, 353, and a maximum pooling layer 315,326, 344, 354.
  • the spatial transformer network 311, 313, 317, 319, 321 , 324, 328, 352 enables the neural network 3 to normalize a 3D input, such that the neural network 3 learns features of a 3D input invariant to spatial transformations of a 3D input, which spatial transfor mations include translation, scale, rotation, and skew.
  • the spatial transformer network 311,313,317,319,321, 324, 328, 352 implements a dynamic mechanism which trans forms a 3D input by producing an appropriate transformation for each 3D input.
  • the spa- tial transformer network 311, 313, 317, 319, 321 , 324, 328, 352 therefore simplifies the subseguent neural network architecture for fitting a garment 41 onto a body 51 , lead ing to superior performance of fitting the garment 41 onto the body 51.
  • the multi-layer perceptron sub-network 312,318, 323, 314, 325, 341 , 343, 345, 351,353 consists of one or more layers which are fully connected with each other, and fully connected to an- tecedent and subsequent layers of the neural network 3.
  • a given layer of the neural net work 3 is considered to be fully connected with an antecedent layer if all outputs of the antecedent layer are connected with all inputs of the given layer, and if all inputs of the subsequent layer are connected with all outputs of the given layer. If a given layer or a given sub-network of the neural network 3 has multiple inputs, the inputs may be concatenated before being processed by the given layer.
  • the maximum pooling layer 31 5, 326, 344, 354 is a layer which produces a non-linear down-sampling of an input to the maximum pooling layer 31 5, 326, 344, 354. The down-sampling occurs by defining a filter size, partitioning the input into a set of non-overlapping sub-regions according to the filter size, and outputting only the maximum value of the input within each respective sub-region.
  • the neu ral network 3 further comprises one or more mesh-convolutional networks 342.
  • Mesh convolutional networks 342 extend the well-known concept of convolutional networks from a 2D image space to a 3 D image space. The inclusion of one or more mesh-convolu- tional networks 342 in the neural network 3 enables the neural network 3 to learn local features of 3 D data.
  • the body sub-network 32 comprises a spatial transformer network C 321 , which takes as an input the body 51 and is connected to a local body features A layer 322.
  • the local body features A layer 322 is provided to the garment sub-network 31 and the multi-layer perceptron D 323.
  • the multi-layer percep- tron D 323 is connected to a spatial transformer network D 324.
  • the spatial transformer network D 324 is connected to a multi-layer perceptron D 325.
  • the multi-layer perceptron D 325 is connected to a maximum pooling layer B 326.
  • the maximum pooling layer B 326 is connected to a global body features A layer 327, which global body features layer A 327 is an output of the body sub-network 32 and is provided to the garment sub-network 31 .
  • the garment sub-network 31 comprises a spatial transformer network A 31 1 , which takes as an inputthe garment 41 and is connected to a multi-layer perceptron A 31 2.
  • the multi- layer perceptron A 31 2 also takes as an input the output of the local body features A layer 322 and is connected to a spatial transformer network B 31 3.
  • the spatial transformer net work B 31 3 is connected to a multi-layer perceptron B 31 4.
  • the multi-layer perceptron B 3 1 4 is connected to a maximum pooling layer A 31 5.
  • the maximum pooling layer A 31 5 is connected to the multi-layer perceptron C 31 6.
  • the multi-layer perceptron C 31 6 also takes as an input the output of the global body features A layer 327.
  • the output of the multi-layer perceptron C 31 6 is the fitted garment 51 .
  • the body sub-network 32 comprises a spatial transformer network G 328, which takes as an input the body 51 , which is connected to a local body features B layer 329.
  • the output of the local body features B layer 329 is pro- vided to the garment sub-network 31 .
  • the local body features B layer 329 is connected to a multi-layer perceptron 1 351 .
  • the multi-layer perceptron 1 351 is connected to a spatial transformer network H 352.
  • the spatial transformer network H 352 is connected to a multi-layer perceptron J 353.
  • the multi-layer perceptron J 353 is connected to a maximum pooling layer D 354.
  • the maximum pooling layer D 354 is connected to a global body features B layer 355.
  • the output of the global body features B layer 355 is provided to the garment sub-network 31 .
  • a selective pooling sub-network 33 takes as inputs the body 51 and the garment 41 .
  • the output of the selective pooling sub-network 33 is provided to the garment sub-network 3 1 .
  • the garment sub-network 31 comprises a spatial transformer network E 31 7 which takes as an input the garment 41 and is connected to a multi-layer perceptron E 31 8.
  • the multi layer perceptron E 31 8 also takes an inputs the output of the local body features B layer 329 and the output of the global body features B layer 355.
  • the multi-layer perceptron E 3 1 8 is connected to a spatial transformer network F 31 9.
  • the spatial transformer network F 31 9 is connected to a multi-layer perceptron F 341 .
  • the multi-layer perceptron C 341 also takes an inputs the output of the local body features B layer 329 and the output of the global body features B layer 355.
  • the multi-layer perceptron F 341 is connected to a mesh convolutional layer 342.
  • the mesh convolutional layer 342 is connected to a multi-layer perceptron G 343.
  • the multi-layer perceptron G 343 also takes as an input the output of the global body features B layer 355.
  • the multi-layer perceptron G is connected to a max- imum pooling layer C 344.
  • the maximum pooling layer C 344 is connected to a multi-layer perceptron H 345.
  • the multi-layer perceptron H 345 also takes an inputs the output of the global body features B layer 355 and the output of the selective pooling sub-network 33.
  • Step S 1 the computer system 1 , or its processor(s) 2, re spectively, generates a training dataset 6 of fitted garments 61 .
  • a garment 41 of the garment dataset 4 and a body 51 of the body dataset 5 are input into a "fitting" program, a computer program e.g. implemented and running in the computer system 1 , or its processor(s) 2, respectively, and configured to generate a fitted garment 61 .
  • the "fitting" program implements a physically based simu- lation program, which is configured to render the fitted garment 61 in accordance with laws of physics which govern how the garment 51 , which is a soft body with a given form, shape, and material properties, drapes and deforms under the influence of gravity when placed over, i.e. is worn, by a solid body 51 .
  • Many commercial software packages available allow such a rendering, for example the NvCIoth library of NVidia. This rendering is com- putationally expensive, such that it reguires the computer system 1 to comprise specific graphics processing units, and such that it takes the computer system 1 considerable elec trical energy and computational time to generate the fitting.
  • Step S2 Each garment 41 is fitted onto a plurality of bodies 51 to create the training dataset 6 of fitted garments 61 .
  • Step S2 as seen in Figures 4, 6, and 7, a trained neural network 31 is generated, by the computer system 1 , from the neural network 3 by training the neural network 3.
  • the inputs to the training of the neural network 3 are the neural network 3 ( its architecture and un trained parameters), the garment dataset 4, the body dataset 5, and the training dataset 6.
  • the output of the training of the neural network 3 is the trained neural network 31 ( its architecture and trained parameters).
  • Step S2 comprises a plu rality of steps.
  • Step S21 a predicted fitted garment 42 is generated by the neural network 3 from the garment 41 and the body 51 .
  • the garment 41 and the body 51 are passed as inputs to the input layers of the neural network 3.
  • the neural network 3 then processes the inputs and generates the predicted fitted garment 42 as an output in an output layer of the neural network 3.
  • Step S22 the predicted fitted garment 42 of the training dataset 6 is compared to a fitted garment 61 which corresponds to the garment 41 and body 51 , the fitted garment 61 having been generated by the physically based simulation as described above.
  • the pre dicted fitted garment 42 is compared to the fitted garment 42 to determine a fitting error.
  • the fitting error describes how similar the predicted fitted garment 42 is to the fitted gar ment 42.
  • the fitting error is computed by a function whose inputs comprise a distance term, an orientation term, and a penetration term.
  • the distance term represents the dis tances between vertices of the predicted fitted garment 42 and the corresponding vertices of the fitted garment 61 , and gives a first indication as to how similar the predicted gar ment 42 is to the fitted garment 42.
  • the orientation term represents angular differences between vertex normals of the predicted fitted garments 42 and the corresponding vertex normals of the fitted garment 61 .
  • Vertex normals are vectors which lie perpendicular to the mesh of the predicted fitted garment 42, or the fitted garment 61 , at a given vertex of the predicted fitted garment 42, or the fitted garment 61 , respectively. Determining a dif ference in the orientation between vertex normals better indicates how similar the pre dicted fitted garment 42 is to the fitted garment 61 .
  • the penetration term indicates whether the predicted fitted garment 42 lies within the body 51 onto which the garment 41 has been fitted. To check whether the predicted fitted garment 42 lies within the body 51 , the distance between the vertices which comprise the predicted fitted garment 42 and the body 51 is determined.
  • a plurality of predicted fitted garments 42 are generated by the neural network 3 and compared with a plurality of fitted garments 61 to generate a fitting error representing an average fitting error.
  • the average fitting error is compared to a predetermined fitting errorthreshold.
  • the fitting error threshold is a measure of how similar the predicted fitted garment 42 is to the fitted garment 61 . If the average fitting error is smaller than the fitting error threshold, then the neural network 3 is considered trained. If the fitting error is larger than the fitting error threshold, then the neural network 3 is not yet considered trained.
  • Step S24 the parameters, i.e. the weight parameters and the bias parameters of the nodes comprising each layer of the neural network 3, are adjusted by back-propagation using the average fitting error.
  • Back-propagation is an algorithm known in the art of neural networks by which the average fitting error is propagated backwards through the neural network 3 and is used to update the parameters of each node.
  • Training the neural network 3 comprises repeating a cycle of individual steps S21 , S22, S23, and S24 until the parameters of the neural network 3 have been adjusted such that the average fitting error of the neural network 3 lies below the fitting error threshold, upon which the neural network 3 is considered to be a trained neural network 31 .
  • Further de- tailed aspects of training the neural network 3 is known in the art of neural networks. Such aspects include dividing the garment dataset 4, body dataset 5, and training dataset 6 into training, validation, and test sets, and selecting appropriate hyper-parameters for training. .
  • Step S3 illustrated in Figure 1 , in an embodiment and/or configuration where the fitting of a garment onto a body is executed on a communication terminal 7, remote from the computer system 1 , the trained neural network 31 is transmitted by the computer system 1 , via the communication network 4, to the communication terminal 7.
  • Step S4 as illustrated in Figure 5, the trained neural network 31 - implemented on the computer system 1 or on the communication terminal 7, or their respective processors 2, 71 , respectively - fits a particular garment 8 onto a particular body 9, resulting in a partic ular fitted garment 1 0.
  • the particular garment 8 and the particular body 9 are, analogously to the garment 41 and the body 51 , 3 D representations of a garment and a body, respec tively, as detailed above.
  • a user of the computer system 1 or the communication terminal 7, selects the particular garment 8, which particular garment 8 is not necessarily part of the garment dataset 8, to be fitted onto a user-selected or defined particular body 9, which is not necessarily part of the body dataset 5.
  • the particular garment 8 and the particular body 9 may be either stored on a memory of the communication terminal 7, or received, via the communication network 4, from the computer system 1 .
  • the particular body 9 is defined and/or generated by the communication terminal 7, or its pro cessors 71 , respectively, e.g. through a 3 D scanning process implemented on the commu nication terminal 7.
  • the processor 71 of the communication terminal 7 uses the trained neural network 31 , along with the inputs of the particular garment 8 and the particular body 9, to generate a fitted garment 1 0.
  • the processor(s) 2 of the computer system 1 generates the fitted gar ment 1 0 using the user-selected garment 8 and body 9.
  • the generation of the fitted gar- ment 1 0 by the processor 71 is computationally efficient, in terms of energy consumed and computational time reguired, in comparison to the initial rendering of the fitted gar ment 61 by the computer system 1 .
  • the fitted garment 1 0 is then showed on the display 72 of the communication terminal 7 by the processor 71 .

Abstract

For fitting a garment onto a body, a training dataset of fitted garments is generated (S1) by a computer system fitting 3D representations of garments from a garment dataset onto 3D representations of bodies from a body dataset. The trained neural network is generated by training (S2) a neural network to fit a plurality of garments onto a plurality of bodies, using the training dataset. A selection of a particular garment and a particular body are received from a user of a communication terminal, which particular garment and which particular body are not in the training dataset. The particular garment is fitted (S4) onto the particular body using the trained neural network and shown on a display of the com- munication terminal.

Description

M ETHOD A N D DEVICES FOR FITTI NG A GARM ENT ONTO A BODY
Field of the Invention
The present invention relates to a method and devices for fitting a garment onto a body. Specifically, the present invention relates to a method, a computer system, a communica tion terminal, and a computer program product for fitting a 3 D model of a garment onto a 3 D model of a body.
Background of the Invention
Generating realistic 3 D simulations remains computationally expensive, both in terms of the cost of specialized hardware, such as graphics processing units, reguired to efficiently simulate physical forces which act on rigid and non-rigid 3 D objects, and also in the time and energy reguired for such hardware to execute such simulations. In addition to specific computational hardware reguired for 3 D simulations, special software tools and libraries are reguired. Of special difficulty is generating 3 D simulations of non-rigid bodies such as fabrics, as the demand for realism reguires simulation software to consider the laws of physics and how they relate them to non-rigid material properties and behavior.
Neural networks have in the last decade, with the advent of more powerful hardware and more advanced learning algorithms, proven both powerful and useful in implementing ap plications of machine learning. Today, neural networks are employed in a vast array of fields - from speech and character recognition, to image classification, machine transla tion, to driving vehicles and aiding doctors in medical diagnoses. The architectures of neu ral networks differ widely, and developing a neural network architecture with the aim of solving a specific problem reguires significant ingenuity. A successful neural network, how- ever, requires more than an appropriate architecture. It must be trained with carefully se lected training samples and tuned such that it can produce reliable outputs for previously unknown inputs.
WO 201 7/203262 A2 describes a system of predicting attributes of clothing pieces or accessories using deep learning techniques, wherein a deep learning model is trained to predict discrete and continuous attributes of digital photos of a garment or accessory, and this deep learning model is used for feature extraction. These features may be used for finding similar items to a queried item, identifying an item provided in a query, or searching items based on attributes. Summary of the Invention
It is an object of this invention to provide a method, a computer system, a communication terminal, and a computer program product for fitting a garment onto a body which does not have at least some of the disadvantages of the prior art. In particular, it is an object of the present invention to provide a computer implemented method, a computer system comprising a processor and a memory, a communication terminal comprising a processor and a display, and a computer program product.
According to the present invention, these objects are achieved through the features of the independent claims. In addition, further advantageous embodiments follow from the de pendent claims and the description. According to the present invention, the above-mentioned objects are particularly achieved in that a computer-implemented method of fitting a garment onto a body comprises stor ing a training dataset of fitted garments, the training dataset of fitted garments generated by fitting 3 D representations of garments from a garment dataset onto 3 D representations of bodies from a body dataset. The method further comprises generating a trained neural network by training a neural network to fit the garments onto the bodies, using the training dataset. The method further comprises receiving, from a user, a selection of a particular garment and a particular body, which particular garment and which particular body are not in the training dataset. The method further comprises fitting the particular garment onto the particular body using the trained neural network, and showing on a display the fitting of the particular garment onto the particular body.
Other inventions use neural networks to solve different tasks that do not directly relate to the present invention. However, the present invention in particular enables the generation of the trained neural network which can perform a close approximation of a computation ally expensive, physics based, simulation used to generate the training dataset of fitted garments. This close approximation, as generated by the trained neural network, is such that a user can judge, based on a fitting by the trained neural network, whether a garment is likely to fit his or her body well. As the trained neural network generates the fitted gar ment in a computationally more efficient manner than the physics based simulation, the fitted garment may be generated more guickly and /or with less computational processing power. The fitted garment may also be generated on a mobile user device, e.g. a commu nication terminal of the user, which has limited computational power, such as a portable phone or a tablet device. Alternatively, the computation may also be performed guickly on a server computer and transmitted to the communication terminal.
In an embodiment, generating the trained neural network is executed on a computer sys tem and the method further comprises a communication terminal receiving the trained neural network from the computer system via a communication network. The method fur ther comprises the communication terminal receiving the selection of the particular gar ment and the particular body from the user. The method further comprises the communi cation terminal fitting the particular garment onto the particular body using the trained neural network. The method further comprises the communication terminal showing on the display the fitting of the particular garment onto the particular body.
In an embodiment, training the neural network comprises training a plurality of sub-net- works. The sub-networks include a garment sub-network configured to receive the gar ment as an input. The garment sub-network comprises a plurality of layers. The sub-net- works also include a body sub-network configured to receive the body as an input. The body sub-network comprises a plurality of layers. An output of the body sub-network is provided to at least one layer of the garment sub-network, and the garment sub-network is configured to use the output of the body sub-network to fit the garment onto the body.
In an embodiment, training the neural network comprises training a plurality of sub-net- works and /or layers. Each sub-network and /or layer has inputs and outputs. The sub-net- works and/or layers include a spatial transformer sub-network, a multi-layer perceptron sub-network, and/or a maximum pooling layer. The spatial transformer sub-network is configured to apply a spatial transformation to a 3 D input. The multi-layer perceptron sub network enables detection of geometric features in an input. The maximum pooling layer is configured to reduce a number of parameters in an output from the number of parame ters in an input.
In an embodiment, training the neural network comprises determining a fitting error by comparing the garments fitted onto the bodies by the neural network to corresponding fitted garments from the training dataset, and adjusting the neural network using the fit ting error.
In an embodiment, determining the fitting error comprises determining a distance term which represents distances between vertices of the garments fitted onto the body by the neural network and corresponding vertices of the corresponding fitted garments from the training dataset. Determining the fitting error further comprises determining an orientation term which represents angular differences between vertex normals of the garments fitted onto the body by the neural network and the corresponding vertex normals of the corre sponding fitted garments from the training dataset. Determining the fitting error further comprises determining a penetration term which indicates whether vertices of the garment fitted by the neural network are inside the body onto which the garment has been fitted.
In addition to the computer-implemented method, the present invention also relates to a computer system comprising one or more processors configured to fit a garment onto a body. The one or more processors are configured to fit the garment onto the body by storing a training dataset of fitted garments, generated by fitting 3 D representations of garments from a garment dataset onto 3 D representations of bodies from a body dataset; generating a trained neural network by training a neural network to fit the garments onto the bodies, using the training dataset; receiving, from a user, a selection of a particular garment and a particular body, which particular garment and which particular body are not in the training dataset; fitting the particular garment onto the particular body using the trained neural network; and showing on a display the fitting of the particular garment onto the particular body. In an embodiment, the processors are configured to implement the neural network to com prise a plurality of sub-networks. The sub-networks include a garment sub-network con figured to receive the garment as an input. The garment sub-network comprises a plurality of layers. The sub-networks further include a body sub-network configured to receive the body as an input. The body sub-network comprises a plurality of layers. An output of the body sub-network is provided to at least one layer of the garment sub-network, and the garment sub-network is configured to use the output of the body sub-network to fit the garment onto the body.
In an embodiment, the processors are configured to implement the neural network to com- prise a plurality of sub-networks and /or layers. Each sub-network and /or layer has inputs and outputs. The sub-networks and/or layers include a spatial transformer sub-network, a multi-layer perceptron sub-network, and /or a maximum pooling layer. The spatial trans former sub-network is configured to apply a spatial transformation to a 3 D input. The multi-layer perceptron sub-network enables detection of geometric features in an input. The maximum pooling layer is configured to reduce a number of parameters in an output from the number of parameters in an input.
In an embodiment, the processors are configured to train the neural network by determin ing a fitting error by comparing the garments fitted onto the bodies by the neural network to corresponding fitted garments from the training dataset, and adjusting the neural net- work using the fitting error.
In an embodiment, the processors are configured to determine the fitting error by per forming the steps of: determining a distance term which represents distances between ver- tices of the garments fitted onto the body by the neural network and corresponding verti ces of the corresponding fitted garments from the training dataset; determining an orien tation term which represents angular differences between vertex normals of the garments fitted onto the body by the neural network and the corresponding vertex normals of the corresponding fitted garments from the training dataset; and determining a penetration term which indicates whether vertices of the garment fitted by the neural network are in side the body onto which the garment has been fitted.
In an embodiment, at least one of the processors is arranged on a networked computer system and configured to store the training dataset of fitted garments, to generate the trained neural network, and to transfer the trained neural network via a communication network to a communication terminal. Furthermore, at least one of the processors is ar ranged on the communication terminal and configured to receive the trained neural net work from the networked computer system via the communication network, to receive the selection of the particular garment and the particular body from the user, to fit the partic- ular garment onto the particular body using the trained neural network, and to show on the display the fitting of the particular garment onto the particular body.
Furthermore, in addition to the computer-implemented method and the computer system, the present invention also relates to a computer program product comprising a non-tran- sitory computer readable medium having stored thereon computer code configured to control a processor of a computer system to fit a garment onto a body by performing the steps of: training dataset of fitted garments, generated by fitting 3 D representations of garments from a garment dataset onto 3 D representations of bodies from a body dataset; generating a trained neural network by training a neural network to fit the garments onto the bodies, using the training dataset; receiving, from a user, a selection of a particular garment and a particular body, which particular garment and which particular body are not in the training dataset; fitting the particular garment onto the particular body using the trained neural network; and showing on a display the fitting of the particular garment onto the particular body. Furthermore, in addition to the computer-implemented method and the computer system, the present invention also relates to a computer program product comprising a non-tran- sitory computer readable medium having stored thereon computer code configured to control a processor of a communication terminal to fit a garment onto a body by perform ing the steps of: receiving from a computer system a trained neural network, trained with a training dataset of fitted garments to fit garments onto bodies; receiving, from a user, a selection of a particular garment and a particular body, which particular garment and which particular body are not in the training dataset; fitting the particular garment onto the particular body using the trained neural network; and showing on a display the fitting of the particular garment onto the particular body. Brief Description of the Drawings
The present invention will be explained in more detail, by way of example, with reference to the drawings in which:
Figure 1 : shows a block diagram illustrating a computer system in communicative con nection with a communication terminal for fitting a garment onto a body; Figure 2: shows a block diagram illustrating schematically a computer system compris ing a processor, a neural network, a garment dataset, a body dataset, and a training dataset; Figure 3: shows a flow diagram illustrating a step for generating a training dataset of fitted garments;
Figure 4: shows a flow diagram illustrating a step for training a neural network;
Figure 5: shows a flow diagram illustrating a step for fitting a particular garment using a trained neural network;
Figure 6: shows a flow diagram illustrating an exemplary seguence of steps for training a neural network;
Figure 7: shows a flow diagram illustrating an exemplary seguence of steps for fitting a garment using a neural network, including transmission of a trained neural network via a communication network;
Figure 8: shows a block diagram illustrating a neural network for fitting a garment onto a body;
Figure 9: shows a flow diagram illustrating an exemplary seguence of steps for fitting a garment using a neural network; and Figure 1 0: shows a flow diagram illustrating an exemplary seguence of steps for fitting a garment using a neural network.
Detailed Description of the Preferred Embodiments
In Figures 1 and 2, reference numeral 1 refers to a computer system. The computer system 1 is a desktop computer or a server computer, which comprises one or more processors 2, a memory, and a communications interface for communicating with a communication ter minal 7 via a communication network 4. In an embodiment, the computer system 1 is a virtual machine being executed on a server computing system.
As illustrated in Figure 1 , the communication terminal 7, more specifically an electronic communication device such as a desktop computer, a laptop computer, a tablet computer, a mobile radio phone, a smart watch, or the like, comprises a processor 71 and a display 72.
The communication network 4 comprises one or more wired or wireless networks (or set of networks), such as the Internet, LAN ( Local Area Network), WLAN (Wireless Local Area Network), and mobile radio networks, such as GSM (Global System for Mobile Communi cation) or UMTS ( Universal Mobile Telephone System) networks.
In Figures 2, 4, 5, 6, 8, 9, and 1 0, reference numeral 3 refers to a neural network, in par ticular a feed-forward neural network. The neural network 3 implements a function which can be decomposed into other functions. The neural network 3 comprises a set of param- eters which are adjusted during training of the neural network 3. The neural network 3 can be represented as a network structure comprising a set of connected layers, which layers may be grouped into sub-networks. Each layer comprises a set of nodes, each node having adjustable parameters including one or more weight parameters and a bias parameter. A layer may be either an input layer, an output layer, or a hidden layer. The neural network 3 comprises one or more input layers, one or more output layers, and one or more hidden layers. Data is processed by a given layer and provided to one or more subseguent layers. Data being input into the neural network 3 is passed to an input layer. The output of the input layer is passed to a succession of hidden layers, each taking as an input the data out put from an antecedent layer, and providing as an input to a subseguent layer its own out put.
An output of a given node is a nonlinear weighted sum of its inputs. Specifically, the output of a given node is computed by using an activation function, the inputs of which are the one or more weight parameters of the node, the bias parameter of the node, and all inputs to the given node. If the layer to which the given node belongs is an input layer, then the inputs to the given node are data to be processed by the neural network 3. If the layer to which the given node belongs is an output layer or a hidden layer, then the inputs to the given node are outputs from a previous layer of the neural network 3. The output of the neural network 3 is the data which the one or more output layers provide.
In Figures 2 and 4, reference numeral 4 refers to a garment dataset, which is stored in the computer system 1 . The garment dataset 4 is a set of garments 41 . A garment 41 is a data file comprising a 3 D representation of an item of clothing, and comprises a collection of points in 3 D space connected by lines, triangles, curved surfaces, or other geometric enti ties to form a mesh. The garment 41 may further comprise data describing other garment properties, such as garment dimensions, garment orientations, garment boundaries, col ours, and material properties such as weight, texture, and diffusion/reflection properties of the garment 41 enabling an accurate 3 D representation of the garment 41 to be gen- erated.
In Figures 2 and 4, reference numeral 5 refers to a body dataset, which is stored in the computer system 1 . The body dataset 5 is a set of bodies 51 . A body is a data file compris- ing a 3 D representation of a human body, or part of a human body, and comprises a col lection of points in 3 D space connected by lines, triangles, curved surfaces, or other geo metric entities to form a mesh. The body 51 may further comprise data describing other body properties such as sex, height, age, weight, or specific measurements such as arm- length, leg length, neck circumference, chest circumference, and waist circumference, etc.
In Figures 2 and 4, reference numeral 6 refers to a training dataset, which is stored in the computer system 1 . The training dataset 6 comprises a set of fitted garments 61 . A fitted garment 61 is a data file comprising a 3 D representation of a specific garment 41 which has been fitted onto a specific body 51 of the training dataset 6. Specifically, the fitted garment has been modified, i.e. deformed, such that it represents the specific garment 41 dressed on or worn by the specific body 51 . This training dataset 6 is generated by a com puter program, such as a computer-aided design (CAD) program, configured to generate realistic garments which are fitted onto bodies such that they appear natural. This CAD program, or other computer program, preferably uses a physics engine which applies the known behaviour of soft bodies to garments lying on top of, or being worn by, a body. This computer program in particular does not necessarily comprise any learnable parameters, such as comprised in a neural network.. The fitted garment 61 comprises a collection of points in 3 D space connected by lines, triangles, curved surfaces, or other geometric enti ties to form a mesh. In an embodiment, the fitted garment 61 is defined by differential data, defining differences with respect to a garment 41 in its initial non-fitted state). The fitted garment 61 may further comprise data describing other properties of the fitted gar ment 61 , such as areas where the fitted garment 61 has been stretched or bunched during fitting. In Figures 8, 9, and 10, reference numeral 31 refers to a garment sub-network of the neu ral network 3. Reference numeral 32 refers to a body sub-network of the neural network 3. The body sub-network 32 has as an inputthe body 51 , and the output of the body sub network 32 is provided to the garment sub-network 31. The garment sub-network 31 has as inputs the garment 41 as well as the output of the body sub-network 32. The output of the garment sub-network is the fitted garment 61. The separation of the neural network 3 into a body sub-network 32 and a garment sub-network 31 enables the neural network 3 to maintain two separate sets of layers and sub-networks for feature detection of seman tically different inputs. The garment sub-network 31 and the body sub-network 32 each comprise a plurality of sub-networks and/or layers including at least a spatial transformer network 311, 313, 317, 319, 321, 324, 328, 352, a multi-layer perceptron sub-network 312, 318, 323, 314, 325, 341, 343, 345, 351, 353, and a maximum pooling layer 315,326, 344, 354. The spatial transformer network 311, 313, 317, 319, 321 , 324, 328, 352 enables the neural network 3 to normalize a 3D input, such that the neural network 3 learns features of a 3D input invariant to spatial transformations of a 3D input, which spatial transfor mations include translation, scale, rotation, and skew. The spatial transformer network 311,313,317,319,321, 324, 328, 352 implements a dynamic mechanism which trans forms a 3D input by producing an appropriate transformation for each 3D input. The spa- tial transformer network 311, 313, 317, 319, 321 , 324, 328, 352 therefore simplifies the subseguent neural network architecture for fitting a garment 41 onto a body 51 , lead ing to superior performance of fitting the garment 41 onto the body 51. The multi-layer perceptron sub-network 312,318, 323, 314, 325, 341 , 343, 345, 351,353 consists of one or more layers which are fully connected with each other, and fully connected to an- tecedent and subsequent layers of the neural network 3. A given layer of the neural net work 3 is considered to be fully connected with an antecedent layer if all outputs of the antecedent layer are connected with all inputs of the given layer, and if all inputs of the subsequent layer are connected with all outputs of the given layer. If a given layer or a given sub-network of the neural network 3 has multiple inputs, the inputs may be concatenated before being processed by the given layer. The maximum pooling layer 31 5, 326, 344, 354 is a layer which produces a non-linear down-sampling of an input to the maximum pooling layer 31 5, 326, 344, 354. The down-sampling occurs by defining a filter size, partitioning the input into a set of non-overlapping sub-regions according to the filter size, and outputting only the maximum value of the input within each respective sub-region.
In an embodiment, in addition to the above-mentioned sub-networks and layers, the neu ral network 3 further comprises one or more mesh-convolutional networks 342. Mesh convolutional networks 342 extend the well-known concept of convolutional networks from a 2D image space to a 3 D image space. The inclusion of one or more mesh-convolu- tional networks 342 in the neural network 3 enables the neural network 3 to learn local features of 3 D data.
In an embodiment, as illustrated in Figure 9, the body sub-network 32 comprises a spatial transformer network C 321 , which takes as an input the body 51 and is connected to a local body features A layer 322. The local body features A layer 322 is provided to the garment sub-network 31 and the multi-layer perceptron D 323. The multi-layer percep- tron D 323 is connected to a spatial transformer network D 324. The spatial transformer network D 324 is connected to a multi-layer perceptron D 325. The multi-layer perceptron D 325 is connected to a maximum pooling layer B 326. The maximum pooling layer B 326 is connected to a global body features A layer 327, which global body features layer A 327 is an output of the body sub-network 32 and is provided to the garment sub-network 31 .
The garment sub-network 31 comprises a spatial transformer network A 31 1 , which takes as an inputthe garment 41 and is connected to a multi-layer perceptron A 31 2. The multi- layer perceptron A 31 2 also takes as an input the output of the local body features A layer 322 and is connected to a spatial transformer network B 31 3. The spatial transformer net work B 31 3 is connected to a multi-layer perceptron B 31 4. The multi-layer perceptron B 3 1 4 is connected to a maximum pooling layer A 31 5. The maximum pooling layer A 31 5 is connected to the multi-layer perceptron C 31 6. The multi-layer perceptron C 31 6 also takes as an input the output of the global body features A layer 327. The output of the multi-layer perceptron C 31 6 is the fitted garment 51 .
In an embodiment, as illustrated in Figure 1 0, the body sub-network 32 comprises a spatial transformer network G 328, which takes as an input the body 51 , which is connected to a local body features B layer 329. The output of the local body features B layer 329 is pro- vided to the garment sub-network 31 . The local body features B layer 329 is connected to a multi-layer perceptron 1 351 . The multi-layer perceptron 1 351 is connected to a spatial transformer network H 352. The spatial transformer network H 352 is connected to a multi-layer perceptron J 353. The multi-layer perceptron J 353 is connected to a maximum pooling layer D 354. The maximum pooling layer D 354 is connected to a global body features B layer 355. The output of the global body features B layer 355 is provided to the garment sub-network 31 . A selective pooling sub-network 33 takes as inputs the body 51 and the garment 41 . The output of the selective pooling sub-network 33 is provided to the garment sub-network 3 1 .
The garment sub-network 31 comprises a spatial transformer network E 31 7 which takes as an input the garment 41 and is connected to a multi-layer perceptron E 31 8. The multi layer perceptron E 31 8 also takes an inputs the output of the local body features B layer 329 and the output of the global body features B layer 355. The multi-layer perceptron E 3 1 8 is connected to a spatial transformer network F 31 9. The spatial transformer network F 31 9 is connected to a multi-layer perceptron F 341 . The multi-layer perceptron C 341 also takes an inputs the output of the local body features B layer 329 and the output of the global body features B layer 355. The multi-layer perceptron F 341 is connected to a mesh convolutional layer 342. The mesh convolutional layer 342 is connected to a multi-layer perceptron G 343. The multi-layer perceptron G 343 also takes as an input the output of the global body features B layer 355. The multi-layer perceptron G is connected to a max- imum pooling layer C 344. The maximum pooling layer C 344 is connected to a multi-layer perceptron H 345. The multi-layer perceptron H 345 also takes an inputs the output of the global body features B layer 355 and the output of the selective pooling sub-network 33.
In the following paragraphs, described with reference to Figures 1 , 3, 4, 5, 6, 7, 8, 9 and 1 0 are a possible seguences of steps performed by the computer system 1 and/or the communication terminal 7, or their processors 2, 71 , respectively, for fitting a garment onto a body.
In Step S 1 , as seen in Figures 3 and 7, the computer system 1 , or its processor(s) 2, re spectively, generates a training dataset 6 of fitted garments 61 . To generate each fitted garment 61 , a garment 41 of the garment dataset 4 and a body 51 of the body dataset 5 are input into a "fitting" program, a computer program e.g. implemented and running in the computer system 1 , or its processor(s) 2, respectively, and configured to generate a fitted garment 61 . Specifically, the "fitting" program implements a physically based simu- lation program, which is configured to render the fitted garment 61 in accordance with laws of physics which govern how the garment 51 , which is a soft body with a given form, shape, and material properties, drapes and deforms under the influence of gravity when placed over, i.e. is worn, by a solid body 51 . Many commercial software packages available allow such a rendering, for example the NvCIoth library of NVidia. This rendering is com- putationally expensive, such that it reguires the computer system 1 to comprise specific graphics processing units, and such that it takes the computer system 1 considerable elec trical energy and computational time to generate the fitting.
Each garment 41 is fitted onto a plurality of bodies 51 to create the training dataset 6 of fitted garments 61 . In Step S2, as seen in Figures 4, 6, and 7, a trained neural network 31 is generated, by the computer system 1 , from the neural network 3 by training the neural network 3. The inputs to the training of the neural network 3 are the neural network 3 ( its architecture and un trained parameters), the garment dataset 4, the body dataset 5, and the training dataset 6. The output of the training of the neural network 3 is the trained neural network 31 ( its architecture and trained parameters). As illustrated in Figure 6, Step S2 comprises a plu rality of steps.
In Step S21 , a predicted fitted garment 42 is generated by the neural network 3 from the garment 41 and the body 51 . The garment 41 and the body 51 are passed as inputs to the input layers of the neural network 3. The neural network 3 then processes the inputs and generates the predicted fitted garment 42 as an output in an output layer of the neural network 3.
In Step S22, the predicted fitted garment 42 of the training dataset 6 is compared to a fitted garment 61 which corresponds to the garment 41 and body 51 , the fitted garment 61 having been generated by the physically based simulation as described above. The pre dicted fitted garment 42 is compared to the fitted garment 42 to determine a fitting error. The fitting error describes how similar the predicted fitted garment 42 is to the fitted gar ment 42. The fitting error is computed by a function whose inputs comprise a distance term, an orientation term, and a penetration term. The distance term represents the dis tances between vertices of the predicted fitted garment 42 and the corresponding vertices of the fitted garment 61 , and gives a first indication as to how similar the predicted gar ment 42 is to the fitted garment 42. The orientation term represents angular differences between vertex normals of the predicted fitted garments 42 and the corresponding vertex normals of the fitted garment 61 . Vertex normals are vectors which lie perpendicular to the mesh of the predicted fitted garment 42, or the fitted garment 61 , at a given vertex of the predicted fitted garment 42, or the fitted garment 61 , respectively. Determining a dif ference in the orientation between vertex normals better indicates how similar the pre dicted fitted garment 42 is to the fitted garment 61 . The penetration term indicates whether the predicted fitted garment 42 lies within the body 51 onto which the garment 41 has been fitted. To check whether the predicted fitted garment 42 lies within the body 51 , the distance between the vertices which comprise the predicted fitted garment 42 and the body 51 is determined. A plurality of predicted fitted garments 42 are generated by the neural network 3 and compared with a plurality of fitted garments 61 to generate a fitting error representing an average fitting error. In Step S23, the average fitting error is compared to a predetermined fitting errorthreshold. The fitting error threshold is a measure of how similar the predicted fitted garment 42 is to the fitted garment 61 . If the average fitting error is smaller than the fitting error threshold, then the neural network 3 is considered trained. If the fitting error is larger than the fitting error threshold, then the neural network 3 is not yet considered trained.
In Step S24, the parameters, i.e. the weight parameters and the bias parameters of the nodes comprising each layer of the neural network 3, are adjusted by back-propagation using the average fitting error. Back-propagation is an algorithm known in the art of neural networks by which the average fitting error is propagated backwards through the neural network 3 and is used to update the parameters of each node.
Training the neural network 3 comprises repeating a cycle of individual steps S21 , S22, S23, and S24 until the parameters of the neural network 3 have been adjusted such that the average fitting error of the neural network 3 lies below the fitting error threshold, upon which the neural network 3 is considered to be a trained neural network 31 . Further de- tailed aspects of training the neural network 3 is known in the art of neural networks. Such aspects include dividing the garment dataset 4, body dataset 5, and training dataset 6 into training, validation, and test sets, and selecting appropriate hyper-parameters for training. .
In optional Step S3, illustrated in Figure 1 , in an embodiment and/or configuration where the fitting of a garment onto a body is executed on a communication terminal 7, remote from the computer system 1 , the trained neural network 31 is transmitted by the computer system 1 , via the communication network 4, to the communication terminal 7. In Step S4, as illustrated in Figure 5, the trained neural network 31 - implemented on the computer system 1 or on the communication terminal 7, or their respective processors 2, 71 , respectively - fits a particular garment 8 onto a particular body 9, resulting in a partic ular fitted garment 1 0. The particular garment 8 and the particular body 9 are, analogously to the garment 41 and the body 51 , 3 D representations of a garment and a body, respec tively, as detailed above. A user of the computer system 1 or the communication terminal 7, selects the particular garment 8, which particular garment 8 is not necessarily part of the garment dataset 8, to be fitted onto a user-selected or defined particular body 9, which is not necessarily part of the body dataset 5. The particular garment 8 and the particular body 9 may be either stored on a memory of the communication terminal 7, or received, via the communication network 4, from the computer system 1 . In an embodiment, the particular body 9 is defined and/or generated by the communication terminal 7, or its pro cessors 71 , respectively, e.g. through a 3 D scanning process implemented on the commu nication terminal 7. The processor 71 of the communication terminal 7 uses the trained neural network 31 , along with the inputs of the particular garment 8 and the particular body 9, to generate a fitted garment 1 0. One skilled in the art will understand, that in an alternative configuration, where the trained neural network 31 is implemented on the computer system 1 , the processor(s) 2 of the computer system 1 generates the fitted gar ment 1 0 using the user-selected garment 8 and body 9. The generation of the fitted gar- ment 1 0 by the processor 71 is computationally efficient, in terms of energy consumed and computational time reguired, in comparison to the initial rendering of the fitted gar ment 61 by the computer system 1 . The fitted garment 1 0 is then showed on the display 72 of the communication terminal 7 by the processor 71 . It should be noted that, in the description, the sequence of the steps has been presented in a specific order, one skilled in the art will understand, however, that the computer pro gram code may be structured differently and that the order of at least some of the steps could be altered, without deviating from the scope of the invention.

Claims

1. A computer-implemented method of fitting a garment (41 , 8) onto a body (51,9), the method comprising: storing a training dataset (6) of fitted garments (61 ), the training dataset (6) gen- 5 erated (S1 ) by fitting 3D representations of garments (41 ) from a garment dataset
(4) onto 3D representations of bodies (51 ) from a body dataset (5); generating (S2) a trained neural network (31 ) by training a neural network (3) to fit the garments (41 ) onto the bodies (51 ), using the training dataset (6); receiving, from a user, a selection of a particular garment (8) and a particular body0 (9), which particular garment (8) and which particular body (9) are not in the train ing dataset (6); fitting (S4) the particular garment (8) onto the particular body (9) using the trained neural network (31); and showing on a display (72) the fitting of the particulargarment (8) ontothe particular5 body (9).
2. The method of claim 1 , wherein generating (S2) the trained neural network (31) is executed on a computer system (1 ); and the method further comprises a commu nication terminal (7) receiving (S3) the trained neural network (31 ) from the com puter system (1 ) via a communication network (4); the communication terminal (7) receiving the selection of the particular garment (8) and the particular body (9) from the user; the communication terminal ( 7) fitting (S4) the particular garment (8) onto the particular body (9) using the trained neural network ( 31 ); and the commu nication terminal ( 7) showing on the display ( 72) the fitting of the particular gar- ment (8) onto the particular body ( 9).
3. The method of one of claims 1 or 2, wherein training the neural network ( 3) com prises training a plurality of sub-networks, including a garment sub-network (31 ) configured to receive the garment (41 , 8) as an input and comprising a plurality of layers, and a body sub-network (32) configured to receive the body ( 51 , 9) as an input and comprising a plurality of layers, whereby an output of the body sub-net- work ( 32) is provided to at least one layer of the garment sub-network ( 31 ), and the garment sub-network ( 32) is configured to use the output of the body sub-net- work (32) to fit the garment (41 , 8) onto the body ( 51 , 9).
4. The method of claims 1 to 3, wherein training the neural network ( 3 ) comprises training a plurality of sub-networks and/or layers, each sub-network and/or layer having inputs and outputs, and the sub-networks and/or layers including at least one of the following: a spatial transformer sub-network ( 31 1 , 31 3, 321 , 324, 31 7, 31 9, 328, 352), configured to apply a spatial transformation to a 3 D input; a multi-layer perceptron sub-network ( 31 2, 31 4, 31 6, 323, 325, 3 1 8, 341 , 343,
345, 351 , 353 ), enabling detection of geometric features in an input; and a maximum pooling layer ( 31 5, 326, 344, 354), configured to reduce a number of parameters in an output from the number of parameters in an input.
5. The method of one of claims 1 to 4, wherein training the neural network (3 ) com prises: determining (S22) a fitting error by comparing the garments fitted (42) onto the bodies ( 51 ) by the neural network (3 ) to corresponding fitted garments ( 61 ) from the training dataset ( 6); and adjusting (S24) the neural network ( 3 ) using the fitting error.
6. The method of claim 5, wherein determining (S22) the fitting error comprises: determining a distance term which represents distances between vertices of the gar ments fitted (42) onto the body ( 51 ) by the neural network (3 ) and corresponding vertices of the corresponding fitted garments (61 ) from the training dataset ( 6); determining an orientation term which represents angular differences between ver tex normals of the garments fitted (42) onto the body ( 51 ) by the neural network ( 3 ) and the corresponding vertex normals of the corresponding fitted garments ( 61 ) from the training dataset ( 6); and determining a penetration term which indicates whether vertices of the garment fit ted (42) by the neural network (3 ) are inside the body ( 51 ) onto which the garment (42) has been fitted.
7. A computer system ( 1 ) comprising one or more processors (2, 71 ) configured to fit a garment (41 , 8) onto a body ( 51 , 9) by performing the following steps: storing a training dataset (6) of fitted garments ( 61 ), generated (S 1 ) by fitting 3 D representations of garments (41 ) from a garment dataset (4) onto 3 D representa- tions of bodies ( 51 ) from a body dataset ( 5); generating (S2) a trained neural network (31 ) by training a neural network (3 ) to fit the garments (41 ) onto the bodies ( 51 ), using the training dataset (6); receiving, from a user, a selection of a particular garment (8) and a particular body ( 9), which particular garment (8) and which particular body (9) are not in the train- ing dataset (6); fitting (S4) the particular garment (8) onto the particular body ( 9) using the trained neural network (3 1 ); and showing on a display (72) the fitting of the particular garment (8) onto the particular body ( 9). 8. The computer system of claim 7, wherein the processors (2) are configured to im- plement the neural network ( 3 ) to comprise a plurality of sub-networks, including a garment sub-network (31 ) configured to receive the garment (41 , 8) as an input and comprising a plurality of layers, and a body sub-network (32) configured to re ceive the body ( 51 , 9) as an input and comprising a plurality of layers, whereby an output of the body sub-network ( 32) is provided to at least one layer of the garment sub-network ( 31 ), and the garment sub-network (32) is configured to use the out put of the body sub-network (32) to fit the garment (41 , 8) onto the body ( 51 , 9).
9. The computer system of claims 7 or 8, wherein the processors (2) are configured to implement the neural network (3 ) to comprise a plurality of sub-networks and/or layers, each sub-network and/or layer having inputs and outputs, and the sub-net- works and /or layers including at least one of the following: a spatial transformer sub-network ( 31 1 , 31 3, 321 , 324, 31 7, 31 9, 328, 352), configured to apply a spatial transformation to a 3 D input; a multi-layer perceptron sub-network ( 31 2, 31 4, 31 6, 323, 325, 3 1 8, 341 , 343, 345, 351 , 353 ), enabling detection of geometric features in an input; and a maximum pooling layer ( 31 5, 326, 344, 354), configured to reduce a number of parameters in an output from the number of parameters in an input.
1 0. The computer system of one of claims 7 to 9, wherein the processors ( 2) are config ured to train the neural network ( 3 ) by performing the steps: determining (S22) a fitting error by comparing the garments fitted (42) onto the bodies ( 51 ) by the neural network (3 ) to corresponding fitted garments ( 61 ) from the training dataset ( 6); and adjusting (S24) the neural network ( 3 ) using the fitting error.
1 1 . The computer system of claim 1 0, wherein the processors ( 2) are configured to de termine (S22) the fitting error by performing the steps: determining a distance term which represents distances between vertices of the gar ments fitted (42) onto the body ( 51 ) by the neural network (3 ) and corresponding 5 vertices of the corresponding fitted garments (61 ) from the training dataset ( 6); determining an orientation term which represents angular differences be-tween ver tex normals of the garments fitted (42) onto the body ( 51 ) by the neural network ( 3 ) and the corresponding vertex normals of the corresponding fitted garments ( 61 ) from the training dataset ( 6); and o determining a penetration term which indicates whether vertices of the garment fit ted (42) by the neural network (3 ) are inside the body ( 51 ) onto which the garment (42) has been fitted.
1 2. The computer system of one of claims 7 to 1 1 , wherein at least one of the processors ( 2) is arranged on a networked computer system ( 1 ) and configured to store the5 training dataset ( 6) of fitted garments (61 ), generate the trained neural network
( 31 ), and transferthe trained neural network ( 31 ) via a communication network (4) to a communication terminal ( 7); and at least one of the processors ( 71 ) is arranged on the communication terminal ( 7) and configured to receive the trained neural net work ( 31 ) from the networked computer system ( 1 ) via the communication net0 work (4), receive the selection of the particular garment (8) and the particular body
( 9) from the user, fit (S4) the particular garment (8) onto the particular body ( 9) using the trained neural network (31 ), and show on the display ( 72) the fitting of the particular garment (8) onto the particular body (9).
1 3. A computer program product comprising a non-transitory computer readable me dium having stored thereon computer code configured to control a processor (2) of a computer system ( 1 ) to fit a garment (41 , 8) onto a body ( 51 , 9) by performing the steps: storing a training dataset (6) of fitted garments (61 ), generated (S 1 ) by fitting 3 D representations of garments (41 ) from a garment dataset (4) onto 3 D representa tions of bodies ( 51 ) from a body dataset ( 5); generating (S2) a trained neural network (31 ) by training a neural network (3 ) to fit the garments (41 ) onto the bodies ( 51 ), using the training dataset (6); receiving, from a user, a selection of a particular garment (8) and a particular body ( 9), which particular garment (8) and which particular body (9) are not in the train ing dataset (6); fitting (S4) the particular garment (8) onto the particular body ( 9) using the trained neural network (3 1 ); and showing on a display ( 51 ) the fitting of the particular garment (8) onto the particular body ( 9).
1 4. A computer program product comprising a non-transitory computer readable me- dium having stored thereon computer code configured to control a processor ( 71 ) of a communication terminal (7) to fit a garment (41 , 8) onto a body ( 51 , 9) by performing the steps: receiving (S3 ) from a computer system ( 1 ) a trained neural network (31 ), trained with a training dataset (6) of fitted garments (61 ) to fit garments (41 , 8) onto bod- ies ( 51 , 9); receiving, from a user, a selection of a particular garment (8) and a particular body ( 9), which particular garment (8) and which particular body (9) are not in the train ing dataset (6); fitting (S4) the particular garment (8) onto the particular body ( 9) using the trained neural network (3 1 ); and showing on a display (72) the fitting of the particular garment (8) onto the particular body ( 9).
PCT/EP2019/066898 2018-11-16 2019-06-25 Method and devices for fitting a garment onto a body WO2020098982A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CH01427/18 2018-11-16
CH14272018 2018-11-16

Publications (1)

Publication Number Publication Date
WO2020098982A1 true WO2020098982A1 (en) 2020-05-22

Family

ID=64332254

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/066898 WO2020098982A1 (en) 2018-11-16 2019-06-25 Method and devices for fitting a garment onto a body

Country Status (1)

Country Link
WO (1) WO2020098982A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112819930A (en) * 2021-02-21 2021-05-18 北京工业大学 Real-time role garment fabric animation simulation method based on feedforward neural network
DE102021005835A1 (en) 2021-02-16 2022-08-18 Sikora Aktiengesellschaft Method and system for collecting and providing data for a purchase of an article of clothing
DE102021103578A1 (en) 2021-02-16 2022-08-18 Sikora Aktiengesellschaft Method and system for collecting and providing data for a purchase of an article of clothing

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140333614A1 (en) * 2012-02-16 2014-11-13 Michael J. Black System and method for simulating realistic clothing
US20170286809A1 (en) * 2016-04-04 2017-10-05 International Business Machines Corporation Visual object recognition
WO2017203262A2 (en) 2016-05-25 2017-11-30 Metail Limited Method and system for predicting garment attributes using deep learning
CN108256975A (en) * 2018-01-23 2018-07-06 喻强 Wearing for 3-D effect is provided for virtual fitting person take system and method based on artificial intelligence
WO2018154331A1 (en) * 2017-02-27 2018-08-30 Metail Limited Method of generating an image file of a 3d body model of a user wearing a garment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140333614A1 (en) * 2012-02-16 2014-11-13 Michael J. Black System and method for simulating realistic clothing
US20170286809A1 (en) * 2016-04-04 2017-10-05 International Business Machines Corporation Visual object recognition
WO2017203262A2 (en) 2016-05-25 2017-11-30 Metail Limited Method and system for predicting garment attributes using deep learning
WO2018154331A1 (en) * 2017-02-27 2018-08-30 Metail Limited Method of generating an image file of a 3d body model of a user wearing a garment
CN108256975A (en) * 2018-01-23 2018-07-06 喻强 Wearing for 3-D effect is provided for virtual fitting person take system and method based on artificial intelligence

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102021005835A1 (en) 2021-02-16 2022-08-18 Sikora Aktiengesellschaft Method and system for collecting and providing data for a purchase of an article of clothing
DE102021103578A1 (en) 2021-02-16 2022-08-18 Sikora Aktiengesellschaft Method and system for collecting and providing data for a purchase of an article of clothing
WO2022175155A1 (en) 2021-02-16 2022-08-25 Sikora Ag Method and system for capturing and providing data for a purchase of an item of clothing
DE102021005835B4 (en) 2021-02-16 2022-12-08 Sikora Aktiengesellschaft Method and system for collecting and providing data for a purchase of an article of clothing
DE102021103578B4 (en) 2021-02-16 2023-06-07 Sikora Aktiengesellschaft Method and system for collecting and providing data for a purchase of an article of clothing
CN112819930A (en) * 2021-02-21 2021-05-18 北京工业大学 Real-time role garment fabric animation simulation method based on feedforward neural network

Similar Documents

Publication Publication Date Title
US11514642B2 (en) Method and apparatus for generating two-dimensional image data describing a three-dimensional image
WO2020098982A1 (en) Method and devices for fitting a garment onto a body
CN108475438A (en) The Facial reconstruction of insertion based on study
CN111401406A (en) Neural network training method, video frame processing method and related equipment
JP6872044B2 (en) Methods, devices, media and equipment for determining the circumscribed frame of an object
JP7135659B2 (en) SHAPE COMPLEMENTATION DEVICE, SHAPE COMPLEMENTATION LEARNING DEVICE, METHOD, AND PROGRAM
Huang et al. Parametric design for human body modeling by wireframe-assisted deep learning
US10909744B1 (en) Simulating garment with wrinkles based on physics based cloth simulator and machine learning model
JP2019125251A (en) Information processor, data structure, information processing method, and program
Wei et al. Intuitive interactive human-character posing with millions of example poses
Vasudevan et al. Non-stationary dependent Gaussian processes for data fusion in large-scale terrain modeling
CN115461785A (en) Generating a non-linear human shape model
JP2021077365A5 (en)
CN108900788A (en) Video generation method, video-generating device, electronic device and storage medium
CN112528811A (en) Behavior recognition method and device
JP2017091377A (en) Attitude estimation device, attitude estimation method, and attitude estimation program
JP6579353B1 (en) Information processing apparatus, information processing method, dimension data calculation apparatus, and product manufacturing apparatus
CN112508776B (en) Action migration method and device and electronic equipment
CN114169255B (en) Image generation system and method
CN114359961A (en) Pedestrian attribute identification method and related equipment
Sun et al. Real-time memory efficient large-pose face alignment via deep evolutionary network
CN110084845A (en) Deformation Prediction method, apparatus and computer readable storage medium
CN112037315A (en) Method and device for generating local descriptor and method and device for generating model
JP2020113116A (en) Motion generator, motion generation method, and program
CN116486108B (en) Image processing method, device, equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19734061

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 13/08/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19734061

Country of ref document: EP

Kind code of ref document: A1