WO2022024200A1 - 3dデータシステム及び3dデータ生成方法 - Google Patents
3dデータシステム及び3dデータ生成方法 Download PDFInfo
- Publication number
- WO2022024200A1 WO2022024200A1 PCT/JP2020/028760 JP2020028760W WO2022024200A1 WO 2022024200 A1 WO2022024200 A1 WO 2022024200A1 JP 2020028760 W JP2020028760 W JP 2020028760W WO 2022024200 A1 WO2022024200 A1 WO 2022024200A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- modeling
- garment
- bone
- clothes
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 26
- 238000010801 machine learning Methods 0.000 claims abstract description 63
- 210000000988 bone and bone Anatomy 0.000 claims abstract description 54
- 238000004519 manufacturing process Methods 0.000 claims description 4
- 230000002194 synthesizing effect Effects 0.000 claims description 3
- 239000004744 fabric Substances 0.000 abstract 4
- 230000008569 process Effects 0.000 description 17
- 238000004891 communication Methods 0.000 description 14
- 238000012545 processing Methods 0.000 description 13
- 230000006870 function Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000004044 response Effects 0.000 description 4
- 238000012549 training Methods 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/09—Supervised learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/16—Cloth
Definitions
- the present invention relates to a technique for generating 3D modeling data of clothing.
- Patent Document 1 discloses applying a 3D clothing model to a 3D person model.
- the present invention provides a technique for more easily acquiring data for giving motion to 3D modeling data.
- One aspect of the present disclosure is to use image data and distance data obtained by scanning the garment as an input layer, and 3D modeling data, bone data, and skin weight of the garment as an output layer for each of the plurality of garments.
- a 3D data system having an acquisition means for acquiring bone data and skin weights and a storage means for storing 3D modeling data, bone data, and skin weights of the target garment.
- the image data and the distance data obtained by scanning the clothes are used as the input layer, and the pattern data used for manufacturing the clothes is used as the output layer.
- the access means for accessing the machine learning model given as teacher data and trained, and the image data and the distance data obtained by scanning the target garment are input to the machine learning model, and the pattern data of the target garment is input.
- An output means for outputting a request including garment pattern data, a second acquisition means for acquiring 3D modeling data, bone data, and skin weight of the target garment from the 3D model generation system, and 3D modeling data of the target garment. , Bone data, and a storage means for storing skin weights.
- This 3D data system may have a third acquisition means for acquiring the user's 3D modeling data, and a synthesis means for synthesizing the 3D model with the target clothes on the user's 3D model.
- Yet another aspect of the present disclosure is to use image data and distance data obtained by scanning the garment as an input layer for each of the plurality of garments, and 3D modeling data, bone data, and skin weight of the garment.
- the step of accessing the machine learning model trained by giving it as teacher data to the output layer, and the image data and distance data obtained by scanning the target garment are input to the machine learning model, and 3D modeling of the target garment is performed.
- a 3D data generation method including a step of acquiring data, bone data, and skin weight, and a step of storing 3D modeling data, bone data, and skin weight of the target garment.
- the image data and the distance data obtained by scanning the clothes are used as the input layer, and the pattern data used for manufacturing the clothes is output.
- the step of accessing the machine learning model given to the layer as teacher data and training, and the image data and the distance data obtained by scanning the target garment are input to the machine learning model, and the pattern data of the target garment is input. It is a request to generate a 3D model to a 3D model generation system that outputs the 3D modeling data, bone data, and skin weight of the garment using the acquired step and the garment pattern data, and is the target garment pattern.
- a 3D data generation method having a step of storing the data.
- the figure which illustrates the hardware configuration of a server 10. A flowchart illustrating processing related to learning.
- CPU 402 ... Memory , 403 ... Storage, 404 ... Communication IF, 405 ... Display, 406 ... Input device, 410 ... Housing, 420 ... Camera, 430 ... Distance sensor, 440 ... Computer, 441 ... CPU
- FIG. 1 is a diagram illustrating an outline of a 3D data system 1 according to an embodiment.
- the 3D data system 1 scans clothes and generates 3D modeling data.
- a machine learning model is used to generate clothing bone data and skin weights.
- the 3D data system 1 has a server 10, a server 20, a terminal device 30, and a 3D scanner 40.
- the 3D scanner 40 is a device that scans an object and generates 3D modeling data representing a 3D model of the object.
- the object is clothing. Scanning an object means taking an image of the appearance of the object from multiple directions, measuring the distance from a reference position (for example, the position of a depth sensor) to the surface of the object at multiple points, and taking an image of the appearance. It means associating a point on an image with distance data (or depth information). That is, scanning an object means taking an image with distance data.
- a 3D model is a virtual object that represents an object in virtual space.
- the 3D modeling data is data representing a 3D model.
- Server 10 is a server that manages 3D modeling data.
- the server 10 is managed by the management company of the 3D data system 1.
- the server 20 is a server that provides an application using a 3D model. This application may be provided by the management company of the 3D data system 1 itself, or may be provided by another company.
- the terminal device 30 is a user terminal that uses an application (that is, uses a 3D model).
- the terminal device 30 is an information processing device such as a smartphone, a tablet terminal, or a personal computer.
- the 3D data system 1 includes a plurality of servers 20 and / or a plurality of terminal devices 30. May have.
- the 3D scanner 40 is a device that scans an object and generates 3D modeling data.
- the 3D scanner 40 uploads the generated 3D modeling data to the server 10.
- the 3D data system 1 includes a storage means 11, a learning means 12, a scanning means 41, a processing means 42, an access means 43, a storage means 44, a communication means 45, an output means 31, a control means 32, and an acquisition means 33.
- the storage means 11 stores various data.
- the data stored by the storage means 11 includes the database 111, the machine learning model 112, and the teacher data 113.
- Database 111 is a database that stores 3D data sets for a plurality of objects (clothes in this example).
- the 3D dataset contains, for each of the plurality of objects, 3D modeling data and accompanying data of the object.
- Ancillary data includes, for example, object attributes and update dates.
- the object attribute indicates the attribute of the garment such as the garment ID, the garment name, the brand, the release date, the target gender, and the size.
- the update date indicates the date and time when the 3D data set was updated.
- the machine learning model 112 is a model used for machine learning.
- the machine learning model 112 has an input layer, an intermediate layer, and an output layer.
- the machine learning model 112 is an untrained (ie, not trained) model.
- the teacher data 113 includes, for each of the plurality of clothes, an image of the clothes, 3D modeling data of the clothes, bone data, and skin weights. This image was taken by the 3D scanner 40, and the 3D modeling data was generated by the 3D scanner 40.
- a bone represents an element that is a unit of movement for moving a 3D model.
- a correspondence relationship (so-called skin weight) between polygon vertices and bones is set (so-called skinning).
- skinning a correspondence relationship between polygon vertices and bones
- Bone data and skin weights are, in one example, manually created by a worker (not a general user, but a trained expert).
- the learning means 12 causes the machine learning model 112 to learn using the teacher data 113.
- the teacher data 113 may be added or updated at a predetermined timing, and the machine learning model 112 may be added or relearned using the added or updated teacher data 113.
- the scanning means 41 scans the object.
- the processing means 42 processes the image data with the distance data obtained by scanning the object.
- the access means 43 accesses the machine learning model 112.
- the acquisition means 46 inputs an image of the garment to be processed (hereinafter referred to as “target garment”) into the machine learning model 112, and acquires 3D modeling data, bone data, and skin weight of the target garment.
- the storage means 44 stores various programs and data.
- the communication means 45 communicates with another device such as the server 10 or the terminal device 30.
- the output means 31 outputs 3D modeling data, bone data, and an image according to the skin weight of the target garment.
- FIG. 3 is a diagram illustrating the hardware configuration of the 3D scanner 40.
- the 3D scanner 40 includes a housing 410, a camera 420, a distance sensor (or depth sensor) 43, and a computer 440.
- the camera 420 captures the appearance of the object and outputs image data.
- the distance sensor 430 measures the distance from a reference position (sensor position) to a plurality of points on the surface of the object. The positional relationship between the portion measured by the distance sensor 430 and the portion photographed by the camera 420 is defined in advance.
- the 3D scanner 40 may have a plurality of cameras 420 and a plurality of distance sensors 430.
- the housing 410 supports the camera 420 and the distance sensor 430.
- the housing 410 may have a mechanism for relatively rotating the object and the camera 420 and the distance sensor 430 according to the number and arrangement of the cameras 420 and the distance sensor 430.
- the computer 440 processes the image data output from the camera 420 and the distance data output from the distance sensor 430.
- This process may include a process of mapping distance data on the image. Further, this process may include a process of applying image data and distance data to a predetermined algorithm to generate 3D modeling data.
- the computer 440 has a CPU (Central Processing Unit) 401, a memory 402, a storage 403, a communication IF 404, a display 405, and an input device 406.
- the CPU 401 is a control device that performs various operations according to a program.
- the memory 402 is a main storage device that functions as a work area when the CPU 401 executes processing.
- the memory 402 includes, for example, a RAM (RandomAccessMemory) and a ROM (ReadOnlyMemory).
- the storage 403 is an auxiliary storage device that stores various data and programs. Storage 403 includes, for example, SSD and / or HDD.
- the communication IF 404 is a device that communicates with another device according to a predetermined communication standard (for example, Ethernet), and includes, for example, a NIC (Network Interface Card).
- the display 405 is a device that outputs visual information, and includes, for example, an LCD (Liquid Crystal Display).
- the input device 406 is a device for inputting instructions or information to the computer 440 in response to a user operation, and includes, for example, at least one of a touch screen, a keyboard, a keypad, a mouse, and a microphone.
- the program stored in the storage 403 includes a program for making the computer device function as the computer 440 in the 3D data system 1 (hereinafter referred to as "3D model generation program").
- 3D model generation program a program for making the computer device function as the computer 440 in the 3D data system 1
- the CPU 401 executes the 3D model generation program, the function of FIG. 2 is implemented in the computer device.
- the camera 420, the distance sensor 430, and the CPU 401 are examples of the scanning means 41 while the CPU 401 is executing the 3D model generation program.
- the CPU 401 is an example of the processing means 42, the access means 43, and the acquisition means 46.
- At least one of the storage 403 and the memory 402 is an example of the storage means 44.
- the communication IF 404 is an example of the communication means 45.
- FIG. 4 is a diagram illustrating a hardware configuration of the server 10.
- the server 10 is a computer device having a CPU 101, a memory 102, a storage 103, and a communication IF 404.
- the CPU 101 is a control device that performs various operations according to a program.
- the memory 102 is a main storage device that functions as a work area when the CPU 401 executes processing.
- the memory 102 includes, for example, RAM and ROM.
- the storage 103 is an auxiliary storage device that stores various data and programs.
- the storage 103 includes, for example, SSD and / or HDD.
- the communication IF 104 is a device that communicates with another device according to a predetermined communication standard (for example, Ethernet), and includes, for example, a NIC.
- the program stored in the storage 103 includes a program for making the computer device function as the server 10 in the 3D data system 1 (hereinafter referred to as "server program").
- server program a program for making the computer device function as the server 10 in the 3D data system 1
- the CPU 101 executes the server program, the function of FIG. 2 is implemented in the computer device.
- At least one of the storage 103 and the memory 102 is an example of the storage means 11 in a state where the CPU 101 is executing the server program.
- the CPU 101 is an example of the learning means 12.
- each of the server 20 and the terminal device 30 has a hardware configuration as a computer device.
- the display included in the terminal device 30 is an example of the output means 31.
- the CPU included in the terminal device 30 is an example of the acquisition means 33.
- the operation of the 3D data system 1 will be described below.
- the operation of the 3D data system 1 is broadly divided into learning and data generation.
- Learning is a process of training the machine learning model 112.
- the data generation is a process of generating 3D modeling data using the machine learning model 112.
- FIG. 5 is a flowchart illustrating a process related to learning.
- the flow of FIG. 5 is started, for example, when a predetermined start condition is satisfied.
- This start condition is, for example, a condition that new data is added to the teacher data 113.
- step S101 the storage means 11 stores the machine learning model 112.
- the machine learning model 112 is a model that has not been trained.
- step S102 the storage means 11 stores the teacher data 113.
- the learning means 12 gives the machine learning model 112 the teacher data 113 and causes the machine learning model 112 to learn. Specifically, the learning means 12 gives an image of the clothes taken to the input layer of the machine learning model 112, and 3D modeling data, bone data, and skin weights of the clothes to the output layer as teacher data.
- the learning means 12 ends the learning.
- the machine learning model 112 becomes a trained model.
- an example of learning the machine learning model 112 using the teacher data 113 has been described, but the same applies to the case of learning the machine learning model 212 using the teacher data 213.
- FIG. 6 is a sequence chart illustrating a process related to data generation. The process of FIG. 6 is started, for example, when a predetermined start condition is satisfied. This start condition is, for example, a condition that the user has instructed the 3D scanner 40 to generate 3D modeling data.
- step S201 the scanning means 41 scans an object, that is, clothes.
- the storage means 44 stores the image data and the distance data obtained by the scanning means 41 (step S202).
- step S203 the access means 43 accesses the machine learning model 112. Specifically, the access means 43 inputs the image data and the distance data obtained by the scanning means 41 into the machine learning model 112.
- the machine learning model 112 outputs 3D modeling data, bone data, and skin weights corresponding to the input image data and distance data (step S204).
- the access means 43 requests that the 3D modeling data, bone data, and skin weights output from the machine learning model 112 be written to the database 111 (step S205). This request includes ancillary data about this garment, such as a garment ID.
- the garment ID is input, for example, by a user who operates the 3D scanner 40.
- the storage means 11 stores the 3D modeling data, bone data, skin weights, and attached data output from the machine learning model 112 in the database 111 (step S206).
- the acquisition means 33 automatically requests the server 10 to provide output data in response to an instruction from the user or by a program (step S206).
- the user of the terminal device 30 may be a user different from the user who has operated the 3D scanner 40 to scan the target clothing, or may be the same user.
- This request includes information that identifies the garment, such as the garment ID.
- the output data refers to data showing the 3D modeling data itself or the result of processing the 3D modeling data (for example, a moving image in which a 3D model is made to perform a predetermined or instructed motion).
- the server 10 outputs the requested output data. For example, the server 10 outputs the 3D modeling data read from the database 111 to the terminal device 30.
- the acquisition means 33 acquires output data from the server 10 (step S207).
- the output means 31 outputs an image using the output data (step S208).
- the output means 31 may output an image in which the target garment is put on the user's 3D model, or the target garment alone (without the user). The image of may be output.
- the control means 32 acquires the user's 3D modeling data (an example of the third acquisition means), and performs a process of synthesizing the user's 3D model and the clothes's 3D model. Do (an example of synthetic means). Since bone data and skin weights are set in the 3D modeling data of the target garment, the garment can be moved according to the movement of the user's 3D model.
- the user's 3D modeling data is stored in a database such as database 111.
- FIG. 7 is a diagram illustrating the functional configuration of the 3D data system 2 according to the second embodiment.
- the 3D data system 2 includes a storage means 11, a learning means 12, a scanning means 41, a processing means 42, an access means 43, a storage means 44, a communication means 45, a generation means 47, an output means 48, an output means 31, and a control means 32. And the acquisition means 33.
- the storage means 11 stores various data.
- the data stored by the storage means 11 includes the database 111, the machine learning model 212, and the teacher data 213.
- Machine learning model 212 is a model used for machine learning.
- the machine learning model 212 has an input layer, an intermediate layer, and an output layer.
- the machine learning model 212 is an untrained (ie, not trained) model.
- the teacher data 213 includes, for each of the plurality of clothes, an image of the clothes, 3D modeling data of the clothes, and pattern data.
- the paper pattern data is data showing the shape and size of the paper pattern used when making the clothes.
- the paper pattern data is provided by the business operator who made the garment. Alternatively, the pattern data may be manually created by the operator of the 3D data system 2 by disassembling the clothes or the like.
- the learning means 12 causes the machine learning model 212 to learn using the teacher data 213.
- the teacher data 213 may be added or updated at a predetermined timing, and the machine learning model 212 may be added or relearned using the added or updated teacher data 213.
- the access means 43 accesses the machine learning model 212.
- the access means 43 inputs an image of the target garment into the machine learning model 212 and acquires the pattern data of the target garment (an example of the first acquisition means).
- the output means 48 outputs a request for 3D model generation to the generation means 47.
- This request includes the pattern data of the target garment.
- the generation means 47 When the pattern data of the clothes is input, the generation means 47 generates 3D modeling data, bone data, and skin weights of the clothes.
- the generation means 47 generates clothing pattern data 3D modeling data, bone data, and skin weights by a predetermined algorithm. Alternatively, the generation means 47 may generate 3D modeling data, bone data, and skin weights using machine learning.
- pattern data, 3D modeling data, bone data, and skin weights are used as teacher data for a plurality of clothes.
- pattern data is input to the input layer, and 3D modeling data, bone data, and skin weights are input to the output layer.
- the access means 43 acquires 3D modeling data, bone data, and skin weights from the generation means 47 (an example of the second acquisition means).
- FIG. 8 is a sequence chart illustrating the data generation process according to the second embodiment.
- the process of FIG. 6 is started, for example, when a predetermined start condition is satisfied.
- This start condition is, for example, a condition that the user has instructed the 3D scanner 40 to generate 3D modeling data.
- step S301 the scanning means 41 scans an object, that is, clothes.
- the storage means 44 stores the image data and the distance data obtained by the scanning means 41 (step S302).
- step S203 the access means 43 accesses the machine learning model 212. Specifically, the access means 43 inputs the image data and the distance data obtained by the scanning means 41 into the machine learning model 212.
- the machine learning model 112 outputs pattern data corresponding to the input image data and distance data (step S304).
- the access means 43 requests the generation means 47 to generate a 3D model (step S305). This request includes the pattern data of the target clothing. In response to this request, the generation means 47 generates 3D modeling data, bone data, and skin weights for the target garment (step S306). The access means 43 acquires the 3D modeling data, bone data, and skin weight of the target garment generated by the generation means 47 (step S307). The access means 43 requests that the 3D modeling data, bone data, and skin weight of the target garment generated by the generation means 47 be written in the database 111 (step S308). This request includes ancillary data about this garment, such as a garment ID. The garment ID is input, for example, by a user who operates the 3D scanner 40. In the server 10, the storage means 11 stores the pattern data and the attached data output from the machine learning model 212 in the database 111 (step S309).
- the acquisition means 33 automatically requests the server 10 to provide output data in response to an instruction from the user or by a program (step S310).
- the server 10 provides output data to the terminal device 30 (step S310).
- the output means 31 outputs an image using the output data (step S311).
- the devices constituting the 3D data system 1 and the assignment of functions to each device are not limited to the examples described in the embodiments.
- the machine learning model 112 may be mounted on a device different from the server 10.
- the machine learning model 212 and the generation means 47 may be mounted on different devices.
- the functions corresponding to the processing means 42 and the access means 43 may be implemented in the server 10 or another server instead of the 3D scanner 40.
- the 3D scanner 40 only scans the target clothing and accepts input of attached data, and the subsequent processing is performed by this server.
- at least a part of the functions described as having the server 10 in the embodiment may be implemented in the 3D scanner 40.
- the 3D data system 1 may change the degree of automation according to the amount of data of the teacher data 113. For example, if the amount of data in the teacher data 113 is less than the first threshold and the machine learning model 112 has not been trained, the access means 43 of the 3D scanner 40 accesses the database 111 and is similar to the scanned clothing. Search for images to do. Similar clothes are searched using, for example, image data as a key. In addition to or instead, similar garments may be searched using information contained in ancillary data such as brand name or model number as a key. When similar 3D modeling data is found, the 3D scanner 40 outputs a screen for editing the found 3D modeling data and the corresponding bone data and skin weights.
- the user can edit the 3D modeling data of the garment similar to the target garment on this screen and generate the 3D modeling data, the bone data, and the skin weight of the target garment. The same applies to the 3D data system 2.
- the amount of data of the teacher data 113 exceeds the first threshold value
- the 3D scanner 40 outputs a screen for the user to edit the 3D modeling data, bone data, and skin weight generated for the target garment.
- the user can edit the 3D modeling data, bone data, and skin weight of the target garment on this screen. Note that this process may be performed when the amount of data of the teacher data 113 is larger than the first threshold value and smaller than the second threshold value in the above modification.
- the server 10 may be a physical server or a so-called virtual server on the cloud.
- the program executed by the CPU 441 or the like may be downloaded via a network such as the Internet, or may be provided in a state of being recorded on a recording medium such as a CD-ROM.
Abstract
Description
図1は、一実施形態に係る3Dデータシステム1の概要を例示する図である。3Dデータシステム1は衣服をスキャンし、3Dモデリングデータを生成する。特に本実施形態においては、機械学習モデルを用いて衣服のボーンデータ及びスキンウェイトを生成する。
2-1.構成
図2は、3Dデータシステム1の機能構成を例示する図である。3Dデータシステム1は、記憶手段11、学習手段12、スキャン手段41、処理手段42、アクセス手段43、記憶手段44、通信手段45、出力手段31、制御手段32、及び取得手段33を有する。記憶手段11は、各種のデータを記憶する。記憶手段11が記憶するデータには、データベース111、機械学習モデル112、及び教師データ113が含まれる。データベース111は、複数の対象物(この例では衣服)についての3Dデータセットを記憶したデータベースである。3Dデータセットは、複数の対象物の各々について、その対象物の3Dモデリングデータ及び付随データを含む。付属データは、例えば、対象物属性及び更新日を含む。対象物属性は、衣服ID、衣服名、ブランド、発売日、対象性別、及びサイズなど衣服の属性を示す。更新日は、3Dデータセットが更新された日時を示す。
以下、3Dデータシステム1の動作を説明する。3Dデータシステム1の動作は、大きく、学習及びデータ生成に区分される。学習は、機械学習モデル112に学習をさせる処理である。データ生成は、機械学習モデル112を用いて3Dモデリングデータを生成する処理である。
図5は、学習に係る処理を例示するフローチャートである。図5のフローは、例えば、所定の開始条件が満たされたことを契機として開始される。この開始条件は、例えば、教師データ113に新しいデータが追加されたという条件である。
図6は、データ生成に係る処理を例示するシーケンスチャートである。図6の処理は、例えば、所定の開始条件が満たされたことを契機として開始される。この開始条件は、例えば、3Dスキャナ40においてユーザにより3Dモデリングデータの生成が指示されたという条件である。
次に、第2実施形態を説明する。第1実施形態と機能の概要が共通する要素については共通の符号を用い、詳細な説明を省略する。
図7は、第2実施形態に係る3Dデータシステム2の機能構成を例示する図である。3Dデータシステム2は、記憶手段11、学習手段12、スキャン手段41、処理手段42、アクセス手段43、記憶手段44、通信手段45、生成手段47、出力手段48、出力手段31、制御手段32、及び取得手段33を有する。記憶手段11は、各種のデータを記憶する。記憶手段11が記憶するデータには、データベース111、機械学習モデル212、及び教師データ213が含まれる。
図8は、第2実施形態に係るデータ生成処理を例示するシーケンスチャートである。図6の処理は、例えば、所定の開始条件が満たされたことを契機として開始される。この開始条件は、例えば、3Dスキャナ40においてユーザにより3Dモデリングデータの生成が指示されたという条件である。
本発明は上述の実施形態に限定されるものではなく、種々の変形実施が可能である。以下、変形例をいくつか説明する。以下の変形例に記載した事項のうち2つ以上が組み合わせて適用されてもよい。
Claims (5)
- 複数の衣服の各々について、当該衣服をスキャンして得られた画像データ及び距離データを入力層に、当該衣服の3Dモデリングデータ、ボーンデータ、及びスキンウェイトを出力層に、教師データとして与えて学習させた機械学習モデルにアクセスするアクセス手段と、
対象衣服をスキャンして得られた画像データ及び距離データを前記機械学習モデルに入力し、当該対象衣服の3Dモデリングデータ、ボーンデータ、及びスキンウェイトを取得する取得手段と、
前記対象衣服の3Dモデリングデータ、ボーンデータ、及びスキンウェイトを記憶する記憶手段と
を有する3Dデータシステム。 - 複数の衣服の各々について、当該衣服をスキャンして得られた画像データ及び距離データを入力層に、当該衣服を製造する際に用いられた型紙データを出力層に、教師データとして与えて学習させた機械学習モデルにアクセスするアクセス手段と、
対象衣服をスキャンして得られた画像データ及び距離データを前記機械学習モデルに入力し、当該対象衣服の型紙データを取得する第1取得手段と、
衣服の型紙データを用いて当該衣服の3Dモデリングデータ、ボーンデータ、及びスキンウェイトを出力する3Dモデル生成システムに対し、3Dモデルの生成の依頼であって、前記対象衣服の型紙データを含む依頼を出力する出力手段と、
前記3Dモデル生成システムから前記対象衣服の3Dモデリングデータ、ボーンデータ、及びスキンウェイトを取得する第2取得手段と
前記対象衣服の3Dモデリングデータ、ボーンデータ、及びスキンウェイトを記憶する記憶手段と
を有する3Dデータシステム。 - ユーザの3Dモデリングデータを取得する第3取得手段と、
前記ユーザの3Dモデルに前記対象衣服を着せた3Dモデルを合成する合成手段と
を有する請求項1又は2に記載の3Dデータシステム。 - 複数の衣服の各々について、当該衣服をスキャンして得られた画像データ及び距離データを入力層に、当該衣服の3Dモデリングデータ、ボーンデータ、及びスキンウェイトを出力層に、教師データとして与えて学習させた機械学習モデルにアクセスするステップと、
対象衣服をスキャンして得られた画像データ及び距離データを前記機械学習モデルに入力し、当該対象衣服の3Dモデリングデータ、ボーンデータ、及びスキンウェイトを取得するステップと、
前記対象衣服の3Dモデリングデータ、ボーンデータ、及びスキンウェイトを記憶するステップと
を有する3Dデータ生成方法。 - 複数の衣服の各々について、当該衣服をスキャンして得られた画像データ及び距離データを入力層に、当該衣服を製造する際に用いられた型紙データを出力層に、教師データとして与えて学習させた機械学習モデルにアクセスするステップと、
対象衣服をスキャンして得られた画像データ及び距離データを前記機械学習モデルに入力し、当該対象衣服の型紙データを取得するステップと、
衣服の型紙データを用いて当該衣服の3Dモデリングデータ、ボーンデータ、及びスキンウェイトを出力する3Dモデル生成システムに対し、3Dモデルの生成の依頼であって、前記対象衣服の型紙データを含む依頼を出力するステップと、
前記3Dモデル生成システムから前記対象衣服の3Dモデリングデータ、ボーンデータ、及びスキンウェイトを取得するステップと
前記対象衣服の3Dモデリングデータ、ボーンデータ、及びスキンウェイトを記憶するステップと
を有する3Dデータ生成方法。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/028760 WO2022024200A1 (ja) | 2020-07-27 | 2020-07-27 | 3dデータシステム及び3dデータ生成方法 |
US18/017,773 US20230196667A1 (en) | 2020-07-27 | 2020-07-27 | 3d data system and 3d data generating method |
CN202080069529.2A CN115004240A (zh) | 2020-07-27 | 2020-07-27 | 3d资料系统及3d资料生成方法 |
JP2020545614A JP6804125B1 (ja) | 2020-07-27 | 2020-07-27 | 3dデータシステム及び3dデータ生成方法 |
EP20946641.6A EP4191540A1 (en) | 2020-07-27 | 2020-07-27 | 3d data system and 3d data generation method |
TW110126281A TWI771106B (zh) | 2020-07-27 | 2021-07-16 | 3d資料系統及3d資料生成方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/028760 WO2022024200A1 (ja) | 2020-07-27 | 2020-07-27 | 3dデータシステム及び3dデータ生成方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022024200A1 true WO2022024200A1 (ja) | 2022-02-03 |
Family
ID=73836106
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/028760 WO2022024200A1 (ja) | 2020-07-27 | 2020-07-27 | 3dデータシステム及び3dデータ生成方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20230196667A1 (ja) |
EP (1) | EP4191540A1 (ja) |
JP (1) | JP6804125B1 (ja) |
CN (1) | CN115004240A (ja) |
TW (1) | TWI771106B (ja) |
WO (1) | WO2022024200A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7220324B1 (ja) * | 2022-09-30 | 2023-02-09 | 株式会社Zozo | グレーディングのためのシステム、方法、およびプログラム |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022269741A1 (ja) * | 2021-06-22 | 2022-12-29 | 株式会社Vrc | 情報処理装置、3dシステム、及び情報処理方法 |
JP7418677B1 (ja) | 2023-07-10 | 2024-01-22 | 株式会社Laila | 情報処理システム、情報処理方法、及び情報処理プログラム |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011113135A (ja) * | 2009-11-24 | 2011-06-09 | Nippon Hoso Kyokai <Nhk> | 仮想キャラクタ生成装置及び仮想キャラクタ生成プログラム |
JP2017037637A (ja) | 2015-07-22 | 2017-02-16 | アディダス アーゲー | 人工ピクチャを生成する方法および装置 |
JP2017037424A (ja) * | 2015-08-07 | 2017-02-16 | 日本放送協会 | 学習装置、認識装置、学習プログラム、及び認識プログラム |
JP2019204476A (ja) * | 2018-05-17 | 2019-11-28 | 株式会社Preferred Networks | 画像生成装置、画像生成方法及びプログラム |
JP2020005192A (ja) * | 2018-06-29 | 2020-01-09 | キヤノン株式会社 | 情報処理装置、情報処理方法、及びプログラム |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
NL1037949C2 (nl) * | 2010-05-10 | 2011-11-14 | Suitsupply B V | Werkwijze voor het op afstand bepalen van kledingmaten. |
CN103366402B (zh) * | 2013-08-05 | 2015-12-09 | 上海趣搭网络科技有限公司 | 三维虚拟服饰的快速姿态同步方法 |
DE102015210453B3 (de) * | 2015-06-08 | 2016-10-13 | Bitmanagement Software GmbH | Verfahren und vorrichtung zum erzeugen von daten für eine zwei- oder dreidimensionale darstellung zumindest eines teils eines objekts und zum erzeugen der zwei- oder dreidimensionalen darstellung zumindest des teils des objekts |
CN105139446A (zh) * | 2015-08-07 | 2015-12-09 | 河海大学常州校区 | 一种基于kinect的全息虚拟试衣系统 |
CN107067460A (zh) * | 2016-01-07 | 2017-08-18 | 广东京腾科技有限公司 | 一种虚拟试衣方法、装置及系统 |
CN105528808B (zh) * | 2016-02-29 | 2018-07-24 | 华中师范大学 | 荆楚民间故事泥塑人物数字化三维模型合成方法及系统 |
CN106228592A (zh) * | 2016-09-12 | 2016-12-14 | 武汉布偶猫科技有限公司 | 一种服装三维模型自动绑定蒙皮信息的方法 |
CN110021061B (zh) * | 2018-01-08 | 2021-10-29 | Oppo广东移动通信有限公司 | 搭配模型构建方法、服饰推荐方法、装置、介质及终端 |
CN108537888B (zh) * | 2018-04-09 | 2020-05-12 | 浙江大学 | 一种基于骨架的快速试衣方法 |
CN110634177A (zh) * | 2018-06-21 | 2019-12-31 | 华为技术有限公司 | 一种物体建模运动方法、装置与设备 |
CN109615683A (zh) * | 2018-08-30 | 2019-04-12 | 广州多维魔镜高新科技有限公司 | 一种基于3d服装模型的3d游戏动画模型制作方法 |
CN109427007B (zh) * | 2018-09-17 | 2022-03-18 | 叠境数字科技(上海)有限公司 | 基于多视角的虚拟试衣方法 |
CN111369649B (zh) * | 2018-12-26 | 2023-09-01 | 苏州笛卡测试技术有限公司 | 一种基于高精度三维扫描模型的制作计算机蒙皮动画的方法 |
CN109871589A (zh) * | 2019-01-23 | 2019-06-11 | 广东康云科技有限公司 | 基于三维人体建模的智能制衣系统及方法 |
CN109949208B (zh) * | 2019-02-21 | 2023-02-07 | 深圳市广德教育科技股份有限公司 | 基于互联网3d服装样板自动生成系统 |
CN110189413A (zh) * | 2019-05-31 | 2019-08-30 | 广东元一科技实业有限公司 | 一种生成衣服变形模型的方法及系统 |
CN110838182B (zh) * | 2019-11-13 | 2024-04-09 | 恒信东方文化股份有限公司 | 一种图像贴合人台的方法及其系统 |
CN111028321A (zh) * | 2019-12-16 | 2020-04-17 | 网易(杭州)网络有限公司 | 蒙皮检测方法、装置以及电子终端 |
CN111311751A (zh) * | 2020-02-12 | 2020-06-19 | 叠境数字科技(上海)有限公司 | 一种基于深度神经网络的三维衣服模型的重建方法 |
-
2020
- 2020-07-27 EP EP20946641.6A patent/EP4191540A1/en active Pending
- 2020-07-27 WO PCT/JP2020/028760 patent/WO2022024200A1/ja active Application Filing
- 2020-07-27 CN CN202080069529.2A patent/CN115004240A/zh active Pending
- 2020-07-27 JP JP2020545614A patent/JP6804125B1/ja active Active
- 2020-07-27 US US18/017,773 patent/US20230196667A1/en active Pending
-
2021
- 2021-07-16 TW TW110126281A patent/TWI771106B/zh active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011113135A (ja) * | 2009-11-24 | 2011-06-09 | Nippon Hoso Kyokai <Nhk> | 仮想キャラクタ生成装置及び仮想キャラクタ生成プログラム |
JP2017037637A (ja) | 2015-07-22 | 2017-02-16 | アディダス アーゲー | 人工ピクチャを生成する方法および装置 |
JP2017037424A (ja) * | 2015-08-07 | 2017-02-16 | 日本放送協会 | 学習装置、認識装置、学習プログラム、及び認識プログラム |
JP2019204476A (ja) * | 2018-05-17 | 2019-11-28 | 株式会社Preferred Networks | 画像生成装置、画像生成方法及びプログラム |
JP2020005192A (ja) * | 2018-06-29 | 2020-01-09 | キヤノン株式会社 | 情報処理装置、情報処理方法、及びプログラム |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7220324B1 (ja) * | 2022-09-30 | 2023-02-09 | 株式会社Zozo | グレーディングのためのシステム、方法、およびプログラム |
WO2024070000A1 (ja) * | 2022-09-30 | 2024-04-04 | 株式会社Zozo | グレーディングのためのシステム、方法、およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022024200A1 (ja) | 2022-02-03 |
CN115004240A (zh) | 2022-09-02 |
US20230196667A1 (en) | 2023-06-22 |
JP6804125B1 (ja) | 2020-12-23 |
TW202209266A (zh) | 2022-03-01 |
TWI771106B (zh) | 2022-07-11 |
EP4191540A1 (en) | 2023-06-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022024200A1 (ja) | 3dデータシステム及び3dデータ生成方法 | |
Osman et al. | Star: Sparse trained articulated human body regressor | |
JP5895703B2 (ja) | 画像処理装置及び画像処理方法、並びにコンピューター・プログラム | |
CN112598785B (zh) | 虚拟形象的三维模型生成方法、装置、设备及存储介质 | |
EP3454302A1 (en) | Approximating mesh deformation for character rigs | |
JP2018532216A (ja) | 画像正則化及びリターゲティングシステム | |
CA3090747C (en) | Automatic rig creation process | |
JPH04195476A (ja) | コンピュータグラフィックの表示方法及び情報処理装置 | |
JP2019204476A (ja) | 画像生成装置、画像生成方法及びプログラム | |
KR20230004837A (ko) | 생성형 비선형 인간 형상 모형 | |
Danckaers et al. | Posture normalisation of 3D body scans | |
JP2019175321A (ja) | 画像評価装置、画像評価方法及びコンピュータプログラム | |
JP7287038B2 (ja) | フォント選定装置及びプログラム | |
US20230079478A1 (en) | Face mesh deformation with detailed wrinkles | |
JP2018197927A (ja) | 情報処理装置、情報処理システム、情報処理方法およびプログラム | |
WO2020261531A1 (ja) | 情報処理装置、メーキャップシミュレーションの学習済モデルの生成方法、メーキャップシミュレーションの実行方法、及び、プログラム | |
CN111696180A (zh) | 一种虚拟仿真人的生成方法、系统、装置和存储介质 | |
JP2022024189A (ja) | 学習用データ作成方法、学習用データ作成装置及びプログラム | |
JP7482551B2 (ja) | 心的イメージ可視化方法、心的イメージ可視化装置及びプログラム | |
WO2020170845A1 (ja) | 情報処理装置、及び、プログラム | |
JP6593112B2 (ja) | 画像処理装置、表示システムおよびプログラム | |
JP2023080963A (ja) | 学習システム、学習システムの制御方法、及びプログラム | |
CN114556435A (zh) | 信息处理设备、信息处理方法以及程序 | |
Nolte | Mosquito popper: a multiplayer online game for 3D human body scan data segmentation | |
JP2023110395A (ja) | 衣服仮想生成システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2020545614 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20946641 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2020946641 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2020946641 Country of ref document: EP Effective date: 20230227 |