WO2021107202A1 - 의상의 3차원 모델링 방법 - Google Patents
의상의 3차원 모델링 방법 Download PDFInfo
- Publication number
- WO2021107202A1 WO2021107202A1 PCT/KR2019/016643 KR2019016643W WO2021107202A1 WO 2021107202 A1 WO2021107202 A1 WO 2021107202A1 KR 2019016643 W KR2019016643 W KR 2019016643W WO 2021107202 A1 WO2021107202 A1 WO 2021107202A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display shape
- space
- clothes
- shape
- display
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 67
- 230000004927 fusion Effects 0.000 claims abstract description 101
- 230000004044 response Effects 0.000 claims description 5
- 230000008569 process Effects 0.000 description 31
- 239000000463 material Substances 0.000 description 19
- 230000015654 memory Effects 0.000 description 15
- 238000004891 communication Methods 0.000 description 13
- 230000008859 change Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 10
- 238000012545 processing Methods 0.000 description 10
- 238000013461 design Methods 0.000 description 9
- 238000004519 manufacturing process Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 4
- 239000003086 colorant Substances 0.000 description 3
- 239000004744 fabric Substances 0.000 description 3
- 230000000704 physical effect Effects 0.000 description 3
- 238000009958 sewing Methods 0.000 description 3
- 238000004590 computer program Methods 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000012356 Product development Methods 0.000 description 1
- 238000005299 abrasion Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/503—Blending, e.g. for anti-aliasing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/251—Fusion techniques of input or preprocessed data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/10—Geometric CAD
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/506—Illumination models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/16—Cloth
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2012—Colour editing, changing, or manipulating; Use of colour codes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2024—Style variation
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Definitions
- the present invention relates to a three-dimensional modeling method of clothes in consideration of a background element.
- the clothing-related business that designs, manufactures, and sells clothes is widely expanded through online shopping malls, which are being activated with the recent development of the Internet, in addition to the conventional offline.
- this clothing-related industry is a business field that is expected to develop continuously in the future.
- the present invention intends to visually examine whether or not the clothes being produced can fit various sizes of the body in the manufacturing stage of the clothes.
- the present invention intends to more naturally model a situation when clothes are worn on a person's body.
- the present invention intends to model clothes in various spaces.
- the present invention intends to model the costume in consideration of the light conditions in various spaces.
- a 3D modeling method of clothes in consideration of a background element determines a first observation direction in which to observe the clothes based on a user's input in a first space serving as a background for displaying the clothes. step; determining a display shape of the clothes according to the first observation direction; determining a display shape of the first space according to the first observation direction; When the display shape of the garment is overlapped and displayed on the display shape of the first space, a fusion part requiring fusion of the display shape of the first space and the display shape of the clothes is extracted from the display shape of the clothes; determining the shape of the extracted fusion part; and overlapping and displaying the display shape of the garment on the display shape of the first space, including the fusion part.
- the determining of the shape of the fusion part may include: determining, on the display shape of the garment, a first fusion part, which is a portion where the display shape of the garment and the display shape of the first space are in contact; and determining, on the display shape of the garment, a second fusion part in which the display shape of the first space is reflected in the display shape of the garment.
- the color of the fusion part is determined by mixing a display color according to the display shape of the garment in the fusion part and a display color according to the display shape of the first space in a predetermined ratio. to do; and determining the shape of the fusion part by mixing the display shape according to the display shape of the garment in the fusion part and the display shape according to the display shape of the first space at a predetermined ratio.
- the second observation may further include: determining and displaying each of the display shape of the garment, the display shape of the first space, and the fusion part according to the direction.
- the three-dimensional modeling method of clothes further comprises; after determining the first observation direction, determining a light irradiation direction that is a light irradiation direction in the first space;
- the determining of the display shape of the garment includes determining the display shape of the garment in consideration of the light along the light irradiation direction, and the determining of the display shape of the first space considers the light along the light irradiation direction.
- the display shape of the first space may be determined.
- the determining of the light irradiation direction may include: matching the first space to a sphere-shaped space centered on the position of the garment; and determining any one of a plurality of directions in which the spherical space passes through the spherical space as the light irradiation direction.
- the determining of the display shape of the clothes may include determining the display shape of the clothes in consideration of the display shape of the clothes positioned at the center, the first observation direction, and the light irradiation direction.
- the determining of the display shape of the first space may include the display of the first space in consideration of the display shape of the first space projected on the inner surface of the spherical space, the first observation direction, and the light irradiation direction. shape can be determined.
- the present invention it is possible to visually examine whether the clothes being manufactured can fit various sizes of the body in the manufacturing stage of the clothes.
- the present invention can more naturally model a situation when clothes are worn on a person's body.
- the present invention can model clothes in various spaces, and in particular, makes it possible to model natural clothes.
- the present invention enables modeling of clothes in consideration of light conditions in various spaces.
- the present invention makes it possible to easily modify clothes in each stage of costume modeling.
- FIG. 1 is a diagram illustrating an example of a network environment according to an embodiment of the present invention.
- FIG. 2 is a block diagram for explaining the internal configuration of a user terminal and a server according to an embodiment of the present invention.
- 3 is an example of the pattern data generation screen 410 .
- FIG. 4 is an example of a screen 420 on which a three-dimensional shape of clothes is displayed.
- FIG. 5 is an example of a screen 430 that determines a first observation direction based on a user's input.
- FIG. 6 is a view for explaining a process in which the processor 112 determines a light irradiation direction according to an embodiment of the present invention.
- FIG. 7 and 8 are diagrams for explaining a process in which the processor 112 determines the first fusion portion according to an embodiment of the present invention.
- 9 and 10 are diagrams for explaining a process in which the processor 112 determines the second fusion portion according to an embodiment of the present invention.
- FIG. 11 is a flowchart illustrating a 3D modeling method of clothes performed by the user terminal 100 according to an embodiment of the present invention.
- a 3D modeling method of clothes in consideration of a background element determines a first observation direction in which to observe the clothes based on a user's input in a first space serving as a background for displaying the clothes. step; determining a display shape of the clothes according to the first observation direction; determining a display shape of the first space according to the first observation direction; When the display shape of the garment is overlapped and displayed on the display shape of the first space, a fusion part requiring fusion of the display shape of the first space and the display shape of the clothes is extracted from the display shape of the clothes; determining the shape of the extracted fusion part; and overlapping and displaying the display shape of the garment on the display shape of the first space, including the fusion part.
- FIG. 1 is a diagram illustrating an example of a network environment according to an embodiment of the present invention.
- the network environment of FIG. 1 shows an example including a plurality of user terminals 101 , 102 , 103 , 104 , a server 200 , and a network 300 .
- 1 is an example for the description of the invention, and the number of user terminals or the number of servers is not limited as in FIG. 1 .
- the plurality of user terminals 101 , 102 , 103 , and 104 may be a fixed terminal implemented as a computer device or a mobile terminal.
- Examples of the plurality of user terminals 101, 102, 103, and 104 include a smartphone, a mobile phone, a navigation system, a computer, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), and a tablet PC. etc.
- the plurality of user terminals 101 , 102 , 103 , and 104 are connected to each other and/or to the server ( ) through the network 300 using a wireless or wired communication method. 200) can be communicated with.
- the communication method of the plurality of user terminals 101 , 102 , 103 , 104 is not limited, and a communication network (eg, a mobile communication network, a wired Internet, a wireless Internet, a broadcasting network) that the network 300 may include is utilized.
- a communication network eg, a mobile communication network, a wired Internet, a wireless Internet, a broadcasting network
- short-range wireless communication between devices may also be included.
- the network 300 may be a Personal Area Network (PAN), a Local Area Network (LAN), a Campus Area Network (CAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), or a Broad Band Network (BBN). , the Internet, and the like.
- PAN Personal Area Network
- LAN Local Area Network
- CAN Campus Area Network
- MAN Metropolitan Area Network
- WAN Wide Area Network
- BBN Broad Band Network
- the Internet and the like.
- the network 300 may include any one or more of a network topology including a bus network, a star network, a ring network, a mesh network, a star-bus network, a tree or a hierarchical network, etc. not limited
- the plurality of user terminals 101 , 102 , 103 , and 104 will be named as the user terminal 100 .
- the server 200 may be implemented as a computer device or a plurality of computer devices that provide commands, codes, files, contents, services, etc. to the user terminal 100 through the network 300 .
- the server 200 may provide a file for installing an application to the user terminal 100 accessed through the network 300 .
- the user terminal 100 may install an application using a file provided from the server 200 .
- the application may be an application for performing a 3D modeling method of clothes.
- the user terminal 100 accesses the server 200 under the control of an operating system (OS) and at least one program (eg, a browser or an installed application) including a service or content provided by the server 200 by accessing the server 200 .
- OS operating system
- at least one program eg, a browser or an installed application
- the server 200 may transmit at least one pre-generated pattern data to the user terminal 100 in response to the request.
- the user terminal 100 may display it according to the control of the application and provide it to the user.
- FIG. 2 is a block diagram for explaining the internal configuration of the user terminal 100 and the server 200 according to an embodiment of the present invention.
- the user terminal 100 and the server 200 may include memories 111 and 211 , processors 112 and 212 , communication modules 113 and 213 , and input/output interfaces 114 and 214 .
- the memories 111 and 211 are computer-readable recording media and may include random access memory (RAM), read only memory (ROM), and permanent mass storage devices such as disk drives.
- RAM random access memory
- ROM read only memory
- an operating system and at least one program code (eg, a code for 3D modeling of clothes installed and driven in the user terminal 100 ) may be stored in the memories 111 and 211 .
- These software components may be loaded from a computer-readable recording medium separate from the memories 111 and 211 using a drive mechanism.
- the separate computer-readable recording medium may include a computer-readable recording medium such as a floppy drive, a disk, a tape, a DVD/CD-ROM drive, and a memory card.
- the software components may be loaded into the memories 111 and 211 through the communication modules 113 and 213 rather than the computer-readable recording medium.
- the at least one program is based on a program installed by files provided through the network 300 by a file distribution system (eg, the above-described server 200 ) that distributes installation files of developers or applications. to be loaded into the memories 111 and 211 .
- the processors 112 and 212 may be configured to process instructions of a computer program by performing basic arithmetic, logic, and input/output operations.
- the instructions may be provided to the processors 112 and 212 by the memories 111 and 211 or the communication modules 113 and 213 .
- the processors 112 and 212 may be configured to execute received instructions according to program codes stored in a recording device such as the memories 111 and 211 .
- the communication modules 113 and 213 may provide a function for the user terminal 100 and the server 200 to communicate with each other through the network 300 , and may include another user terminal (not shown) or another server (not shown). It can provide a function to communicate with For example, a request generated by the processor 112 of the user terminal 100 according to a program code stored in a recording device such as the memory 111 is transmitted to the server 200 through the network 300 under the control of the communication module 113 . ) can be transferred. Conversely, a control signal, command, content, file, etc. provided under the control of the server 200 and the processor 212 passes through the communication module 213 and the network 300 to the communication module 113 of the user terminal 100 . may be received by the user terminal 100 through
- the input/output interfaces 114 and 214 may be means for interfacing with the input/output device 115 .
- the input device may include, for example, a device such as a keyboard or a mouse
- the output device may include a device such as a display for displaying clothes modeled in 3D.
- the input/output interfaces 114 and 214 may be means for an interface with a device in which functions for input and output are integrated into one, such as a touch screen.
- the user terminal 100 and the server 200 may include more components than those of FIG. 2 .
- the user terminal 100 is implemented to include at least a portion of the above-described input/output device 115 or other components such as a transceiver, a global positioning system (GPS) module, a camera, various sensors, and a database. may include more.
- GPS global positioning system
- the processor 112 may generate pattern data for clothes based on a user's input.
- 'pattern data' of clothes may mean a data set including various information for manufacturing clothes.
- the pattern data may include at least one of shape, dimension information, stitch information, material information, and landmark information of at least one part constituting the garment as an attribute.
- a 'part' constituting the costume may mean at least a part of the costume used for manufacturing the corresponding costume.
- a part may mean a piece of fabric cut for the production of the corresponding costume, or may mean a button, a zipper, or other members for bonding used in the production of the corresponding costume.
- this is an example, and the spirit of the present invention is not limited thereto.
- 'stitch information' is information for combining the above-described parts, and may refer to, for example, information on seams of cut pieces of fabric.
- the stitch information may include information about a material used when combining parts, as well as information on a form of use when combining the corresponding material.
- the stitch information may include information about the number of stitches and information about the color, thickness, and material of a thread used for sewing.
- the stitch information may include information on physical properties of bonding, such as bonding method between parts, bonding elasticity, and bonding strength.
- this is an example, and the spirit of the present invention is not limited thereto.
- 'material information' may include visual information on the corresponding material and physical property information on the corresponding material.
- the visual information about the material may include a color of the material, a pattern of the material, and the like.
- the physical property information about the material may include thickness, density, elasticity, elasticity, breathability, abrasion, and light transmission of the material.
- the above-mentioned material information is exemplary, and if it is a property that can represent the unique characteristics of the material, it may be included in the material information of the present invention.
- the processor 112 may receive such material information from a manufacturer server (not shown) of the corresponding material or from another user terminal (not shown) and store it in the memory 111 .
- the processor 112 may generate the above-described pattern data based on a user's input.
- the processor 112 may acquire the above-described pattern data from equipment that generates pattern data for clothes.
- the device for generating the pattern data of the clothes may include, for example, a plurality of image sensors, at least one light source, and a distance sensor, and may be a device for generating the pattern data in a manner of scanning 3D information of the clothes.
- a pattern data generating device is exemplary, and the spirit of the present invention is not limited thereto.
- 3 is an example of the pattern data generation screen 410 .
- the screen 410 includes an area 411 for editing the shape of parts constituting the clothes, an area 412 for displaying the shape of the clothes in a three-dimensional space, and various functions for displaying the clothes.
- An area 413 for setting attribute values or setting attributes of parts or stitch information between parts may be included.
- the processor 112 may generate pattern data for the corresponding clothes based on a user's input through an interface such as the screen 410 .
- the user may modify the shape of the parts 411a to 411g in the two-dimensional space or generate pattern data for the clothes by adding a new part.
- the processor 112 may generate and/or edit the pattern data of the clothes based on user inputs corresponding to various items constituting the pattern data of the clothes.
- the processor 112 may store in the memory 111 a series of inputs and/or manipulations of the user for generating pattern data of a specific garment. Of course, the processor 112 may transmit such pattern data to the server 200 and/or another user terminal (not shown) through the communication module 113 .
- the processor 112 may load the pattern data of the clothes generated through the above-described process.
- 'loading' the pattern data may mean loading the corresponding pattern data from the memory 111 and/or the server 200 in order to display, edit, and/or modify the corresponding pattern data. have.
- the loading of the pattern data may be performed by a user's selection of any one pattern data among a plurality of pattern data stored in the memory 111 .
- the loading of the pattern data may be performed by the user downloading specific pattern data from the server 200 .
- this is an example, and the spirit of the present invention is not limited thereto.
- the processor 112 may load body data for a body to be clothed with clothes corresponding to the corresponding pattern data.
- the 'body data' may include information on at least one of the size of the corresponding body, the ratio of each part, race, gender, and skin color.
- the processor 112 may modify at least one of the aforementioned items included in the body data based on the user's body data correction input. For example, the processor 112 may correct the gender information included in the body data from male to female based on an input for changing the user's gender from male to female.
- the processor 112 may display the three-dimensional shape of the corresponding garment based on the pattern data of the garment loaded by the above-described process.
- the three-dimensional shape may mean the shape of the garment in a three-dimensional space based on the pattern data.
- the processor 112 according to an embodiment of the present invention may display the three-dimensional shape of the garment in consideration of the body data loaded separately from the pattern data.
- the processor 112 according to an embodiment of the present invention may display the three-dimensional shape of the garment based on the body data and the pattern data.
- FIG. 4 is an example of a screen 420 on which a three-dimensional shape of clothes is displayed.
- the screen 420 includes an area 421 for displaying the shape of the garment in a three-dimensional space, an area 422 for displaying or editing the shape of parts constituting the garment, and a screen for displaying the garment.
- An area 423 for setting various attribute values may be included.
- the processor 112 provides a three-dimensional shape 421a of a body based on body data and a three-dimensional shape ( The three-dimensional shape 421b of the garment in consideration of 421a) may be displayed.
- the processor 112 may consider the three-dimensional shape 421a of the body when displaying the three-dimensional shape 421b of the garment.
- the processor 112 may consider a space serving as a background of the clothes display when displaying the three-dimensional shape 421b of the clothes.
- the processor 112 may determine a first observation direction in which the clothes are to be observed based on a user's input in the first space serving as the background of the clothes display.
- the first space is a virtual space in which the user intends to model the clothes, and may be set by the user.
- the first space may be an indoor space having a specific floor shape as shown in FIG. 4 , or an outdoor space such as a beach as shown in FIG. 5 .
- the screen 430 may include a three-dimensional display area 431 displaying the three-dimensional shape 432 of the body, the display (three-dimensional) shape 433 of the clothes, and the display shape 434 of the first space. .
- the processor 112 may determine the first viewing direction based on a user's input to the display area 431 . For example, the processor 112 may change the viewing direction from the third viewing direction to the first viewing direction according to a user's drag input to any one point on the display area 431 , or from the first viewing direction to the second viewing direction. can also be changed to
- the processor 112 may determine the viewing direction or change the viewing direction based on a user input to a predetermined viewing direction switching interface (not shown) provided on the screen 430 .
- the processor 112 may display a direction indicator 435 for informing the user of the first viewing direction, which is the current viewing direction. For example, by observing the indicator 435 while performing a drag input to any one point on the display area 431 , the user may better confirm the change in the observation direction.
- the processor 112 may determine a light irradiation direction that is a light irradiation direction in the first space.
- FIG. 6 is a view for explaining a process in which the processor 112 determines a light irradiation direction according to an embodiment of the present invention.
- the processor 112 may match the first space 461 to the sphere-shaped space 460 centered on the position of the garment 462 to be modeled.
- matching the first space 461 to the spherical space 460 may mean, for example, projecting an image representing the first space 461 onto the inner surface of the spherical space 460 .
- one direction 463 among a plurality of directions penetrating the spherical space 460 . can be determined as the light irradiation direction.
- the light irradiation direction may be determined based on a user input specifying a direction for the spherical space 460 displayed on the screen.
- the determined light irradiation direction may be used to determine the display shape of the clothes and the display shape of the first space, and a detailed description thereof will be described later.
- the present invention enables modeling to be performed in consideration of even the light conditions in the three-dimensional modeling of clothes.
- the processor 112 may determine the display shape of the clothes according to the first observation direction. For example, when the user performs an input to change the observation direction from the observation direction for observing the side of the garment to the first observation direction for observing the front of the garment as shown in FIG. 5 , the processor 112 removes the garment from the front. It is possible to determine the display shape of the clothes when observing.
- the second method when providing (displaying) the spherical space 460 shown in FIG. 6 to the user, the second method is to obtain an input of the user's direction 464 for the displayed spherical space 460 . 1 It is also possible to input the observation direction. However, such a direction 464 input method is exemplary and the spirit of the present invention is not limited thereto.
- the processor 112 may determine the display shape of the garment in consideration of the light according to the determined light irradiation direction. For example, the processor 112 may adjust the brightness of at least a portion of the display shape of the garment in consideration of the light irradiation direction, or may perform a process corresponding to the shadow on at least a portion.
- this is an example, and the spirit of the present invention is not limited thereto.
- the processor 112 may determine the display shape of the first space according to the first observation direction. For example, when the user performs an input to change the observation direction from the observation direction for observing the side of the garment to the first observation direction for observing the front of the garment as shown in FIG. 5 , the processor 112 sets the first observation direction A display shape of the first space may be determined in consideration of .
- the processor 112 may determine the display shape of the first space in consideration of the light according to the determined light irradiation direction. For example, the processor 112 may adjust the brightness of at least a portion of the display shape of the first space in consideration of the light irradiation direction, or may perform a process corresponding to the shadow on at least a portion.
- this is an example, and the spirit of the present invention is not limited thereto.
- the processor 112 When the processor 112 according to an embodiment of the present invention overlaps the display shape of the clothes on the display shape of the first space and displays the display shape of the clothes, the display shape of the first space and the display shape of the clothes are fused on the display shape of the clothes. This necessary fusion moiety can be extracted.
- the processor 112 may determine a first fusion part, which is a portion in which the display shape of the clothes and the display shape of the first space, contact the display shape of the clothes as the fusion part.
- the processor 112 may determine the second fusion portion, which is a portion in which the display shape of the first space on the display shape of the garment is reflected on the display shape of the garment, as the fusion part.
- the first fusion portion and the second fusion portion will be described with reference to FIGS. 7 to 10 .
- FIG. 7 and 8 are diagrams for explaining a process in which the processor 112 determines the first fusion portion according to an embodiment of the present invention.
- the processor 112 converts the first fusion part, which is a portion where the display shape 441 of the clothes and the display shape 442 of the first space, in contact with the display shape 441 of the clothes, as a fusion part.
- the 'part in contact' between the display shape 441 of the garment and the display shape 442 of the first space may mean a part serving as a boundary between the two shapes.
- the processor 112 may determine a portion of the right shoulder portion of the display shape 441 of the garment as the first fusion portion 443 .
- 9 and 10 are diagrams for explaining a process in which the processor 112 determines the second fusion portion according to an embodiment of the present invention.
- the processor 112 converts the second fusion part, which is the part in which the display shape 452 of the first space on the display shape 451 of the clothes is reflected in the display shape 451 of the clothes, into the fusion part. may decide In this case, the 'reflection' of the display shape 452 of the first space on the display shape 451 of the clothes may mean that the inner diameter is exposed through the clothes due to the material characteristics of the clothes. For example, the processor 112 may determine a portion of the left leg portion of the display shape 451 of the garment as the second fusion portion 453 as shown on the partial screen 450P.
- the processor 112 may determine the shape of the fusion part determined by the above-described process.
- the processor 112 may determine the color of the fusion part by mixing the display color according to the display shape of the clothes in the fusion part and the display color according to the display shape of the first space at a predetermined ratio.
- the predetermined ratio may be determined with reference to the light irradiation direction and pattern data of the clothes.
- the processor 112 may determine the shape of the fusion part by mixing the display shape according to the display shape of the clothes in the fusion part and the display shape according to the display shape of the first space at a predetermined ratio.
- the display color according to the display shape 441 of the garment and the display shape of the first space ( 442) may be mixed in a predetermined ratio.
- the processor 112 determines the color of the second fusion part 453 in the situations shown in FIGS. 9 and 10 , the display color according to the display shape 451 of the garment and the display shape of the first space ( 452) may be mixed in a predetermined ratio. Also, in determining the display shape of the second fusion part 453 , the processor 112 determines the display shape (ie, lace shape) according to the display shape 451 of the garment and the display according to the display shape 452 of the first space. The shape (the shape of the land and the sea) can be mixed in a predetermined ratio.
- the processor 112 may determine the shape of the fusion portion in units of a predetermined unit size. For example, the processor 112 may determine the shape of the fusion portion in units of pixel blocks including a predetermined number of pixels, or may determine the shape of the fusion portion in units of pixels.
- the processor 112 may overlap and display the display shape of the garment on the display shape of the first space, including the fusion portion determined by the above-described process.
- the part corresponding to the fusion part on the display shape of the garment may be displayed as the shape of the fusion part determined by the above-described process.
- the processor 112 responds to a user input changing from the first observation direction to the second observation direction, and includes a display shape of clothes, a display shape of a first space, and a display shape of the clothes according to the second observation direction.
- Each of the fusion moieties may be determined and displayed according to the above-described process.
- the processor 112 may update and display the display shape of the clothes, the display shape of the first space, and the shape of the fusion part according to the observation change.
- the processor 112 when the light irradiation direction is determined by the processor 112 , the processor 112 changes the observation direction as well as the updated shapes (display shape of clothes, the shape of the first space in consideration of the light irradiation direction). display shape and shape of the fusion part) can be displayed.
- the processor 112 may determine the display shape of the clothes in consideration of the display shape of the clothes positioned at the center of the spherical space in FIG. 6 , the second observation direction, and the light irradiation direction.
- the processor 112 may determine the display shape of the first space in consideration of the display shape of the first space projected on the inner surface of the spherical space, the second observation direction, and the light irradiation direction.
- the processor 112 may display the determined display shape of the clothes and the display shape of the first space together.
- the present invention can model clothes in various spaces, and in particular, enables modeling of natural clothes.
- the present invention enables modeling of clothes in consideration of light conditions in various spaces.
- FIG. 11 is a flowchart illustrating a 3D modeling method of clothes performed by the user terminal 100 according to an embodiment of the present invention. Hereinafter, it will be described with reference to FIGS. 1 to 10 together, but a description of contents overlapping with FIGS. 1 to 10 will be omitted.
- the user terminal 100 may determine a first observation direction in which to observe the clothes based on the user's input in the first space serving as the background of the clothes display.
- the first The space is a virtual space in which the user intends to model the clothes, and may be set by the user.
- the first space may be an indoor space having a specific floor shape as shown in FIG. 4 , or an outdoor space such as a beach as shown in FIG. 5 .
- the screen 430 may include a three-dimensional display area 431 displaying the three-dimensional shape 432 of the body, the display (three-dimensional) shape 433 of the clothes, and the display shape 434 of the first space. .
- the user terminal 100 may determine the first viewing direction based on the user's input to the display area 431 .
- the user terminal 100 may change the observation direction from the third observation direction to the first observation direction according to the user's drag input to any one point on the display area 431 , or change the observation direction from the first observation direction to the second observation direction. direction can also be changed.
- the user terminal 100 may determine the observation direction or change the observation direction based on a user input to a predetermined viewing direction switching interface (not shown) provided on the screen 430 .
- the user terminal 100 may display a direction indicator 435 for informing the user of the first observation direction, which is the current observation direction. For example, by observing the indicator 435 while performing a drag input to any one point on the display area 431 , the user may better confirm the change in the observation direction.
- the user terminal 100 may determine the light irradiation direction, which is the light irradiation direction in the first space.
- FIG. 6 is a view for explaining a process in which the user terminal 100 determines a light irradiation direction according to an embodiment of the present invention.
- the user terminal 100 may correspond to the first space 461 in a sphere-shaped space 460 centered on the position of the garment 462 to be modeled.
- matching the first space 461 to the spherical space 460 may mean, for example, projecting an image representing the first space 461 onto the inner surface of the spherical space 460 .
- any one of a plurality of directions 463 passing through the spherical space 460 . can be determined as the light irradiation direction.
- the light irradiation direction may be determined based on a user input specifying a direction for the spherical space 460 displayed on the screen.
- the determined light irradiation direction may be used to determine the display shape of the clothes and the display shape of the first space, and a detailed description thereof will be described later.
- the present invention enables modeling to be performed in consideration of even the light conditions in the three-dimensional modeling of clothes.
- the user terminal 100 may determine the display shape of the clothes according to the first observation direction. (S1120) For example, as shown in FIG. 5 in the observation direction in which the user observes the side of the clothes. Similarly, when an input for changing the observation direction to the first observation direction for observing the front of the clothes is performed, the user terminal 100 may determine the display shape of the clothes when observing the clothes from the front.
- the second method when providing (displaying) the spherical space 460 shown in FIG. 6 to the user, the second method is to obtain an input of the user's direction 464 for the displayed spherical space 460 . 1 It is also possible to input the observation direction. However, such a direction 464 input method is exemplary and the spirit of the present invention is not limited thereto.
- the user terminal 100 may determine the display shape of the clothes in consideration of the light according to the determined light irradiation direction. For example, the user terminal 100 may adjust the brightness of at least a portion of the display shape of the garment in consideration of the light irradiation direction, or may perform a process corresponding to the shadow on at least a portion.
- this is an example, and the spirit of the present invention is not limited thereto.
- the user terminal 100 may determine the display shape of the first space according to the first observation direction. (S1130) For example, as shown in FIG. 5 in the observation direction in which the user observes the side of the clothes As described above, when an input for changing the observation direction to the first observation direction for observing the front of the garment is performed, the user terminal 100 may determine the display shape of the first space in consideration of the first observation direction.
- the user terminal 100 may determine the display shape of the first space in consideration of the light according to the determined light irradiation direction. For example, the user terminal 100 may adjust the brightness of at least a portion of the display shape of the first space in consideration of the light irradiation direction, or may perform a process corresponding to the shadow on at least a portion.
- this is an example, and the spirit of the present invention is not limited thereto.
- the display shape of the first space and the display shape of the clothes are displayed on the display shape of the clothes.
- a fusion portion requiring fusion may be extracted (S1140).
- the user terminal 100 may determine a first fusion part, which is a portion in which the display shape of the clothes and the display shape of the first space, is in contact with the display shape of the clothes as the fusion part.
- the user terminal 100 may determine, as the fusion part, the second fusion part, which is the part where the display shape of the first space on the display shape of the clothes is reflected on the display shape of the clothes.
- the first fusion portion and the second fusion portion will be described with reference to FIGS. 7 to 10 .
- FIG. 7 and 8 are diagrams for explaining a process in which the user terminal 100 determines the first fusion part according to an embodiment of the present invention.
- the user terminal 100 combines a first fusion part, which is a portion where the display shape 441 of the clothes and the display shape 442 of the first space, on the display shape 441 of the clothes, is in contact with the fusion part.
- the 'part in contact' between the display shape 441 of the garment and the display shape 442 of the first space may mean a part serving as a boundary between the two shapes.
- the user terminal 100 may determine a portion of the right shoulder portion of the display shape 441 of the garment as the first fusion portion 443 .
- 9 and 10 are diagrams for explaining a process in which the user terminal 100 determines the second fusion part according to an embodiment of the present invention.
- the user terminal 100 forms a second fusion part, which is a part in which the display shape 452 of the first space is reflected in the display shape 451 of the clothes on the display shape 451 of the clothes, as the fusion part may be decided as
- the 'reflection' of the display shape 452 of the first space on the display shape 451 of the clothes may mean that the inner diameter is exposed through the clothes due to the material characteristics of the clothes.
- the user terminal 100 may determine a part of the left leg part of the display shape 451 of the garment as the second fusion part 453 .
- the user terminal 100 may determine the shape of the fusion portion determined by the above-described process.
- the user terminal 100 may determine the color of the fusion part by mixing the display color according to the display shape of the clothes in the fusion part and the display color according to the display shape of the first space at a predetermined ratio.
- the predetermined ratio may be determined with reference to the light irradiation direction and pattern data of the clothes.
- the user terminal 100 may determine the shape of the fusion part by mixing the display shape according to the display shape of the garment in the fusion part and the display shape according to the display shape of the first space at a predetermined ratio.
- the display color according to the display shape 441 of the garment and the display shape of the first space may be mixed in a predetermined ratio.
- the user terminal 100 determines the color of the second fusion part 453 , the display color according to the display shape 451 of the garment and the display shape of the first space
- the display colors according to (452) may be mixed in a predetermined ratio.
- the user terminal 100 determines the display shape of the second fusion part 453 according to the display shape (ie, lace shape) according to the display shape 451 of the garment and the display shape 452 of the first space.
- the display shape (the shape of the land and the sea) can be mixed in a predetermined ratio.
- the user terminal 100 may determine the shape of the fusion part in units of a predetermined unit size.
- the user terminal 100 may determine the shape of the fusion portion in units of pixel blocks including a predetermined number of pixels, or may determine the shape of the fusion portion in units of pixels.
- the user terminal 100 may overlap and display the display shape of the garment on the display shape of the first space, including the fusion portion determined by the above-described process. (S1150) At this time The part corresponding to the fusion part on the display shape of the garment may be displayed as the shape of the fusion part determined by the above-described process.
- the user terminal 100 responds to a user's input changing from the first observation direction to the second observation direction, and the display shape of the clothes according to the second observation direction and the display shape of the first space and each of the fusion portions may be determined and displayed according to the above-described process.
- the user terminal 100 may update and display the display shape of the clothes, the display shape of the first space, and the shape of the fusion part according to the observation change.
- the user terminal 100 when the light irradiation direction is determined by the user terminal 100, the user terminal 100 changes the observation direction and the updated shapes (display shape of clothes, the first The display shape of the space and the shape of the fusion part) can be displayed.
- the user terminal 100 may determine the display shape of the clothes in consideration of the display shape of the clothes positioned at the center of the spherical space in FIG. 6 , the second observation direction, and the light irradiation direction.
- the user terminal 100 may determine the display shape of the first space in consideration of the display shape of the first space projected on the inner surface of the spherical space, the second observation direction, and the light irradiation direction.
- the user terminal 100 may display the determined display shape of the clothes and the display shape of the first space together.
- the present invention can model clothes in various spaces, and in particular, enables modeling of natural clothes.
- the present invention enables modeling of clothes in consideration of light conditions in various spaces.
- the device described above may be implemented as a hardware component, a software component, and/or a combination of the hardware component and the software component.
- devices and components described in the embodiments may include, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable gate array (FPGA). , a programmable logic unit (PLU), microprocessor, or any other device capable of executing and responding to instructions, may be implemented using one or more general purpose or special purpose computers.
- the processing device may execute an operating system (OS) and one or more software applications running on the operating system.
- the processing device may also access, store, manipulate, process, and generate data in response to execution of the software.
- OS operating system
- the processing device may also access, store, manipulate, process, and generate data in response to execution of the software.
- the processing device includes a plurality of processing elements and/or a plurality of types of processing elements. It can be seen that can include For example, the processing device may include a plurality of processors or one processor and one controller. Other processing configurations are also possible, such as parallel processors.
- Software may comprise a computer program, code, instructions, or a combination of one or more thereof, which configures a processing device to operate as desired or is independently or collectively processed You can command the device.
- the software and/or data may be any kind of machine, component, physical device, virtual equipment, computer storage medium or device, to be interpreted by or to provide instructions or data to the processing device. , or may be permanently or temporarily embody in a transmitted signal wave.
- the software may be distributed over networked computer systems, and stored or executed in a distributed manner. Software and data may be stored in one or more computer-readable recording media.
- the method according to the embodiment may be implemented in the form of program instructions that can be executed through various computer means and recorded in a computer-readable medium.
- the computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination.
- the program instructions recorded on the medium may be specially designed and configured for the embodiment, or may be known and available to those skilled in the art of computer software.
- Examples of the computer-readable recording medium include magnetic media such as hard disks, floppy disks and magnetic tapes, optical media such as CD-ROMs and DVDs, and magnetic media such as floppy disks.
- - includes magneto-optical media, and hardware devices specially configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
- Examples of program instructions include not only machine language codes such as those generated by a compiler, but also high-level language codes that can be executed by a computer using an interpreter or the like.
- the hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Geometry (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Architecture (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Artificial Intelligence (AREA)
- Computational Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Pure & Applied Mathematics (AREA)
- Processing Or Creating Images (AREA)
- Professional, Industrial, Or Sporting Protective Garments (AREA)
Abstract
Description
Claims (5)
- 배경 요소를 고려한 의상의 3차원 모델링 방법에 있어서,의상 표시의 배경이 되는 제1 공간에서, 사용자의 입력에 기초하여 상기 의상을 관찰하고자 하는 제1 관찰 방향을 결정하는 단계;상기 제1 관찰 방향에 따른 상기 의상의 표시 형상을 결정하는 단계;상기 제1 관찰 방향에 따른 상기 제1 공간의 표시 형상을 결정하는 단계;상기 제1 공간의 표시 형상 상에 상기 의상의 표시 형상을 오버랩 하여 표시할 때, 상기 의상의 표시 형상 상에서 상기 제1 공간의 표시 형상과 상기 의상의 표시 형상의 융합이 필요한 융합 부분을 추출하고, 추출된 융합 부분의 형상을 결정하는 단계; 및상기 융합 부분을 포함하여, 상기 제1 공간의 표시 형상 상에 상기 의상의 표시 형상을 오버랩 하여 표시하는 단계;를 포함하는, 의상의 3차원 모델링 방법.
- 청구항 1에 있어서상기 융합 부분의 형상을 결정하는 단계는상기 의상의 표시 형상 상에서, 상기 의상의 표시 형상과 상기 제1 공간의 표시 형상이 접하는 부분인 제1 융합 부분을 결정하는 단계; 및상기 의상의 표시 형상 상에서, 상기 제1 공간의 표시 형상이 상기 의상의 표시 형상에 비치는 부분인 제2 융합 부분을 결정하는 단계;를 포함하는, 의상의 3차원 모델링 방법.
- 청구항 1에 있어서상기 융합 부분의 형상을 결정하는 단계는상기 융합 부분에서의 상기 의상의 표시 형상에 따른 표시 색상과 상기 제1 공간의 표시 형상에 따른 표시 색상을 소정의 비율로 혼합하여 상기 융합 부분의 색상을 결정하는 단계; 및상기 융합 부분에서의 상기 의상의 표시 형상에 따른 표시 모양과 상기 제1 공간의 표시 형상에 따른 표시 모양을 소정의 비율로 혼합하여 상기 융합 부분의 모양을 결정하는 단계;를 포함하는, 의상의 3차원 모델링 방법.
- 청구항 1에 있어서상기 의상의 3차원 모델링 방법은상기 표시하는 단계 이후에,관찰 방향을 상기 제1 관찰 방향에서 제2 관찰 방향으로 변경하는 사용자의 입력에 대응하여, 상기 제2 관찰 방향에 따른 상기 의상의 표시 형상, 상기 제1 공간의 표시 형상 및 상기 융합 부분 각각을 결정하여 표시하는 단계;를 더 포함하는, 의상의 3차원 모델링 방법.
- 청구항 1에 있어서상기 의상의 3차원 모델링 방법은상기 제1 관찰 방향을 결정하는 단계 이후에,상기 제1 공간에서 광의 조사 방향인 광 조사 방향을 결정하는 단계;를 더 포함하고,상기 의상의 표시 형상을 결정하는 단계는상기 광 조사 방향에 따른 광을 고려하여 상기 의상의 표시 형상을 결정하고,상기 제1 공간의 표시 형상을 결정하는 단계는상기 광 조사 방향에 따른 광을 고려하여 상기 제1 공간의 표시 형상을 결정하는, 의상의 3차원 모델링 방법.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/050,037 US11321935B2 (en) | 2019-11-28 | 2019-11-28 | Three-dimensional (3D) modeling method of clothing |
CN201980028804.3A CN113196343A (zh) | 2019-11-28 | 2019-11-28 | 衣服的三维建模方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2019-0155797 | 2019-11-28 | ||
KR1020190155797A KR20210066494A (ko) | 2019-11-28 | 2019-11-28 | 의상의 3차원 모델링 방법 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021107202A1 true WO2021107202A1 (ko) | 2021-06-03 |
Family
ID=76130594
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2019/016643 WO2021107202A1 (ko) | 2019-11-28 | 2019-11-28 | 의상의 3차원 모델링 방법 |
Country Status (4)
Country | Link |
---|---|
US (1) | US11321935B2 (ko) |
KR (2) | KR20210066494A (ko) |
CN (1) | CN113196343A (ko) |
WO (1) | WO2021107202A1 (ko) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023229348A1 (ko) * | 2022-05-23 | 2023-11-30 | (주)클로버추얼패션 | 텍스쳐 이미지 시뮬레이션 방법 및 그 장치 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160275724A1 (en) * | 2011-02-17 | 2016-09-22 | Metail Limited | Computer implemented methods and systems for generating virtual body models for garment fit visualisation |
KR20180069786A (ko) * | 2015-08-14 | 2018-06-25 | 미테일 리미티드 | 3d 신체 모델에 대한 3d 의복 모델의 이미지 파일을 생성하기 위한 방법 및 시스템 |
KR20190028827A (ko) * | 2017-07-31 | 2019-03-20 | 주식회사 자이언소프트 | 가상 피팅 시스템 |
KR102033161B1 (ko) * | 2018-03-30 | 2019-10-16 | (주)클로버추얼패션 | 아바타간 의상 전이 착장 방법 |
KR102044348B1 (ko) * | 2017-11-09 | 2019-11-13 | (주)코아시아 | 증강현실을 이용한 가상 의류 피팅 미러 장치 시스템 |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6839463B1 (en) * | 2000-12-22 | 2005-01-04 | Microsoft Corporation | System and method providing subpixel-edge-offset-based determination of opacity |
US7064755B2 (en) * | 2002-05-24 | 2006-06-20 | Silicon Graphics, Inc. | System and method for implementing shadows using pre-computed textures |
KR20090012649A (ko) * | 2007-07-31 | 2009-02-04 | 주식회사 에프엑스코드 | 의상의 형상과 미세구조를 동시에 스캐닝하는 방법 및시스템 |
US9024966B2 (en) * | 2007-09-07 | 2015-05-05 | Qualcomm Incorporated | Video blending using time-averaged color keys |
KR101120820B1 (ko) * | 2009-11-19 | 2012-03-22 | 삼성메디슨 주식회사 | 초음파 공간 합성 영상을 제공하는 초음파 시스템 및 방법 |
KR101671185B1 (ko) * | 2010-03-22 | 2016-11-01 | 삼성전자주식회사 | 렌더링을 위한 빛 및 질감 추출 장치 및 방법, 그리고, 빛 및 질감을 이용한 렌더링 장치 |
US8194072B2 (en) * | 2010-03-26 | 2012-06-05 | Mitsubishi Electric Research Laboratories, Inc. | Method for synthetically relighting images of objects |
US8884984B2 (en) * | 2010-10-15 | 2014-11-11 | Microsoft Corporation | Fusing virtual content into real content |
JP6102122B2 (ja) * | 2012-08-24 | 2017-03-29 | ソニー株式会社 | 画像処理装置および方法、並びにプログラム |
US9773274B2 (en) * | 2013-12-02 | 2017-09-26 | Scott William Curry | System and method for online virtual fitting room |
JP5992957B2 (ja) * | 2014-06-10 | 2016-09-14 | 美津濃株式会社 | 情報処理装置、情報処理方法、情報処理プログラム、およびこれを用いた衣服の製造方法 |
US10636206B2 (en) | 2015-08-14 | 2020-04-28 | Metail Limited | Method and system for generating an image file of a 3D garment model on a 3D body model |
US20170161950A1 (en) * | 2015-12-08 | 2017-06-08 | GM Global Technology Operations LLC | Augmented reality system and image processing of obscured objects |
US20180068473A1 (en) * | 2016-09-06 | 2018-03-08 | Apple Inc. | Image fusion techniques |
CN109035259B (zh) * | 2018-07-23 | 2021-06-29 | 西安建筑科技大学 | 一种三维多角度试衣装置及试衣方法 |
US10949959B2 (en) * | 2019-02-18 | 2021-03-16 | Samsung Electronics Co., Ltd. | Processing image data in a composite image |
US10902670B1 (en) * | 2019-11-12 | 2021-01-26 | Facebook Technologies, Llc | Systems and methods for graphics rendering based on machine learning |
KR20220017017A (ko) * | 2020-08-03 | 2022-02-11 | 삼성디스플레이 주식회사 | 표시 장치 및 그 제조 방법 |
-
2019
- 2019-11-28 KR KR1020190155797A patent/KR20210066494A/ko not_active IP Right Cessation
- 2019-11-28 US US17/050,037 patent/US11321935B2/en active Active
- 2019-11-28 WO PCT/KR2019/016643 patent/WO2021107202A1/ko active Application Filing
- 2019-11-28 CN CN201980028804.3A patent/CN113196343A/zh active Pending
-
2021
- 2021-06-24 KR KR1020210082444A patent/KR102458303B1/ko active IP Right Grant
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160275724A1 (en) * | 2011-02-17 | 2016-09-22 | Metail Limited | Computer implemented methods and systems for generating virtual body models for garment fit visualisation |
KR20180069786A (ko) * | 2015-08-14 | 2018-06-25 | 미테일 리미티드 | 3d 신체 모델에 대한 3d 의복 모델의 이미지 파일을 생성하기 위한 방법 및 시스템 |
KR20190028827A (ko) * | 2017-07-31 | 2019-03-20 | 주식회사 자이언소프트 | 가상 피팅 시스템 |
KR102044348B1 (ko) * | 2017-11-09 | 2019-11-13 | (주)코아시아 | 증강현실을 이용한 가상 의류 피팅 미러 장치 시스템 |
KR102033161B1 (ko) * | 2018-03-30 | 2019-10-16 | (주)클로버추얼패션 | 아바타간 의상 전이 착장 방법 |
Also Published As
Publication number | Publication date |
---|---|
US11321935B2 (en) | 2022-05-03 |
KR102458303B1 (ko) | 2022-10-25 |
KR20210082407A (ko) | 2021-07-05 |
US20210166492A1 (en) | 2021-06-03 |
CN113196343A (zh) | 2021-07-30 |
KR20210066494A (ko) | 2021-06-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021107204A1 (ko) | 의상의 3차원 모델링 방법 | |
WO2021006656A1 (ko) | 의복 디자인 생성 방법 및 시스템과 이를 위한 통합 애플리케이션 프로그램 | |
WO2020180084A1 (ko) | 타겟 이미지의 채색 완성 방법, 장치 및 컴퓨터 프로그램 | |
WO2020149581A1 (en) | Electronic device for generating avatar and method thereof | |
WO2019139270A1 (ko) | 디스플레이 장치 및 이의 컨텐츠 제공 방법 | |
WO2020251238A1 (ko) | 입력영상데이터 기반 사용자 관심정보 획득 방법 및 대상체 디자인 커스터마이징 방법 | |
WO2017030255A1 (en) | Large format display apparatus and control method thereof | |
WO2020091207A1 (ko) | 이미지의 채색 완성 방법, 장치 및 컴퓨터 프로그램과 인공 신경망 학습 방법, 장치 및 컴퓨터 프로그램 | |
EP3632119A1 (en) | Display apparatus and server, and control methods thereof | |
WO2018182068A1 (ko) | 아이템에 대한 추천 정보 제공 방법 및 장치 | |
WO2021133053A1 (ko) | 전자 장치 및 그의 제어 방법 | |
WO2021107205A1 (ko) | 의상 모델링을 위한 바디 형상 표시 방법 | |
WO2021107202A1 (ko) | 의상의 3차원 모델링 방법 | |
WO2019103285A1 (ko) | 전자 장치 및 전자 장치의 증강 현실 서비스 제공 방법 | |
WO2020242047A1 (en) | Method and apparatus for acquiring virtual object data in augmented reality | |
WO2014109483A1 (ko) | 온라인상에서의 제품 디자인, 디자인의 공유, 디자인을 통한 제품의 제작 및 마케팅이 이루어지는 제품 토탈 솔루션 제공 방법 | |
WO2021107203A1 (ko) | 의상의 3차원 모델링 방법 | |
WO2018026082A1 (ko) | 애니메이션 제작 장치 및 방법 | |
WO2021040256A1 (ko) | 전자 장치 및 이의 의류 추천 방법 | |
WO2018182053A1 (ko) | 오브젝트의 형상에 대한 정보를 획득하는 방법 및 장치 | |
WO2009131361A2 (ko) | 3차원 지도 서비스에서의 지도 데이터 편집 장치 및 방법 | |
WO2021235635A1 (ko) | 가상 컨텐츠 데이터 기반 현실적 장면 이미지의 렌더링 방법 및 그 장치 | |
WO2020141808A1 (ko) | 외부 장치의 콘텐트를 편집하는 전자 장치 및 방법 | |
KR102083504B1 (ko) | 환경 요소를 고려한 의류의 3차원 모델링 방법 및 컴퓨터 프로그램 | |
WO2020175760A1 (en) | Electronic device and content generation method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19953907 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19953907 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 12.12.2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19953907 Country of ref document: EP Kind code of ref document: A1 |