US20220207828A1 - Systems and methods of three-dimensional modeling for use in generating a realistic computer avatar and garments - Google Patents
Systems and methods of three-dimensional modeling for use in generating a realistic computer avatar and garments Download PDFInfo
- Publication number
- US20220207828A1 US20220207828A1 US17/357,394 US202117357394A US2022207828A1 US 20220207828 A1 US20220207828 A1 US 20220207828A1 US 202117357394 A US202117357394 A US 202117357394A US 2022207828 A1 US2022207828 A1 US 2022207828A1
- Authority
- US
- United States
- Prior art keywords
- mesh
- resulting
- geometry
- topology
- shells
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 84
- 238000013507 mapping Methods 0.000 claims description 11
- 230000015654 memory Effects 0.000 claims description 11
- 238000012545 processing Methods 0.000 claims description 5
- 238000011156 evaluation Methods 0.000 claims description 3
- 238000005516 engineering process Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 4
- 230000005291 magnetic effect Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 239000004744 fabric Substances 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- VYZAMTAEIAYCRO-UHFFFAOYSA-N Chromium Chemical compound [Cr] VYZAMTAEIAYCRO-UHFFFAOYSA-N 0.000 description 1
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000037237 body shape Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000007429 general method Methods 0.000 description 1
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/16—Cloth
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2021—Shape modification
Definitions
- the present disclosure is directed to a method comprising determining a topology mesh and a geometry mesh and determining a resulting mesh that has a topology of the topology mesh with a bi-dimensional (UV) coordinate space, and a geometry of the geometry mesh.
- determining a topology mesh and a geometry mesh and determining a resulting mesh that has a topology of the topology mesh with a bi-dimensional (UV) coordinate space, and a geometry of the geometry mesh.
- the present disclosure is directed to a 3D system comprising: a processor and a memory for storing instructions, the processor executing the instructions to determine a topology mesh and a geometry mesh; determine a resulting mesh that has a topology of the topology mesh with a bi-dimensional (UV) coordinate space, and a geometry of the geometry mesh by: copying the topology mesh to the resulting mesh; copying a UV boundary of the geometry mesh to the resulting mesh; restoring a UV parametrization of the resulting mesh to the topology mesh; copying a three-dimensional geometry of the geometry mesh to the resulting mesh; and outputting the resulting mesh.
- a 3D system comprising: a processor and a memory for storing instructions, the processor executing the instructions to determine a topology mesh and a geometry mesh; determine a resulting mesh that has a topology of the topology mesh with a bi-dimensional (UV) coordinate space, and a geometry of the geometry mesh by: copying the topology mesh to the resulting mesh; copying a UV
- a method includes determining a topology mesh and a geometry mesh; and determining a resulting mesh that has a topology of the topology mesh with a bi-dimensional (UV) coordinate space, and a geometry of the geometry mesh.
- UV bi-dimensional
- FIG. 1 is a screenshot of a topology mesh T and UV shells of the Topology mesh T.
- FIG. 2 is a screenshot of an XYZ representation of the geometry mesh G and UV shells of the geometry mesh G are also illustrated.
- FIG. 3 is a screenshot of UV shells of an intermediate mesh.
- FIG. 4 is a screenshot illustrating the use of consistency 2-D parametrization of UV shells by rearranging interior UV vertices.
- FIG. 5 is a screenshot illustrating computed positions of a resulting mesh R's vertices by mapping them from UV shell space to the 3D space of a garment.
- FIG. 6 is a flowchart of an example method of the present disclosure.
- FIG. 7 is a flowchart of another example method of the present disclosure.
- FIG. 8 is a flowchart of yet another example method of the present disclosure.
- FIG. 9 is a flowchart of an additional example method of the present disclosure.
- FIG. 10 is a flowchart of a variation of the method of FIG. 6 .
- FIG. 11 is a flowchart of another variation of the method of FIG. 6 .
- FIG. 12 illustrates example code used to implement aspects of the present disclosure.
- FIG. 13 is schematic diagram of an example computerized system of the present disclosure.
- the present disclosure pertains to systems and methods for the generation of a realistic customized computer animation of a user wearing a particular one or more cloth garments.
- an avatar representing the user's likeness in body shape and size can be generated, and different garments can be draped on the avatar of the user to simulate how the garment(s) would look on the actual body of the human user.
- the computer animation can be generated for any user, wearing any size garment, in substantially real-time.
- FIG. 11 illustrates an example computer system that can be programmed for use in accordance with the present disclosure. That is, the computer system can be configured to provide three-dimensional modeling for use in generating a realistic computer avatar of a user wearing a particular garment.
- the specifically programmed system disclosed herein will be referenced as the “3D system”.
- an example 3D system of the present disclosure utilizes mesh analysis to represent a three-dimensional surface.
- the mesh can comprise a plurality of vertices, also referred to as points. Vertices can be connected by edges. In some instances, vertices and edges can form shapes such as triangles or other similar polygons. Thus, when edges that connect vertices form a closed loop, a face is created.
- the three-dimensional surface may be conceptually thought of as a pattern of faces.
- Any surface topology can be represented using vertices, edges, and faces.
- the creation of mesh allows for accurate representation of a surface in three-dimensions. These methods are more efficient in representing three-dimensional surfaces than point-clouds. To be sure, while a three-dimensional surface can be sampled to identify points and create a point-cloud, this process can be quite data intensive depending on the number of vertices/points obtained.
- the modeling disclosed herein provides advantages over point-cloud processes, and do not require the same amount of data, but still provide accurate 3D representations.
- FIG. 1 is a screenshot of a topology mesh T related to a three-dimensional object, such as a shirt 100 .
- An XYZ representation 102 of the topology mesh T and UV shells 104 are also illustrated.
- the topology includes vertices such as vertex 101 , edges, such as edge 103 , and an example faces, such as face 105 .
- the topology includes a plurality of faces.
- the 3D system can assign each vertex/point a unique identifier. Edges connecting between two vertices can also be numbered uniquely, as well as identified in terms of the vertices they span between. The same identification process can be used for faces, allowing the 3D system to know which edges and vertices belong to which faces.
- topology This collective information is referred to as topology. Further, when referring to geometry, this will be understood to include the surface/face (or group of faces) that are implied by the relative 3D position of each of the vertices associated with the face(s).
- the dark lines or edges form a topology and the shirt is the underlying geometry.
- the vertices, edges, and faces of the topology form a consistent representation of the underlying geometry of the shirt 100 .
- a topology mesh comprises a mesh having a desired topology and desired UV shell density.
- a geometry mesh includes a mesh having a desired geometry and desired UV shell boundaries. The use of both geometry and topology meshes will be described in greater detail below.
- FIG. 2 illustrates an XYZ representation 102 of the geometry mesh G and UV shells 104 of the geometry mesh G are also illustrated.
- the 3D system can determine a resulting mesh R that has a desired topology and desired UV shell density
- the resulting mesh R combines the topology of T with UV and geometry of G. It will be understood that T and G have the same UV shells in terms of the number of boundaries and boundary topological (connectivity) information.
- the 3D system can create R by initializing it to T. This process can include establishing consistency in a 1-D parametrization of boundaries of UV shells of the mesh R and G, by parametrization of boundaries from 0 to 1 and forcing consistent direction. The 3D system can, based on consistency of 1-D boundary parametrization, project the mesh R's UV boundary vertices onto G's UV boundaries to ensure the same UV boundary contours. UV shells of the mesh R are illustrated in FIG. 3 as representations 106 .
- the 3D system can establish consistency 2-D parametrization of UV shells, by rearranging interior UV vertices. Rearranged interior UV vertices 108 are also illustrated.
- the 3D system can, based on consistency 2-D shell parametrization, compute positions of R's vertices by mapping them from UV shell space to 3D space of a garment.
- An XYZ representation 110 of the mesh R prior to rearranging interior UV vertices, as well as an XYZ representation 112 of the mesh R, after rearranging interior UV vertices is also illustrated.
- FIGS. 6-9 collectively correspond to the illustrations and descriptions of FIGS. 1-5 .
- FIG. 6 is a flowchart of an example method of the present disclosure.
- the method can include a step 602 of determining a topology mesh and a geometry mesh, as well as a step 604 of determining a resulting mesh that combines a topology of the topology mesh with a bi-dimensional (UV) coordinate space, and a geometry of the geometry mesh.
- step 604 includes sub-step 604 . 1 of copying the topology mesh to the resulting mesh.
- the method includes a step 604 . 2 of copying a UV boundary of the geometry mesh to the resulting mesh, along with a step 604 . 3 of restoring a UV parametrization of the resulting mesh to the topology mesh.
- the method includes a step 604 . 4 of copying a three-dimensional geometry of the geometry mesh to the resulting mesh.
- the resulting mesh comprises (1) a topology of the topology mesh of both 3D and UV representations; (2) UV parametrization of the topology mesh; (3) a 3D geometry of the geometry mesh; and (4) UV boundaries of UV shells of the geometry mesh.
- the method can also include a step 606 of outputting the resulting mesh.
- the resulting mesh is utilized to generate a representation of a garment that is applied to an avatar.
- the method can include resizing the avatar and the garment by recreating the resulting mesh by altering at least one of the UV boundary, the UV parameterization, or the three-dimensional geometry.
- FIG. 7 is a flowchart of a method for copying a UV boundary.
- the method can include a step 702 of generating a modified resulting mesh by mapping UV boundaries of the resulting mesh onto UV boundaries of the geometry mesh.
- the generation of a modified resulting mesh can include a step 704 . 1 of determining geometry UV shells and resulting UV shells.
- the method can include a step 704 . 2 of determining one or more of the resulting UV shells that matches a geometry UV shell.
- the method can also include a step 704 . 3 of building UV boundaries comprising an array of corresponding pairs of the geometry UV shells and the resulting UV shells.
- the method can include a step 704 . 4 of redefining UV positions of resulting boundary points by mapping the resulting boundary points onto a curve defined by points of the geometry boundary.
- FIG. 8 is a flowchart of a method for restoring a UV parameterization.
- the method can include a step 802 of generating a modified resulting mesh by restoring the UV parameterization of the resulting mesh to be consistent with a UV parameterization of the topology mesh.
- the method can include a step 804 . 1 of determining topology UV shells and resulting UV shells. For each topology UV shell of the topology UV shells, the method can include a step 804 . 2 of subdividing the topology UV shells into one or more of overlapping regions. In some embodiments, the method includes a step 804 .
- the method can also include a step 804 . 4 of subdividing resulting UV shells into one or more of overlapping regions, as well as a step 804 . 5 of building an array of pairs, wherein each element of the array comprises UV region(s) of the topology mesh and corresponding UV region(s) of the resulting mesh
- the method can include a step 804 . 6 of processing pairs of topology and resulting regions, redefining UV positions of points of the resulting regions by enforcing a same relative point density in the resulting regions as in the paired topology regions.
- FIG. 9 is a flowchart of a method for copying a three-dimensional geometry.
- the method can include a step 902 of generating a modified resulting mesh by having points of the resulting mesh described in a three-dimensional shape of the geometry mesh, wherein geometry mesh and the resulting mesh comprise similar or identical UV parameterization based on the UV boundary restoration process described above.
- the method can include a step 904 . 1 of determining source geometry UV shells and target/resulting UV shells. For each resulting UV shell in the target/resulting UV shells, the method can include a step 904 . 2 of determining one or more of a source/geometry UV shell of the source/geometry UV shells. For each resulting vertex of the resulting/target UV shells, the method can include a step 904 . 3 of mapping a UV vertex of the target/resulting mesh to a geometry UV shell of the source/geometry UV shells. The method can also include as step 904 .
- FIG. 10 is a flowchart of an example method of the present disclosure. This method is a variation of the method disclosed above with respect to FIG. 6 .
- the method can include a step 1002 of determining a topology mesh and a geometry mesh, as well as a step 1004 of determining a resulting mesh that combines a topology of the topology mesh with a bi-dimensional (UV) coordinate space, and a geometry of the geometry mesh.
- step 1004 includes sub-step 1004 . 1 of copying the topology mesh to the resulting mesh.
- the method includes a step 1004 . 2 of copying a UV boundary of the geometry mesh to the resulting mesh, along with a step 1004 . 3 of copying/restoring a UV parametrization of the resulting mesh to the geometry mesh.
- the method includes a step 1004 . 4 of copying a three-dimensional geometry of the geometry mesh to the resulting mesh.
- the method can also include a step 1006 of outputting the resulting mesh.
- the resulting mesh is utilized to generate a representation of a garment that is applied to an avatar.
- the method can include resizing the avatar and the garment by recreating the resulting mesh by altering at least one of the UV boundary, the UV parameterization, or the three-dimensional geometry.
- the steps of copying UV boundaries, UV parameterization, and 3D geometry can be accomplished using methods of FIGS. 7-9 . In contrast to the method of FIG. 6 , this method involves copying/restoring UV parameterization (e.g., density) of the geometry, rather than the topology.
- UV parameterization e.g., density
- FIG. 11 is a flowchart of an example method of the present disclosure. This method is a variation of the method disclosed above with respect to FIG. 6 .
- the method can include a step 1102 of determining a topology mesh and a geometry mesh, as well as a step 1104 of determining a resulting mesh that combines a topology of the topology mesh with a bi-dimensional (UV) coordinate space, and a geometry of the geometry mesh.
- step 1104 includes sub-step 1104 . 1 of copying the topology mesh to the resulting mesh.
- the method includes a step 1104 . 2 of copying a three-dimensional geometry of the geometry mesh to the resulting mesh.
- the method can also include a step 1106 of outputting the resulting mesh.
- the resulting mesh is utilized to generate a representation of a garment that is applied to an avatar.
- the method can include resizing the avatar and the garment by recreating the resulting mesh by altering at least one of the UV boundary, the UV parameterization, or the three-dimensional geometry.
- the step of copying geometry can be performed using the geometry copying methods as set forth above, with the exception that in this method related to FIG. 10 , the UV parameterization of the geometry is copied/restored rather than the topology. This method excludes steps related to copying UV boundaries and UV parameterization restoration.
- FIG. 12 illustrates example pseudocode for implementing the methods disclosed herein for measurement space deformation and three-dimensional interpolation.
- the pseudocode correlates, in part, or in whole, to the methods of FIGS. 7-11 , as well as other descriptions provided herein.
- the term “source” shall be understood to include either the geometry mesh or the topology mesh based on context.
- the “target” will be understood to be the resulting mesh.
- VARIATION 1 involves creating a resulting mesh having a topology of the topology mesh, geometry of the geometry mesh, as well as UV boundaries of the geometry mesh and UV parametrization of topology mesh.
- An example of VARIATION 1 is illustrated and described with respect to FIG. 6 , and set forth more fully in the sections above.
- VARIATION 2 involves creating a resulting mesh having a topology of topology mesh and a geometry of the geometry mesh, as well as UV boundaries and UV parameterization of the geometry mesh.
- VARIATION 3 is similar to the methods of VARIATIONS 1 and 2, and involves creating a resulting mesh having a topology of topology mesh and a geometry of the geometry mesh, as well as UV boundaries and UV parameterization of the topology mesh.
- FIG. 13 is a diagrammatic representation of an example machine in the form of a computer system 1 , within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed.
- the machine operates as a standalone device or may be connected (e.g., networked) to other machines.
- the machine may operate in the capacity of a server or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
- the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
- the computer system 1 includes a processor or multiple processor(s) 5 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), and a main memory 10 and static memory 15 , which communicate with each other via a bus 20 .
- the computer system 1 may further include a video display 35 (e.g., a liquid crystal display (LCD)).
- a processor or multiple processor(s) 5 e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both
- main memory 10 and static memory 15 which communicate with each other via a bus 20 .
- the computer system 1 may further include a video display 35 (e.g., a liquid crystal display (LCD)).
- LCD liquid crystal display
- the computer system 1 may also include an alpha-numeric input device(s) 30 (e.g., a keyboard), a cursor control device (e.g., a mouse), a voice recognition or biometric verification unit (not shown), a drive unit 37 (also referred to as disk drive unit), a signal generation device 40 (e.g., a speaker), and a network interface device 45 .
- the computer system 1 may further include a data encryption module (not shown) to encrypt data.
- the drive unit 37 includes a computer or machine-readable medium 50 on which is stored one or more sets of instructions and data structures (e.g., instructions 55 ) embodying or utilizing any one or more of the methodologies or functions described herein.
- the instructions 55 may also reside, completely or at least partially, within the main memory 10 and/or within the processor(s) 5 during execution thereof by the computer system 1 .
- the main memory 10 and the processor(s) 5 may also constitute machine-readable media.
- the instructions 55 may further be transmitted or received over a network via the network interface device 45 utilizing any one of a number of well-known transfer protocols (e.g., Hyper Text Transfer Protocol (HTTP)).
- HTTP Hyper Text Transfer Protocol
- the machine-readable medium 50 is shown in an example embodiment to be a single medium, the term “computer-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database and/or associated caches and servers) that store the one or more sets of instructions.
- computer-readable medium shall also be taken to include any medium that is capable of storing, encoding, or carrying a set of instructions for execution by the machine and that causes the machine to perform any one or more of the methodologies of the present application, or that is capable of storing, encoding, or carrying data structures utilized by or associated with such a set of instructions.
- the term “computer-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals. Such media may also include, without limitation, hard disks, flash memory cards, digital video disks, random access memory (RAM), read only memory (ROM), and the like.
- RAM random access memory
- ROM read only memory
- the example embodiments described herein may be implemented in an operating environment comprising software installed on a computer, in hardware, or in a combination of software and hardware.
- the components provided in the computer system 1 are those typically found in computer systems that may be suitable for use with embodiments of the present disclosure and are intended to represent a broad category of such computer components that are known in the art.
- the computer system 1 can be a server, minicomputer, mainframe computer, or any other computer system.
- the computer may also include different bus configurations, networked platforms, multi-processor platforms, and the like.
- Various operating systems may be used including UNIX, LINUX, WINDOWS, QNX ANDROID, IOS, CHROME, TIZEN, and other suitable operating systems.
- Some of the above-described functions may be composed of instructions that are stored on storage media (e.g., computer-readable medium).
- the instructions may be retrieved and executed by the processor.
- Some examples of storage media are memory devices, tapes, disks, and the like.
- the instructions are operational when executed by the processor to direct the processor to operate in accord with the technology. Those skilled in the art are familiar with instructions, processor(s), and storage media.
- the computer system 1 may be implemented as a cloud-based computing environment, such as a virtual machine operating within a computing cloud.
- the computer system 1 may itself include a cloud-based computing environment, where the functionalities of the computer system 1 are executed in a distributed fashion.
- the computer system 1 when configured as a computing cloud, may include pluralities of computing devices in various forms, as will be described in greater detail below.
- a cloud-based computing environment is a resource that typically combines the computational power of a large grouping of processors (such as within web servers) and/or that combines the storage capacity of a large grouping of computer memories or storage devices.
- Systems that provide cloud-based resources may be utilized exclusively by their owners or such systems may be accessible to outside users who deploy applications within the computing infrastructure to obtain the benefit of large computational or storage resources.
- the cloud is formed, for example, by a network of web servers that comprise a plurality of computing devices, such as the computer device 1 , with each server (or at least a plurality thereof) providing processor and/or storage resources.
- These servers manage workloads provided by multiple users (e.g., cloud resource customers or other users).
- users e.g., cloud resource customers or other users.
- each user places workload demands upon the cloud that vary in real-time, sometimes dramatically. The nature and extent of these variations typically depends on the type of business associated with the user.
- Non-volatile media include, for example, optical or magnetic disks, such as a fixed disk.
- Volatile media include dynamic memory, such as system RAM.
- Transmission media include coaxial cables, copper wire and fiber optics, among others, including the wires that comprise one embodiment of a bus.
- Transmission media can also take the form of acoustic or light waves, such as those generated during radio frequency (RF) and infrared (IR) data communications.
- RF radio frequency
- IR infrared
- Common forms of computer-readable media include, for example, a flexible disk, a hard disk, magnetic tape, any other magnetic medium, a CD-ROM disk, digital video disk (DVD), any other optical medium, any other physical medium with patterns of marks or holes, a RAM, a PROM, an EPROM, an EEPROM, a FLASHEPROM, any other memory chip or data exchange adapter, a carrier wave, or any other medium from which a computer can read.
- a bus carries the data to system RAM, from which a CPU retrieves and executes the instructions.
- the instructions received by system RAM can optionally be stored on a fixed disk either before or after execution by a CPU.
- Computer program code for carrying out operations for aspects of the present technology may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++, or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
- the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
- the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- LAN local area network
- WAN wide area network
- Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Geometry (AREA)
- Architecture (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
- This application claims the benefit and priority of U.S. Provisional Application Ser. No. 63/132,173, filed on Dec. 30, 2020, which is hereby incorporated by reference herein in its entirety, including all references and appendices cited therein, for all purposes, as if fully set forth herein.
- In one embodiment, the present disclosure is directed to a method comprising determining a topology mesh and a geometry mesh and determining a resulting mesh that has a topology of the topology mesh with a bi-dimensional (UV) coordinate space, and a geometry of the geometry mesh.
- In one embodiment, the present disclosure is directed to a 3D system comprising: a processor and a memory for storing instructions, the processor executing the instructions to determine a topology mesh and a geometry mesh; determine a resulting mesh that has a topology of the topology mesh with a bi-dimensional (UV) coordinate space, and a geometry of the geometry mesh by: copying the topology mesh to the resulting mesh; copying a UV boundary of the geometry mesh to the resulting mesh; restoring a UV parametrization of the resulting mesh to the topology mesh; copying a three-dimensional geometry of the geometry mesh to the resulting mesh; and outputting the resulting mesh.
- In one embodiment, a method includes determining a topology mesh and a geometry mesh; and determining a resulting mesh that has a topology of the topology mesh with a bi-dimensional (UV) coordinate space, and a geometry of the geometry mesh.
- The detailed description is set forth regarding the accompanying drawings. The use of the same reference numerals may indicate similar or identical items. Various embodiments may utilize elements and/or components other than those illustrated in the drawings, and some elements and/or components may not be present in various embodiments. Elements and/or components in the figures are not necessarily drawn to scale. Throughout this disclosure, depending on the context, singular and plural terminology may be used interchangeably.
-
FIG. 1 is a screenshot of a topology mesh T and UV shells of the Topology mesh T. -
FIG. 2 is a screenshot of an XYZ representation of the geometry mesh G and UV shells of the geometry mesh G are also illustrated. -
FIG. 3 is a screenshot of UV shells of an intermediate mesh. -
FIG. 4 is a screenshot illustrating the use of consistency 2-D parametrization of UV shells by rearranging interior UV vertices. -
FIG. 5 is a screenshot illustrating computed positions of a resulting mesh R's vertices by mapping them from UV shell space to the 3D space of a garment. -
FIG. 6 is a flowchart of an example method of the present disclosure. -
FIG. 7 is a flowchart of another example method of the present disclosure. -
FIG. 8 is a flowchart of yet another example method of the present disclosure. -
FIG. 9 is a flowchart of an additional example method of the present disclosure. -
FIG. 10 is a flowchart of a variation of the method ofFIG. 6 . -
FIG. 11 is a flowchart of another variation of the method ofFIG. 6 . -
FIG. 12 illustrates example code used to implement aspects of the present disclosure. -
FIG. 13 is schematic diagram of an example computerized system of the present disclosure. - The present disclosure pertains to systems and methods for the generation of a realistic customized computer animation of a user wearing a particular one or more cloth garments.
- In exemplary embodiments, an avatar representing the user's likeness in body shape and size can be generated, and different garments can be draped on the avatar of the user to simulate how the garment(s) would look on the actual body of the human user. The computer animation can be generated for any user, wearing any size garment, in substantially real-time.
- For context, the methods of the present disclosure can be performed by a specifically configured computing system.
FIG. 11 illustrates an example computer system that can be programmed for use in accordance with the present disclosure. That is, the computer system can be configured to provide three-dimensional modeling for use in generating a realistic computer avatar of a user wearing a particular garment. For clarity, the specifically programmed system disclosed herein will be referenced as the “3D system”. - Generally speaking, an example 3D system of the present disclosure utilizes mesh analysis to represent a three-dimensional surface. In some instances, the mesh can comprise a plurality of vertices, also referred to as points. Vertices can be connected by edges. In some instances, vertices and edges can form shapes such as triangles or other similar polygons. Thus, when edges that connect vertices form a closed loop, a face is created. The three-dimensional surface may be conceptually thought of as a pattern of faces.
- Any surface topology can be represented using vertices, edges, and faces. The creation of mesh allows for accurate representation of a surface in three-dimensions. These methods are more efficient in representing three-dimensional surfaces than point-clouds. To be sure, while a three-dimensional surface can be sampled to identify points and create a point-cloud, this process can be quite data intensive depending on the number of vertices/points obtained. The modeling disclosed herein provides advantages over point-cloud processes, and do not require the same amount of data, but still provide accurate 3D representations.
-
FIG. 1 is a screenshot of a topology mesh T related to a three-dimensional object, such as ashirt 100. AnXYZ representation 102 of the topology mesh T andUV shells 104 are also illustrated. The topology includes vertices such asvertex 101, edges, such asedge 103, and an example faces, such asface 105. As noted, the topology includes a plurality of faces. In some embodiments, the 3D system can assign each vertex/point a unique identifier. Edges connecting between two vertices can also be numbered uniquely, as well as identified in terms of the vertices they span between. The same identification process can be used for faces, allowing the 3D system to know which edges and vertices belong to which faces. This collective information is referred to as topology. Further, when referring to geometry, this will be understood to include the surface/face (or group of faces) that are implied by the relative 3D position of each of the vertices associated with the face(s). InFIG. 1 , the dark lines or edges form a topology and the shirt is the underlying geometry. The vertices, edges, and faces of the topology form a consistent representation of the underlying geometry of theshirt 100. - For context, a topology mesh comprises a mesh having a desired topology and desired UV shell density. Also, for context, a geometry mesh includes a mesh having a desired geometry and desired UV shell boundaries. The use of both geometry and topology meshes will be described in greater detail below.
-
FIG. 2 illustrates anXYZ representation 102 of the geometry mesh G andUV shells 104 of the geometry mesh G are also illustrated. The 3D system can determine a resulting mesh R that has a desired topology and desired UV shell density - That is, the resulting mesh R combines the topology of T with UV and geometry of G. It will be understood that T and G have the same UV shells in terms of the number of boundaries and boundary topological (connectivity) information. The 3D system can create R by initializing it to T. This process can include establishing consistency in a 1-D parametrization of boundaries of UV shells of the mesh R and G, by parametrization of boundaries from 0 to 1 and forcing consistent direction. The 3D system can, based on consistency of 1-D boundary parametrization, project the mesh R's UV boundary vertices onto G's UV boundaries to ensure the same UV boundary contours. UV shells of the mesh R are illustrated in
FIG. 3 asrepresentations 106. - As illustrated in
FIG. 4 , the 3D system can establish consistency 2-D parametrization of UV shells, by rearranging interior UV vertices. Rearrangedinterior UV vertices 108 are also illustrated. - As best illustrated in
FIG. 5 , the 3D system can, based on consistency 2-D shell parametrization, compute positions of R's vertices by mapping them from UV shell space to 3D space of a garment. AnXYZ representation 110 of the mesh R prior to rearranging interior UV vertices, as well as anXYZ representation 112 of the mesh R, after rearranging interior UV vertices is also illustrated. - With the disclosed techniques, a realistic customized computer animation of a user wearing a particular one or more cloth garments can be generated in substantially real-time for a user of any shape and size. To be sure,
FIGS. 6-9 collectively correspond to the illustrations and descriptions ofFIGS. 1-5 . -
FIG. 6 is a flowchart of an example method of the present disclosure. The method can include astep 602 of determining a topology mesh and a geometry mesh, as well as astep 604 of determining a resulting mesh that combines a topology of the topology mesh with a bi-dimensional (UV) coordinate space, and a geometry of the geometry mesh. In some instances,step 604 includes sub-step 604.1 of copying the topology mesh to the resulting mesh. Next, the method includes a step 604.2 of copying a UV boundary of the geometry mesh to the resulting mesh, along with a step 604.3 of restoring a UV parametrization of the resulting mesh to the topology mesh. In some embodiments, the method includes a step 604.4 of copying a three-dimensional geometry of the geometry mesh to the resulting mesh. The resulting mesh comprises (1) a topology of the topology mesh of both 3D and UV representations; (2) UV parametrization of the topology mesh; (3) a 3D geometry of the geometry mesh; and (4) UV boundaries of UV shells of the geometry mesh. - The method can also include a
step 606 of outputting the resulting mesh. The resulting mesh is utilized to generate a representation of a garment that is applied to an avatar. In some instances, the method can include resizing the avatar and the garment by recreating the resulting mesh by altering at least one of the UV boundary, the UV parameterization, or the three-dimensional geometry. -
FIG. 7 is a flowchart of a method for copying a UV boundary. The method can include astep 702 of generating a modified resulting mesh by mapping UV boundaries of the resulting mesh onto UV boundaries of the geometry mesh. - In some embodiments, the generation of a modified resulting mesh can include a step 704.1 of determining geometry UV shells and resulting UV shells.
- In some instances, for each geometry UV shell of the geometry UV shells, the method can include a step 704.2 of determining one or more of the resulting UV shells that matches a geometry UV shell. The method can also include a step 704.3 of building UV boundaries comprising an array of corresponding pairs of the geometry UV shells and the resulting UV shells. For each resulting boundary and geometry boundary the UV boundaries, the method can include a step 704.4 of redefining UV positions of resulting boundary points by mapping the resulting boundary points onto a curve defined by points of the geometry boundary.
-
FIG. 8 is a flowchart of a method for restoring a UV parameterization. Conceptually, this process restores or preserves density information that may have been modified during the process of copying UV boundaries. The method can include astep 802 of generating a modified resulting mesh by restoring the UV parameterization of the resulting mesh to be consistent with a UV parameterization of the topology mesh. The method can include a step 804.1 of determining topology UV shells and resulting UV shells. For each topology UV shell of the topology UV shells, the method can include a step 804.2 of subdividing the topology UV shells into one or more of overlapping regions. In some embodiments, the method includes a step 804.3 of determining one or more resulting UV shells that correspond to a topology UV shell. The method can also include a step 804.4 of subdividing resulting UV shells into one or more of overlapping regions, as well as a step 804.5 of building an array of pairs, wherein each element of the array comprises UV region(s) of the topology mesh and corresponding UV region(s) of the resulting mesh In some embodiments, for each pair created from the overlapping topology regions and resulting regions in the array, the method can include a step 804.6 of processing pairs of topology and resulting regions, redefining UV positions of points of the resulting regions by enforcing a same relative point density in the resulting regions as in the paired topology regions. -
FIG. 9 is a flowchart of a method for copying a three-dimensional geometry. The method can include astep 902 of generating a modified resulting mesh by having points of the resulting mesh described in a three-dimensional shape of the geometry mesh, wherein geometry mesh and the resulting mesh comprise similar or identical UV parameterization based on the UV boundary restoration process described above. - Generating a modified target/resulting mesh can include various sub-steps. Thus, the method can include a step 904.1 of determining source geometry UV shells and target/resulting UV shells. For each resulting UV shell in the target/resulting UV shells, the method can include a step 904.2 of determining one or more of a source/geometry UV shell of the source/geometry UV shells. For each resulting vertex of the resulting/target UV shells, the method can include a step 904.3 of mapping a UV vertex of the target/resulting mesh to a geometry UV shell of the source/geometry UV shells. The method can also include as step 904.4 of expressing a projection of the UV vertex in barycentric coordinates of the source/geometry mesh in UV space. It will be understood that a three-dimensional shape of the target/resulting mesh is defined by XYZ components of the target/resulting vertices. It will be further understood that XYZ components of the target/resulting vertices may be result of evaluation the barycentric coordinates of the source/geometry mesh in three-dimensional space.
FIG. 10 is a flowchart of an example method of the present disclosure. This method is a variation of the method disclosed above with respect toFIG. 6 . The method can include astep 1002 of determining a topology mesh and a geometry mesh, as well as astep 1004 of determining a resulting mesh that combines a topology of the topology mesh with a bi-dimensional (UV) coordinate space, and a geometry of the geometry mesh. In some instances,step 1004 includes sub-step 1004.1 of copying the topology mesh to the resulting mesh. Next, the method includes a step 1004.2 of copying a UV boundary of the geometry mesh to the resulting mesh, along with a step 1004.3 of copying/restoring a UV parametrization of the resulting mesh to the geometry mesh. In some embodiments, the method includes a step 1004.4 of copying a three-dimensional geometry of the geometry mesh to the resulting mesh. - The method can also include a
step 1006 of outputting the resulting mesh. The resulting mesh is utilized to generate a representation of a garment that is applied to an avatar. In some instances, the method can include resizing the avatar and the garment by recreating the resulting mesh by altering at least one of the UV boundary, the UV parameterization, or the three-dimensional geometry. The steps of copying UV boundaries, UV parameterization, and 3D geometry can be accomplished using methods ofFIGS. 7-9 . In contrast to the method ofFIG. 6 , this method involves copying/restoring UV parameterization (e.g., density) of the geometry, rather than the topology. -
FIG. 11 is a flowchart of an example method of the present disclosure. This method is a variation of the method disclosed above with respect toFIG. 6 . The method can include astep 1102 of determining a topology mesh and a geometry mesh, as well as astep 1104 of determining a resulting mesh that combines a topology of the topology mesh with a bi-dimensional (UV) coordinate space, and a geometry of the geometry mesh. In some instances,step 1104 includes sub-step 1104.1 of copying the topology mesh to the resulting mesh. In some embodiments, the method includes a step 1104.2 of copying a three-dimensional geometry of the geometry mesh to the resulting mesh. - The method can also include a
step 1106 of outputting the resulting mesh. The resulting mesh is utilized to generate a representation of a garment that is applied to an avatar. In some instances, the method can include resizing the avatar and the garment by recreating the resulting mesh by altering at least one of the UV boundary, the UV parameterization, or the three-dimensional geometry. It will be understood that the step of copying geometry can be performed using the geometry copying methods as set forth above, with the exception that in this method related toFIG. 10 , the UV parameterization of the geometry is copied/restored rather than the topology. This method excludes steps related to copying UV boundaries and UV parameterization restoration. -
FIG. 12 illustrates example pseudocode for implementing the methods disclosed herein for measurement space deformation and three-dimensional interpolation. The pseudocode correlates, in part, or in whole, to the methods ofFIGS. 7-11 , as well as other descriptions provided herein. As used herein, the term “source” shall be understood to include either the geometry mesh or the topology mesh based on context. The “target” will be understood to be the resulting mesh. - In general, there are three variations of general methods that can be used in accordance with the present disclosure. The first variation (VARIATION 1) involves creating a resulting mesh having a topology of the topology mesh, geometry of the geometry mesh, as well as UV boundaries of the geometry mesh and UV parametrization of topology mesh. An example of
VARIATION 1 is illustrated and described with respect toFIG. 6 , and set forth more fully in the sections above. -
VARIATION 2 involves creating a resulting mesh having a topology of topology mesh and a geometry of the geometry mesh, as well as UV boundaries and UV parameterization of the geometry mesh. - A third variation (VARIATION 3) is similar to the methods of
VARIATIONS - Exemplary Computing system
-
FIG. 13 is a diagrammatic representation of an example machine in the form of acomputer system 1, within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed. In various example embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein. - The
computer system 1 includes a processor or multiple processor(s) 5 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), and amain memory 10 andstatic memory 15, which communicate with each other via abus 20. Thecomputer system 1 may further include a video display 35 (e.g., a liquid crystal display (LCD)). Thecomputer system 1 may also include an alpha-numeric input device(s) 30 (e.g., a keyboard), a cursor control device (e.g., a mouse), a voice recognition or biometric verification unit (not shown), a drive unit 37 (also referred to as disk drive unit), a signal generation device 40 (e.g., a speaker), and anetwork interface device 45. Thecomputer system 1 may further include a data encryption module (not shown) to encrypt data. - The
drive unit 37 includes a computer or machine-readable medium 50 on which is stored one or more sets of instructions and data structures (e.g., instructions 55) embodying or utilizing any one or more of the methodologies or functions described herein. Theinstructions 55 may also reside, completely or at least partially, within themain memory 10 and/or within the processor(s) 5 during execution thereof by thecomputer system 1. Themain memory 10 and the processor(s) 5 may also constitute machine-readable media. - The
instructions 55 may further be transmitted or received over a network via thenetwork interface device 45 utilizing any one of a number of well-known transfer protocols (e.g., Hyper Text Transfer Protocol (HTTP)). While the machine-readable medium 50 is shown in an example embodiment to be a single medium, the term “computer-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database and/or associated caches and servers) that store the one or more sets of instructions. - The term “computer-readable medium” shall also be taken to include any medium that is capable of storing, encoding, or carrying a set of instructions for execution by the machine and that causes the machine to perform any one or more of the methodologies of the present application, or that is capable of storing, encoding, or carrying data structures utilized by or associated with such a set of instructions. The term “computer-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals. Such media may also include, without limitation, hard disks, flash memory cards, digital video disks, random access memory (RAM), read only memory (ROM), and the like. The example embodiments described herein may be implemented in an operating environment comprising software installed on a computer, in hardware, or in a combination of software and hardware.
- The components provided in the
computer system 1 are those typically found in computer systems that may be suitable for use with embodiments of the present disclosure and are intended to represent a broad category of such computer components that are known in the art. Thus, thecomputer system 1 can be a server, minicomputer, mainframe computer, or any other computer system. The computer may also include different bus configurations, networked platforms, multi-processor platforms, and the like. Various operating systems may be used including UNIX, LINUX, WINDOWS, QNX ANDROID, IOS, CHROME, TIZEN, and other suitable operating systems. - Some of the above-described functions may be composed of instructions that are stored on storage media (e.g., computer-readable medium). The instructions may be retrieved and executed by the processor. Some examples of storage media are memory devices, tapes, disks, and the like. The instructions are operational when executed by the processor to direct the processor to operate in accord with the technology. Those skilled in the art are familiar with instructions, processor(s), and storage media.
- In some embodiments, the
computer system 1 may be implemented as a cloud-based computing environment, such as a virtual machine operating within a computing cloud. In other embodiments, thecomputer system 1 may itself include a cloud-based computing environment, where the functionalities of thecomputer system 1 are executed in a distributed fashion. Thus, thecomputer system 1, when configured as a computing cloud, may include pluralities of computing devices in various forms, as will be described in greater detail below. - In general, a cloud-based computing environment is a resource that typically combines the computational power of a large grouping of processors (such as within web servers) and/or that combines the storage capacity of a large grouping of computer memories or storage devices. Systems that provide cloud-based resources may be utilized exclusively by their owners or such systems may be accessible to outside users who deploy applications within the computing infrastructure to obtain the benefit of large computational or storage resources.
- The cloud is formed, for example, by a network of web servers that comprise a plurality of computing devices, such as the
computer device 1, with each server (or at least a plurality thereof) providing processor and/or storage resources. These servers manage workloads provided by multiple users (e.g., cloud resource customers or other users). Typically, each user places workload demands upon the cloud that vary in real-time, sometimes dramatically. The nature and extent of these variations typically depends on the type of business associated with the user. - It is noteworthy that any hardware platform suitable for performing the processing described herein is suitable for use with the technology. The terms “computer-readable storage medium” and “computer-readable storage media” as used herein refer to any medium or media that participate in providing instructions to a CPU for execution. Such media can take many forms, including, but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media include, for example, optical or magnetic disks, such as a fixed disk. Volatile media include dynamic memory, such as system RAM. Transmission media include coaxial cables, copper wire and fiber optics, among others, including the wires that comprise one embodiment of a bus. Transmission media can also take the form of acoustic or light waves, such as those generated during radio frequency (RF) and infrared (IR) data communications. Common forms of computer-readable media include, for example, a flexible disk, a hard disk, magnetic tape, any other magnetic medium, a CD-ROM disk, digital video disk (DVD), any other optical medium, any other physical medium with patterns of marks or holes, a RAM, a PROM, an EPROM, an EEPROM, a FLASHEPROM, any other memory chip or data exchange adapter, a carrier wave, or any other medium from which a computer can read.
- Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to a CPU for execution. A bus carries the data to system RAM, from which a CPU retrieves and executes the instructions. The instructions received by system RAM can optionally be stored on a fixed disk either before or after execution by a CPU.
- Computer program code for carrying out operations for aspects of the present technology may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++, or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- The foregoing detailed description includes references to the accompanying drawings, which form a part of the detailed description. The drawings show illustrations in accordance with exemplary embodiments. These example embodiments, which are also referred to herein as “examples,” are described in enough detail to enable those skilled in the art to practice the present subject matter.
- The embodiments can be combined, other embodiments can be utilized, or structural, logical, and electrical changes can be made without departing from the scope of what is claimed. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope is defined by the appended claims and their equivalents. In this document, the terms “a” or “an” are used, as is common in patent documents, to include one or more than one. In this document, the term “or” is used to refer to a nonexclusive “or,” such that “A or B” includes “A but not B,” “B but not A,” and “A and B,” unless otherwise indicated. Furthermore, all publications, patents, and patent documents referred to in this document are incorporated by reference herein in their entirety, as though individually incorporated by reference. In the event of inconsistent usages between this document and those documents so incorporated by reference, the usage in the incorporated reference(s) should be considered supplementary to that of this document; for irreconcilable inconsistencies, the usage in this document controls.
- The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present technology has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. Exemplary embodiments were chosen and described to best explain the principles of the present technology and its practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
- While various embodiments have been described above, they have been presented by way of example only, and not limitation. The descriptions are not intended to limit the scope of the technology to the forms set forth herein. Thus, the breadth and scope of a preferred embodiment should not be limited by any of the above-described exemplary embodiments. The above description is illustrative and not restrictive.
Claims (18)
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/357,394 US20220207828A1 (en) | 2020-12-30 | 2021-06-24 | Systems and methods of three-dimensional modeling for use in generating a realistic computer avatar and garments |
US17/920,716 US20230169726A1 (en) | 2020-12-30 | 2021-12-14 | Measurement space deformation interpolation |
PCT/US2021/063353 WO2022146681A1 (en) | 2020-12-30 | 2021-12-14 | Three-dimensional modeling for use in generating a realistic garmented avatar |
US17/719,934 US20220237846A1 (en) | 2020-12-30 | 2022-04-13 | Generation and simultaneous display of multiple digitally garmented avatars |
PCT/US2022/024601 WO2022221398A1 (en) | 2021-04-14 | 2022-04-13 | Generation and simultaneous display of multiple digitally garmented avatars |
EP22788848.4A EP4275167A1 (en) | 2021-04-14 | 2022-04-13 | Generation and simultaneous display of multiple digitally garmented avatars |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202063132173P | 2020-12-30 | 2020-12-30 | |
US17/357,394 US20220207828A1 (en) | 2020-12-30 | 2021-06-24 | Systems and methods of three-dimensional modeling for use in generating a realistic computer avatar and garments |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/231,325 Continuation-In-Part US11663764B2 (en) | 2020-12-30 | 2021-04-15 | Automatic creation of a photorealistic customized animated garmented avatar |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/920,716 Continuation US20230169726A1 (en) | 2020-12-30 | 2021-12-14 | Measurement space deformation interpolation |
US17/706,420 Continuation-In-Part US20220237857A1 (en) | 2020-12-30 | 2022-03-28 | Producing a digital image representation of a body |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220207828A1 true US20220207828A1 (en) | 2022-06-30 |
Family
ID=82119415
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/357,394 Abandoned US20220207828A1 (en) | 2020-12-30 | 2021-06-24 | Systems and methods of three-dimensional modeling for use in generating a realistic computer avatar and garments |
US17/920,716 Pending US20230169726A1 (en) | 2020-12-30 | 2021-12-14 | Measurement space deformation interpolation |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/920,716 Pending US20230169726A1 (en) | 2020-12-30 | 2021-12-14 | Measurement space deformation interpolation |
Country Status (2)
Country | Link |
---|---|
US (2) | US20220207828A1 (en) |
WO (1) | WO2022146681A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20240020935A1 (en) * | 2022-07-15 | 2024-01-18 | The Boeing Company | Modeling system for 3d virtual model |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6546309B1 (en) * | 2000-06-29 | 2003-04-08 | Kinney & Lange, P.A. | Virtual fitting room |
US6614431B1 (en) * | 2001-01-18 | 2003-09-02 | David J. Collodi | Method and system for improved per-pixel shading in a computer graphics system |
US20150351477A1 (en) * | 2014-06-09 | 2015-12-10 | GroupeSTAHL | Apparatuses And Methods Of Interacting With 2D Design Documents And 3D Models And Generating Production Textures for Wrapping Artwork Around Portions of 3D Objects |
US20160209929A1 (en) * | 2015-01-20 | 2016-07-21 | Jahja I. Trisnadi | Method and system for three-dimensional motion-tracking |
US20180240280A1 (en) * | 2015-08-14 | 2018-08-23 | Metail Limited | Method and system for generating an image file of a 3d garment model on a 3d body model |
US20200151807A1 (en) * | 2018-11-14 | 2020-05-14 | Beijing Jingdong Shangke Information Technology Co., Ltd. | System and method for automatically generating three-dimensional virtual garment model using product description |
US10706636B2 (en) * | 2017-06-26 | 2020-07-07 | v Personalize Inc. | System and method for creating editable configurations of 3D model |
US20210056767A1 (en) * | 2019-08-19 | 2021-02-25 | Clo Virtual Fashion Inc. | Automated grading of clothing patterns of garment |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6731287B1 (en) * | 2000-10-12 | 2004-05-04 | Momentum Bilgisayar, Yazilim, Danismanlik, Ticaret A.S. | Method for animating a 3-D model of a face |
GB2458388A (en) * | 2008-03-21 | 2009-09-23 | Dressbot Inc | A collaborative online shopping environment, virtual mall, store, etc. in which payments may be shared, products recommended and users modelled. |
US8982122B2 (en) * | 2008-11-24 | 2015-03-17 | Mixamo, Inc. | Real time concurrent design of shape, texture, and motion for 3D character animation |
US8700477B2 (en) * | 2009-05-26 | 2014-04-15 | Embodee Corp. | Garment fit portrayal system and method |
US20130314401A1 (en) * | 2012-05-23 | 2013-11-28 | 1-800 Contacts, Inc. | Systems and methods for generating a 3-d model of a user for a virtual try-on product |
US10657709B2 (en) * | 2017-10-23 | 2020-05-19 | Fit3D, Inc. | Generation of body models and measurements |
EP3706989A1 (en) * | 2017-10-27 | 2020-09-16 | Technische Universität Berlin | Auxetic structure and a method for manufacturing an auxetic structure |
US10777021B2 (en) * | 2018-02-27 | 2020-09-15 | Soul Vision Creations Private Limited | Virtual representation creation of user for fit and style of apparel and accessories |
-
2021
- 2021-06-24 US US17/357,394 patent/US20220207828A1/en not_active Abandoned
- 2021-12-14 US US17/920,716 patent/US20230169726A1/en active Pending
- 2021-12-14 WO PCT/US2021/063353 patent/WO2022146681A1/en active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6546309B1 (en) * | 2000-06-29 | 2003-04-08 | Kinney & Lange, P.A. | Virtual fitting room |
US6614431B1 (en) * | 2001-01-18 | 2003-09-02 | David J. Collodi | Method and system for improved per-pixel shading in a computer graphics system |
US20150351477A1 (en) * | 2014-06-09 | 2015-12-10 | GroupeSTAHL | Apparatuses And Methods Of Interacting With 2D Design Documents And 3D Models And Generating Production Textures for Wrapping Artwork Around Portions of 3D Objects |
US20160209929A1 (en) * | 2015-01-20 | 2016-07-21 | Jahja I. Trisnadi | Method and system for three-dimensional motion-tracking |
US20180240280A1 (en) * | 2015-08-14 | 2018-08-23 | Metail Limited | Method and system for generating an image file of a 3d garment model on a 3d body model |
US10706636B2 (en) * | 2017-06-26 | 2020-07-07 | v Personalize Inc. | System and method for creating editable configurations of 3D model |
US20200151807A1 (en) * | 2018-11-14 | 2020-05-14 | Beijing Jingdong Shangke Information Technology Co., Ltd. | System and method for automatically generating three-dimensional virtual garment model using product description |
US20210056767A1 (en) * | 2019-08-19 | 2021-02-25 | Clo Virtual Fashion Inc. | Automated grading of clothing patterns of garment |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20240020935A1 (en) * | 2022-07-15 | 2024-01-18 | The Boeing Company | Modeling system for 3d virtual model |
Also Published As
Publication number | Publication date |
---|---|
US20230169726A1 (en) | 2023-06-01 |
WO2022146681A1 (en) | 2022-07-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11972529B2 (en) | Augmented reality system | |
US20200364937A1 (en) | System-adaptive augmented reality | |
CA3090747C (en) | Automatic rig creation process | |
CN109636919B (en) | Holographic technology-based virtual exhibition hall construction method, system and storage medium | |
US20230085468A1 (en) | Advanced Automatic Rig Creation Processes | |
CN109844818B (en) | Method for building deformable 3d model of element and related relation | |
AU2017272304B2 (en) | Auto vr: an assistant system for virtual reality painting | |
US10571893B2 (en) | Orientation optimization in 3D printing | |
CN109493431B (en) | 3D model data processing method, device and system | |
CN111583379A (en) | Rendering method and device of virtual model, storage medium and electronic equipment | |
US20220207828A1 (en) | Systems and methods of three-dimensional modeling for use in generating a realistic computer avatar and garments | |
CN111382618A (en) | Illumination detection method, device, equipment and storage medium for face image | |
CN114219001A (en) | Model fusion method and related device | |
US20210082192A1 (en) | Topology-change-aware volumetric fusion for real-time dynamic 4d reconstruction | |
CN111026895A (en) | Data visualization processing method and device and storage medium | |
US11694414B2 (en) | Method and apparatus for providing guide for combining pattern pieces of clothing | |
Mezhenin et al. | The synthesis of virtual space in the context of insufficient data | |
Vierjahn et al. | Surface-reconstructing growing neural gas: A method for online construction of textured triangle meshes | |
JP2020013390A (en) | Information processing apparatus, information processing program, and information processing method | |
WO2023179091A1 (en) | Three-dimensional model rendering method and apparatus, and device, storage medium and program product | |
US10957118B2 (en) | Terahertz sensors and photogrammetry applications | |
CN115713609A (en) | Image generation method, image generation device, electronic equipment and storage medium | |
CN113888394A (en) | Image deformation method and device and electronic equipment | |
Mashalkar et al. | Creating Personalized Avatars | |
JP2023011218A (en) | Map information generation device, position identifying device, map information generation method, position identifying method, program, and recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SPREE3D CORPORATION, NEVADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PINSKIY, DMITRIY VLADLENOVICH;REEL/FRAME:057124/0799 Effective date: 20210805 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: SPREE3D CORPORATION, NEVADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PINSKIY, DMITRIY VLADLENOVICH;REEL/FRAME:061565/0517 Effective date: 20221026 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |