US20150269291A1 - Data processing apparatus and data processing program - Google Patents
Data processing apparatus and data processing program Download PDFInfo
- Publication number
- US20150269291A1 US20150269291A1 US14/641,570 US201514641570A US2015269291A1 US 20150269291 A1 US20150269291 A1 US 20150269291A1 US 201514641570 A US201514641570 A US 201514641570A US 2015269291 A1 US2015269291 A1 US 2015269291A1
- Authority
- US
- United States
- Prior art keywords
- model
- deformation
- points
- garment
- position coordinates
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G06F17/5009—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2113/00—Details relating to the application field
- G06F2113/12—Cloth
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/16—Cloth
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2024—Style variation
Definitions
- Embodiments described herein relate generally to a data processing apparatus and data processing program.
- a body shape and a posture of a human body is sensed from a real video to generate a human body model.
- a garment model is deformed and combined with the human body model according to the shape of the human body model. Consequently, a person can have virtual experience as if the person actually tries on a garment.
- furniture or bedding such as a table or a bed is sensed from a real video to generate a furniture or bedding model.
- a model of a tablecloth, a sheet, or the like is deformed and combined with the furniture or bedding model according to the shape of the furniture or bedding model. Consequently, a person can have virtual experience as if the person actually changes an interior of a room.
- VR representation When both of an object to be combined (the human body, the table, the bed, or the like) and a combining object (the garment, the tablecloth, the sheet, or the like) are visualized by the CG, VR representation is realized.
- an object to be combined the human body, the table, the bed, or the like
- a combining object the garment, the tablecloth, the sheet, or the like
- AR representation is realized.
- a technique for virtually deforming the model of the combining object according to a model shape of the object to be combined is necessary.
- Examples of a method of deforming a model include a method of deforming the model according to a physical simulation taking into account a mechanical characteristic of the combining object, the gravity, and the like and a method of assuming a plurality of kinds of the objects to be combined in advance, calculating deformation that occurs when the combining object is matched to the objects to be combined, accumulating results of the calculation, and, when the object to be combined actually appears, selecting a calculation result closest to the real object to be combined.
- the method by the physical simulation requires a lot of computer resources and a long calculation time.
- the method of accumulating the calculation results in advance requires vast simulations beforehand and uses a calculation result obtained by using the objects to be combined different from the real object to be combined. Therefore, accuracy of the calculation tends to be deteriorated.
- FIG. 1 is a block diagram illustrating a data processing apparatus according to a first embodiment
- FIG. 2 is a diagram schematically illustrating a change of data in a data processing method according to the first embodiment
- FIG. 3 is a flowchart illustrating the data processing method according to the first embodiment
- FIG. 4 is a diagram illustrating a garment model in the first embodiment
- FIG. 5 is a diagram illustrating control weight information of a texture format
- FIG. 6 is a diagram illustrating designation of gap information as an absolute value
- FIG. 7 is a diagram illustrating designation of the gap information as a relative value
- FIG. 8 is a diagram illustrating a human body model
- FIG. 9 is a block diagram illustrating a data processing apparatus according to a second embodiment.
- FIG. 10A is a diagram illustrating a deformation history at time (t ⁇ 1);
- FIG. 10B is a diagram illustrating a control-point calculating method at time t;
- FIG. 11 is a time chart illustrating a data processing method according to the second embodiment.
- FIG. 12 is a flowchart illustrating the data processing method according to the second embodiment.
- a data processing apparatus includes a control-point calculating unit and a deformation processing unit.
- the control-point calculating unit calculates target position coordinates on the basis of a first model representing a shape of a first object, deformation parameters representing characteristics of deformation of the first object, and a second model representing a shape of a second object.
- the target position coordinates are the coordinates to which points of the first model should move according to the second model when the first object is deformed according to the second object.
- the deformation processing unit calculates reaching position coordinates to minimize a sum of absolute values of differences between the target position coordinates and the reaching position coordinates where the point reaches. The sum is obtained by taking into account importance levels of the points.
- a series of data processing for deforming a model of a combining object (a first object) according to the shape of an object to be combined (a second object) is specifically described.
- an example of the object to be combined is a human body and an example of the combining object is a garment.
- contents of deformation parameters and a method of using the deformation parameters are described in detail.
- a data processing apparatus is a data processing apparatus that simulates a shape after deformation of a combining object deformed according to an object to be combined when the combining object is applied to the object to be combined. More specifically, the data processing apparatus is an apparatus that simulates deformation of a garment when the garment is virtually worn on a human body.
- the combining object is applied to the object to be combined means deforming the shape of the combining object to fit the shape of the object to be combined and is, for example, a concept including “the garment is worn on the human body”.
- FIG. 1 is a block diagram illustrating the data processing apparatus according to the embodiment.
- a data processing apparatus 1 includes a garment-model acquiring unit 11 , a human-body-model acquiring unit 12 , a deformation-parameter acquiring unit 13 , a control-point calculating unit 14 , and a deformation processing unit 15 .
- a garment model D 1 which is a combining model (a first model), a human body model D 2 , which is a model to be combined (a second model), and deformation parameters D 3 of the garment model are input to the data processing apparatus 1 .
- the garment model D 1 is data representing the shape of the garment, which are the combining object.
- the deformation parameters D 3 are data representing characteristics of deformation of the garment.
- the human body model D 2 is data representing the shape of the human body, which is the object to be combined. Details of the garment model D 1 , the human body model D 2 , and the deformation parameters D 3 are described below.
- the garment-model acquiring unit 11 acquires the garment model D 1 from the outside of the data processing apparatus 1 .
- the human-body-model acquiring unit 12 acquires the human body model D 2 from the outside of the data processing apparatus 1 .
- the deformation-parameter acquiring unit 13 acquires the deformation parameters D 3 from the outside of the data processing apparatus 1 .
- the control-point calculating unit 14 calculates, on the basis of the garment model D 1 , the human body model D 2 , and the deformation parameters D 3 , target position coordinates to which points of the garment model D 1 should move according to the human body model D 2 when the garment is worn on the human body.
- the deformation processing unit 15 calculates reaching position coordinates to minimize a sum of absolute values of differences between target position coordinates of the points of the garment model D 1 and reaching position coordinates where the points actually reach, i.e., a sum obtained by taking into account importance levels of the points.
- the deformation of the garment is limited by a relation among points of the garment, an allowable amount of extension and contraction of a material of the garment, and the like. Therefore, the reaching position coordinates of the points in the garment model after the deformation are likely to be different from the target position coordinates.
- the data processing apparatus 1 can be realized by, for example, dedicated hardware.
- the garment-model acquiring unit 11 , the human-body-model acquiring unit 12 , the deformation-parameter acquiring unit 13 , the control-point calculating unit 14 , and the deformation processing unit 15 may be configured separately from one another.
- the data processing apparatus 1 may be realized by causing a general-purpose personal computer to execute a computer program.
- the garment-model acquiring unit 11 , the human-body-model acquiring unit 12 , and the deformation-parameter acquiring unit 13 may be realized by cooperation of, for example, an optical drive, a LAN (Local Area Network) terminal or a USB (Universal Serial Bus) terminal, a CPU (central processing unit), and a RAM (Random Access Memory).
- the control-point calculating unit 14 and the deformation processing unit 15 may be realized by a CPU and a RAM.
- the operation of the data processing apparatus 1 that is, a data processing method according to the embodiment is described.
- FIG. 2 is a diagram schematically illustrating a change of data in the data processing method according to the embodiment.
- FIG. 3 is a flowchart illustrating the data processing method according to the embodiment.
- the data processing method according to the embodiment is a method of simulating deformation of a garment Ob 1 , which is a combining object that occurs when the garment Ob 1 is virtually worn on a human body Ob 2 , which is an object to be combined.
- the garment model D 1 representing the shape of the garment Ob 1 is created.
- the garment model D 1 is created by, for example, an operator using CG modeling software, CAD software, or the like. It is also possible to photograph the garment Ob 1 with photographing means attached with a depth sensor such as a camera or an infrared camera to acquire the garment image G 1 and create the garment model D 1 with the CG modeling software, the CAD software, or the like on the basis of the garment image G 1 .
- the garment model D 1 may be automatically generated by estimating a three-dimensional structure from depth data.
- the deformation parameters D 3 representing characteristics of deformation of the garment model D 1 are created from the garment Ob 1 .
- the human body Ob 2 is photographed by the photographing means attached with the depth sensor to acquire a human body image G 2 .
- the human body model D 2 representing the shape of the human body Ob 2 is generated on the basis of the human body image G 2 .
- the garment-model acquiring unit 11 of the data processing apparatus 1 acquires the garment model D 1 .
- step S 102 the human-body-model acquiring unit 12 acquires the human body model D 2 .
- the deformation-parameter acquiring unit 13 acquires the deformation parameters D 3 .
- the control-point calculating unit 14 calculates, on the basis of the garment model D 1 , the deformation parameters D 3 , and the human body model D 2 , target position coordinates, which are positions to which points of the garment model D 1 should move according to the human body model D 2 when the garment is deformed according to the human body by putting the garment on the human body.
- the deformation processing unit 15 calculates reaching position coordinates of the points of the garment model after the deformation.
- the deformation processing unit 15 adjusts the reaching position coordinates to minimize a sum of absolute values of differences between the target position coordinates and the reaching position coordinates, i.e., a sum obtained by taking into account importance levels of the points of the garment model D 1 .
- a garment model D 4 after the deformation is obtained.
- at least a part of a calculation result that can be calculated on the basis of the garment model D 1 and the deformation parameters D 3 in a calculation formula used for a simulation is calculated and included in the deformation parameters D 3 in advance. Consequently, it is possible to realize the simulation at high speed.
- a combined image G 3 can be created by superimposing the garment model D 4 after the deformation on the human body image G 2 .
- processing for the superimposing is performed on the outside of the data processing apparatus 1 .
- FIG. 4 is a diagram illustrating the garment model in the embodiment.
- the garment model D 1 which is a combining model to be deformed, is configured by data of computer graphics.
- a plurality of polygon data representing the shape of the garment are configured by a vertex coordinate list indicating three-dimensional position coordinates of a plurality of vertexes and a vertex index list indicating which vertexes are used to form a polygon.
- Crossing points of a lattice shown in FIG. 4 are the vertexes.
- the garment model D 1 may be configured by only a vertex coordinate list, which takes into account order of forming polygons, without using the vertex index list.
- normal vectors of the vertexes and the polygons may be included in advance or may be calculated in the data processing apparatus 1 .
- texture coordinates for associating the texture data with the vertexes may be included.
- the deformation parameters D 3 are described.
- control weight information for example, control weight information, corresponding position information, gap information, and deforming flexibility information are included.
- deformation parameters D 3 only a part of the information may be included or information other than the information may be included.
- the control weight information is information indicating, when the garment model D 1 is deformed with respect to the vertexes of the garment model D 1 , at which importance level the garment model D 1 should be controlled.
- a true value (true/false or 1/0) indicating whether a certain vertex is set as a control point or a value (a value between 0.0 and 1.0) of weight indicating an importance level of control is designated.
- ornamental parts such as a collar, a pocket, and a button of the garment model D 1 should not be deformed according to the shape of the human body model D 2 and should be deformed according to deformation of the other parts of the garment model D 1 . Therefore, the ornamental parts are not set as control points. Therefore, as the control weight information, 0 or a value close to 0 is set.
- the shoulders and an upper part of the back of the garment model D 1 should be relatively strictly deformed according to the shape of the human body model. Therefore, the shoulders and the upper part of the back are set as control points having high importance levels. Therefore, as the control weight information, 1 or a value close to 1 is set.
- the sides and a lower part of the back of the garment model D 1 are portions that are deformed according to the shape of the human body but may be deformed with a certain degree of freedom. Therefore, the sides and the lower part of the back are set as control points having low importance levels. Therefore, as the control weight information, an intermediate value such as 0.4 or 0.6 is set.
- values of the control weight information are set relatively high for structural parts and values of the control weight information are set relatively low for ornamental parts.
- values of the control weight information are set higher for portions closely attached to the object to be combined by the action of the gravity or the like.
- FIG. 5 is a diagram illustrating control weight information of a texture format.
- the garment model D 1 is disassembled into parts of the garment. Values of the control weight information of portions of the parts are indicated by gradation. That is, in dark gray regions, the control weight information is 1 or a value close to 1. In light gray regions, the control weight information is an intermediate value. In white regions, the control weight information is 0 or a value close to 0.
- the corresponding position information is information representing positions on the human body model D 2 corresponding to the vertexes on the garment model D 1 .
- the human body model is divided into a plurality of parts, for example, the forehead part, the head top part, the head side part, the head back part, the neck, the right shoulder, the left shoulder, the right upper arm, the left upper arm, the right forearm, the left forearm, the right hand, the left hand, the chest, the back, the belly, the waist, the right thigh, the left thigh, the right lower leg, the left lower leg, the right foot, and the left foot.
- Part IDs are attached to the parts.
- the part IDs are recorded as attributes of the vertexes of the garment model D 1 .
- the part IDs do not need to be associated with all the vertexes of the garment model D 1 and may be associated with only a part of the vertexes, for example, only the vertexes where values of the control weight information are large.
- corresponding position information corresponding part weight indicating priority for searching for a corresponding position of each of part IDs of the human body model D 2 may be used.
- Corresponding point weight indicating priority for searching for corresponding positions in the vertexes of the human body model D 2 may be used.
- not only the part IDs corresponding to the parts of the human body but also IDs in finer units may be used. For example, IDs corresponding to a single polygon or a group consisting of a plurality of polygons of the garment model D 1 may be used.
- the gap information is information representing setting values of distances between the points of the garment model D 1 and the human body model D 2 and is information indicating, concerning the control points of the garment model D 1 , how large gap is provided with respect to the human body model D 2 to set the control points as target positions after deformation.
- the gap information is spacing amounts indicating distances by which target positions of the control points after deformation of the garment model D 1 are spaced from the surface of the human body model in the normal direction of the human body model.
- the gap information describes the spacing amount as an absolute value or a relative value.
- FIG. 6 is a diagram illustrating designation of the gap information as an absolute value.
- a target position of a control point P D1 on the garment model D 1 is a position spaced from a corresponding point P D2 of the human body model D 2 by a distance g along a normal direction N of the corresponding point P D2 .
- FIG. 7 is a diagram illustrating designation of the gap information as a relative value.
- two kinds of human body models are prepared.
- an inner garment worn on the inner side of the garment Ob 1 is assumed.
- a human body model D 20 not wearing the inner garment and a human body model D 21 wearing the inner garment are prepared.
- a distance d between a corresponding point P D20 of the human body model D 20 corresponding to the control point P D1 of the garment model D 1 and a corresponding point P D21 of the human body model D 21 is calculated.
- the coefficient r is gap information of a control point P D3 .
- a region of the garment and a type of the garment are taken into account.
- the distance g is set relatively short concerning a portion of the combining object (e.g., a garment) disposed above the object to be combined (e.g., a human body).
- the distance g is set relatively long concerning a portion disposed on a side of or below the object to be combined.
- the distance g is set relatively short for the parts of the shoulders and the back of the garment model such that the parts are closely attached to the human body model.
- the distance g is set relatively long for the parts such as the arms and the sides of the garment model such that the garment model is loosely worn on the human body model.
- the distance g is set shorter for the combining object disposed in a position closer to the object to be combined.
- the distance g is set taking into account a type of the garment such as a T-shirt, a dress shirt, a sweater, a jacket, or a coat, on the basis of the order of layered wearing, and taking into account thickness from the human body model.
- the distance g of the T-shirt or the dress shirt is set relatively short such that the T-shirt or the dress shirt is closely attached to the human body model.
- the distance g of the sweater is set longer than the distance g of the T-shirt or the dress shirt taking into account that the sweater is worn over the T-shirt or the dress shirt.
- the distance g of the jacket or the coat is set longer than the distances g of the T-shirt, the dress shirt, and the sweater taking into account that the jacket or the coat is worn over the T-shirt, the dress shirt, and or sweater.
- the deforming flexibility information is information representing a mechanical characteristic of the garment.
- the deforming flexibility information is set, for example according to softness and a degree of expansion and contraction of a material of the garment model.
- the deforming flexibility information designates an allowable range of a change vector or a change amount before and after deformation among vertexes adjacent to one another in the vertexes on the garment model. Specifically, in the case of a material easily distorted or expanded and contracted like a sweater, the allowable range of the change vector or the change amount is set large. In the case of a material less easily distorted or expanded and contracted like leather, the allowable range of the change vector or the change amount is set small.
- the deformation parameters D 3 are allocated to the vertexes of the garment model D 1 .
- the deformation parameters corresponding to the vertexes of the garment model D 1 may be retained as numerical value data corresponding to the vertexes like normal vectors or may be retained as the texture format shown in FIG. 5 .
- texture coordinates need to be set in the garment model D 1 .
- the deformation parameters can be associated with the vertexes of the garment model by performing texture mapping on the basis of the texture coordinates set in the garment model.
- Various kinds of information included in the deformation parameters may be embedded in a single texture as data or may be embedded in separate textures as data.
- the human body model is a model used as a reference for deforming the garment model D 1 and configured by data of computer graphics.
- FIG. 8 is a diagram illustrating the human body model.
- the human body model D 2 is configured by a vertex coordinate list indicating three-dimensional position coordinates concerning a plurality of vertexes of a plurality of polygons representing the shape of a human body and a vertex index list indicating which vertexes are used to form a polygon. Crossing points of a lattice shown in FIG. 8 are the vertexes.
- the part IDs allocated to each of regions are given to the human body model D 2 .
- two kinds of human body models are prepared, i.e., the human body model D 20 not wearing an inner garment and the human body model D 21 wearing the inner garment.
- the human body model D 2 may be configured by only the vertex coordinate list, which takes into account order of forming polygons, without using the vertex index list.
- normal vectors of the vertexes or the polygons may be included. The normal vectors may be calculated after being input to the data processing apparatus 1 .
- step S 104 considering an energy function indicated by Expression 1, a formula for calculating a solution for minimizing energy of the energy function is set up.
- step S 105 the formula is solved to simulate deformation of a garment.
- E represents the energy function
- m represents the number of vertexes set as control points among vertexes of a garment model
- c i represents a target position coordinate after deformation of an i-th control point
- x i represents a reaching position coordinate after the deformation of the i-th control point
- ⁇ i represents control weight information representing an importance level of control of the i-th control point.
- the energy function E is obtained by weighting a square of a difference between a target position coordinate and a reaching position coordinate with respect to all the control points and totaling the squares.
- the target position coordinate c i is determined on the basis of the human body model D 2 , the gap information, and the corresponding position information. Therefore, Expression 1 includes the human body model D 2 and the control weight information, the gap information, and the corresponding position information among the deformation parameters D 3 .
- the reaching position coordinate x i is calculated such that the energy function E is minimized, that is, the garment model D 1 fits in an ideal position determined on the basis of the human body model D 2 as much as possible.
- Determinants shown in Expressions 2 to 4 are solved in order to calculate the reaching position coordinate x i for minimizing the energy function E shown in Expression 1.
- the number of rows of a matrix A is equivalent to the number of control points of the garment model and the number of columns is equivalent to the number of vertexes of the garment model.
- the number of control points is, for example, approximately 3000.
- the number of rows of a matrix b is equivalent to the number of control points of the garment model.
- the control weight information for determining beforehand which vertexes of parameters concerning the matrix A, in particular, the garment model are set as control points and with which importance level the control points are controlled. If the matrix A is determined beforehand, a portion that can be determined by only information of the matrix Z in Expression 5, that is, a matrix (A T A) ⁇ 1 A T can be calculated beforehand and a result of the calculation can be retained as a part of the deformation parameters D 3 . Therefore, it is possible to markedly reduce the processing time during the execution.
- the target position coordinate c i it is important whether the target position coordinate c i can be calculated at high speed and high accuracy during the execution.
- the target position coordinate c i after deformation of the i-th control point is calculated with reference to a point on the human body model corresponding thereto. Therefore, it is important to calculate the corresponding point on the human body model at high speed and high accuracy.
- Determination concerning a position shifted by which length and in which direction from the corresponding point on the human body model is set as the target position coordinate greatly affects the quality of the garment model after the deformation. Therefore, because of the presence of the corresponding position information, when the target position coordinate c i is set in Expression 1 or Expression 6, it is possible to determine at high speed and high accuracy to which positions of the human body model D 2 the control points of the garment model D 1 correspond. Therefore, by including the gap information in the deformation parameters D 3 , it is possible to set the target position coordinate c i at high accuracy in Expression 1 or Expression 6.
- the Laplacian L shown in Expression 6 can be calculated as indicated by Expression 7 and Expression 8.
- e represents a set of vertexes connected to a vertex v j by edges and ⁇ jk represents weight at a vertex v k adjacent to the vertex v j .
- L(p j ) represents Laplacian of the garment model before the deformation and L(x j ) represents Laplacian of the garment model after the deformation desired to be finally calculated.
- the number of rows is equivalent to a sum of the number of control points and the number of vertexes on the garment model.
- the number of columns is equivalent to the number of vertexes on the garment model.
- the number of rows is equivalent to the sum of the number of control points and the number of vertexes on the garment model.
- ⁇ j represents weight for indicating an importance level for maintaining the positional relation among the vertexes adjacent to the j-th vertex.
- ⁇ j represents weight for indicating an importance level for maintaining the positional relation among the vertexes adjacent to the j-th vertex.
- the weight ⁇ j is calculated by Expression 11.
- l represents the number of adjacent vertexes
- S represents a threshold for setting the importance level ⁇ j to 1 with respect to an average in the allowable range s k of expansion and contraction.
- control-point calculating unit 14 is described in detail.
- control-point calculating unit 14 substitutes the values in the energy function shown in Expression 1 or Expression 6 and sets up a formula for calculating the reaching position coordinate x i for minimizing the energy function.
- control-point calculating unit 14 determines, using the control weight information, whether the vertexes of the garment model should be included in the control points and, if the vertexes of the garment model are included in the control points, how ⁇ i should be set in Expression 1 or Expression 6. If the control weight information is given, ⁇ i can be set in advance. When the energy function in Expression 1 is used, the matrix A of Expression 2 is determined. Therefore, it is possible to calculate the matrix (A T A) ⁇ 1 A T in Expression 5 beforehand.
- control-point calculating unit 14 calculates corresponding points on the human body model D 2 using the corresponding position information and calculates the target position coordinate c i using the gap information.
- the control-point calculating unit 14 may calculate the value g of the gap taking into account a relation between the direction of the normal vector of the corresponding points of the human body model D 2 and the direction of the gravity. Consequently, the matrix b in Expression 3 is determined and Expression 5 can be calculated.
- the corresponding position information is not included in the deformation parameters D 3
- computational complexity is large and time required for the calculation increases.
- the gap information is not included in the deformation parameters D 3 , it is conceivable to not provide the gap or set a gap amount to a fixed value. However, accuracy of a simulation is deteriorated.
- ⁇ j that is, an importance level for maintaining the positional relation among the vertexes adjacent to the j-th vertex is calculated using the deforming flexibility information. If the deforming flexibility information is given, ⁇ j can be set in advance and the matrix A shown in Expression 9 is determined. Therefore, the matrix (A T A) ⁇ 1 A T shown in Expression 5 can be calculated beforehand. In this way, if the deforming flexibility information of the material of the garment is included in the deformation parameters D 3 , it is possible to simulate the deformation of the garment model D 1 at higher accuracy.
- ⁇ j is set to a fixed value. Therefore, the accuracy of the simulation is slightly deteriorated.
- the deformation processing unit 15 calculates a reaching position coordinate on the basis of the determined control points and the target position coordinates c i of the control points to minimize a sum of absolute values of differences between the target position coordinates and the reaching position coordinates x i , i.e., a sum obtained by taking into account importance levels of the points. Specifically, the deformation processing unit 15 executes calculation of Expression 5 completed by substituting the values. After the calculation, it is also possible to remove abnormal values and recalculate Expression 5 or calculate and correct a positional relation with the human body model at the vertexes of the garment model.
- the data processing method is configured by procedures described below.
- a garment model representing the shape of a garment, deformation parameters representing characteristics of deformation of the garment, and a human body model representing the shape of a human body are acquired (steps S 101 to S 103 ).
- ⁇ 3> Reaching position coordinates are calculated to minimize a sum of absolute values of differences between the target position coordinates and reaching position coordinates where the points of the garment model reach, i.e., a sum obtained by taking into account importance levels of the points of the garment model (step S 105 ).
- the data processing apparatus 1 can be realized by causing a general-purpose computer to execute a computer program.
- a data processing program used in this case is a program for causing the computer to execute the procedures ⁇ 1> to ⁇ 3>.
- a data processing apparatus is an apparatus for creating an animation (a moving image).
- a deformation history is stored after deformation of a garment model and used for deformation of the next frame. Consequently, it is possible to deform a garment following the movement of a human body and create a high-quality animation.
- FIG. 9 is a block diagram illustrating the data processing apparatus according to the embodiment.
- a deformation-history storing unit 16 is provided in addition to the components of the data processing apparatus 1 (see FIG. 1 ) according to the first embodiment.
- the deformation-history storing unit 16 stores, as a change history, a result of a deformation simulation of the garment model D 1 performed by the deformation processing unit 15 .
- the deformation-history storing unit 16 can be configured by, for example, a RAM.
- the control-point calculating unit 14 calculates target position coordinates C i at points of the garment model D 1 taking into account a deformation history at the first point in time in addition to the garment model D 1 , the deformation parameters D 3 , and the human body model D 2 at the second point in time.
- the deformation-history storing unit 16 stores, as a deformation history, the garment model D 4 after deformation calculated by the deformation processing unit 15 .
- the deformation history includes, in addition to the garment model D 4 after the deformation calculated by the deformation processing unit 15 , the calculated matrix (A T A) ⁇ 1 A T used by the control-point calculating unit 14 in deriving Expression 5, information concerning the corresponding points on the human body model at the control points used in deriving the matrix b described in Expression 3 or Expression 10, and information concerning the target position coordinate c i after the deformation at the i-th control point.
- the control-point calculating unit 14 and the deformation processing unit 15 use these kinds of history information in performing processing of the next frame.
- the control-point calculating unit 14 is described.
- the control-point calculating unit 14 determines control points taking into account the deformation history read out from the deformation-history storing unit 16 in addition to the acquired garment model D 1 , deformation parameters D 3 , and human body model D 2 and calculates target position coordinates after the deformation at the control points.
- the calculated matrix (A T A) ⁇ 1 A T stored in the deformation-history storing unit 16 can be always reused. Therefore, the calculated matrix (A T A) ⁇ 1 A T is reused in all frames.
- the other deformation histories are classified into three patterns described below according to reuse methods for the deformation histories.
- FIG. 10A is a diagram illustrating a deformation history at time (t ⁇ 1).
- FIG. 10B is a diagram illustrating a control-point calculating method at time t.
- Time (t ⁇ 1) is time one frame before time t.
- a target position coordinate of a control point of the garment model D 1 is described.
- a target position at a certain control point is represented as p 1 and a reaching point is represented as p 2 .
- the target position p 1 and the reaching position p 2 are in a predetermined positional relation with respect to a polygon of the human body model, a normal vector, and a specific vector on a polygon surface at time (t ⁇ 1).
- the control-point calculating unit 14 calculates a position p 1 ′ and a position p 2 ′, which are in the predetermined positional relation with respect to a polygon of the human body model, a normal vector, and a specific vector on a polygon surface at time t.
- the control-point calculating unit 14 sets the position p 1 ′ or the position p 2 ′ as a target position at time t. Simply by using the history of the frames in the past, it is possible to calculate Expression 5.
- FIG. 11 is a time chart illustrating a data processing method according to the embodiment.
- target position coordinates of the control points are calculated anew without inheriting the past deformation histories according to the pattern (3). Consequently, it is possible to guarantee accuracy of the simulation.
- control points are calculated according to the pattern (3), every time a fixed time (number of frames) T 2 elapses, the past deformation histories are partially inherited according to the pattern (2), a part of the deformation histories is calculated anew, and target position coordinates of the control points are calculated.
- the time T 2 is shorter than the time T 3 .
- the deformation processing unit 15 is described. After forming the deformation simulation at time t, the deformation processing unit 15 may perform filtering in the time direction to correct the garment model using a deformation history before time (t ⁇ 1). That is, the deformation processing unit 15 mixes a simulation result at time t and the deformation history before time (t ⁇ 1) and creates a garment model at time t. For example, the deformation processing unit 15 performs the filtering according to Expression 12. Consequently, it is possible to further improve the continuity among the frames.
- x′ t represents a reaching position coordinate after the correction at time t
- x t represents a reaching position coordinate before the correction (after the normal deformation processing) at time t
- r represents the number of frames in the past referred to in the filtering
- k represents an interpolation coefficient
- a filtering method by Expression 12 is an example. General filtering in the time direction can also be used.
- the operation of the data processing apparatus 2 that is, a data processing method according to the embodiment is described.
- FIG. 12 is a flowchart illustrating the data processing method according to the embodiment.
- a plurality of frames arrayed in time series are present in the human body model D 2 .
- the garment-model acquiring unit 11 acquires the garment model D 1 .
- step S 103 the deformation-parameter acquiring unit 13 acquires the deformation parameters D 3 .
- the human-body-model acquiring unit 12 sets an initial frame, that is, sets a value of a time parameter t to 0.
- the human-body-model acquiring unit 12 acquires the human body model D 2 in a t-th frame.
- the control-point calculating unit 14 acquires a deformation history before a (t ⁇ 1)-th frame from the deformation-history storing unit 16 .
- the deformation history before the (t ⁇ 1)-th frame is data generated when deformation processing before the (t ⁇ 1)-th frame is performed and stored in the deformation-history storing unit 16 .
- the control-point calculating unit 14 selects a control point calculation pattern corresponding to time t. That is, the control-point calculating unit 14 selects any one of the patterns (1) to (3).
- the processing proceeds to step S 205 .
- the pattern (2) is selected, the processing proceeds to step S 206 .
- the processing proceeds to step S 207 .
- step S 205 the control-point calculating unit 14 calculates control points in the t-th frame reusing both of the information concerning the corresponding points and the target position coordinates.
- the control-point calculating unit 14 determines the control points on the basis of the deformation history before the (t ⁇ 1)-th frame besides the garment model D 1 , the deformation parameters D 3 , and the human body model D 2 acquired in the t-th frame and calculates target position coordinates after deformation at the respective control points. Thereafter, the processing proceeds to step S 208 .
- step S 206 the control-point calculating unit 14 determines control points in the t-th frame reusing the information concerning the corresponding points and calculates target position coordinates at the control points. Thereafter, the processing proceeds to step S 208 .
- step S 207 the control-point calculating unit 14 determines control points in the t-th frame anew without reusing the past deformation history and calculates target position coordinates at the control points. Thereafter, the processing proceeds to step S 208 .
- the deformation processing unit 15 performs the deformation processing in the t-th frame.
- the deformation-processing unit 15 performs the calculation of Expression 5 on the basis of the control points determined for the human body model D 2 in the t-th frame and the target position coordinates after the deformation at the respective control points and calculates reaching position coordinates at the control points.
- the deformation processing unit 15 stores a deformation history in the t-th frame in the deformation-history storing unit 16 .
- the human-body-model acquiring unit 12 changes the frame to the next frame. That is, the human-body-model acquiring unit 12 changes the time parameter t to (t+1).
- the human-body-model acquiring unit 12 determines whether the present frame reaches a last frame.
- a total number of frames of the human body model D 2 is represented as N.
- a deformation history of a garment model in a certain frame is stored in the deformation-history storing unit and used for a deformation simulation of the next garment model. Consequently, it is possible to create, at high speed and high accuracy, an animation of a garment model that follows the movement of a human body.
- the present invention is not limited to the embodiments per se.
- the constituent elements can be changed and embodied without departing from the spirit of the present invention.
- Various inventions can be formed by appropriately combining the plurality of constituent elements disclosed in the embodiments.
- the example is described in which the first object, which is the combining object, is the garment and the second object, which is the object to be combined, is the human body.
- the first object only has to be an object that is deformed according to the shape of the second object.
- the first object may be a cloth cover and the second object may be furniture or bedding.
- both of the first model and the second model target one kind of object.
- one or both of the first model and the second model may simultaneously target a plurality of kinds of objects.
- a combining unit that combines the deformed first model and second model and a presenting unit that presents a combination result are added to the data processing apparatus according to the embodiments, it is possible to obtain a video combining apparatus for realizing VR representation of the combination result.
- a combining unit that combines the deformed garment D 4 and human body image G 2 and generates the combined image G 3 (see FIG. 2 ) and a presenting unit that presents the combined image G 3 are added to the data processing apparatus according to the embodiments, it is possible to obtain a video combining apparatus for realizing AR representation.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Architecture (AREA)
- Computer Graphics (AREA)
- Evolutionary Computation (AREA)
- Geometry (AREA)
- Processing Or Creating Images (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- Computational Mathematics (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Algebra (AREA)
- Pure & Applied Mathematics (AREA)
- Databases & Information Systems (AREA)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-060026 | 2014-03-24 | ||
JP2014060026A JP2015184875A (ja) | 2014-03-24 | 2014-03-24 | データ処理装置及びデータ処理プログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150269291A1 true US20150269291A1 (en) | 2015-09-24 |
Family
ID=54142354
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/641,570 Abandoned US20150269291A1 (en) | 2014-03-24 | 2015-03-09 | Data processing apparatus and data processing program |
Country Status (3)
Country | Link |
---|---|
US (1) | US20150269291A1 (ja) |
JP (1) | JP2015184875A (ja) |
CN (1) | CN104952112A (ja) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170161948A1 (en) * | 2017-02-15 | 2017-06-08 | StyleMe Limited | System and method for three-dimensional garment mesh deformation and layering for garment fit visualization |
CN107229780A (zh) * | 2017-05-18 | 2017-10-03 | 广东溢达纺织有限公司 | 参数化服装纸样的加缩水方法及装置 |
CN107464289A (zh) * | 2017-08-03 | 2017-12-12 | 厦门幻世网络科技有限公司 | 一种虚拟服饰穿戴方法、装置、设备和存储介质 |
US10242498B1 (en) | 2017-11-07 | 2019-03-26 | StyleMe Limited | Physics based garment simulation systems and methods |
US10373373B2 (en) | 2017-11-07 | 2019-08-06 | StyleMe Limited | Systems and methods for reducing the stimulation time of physics based garment simulations |
US10395404B2 (en) | 2014-09-04 | 2019-08-27 | Kabushiki Kaisha Toshiba | Image processing device for composite images, image processing system and storage medium |
CN110737913A (zh) * | 2019-09-02 | 2020-01-31 | 深圳壹账通智能科技有限公司 | 基于时间日期数据的安全脱敏方法、装置和计算机设备 |
CN110766603A (zh) * | 2018-07-25 | 2020-02-07 | 北京市商汤科技开发有限公司 | 一种图像处理方法、装置和计算机存储介质 |
WO2021179936A1 (en) * | 2020-03-09 | 2021-09-16 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | System and method for virtual fitting |
CN113797529A (zh) * | 2021-09-18 | 2021-12-17 | 珠海金山网络游戏科技有限公司 | 目标展示方法及装置 |
US11282290B1 (en) * | 2020-11-19 | 2022-03-22 | Adobe Inc. | Generating suggested edits for three-dimensional graphics based on deformations of prior edits |
US11551646B2 (en) * | 2019-07-05 | 2023-01-10 | Lg Electronics Inc. | Artificial intelligence apparatus for calibrating output position of display panel of user and method for the same |
US11595739B2 (en) * | 2019-11-29 | 2023-02-28 | Gree, Inc. | Video distribution system, information processing method, and computer program |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018148525A (ja) * | 2017-03-09 | 2018-09-20 | エイディシーテクノロジー株式会社 | 仮想立体物生成装置 |
KR101995277B1 (ko) * | 2017-07-31 | 2019-10-02 | 주식회사 자이언소프트 | 가상신체 구현 시스템 |
CN109426780A (zh) * | 2017-08-28 | 2019-03-05 | 青岛海尔洗衣机有限公司 | 穿戴物品信息采集系统和方法 |
CN109427090A (zh) * | 2017-08-28 | 2019-03-05 | 青岛海尔洗衣机有限公司 | 穿戴物品3d模型构建系统和方法 |
JP7008557B2 (ja) * | 2018-03-26 | 2022-01-25 | 株式会社コーエーテクモゲームス | 画像生成プログラム、記録媒体、画像生成方法 |
JP7293036B2 (ja) * | 2019-08-09 | 2023-06-19 | 任天堂株式会社 | 情報処理装置、情報処理プログラム、情報処理システム及び情報処理方法 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060290693A1 (en) * | 2005-06-22 | 2006-12-28 | Microsoft Corporation | Large mesh deformation using the volumetric graph laplacian |
US20150130795A1 (en) * | 2013-11-14 | 2015-05-14 | Ebay Inc. | Garment simulation using thread and data level parallelism |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0934952A (ja) * | 1995-07-20 | 1997-02-07 | Toyobo Co Ltd | 着装シミュレーション方法および着装シミュレーション装置 |
JP2002117414A (ja) * | 2000-10-11 | 2002-04-19 | Toyobo Co Ltd | 衣服衝突処理方法および衣服衝突処理プログラムを記録したコンピュータ読み取り可能な記録媒体 |
-
2014
- 2014-03-24 JP JP2014060026A patent/JP2015184875A/ja active Pending
-
2015
- 2015-03-09 US US14/641,570 patent/US20150269291A1/en not_active Abandoned
- 2015-03-18 CN CN201510119281.7A patent/CN104952112A/zh not_active Withdrawn
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060290693A1 (en) * | 2005-06-22 | 2006-12-28 | Microsoft Corporation | Large mesh deformation using the volumetric graph laplacian |
US20150130795A1 (en) * | 2013-11-14 | 2015-05-14 | Ebay Inc. | Garment simulation using thread and data level parallelism |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10395404B2 (en) | 2014-09-04 | 2019-08-27 | Kabushiki Kaisha Toshiba | Image processing device for composite images, image processing system and storage medium |
KR20190118213A (ko) * | 2017-02-15 | 2019-10-18 | 스타일미 리미티드 | 의류 맞음새 시각화를 위한 3차원 의류 메시 변형 및 레이어링을 위한 시스템 및 방법 |
US9754410B2 (en) * | 2017-02-15 | 2017-09-05 | StyleMe Limited | System and method for three-dimensional garment mesh deformation and layering for garment fit visualization |
US20170161948A1 (en) * | 2017-02-15 | 2017-06-08 | StyleMe Limited | System and method for three-dimensional garment mesh deformation and layering for garment fit visualization |
KR102353776B1 (ko) * | 2017-02-15 | 2022-01-19 | 스타일미 리미티드 | 의류 맞음새 시각화를 위한 3차원 의류 메시 변형 및 레이어링을 위한 시스템 및 방법 |
CN107229780A (zh) * | 2017-05-18 | 2017-10-03 | 广东溢达纺织有限公司 | 参数化服装纸样的加缩水方法及装置 |
CN107464289A (zh) * | 2017-08-03 | 2017-12-12 | 厦门幻世网络科技有限公司 | 一种虚拟服饰穿戴方法、装置、设备和存储介质 |
US10242498B1 (en) | 2017-11-07 | 2019-03-26 | StyleMe Limited | Physics based garment simulation systems and methods |
US10373373B2 (en) | 2017-11-07 | 2019-08-06 | StyleMe Limited | Systems and methods for reducing the stimulation time of physics based garment simulations |
CN110766603A (zh) * | 2018-07-25 | 2020-02-07 | 北京市商汤科技开发有限公司 | 一种图像处理方法、装置和计算机存储介质 |
US11551646B2 (en) * | 2019-07-05 | 2023-01-10 | Lg Electronics Inc. | Artificial intelligence apparatus for calibrating output position of display panel of user and method for the same |
CN110737913A (zh) * | 2019-09-02 | 2020-01-31 | 深圳壹账通智能科技有限公司 | 基于时间日期数据的安全脱敏方法、装置和计算机设备 |
US11595739B2 (en) * | 2019-11-29 | 2023-02-28 | Gree, Inc. | Video distribution system, information processing method, and computer program |
US12022165B2 (en) | 2019-11-29 | 2024-06-25 | Gree, Inc. | Video distribution system, information processing method, and computer program |
WO2021179936A1 (en) * | 2020-03-09 | 2021-09-16 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | System and method for virtual fitting |
US11282290B1 (en) * | 2020-11-19 | 2022-03-22 | Adobe Inc. | Generating suggested edits for three-dimensional graphics based on deformations of prior edits |
CN113797529A (zh) * | 2021-09-18 | 2021-12-17 | 珠海金山网络游戏科技有限公司 | 目标展示方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
CN104952112A (zh) | 2015-09-30 |
JP2015184875A (ja) | 2015-10-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20150269291A1 (en) | Data processing apparatus and data processing program | |
JP6302132B2 (ja) | 画像処理装置、画像処理システム、画像処理方法及びプログラム | |
CA2863097C (en) | System and method for simulating realistic clothing | |
KR20210011425A (ko) | 이미지 처리 방법 및 디바이스, 이미지 장치, 및 저장 매체 | |
CN105354876B (zh) | 一种基于移动终端的实时立体试衣方法 | |
US7308332B2 (en) | Virtual clothing modeling apparatus and method | |
CN109427007B (zh) | 基于多视角的虚拟试衣方法 | |
KR20190118213A (ko) | 의류 맞음새 시각화를 위한 3차원 의류 메시 변형 및 레이어링을 위한 시스템 및 방법 | |
CN106659259A (zh) | 用于虚拟选择衣服的方法 | |
US20200367590A1 (en) | Devices and methods for extracting body measurements from 2d images | |
US10395404B2 (en) | Image processing device for composite images, image processing system and storage medium | |
US20200250892A1 (en) | Generation of Improved Clothing Models | |
JP2022501732A (ja) | 画像処理方法及び装置、画像デバイス並びに記憶媒体 | |
KR101767144B1 (ko) | 의상의 3차원 모델 생성 장치, 방법 및 이를 위한 컴퓨터 프로그램 | |
JPWO2019189846A1 (ja) | サイズ測定システム | |
WO2023056104A1 (en) | Controllable image-based virtual try-on system | |
KR101158453B1 (ko) | 2차원 영상 데이터를 이용하여 평면상에 입체의 가상 의상 코디를 위한 장치 및 방법 | |
US10152827B2 (en) | Three-dimensional modeling method and electronic apparatus thereof | |
KR101508161B1 (ko) | 디지털 서로게이트를 이용한 가상 피팅 장치 및 방법 | |
WO2021074630A1 (en) | Methods of image manipulation for clothing visualisation | |
JP6545847B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
CN116342765A (zh) | 基于照片的虚拟人模型生成方法、装置、设备及存储介质 | |
WO2020174586A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
JP2002092641A (ja) | 外殻による3dモデルアニメーション生成方法及びその装置並びにそのプログラムを記録した記録媒体 | |
JP7418677B1 (ja) | 情報処理システム、情報処理方法、及び情報処理プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: KABUSHIKI KAISHA TOSHIBA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SEKINE, MASAHIRO;SUGITA, KAORU;NISHIYAMA, MASASHI;SIGNING DATES FROM 20150217 TO 20150220;REEL/FRAME:035571/0806 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |