US9053553B2 - Methods and apparatus for manipulating images and objects within images - Google Patents
Methods and apparatus for manipulating images and objects within images Download PDFInfo
- Publication number
- US9053553B2 US9053553B2 US12/714,028 US71402810A US9053553B2 US 9053553 B2 US9053553 B2 US 9053553B2 US 71402810 A US71402810 A US 71402810A US 9053553 B2 US9053553 B2 US 9053553B2
- Authority
- US
- United States
- Prior art keywords
- control points
- mesh
- deformed
- control point
- properties
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000000034 method Methods 0.000 title claims abstract description 82
- 238000005457 optimization Methods 0.000 claims abstract description 45
- 238000013519 translation Methods 0.000 claims abstract description 14
- 230000015654 memory Effects 0.000 claims description 22
- 230000001902 propagating effect Effects 0.000 claims description 13
- 230000004044 response Effects 0.000 claims description 7
- 230000000007 visual effect Effects 0.000 claims 12
- 230000000644 propagated effect Effects 0.000 abstract description 21
- 230000006870 function Effects 0.000 description 12
- 238000005516 engineering process Methods 0.000 description 11
- 238000012545 processing Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 230000002093 peripheral effect Effects 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 238000013500 data storage Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 239000003973 paint Substances 0.000 description 3
- 238000009877 rendering Methods 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 230000002411 adverse Effects 0.000 description 2
- 238000011960 computer-aided design Methods 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013501 data transformation Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000009472 formulation Methods 0.000 description 1
- 238000012886 linear function Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 208000003580 polydactyly Diseases 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 229920000638 styrene acrylonitrile Polymers 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/18—Image warping, e.g. rearranging pixels individually
-
- G06T3/0093—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
Definitions
- a general term for these operations is shape or object deformation.
- object deformation is to define a skeleton for the object; the skeleton can then be manipulated to deform the object.
- properly defining a skeleton for an object may be difficult and is not effective for many shapes of objects or for entire images.
- shape deformation is free-form deformation (FFD) in which an object is divided into polygons; each polygon may be deformed by manipulating vertices of the polygon.
- FFD polygons is tedious, and the user must manipulate many vertices to obtain a desired deformation for the object.
- a method for object deformation have been proposed that do not require the manual definition of skeletons or FFD polygons.
- an object is represented by a triangle mesh.
- the user moves several vertices of the mesh as constrained handles.
- the system then computes the positions of the remaining free vertices by minimizing the distortion of each triangle.
- This method employs a two-step closed-form algorithm in which the first step finds an appropriate rotation for each triangle and the second step adjusts its scale.
- Embodiments may provide a warping module and a user interface to the warping module that enable the manipulation of a digital image or a specified object or region in a digital image by selectively deforming or warping portions of the image or selected object or region while maintaining local rigidity.
- embodiments may allow a portion of an image to be moved or adjusted while reducing or minimizing undesirable distortions in the image.
- Embodiments may be implemented according to a criterion that the surface under deformation should behave rigidly. If the surface behaves rigidly, the original image is not adversely distorted. To help achieve this goal of rigidity, embodiments may allow the user to add control points to an image.
- the control points may serve both the purposes of providing handles whereby the surface can be manipulated, and of providing pins or anchors to constrain deformation of the surface at certain points.
- the user may thus position multiple control points on a surface to constrain deformation so that parts of the surface that the user wants to move do move, and parts of the surface that the user does not want to move do not move.
- Embodiments of the warping module may move the underlying points in a surface given the positions of the control points.
- embodiments may allow the user to specify multiple properties at each control point.
- these properties may include, but are not limited to, translation, rotation, depth, and scale.
- the depth property may be used to specify parts of the surface that should be in front of (occlude) other parts of the surface, or to specify parts of the surface that should be behind (be occluded by) other parts of the surface.
- Embodiments of the warping module may overlay a coarser grid or mesh on top of the pixels of the surface to be deformed. For example, in some embodiments, a triangle mesh may be used.
- the warping module takes the control point locations and the information (properties) of the control points. By moving a control point or control points, for example with a mouse or other cursor control device or via user interaction with a touchpad or multitouch device, the user may specify a new position for the control point(s). The user may also specify a rotation around a control point, or one or more other parameters such as a depth of a control point. These properties are stored or associated with each control point. The user can modify those properties (e.g., position, rotation, depth, and scale) via a user interface.
- the warping module may perform an initialization in which the warping module propagates the information to some or all of the other vertices in the mesh to generate an initial deformed mesh.
- the warping module may then perform an iterative optimization operation on the deformed mesh to improve the deformation while retaining local rigidity.
- embodiments move or adjust coordinates of the vertices of the mesh.
- the surface is then deformed according to the deformed mesh.
- a user may specify rotation at the control points. However, if the user does not specify a rotation, the warping module may infer the rotation by solving the optimization as described herein.
- the rotation information may be propagated to some or all of the other vertices in the mesh. For example, the rotation information may be propagated in the initialization step using the coordinates computed by solving a Laplacian problem.
- the user may specify a depth at one or more control points. The depth may be propagated throughout the mesh in the same way as the rotation is propagated. If two portions of a surface overlap, the depth values may be compared to determine which is in front and which is behind.
- a scale property may be similarly specified and propagated.
- a scale property modifies or scales the distance, or length of an edge, between two neighboring vertices.
- the objective function stays the same, but the constraints are no longer just positions; certain edges would have to be of a certain length because of the scale assigned to the vertices.
- a user may specify one, two, three or more properties at each control point. One or more of these properties may then be propagated throughout the mesh.
- Some embodiments may provide a method for the user to specify certain areas of a surface to which propagation of one or more of the control point parameters will be restricted or prohibited so that the area receives less deformation or remains completely rigid.
- some embodiments may provide one or more user interface elements via which a user may paint or otherwise indicate an area of the surface to prevent the area from warping or to restrict warping in the area.
- some embodiments may allow the user to specify an amount of rigidity or elasticity that the user desires.
- initialization may be performed to produce a deformed mesh, and the surface is then deformed according to the results of the initialization.
- the optimization may be allowed to run until, for example a convergence criterion or criteria is reached and the surface is then deformed according to the results of the optimization to provide local rigidity.
- Some embodiments may provide one or more modes or levels in between the two extremes that set a certain number of iterations of the optimization function.
- FIG. 1 is a high-level flowchart of a method of operation of a warping module according to some embodiments.
- FIG. 2 is a more detailed flowchart of a method of operation of a warping module according to some embodiments.
- FIGS. 3A through 3H illustrate examples of manipulating a selected object within an image according to the warping methods described above implemented in an example warping module with an example user interface, according to some embodiments.
- FIGS. 4A through 4H illustrate examples of manipulating an entire image according to the warping methods described above implemented in an example warping module with an example user interface, according to some embodiments.
- FIG. 5 illustrates an example user interface to a warping module according to some embodiments.
- FIG. 6 illustrates an example embodiment of a warping tool.
- FIG. 7 illustrates an example of a computer system that may be used in embodiments.
- such quantities may take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared or otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to such signals as bits, data, values, elements, symbols, characters, terms, numbers, numerals or the like. It should be understood, however, that all of these or similar terms are to be associated with appropriate physical quantities and are merely convenient labels. Unless specifically stated otherwise, as apparent from the following discussion, it is appreciated that throughout this specification discussions utilizing terms such as “processing,” “computing,” “calculating,” “determining” or the like refer to actions or processes of a specific apparatus, such as a special purpose computer or a similar special purpose electronic computing device.
- a special purpose computer or a similar special purpose electronic computing device is capable of manipulating or transforming signals, typically represented as physical electronic or magnetic quantities within memories, registers, or other information storage devices, transmission devices, or display devices of the special purpose computer or similar special purpose electronic computing device.
- Embodiments may provide a warping module and a user interface to the warping module that enable the manipulation of a digital image or a specified object or region in a digital image by selectively deforming or warping portions of the image or selected object or region while maintaining local rigidity.
- embodiments may allow a portion of an image to be moved or adjusted while reducing or minimizing undesirable distortions in the image.
- Embodiments may be implemented according to a criterion that the surface under deformation should behave rigidly. If the surface behaves rigidly, the original image is not adversely distorted. To help achieve this goal of rigidity, embodiments may allow the user to add control points to an image.
- the control points may serve both the purposes of providing handles whereby the surface can be manipulated, and providing pins or anchors to constrain deformation of the surface at certain points. The user may thus position multiple control points on a surface to constrain deformation so that parts of the surface that the user wants to move do move, and parts of the surface that the user does not want to move do not move.
- Embodiments of the warping module may move the underlying points in a surface given the positions of the control points.
- embodiments may allow the user to specify multiple properties at each control point.
- these properties may include, but are not limited to, translation, rotation, depth, and scale.
- the depth property may be used to specify parts of the surface that should be in front of (occlude) other parts of the surface, or to specify parts of the surface that should be behind (be occluded by) other parts of the surface.
- Embodiments of the warping module may overlay a coarser grid or mesh on top of the pixels of the surface to be deformed. For example, in some embodiments, a triangle mesh may be used.
- the warping module takes the control point locations and the information (properties) of the control points. By moving a control point, for example with a mouse or other cursor control device or via user interaction with a touchpad or multitouch device, the user may specify a new position for the control point. The user may also specify a rotation around the control point, or one or more other parameters such as a depth of the control point. These properties are stored or associated with each control point. The user can modify those properties (e.g., position, rotation, and depth) via a user interface.
- the warping module may perform an initialization in which the warping module propagates the information to some or all of the other vertices in the mesh to generate an initial deformed mesh.
- the warping module may then perform an iterative optimization operation on the deformed mesh to improve the deformation while retaining local rigidity.
- embodiments move or adjust coordinates of the vertices of the mesh.
- the surface is then deformed according to the deformed mesh.
- initialization takes a property, e.g. rotation, at each control point, and propagates that property to some or all other vertices in the mesh.
- initialization determines the rotation of some or all vertices in the mesh given the rotation of the control point(s).
- this may be performed by solving a Laplace's equation, which can be solved as a linear system.
- the linear system solved, assigns a set of coordinates to each vertex in the mesh. These coordinates may be used to propagate the information of the control point(s).
- each vertex in the mesh has some coordinates, some weighting coefficients, that inform the vertex how to weigh the properties at each of the control points in order to compute the property value(s) at the vertex.
- embodiments may solve an optimization problem that tries to improve the initialization.
- the propagation generates an initial deformation of the mesh, but does not necessarily minimize the distortion of the original image.
- the initial deformation of the mesh is used as an initial guess, and an iterative optimization is performed that iteratively improve the local rigidity of the deformation.
- optimization may solve the following equation:
- the optimization process tries to simultaneously solve for the best position of each vertex and, in this case, the best rotation of each vertex in order to minimize the distortion subject to matching the specified locations of the control points.
- This optimization proceeds iteratively so that it starts off with an initial guess, iterates and improves until satisfied with the results or until some termination criterion (e.g., a specified number of iterations, or a convergence criterion) is met.
- termination criterion e.g., a specified number of iterations, or a convergence criterion
- a user may specify rotation at the control points. For example, in some embodiments, the user may click an option key to open a rotation widget that allows the user to rotate around a control point and fix a particular rotation for the control point. However, if the user does not specify a rotation, the warping module may infer the rotation by solving the optimization as described herein. In cases where the user specifies a rotation, the rotation information may be propagated to some or all of the other vertices in the mesh. For example, the rotation information may be propagated in the initialization step using the coordinates computed by solving a Laplacian problem, as described herein.
- the user may specify a depth at one or more control points.
- the warping module may use a strategy that assigns a random depth to each control point; the user may override the depth assignment by specifying depth ordering.
- the depth may be propagated throughout the mesh in the same way as the rotation is propagated. If two portions of a surface overlap, the depth values may be compared to determine which is in front and which is behind. Two triangles in the mesh may have slightly different depths at the vertices; these triangles may be examined, and depth order may be based on a comparison of the depths so that the triangles are rendered appropriately.
- a scale property may be similarly specified and propagated.
- a scale property modifies or scales the distance, or length of an edge, between two neighboring vertices.
- the objective function stays the same, but the constraints are no longer just positions; certain edges would have to be of a certain length because of the scale assigned to the vertices.
- a user may specify one, two, three or more properties at each control point. One or more of these properties may then be propagated throughout the mesh.
- Some embodiments may provide a method for the user to specify certain areas of a surface to which propagation of one or more of the control point parameters will be restricted or prohibited so that the area receives less deformation or remains completely rigid.
- some embodiments may provide one or more user interface elements via which a user may paint or otherwise indicate an area of the surface to prevent the area from warping or to restrict warping in the area.
- some embodiments may allow the user to specify an amount of rigidity or elasticity that the user desires.
- initialization may be performed to produce a deformed mesh, but the optimization may not be performed, to allow maximum elasticity.
- the surface is then deformed according to the results of the initialization.
- the optimization may be allowed to run until, for example a convergence criterion or criteria is reached.
- the surface is then deformed according to the results of the optimization to provide local rigidity.
- Some embodiments may provide one or more modes in between the two extremes that set a certain number of iterations of the optimization function.
- Some embodiments may allow any number of iterations to be specified by the user to allow a smooth transition between elasticity and rigidity.
- FIG. 1 is a high-level flowchart of a method of operation of a warping module according to some embodiments.
- a plurality of control points each at a vertices of a polygon mesh overlaid on an image, may be obtained.
- two or more properties of one or more of the control points may be specified by the user, or otherwise inferred.
- a change in a property of a control point may be obtained. For example, a user may move one of the control points in the image, and/or may specify a property (e.g., a rotation) at a control point or points.
- a property e.g., a rotation
- an initialization may be performed that generates an initial deformation of the mesh according to the changed control point, where the other control points act as pins that anchor the mesh.
- the initialization may propagate one or more properties at the control point(s) to other vertices of the mesh.
- some embodiments may provide a method for the user to specify certain areas of a surface to which propagation of one or more of the control point parameters will be restricted or prohibited.
- an iterative optimization of the initial deformation of the mesh may be performed to improve local rigidity.
- the surface underneath the mesh may then be deformed or adjusted according to optimized deformed mesh.
- additional control points may be added as pins and/or for use in manipulating the image, and one or more other control points may be adjusted or properties of control points may be specified by the user to further manipulate the image.
- Each adjustment of a control point may result in elements 104 through 108 being performed.
- FIG. 2 is a more detailed flowchart of a method of operation of a warping module according to some embodiments.
- a surface to be deformed may be obtained.
- the warping module obtains an image or an object or region within an image to be manipulated.
- a user may specify the image or object/region via the user interface.
- a specified object or region may, for example, be the figure of a human, an animal, or any other object or region that appears in the image.
- Manual and automated methods for selecting or specifying objects or regions of images are known in the art; any such method may be used in embodiments to specify an object in an image to be manipulated.
- what is to be manipulated e.g., an entire image or an object or region within an image
- a surface e.g., an entire image or an object or region within an image
- the warping module may generate a polygon mesh on the surface.
- a triangle mesh may be used; a triangle mesh is used as an example in this document.
- FIG. 4B shows an example image with a triangle mesh overlaid on the surface.
- other polygon meshes may be used in some embodiments.
- each polygon is defined by three (or more) vertices and three (or more) corresponding edges that connect the vertices; neighboring polygons may share at least one vertex.
- Methods for generating such polygon meshes for surfaces are known in the art; any such method may be used in embodiments to generate a polygon mesh for a surface to be manipulated.
- control points may be obtained at vertices of the mesh.
- the warping module obtains one or more control points on the surface.
- FIG. 4C shows an example image with control points on the surface.
- a user may specify control points on the surface to be manipulated via the user interface to the warping module.
- Each control point may correspond to a different one of the vertices of the mesh.
- the control points are placed for one (or both) of two reasons: to be used to adjust a part of the surface that is to be manipulated, or to constrain or hold a part of the surface in place that should not be moved when another control point is adjusted.
- embodiments may accept one or two control points, a single control point results in a simple translation of the entire surface, while two control points result in a simple rotation about one of the two control points.
- embodiments are generally directed to the application of image deformation or distortion using three or more control points, where one control point is moved to adjust the surface as desired while the other two or more control points act as pins that constrain the surface adjustment as desired.
- one or more properties of the control points may be obtained.
- these properties may include one or more of, but are not limited to, translation (or position), rotation, depth, and scale.
- one or more user interface elements may be provided whereby a user may specify values for translation, rotation, depth, and/or scale properties at each control point. Each of these properties is discussed in more depth throughout this document. While the values for the properties may be specified, for example by a user via the user interface, in some embodiments one or more properties that are not so specified may be inferred or calculated at control points. For example, if the user does not specify a rotation value at a control point, a rotation value may be automatically calculated for the control point by the warping module.
- the warping module may, for example, detect movement of a control point or control points to a new position.
- a user may use a mouse or other cursor control device, or via user interaction with a touchpad or multitouch device, to move one or more of the control points to a new position on the surface.
- Movement of a control point from a position A to another position B specifies a translation A ⁇ B of the control point. In other words, at least one property (translation) of the control point is changed.
- the other control points act as pins that constrain deformation of the mesh that results from the detected movement of the control point(s).
- an initialization is performed to generate an initial “guess” at a deformation of the polygon mesh after the adjustment of the control point and one or more of its properties.
- one or more properties may be propagated from an adjusted control point, and/or from one or more other control points, to the other vertices of the mesh.
- initialization involves propagating a property or properties of the control point(s) to all other vertices of the mesh, subject to the constraints of the other control points and the shape of the surface.
- an interpolation method may be used to adjust values of a property or properties of other vertices according to the values of a property or properties of the control point(s).
- the boundary of the surface being manipulated may influence how the property or properties are propagated. See, for example, FIG. 3B , where the human shape represents a surface being manipulated. While control point 300 A is closer in Euclidian distance to control point 300 C and its nearby vertices than it is to control point 300 B and its nearby vertices, the propagation method does not propagate to control point 300 C before propagating to control point 300 B; propagation instead follows the boundary of the shape and thus propagates down the arm to first reach control point 300 B before reaching control points 300 D and 300 C. As previously noted, some embodiments may provide a method for the user to specify certain areas of a surface to which propagation of one or more of the control point parameters will be restricted or prohibited.
- an iterative optimization may be performed to improve the local rigidity of the initial deformation of the mesh.
- the warping module may perform an iterative optimization operation to generate an improved deformed mesh.
- the warping module starts with an initial guess generated by the initialization process and performs an optimization that iterates to improve the local rigidity of the deformation.
- the optimization simultaneously solves for the best position of each vertex and the best rotation of each vertex in order to minimize distortion subject to matching the specified locations of the pins (control points).
- This optimization starts with the initial guess, iterates, and improves until satisfied with the results (e.g., until some termination criterion or criteria such as a convergence criterion or a specified number of iterations have been performed). See equation (4) and the discussion of optimization below for a discrete optimization technique that may be used in some embodiments to perform optimization.
- the surface underlying the mesh may be deformed according to the improved deformed mesh to render an adjusted or “deformed” surface.
- energy minimization is used to compute a deformation that not only agrees with the user-specified constraints C (i.e., control points) ⁇ C, but that is also more desirable over other candidates:
- Some embodiments may use a polygon mesh, for example a triangle mesh, to represent the domain for the deformation.
- the location of any point in the domain x ⁇ may be computed using a barycentric combination of its three triangle vertices.
- a barycentric combination may be defined as a weighted sum of points where the coefficients of the points sum to one.
- the deformation ⁇ and the gradients g may be extended to the interior of any triangle by associating values ⁇ 1 , . . . , ⁇ n and g 1 , . . . , g n with the corresponding vertices.
- set (i) denotes indices of all vertices v j connected to v i .
- the gradients at each vertex may be required to be rotations g i ⁇ SO(2) expressing that a desired deformation should ideally be a distortion free transformation of the local neighborhood.
- the above formulation may be solved efficiently with a variety of optimization methods, but since the optimization may have several local minima it may require careful initialization.
- Some embodiments may initialize the iterative optimization process by propagating inferred gradients near the few user-specified constraints (i.e., control points) to all the remaining mesh vertices. These rotations may be smoothly propagated by computing generalized barycentric coordinates. Barycentric coordinates may be defined as the coefficients in a barycentric combination.
- ⁇ i ⁇ j ⁇ B ⁇ a ij ⁇ ⁇ j , i ⁇ v ( 6 )
- the same approach can be used to initialize gradients by first estimating gradients near user-specified constraints and then propagating them to the remaining vertices.
- the same interpolation may also be used to propagate other user-specified information. For example, a user may assign rendering depths for some subset of vertices ⁇ d j :j ⁇ ⁇ ⁇ and these could then be propagated elsewhere:
- the user may identify rendering order without having to specify depth for every single vertex.
- the user may specify a scaling factor to indicate how much to deviate from the built-in distortion-free energy metric. In that case, scaling factors from one set of vertices may be propagated to all the others.
- FIGS. 3A through 3H and FIGS. 4A through 4H illustrate examples of manipulating surfaces according to the warping methods described above implemented in an example warping module with an example user interface, according to some embodiments.
- FIGS. 3A through 3H illustrate examples of manipulating a selected object within an image according to the warping methods described above implemented in an example warping module with an example user interface, according to some embodiments.
- FIG. 3A shows an example figure of a man (a referee). Manual and automated methods for selecting or specifying objects or regions of images are known in the art; any such method may be used in embodiments to specify an object in an image to be manipulated.
- the selected object (the referee) may be referred to as a surface.
- control points 300 A through 300 E have been obtained, for example by user selection using a user interface to the warping module.
- a control point selection mode may be entered via the user interface, and the user may select the control points using cursor 302 .
- cursor 302 is shown at control point 300 A.
- a polygon mesh (not shown in FIG. 3B ) may be or may have been generated for the surface, and each control point may correspond to a vertex of the mesh.
- the user may, via the user interface, specify values for one or more properties of each control point.
- Example properties include translation, rotation, depth, and scale.
- control points 300 A through 300 E are still present.
- an example triangle mesh is shown displayed on the surface.
- control point 300 A has been moved, for example by the user using a cursor control device (e.g., a mouse, trackball, keyboard keys, etc.) to move the cursor 300 while the cursor is associated with the control point 300 A.
- a cursor control device e.g., a mouse, trackball, keyboard keys, etc.
- an initialization is performed in which one or more properties of the control point 300 A are propagated to other vertices of the mesh to initialize a deformation or warp of the mesh.
- an interpolation technique may be used to propagate property values to other vertices, and the propagation may be constrained by the shape of the surface and by the other control points 300 B through 300 D that act as pins or anchors.
- an iterative optimization is performed as previously explained to improve the local rigidity of the initial deformation of the mesh.
- the surface underlying the mesh may be deformed according to the improved deformed mesh to render an adjusted surface. In this instance, the referee's arm is moved slightly inward.
- control points 300 A through 300 E are still present.
- control point 300 A has again been moved slightly inward by the user.
- the warping module again performs the warping method as described, resulting in the referee's arm moving a bit more inward.
- control points 300 A through 300 E are still present.
- an example triangle mesh is shown displayed on the surface.
- control point 300 A has been moved by the user to over the referee's chest.
- the warping module again performs the warping method as described, resulting in the referee's arm moving as illustrated.
- a depth property at control point 300 A and/or at other control points 300 may be used to determine that the left arm should be rendered in front of the head and torso, rather than behind.
- control points 300 A through 300 E are still present.
- the user has added control points 300 F and 300 G at the referee's elbows.
- control point 300 A has been moved by the user to over the referee's head.
- the warping module again performs the warping method as described, resulting in the referee's arm moving as illustrated.
- control points 300 A through 300 G are still present.
- the user has added control points 300 H, 300 I, and 300 J.
- control points 300 H and 300 I have been moved by the user to move the referee's right arm inward, and control point 300 A has been slightly adjusted as well.
- the warping module again performs the warping method as described for each moved control point, resulting in the referee's arm moving as illustrated.
- a depth property at control point 300 A and/or at control point 300 H may be used to determine that the left arm should be rendered in front of the right arm, rather than behind.
- FIG. 3H illustrates a final image of the referee that may be generated after further manipulation of the control points 300 .
- the control points 300 (not shown in FIG. 3H ) can be added, removed, and manipulated as described to perform detailed control of the surface.
- the warping method as described maintains the local rigidity, thus reducing distortion as is commonly seen using conventional warping methods.
- FIGS. 4A through 4H illustrate examples of manipulating an entire image according to the warping methods described above implemented in an example warping module with an example user interface, according to some embodiments.
- FIG. 4A shows an example image of a football field. The image shows distortion that is common in images captured using wide-angle or fisheye lenses. Embodiments of the warping methods as described herein may be used to reduce or remove this distortion.
- the image may be referred to as a surface.
- a polygon mesh may be generated for the surface.
- An example triangle mesh is shown displayed on the surface in FIG. 4B .
- control points 400 A through 400 E have been obtained, for example by user selection using a user interface to the warping module.
- a control point selection mode may be entered via the user interface, and the user may select the control points using cursor 302 .
- cursor 402 is shown at control point 400 A.
- the user may, via the user interface, specify values for one or more properties of each control point.
- Example properties include translation, rotation, depth, and scale.
- control points 400 A through 400 E are still present.
- the cursor 402 is shown at control point 400 B.
- the user may, for example, move control point 400 B downward and to the left a bit to slightly straighten the yard lines on the right side of the field.
- the warping module performs the warping method as described.
- the other control points 400 pin or anchor the image so that the warping does not effect undesired portions of the image. In other words, the other control points constrain the deformation.
- control points 400 A and 400 C through 400 E are still present.
- Control point 400 B has been removed by the user, and a new control point 400 F has been added.
- the user may, for example, move control point 400 F up and to the right a bit to slightly straighten the yard lines on the right side of the field.
- the warping module performs the warping method as described.
- the other control points 400 again constrain the deformation.
- control points 400 A and 400 C through 400 F are still present.
- the user has added control points 400 G and 400 H.
- the user may move any of the control point 400 to adjust the image to reduce distortion.
- the warping module performs the warping method as described for each moved control point 400 , with the other control points 400 constraining the deformation.
- control points 400 A and 400 C through 400 H are still present.
- the user has added control point 4001 .
- the user may move any of the control point 400 to adjust the image to further reduce distortion.
- the warping module performs the warping method as described for each moved control point 400 , with the other control points 400 constraining the deformation.
- FIG. 4H illustrates a final image that may be generated after manipulation of the control points 400 .
- the control points 400 (not shown in FIG. 4H ) can be added, removed, and manipulated as described to perform detailed control of the surface.
- the warping method as described maintains the local rigidity, thus reducing distortion as is commonly seen using conventional warping methods.
- Embodiments of a warping module and user interface to the warping module may be implemented as a warping tool.
- portions or all of the warping module may be implemented by program instructions configured for execution on a graphics processing unit (GPU), or parallel execution on two or more GPUs.
- Applications of the warping tool may include, but are not limited to, manipulating or warping digital photographs to, for example, repair camera distortion effects, manipulating or animating parts of selected objects such as human or animal figures within digital images, modeling shapes in synthetically generated two-dimensional (2D) or three-dimensional (3D) images, and in general in any application where digital images or portions of images may need to be warped or deformed.
- Some embodiments of the warping tool may be implemented, for example, as a module in or plug-in for art design tools such as Adobe® Illustrator® technology and GNU Gimp technology, as a module in or plug-in for other types of image processing applications such as Adobe® Photoshop® technology and Adobe® After Effects® technology, or as a module or plug-in for computer-aided design (CAD) systems.
- CAD computer-aided design
- Other embodiments may be otherwise implemented, for example as a stand-alone program or utility, or as a library function.
- Various embodiments of the warping tool may obtain, manipulate, and output digital images in any of various digital image formats.
- FIG. 5 illustrates an example user interface to a warping module according to some embodiments.
- User interface 500 is given as an example and is not intended to be limiting.
- User interface 500 may include an image pane 502 in which a surface (e.g., an entire image or, as shown, a selected object 504 from an image)) may be displayed.
- User interface 500 may also include one or more textual and/or graphical user interface elements, modes, controls, or techniques via which the user may perform various operations of the warping methods as described herein.
- user interface 500 may include an add control points 510 user interface element that the user may select to enter an add control points mode in which the user may add control points 506 to the displayed surface, for example by manipulating cursor 508 with a cursor control device (not shown) or by touching a touchpad or multitouch device.
- user interface 500 may include an adjust control points properties 512 user interface element that the user may use to specify values for one or more properties (e.g. translation, rotation, depth and scale) of a selected control point.
- the user interface 500 may provide a method whereby the user may cause popup menu controls 520 to be displayed for a control point 506 , via which the user may specify the one or more properties for the control point.
- user interface 500 may include a move control points 514 user interface element that the user may select to enter a mode within which the user may move one or more control points 506 , for example using a cursor control device, touchpad, or multitouch device.
- User interface 500 may include one or more other user interface elements 516 .
- user interface 500 may include one or more user interface elements that may be used to control manual or automatic methods for selecting an object in an image for manipulation, and/or one or more user interface elements whereby the user can paint or otherwise indicate areas of the surface (object 504 in this example) to which propagation of control point parameters will be restricted or prohibited so that deformation of these areas is reduced or prevented. In other words, the user may specify areas of the surface that will remain more or completely rigid.
- multitouch technology may be integrated with the warping tool user interface and warping module.
- Multitouch is a technology that provides hardware and software that allows computer users to control various applications via the manipulation of multiple digits on the surface of (or, for some devices, proximate to) a multitouch-enabled device.
- Multitouch technology generally consists of a touch-enabled device (referred to as a multitouch device) such as a touch-sensitive display device (computer display, screen, table, wall, etc.), touchpad, tablet, etc., as well as software that recognizes multiple, substantially simultaneous touch points on the surface of the multitouch device.
- conventional touch-enabled technologies e.g. a computer touchpad, ATM screen, etc) recognize only one touch point.
- a user may simultaneously move two or more control points 506 on the image pane 502 .
- the warping module may perform the warping method as described herein separately for each of the simultaneously moved control point 506 , with the surface deforming accordingly.
- the warping module may apply the warping method as described herein for two or more control points 506 .
- the warping module may perform an initialization with propagation based on the information received from two or more moved control points to generate an initial deformed mesh, and then perform optimization as described on the initial deformed mesh.
- FIG. 6 illustrates an example embodiment of a warping tool that may implement a warping module as described herein which implements the workflows and methods for manipulating surfaces as described herein.
- Warping tool 600 may include a warping module 602 that implements an embodiment of a warping method as described herein, for example as described in FIGS. 1 and/or 2 , and a user interface 610 to the warping module 602 , for example as illustrated in FIG. 5 , that provides one or more textual and/or graphical user interface elements, modes or techniques via which a user may provide input to and/or control various aspects of surface manipulation as described herein using embodiments of the warping module 602 .
- warping tool 600 may implement one or more of these features, may implement subsets of the features or all of the features described herein for embodiments of the warping method.
- warping tool 600 may provide real-time or near-real-time feedback to the user via dynamic display on a display device(s) 630 of modifications to an input image 604 made according to the user input received via user interface 610 .
- the user may perform manipulation of the surface using the warping tool 600 , as illustrated by FIGS. 3A-3H and FIGS. 4A-4H , with results being dynamically displayed on a display device 630 .
- Results may be output as an example output image 608 .
- Output image 608 may, for example, be displayed on a display device 630 , printed, and/or written to or stored on any of various types of memory media, such as storage media or storage devices 620 .
- FIG. 7 One such computer system is illustrated by FIG. 7 .
- computer system 700 includes one or more processors 710 coupled to a system memory 720 via an input/output (I/O) interface 730 .
- Computer system 700 further includes a network interface 740 coupled to I/O interface 730 , and one or more input/output devices 750 , such as cursor control device 760 , touchpad (not shown), keyboard 770 , display(s) 780 , and multitouch-enabled device(s) 790 .
- I/O input/output
- Computer system 700 further includes a network interface 740 coupled to I/O interface 730 , and one or more input/output devices 750 , such as cursor control device 760 , touchpad (not shown), keyboard 770 , display(s) 780 , and multitouch-enabled device(s) 790 .
- embodiments may be implemented using a single instance of computer system 700 , while in other embodiments multiple such systems, or multiple nodes making up computer system 700 , may be configured to host different portions or instances of embodiments.
- some elements may be implemented via one or more nodes of computer system 700 that are distinct from those nodes implementing other elements.
- computer system 700 may be a uniprocessor system including one processor 710 , or a multiprocessor system including several processors 710 (e.g., two, four, eight, or another suitable number).
- processors 710 may be any suitable processor capable of executing instructions.
- processors 710 may be general-purpose or embedded processors implementing any of a variety of instruction set architectures (ISAs), such as the x86, PowerPC, SPARC, or MIPS ISAs, or any other suitable ISA.
- ISAs instruction set architectures
- each of processors 710 may commonly, but not necessarily, implement the same ISA.
- At least one processor 710 may be a graphics processing unit.
- a graphics processing unit or GPU may be considered a dedicated graphics-rendering device for a personal computer, workstation, game console or other computer system.
- Modern GPUs may be very efficient at manipulating and displaying computer graphics, and their highly parallel structure may make them more effective than typical CPUs for a range of complex graphical algorithms.
- a graphics processor may implement a number of graphics primitive operations in a way that makes executing them much faster than drawing directly to the screen with a host central processing unit (CPU).
- the methods disclosed herein for surface inflation may be implemented by program instructions configured for execution on one of, or parallel execution on two or more of, such GPUs.
- the GPU(s) may implement one or more application programmer interfaces (APIs) that permit programmers to invoke the functionality of the GPU(s). Suitable GPUs may be commercially available from vendors such as NVIDIA Corporation, ATI Technologies, and others.
- APIs application programmer interfaces
- System memory 720 may be configured to store program instructions and/or data accessible by processor 710 .
- system memory 720 may be implemented using any suitable memory technology, such as static random access memory (SRAM), synchronous dynamic RAM (SDRAM), nonvolatile/Flash-type memory, or any other type of memory.
- SRAM static random access memory
- SDRAM synchronous dynamic RAM
- program instructions and data implementing desired functions are shown stored within system memory 720 as program instructions 725 and data storage 735 , respectively.
- program instructions and/or data may be received, sent or stored upon different types of computer-accessible media or on similar media separate from system memory 720 or computer system 700 .
- a computer-accessible medium may include storage media or memory media such as magnetic or optical media, e.g., disk or CD/DVD-ROM coupled to computer system 700 via I/O interface 730 .
- Program instructions and data stored via a computer-accessible medium may be transmitted by transmission media or signals such as electrical, electromagnetic, or digital signals, which may be conveyed via a communication medium such as a network and/or a wireless link, such as may be implemented via network interface 740 .
- I/O interface 730 may be configured to coordinate I/O traffic between processor 710 , system memory 720 , and any peripheral devices in the device, including network interface 740 or other peripheral interfaces, such as input/output devices 750 .
- I/O interface 730 may perform any necessary protocol, timing or other data transformations to convert data signals from one component (e.g., system memory 720 ) into a format suitable for use by another component (e.g., processor 710 ).
- I/O interface 730 may include support for devices attached through various types of peripheral buses, such as a variant of the Peripheral Component Interconnect (PCI) bus standard or the Universal Serial Bus (USB) standard, for example.
- PCI Peripheral Component Interconnect
- USB Universal Serial Bus
- I/O interface 730 may be split into two or more separate components, such as a north bridge and a south bridge, for example.
- some or all of the functionality of I/O interface 730 such as an interface to system memory 720 , may be incorporated directly into processor 710 .
- Network interface 740 may be configured to allow data to be exchanged between computer system 700 and other devices attached to a network, such as other computer systems, or between nodes of computer system 700 .
- network interface 740 may support communication via wired or wireless general data networks, such as any suitable type of Ethernet network, for example; via telecommunications/telephony networks such as analog voice networks or digital fiber communications networks; via storage area networks such as Fibre Channel SANs, or via any other suitable type of network and/or protocol.
- Input/output devices 750 may, in some embodiments, include one or more display terminals, keyboards, keypads, touchpads, scanning devices, voice or optical recognition devices, or any other devices suitable for entering or retrieving data by one or more computer system 700 .
- Multiple input/output devices 750 may be present in computer system 700 or may be distributed on various nodes of computer system 700 .
- similar input/output devices may be separate from computer system 700 and may interact with one or more nodes of computer system 700 through a wired or wireless connection, such as over network interface 740 .
- memory 720 may include program instructions 725 , configured to implement embodiments of the methods for manipulating surfaces, warping module, and/or warping tool as described herein, and data storage 735 , comprising various data accessible by program instructions 725 .
- program instructions 725 may include software elements of the methods for manipulating surfaces, warping module, and/or warping tool illustrated in the above Figures.
- Data storage 735 may include data that may be used in embodiments. In other embodiments, other or different software elements and data may be included.
- computer system 700 is merely illustrative and is not intended to limit the scope of the methods for manipulating surfaces, warping module, and/or warping tool as described herein.
- the computer system and devices may include any combination of hardware or software that can perform the indicated functions, including computers, network devices, internet appliances, PDAs, wireless phones, pagers, etc.
- Computer system 700 may also be connected to other devices that are not illustrated, or instead may operate as a stand-alone system.
- the functionality provided by the illustrated components may in some embodiments be combined in fewer components or distributed in additional components.
- the functionality of some of the illustrated components may not be provided and/or other additional functionality may be available.
- instructions stored on a computer-accessible medium separate from computer system 700 may be transmitted to computer system 700 via transmission media or signals such as electrical, electromagnetic, or digital signals, conveyed via a communication medium such as a network and/or a wireless link.
- Various embodiments may further include receiving, sending or storing instructions and/or data implemented in accordance with the foregoing description upon a computer-accessible medium. Accordingly, the present invention may be practiced with other computer system configurations.
- a computer-accessible medium may include storage media or memory media such as magnetic or optical media, e.g., disk or DVD/CD-ROM, volatile or non-volatile media such as RAM (e.g. SDRAM, DDR, RDRAM, SRAM, etc.), ROM, etc., as well as transmission media or signals such as electrical, electromagnetic, or digital signals, conveyed via a communication medium such as network and/or a wireless link.
- storage media or memory media such as magnetic or optical media, e.g., disk or DVD/CD-ROM, volatile or non-volatile media such as RAM (e.g. SDRAM, DDR, RDRAM, SRAM, etc.), ROM, etc.
- RAM e.g. SDRAM, DDR, RDRAM, SRAM, etc.
- ROM etc.
- transmission media or signals such as electrical, electromagnetic, or digital signals
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
The unknowns in this equation are both the coordinates of each of the points (ƒ), which are needed to specify the deformation, and some other property at each vertex (g), for example rotation. C represents the control points as constraints.
E[ƒ]= ∥∇ƒ 1(x)−g 1(x)∥+∥∇ƒ2(x)−g 2(x)∥ (2)
where the linear functions g1 and g2 specify the gradients of the ideal isometric deformation. These gradients and the ideal deformation are not known in advance. Instead, some embodiments may minimize the energy to compute both the deformation ƒ and the gradients g:
Optimization
={v1v2, . . . ,v n}
into triangular faces:
={t 1 t 2 , . . . ,t m}
with each triangle identified by a triple of vertex indices:
t iε××.
Δa j(x)=0,jε (7)
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/714,028 US9053553B2 (en) | 2010-02-26 | 2010-02-26 | Methods and apparatus for manipulating images and objects within images |
US14/733,090 US9454797B2 (en) | 2010-02-26 | 2015-06-08 | Deforming a surface via a control point |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/714,028 US9053553B2 (en) | 2010-02-26 | 2010-02-26 | Methods and apparatus for manipulating images and objects within images |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/733,090 Continuation US9454797B2 (en) | 2010-02-26 | 2015-06-08 | Deforming a surface via a control point |
Publications (2)
Publication Number | Publication Date |
---|---|
US20130120457A1 US20130120457A1 (en) | 2013-05-16 |
US9053553B2 true US9053553B2 (en) | 2015-06-09 |
Family
ID=48280208
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/714,028 Active 2033-01-25 US9053553B2 (en) | 2010-02-26 | 2010-02-26 | Methods and apparatus for manipulating images and objects within images |
US14/733,090 Active US9454797B2 (en) | 2010-02-26 | 2015-06-08 | Deforming a surface via a control point |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/733,090 Active US9454797B2 (en) | 2010-02-26 | 2015-06-08 | Deforming a surface via a control point |
Country Status (1)
Country | Link |
---|---|
US (2) | US9053553B2 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9454797B2 (en) | 2010-02-26 | 2016-09-27 | Adobe Systems Incorporated | Deforming a surface via a control point |
US10599320B2 (en) | 2017-05-15 | 2020-03-24 | Microsoft Technology Licensing, Llc | Ink Anchoring |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130016098A1 (en) * | 2011-07-17 | 2013-01-17 | Raster Labs, Inc. | Method for creating a 3-dimensional model from a 2-dimensional source image |
JP5375897B2 (en) * | 2011-08-25 | 2013-12-25 | カシオ計算機株式会社 | Image generation method, image generation apparatus, and program |
KR20150015680A (en) * | 2013-08-01 | 2015-02-11 | 씨제이씨지브이 주식회사 | Method and apparatus for correcting image based on generating feature point |
US20150089446A1 (en) * | 2013-09-24 | 2015-03-26 | Google Inc. | Providing control points in images |
CN107924571A (en) * | 2015-08-14 | 2018-04-17 | 汤姆逊许可公司 | Three-dimensional reconstruction is carried out to human ear from a cloud |
US10930086B2 (en) | 2016-11-01 | 2021-02-23 | Dg Holdings, Inc. | Comparative virtual asset adjustment systems and methods |
US10275941B2 (en) * | 2016-11-01 | 2019-04-30 | Dg Holdings, Inc. | Multi-layered depth and volume preservation of stacked meshes |
US10140764B2 (en) * | 2016-11-10 | 2018-11-27 | Adobe Systems Incorporated | Generating efficient, stylized mesh deformations using a plurality of input meshes |
CN106846477B (en) * | 2017-02-10 | 2020-03-31 | 中国电建集团成都勘测设计研究院有限公司 | Geological marker interpretation modeling method for compiling and recording field geological image |
US10510186B2 (en) * | 2017-12-22 | 2019-12-17 | Adobe Inc. | Digital media environment for intuitive modifications of digital graphics |
US10388045B2 (en) | 2018-01-04 | 2019-08-20 | Adobe Inc. | Generating a triangle mesh for an image represented by curves |
US10410317B1 (en) | 2018-03-26 | 2019-09-10 | Adobe Inc. | Digital image transformation environment using spline handles |
US10628918B2 (en) | 2018-09-25 | 2020-04-21 | Adobe Inc. | Generating enhanced digital content using piecewise parametric patch deformations |
US10706500B2 (en) * | 2018-09-25 | 2020-07-07 | Adobe Inc. | Generating enhanced digital content using piecewise parametric patch deformations |
US10832376B2 (en) | 2018-09-25 | 2020-11-10 | Adobe Inc. | Generating enhanced digital content using piecewise parametric patch deformations |
US10832446B2 (en) | 2019-01-07 | 2020-11-10 | Adobe Inc. | Bone handle generation |
US10943375B2 (en) | 2019-04-17 | 2021-03-09 | Adobe Inc. | Multi-state vector graphics |
US11315299B1 (en) * | 2020-11-13 | 2022-04-26 | Unity Technologies Sf | Method for computation of local densities for virtual fibers |
US11170553B1 (en) * | 2020-07-24 | 2021-11-09 | Weta Digital Limited | Methods and systems for generating an animation control rig |
US11282277B1 (en) * | 2020-09-28 | 2022-03-22 | Adobe Inc. | Systems for shading vector objects |
CN112530016B (en) * | 2020-10-30 | 2022-11-11 | 北京字跳网络技术有限公司 | Method, device, equipment and storage medium for adsorbing road fittings |
US11631207B2 (en) | 2021-09-09 | 2023-04-18 | Adobe Inc. | Vector object stylization from raster objects |
US11875456B2 (en) * | 2021-09-30 | 2024-01-16 | Ephere, Inc. | System and method of generating graft surface files and graft groom files and fitting the same onto a target surface to provide an improved way of generating and customizing grooms |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6768486B1 (en) * | 2001-05-18 | 2004-07-27 | Autodesk, Inc. | Modifying subobjects of geometry objects based on per-subobject objects |
US20070035541A1 (en) * | 2005-07-29 | 2007-02-15 | Michael Isner | Three-dimensional animation of soft tissue of characters using controls associated with a surface mesh |
US20090002376A1 (en) * | 2007-06-29 | 2009-01-01 | Microsoft Corporation | Gradient Domain Editing of Animated Meshes |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8514238B2 (en) * | 2008-11-21 | 2013-08-20 | Adobe Systems Incorporated | System and method for adding vector textures to vector graphics images |
US9053553B2 (en) | 2010-02-26 | 2015-06-09 | Adobe Systems Incorporated | Methods and apparatus for manipulating images and objects within images |
-
2010
- 2010-02-26 US US12/714,028 patent/US9053553B2/en active Active
-
2015
- 2015-06-08 US US14/733,090 patent/US9454797B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6768486B1 (en) * | 2001-05-18 | 2004-07-27 | Autodesk, Inc. | Modifying subobjects of geometry objects based on per-subobject objects |
US20070035541A1 (en) * | 2005-07-29 | 2007-02-15 | Michael Isner | Three-dimensional animation of soft tissue of characters using controls associated with a surface mesh |
US20090002376A1 (en) * | 2007-06-29 | 2009-01-01 | Microsoft Corporation | Gradient Domain Editing of Animated Meshes |
Non-Patent Citations (8)
Title |
---|
A. Cuno, C. Esperanca, A. Oliveira, and P. R. Cavalcanti: 3D as-rigid-as-possible deformations using MLS. Proceedings of the 27th Computer Graphics International Conference, pp. 115-122, Petropolis, RJ, Brazil, May 2007. |
Marc Alexa, Daniel Cohen-Or, David Levin: As-Rigid-As-Possible Shape Interpolation, SIGGRAPH 2000 Conference Proceedings, 157-164 (2000). |
Olga Sorkine, Marc Alexa: As-rigid-as-possible surface modeling, Proceedings of the fifth Eurographics symposium on Geometry processing, Jul. 4-6, 2007. |
Robert W. Sumner, Jovan Popovic: Deformation transfer for triangle meshes, ACM Transactions on Graphics (TOG), v.23 n. 3, Aug. 2004. |
Robert W. Sumner, Matthias Zwicker, Craig Gotsman, Jovan Popovic: Mesh-based inverse kinematics, ACM Transactions on Graphics (TOG), v.24 n. 3, Jul. 2005. |
Takeo Igarashi, Tomer Moscovich, John F. Hughes: As-rigid-as-possible shape manipulation, ACM Transactions on Graphics (TOG), vol. 24 , Issue 3 (Jul. 2005). |
Ullman, S.: Maximizing rigidity: The incremental recovery of 3d structure from rigid and rubbery motion. Technical Report A.I. Memo No. 721, MIT (1983). |
Xiaohan Shi, Kun Zhou, Yiying Tong, Mathieu Desbrun, Hujun Bao, Baining Guo, Mesh puppetry: cascading optimization of mesh deformation with inverse kinematics, ACM Transactions on Graphics (TOG), v.26 n. 3, Jul. 2007. |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9454797B2 (en) | 2010-02-26 | 2016-09-27 | Adobe Systems Incorporated | Deforming a surface via a control point |
US10599320B2 (en) | 2017-05-15 | 2020-03-24 | Microsoft Technology Licensing, Llc | Ink Anchoring |
Also Published As
Publication number | Publication date |
---|---|
US20130120457A1 (en) | 2013-05-16 |
US9454797B2 (en) | 2016-09-27 |
US20150269706A1 (en) | 2015-09-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9454797B2 (en) | Deforming a surface via a control point | |
US8334868B2 (en) | Method and apparatus for surface inflation using mean curvature constraints | |
US8711150B2 (en) | Methods and apparatus for deactivating internal constraint curves when inflating an N-sided patch | |
US8731876B2 (en) | Creating editable feature curves for a multi-dimensional model | |
Wang et al. | Feature based 3D garment design through 2D sketches | |
US10796497B2 (en) | Distance field coupled fitted deformation lattices for shape modification | |
US10255381B2 (en) | 3D modeled object defined by a grid of control points | |
Wu et al. | ViSizer: a visualization resizing framework | |
US20130124148A1 (en) | System and Method for Generating Editable Constraints for Image-based Models | |
US8766978B2 (en) | Methods and apparatus for generating curved extrusions | |
US10943375B2 (en) | Multi-state vector graphics | |
US20130127824A1 (en) | Object Selection in Stereo Image Pairs | |
KR20080034420A (en) | Large mesh deformation using the volumetric graph laplacian | |
US9965843B2 (en) | Methods and systems for characterizing concept drawings and estimating three-dimensional information therefrom | |
US20150154797A1 (en) | Method, apparatus and system for tessellating a parametric patch | |
Huang et al. | Transformation guided image completion | |
US20150113453A1 (en) | Methods and devices for simplified graphical object editing | |
US8681147B1 (en) | Fractured texture coordinates | |
US9805499B2 (en) | 3D-consistent 2D manipulation of images | |
US9311755B2 (en) | Self-disclosing control points | |
US8334869B1 (en) | Method and apparatus for modeling 3-D shapes from a user drawn curve | |
Pakdel et al. | Incremental subdivision for triangle meshes | |
McDonnell et al. | PB-FFD: a point-based technique for free-form deformation | |
CN111047666B (en) | Unified digital content selection system for vector graphics and grid graphics | |
JP2014525313A (en) | Method of accessing 3D transparent video data for high-speed driving and user interface device therefor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ADOBE SYSTEMS INCORPORATED, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:POPOVIC, JOVAN;CHIEN, JEN-CHAN;INTWALA, CHINTAN;AND OTHERS;SIGNING DATES FROM 20100225 TO 20100226;REEL/FRAME:024008/0347 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: ADOBE INC., CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:ADOBE SYSTEMS INCORPORATED;REEL/FRAME:048867/0882 Effective date: 20181008 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |