CN110069195B - Image dragging deformation method and device - Google Patents

Image dragging deformation method and device Download PDF

Info

Publication number
CN110069195B
CN110069195B CN201910101346.3A CN201910101346A CN110069195B CN 110069195 B CN110069195 B CN 110069195B CN 201910101346 A CN201910101346 A CN 201910101346A CN 110069195 B CN110069195 B CN 110069195B
Authority
CN
China
Prior art keywords
dragging
point
image
grid
deformed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910101346.3A
Other languages
Chinese (zh)
Other versions
CN110069195A (en
Inventor
倪光耀
杨辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Douyin Vision Co Ltd
Douyin Vision Beijing Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN201910101346.3A priority Critical patent/CN110069195B/en
Publication of CN110069195A publication Critical patent/CN110069195A/en
Application granted granted Critical
Publication of CN110069195B publication Critical patent/CN110069195B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The disclosure discloses an image dragging deformation method, an image dragging deformation device, electronic equipment and a computer readable storage medium. The image dragging deformation method comprises the following steps: determining the position of a dragging point in an image to be deformed, wherein the dragging point comprises a dragging starting point and a dragging ending point; carrying out gridding processing on the image to be deformed to obtain at least one grid point; determining the target position of the grid point according to the original position of the grid point, the position of the dragging start point and the position of the dragging end point; and dragging and deforming the image to be deformed according to the target position. According to the method and the device for dragging and deforming the image to be deformed, the grid processing is carried out on the image to be deformed, the target position of the grid point is determined according to the original position of the grid point, the position of the dragging starting point and the position of the dragging ending point, and the image to be deformed is dragged and deformed according to the target position, so that the technical problems that in the prior art, the image deformation complexity is high, and the dragging and deforming cannot be carried out are solved.

Description

Image dragging deformation method and device
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image dragging and deforming method and apparatus, an electronic device, and a computer-readable storage medium.
Background
Image deformation is a common method in image processing, and refers to changing one image into another according to a certain rule or method. Common transformations include, for example, affine transformations, transmission transformations, bilinear transformations, and the like. The image deformation technology has wide application range, the image deformation is an important application in medical imaging, and the technology is greatly helpful for medical research on the assumption that many models need to be simulated and deformed in some operations. It can also be used for entertainment, face deformation, etc., for example, for some given face images, it can generate complex expression changes of happiness, anger, sadness, music, etc.
In the prior art, a triangulation method is generally used to record the positions of key points of a human face, and the human face is deformed according to the positions of the key points of the human face. However, the face deformation transformation is complex, and the face image cannot be dragged and deformed.
Disclosure of Invention
In a first aspect, an embodiment of the present disclosure provides an image dragging and deforming method, including: determining the position of a dragging point in an image to be deformed, wherein the dragging point comprises a dragging starting point and a dragging ending point; carrying out gridding processing on the image to be deformed to obtain at least one grid point; determining the target position of the grid point according to the original position of the grid point, the position of the dragging start point and the position of the dragging end point; and dragging and deforming the image to be deformed according to the target position.
Further, the determining the target position of the grid point according to the original position of the grid point, the position of the drag start point, and the position of the drag end point includes:
determining a transformation matrix and a translation vector of the grid point according to the original position of the grid point, the position of the dragging start point and the position of the dragging end point;
and determining the target position according to the transformation matrix and the translation vector.
Further, the determining a transformation matrix and a translation vector of the grid point according to the original position of the grid point, the position of the drag start point, and the position of the drag end point includes:
determining the weight of the grid point and the drag starting point according to the original position of the grid point and the position of the drag starting point;
and determining a transformation matrix and a translation vector of the grid points according to the weight, the position of the dragging start point and the position of the dragging end point.
Further, the determining a transformation matrix and a translation vector of the grid points according to the weight, the position of the drag start point, and the position of the drag end point includes:
optimizing the weight;
and determining a transformation matrix and a translation vector of the grid point according to the optimized weight, the position of the dragging start point and the position of the dragging end point.
Further, the determining the weight of the grid point and the drag start point according to the original position of the grid point and the position of the drag start point includes:
according to the formula
Figure RE-GDA0002007967850000021
Calculating to obtain the weight; wherein, | - | represents an absolute value, wijIs the weight, p, of the jth grid point and the ith drag pointiCoordinates of the start point of the i-th drag point, vjThe original coordinates of the jth grid point, α is the weight attenuation factor.
Further, the optimizing the weight includes:
using an optimisation formula
Figure RE-GDA0002007967850000022
And optimizing the weights, wherein gamma is an optimization weight coefficient, and β is an optimization weight offset.
Further, the determining a transformation matrix and a translation vector of the grid point according to the original position of the grid point, the position of the drag start point, and the position of the drag end point includes:
according to the objective function
Figure RE-GDA0002007967850000031
And constraint Mj TMj=λ2I, solving to obtain the transformation matrix and the translation vector, wherein argmin is a function of solving the maximum value, ∑·For summation, MjA transformation matrix for the j grid point, TjIs the translation vector of the jth grid point, qiAnd T is the coordinate of the ith dragging end point, T is the matrix transposition, lambda is the numerical value, and I is the unit matrix.
Further, the determining the target position according to the transformation matrix and the translation vector includes:
according to the formula fα(vj)=vjMj+TjCalculating target position coordinates of the grid points, wherein fα(vj) Is the target position coordinate of the jth grid point.
Further, the image to be deformed is a face image.
In a second aspect, an embodiment of the present disclosure provides an image dragging and deforming apparatus, including: the position determining module is used for determining the position of a dragging point in the image to be deformed, wherein the dragging point comprises a dragging starting point and a dragging ending point; the grid processing module is used for carrying out grid processing on the image to be deformed to obtain at least one grid point; a position determining module, configured to determine a target position of the grid point according to an original position of the grid point, a position of the drag start point, and a position of the drag end point; and the dragging deformation module is used for dragging and deforming the image to be deformed according to the target position.
Further, the position determination module includes:
a mapping relation determining unit, configured to determine a transformation matrix and a translation vector of the grid point according to an original position of the grid point, a position of the drag start point, and a position of the drag end point;
and the position determining unit is used for determining the target position according to the transformation matrix and the translation vector.
Further, the mapping relationship determining unit is specifically configured to: determining the weight of the grid point and the drag starting point according to the original position of the grid point and the position of the drag starting point; and determining a transformation matrix and a translation vector of the grid points according to the weight, the position of the dragging start point and the position of the dragging end point.
Further, the mapping relationship determining unit is specifically configured to: optimizing the weight; and determining a transformation matrix and a translation vector of the grid point according to the optimized weight, the position of the dragging start point and the position of the dragging end point.
Further, the mapping relationship determining unit is specifically configured to: according to the formula
Figure RE-GDA0002007967850000041
Calculating to obtain the weight; wherein, | - | represents an absolute value, wijAs the weight of the jth grid point and the ith drag point,piCoordinates of the start point of the i-th drag point, vjThe original coordinates of the jth grid point, α is the weight attenuation factor.
Further, the mapping relationship determining unit is specifically configured to: using an optimisation formula
Figure RE-GDA0002007967850000042
And optimizing the weights, wherein gamma is an optimization weight coefficient, and β is an optimization weight offset.
Further, the mapping relationship determining unit is specifically configured to: according to the objective function
Figure RE-GDA0002007967850000043
And constraint Mj TMj=λ2I, solving to obtain the transformation matrix and the translation vector, wherein argmin is a function of solving the maximum value, ∑·For summation, MjA transformation matrix for the j grid point, TjIs the translation vector of the jth grid point, qiAnd T is the coordinate of the ith dragging end point, T is the matrix transposition, lambda is the numerical value, and I is the unit matrix.
Further, the position determining unit is specifically configured to: according to the formula fα(vj)=vjMj+TjCalculating target position coordinates of the grid points, wherein fα(vj) Is the target position coordinate of the jth grid point.
Further, the image to be deformed is a face image.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform any of the image drag warping methods of the first aspect.
In a fourth aspect, the present disclosure provides a non-transitory computer-readable storage medium, which stores computer instructions for causing a computer to execute the image drag-and-warp method in any one of the foregoing first aspects.
The disclosure discloses an image dragging deformation method, an image dragging deformation device, electronic equipment and a computer readable storage medium. The image dragging deformation method comprises the following steps: determining the position of a dragging point in an image to be deformed, wherein the dragging point comprises a dragging starting point and a dragging ending point; carrying out gridding processing on the image to be deformed to obtain at least one grid point; determining the target position of the grid point according to the original position of the grid point, the position of the dragging start point and the position of the dragging end point; and dragging and deforming the image to be deformed according to the target position. According to the method and the device for dragging and deforming the image to be deformed, the grid processing is carried out on the image to be deformed, the target position of the grid point is determined according to the original position of the grid point, the position of the dragging starting point and the position of the dragging ending point, and the image to be deformed is dragged and deformed according to the target position, so that the technical problems that in the prior art, the image deformation complexity is high, and the dragging and deforming cannot be carried out are solved.
The foregoing is a summary of the present disclosure, and for the purposes of promoting a clear understanding of the technical means of the present disclosure, the present disclosure may be embodied in other specific forms without departing from the spirit or essential attributes thereof.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present disclosure, and other drawings can be obtained according to the drawings without creative efforts for those skilled in the art.
Fig. 1a is a flowchart of an image dragging and deforming method provided in an embodiment of the present disclosure;
FIG. 1b is a flowchart of an image dragging and distorting method according to another embodiment of the disclosure;
fig. 2a is a schematic structural diagram of an image dragging and deforming device provided in the embodiment of the present disclosure;
fig. 2b is a schematic structural diagram of an image dragging and deforming apparatus according to another embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an electronic device provided according to an embodiment of the present disclosure.
Detailed Description
The embodiments of the present disclosure are described below with specific examples, and other advantages and effects of the present disclosure will be readily apparent to those skilled in the art from the disclosure in the specification. It is to be understood that the described embodiments are merely illustrative of some, and not restrictive, of the embodiments of the disclosure. The disclosure may be embodied or carried out in various other specific embodiments, and various modifications and changes may be made in the details within the description without departing from the spirit of the disclosure. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
It is noted that various aspects of the embodiments are described below within the scope of the appended claims. It should be apparent that the aspects described herein may be embodied in a wide variety of forms and that any specific structure and/or function described herein is merely illustrative. Based on the disclosure, one skilled in the art should appreciate that one aspect described herein may be implemented independently of any other aspects and that two or more of these aspects may be combined in various ways. For example, an apparatus may be implemented and/or a method practiced using any number of the aspects set forth herein. Additionally, such an apparatus may be implemented and/or such a method may be practiced using other structure and/or functionality in addition to one or more of the aspects set forth herein.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present disclosure, and the drawings only show the components related to the present disclosure rather than the number, shape and size of the components in actual implementation, and the type, amount and ratio of the components in actual implementation may be changed arbitrarily, and the layout of the components may be more complicated.
In addition, in the following description, specific details are provided to facilitate a thorough understanding of the examples. However, it will be understood by those skilled in the art that the aspects may be practiced without these specific details.
Fig. 1a is a flowchart of an image dragging and deforming method according to an embodiment of the present disclosure, where the image dragging and deforming method according to the embodiment may be executed by an image dragging and deforming device, the image dragging and deforming device may be implemented as software or as a combination of software and hardware, and the image dragging and deforming device may be integrally disposed in a certain device in an image dragging and deforming system, such as an image dragging and deforming server or an image dragging and deforming terminal device. As shown in fig. 1a, the method comprises the steps of:
step S1: determining the position of a dragging point in an image to be deformed, wherein the dragging point comprises a dragging starting point and a dragging ending point;
in this embodiment, the image to be deformed is an image before deformation, and may be a human face image or a human face video image. And selecting any pixel point on the image to be deformed as the dragging point. Specifically, the positions of the drag start point and the drag end point may be set by a user.
In an alternative embodiment, step S1 includes:
step S11: determining a transformation matrix and a translation vector of a dragging point according to the position of the dragging point on a template image, the position of a source key point on the template image and the position of a target key point on an image to be deformed;
in this embodiment, the template image is a preset image, including but not limited to a face image, such as a face front image. The dragging point can be a dragging starting point or a dragging ending point, and any pixel point on the template image can be selected as the dragging point. Specifically, a selection can be made on the template image by the user.
In this embodiment, the template image and the image to be deformed are associated images, for example, both are face images, and faces in the images are the same face. The image to be deformed may be a face video image.
The source key points are pixel points on the template image, and specifically, all the pixel points on the template image or a preset number of selected pixel points. The target key points are key points on the image to be deformed, and specifically may be all pixel points on the template image or a selected preset number of pixel points. And the source key points and the target key points have the same number and are pixel points with the same characteristics contained in the two images. The key point can be set by a user in a self-defined mode or obtained through machine algorithm learning, and the number of specific pixel points can be set in a self-defined mode, for example, 106. When the pixel points are the selected preset number of pixel points, the calculation amount can be effectively reduced.
Step S12: and determining the position of the dragging point mapped to the image to be deformed according to the transformation matrix and the translation vector.
Since a certain position relationship exists between the pixel points on the template image and the pixel points on the image to be deformed, the position of the dragging point mapped to the image to be deformed can be determined through the position mapping relationship between the source key point and the target key point.
Further, step S11 includes:
s111: determining a transformation matrix and a translation vector of the dragged point according to the position of the dragged point, the position of the source key point and the position of the target key point;
s112: and determining the position of the dragging point mapped to the image to be deformed according to the transformation matrix and the translation vector.
Further, step S111 may be implemented by:
determining the weight of the dragging point and the source key point according to the position of the dragging point and the position of the source key point;
and determining a transformation matrix and a translation vector of the dragging point according to the weight, the position of the source key point and the position of the target key point.
Further, it can be according to the formula
Figure RE-GDA0002007967850000081
Calculating to obtain the weight; wherein, | - | represents an absolute value, wijIs the weight, p, of the jth drag point and the ith source key pointiIs the coordinate of the ith source keypoint, vjFor the coordinates of the jth drag point, α is the weight attenuation coefficient, or, to achieve smoother image deformation, the following formula can be used to optimize the weights, and the optimized weights are used to locate the drag point:
Figure RE-GDA0002007967850000091
where γ is the optimization weight coefficient and β is the optimization weight offset.
Further, it can be based on an objective function
Figure RE-GDA0002007967850000092
Solving to obtain the transformation matrix and the translation vector, wherein argmin is a function of solving the maximum value, ∑·For summation, MjFor the transformation matrix of the jth drag point, TjFor the translation vector of the jth drag point, qiIs the coordinate of the ith target keypoint.
Further, it can be according to the formula fα(vj)=vjMj+TjCalculating to obtain the coordinate of the dragging point mapped to the image to be deformed, wherein fα(vj) And mapping the jth dragging point to the coordinate of the image to be deformed.
Step S2: carrying out gridding processing on the image to be deformed to obtain at least one grid point;
step S3: determining the target position of the grid point according to the original position of the grid point, the position of the dragging start point and the position of the dragging end point;
step S4: and dragging and deforming the image to be deformed according to the target position.
Specifically, the target positions of the grid points can be used to calculate the target position of each pixel point in the image to be deformed through an interpolation algorithm, and each pixel point is moved from the original position to the target position to complete the deformation of the image.
In this embodiment, the deformation speed can be increased by performing the gridding processing on the image to be deformed and performing the deformation processing only on the pixel points corresponding to the grid points.
In an alternative embodiment, as shown in fig. 1b, step S3 includes:
step S31: determining the weight of the grid point and the drag starting point according to the original position of the grid point and the position of the drag starting point;
step S32: and determining a transformation matrix and a translation vector of the grid points according to the weight, the position of the dragging start point and the position of the dragging end point.
Further, step 32 includes:
optimizing the weight;
and determining a transformation matrix and a translation vector of the grid point according to the optimized weight, the position of the dragging start point and the position of the dragging end point.
Further, it can be according to the formula
Figure RE-GDA0002007967850000101
Calculating to obtain the weight; wherein, | - | represents an absolute value, wijIs the weight, p, of the jth grid point and the ith drag pointiCoordinates of the start point of the i-th drag point, vjThe original coordinates of the jth grid point, α is the weight attenuation factor.
Further, optimization formulas may be employed
Figure RE-GDA0002007967850000102
And optimizing the weights, wherein gamma is an optimization weight coefficient, and β is an optimization weight offset.
Further, it can be based on an objective function
Figure RE-GDA0002007967850000103
And constraint Mj TMj=λ2I, solving to obtain the transformation matrix and the translation vector, wherein argmin is a function of solving the maximum value, ∑·For summation, MjA transformation matrix for the j grid point, TjIs the translation vector of the jth grid point, qiAnd T is the coordinate of the ith dragging end point, T is the matrix transposition, lambda is the numerical value, and I is the unit matrix.
Further, it can be according to the formula fα(vj)=vjMj+TjCalculating target position coordinates of the grid points, wherein fα(vj) Is the target position coordinate of the jth grid point.
According to the method and the device for dragging and deforming the image to be deformed, the grid processing is carried out on the image to be deformed, the target position of the grid point is determined according to the original position of the grid point, the position of the dragging starting point and the position of the dragging ending point, and the image to be deformed is dragged and deformed according to the target position, so that the technical problems that in the prior art, the image deformation complexity is high, and the dragging and deforming cannot be carried out are solved.
Fig. 2a is a schematic structural diagram of an image dragging and deforming apparatus provided in the embodiment of the present disclosure, where the image dragging and deforming apparatus may be implemented as software or as a combination of software and hardware, and the image dragging and deforming apparatus may be integrally disposed in a certain device in an image dragging and deforming system, such as an image dragging and deforming server or an image dragging and deforming terminal device. As shown in fig. 2a, the apparatus comprises: a drag position determination module 21, a mesh processing module 22, a target position determination module 23, and a drag deformation module 24. Wherein the content of the first and second substances,
the dragging position determining module 21 is configured to determine a position of a dragging point in the image to be deformed, where the dragging point includes a dragging start point and a dragging end point;
the grid processing module 22 is configured to perform a grid processing on the image to be deformed to obtain at least one grid point;
the target position determining module 23 is configured to determine a target position of the grid point according to the original position of the grid point, the position of the drag start point, and the position of the drag end point;
the dragging deformation module 24 is configured to drag and deform the image to be deformed according to the target position.
In this embodiment, the image to be deformed is an image before deformation, and may be a human face image or a human face video image. And selecting any pixel point on the image to be deformed as the dragging point. Specifically, the positions of the drag start point and the drag end point may be set by a user.
In an alternative embodiment, the drag position determination module 21 includes a mapping relation determination submodule 211 and a drag point positioning submodule 212. Wherein the content of the first and second substances,
the mapping relation determining submodule 211 is configured to determine a transformation matrix and a translation vector of a dragging point according to a position of the dragging point on a template image, a position of a source key point on the template image, and a position of a target key point on an image to be deformed;
the dragging point positioning submodule 212 is configured to determine, according to the transformation matrix and the translation vector, a position where the dragging point is mapped onto the image to be deformed.
In this embodiment, the template image is a preset image, including but not limited to a face image, such as a face front image. The dragging point can be a dragging starting point or a dragging ending point, and any pixel point on the template image can be selected as the dragging point. Specifically, a selection can be made on the template image by the user.
In this embodiment, the template image and the image to be deformed are associated images, for example, both are face images, and faces in the images are the same face. The image to be deformed may be a face video image.
The source key points are pixel points on the template image, and specifically, all the pixel points on the template image or a preset number of selected pixel points. The target key points are key points on the image to be deformed, and specifically may be all pixel points on the template image or a selected preset number of pixel points. And the source key points and the target key points have the same number and are pixel points with the same characteristics contained in the two images. The key point can be set by a user in a self-defined mode or obtained through machine algorithm learning, and the number of specific pixel points can be set in a self-defined mode, for example, 106. When the pixel points are the selected preset number of pixel points, the calculation amount can be effectively reduced.
Since a certain position relationship exists between the pixel points on the template image and the pixel points on the image to be deformed, the position of the dragging point mapped to the image to be deformed can be determined through the position mapping relationship between the source key point and the target key point.
Further, the mapping relation determining sub-module 211 includes: a weight determination unit 2111 and a mapping relationship determination unit 2112. Wherein the content of the first and second substances,
the weight determining unit 2111 is configured to determine the weight of the dragged point and the source key point according to the position of the dragged point and the position of the source key point;
the mapping relation determining unit 2112 is configured to determine a transformation matrix and a translation vector of the dragging point according to the weight, the position of the source key point, and the position of the target key point.
Further, the weight determination unit 2111 is specifically configured to: determining the weight of the dragging point and the source key point according to the position of the dragging point and the position of the source key point; and determining a transformation matrix and a translation vector of the dragging point according to the weight, the position of the source key point and the position of the target key point.
Further, the mapping relationship determining unit 2112 is specifically configured to: according to the formula
Figure RE-GDA0002007967850000121
Calculating to obtain the weight; wherein, | - | represents an absolute value, wijIs the weight, p, of the jth drag point and the ith source key pointiIs the coordinate of the ith source keypoint, vjAnd α is a weight attenuation coefficient for the coordinate of the jth dragging point.
Further, the mapping relationship determining unit 2112 is specifically configured to: according to the objective function
Figure RE-GDA0002007967850000131
Solving to obtain the transformation matrix and the translation vector, wherein argmin is a function of solving the maximum value, ∑·For summation, MjFor the transformation matrix of the jth drag point, TjFor the translation vector of the jth drag point, qiIs the coordinate of the ith target keypoint.
Further, the drag point positioning sub-module 212 is specifically configured to: according to the formula fα(vj)=vjMj+TjCalculating to obtain the coordinate of the dragging point mapped to the image to be deformed, wherein fα(vj) And mapping the jth dragging point to the coordinate of the image to be deformed.
In an alternative embodiment, as shown in fig. 2b, the target position determining module 23 comprises: a mapping relation determination unit 231 and a position determination unit 232; wherein the content of the first and second substances,
the mapping relation determining unit 231 is configured to determine a transformation matrix and a translation vector of the grid point according to an original position of the grid point, a position of the drag start point, and a position of the drag end point;
the position determining unit 232 is configured to determine the target position according to the transformation matrix and the translation vector.
Further, the mapping relationship determining unit 231 is specifically configured to: determining the weight of the grid point and the drag starting point according to the original position of the grid point and the position of the drag starting point; and determining a transformation matrix and a translation vector of the grid points according to the weight, the position of the dragging start point and the position of the dragging end point.
Further, the mapping relationship determining unit 231 is specifically configured to: optimizing the weight; and determining a transformation matrix and a translation vector of the grid point according to the optimized weight, the position of the dragging start point and the position of the dragging end point.
Further, the mapping relationship determining unit 231 is specifically configured to: according to the formula
Figure RE-GDA0002007967850000132
Calculating to obtain the weight; wherein, | - | represents an absolute value, wijIs the weight, p, of the jth grid point and the ith drag pointiCoordinates of the start point of the i-th drag point, vjThe original coordinates of the jth grid point, α is the weight attenuation factor.
Further, the mapping relationship determining unit 231 is specifically configured to: using an optimisation formula
Figure RE-GDA0002007967850000141
And optimizing the weights, wherein gamma is an optimization weight coefficient, and β is an optimization weight offset.
Further, the mapping relationship determining unit 231 is specifically configured to: according to the objective function
Figure RE-GDA0002007967850000142
And constraint Mj TMj=λ2I, solving to obtain the transformation matrix and the translation vector, wherein argmin is a function of solving the maximum value, ∑·For summation, MjA transformation matrix for the j grid point, TjIs the translation vector of the jth grid point, qiAnd T is the coordinate of the ith dragging end point, T is the matrix transposition, lambda is the numerical value, and I is the unit matrix.
Further, the position determining unit 232 is specifically configured to:according to the formula fα(vj)=vjMj+TjCalculating target position coordinates of the grid points, wherein fα(vj) Is the target position coordinate of the jth grid point.
For detailed descriptions of the working principle, the realized technical effect, and the like of the embodiment of the image dragging and deforming device, reference may be made to the related descriptions in the foregoing embodiment of the image dragging and deforming method, and further description is omitted here.
Referring now to FIG. 3, shown is a schematic diagram of an electronic device suitable for use in implementing embodiments of the present disclosure. The electronic devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., car navigation terminals), and the like, and fixed terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 3 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 3, the electronic device may include a processing device (e.g., a central processing unit, a graphics processor, etc.) 301 that may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)302 or a program loaded from a storage device 308 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data necessary for the operation of the electronic apparatus are also stored. The processing device 301, the ROM 302, and the RAM 303 are connected to each other via a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
Generally, the following devices may be connected to the I/O interface 305: input devices 306 including, for example, a touch screen, touch pad, keyboard, mouse, image sensor, microphone, accelerometer, gyroscope, etc.; an output device 307 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage devices 308 including, for example, magnetic tape, hard disk, etc.; and a communication device 309. The communication means 309 may allow the electronic device to communicate wirelessly or by wire with other devices to exchange data. While fig. 3 illustrates an electronic device having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication means 309, or installed from the storage means 308, or installed from the ROM 302. The computer program, when executed by the processing device 301, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: determining the position of a dragging point in an image to be deformed, wherein the dragging point comprises a dragging starting point and a dragging ending point; and carrying out dragging deformation on the image to be deformed according to the position of the dragging starting point and the position of the dragging ending point.
Alternatively, the computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: determining the position of a dragging point in an image to be deformed, wherein the dragging point comprises a dragging starting point and a dragging ending point; and carrying out dragging deformation on the image to be deformed according to the position of the dragging starting point and the position of the dragging ending point.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a cell does not in some cases constitute a definition of the cell itself, for example, the position determination module may also be described as "a module for determining the position of the drag point in the image to be deformed".
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.

Claims (9)

1. An image dragging and deforming method is characterized by comprising the following steps:
determining the position of a dragging point in an image to be deformed, wherein the dragging point comprises a dragging starting point and a dragging ending point;
carrying out gridding processing on the image to be deformed to obtain at least one grid point;
according to the formula
Figure FDA0002459326710000011
Calculating to obtain the weight of the grid point and the dragging starting point; wherein, | - | represents an absolute value, wijIs the weight, p, of the jth grid point and the ith drag pointiCoordinates of the start point of the i-th drag point, vjThe original coordinates of the jth grid point, α is the weight attenuation coefficient;
determining a transformation matrix and a translation vector of the grid point according to the weight, the position of the dragging start point and the position of the dragging end point;
determining a target position according to the transformation matrix and the translation vector;
and dragging and deforming the image to be deformed according to the target position.
2. The method for image dragging and warping according to claim 1, wherein said determining a transformation matrix and a translation vector of the grid points according to the weights, the position of the dragging start point, and the position of the dragging end point comprises:
optimizing the weight;
and determining a transformation matrix and a translation vector of the grid point according to the optimized weight, the position of the dragging start point and the position of the dragging end point.
3. The method for dragging and distorting the image according to claim 2, wherein the optimizing the weight comprises:
using an optimisation formula
Figure FDA0002459326710000021
And optimizing the weights, wherein gamma is an optimization weight coefficient, and β is an optimization weight offset.
4. The method for image dragging and warping according to claim 3, wherein said determining a transformation matrix and a translation vector of the grid points according to the weights, the position of the dragging start point, and the position of the dragging end point comprises:
according to the objective function
Figure FDA0002459326710000022
And constraint Mj TMj=λ2I, solving to obtain the transformation matrix and the translation vector, wherein argmin is a function of solving the maximum value, ∑ is summation, and M isjA transformation matrix for the j grid point, TjIs the translation vector of the jth grid point, qiAnd T is the coordinate of the ith dragging end point, T is the matrix transposition, lambda is the numerical value, and I is the unit matrix.
5. The image dragging and warping method of claim 4, wherein said determining a target position from the transformation matrix and the translation vector comprises:
according to the formula fα(vj)=vjMj+TjCalculating target position coordinates of the grid points, wherein fα(vj) Is the target position coordinate of the jth grid point.
6. The image dragging and warping method according to any one of claims 1-5, wherein the image to be warped is a human face image.
7. An image dragging and distorting apparatus, comprising:
the position determining module is used for determining the position of a dragging point in the image to be deformed, wherein the dragging point comprises a dragging starting point and a dragging ending point;
the grid processing module is used for carrying out grid processing on the image to be deformed to obtain at least one grid point;
a position determination module for determining a position based on a formula
Figure FDA0002459326710000023
Calculating to obtain the weight of the grid point and the dragging starting point; wherein, | - | represents an absolute value, wijIs the weight, p, of the jth grid point and the ith drag pointiCoordinates of the start point of the i-th drag point, vjDetermining a transformation matrix and a translation vector of the jth grid point according to the weight, the position of the dragging start point and the position of the dragging end point;
and the dragging deformation module is used for dragging and deforming the image to be deformed according to the target position.
8. An electronic device, comprising:
a memory for storing non-transitory computer readable instructions; and
a processor for executing the computer readable instructions, such that the processor when executing implements the image drag-and-distortion method according to any one of claims 1-6.
9. A computer-readable storage medium storing non-transitory computer-readable instructions which, when executed by a computer, cause the computer to perform the image drag-warping method of any one of claims 1-6.
CN201910101346.3A 2019-01-31 2019-01-31 Image dragging deformation method and device Active CN110069195B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910101346.3A CN110069195B (en) 2019-01-31 2019-01-31 Image dragging deformation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910101346.3A CN110069195B (en) 2019-01-31 2019-01-31 Image dragging deformation method and device

Publications (2)

Publication Number Publication Date
CN110069195A CN110069195A (en) 2019-07-30
CN110069195B true CN110069195B (en) 2020-06-30

Family

ID=67366120

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910101346.3A Active CN110069195B (en) 2019-01-31 2019-01-31 Image dragging deformation method and device

Country Status (1)

Country Link
CN (1) CN110069195B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112200716B (en) * 2020-10-15 2022-03-22 广州博冠信息科技有限公司 Image processing method, device, electronic equipment and nonvolatile storage medium
CN112258653A (en) * 2020-10-28 2021-01-22 北京字跳网络技术有限公司 Rendering method, device and equipment of elastic object and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104143204A (en) * 2014-07-02 2014-11-12 浙江工商大学 Moving least square two-dimensional character deformation method considering topological structure
CN104978707A (en) * 2014-04-03 2015-10-14 陈鹏飞 Image deformation technique based on contour
CN105184735A (en) * 2014-06-19 2015-12-23 腾讯科技(深圳)有限公司 Portrait deformation method and device
CN108830787A (en) * 2018-06-20 2018-11-16 北京微播视界科技有限公司 The method, apparatus and electronic equipment of anamorphose
CN109242765A (en) * 2018-08-31 2019-01-18 腾讯科技(深圳)有限公司 A kind of face image processing process, device and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020154143A1 (en) * 2001-04-06 2002-10-24 Christopher Maier Method of using wood to render images onto surfaces
US8970740B2 (en) * 2012-12-14 2015-03-03 In View Technology Corporation Overlap patterns and image stitching for multiple-detector compressive-sensing camera
CN107798713B (en) * 2017-09-04 2021-04-09 昆明理工大学 Image deformation method for two-dimensional virtual fitting

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104978707A (en) * 2014-04-03 2015-10-14 陈鹏飞 Image deformation technique based on contour
CN105184735A (en) * 2014-06-19 2015-12-23 腾讯科技(深圳)有限公司 Portrait deformation method and device
CN104143204A (en) * 2014-07-02 2014-11-12 浙江工商大学 Moving least square two-dimensional character deformation method considering topological structure
CN108830787A (en) * 2018-06-20 2018-11-16 北京微播视界科技有限公司 The method, apparatus and electronic equipment of anamorphose
CN109242765A (en) * 2018-08-31 2019-01-18 腾讯科技(深圳)有限公司 A kind of face image processing process, device and storage medium

Also Published As

Publication number Publication date
CN110069195A (en) 2019-07-30

Similar Documents

Publication Publication Date Title
CN110069191B (en) Terminal-based image dragging deformation implementation method and device
CN109981787B (en) Method and device for displaying information
CN110288705B (en) Method and device for generating three-dimensional model
CN109754464B (en) Method and apparatus for generating information
CN110069195B (en) Image dragging deformation method and device
CN111459364A (en) Icon updating method and device and electronic equipment
CN110211017B (en) Image processing method and device and electronic equipment
CN110555861B (en) Optical flow calculation method and device and electronic equipment
CN110288625B (en) Method and apparatus for processing image
CN110288532B (en) Method, apparatus, device and computer readable storage medium for generating whole body image
CN111652675A (en) Display method and device and electronic equipment
CN113129366B (en) Monocular SLAM initialization method and device and electronic equipment
CN111368668B (en) Three-dimensional hand recognition method and device, electronic equipment and storage medium
CN110070479B (en) Method and device for positioning image deformation dragging point
CN111325668B (en) Training method and device for image processing deep learning model and electronic equipment
CN110288523B (en) Image generation method and device
CN110060324B (en) Image rendering method and device and electronic equipment
CN114234984B (en) Indoor positioning track smoothing method, system and equipment based on difference matrix
CN110222777B (en) Image feature processing method and device, electronic equipment and storage medium
CN110619615A (en) Method and apparatus for processing image
CN111680754B (en) Image classification method, device, electronic equipment and computer readable storage medium
CN114049403A (en) Multi-angle three-dimensional face reconstruction method and device and storage medium
CN111311665B (en) Video processing method and device and electronic equipment
CN112395826B (en) Text special effect processing method and device
CN110929571B (en) Handwriting fitting method, handwriting fitting device, handwriting fitting medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee after: Tiktok vision (Beijing) Co.,Ltd.

Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee before: BEIJING BYTEDANCE NETWORK TECHNOLOGY Co.,Ltd.

Address after: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee after: Douyin Vision Co.,Ltd.

Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee before: Tiktok vision (Beijing) Co.,Ltd.