CN110705346B - Large-scale human face deformation method - Google Patents

Large-scale human face deformation method Download PDF

Info

Publication number
CN110705346B
CN110705346B CN201910776710.6A CN201910776710A CN110705346B CN 110705346 B CN110705346 B CN 110705346B CN 201910776710 A CN201910776710 A CN 201910776710A CN 110705346 B CN110705346 B CN 110705346B
Authority
CN
China
Prior art keywords
deformation
face
feature points
points
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910776710.6A
Other languages
Chinese (zh)
Other versions
CN110705346A (en
Inventor
熊永春
鲍宏鑫
张金矿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Xiaoying Innovation Technology Co ltd
Original Assignee
Hangzhou Xiaoying Innovation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Xiaoying Innovation Technology Co ltd filed Critical Hangzhou Xiaoying Innovation Technology Co ltd
Priority to CN201910776710.6A priority Critical patent/CN110705346B/en
Publication of CN110705346A publication Critical patent/CN110705346A/en
Application granted granted Critical
Publication of CN110705346B publication Critical patent/CN110705346B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships

Abstract

The invention discloses a large-scale face deformation method, which comprises the following steps: obtaining a plurality of face characteristic points according to a face characteristic point recognition algorithm; calculating forehead feature points according to the eyebrow feature points, calculating cheek feature points according to the nose and face contours, and increasing a deformable region of the face; selecting n control vertexes from all the characteristic points, wherein the number of the characteristic points is 3 < n >; all the human face characteristic points after deformation are obtained through the deformation constraint conditions and the control vertexes through the moving least square method MLS, so that the human face is more natural after deformation; b-spline interpolation is used for respectively carrying out interpolation processing on the feature points before and after deformation, so that the density of the feature points is improved, and the smoothness of the edge of the human face after large-scale deformation is improved; triangulating the feature points before deformation after interpolation to obtain indexes corresponding to vertexes of all triangles; and converting the positions of the feature points before deformation into texture coordinates, converting the positions of the feature points after deformation into vertex positions, and rendering through OpenGL to obtain the final effect of the human face after large-scale deformation.

Description

Large-scale human face deformation method
Technical Field
The invention belongs to the technical field of graphic processing, and particularly relates to a large-scale face deformation method.
Background
The research of human face image deformation technology and algorithm is always an important research content in the image processing research field. The algorithm related to the face image deformation is complex, the core content of the face image deformation is that a deformation mapping function is generated according to a certain constraint condition, and a smooth and natural deformation effect is generated by controlling the characteristic points.
Nowadays, people pay more attention to details of human face image deformation, but existing deformation technologies mainly focus on small-scale deformation of local areas, such as large eyes, face thinning and the like. If large-scale face deformation is involved, the problems of unsmooth deformation edge, unnatural face after deformation and the like can be caused, so that the overall effect of the processed image is unnatural. In addition, the feature points obtained by the existing mature human face feature point recognition algorithm only cover the part below the eyebrow, so that the human face deformation based on the feature points can only be carried out on the part below the eyebrow, and the forehead cannot be deformed.
Disclosure of Invention
In view of the technical problems, the invention is used for providing a large-scale face deformation method, and mainly solves the problems of unnatural face deformation areas and unsmooth face edges generated after the face large-scale deformation is realized in the prior art.
In order to solve the technical problems, the invention adopts the following technical scheme:
a large-scale human face deformation method comprises the following steps:
s10, obtaining a plurality of face characteristic points according to the existing face characteristic point recognition algorithm;
s20, calculating the forehead feature point according to the eyebrow feature point and the size of the current face frame, wherein the formula is as follows:
P head.x =P eyebrow.x
P head.y =P eyebrow.y -αH face
wherein d is the gain factor, H face The current face frame height; p head.x ,P head.y Respectively calculating the x-axis coordinate and the y-axis coordinate of the obtained forehead characteristic point; p eyebrow.x ,P eyebrow.y Respectively representing x-axis coordinates and y-axis coordinates of the specific eyebrow feature points;
calculating cheek feature points from the nose wing feature points and the face contour, wherein the formula is as follows:
P face.xy =(P profile.xy +P nose.xy )/2.0
wherein P is face.xy Calculating the coordinates of the obtained cheek feature points; p profile.xy Coordinates of contour points of a specific face; p nose.xy Coordinates of feature points of a specific nose and wing part;
the additional characteristic points are added to cover the area which cannot be marked by the current face recognition algorithm, and then the face deformable area is added;
s30, randomly selecting n points from all the above feature points as control vertices, (where 3 ≦ n ≦ the number of feature points), and if the expected form is a face symmetric deformation mode, only selecting one side of feature points, where the symmetric feature points automatically become control vertices; if the deformation mode is the asymmetric deformation mode, the control point is the selected characteristic point;
and S40, obtaining coordinates of all the deformed human face characteristic points through moving least square method MLS calculation according to given deformation constraint conditions and control vertexes, wherein the characteristic points obtained through the algorithm enable the human face to be more natural after deformation. The MLS expression is:
f v (x)=(x-p * )M+q *
where x is the current point v coordinate, f v (x) The coordinates after deformation are respectively weighted summation and gravity center positions of control points before deformation, and M is an affine transformation matrix;
s50, interpolation processing is carried out on the feature points before and after deformation by adopting B spline interpolation, the density of the feature points is improved, edges and corners generated by deformation (especially deformation such as stretching and enlarging) are reduced, and the smoothness of the edges of the human face after large-scale deformation is improved;
s60, triangulating the feature points which are not deformed after interpolation to obtain indexes corresponding to all triangle vertexes;
s70, converting the positions of the feature points before deformation into texture coordinates,
UV x =P x /width;
UV y =P y /height;
wherein UV x ,UV y Texture x-axis coordinates and texture y-axis coordinates respectively; p x ,P y Respectively representing x-axis coordinates and y-axis coordinates of the characteristic points before deformation; width is the width of the texture; height is the texture height;
the deformed feature point positions are converted into vertex positions,
Figure GDA0003533188490000031
Figure GDA0003533188490000032
wherein Pos x ,Pos y Respectively representing vertex x and y-axis coordinates, and Px and Py respectively representing feature point x and y-axis coordinates before deformation; width is the width of the texture; height is the texture height;
and adding a vertex index obtained by triangulation, and rendering through OpenGL to obtain the final effect of the human face after large-scale deformation.
Preferably, the B-spline interpolation expression is:
Figure GDA0003533188490000033
wherein d is i (i is 0, 1 … N) denotes a control vertex (feature point coordinates), N i,k (i ═ 0, 1 … n) is the k-th order canonical B spline basis function, with the highest order being k.
Preferably, the deformation constraint condition includes a deformation strength, whether to symmetrically deform, a deformation amount and a deformation point index.
The invention has the following beneficial effects:
(1) calculating characteristic points of the forehead and the cheek part by using the existing face characteristic points, and ensuring that all areas of the face are marked by the characteristic points, so that the deformation of the forehead part becomes possible;
(2) calculating by adopting a Moving Least Square (MLS) method and combining deformation constraint conditions to obtain all deformed characteristic points, so that the deformed face is more natural;
(3) the B-spline interpolation method is adopted to interpolate the original face characteristic points and the deformed face characteristic points respectively, so that the problems of sharp edges and the like caused by large-scale deformation are solved, and the smoothness of the deformed face edges is improved.
Drawings
Fig. 1 is a flowchart illustrating steps of a large-scale face morphing method according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, a large-scale face deformation method disclosed by the embodiment of the invention is shown, and includes the following steps:
s10, obtaining a plurality of face characteristic points according to the existing face characteristic point recognition algorithm;
s20, calculating the forehead feature point according to the eyebrow feature point and the size of the current face frame, wherein the formula is as follows:
P head.x =P eyebrow.x
P head.y =P eyebrow.y -αH face
where α is the gain factor, H face The current face frame height; p head.x ,P head.y Respectively calculating the x-axis coordinate and the y-axis coordinate of the obtained forehead characteristic point; p eyebrow.x ,P eyebrow.y The x-axis coordinates and the y-axis coordinates of the specific eyebrow feature points are respectively.
Calculating cheek feature points from the nose wing feature points and the face contour, wherein the formula is as follows:
P face.xy =(P profile.xy +P nose.xy )/2.0
wherein P is face.xy Calculating the coordinates of the obtained cheek feature points; p profile.xy Coordinates of contour points of a specific face; p nose.xy Is the specific nose and wing feature point coordinate.
The additional characteristic points are added to cover the area which cannot be marked by the current face recognition algorithm, and then the face deformable area is added;
s30, randomly selecting n points from all the above feature points as control vertices, (where 3 ≦ n ≦ the number of feature points), and if the expected form is a face symmetric deformation mode, only selecting one side of feature points, where the symmetric feature points automatically become control vertices; if the deformation mode is the asymmetric deformation mode, the control point is the selected characteristic point;
and S40, obtaining coordinates of all human face feature points after deformation through moving least square method MLS calculation according to given deformation constraint conditions (including deformation strength, whether symmetrical deformation exists, deformation amount and deformation point index) and control vertexes, and enabling the human face to be more natural after deformation through the feature points obtained through the algorithm. The MLS expression is:
f v (x)=(x-p * )M+q *
where x is the current point v coordinate, f v (x) And p and q are weighted summation and gravity center position of the control point before deformation respectively, and M is an affine transformation matrix.
S50, interpolation processing is carried out on the feature points before and after deformation by adopting B spline interpolation, the density of the feature points is improved, edges and corners generated by deformation (especially deformation such as stretching and enlarging) are reduced, and the smoothness of the edges of the human face after large-scale deformation is improved; the B spline interpolation expression is:
Figure GDA0003533188490000051
wherein d is i (i is 0, 1 … N) is a control vertex (feature point coordinate), N i,k (i ═ 0, 1 … n) is the k-th order canonical B spline basis function, with the highest order being k.
S60, triangulating the feature points which are not deformed after interpolation to obtain indexes corresponding to all triangle vertexes;
s70, converting the positions of the feature points before deformation into texture coordinates,
UV x =P x /width;
UV y =P y /height;
wherein UV x ,UV y Texture x-axis coordinates and texture y-axis coordinates respectively; p x ,P y Respectively representing x-axis coordinates and y-axis coordinates of the characteristic points before deformation; width is the width of the texture; height is the texture height.
The deformed feature point positions are converted into vertex positions,
Figure GDA0003533188490000052
Figure GDA0003533188490000061
wherein Pos x ,Pos y Respectively, the x-axis coordinate and the y-axis coordinate of the vertex, and Px and Py respectivelyThe coordinate of the characteristic point x and y axis before deformation; width is the width of the texture; height is the texture height.
And adding a vertex index obtained by triangulation, and rendering through OpenGL to obtain the final effect of the human face after large-scale deformation.
According to the embodiment of the invention, by the technical scheme, the characteristic points of the forehead and the cheek part are calculated by using the existing human face characteristic points, so that all areas of the human face are marked by the characteristic points, and the forehead part is possible to deform; calculating by adopting MLS and combining deformation constraint conditions to obtain all deformed characteristic points, so that the deformed face is more natural; the B-spline interpolation method is adopted to interpolate the original face characteristic points and the deformed face characteristic points respectively, so that the problems of sharp edges and the like caused by large-scale deformation are solved, and the smoothness of the deformed face edges is improved.
It is to be understood that the exemplary embodiments described herein are illustrative and not restrictive. Although one or more embodiments of the present invention have been described with reference to the accompanying drawings, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the following claims.

Claims (3)

1. A large-scale human face deformation method is characterized by comprising the following steps:
s10, obtaining a plurality of face characteristic points according to the existing face characteristic point recognition algorithm;
s20, calculating the forehead feature point according to the eyebrow feature point and the size of the current face frame, wherein the formula is as follows:
P head.x =P eyebrow.x
P head.y =P eyebrow.y -αH face
where α is the gain factor, H face The current face frame height; p head.x ,P head.y Respectively calculating the x-axis coordinate and the y-axis coordinate of the obtained forehead characteristic point; p eyebrow.x ,P eyebrow.y Respectively representing x-axis coordinates and y-axis coordinates of the specific eyebrow feature points;
calculating cheek feature points from the nose wing feature points and the face contour, wherein the formula is as follows:
P face.xy =(P profile.xy +P nose.xy )/2.0
wherein P is face.xy Calculating the coordinates of the obtained cheek feature points; p profile.xy Coordinates of contour points of a specific face; p nose.xy Coordinates of feature points of a specific nose and wing part;
the additional characteristic points are added to cover the area which cannot be marked by the current face recognition algorithm, and then the face deformable area is added;
s30, randomly selecting n points from all the characteristic points as control vertexes, wherein the number of the characteristic points is 3 < n < > and the number of the characteristic points is 3 < n >, if the expected form is a face symmetric deformation mode, only one side of the characteristic points needs to be selected, and the symmetric characteristic points automatically become the control vertexes; if the deformation mode is the asymmetric deformation mode, the control point is the selected characteristic point;
s40, according to given deformation constraint conditions and control vertexes, coordinates of all deformed human face feature points are obtained through moving least square method MLS calculation, the feature points obtained through the algorithm enable the human face to be more natural after deformation, and MLS expressions are as follows:
f v (x)=(x-p * )M+q *
where x is the current point v coordinate, f v (x) The coordinates after deformation are respectively weighted summation and gravity center positions of control points before deformation, and M is an affine transformation matrix;
s50, interpolation processing is carried out on the feature points before and after deformation by adopting B spline interpolation, the density of the feature points is improved, the deformation is reduced, particularly, edges and corners generated by deformation such as stretching and enlarging are reduced, and the smoothness of the edges of the human face after large-scale deformation is improved;
s60, triangulating the feature points which are not deformed after interpolation to obtain indexes corresponding to all triangle vertexes;
s70, converting the positions of the feature points before deformation into texture coordinates,
UV x =P x /width;
UV y =P y /height;
wherein UV x ,UV y Texture x-axis coordinates and texture y-axis coordinates respectively; p x ,P y Respectively representing x-axis coordinates and y-axis coordinates of the characteristic points before deformation; width is the width of the texture; height is the texture height;
the deformed feature point positions are converted into vertex positions,
Figure FDA0003533188480000021
Figure FDA0003533188480000022
wherein Pos x ,Pos y Respectively representing vertex x and y-axis coordinates, and Px and Py respectively representing feature point x and y-axis coordinates before deformation; width is the width of the texture; height is the texture height;
and adding a vertex index obtained by triangulation, and rendering through OpenGL to obtain the final effect of the human face after large-scale deformation.
2. The large-scale face morphing method of claim 1, wherein the B-spline interpolation expression is:
Figure FDA0003533188480000023
wherein d is i (i ═ 0, 1.. N) is the control vertex feature point coordinate, N i,k For k-th order canonical B-spline basis functions, the highest order is k, where i ═ 0, 1.. n.
3. The large-scale face deformation method according to claim 1 or 2, wherein the deformation constraint conditions include deformation strength, whether deformation is symmetric, deformation amount and deformation point index.
CN201910776710.6A 2019-08-22 2019-08-22 Large-scale human face deformation method Active CN110705346B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910776710.6A CN110705346B (en) 2019-08-22 2019-08-22 Large-scale human face deformation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910776710.6A CN110705346B (en) 2019-08-22 2019-08-22 Large-scale human face deformation method

Publications (2)

Publication Number Publication Date
CN110705346A CN110705346A (en) 2020-01-17
CN110705346B true CN110705346B (en) 2022-08-05

Family

ID=69193364

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910776710.6A Active CN110705346B (en) 2019-08-22 2019-08-22 Large-scale human face deformation method

Country Status (1)

Country Link
CN (1) CN110705346B (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103208133B (en) * 2013-04-02 2015-08-19 浙江大学 The method of adjustment that in a kind of image, face is fat or thin
CN104778712B (en) * 2015-04-27 2018-05-01 厦门美图之家科技有限公司 A kind of face chart pasting method and system based on affine transformation
CN106296571B (en) * 2016-07-29 2019-06-04 厦门美图之家科技有限公司 A kind of diminution wing of nose method, apparatus based on face grid and calculating equipment
CN106296572A (en) * 2016-08-01 2017-01-04 南京信息工程大学 A kind of face editor based on parts and beautification method
CN107273837B (en) * 2017-06-07 2019-05-07 广州视源电子科技股份有限公司 The method and system virtually made up
CN109389682A (en) * 2017-08-09 2019-02-26 上海影子智能科技有限公司 A kind of three-dimensional face model automatic adjusting method

Also Published As

Publication number Publication date
CN110705346A (en) 2020-01-17

Similar Documents

Publication Publication Date Title
CN103208133B (en) The method of adjustment that in a kind of image, face is fat or thin
CN112669447B (en) Model head portrait creation method and device, electronic equipment and storage medium
CN107730449B (en) Method and system for beautifying facial features
CN101853523A (en) Method for adopting rough drawings to establish three-dimensional human face molds
CN110264396B (en) Video face replacement method, system and computer readable storage medium
CN109389682A (en) A kind of three-dimensional face model automatic adjusting method
CN111091624B (en) Method for generating high-precision drivable human face three-dimensional model from single picture
CN107689254B (en) Digital generation method for outer surface of full-crown prosthesis
CN106447763A (en) Face image three-dimensional reconstruction method for fusion of sparse deformation model and principal component regression algorithm
CN108596992B (en) Rapid real-time lip gloss makeup method
CN109461197B (en) Cloud real-time drawing optimization method based on spherical UV and re-projection
CN115601097A (en) Two-dimensional virtual fitting method for free dressing change
CN110910308A (en) Image processing method, apparatus, device and medium
Yu et al. An rbf-based reparameterization method for constrained texture mapping
CN111652795A (en) Face shape adjusting method, face shape adjusting device, live broadcast method, live broadcast device, electronic equipment and storage medium
CN110705346B (en) Large-scale human face deformation method
CN108961408A (en) Digital rubbing production method, system and storage medium based on triangle grid model
JP3951061B2 (en) Face image processing method and face image processing apparatus
CN113808006A (en) Method and device for reconstructing three-dimensional grid model based on two-dimensional image
JP3915844B2 (en) Face image processing method and face image processing apparatus
Ono et al. 3D character model creation from cel animation
Mingming et al. The 3D caricature face modeling based on aesthetic formulae
CN112561780B (en) City scene grid model optimization method with additional multi-sight feature constraint
Wang et al. Creating animatable MPEG4 face
Gong et al. An automatic approach for pixel-wise correspondence between 3D faces

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 22nd floor, block a, Huaxing Times Square, 478 Wensan Road, Xihu District, Hangzhou, Zhejiang 310000

Applicant after: Hangzhou Xiaoying Innovation Technology Co.,Ltd.

Address before: 310000 16 / F, HANGGANG metallurgical technology building, 294 Tianmushan Road, Xihu District, Hangzhou City, Zhejiang Province

Applicant before: Hangzhou Xiaoying Innovation Technology Co.,Ltd.

Address after: 310000 16 / F, HANGGANG metallurgical technology building, 294 Tianmushan Road, Xihu District, Hangzhou City, Zhejiang Province

Applicant after: Hangzhou Xiaoying Innovation Technology Co.,Ltd.

Address before: 310007 16th floor, HANGGANG and metallurgical technology building, No. 294, Tianmushan Road, Xihu District, Hangzhou City, Zhejiang Province

Applicant before: HANGZHOU QUWEI SCIENCE & TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant