CN113658298A - Method and device for generating special-effect program file package and special effect - Google Patents

Method and device for generating special-effect program file package and special effect Download PDF

Info

Publication number
CN113658298A
CN113658298A CN202110977497.2A CN202110977497A CN113658298A CN 113658298 A CN113658298 A CN 113658298A CN 202110977497 A CN202110977497 A CN 202110977497A CN 113658298 A CN113658298 A CN 113658298A
Authority
CN
China
Prior art keywords
special effect
makeup
parameter
face
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110977497.2A
Other languages
Chinese (zh)
Inventor
许亲亲
杨瑞健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sensetime Technology Development Co Ltd
Original Assignee
Beijing Sensetime Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sensetime Technology Development Co Ltd filed Critical Beijing Sensetime Technology Development Co Ltd
Priority to CN202110977497.2A priority Critical patent/CN113658298A/en
Publication of CN113658298A publication Critical patent/CN113658298A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • G06T3/04

Abstract

The embodiment of the invention discloses a method and a device for generating a special effect program file package and a special effect, wherein the method for generating the special effect program file package comprises the following steps: displaying the reference image and preset key points on the reference image; importing makeup and/or face changing sub-materials, and displaying the imported makeup and/or face changing sub-materials; acquiring parameter values of special effect parameters of the makeup and/or face changing sub-materials, and establishing a corresponding relation between display attributes of the makeup and/or face changing sub-materials and key points in a display position coverage range; and generating a makeup and/or face changing special effect program file package according to the makeup and/or face changing sub-materials and the parameter values and the corresponding relation of the special effect parameters. According to the embodiment of the invention, the program file can be generated into the makeup-beautifying and/or face-changing special effect program file which can be executed by the rendering engine without manually writing the program file, so that the overall efficiency of realizing the special effect is improved, and the possible errors of manually writing the program file are avoided.

Description

Method and device for generating special-effect program file package and special effect
Technical Field
The invention relates to an artificial intelligence technology, in particular to a method and a device for generating a special effect program file package and a special effect.
Background
Augmented Reality (AR) is a new technology for seamlessly integrating real world information and virtual world information, and is a technology for applying virtual information to a real world by superimposing real information originally within a certain time space range of the real world after simulation and superimposing virtual information, and superimposing real characters, environments and virtual objects of the real world on the same picture or space in real time, thereby achieving sensory experience beyond Reality.
Disclosure of Invention
The embodiment of the invention provides a technical scheme for generating a makeup and/or face changing special effect program file package and generating a makeup and/or face changing special effect.
According to an aspect of an embodiment of the present invention, a method for generating a cosmetic and/or face-changing special effect program file package is provided, including:
displaying a reference image and preset key points on the reference image; the reference image includes: referencing at least a portion of an image of a person;
importing makeup and/or face changing sub-materials, and displaying the imported makeup and/or face changing sub-materials;
acquiring parameter values of special effect parameters of the makeup and/or face changing sub-material, wherein the special effect parameters comprise superposition mode parameters, and establishing a corresponding relation between display attributes of the makeup and/or face changing sub-material and key points in a display position coverage range; the display attributes include: size and/or display location;
and generating a makeup and/or face changing special effect program file package according to the makeup and/or face changing sub-material, the parameter value of the special effect parameter and the corresponding relation.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the at least one part of the image of the reference person includes an image of any one or more of the following parts of the reference person: a full image, a head image, a face image, a shoulder image, an arm image, a gesture image, a waist image, a leg image, a foot image.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the makeup and/or face-changing sub-material includes: makeup and/or face-changing sub-materials for one part of the person, and/or a combination of makeup and/or face-changing sub-materials for two or more parts of the person.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the preset key points include any one or more of the following: head key points, face key points, shoulder key points, arm key points, gesture key points, waist key points, leg key points, foot key points, and human skeleton key points.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the head key point includes at least one of: vertex key points, nose tip key points, and chin key points; and/or
The facial face key points include at least one of: key points of face contour, eye, eyebrow, nose and mouth; and/or
The shoulder keypoints comprise at least one of: a shoulder-head convergence key point located at a convergence position of the shoulder and the head, and a shoulder-contour midpoint key point located at a midpoint position between the arm-root-contour key point and the shoulder-head convergence key point; and/or
The arm keypoints comprise at least one of: a wrist contour key point, an elbow contour key point, an arm root contour key point, a forearm contour middle point key point located at a middle point position between the wrist contour key point and the elbow contour key point, and a forearm middle point key point located at a middle point position between the elbow contour key point and the arm root contour key point; and/or
The gesture keypoints comprise at least one of: four vertex key points of the gesture box and a central key point of the gesture box; and/or
The leg keypoints comprise at least one of: a crotch critical point, a knee contour critical point, an ankle contour critical point, a thigh root lateral contour critical point, a calf contour midpoint critical point located at a midpoint position between the knee contour critical point and the ankle contour critical point, an intra-thigh contour midpoint critical point located at a midpoint position between the knee contour critical point and the crotch critical point, and a thigh outer contour midpoint critical point located at a midpoint position between the knee contour critical point and the thigh root lateral contour critical point; and/or
The waist key points include at least one of: dividing N equal parts between the thigh root outer side outline key point and the arm root outline key point to generate N equal parts; wherein said N is greater than 1; and/or
The foot keypoints comprise at least one of: toe keypoints and heel keypoints; and/or
The human skeletal key points will include at least one of: a right shoulder skeleton keypoint, a right elbow skeleton keypoint, a right carpal skeleton keypoint, a left shoulder skeleton keypoint, a left elbow skeleton keypoint, a left carpal skeleton keypoint, a right hip skeleton keypoint, a right knee skeleton keypoint, a right ankle skeleton keypoint, a left hip skeleton keypoint, a left knee skeleton keypoint, a left ankle skeleton keypoint, a vertex skeleton keypoint, and a neck skeleton keypoint.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the eye key points include at least one of: a left eye orbit key point, a left eye pupil center key point, a left eye center key point, a right eye orbit key point, a right eye pupil center key point, and a right eye center key point; and/or
The eyebrow key points include at least one of: a left eyebrow key point and a right eyebrow key point; and/or
The nose key points include at least one of: a nose bridge key point, a nose lower edge key point, and a nose outer contour key point; and/or
The mouth keypoints comprise at least one of: upper lip keypoints, and lower lip keypoints.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the importing a makeup and/or face-changing sub-material includes:
and receiving an import instruction input through an interactive interface of the operation bar, and importing makeup and/or face changing sub-materials in the material folder pointed by the import instruction.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the receiving an import instruction input through an interactive interface of an operation bar, and importing a makeup and/or face-changing child material in a material folder pointed by the import instruction includes:
receiving a selection instruction sent through an interactive interface of the operation bar, taking a reference part selected by the selection instruction as the current target part needing to be added with makeup and/or a special effect of changing the face, and displaying a special effect parameter setting interface under the target part in the operation bar;
and receiving an import instruction sent through an interactive interface in the special effect parameter setting interface, and importing makeup and/or face changing sub-materials in the material folder pointed by the import instruction.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, importing makeup and/or face-changing child materials in a material folder pointed by the import instruction includes:
receiving an import instruction sent through the interactive interface, and acquiring and displaying a material folder pointed by the import instruction;
in response to receiving a selection operation of cosmetic and/or face changing sub-materials in the material folder, importing one or more cosmetic and/or face changing sub-materials selected by the cosmetic and/or face changing sub-material selection operation; and/or
In response to the fact that no operation for selecting makeup and/or face changing sub-materials in the material folder is received, selecting one or more makeup and/or face changing sub-materials in the material folder according to preset setting, and importing the selected makeup and/or face changing sub-materials according to the preset setting;
the plurality of makeup and/or face-changing sub-materials form a group of makeup and/or face-changing sub-materials with preset playing time sequence.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the playing timing of a plurality of makeup and/or face-changing sub-materials in the set of makeup and/or face-changing sub-materials is determined based on the file names of the plurality of makeup and/or face-changing sub-materials.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the displaying the imported makeup and/or face-changing sub-material includes:
and displaying the imported makeup and/or face changing sub-materials on the target part according to the parameter value of the superposition mode.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the obtaining parameter values of special effect parameters of the makeup and/or face-changing sub-material includes:
responding to the received parameter value set for the special effect parameter of the makeup and/or face changing sub-material sent through the interactive interface of the operation bar, and taking the set parameter value as the parameter value of the special effect parameter of the makeup and/or face changing sub-material; and/or
And in response to the fact that the parameter values set for the special effect parameters of the makeup and/or face changing sub-materials sent through the interactive interface of the operation bar are not received, using preset parameter values as the parameter values of the special effect parameters of the makeup and/or face changing sub-materials.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the special effect parameters include any one or more of:
display parameters: the makeup display device is used for displaying whether the makeup and/or face changing sub-materials are displayed or not;
and (3) superposition mode parameters: the superposition mode is used for representing the makeup and/or face changing sub-materials;
triggering mode parameters: the trigger event is used for triggering the display of the makeup and/or face changing sub-materials;
circulation parameters: the number of times of circularly displaying the makeup and/or face changing sub-materials is represented;
the parameters of the playing frame number are as follows: the system is used for representing the number of frames played by the makeup and/or face changing sub-element;
delay trigger parameters: for representing a time for delaying display of the makeup and/or face-changing sub-material;
triggering an ending parameter: a trigger event for indicating the end of displaying the makeup and/or face changing sub-material;
deformation special effect parameters: the deformation effect is used for representing the generation of a deformation area on the image when the makeup and/or face changing sub-materials are displayed;
the sticker special effect parameter is used for representing the special effect of the sub-materials generated on the image when the makeup is displayed and/or the face sub-materials are changed;
and the stroke special effect parameter is used for representing that the stroke special effect is generated on the image when the makeup is displayed and/or the face sub-material is changed.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the trigger event includes any one or more of: no action trigger, eye action, head action, eyebrow action, hand action, mouth action, shoulder action, special deformation effect, special paster effect, special sound effect and special drawing edge effect.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, establishing a correspondence between display attributes of the makeup and/or face-changing sub-materials and key points in the display position coverage range includes:
and establishing a corresponding relation between the display attribute of the makeup and/or face changing sub-material and at least two key points in the display position coverage range.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the method further includes:
in response to receiving a start instruction, displaying an operation interface, wherein the operation interface comprises: an operation bar, a content display bar and/or a program file bar;
the displaying of the reference image and the preset key points on the reference image includes: displaying the reference image and preset key points on the reference image on the content display column;
the leading-in makeup and/or face changing child material is displayed, and the method comprises the following steps: and displaying the imported makeup and/or face changing sub-materials on the content display column.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each embodiment, the operation interface includes three areas, namely, a left area, a middle area and a right area;
the display operation interface comprises:
and displaying the operation bar on the left side of the operation interface, displaying the content display bar in the middle of the operation interface, and displaying the program file bar on the right side of the operation interface.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the method further includes:
and updating the display positions of the makeup and/or face changing sub-materials and the corresponding relation according to the position moving operation of the makeup and/or face changing sub-materials received through the content display bar.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the method further includes:
and updating the display size of the makeup and/or face changing sub-material in the content display column according to the size adjustment operation of the makeup and/or face changing sub-material received through the content display column.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the method further includes:
adjusting the shielding relation between the two or more makeup and/or face changing sub-materials according to a layer parameter adjusting instruction which is received through an interactive interface of an operation bar and is sent aiming at the two or more makeup and/or face changing sub-materials, and displaying the two or more makeup and/or face changing sub-materials according to the adjusted shielding relation and the parameter value of the superposition mode of the two or more makeup and/or face changing sub-materials.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, before generating the makeup and/or face-changing special effect program file package, the method further includes:
and generating a special effect program file of the makeup and/or face changing sub-material according to a preset special effect program file, the parameter value of the special effect parameter of the makeup and/or face changing sub-material and the corresponding relation, and displaying the special effect program file of the makeup and/or face changing sub-material through a program file column.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the embodiments, the special effect program file includes: and generating a special effect program file by using a json program.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, after the generating of the makeup and/or face-changing special effect program file package, the method further includes:
and saving the makeup and/or face changing special effect program file package at the position pointed by the saving instruction according to the received saving instruction.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the saving the makeup and/or face-changing special effect program file package at a position pointed by the saving instruction according to the received saving instruction includes:
in response to receiving a save instruction, displaying a save path selection interface and a compression interface;
receiving a save location sent through the save path selection interface; receiving a compression mode sent by the compression interface, and compressing the makeup and/or face changing special effect program file package according to the compression mode to generate a compressed file package;
and storing the compressed file package into the folder pointed by the saving position.
Optionally, in the method for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the size of the makeup and/or face-changing sub-material in the makeup and/or face-changing special effect program file package is maintained as the size of the makeup and/or face-changing sub-material before the makeup and/or face-changing sub-material is imported.
According to another aspect of the embodiment of the present invention, a method for generating a cosmetic and/or face-changing effect is provided, which includes:
acquiring a makeup and/or face changing sub-material, a parameter value of a special effect parameter of the makeup and/or face changing sub-material, and a corresponding relation between a display attribute and a key point of the makeup and/or face changing sub-material; the display attributes include: size and/or display location;
and generating a special effect of the makeup and/or face changing sub-material on the image based on the corresponding relation, key points in the image related to the corresponding relation and parameter values of the special effect parameters.
Optionally, in the makeup and/or face-changing special effect generating method in each of the above embodiments, the makeup and/or face-changing sub-material includes: makeup and/or face-changing sub-materials for one part of the person, and/or a combination of makeup and/or face-changing sub-materials for two or more parts of the person.
Optionally, in the makeup appearance and/or face changing special effect generating method in each of the above embodiments, the image includes any one or more of the following items: still images, images in video.
Optionally, in the method for generating a special effect of makeup and/or face changing according to each of the above embodiments, the method further includes:
importing a makeup and/or face changing special effect program file package; the makeup and/or face changing special effect program file package comprises: the makeup and/or face changing sub-material, the parameter value of the special effect parameter of the makeup and/or face changing sub-material, and the corresponding relation between the display attribute and the key point of the makeup and/or face changing sub-material;
the acquiring of the makeup and/or face-changing sub-material, the parameter value of the special effect parameter of the makeup and/or face-changing sub-material, and the corresponding relationship between the display attribute and the key point of the makeup and/or face-changing sub-material comprises: and acquiring makeup and/or face changing sub-materials, parameter values of the special effect parameters and the corresponding relation from the makeup and/or face changing special effect program file package.
Optionally, in the makeup and/or face-changing special effect generating method according to each embodiment, the makeup and/or face-changing special effect program file package is a makeup and/or face-changing special effect program file package generated by using the makeup and/or face-changing special effect program file package generating method according to any one of the embodiments.
Optionally, in the method for generating a special effect of makeup and/or face changing according to each of the above embodiments, the method further includes:
and detecting key points related to the corresponding relation of the image through a neural network, and outputting a key point detection result.
Optionally, in the makeup and/or face-changing special effect generating method according to each of the above embodiments, the key point detection result includes any one or more of the following:
the positions of key points related to the corresponding relations in the images in the video;
and presetting numbers of the key points related to the corresponding relation.
Optionally, in the makeup and/or face-changing special effect generating method in each of the above embodiments, the makeup and/or face-changing sub-material includes: a group of makeup and/or face changing sub-materials with preset playing time sequence.
Optionally, in the makeup and/or face-changing special effect generating method according to each of the above embodiments, the generating a special effect of the makeup and/or face-changing sub-material on the image based on the correspondence, the key points in the image related to the correspondence, and the parameter values of the special effect parameters includes:
determining the playing time sequence of a plurality of makeup and/or face changing sub-materials based on the file names of the makeup and/or face changing sub-materials in the group of makeup and/or face changing sub-materials;
and generating a special effect of the makeup and/or face changing sub-material on the image according to the determined playing time sequence based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter values of the special effect parameters.
Optionally, in the makeup appearance and/or face changing special effect generating method according to each of the above embodiments, the special effect parameter includes: and the superposition mode parameter is used for representing the superposition mode of the makeup and/or face changing sub-materials.
Optionally, in the makeup appearance and/or face changing special effect generating method according to each of the above embodiments, the special effect parameter includes: displaying parameters, wherein the display parameters are used for indicating whether the makeup and/or face changing sub-materials are displayed or not;
generating a special effect of the makeup and/or face changing sub-material on the image based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter values of the special effect parameters, wherein the special effect comprises the following steps:
and when the parameter value of the display parameter is the parameter value for displaying the makeup and/or face changing sub-material, generating a special effect of the makeup and/or face changing sub-material on the image based on the corresponding relation, the key point in the image related to the corresponding relation and the parameter value of the special effect parameter.
Optionally, in the makeup appearance and/or face changing special effect generating method according to each of the above embodiments, the special effect parameter includes: a triggering mode parameter, wherein the triggering mode parameter is used for representing a triggering event for triggering and displaying the makeup and/or face changing sub-material;
the method further comprises the following steps: detecting whether a trigger event corresponding to the parameter value of the trigger mode parameter appears in the image;
generating a special effect of the makeup and/or face changing sub-material on the image based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter values of the special effect parameters, wherein the special effect comprises the following steps:
and in response to the fact that a trigger event corresponding to the parameter value of the trigger mode parameter appears in the image, generating a special effect of the makeup and/or face changing sub-material on the image based on the corresponding relation, the key point in the image related to the corresponding relation and the parameter value of the special effect parameter.
Optionally, in the makeup appearance and/or face changing special effect generating method according to each of the above embodiments, the special effect parameter includes: a delay triggering parameter, wherein the delay triggering parameter is used for representing the time for delaying the display of the makeup and/or face changing sub-materials;
generating a special effect of the makeup and/or face changing sub-material on the image based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter values of the special effect parameters, wherein the special effect comprises the following steps:
responding to display conditions meeting makeup and/or face changing sub-materials, according to delayed playing time corresponding to parameter values of the delayed trigger parameters, and based on the corresponding relations, key points in the images related to the corresponding relations and parameter values of the special effect parameters, delaying generation of special effects of the makeup and/or face changing sub-materials on the images; the display conditions meeting the requirements of makeup and/or face changing sub-materials comprise: and the parameter value of the display parameter is used for displaying the makeup and/or face changing sub-material, and/or a trigger event corresponding to the parameter value of the trigger mode parameter occurs.
Optionally, in the makeup appearance and/or face changing special effect generating method according to each of the above embodiments, the special effect parameter includes: the circulation parameter is used for representing the circulation playing times of the makeup and/or face changing sub-material;
generating a special effect of the makeup and/or face changing sub-material on the image based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter values of the special effect parameters, wherein the special effect comprises the following steps:
responding to display conditions meeting makeup and/or face changing sub-materials, and circularly displaying the makeup and/or face changing sub-materials on the image according to the corresponding times of circulation corresponding to the parameter values of the circulation parameters based on the corresponding relations, the key points in the image related to the corresponding relations and the parameter values of the special effect parameters so as to generate special effects of the makeup and/or face changing sub-materials; the display conditions meeting the requirements of makeup and/or face changing sub-materials comprise: and the parameter value of the display parameter is used for displaying the makeup and/or face changing sub-material, and/or a trigger event corresponding to the parameter value of the trigger mode parameter occurs.
Optionally, in the makeup appearance and/or face changing special effect generating method according to each of the above embodiments, the special effect parameter includes: the playing frame number parameter is used for expressing the number of frames played by the makeup and/or face changing sub-element;
generating a special effect of the makeup and/or face changing sub-material on the image based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter values of the special effect parameters, wherein the special effect comprises the following steps:
responding to a playing condition meeting makeup and/or face changing sub-materials, and generating a special effect of the makeup and/or face changing sub-materials on an image corresponding to the playing frame number in a video according to the playing frame number corresponding to the parameter value of the playing frame number parameter on the basis of the corresponding relation, the key point in the image related to the corresponding relation and the parameter value of the special effect parameter; the display conditions meeting the requirements of makeup and/or face changing sub-materials comprise: and the parameter value of the display parameter is used for displaying the makeup and/or face changing sub-material, and/or a trigger event corresponding to the parameter value of the trigger mode parameter occurs.
Optionally, in the makeup appearance and/or face changing special effect generating method according to each of the above embodiments, the special effect parameter includes: triggering an end parameter, wherein the end parameter is used for representing a triggering event for ending the display of the makeup and/or face changing sub-material;
the method further comprises the following steps:
detecting whether a trigger event corresponding to the parameter value of the trigger end parameter occurs;
and stopping generating the special effects of the makeup and/or face changing sub-materials in response to the detection of the occurrence of the trigger event corresponding to the parameter value of the trigger end parameter.
Optionally, in the makeup appearance and/or face changing special effect generating method according to each of the above embodiments, the special effect parameter includes: a deformation special effect parameter used for representing a deformation effect of a deformation area generated on the image when the makeup and/or face changing sub-material is displayed;
the method further comprises the following steps:
and generating a deformation effect of the deformation area in the image according to the deformation special effect parameter when generating the special effect of the makeup and/or face changing sub-material on the image based on the corresponding relation, the key point in the image related to the corresponding relation and the parameter value of the special effect parameter.
Optionally, in the makeup appearance and/or face changing special effect generating method according to each of the above embodiments, the special effect parameter includes: a sticker special effect parameter used for representing a special effect of generating a sub-material on an image when the makeup is displayed and/or the sub-material is changed;
the method further comprises the following steps:
and generating the special effect of the sub-materials in the image according to the paster special effect parameters when generating the special effect of the makeup and/or face changing sub-materials on the image based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter values of the special effect parameters.
According to another aspect of the embodiments of the present invention, an apparatus for generating a cosmetic and/or face-changing special effect program file package is provided, including:
the display module is used for displaying a reference image and preset key points on the reference image; the reference image includes: referencing at least a portion of an image of a person; and displaying the imported makeup and/or face changing sub-materials;
the first import module is used for importing makeup and/or face changing sub-materials;
the first acquisition module is used for acquiring parameter values of special effect parameters of the makeup and/or face changing sub-material, wherein the special effect parameters comprise superposition mode parameters;
the establishing module is used for establishing the corresponding relation between the display attribute of the makeup and/or face changing sub-material and the key point in the display position coverage range; the display attributes include: size and/or display location;
and the first generation module is used for generating a makeup and/or face changing special effect program file package according to the makeup and/or face changing sub-material, the parameter value of the special effect parameter and the corresponding relation.
Optionally, in the apparatus for generating a makeup and/or face-changing special effect program file package according to the above embodiments, the at least one part of the image of the reference person includes an image of any one or more of the following parts of the reference person: a full image, a head image, a face image, a shoulder image, an arm image, a gesture image, a waist image, a leg image, a foot image.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the makeup and/or face-changing sub-material includes: makeup and/or face-changing sub-materials for one part of the person, and/or a combination of makeup and/or face-changing sub-materials for two or more parts of the person.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the preset key points include any one or more of the following: head key points, face key points, shoulder key points, arm key points, gesture key points, waist key points, leg key points, foot key points, and human skeleton key points.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the head key point includes at least one of: vertex key points, nose tip key points, and chin key points; and/or
The facial face key points include at least one of: key points of face contour, eye, eyebrow, nose and mouth; and/or
The shoulder keypoints comprise at least one of: a shoulder-head convergence key point located at a convergence position of the shoulder and the head, and a shoulder-contour midpoint key point located at a midpoint position between the arm-root-contour key point and the shoulder-head convergence key point; and/or
The arm keypoints comprise at least one of: a wrist contour key point, an elbow contour key point, an arm root contour key point, a forearm contour middle point key point located at a middle point position between the wrist contour key point and the elbow contour key point, and a forearm middle point key point located at a middle point position between the elbow contour key point and the arm root contour key point; and/or
The gesture keypoints comprise at least one of: four vertex key points of the gesture box and a central key point of the gesture box; and/or
The leg keypoints comprise at least one of: a crotch critical point, a knee contour critical point, an ankle contour critical point, a thigh root lateral contour critical point, a calf contour midpoint critical point located at a midpoint position between the knee contour critical point and the ankle contour critical point, an intra-thigh contour midpoint critical point located at a midpoint position between the knee contour critical point and the crotch critical point, and a thigh outer contour midpoint critical point located at a midpoint position between the knee contour critical point and the thigh root lateral contour critical point; and/or
The waist key points include at least one of: dividing N equal parts between the thigh root outer side outline key point and the arm root outline key point to generate N equal parts; wherein said N is greater than 1; and/or
The foot keypoints comprise at least one of: toe keypoints and heel keypoints; and/or
The human skeletal key points will include at least one of: a right shoulder skeleton keypoint, a right elbow skeleton keypoint, a right carpal skeleton keypoint, a left shoulder skeleton keypoint, a left elbow skeleton keypoint, a left carpal skeleton keypoint, a right hip skeleton keypoint, a right knee skeleton keypoint, a right ankle skeleton keypoint, a left hip skeleton keypoint, a left knee skeleton keypoint, a left ankle skeleton keypoint, a vertex skeleton keypoint, and a neck skeleton keypoint.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the eye key points include at least one of: a left eye orbit key point, a left eye pupil center key point, a left eye center key point, a right eye orbit key point, a right eye pupil center key point, and a right eye center key point; and/or
The eyebrow key points include at least one of: a left eyebrow key point and a right eyebrow key point; and/or
The nose key points include at least one of: a nose bridge key point, a nose lower edge key point, and a nose outer contour key point; and/or
The mouth keypoints comprise at least one of: upper lip keypoints, and lower lip keypoints.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the first import module is specifically configured to receive an import instruction input through an interactive interface of an operation bar, and import a makeup and/or face-changing child material in a material folder pointed by the import instruction.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the first import module is specifically configured to: receiving a selection instruction sent through an interactive interface of the operation bar, taking a reference part selected by the selection instruction as the current target part needing to be added with makeup and/or a special effect of changing the face, and displaying a special effect parameter setting interface under the target part in the operation bar; and receiving an import instruction sent through an interactive interface in the special effect parameter setting interface, and importing makeup and/or face changing sub-materials in the material folder pointed by the import instruction.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, when the first import module imports a makeup and/or face-changing child material in the material folder pointed by the import instruction, the first import module is specifically configured to:
receiving an import instruction sent through the interactive interface, and acquiring and displaying a material folder pointed by the import instruction;
in response to receiving a selection operation of cosmetic and/or face changing sub-materials in the material folder, importing one or more cosmetic and/or face changing sub-materials selected by the cosmetic and/or face changing sub-material selection operation; and/or
In response to the fact that no operation for selecting makeup and/or face changing sub-materials in the material folder is received, selecting one or more makeup and/or face changing sub-materials in the material folder according to preset setting, and importing the selected makeup and/or face changing sub-materials according to the preset setting;
the plurality of makeup and/or face-changing sub-materials form a group of makeup and/or face-changing sub-materials with preset playing time sequence.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the playing timing of the plurality of makeup and/or face-changing sub-materials in the set of makeup and/or face-changing sub-materials is determined based on the file names of the plurality of makeup and/or face-changing sub-materials.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, when the display module displays the introduced makeup and/or face-changing sub-material, the display module is specifically configured to display the introduced makeup and/or face-changing sub-material on the target portion according to the parameter value of the overlay manner.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the embodiments, the first obtaining module is specifically configured to:
responding to the received parameter value set for the special effect parameter of the makeup and/or face changing sub-material sent through the interactive interface of the operation bar, and taking the set parameter value as the parameter value of the special effect parameter of the makeup and/or face changing sub-material; and/or
And in response to the fact that the parameter values set for the special effect parameters of the makeup and/or face changing sub-materials sent through the interactive interface of the operation bar are not received, using preset parameter values as the parameter values of the special effect parameters of the makeup and/or face changing sub-materials.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the special effect parameters include any one or more of:
display parameters: the makeup display device is used for displaying whether the makeup and/or face changing sub-materials are displayed or not;
and (3) superposition mode parameters: the superposition mode is used for representing the makeup and/or face changing sub-materials;
triggering mode parameters: the trigger event is used for triggering the display of the makeup and/or face changing sub-materials;
circulation parameters: the number of times of circularly displaying the makeup and/or face changing sub-materials is represented;
the parameters of the playing frame number are as follows: the system is used for representing the number of frames played by the makeup and/or face changing sub-element;
delay trigger parameters: for representing a time for delaying display of the makeup and/or face-changing sub-material;
triggering an ending parameter: a trigger event for indicating the end of displaying the makeup and/or face changing sub-material;
deformation special effect parameters: the deformation effect is used for representing the generation of a deformation area on the image when the makeup and/or face changing sub-materials are displayed;
the sticker special effect parameter is used for representing the special effect of the sub-materials generated on the image when the makeup is displayed and/or the face sub-materials are changed;
and the stroke special effect parameter is used for representing that the stroke special effect is generated on the image when the makeup is displayed and/or the face sub-material is changed.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to the above embodiments, the trigger event includes any one or more of: no action trigger, eye action, head action, eyebrow action, hand action, mouth action, shoulder action, special deformation effect, special paster effect, special sound effect and special drawing edge effect.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the establishing module is specifically configured to establish a correspondence between a display attribute of the makeup and/or face-changing sub-material and at least two key points within the display position coverage range.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the display module includes:
the operation interface is used for responding to the received starting instruction display and comprises: an operation bar, a content display bar and/or a program file bar;
the display module is specifically used for displaying the reference image and the preset key points on the reference image on the content display bar; and displaying the imported makeup and/or face changing sub-materials on the content display column.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the operation interface includes three areas, namely a left area, a middle area and a right area;
and displaying the operation bar on the left side of the operation interface, displaying the content display bar in the middle of the operation interface, and displaying the program file bar on the right side of the operation interface.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the device further includes:
and the first updating module is used for updating the display position of the makeup and/or face changing sub-material and the corresponding relation according to the position moving operation of the makeup and/or face changing sub-material received by the content display column.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the device further includes:
and the second updating module is used for updating the display size of the makeup and/or face changing sub-material in the content display column according to the size adjusting operation of the makeup and/or face changing sub-material received through the content display column.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the device further includes:
the adjusting module is used for adjusting the shielding relation between the two or more makeup and/or face changing sub-materials according to a layer parameter adjusting instruction which is received through an interactive interface of the operation bar and is sent aiming at the two or more makeup and/or face changing sub-materials;
the display module is further used for displaying the two or more makeup and/or face changing sub-materials according to the adjusted shielding relation and the parameter values of the superposition mode of the two or more makeup and/or face changing sub-materials.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the embodiments, the first generating module is further configured to generate a special effect program file of the makeup and/or face-changing sub-material according to a preset special effect program file and a parameter value of a special effect parameter of the makeup and/or face-changing sub-material and the corresponding relationship before generating the makeup and/or face-changing special effect program file package, and display the special effect program file of the makeup and/or face-changing sub-material through a program file column.
Optionally, in the apparatus for generating a cosmetic and/or face-changing special effect program file package according to each of the above embodiments, the special effect program file includes: and generating a special effect program file by using a json program.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the device further includes:
and the storage module is used for storing the makeup and/or face changing special effect program file package at the position pointed by the storage instruction according to the received storage instruction.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the saving module is specifically configured to:
in response to receiving a save instruction, displaying a save path selection interface and a compression interface;
receiving a save location sent through the save path selection interface; receiving a compression mode sent by the compression interface, and compressing the makeup and/or face changing special effect program file package according to the compression mode to generate a compressed file package;
and storing the compressed file package into the folder pointed by the saving position.
Optionally, in the device for generating a makeup and/or face-changing special effect program file package according to each of the above embodiments, the size of the makeup and/or face-changing sub-material in the makeup and/or face-changing special effect program file package is maintained as the size of the makeup and/or face-changing sub-material before the makeup and/or face-changing sub-material is imported.
According to another aspect of the embodiments of the present invention, there is provided a makeup and/or face-changing special effect generating device, including:
the second acquisition module is used for acquiring makeup and/or face changing sub-materials, parameter values of special effect parameters of the makeup and/or face changing sub-materials and corresponding relations between display attributes and key points of the makeup and/or face changing sub-materials; the display attributes include: size and/or display location;
and the second generation module is used for generating the special effects of the makeup and/or face changing sub-materials on the images based on the corresponding relation, the key points in the images related to the corresponding relation and the parameter values of the special effect parameters.
Optionally, in the makeup and/or face-changing special effect generating device according to each of the above embodiments, the makeup and/or face-changing sub-element includes: makeup and/or face-changing sub-materials for one part of the person, and/or a combination of makeup and/or face-changing sub-materials for two or more parts of the person.
Optionally, in the makeup and/or face-changing special effect generating device of each of the above embodiments, the image includes any one or more of the following items: still images, images in video.
Optionally, in the make-up and/or face-changing special effect generating device according to each of the above embodiments, the method further includes:
the second import module is used for importing a makeup and/or face changing special effect program file package; the makeup and/or face changing special effect program file package comprises: the makeup and/or face changing sub-material, the parameter value of the special effect parameter of the makeup and/or face changing sub-material, and the corresponding relation between the display attribute and the key point of the makeup and/or face changing sub-material;
the second obtaining module is specifically configured to obtain makeup and/or face-changing sub-materials, parameter values of the special effect parameters, and the corresponding relationship from the makeup and/or face-changing special effect program file package.
Optionally, in the makeup and/or face-changing special effect generating device according to each embodiment of the present invention, the makeup and/or face-changing special effect program file package is a makeup and/or face-changing special effect program file package generated by using the method for generating a makeup and/or face-changing special effect program file package or the device for generating a makeup and/or face-changing special effect program file package according to any embodiment of the present invention.
Optionally, in the make-up and/or face-changing special effect generating device according to each of the above embodiments, the method further includes:
and the key point detection module is used for detecting the key points related to the corresponding relation of the image through a neural network and outputting a key point detection result.
Optionally, in the makeup and/or face-changing special effect generating device according to each of the above embodiments, the key point detection result includes any one or more of the following:
the positions of key points related to the corresponding relations in the images in the video;
and presetting numbers of the key points related to the corresponding relation.
Optionally, in the makeup and/or face-changing special effect generating device according to each of the above embodiments, the makeup and/or face-changing sub-element includes: a group of makeup and/or face changing sub-materials with preset playing time sequence.
Optionally, in the make-up and/or face-changing special effect generating device according to each of the above embodiments, the second generating module is specifically configured to:
determining the playing time sequence of a plurality of makeup and/or face changing sub-materials based on the file names of the makeup and/or face changing sub-materials in the group of makeup and/or face changing sub-materials; and
and generating a special effect of the makeup and/or face changing sub-material on the image according to the determined playing time sequence based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter values of the special effect parameters.
Optionally, in the makeup and/or face-changing special effect generating device according to each of the above embodiments, the special effect parameters include: and the superposition mode parameter is used for representing the superposition mode of the makeup and/or face changing sub-materials.
Optionally, in the makeup and/or face-changing special effect generating device according to each of the above embodiments, the special effect parameters include: displaying parameters, wherein the display parameters are used for indicating whether the makeup and/or face changing sub-materials are displayed or not;
the second generating module is specifically configured to, when the parameter value of the display parameter is a parameter value for displaying the makeup and/or face changing sub-material, generate a special effect of the makeup and/or face changing sub-material on the image based on the correspondence, the key point in the image related to the correspondence, and the parameter value of the special effect parameter.
Optionally, in the makeup and/or face-changing special effect generating device according to each of the above embodiments, the special effect parameters include: a triggering mode parameter, wherein the triggering mode parameter is used for representing a triggering event for triggering and displaying the makeup and/or face changing sub-material;
the device further comprises:
the first detection module is used for detecting whether a trigger event corresponding to the parameter value of the trigger mode parameter appears in the image;
the second generation module is specifically configured to
And in response to the fact that a trigger event corresponding to the parameter value of the trigger mode parameter appears in the image, generating a special effect of the makeup and/or face changing sub-material on the image based on the corresponding relation, the key point in the image related to the corresponding relation and the parameter value of the special effect parameter.
Optionally, in the makeup and/or face-changing special effect generating device according to each of the above embodiments, the special effect parameters include: a delay triggering parameter, wherein the delay triggering parameter is used for representing the time for delaying the display of the makeup and/or face changing sub-materials;
the second generating module is specifically configured to, in response to a display condition that a makeup and/or face changing sub-material is satisfied, delay playing time corresponding to a parameter value of the delay trigger parameter, and delay generation of a special effect of the makeup and/or face changing sub-material on an image based on the correspondence, a key point in the image related to the correspondence, and a parameter value of the special effect parameter; the display conditions meeting the requirements of makeup and/or face changing sub-materials comprise: and the parameter value of the display parameter is used for displaying the makeup and/or face changing sub-material, and/or a trigger event corresponding to the parameter value of the trigger mode parameter occurs.
Optionally, in the makeup and/or face-changing special effect generating device according to each of the above embodiments, the special effect parameters include: the circulation parameter is used for representing the circulation playing times of the makeup and/or face changing sub-material;
the second generating module is specifically configured to, in response to a display condition that a makeup and/or face changing sub-material is satisfied, circularly display the makeup and/or face changing sub-material on the image according to the circulation times corresponding to the parameter values of the circulation parameters based on the correspondence, the key points in the image related to the correspondence, and the parameter values of the special effect parameters, so as to generate a special effect of the makeup and/or face changing sub-material; the display conditions meeting the requirements of makeup and/or face changing sub-materials comprise: and the parameter value of the display parameter is used for displaying the makeup and/or face changing sub-material, and/or a trigger event corresponding to the parameter value of the trigger mode parameter occurs.
Optionally, in the makeup and/or face-changing special effect generating device according to each of the above embodiments, the special effect parameters include: the playing frame number parameter is used for expressing the number of frames played by the makeup and/or face changing sub-element;
the second generating module is specifically configured to generate, in response to a playing condition that a makeup and/or face changing sub-material is satisfied, a special effect of the makeup and/or face changing sub-material on an image corresponding to the playing frame number in a video based on the correspondence, the key point in the image related to the correspondence, and the parameter value of the special effect parameter, and according to the playing frame number corresponding to the parameter value of the playing frame number parameter; the display conditions meeting the requirements of makeup and/or face changing sub-materials comprise: and the parameter value of the display parameter is used for displaying the makeup and/or face changing sub-material, and/or a trigger event corresponding to the parameter value of the trigger mode parameter occurs.
Optionally, in the makeup and/or face-changing special effect generating device according to each of the above embodiments, the special effect parameters include: triggering an end parameter, wherein the end parameter is used for representing a triggering event for ending the display of the makeup and/or face changing sub-material;
the device further comprises:
the second detection module is used for detecting whether a trigger event corresponding to the parameter value of the trigger end parameter occurs or not;
the second generating module is further configured to stop generating the special effect of the makeup and/or face changing sub-material in response to the second detecting module detecting that the trigger event corresponding to the parameter value of the trigger end parameter occurs.
Optionally, in the makeup and/or face-changing special effect generating device according to each of the above embodiments, the special effect parameters include: a deformation special effect parameter used for representing a deformation effect of a deformation area generated on the image when the makeup and/or face changing sub-material is displayed;
the second generating module is further configured to generate a deformation effect of the deformation region in the image according to the deformation special effect parameter when generating a special effect of the makeup and/or face changing sub-material on the image based on the correspondence, the key points in the image related to the correspondence, and the parameter value of the special effect parameter.
Optionally, in the makeup and/or face-changing special effect generating device according to each of the above embodiments, the special effect parameters include: a sticker special effect parameter used for representing a special effect of generating a sub-material on an image when the makeup is displayed and/or the sub-material is changed;
the second generating module is further configured to generate a special effect of the sub-material in the image according to the sticker special effect parameter when generating a special effect of the makeup and/or face changing sub-material on the image based on the correspondence, the key points in the image related to the correspondence, and the parameter values of the special effect parameter.
According to still another aspect of an embodiment of the present invention, there is provided an electronic apparatus including:
a memory for storing a computer program;
a processor for executing the computer program stored in the memory, and the computer program, when executed, implements the method of any of the above embodiments of the invention.
According to a further aspect of an embodiment of the present invention, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method according to any one of the above-mentioned embodiments of the present invention.
According to a further aspect of an embodiment of the present invention, there is provided a computer program comprising computer instructions for implementing the method according to any one of the above embodiments of the present invention when the computer instructions are run in a processor of a device.
Based on the method and device for generating the special effect program file package for makeup and/or face exchange, the electronic device, the program and the medium, when the special effect program file package for makeup and/or face exchange is generated, preset key points on a reference image and the reference image are displayed, a sub-material for makeup and/or face exchange is imported, and the imported sub-material for makeup and/or face exchange is displayed; acquiring parameter values of special effect parameters of the makeup and/or face changing sub-materials, and establishing a corresponding relation between display attributes of the makeup and/or face changing sub-materials and key points in a display position coverage range; and generating a makeup and/or face changing special effect program file package according to the makeup and/or face changing sub-materials and the parameter values and the corresponding relation of the special effect parameters so as to realize the makeup and/or face changing special effect on the image. In addition, when the makeup and/or face changing special effect program file package is generated, the makeup and/or face changing special effect program file executable by the rendering engine can be generated only by importing the makeup and/or face changing sub-materials to the corresponding display attributes without manually writing the program file, the operation is simple, the required time is short, the overall efficiency of realizing the makeup and/or face changing special effects is improved, the possible errors of manually writing the program file are avoided, and the accuracy of the makeup and/or face changing special effects is effectively guaranteed.
Based on the method and device for generating the makeup and/or face-changing special effect, the electronic device, the program and the medium, which are provided by the embodiment of the invention, the parameter values of the makeup and/or face-changing sub-material and the special effect parameters thereof and the corresponding relation between the display attribute and the key point of the makeup and/or face-changing sub-material are obtained, wherein the display attribute comprises the size and the display position; and generating a special effect of the makeup and/or face changing sub-material on the image based on the corresponding relation and the related key points in the image and the parameter values of the special effect parameters. According to the embodiment of the invention, the makeup and/or face changing special effects can be generated on the image through the preset makeup and/or face changing sub-materials and the parameter values of the special effect parameters thereof and the corresponding relation between the display attributes and the key points of the makeup and/or face changing sub-materials, so that the overall atmosphere effect of image playing is increased, the entertainment of a user is enhanced, the immersion of the user is improved, and the playing effect is improved.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
The invention will be more clearly understood from the following detailed description, taken with reference to the accompanying drawings, in which:
fig. 1 is a flowchart of an embodiment of a method for generating a cosmetic and/or face-changing special effect program package according to the present invention.
Fig. 2 is an exemplary diagram of a makeup and/or face-changing sub-material according to an embodiment of the invention.
FIG. 3 is an exemplary diagram of facial keypoints in an embodiment of the invention.
Fig. 4 is an exemplary diagram of an operation interface of a device for generating a makeup/change special effect program package according to an embodiment of the present invention.
Fig. 5 is an exemplary schematic diagram of a special effect parameter setting interface in an embodiment of the present invention.
Fig. 6 is an exemplary diagram of hand movements in an embodiment of the present invention.
Fig. 7 is an exemplary diagram of a special deformation effect in the embodiment of the present invention.
FIG. 8 is an exemplary diagram of a special effect of a sticker in an embodiment of the present invention.
Fig. 9 is an exemplary diagram of a stroking effect in an embodiment of the present invention.
Fig. 10 is a flowchart of another embodiment of a method for generating a cosmetic and/or face-changing special effect program package according to the present invention.
Fig. 11 is a flowchart illustrating an embodiment of a method for generating a cosmetic and/or face-changing effect according to the present invention.
Fig. 12 is an exemplary diagram of generating effects of makeup and/or face-changing sub-materials on an image according to an embodiment.
Fig. 13 is a flowchart illustrating another embodiment of a method for generating a cosmetic and/or face-changing effect according to the present invention.
Fig. 14 is a schematic structural diagram of an embodiment of a device for generating a cosmetic and/or face-changing special effect program file package according to the present invention.
Fig. 15 is a schematic structural diagram of another embodiment of a device for generating a cosmetic and/or face-changing special effect program file package according to the present invention.
Fig. 16 is a schematic structural diagram of an embodiment of a makeup and/or face-changing effect generating device according to the present invention.
Fig. 17 is a schematic structural diagram of another embodiment of the makeup and/or face-changing effect generating device according to the present invention.
Fig. 18 is a schematic structural diagram of an embodiment of an electronic device according to the present invention.
Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.
Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
Embodiments of the invention are operational with numerous other general purpose or special purpose computing system environments or configurations, and with numerous other electronic devices, such as terminal devices, computer systems, servers, etc. Examples of well known terminal devices, computing systems, environments, and/or configurations that may be suitable for use with electronic devices, such as terminal devices, computer systems, servers, and the like, include, but are not limited to: personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, microprocessor-based systems, set-top boxes, programmable consumer electronics, networked personal computers, minicomputer systems, mainframe computer systems, distributed cloud computing environments that include any of the above, and the like.
Electronic devices such as terminal devices, computer systems, servers, etc. may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, etc. that perform particular tasks or implement particular abstract data types. The computer system/server may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computer system storage media including memory storage devices.
Fig. 1 is a flowchart of an embodiment of a method for generating a cosmetic and/or face-changing special effect program package according to the present invention. The method for generating a makeup and/or face-changing special effect program file package according to the embodiments of the present invention may be implemented, for example, but not limited to, by a device (hereinafter, the embodiments of the present invention are referred to as a makeup and/or face-changing special effect program file package generation device). As shown in fig. 1, the method for generating a makeup/change special effect program package according to the embodiment includes:
and 102, displaying the reference image and preset key points on the reference image.
The reference image comprises: reference to at least a portion of an image of a person, for example an image of any one or more of: a full image, a head image, a face image, a shoulder image, an arm image, a gesture image, a waist image, a leg image, a foot image, and the like.
And 104, importing makeup and/or face changing sub-materials, and displaying the imported makeup and/or face changing sub-materials.
The makeup and/or face changing sub-material in each embodiment of the invention can be a picture or an animation composed of multiple pictures.
And 106, acquiring parameter values of special effect parameters of the makeup and/or face changing sub-materials, and establishing a corresponding relation between the display attributes of the makeup and/or face changing sub-materials and key points in the display position coverage range.
The special effect parameters comprise superposition mode parameters.
The display attributes include: size and/or display position, i.e.: the size and the display position may be included, or only the size or the display position may be included.
The correspondence between the display attribute of the makeup and/or face changing sub-material and the key point in the display position coverage range may be a one-to-one correspondence between the display attribute and at least one key point (for example, two or more key points) in the display position coverage range, or a one-to-one binding relationship between the display attribute and at least one key point in the display position coverage range. After the display attributes of the makeup and/or face changing sub-material and the key points in the display position coverage range are corresponding or bound, the tight fit of the parts in the makeup and/or face changing sub-material and/or the display position coverage range can be realized.
And 108, generating a makeup and/or face changing special effect program file package according to the makeup and/or face changing sub-materials and the parameter values and the corresponding relation of the special effect parameters.
In the embodiment of the present invention, the makeup and/or face-changing special effect program file package may be used to perform special effect processing on an image, generate a special effect of a makeup and/or face-changing sub-material on the image, and perform rendering processing of an AR effect on a still image or a video image, for example, when the makeup sub-material is a red lip, a red lip may be generated on a lip of a person in the image.
Based on the method for generating the special effect program file package for makeup and/or face change provided by the embodiment of the invention, when the special effect program file package for makeup and/or face change is generated, preset key points on a reference image and the reference image are displayed, a sub-material for makeup and/or face change is imported, and the imported sub-material for makeup and/or face change is displayed; acquiring parameter values of special effect parameters of the makeup and/or face changing sub-materials, and establishing a corresponding relation between display attributes of the makeup and/or face changing sub-materials and key points in a display position coverage range; and generating a makeup and/or face changing special effect program file package according to the makeup and/or face changing sub-materials and the parameter values and the corresponding relation of the special effect parameters so as to realize the makeup and/or face changing special effect on the image. In addition, when the makeup and/or face changing special effect program file package is generated, the makeup and/or face changing special effect program file executable by the rendering engine can be generated only by importing the makeup and/or face changing sub-materials to the corresponding display attributes without manually writing the program file, the operation is simple, the required time is short, the overall efficiency of realizing the makeup and/or face changing special effects is improved, the possible errors of manually writing the program file are avoided, and the accuracy of the makeup and/or face changing special effects is effectively guaranteed.
In embodiments of the present invention, the makeup and/or face-changing sub-materials may include: makeup and/or face-changing sub-materials for one part of the person, and/or a combination of makeup and/or face-changing sub-materials for two or more parts of the person. For example, the makeup and/or face-changing sub-material may be a makeup sub-material for one lip, a combination of makeup sub-materials for one eyebrow and one eyelash, a combination of makeup sub-materials for one eye shadow and one blush, or a combination of makeup sub-materials for each part of the entire face. As shown in fig. 2, from left to right, the following are sequentially provided: an exemplary schematic diagram of a combination of makeup sub-materials for eyebrows and eyelashes, a combination of makeup sub-materials for eye shadow and blush, a combination of makeup sub-materials for lips, and a combination of makeup sub-materials for various parts of the entire face.
In each embodiment of the present invention, the positions of a plurality of key points may be preset, so as to correspond the display attributes of the makeup and/or face changing sub-material to the key points within the display position coverage range. For example, in one embodiment, a plurality of key points may be defined for a face and a gesture (hand) respectively based on face detection and gesture detection, so as to implement correspondence of a position relationship based on the face key points or the gesture key points in makeup and/or face changing special effect generation.
In some implementations of embodiments of the present invention, the predetermined key points may include, for example, but are not limited to, any one or more of the following: head keypoints, face keypoints, shoulder keypoints, arm keypoints, gesture keypoints, waist keypoints, leg keypoints, foot keypoints, human skeleton keypoints, and the like.
In one optional example, the head keypoints may include, for example, but are not limited to, at least one of: vertex key points, tip key points, and chin key points, etc.
In one optional example, the facial keypoints may include, for example, but are not limited to, at least one of: face contour keypoints, eye keypoints, eyebrow keypoints, nose keypoints, mouth keypoints, and so forth.
Illustratively, eye key points may include, for example, but are not limited to, at least one of: a left eye orbit keypoint, a left eye pupil center keypoint, a left eye center keypoint, a right eye orbit keypoint, a right eye pupil center keypoint, and a right eye center keypoint, and so on. Eyebrow key points may include, for example, but are not limited to, at least one of: left eyebrow keypoints and right eyebrow keypoints, and so on. The nose key points may include, for example, but are not limited to, at least one of: nasal bridge keypoints, nasal inferior border keypoints, and nasal lateral contour keypoints, among others. The mouth keypoints may include, for example, but are not limited to, at least one of: upper lip keypoints, and lower lip keypoints, and so on.
In one optional example, the shoulder keypoints may include, for example, but are not limited to, at least one of: a shoulder-nose junction key point located at the shoulder-and-head junction, and a shoulder-contour midpoint key point located at the midpoint between the arm-root-contour key point and the shoulder-nose junction key point, and so on.
In one optional example, the arm keypoints may include, for example, but are not limited to, at least one of: wrist contour keypoints, elbow contour keypoints, arm root contour keypoints, forearm contour midpoint keypoints at a midpoint location between the wrist contour keypoints and the elbow contour keypoints, and forearm midpoint keypoints at a midpoint location between the elbow contour keypoints and the arm root contour keypoints, and so on.
In one optional example, the gesture keypoints may include, for example, but are not limited to, at least one of: four vertex keypoints of the gesture box (i.e., the gesture detection box), and a center keypoint of the gesture box, etc.
In one optional example, the leg keypoints may include, for example, but are not limited to, at least one of: a crotch critical point, a knee contour critical point, an ankle contour critical point, a thigh root lateral contour critical point, a calf contour midpoint critical point located at a midpoint position between the knee contour critical point and the ankle contour critical point, an intra-thigh contour midpoint critical point located at a midpoint position between the knee contour critical point and the crotch critical point, and a thigh outer contour midpoint critical point located at a midpoint position between the knee contour critical point and the thigh root lateral contour critical point, and so on.
In one optional example, the waist keypoints may include, for example, but are not limited to, at least one of: dividing N equal parts between the thigh root outer side outline key point and the arm root outline key point to generate N equal parts; wherein N is greater than 1.
In one optional example, the foot keypoints may include, for example, but are not limited to, at least one of: toe keypoints and heel keypoints, and so on.
In one optional example, the human skeletal key points may include, but are not limited to, at least one of: a right shoulder skeleton key point, a right elbow skeleton key point, a right carpal skeleton key point, a left shoulder skeleton key point, a left elbow skeleton key point, a left carpal skeleton key point, a right hip skeleton key point, a right knee skeleton key point, a right ankle skeleton key point, a left hip skeleton key point, a left knee skeleton key point, a left ankle skeleton key point, a parietal skeleton key point, and a neck skeleton key point, and so forth.
For example, fig. 3 is an exemplary schematic diagram of face key points in an embodiment of the present invention, and in combination with fig. 3, in an alternative example, the face key points may be defined as follows:
key point item Key point numbering Key point item Key point numbering
Face frame (face contour key point) 0-32 Nose bridge 43-46
Left eyebrow 33-37,64-67 Right eyebrow 38-42,68-71
Left eye socket 52-57,72-73 Right eye socket 58-63,75-76
Pupil of the left eye 74,104, Pupil of right eye 77,105
Lower edge of nose 47-51 Outer contour of nose 78-83
Upper lip 84-90,96-100 Lower lip 91-95,101-103
In one alternative example, the hand keypoints may be defined as follows:
key point item Key point numbering Key point item Key point numbering
Gesture frame 110-113 Center of a ship 114
The key points with the numbers 110 and 113 are the four vertices of the gesture detection box (i.e. the external frame of the hand), and the key point with the number 114 is the center of the gesture detection box.
In some implementations of the embodiments of the present invention, importing makeup and/or face-changing sub-materials may include: and receiving an import instruction input through an interactive interface of the operation bar, and importing makeup and/or face changing sub-materials in the material folder pointed by the import instruction.
The makeup and/or face-changing sub-materials in the embodiments of the present invention may be preset, for example, when the makeup and/or face-changing sub-materials are preset by computer graphics image software Adobe photoshop and exported to a material folder, the makeup and/or face-changing sub-materials may be exported separately according to different superposition modes of the original image when the sub-materials are applied to the image, or two or more makeup and/or face-changing sub-materials with the same superposition mode may be merged into one image layer for export. For example, makeup sub-materials of eyebrows and eyelashes may be combined into one layer and exported to a material folder, eye shadow and blush makeup sub-materials may be combined into one layer and exported to a material folder, and makeup sub-materials of lips may be exported as a separate layer.
In an implementation manner of each embodiment of the present invention, the device for generating the makeup and/or face-changing special effect program file package may include a preset special effect program file, which may be, for example, a lightweight JavaScript Object notation (json) file based on a JavaScript language, or any other executable program file. The parameter values of the special effect parameters in the special effect program file can be vacant or preset as default values, and when the parameter values set for the special effect parameters of the makeup and/or face changing sub-material are received, the corresponding parameter values in the special effect program file are automatically updated to the received parameter values. Optionally, the device for generating a makeup and/or face-changing special effect program file package may include an operation bar, where the operation bar is provided with at least one interactive interface, and is configured to receive parameter values set for special effect parameters of a makeup and/or face-changing sub-material; in addition, the device for generating the makeup and/or face-changing special effect program file package can further comprise a program file display column for displaying the special effect program file. Fig. 4 is a diagram illustrating an exemplary operation interface of a device for generating a cosmetic and/or face-changing special effect program file package according to an embodiment of the present invention, where the operation interface of the device for generating a cosmetic and/or face-changing special effect program file package includes an operation bar and a program file display bar. After the generation device of the makeup and/or face-changing special-effect program file package is started, corresponding to a special-effect parameter setting interface of a makeup and/or face-changing sub-material in an operation bar, a program file display bar displays a special-effect program file when special-effect parameters of the makeup and/or face-changing sub-material are absent or preset as default values, when a parameter value set for the special-effect parameters of the makeup and/or face-changing sub-material is received through an interactive interface of the operation bar, the parameter value of the special-effect parameters of the makeup and/or face-changing sub-material is updated to a recently received parameter value, and the program file display bar displays the special-effect program file after the parameter value is updated in real time. It should be noted that the program file displayed in the program file display field in fig. 4 is only used for exemplarily illustrating the operation interface displayed by the generation apparatus of the cosmetic and/or face-changing special effect program file package, and the present invention does not need to pay attention to the representation form and specific content of the special effect program file therein, and does not limit the representation form and specific content of the special effect program file.
In an implementation manner of each embodiment of the method for generating a makeup and/or face-changing special effect program file package according to the present invention, operation 102 may include: and receiving an import instruction sent through an interactive interface of the operation bar, and importing makeup and/or face changing sub-materials in the material folder pointed by the import instruction.
As an optional example, but not by way of limitation, of embodiments of the present invention, as shown in fig. 4, a special effect parameter setting interface may be included in the operation bar, which includes at least one interactive interface; in addition, other areas may also be included, for example, a reference portion display area, and the special effect parameter setting interface in this case may be a special effect parameter setting interface under each reference portion. The reference sites in the embodiments of the present invention may include, for example, but are not limited to, any one or more of the following: ear (ear), hand (hand), face (face), hair (hair), neck, limbs. Fig. 5 is an exemplary diagram of a special effect parameter setting interface of a Makeup and/or face changing (Makeup/Facetrans) sub-material when the reference portion is an eye (eye) according to an embodiment of the present invention. In an optional example of the foregoing embodiment of the present invention, receiving an import instruction input through an interactive interface of an operation bar, and importing makeup and/or face changing child materials in a material folder pointed by the import instruction may include: receiving a selection instruction sent through an interactive interface of an operation bar, taking a reference part selected by the selection instruction as a target part needing to add makeup and/or change a special effect of a face at present, and displaying a special effect parameter setting interface under the target part in the operation bar; and receiving an import instruction sent through an interactive interface in the special effect parameter setting interface, and importing makeup and/or face changing sub-materials in the material folder pointed by the import instruction.
In some optional examples, importing makeup and/or face changing sub-materials in the material folder pointed to by the import instruction may include:
receiving an import instruction sent through an interactive interface, and acquiring and displaying a material folder pointed by the import instruction;
in response to receiving cosmetic and/or face changing sub-material selection operations in the material folder, importing one or more cosmetic and/or face changing sub-materials selected by the cosmetic and/or face changing sub-material selection operations; and/or
In response to the fact that no operation for selecting makeup and/or face changing sub-materials in the material folder is received, selecting one or more makeup and/or face changing sub-materials in the material folder according to preset setting, and importing the selected makeup and/or face changing sub-materials according to the preset setting;
when a plurality of makeup and/or face changing sub-materials in the material folder are selected and imported, the plurality of makeup and/or face changing sub-materials can form a group of makeup and/or face changing sub-materials with preset playing time sequence. The playing timing sequence of the plurality of makeup and/or face exchange sub-materials in the set of makeup and/or face exchange sub-materials may be determined based on the file names of the plurality of makeup and/or face exchange sub-materials, for example, the file names of the plurality of makeup and/or face exchange sub-materials in the set of makeup and/or face exchange sub-materials are face001, face003, and face009, and then the playing timing sequence of the plurality of makeup and/or face exchange sub-materials in the set of makeup and/or face exchange sub-materials may be determined to be face001 → face003 → face009 based on an exemplary preset rule.
Each material folder may include a plurality of makeup and/or face-changing sub-materials, for example, if the target portion is a mouth, the material folder may include lip sub-materials of different shapes and colors, in an implementation manner of each embodiment of the present invention, when the makeup and/or face-changing sub-materials are imported, when a user does not receive a selection operation of the makeup and/or face-changing sub-materials in the material folder, the makeup and/or face-changing sub-materials at a preset position or a preset serial number in the material folder pointed by the import instruction may be preset, and the makeup and/or face-changing sub-materials at the preset position or the preset serial number in the material folder pointed by the import instruction are imported. For example, when the user does not select makeup and/or face changing sub-materials, the first makeup and/or face changing sub-materials in the material folder are selected and imported by default, so that corresponding makeup and/or face changing effects can be achieved under the condition that the user does not select makeup and/or face changing sub-materials.
Accordingly, in some optional examples, the imported makeup and/or face changing sub-materials may include: and displaying the imported makeup and/or face changing sub-materials on the target part according to the parameter value of the superposition mode parameter (tag) in the special effect parameter.
Wherein the parameter values of the superposition mode parameters define the corresponding superposition mode. The superposition mode comprises a combination of a superposition mode and/or transparency in computer graphic image software Adobe photoshop. The 'superposition' mode is a mixed mode in computer graphics image software Adobe photoshop and exists in a 'superposition' mode group of a color mixed mode, a channel mixed mode and an image layer mixed mode. The "superposition" mode, which acts between an image pixel and surrounding pixels to cause an increase or decrease in image contrast, is a mode in which the primary colors determine the mixing effect, and the shade of the primary colors determines the mixing mode of the mixed colors. After the 'superposition' mixed mode is used, color level overflow can not be generated generally, image detail loss can not be caused, and when the positions of the primary color and the mixed color are exchanged, the result colors are different.
In some optional examples, the "overlay" mode may include, for example, but is not limited to, any one or more of: cross, normal, positive stack, dissolve, linear deepening, soft light, bright light, highlight, linear light, brightness, color, saturation, difference, linear deepening, and the like. The value of the transparency can be [ 0%, 100% ]. The corresponding superposition mode can be determined by selecting any one superposition mode and any one transparency value.
In order to realize corresponding makeup and/or face changing effects, makeup and/or face changing sub-materials suitable for different target parts may need different superposition modes to realize the corresponding makeup and/or face changing effects, the superposition modes of the makeup and/or face changing sub-materials suitable for different target parts can be preset, and the superposition mode and/or transparency corresponding to each superposition mode are set, so that when the parameter values of the superposition mode parameters are set for the makeup and/or face changing sub-materials imported into the target parts needing makeup and/or face changing, the superposition modes corresponding to the makeup and/or face changing sub-materials can be directly selected. As shown below, an example of preset makeup and/or face changing sub-materials and corresponding superposition modes thereof is shown:
cosmetic compositionOr Face changing sub-material (Face Transplant) Mode of superposition
Eye (Makeup-eye) Normal, transparency overlay
Eye shadow (Makeup-eyeshadow) Superposition
Blush (Makeup-blush) Front piece folding bottom
Nose (Makeup-nose) Normal, transparency overlay
Lips (Makeup-lips) Superposition, high light
In some implementations of the embodiments of the present invention, obtaining parameter values of special effect parameters of make-up and/or face-changing sub-materials may include: responding to the received parameter value set for the special effect parameter of the makeup and/or face changing sub-material sent through the interactive interface of the operation bar, and taking the set parameter value as the parameter value of the special effect parameter of the makeup and/or face changing sub-material; and/or responding to the condition that the parameter value set for the special effect parameter of the makeup and/or face changing sub-material sent through the interactive interface of the operation bar is not received, and taking the preset parameter value as the parameter value of the special effect parameter of the makeup and/or face changing sub-material.
According to the embodiment of the invention, the generation of the makeup and/or face changing special effect program package can be realized based on the selection operation of the user on the makeup and/or face changing sub-material and the setting operation of the parameter value in the operation bar without manually writing the program file, the operation is simple, the required time is short, the overall efficiency of the makeup and/or face changing special effect realization is improved, the possible errors of manually writing the program file are avoided, and the accuracy of the makeup and/or face changing effect is effectively ensured.
In some implementations of embodiments of the invention, the special effects parameters may include, for example, but are not limited to, any one or more of:
1, Display parameter (Display): for indicating whether makeup and/or face changing sub-materials are displayed. The parameter values comprise two options of 'Yes' (Yes) 'and' No '(No)', when the parameter value is selected to 'Yes' (Yes) 'the corresponding makeup and/or face changing sub-material is required to be displayed in the image or video playing process, and when the parameter value is selected to' No '(No)' the corresponding makeup and/or face changing sub-material is not required to be displayed in the image or video playing process;
2, stacking mode parameter (Tag): the superposition mode is used for representing makeup and/or face changing sub-materials;
3, triggering mode parameter (TriggerType): the trigger event for representing triggering display of the makeup and/or face changing sub-material refers to what trigger event triggers display of the makeup and/or face changing sub-material, the parameter value of the trigger event may include each trigger event, and the user may select at least one event from a preset event set as the trigger event. Namely: in the process of playing the image video, when a corresponding trigger event is detected, a corresponding makeup and/or face changing sub-material may be triggered to be displayed, for example, when an action of "open mouth" of the trigger event specified in the trigger mode parameter is detected in the video, a makeup and/or face changing sub-material of red lips starts to be displayed, specifically, the starting display time, the ending display time, the display duration, and the like of the makeup and/or face changing sub-material may be determined specifically according to parameter values of other parameters, for example, may be determined according to parameter values of a delay trigger parameter, a trigger ending parameter, and a cycle parameter, respectively;
4, loop parameter (TriggerLoop): for indicating the number of times of cyclically displaying makeup and/or face-changing sub-materials. Specific values of the loop playing times can be set or selected as parameter values, for example, 1, 5, etc., and the parameter value can be set to 0 as infinite loop playing;
5, Play frame number parameter (Frames): the system is used for representing the number of frames of the makeup and/or face changing sub-materials, namely the corresponding makeup and/or face changing sub-materials are required to be displayed on the number of frames of images in the video playing process, and the starting display time of the makeup and/or face changing sub-materials can be determined according to the parameter values of other parameters, for example, can be determined according to the parameter values of a trigger mode parameter and a delay trigger parameter respectively;
delay trigger parameter (TriggerDelay): for representing the time of delayed display of makeup and/or face-changing sub-materials, namely: when the trigger mode in the trigger mode parameters is detected, the makeup and/or face changing sub-material is displayed by delaying time or delaying frames in the video playing process, and the specific time or the frame number for delaying the display of the makeup and/or face changing sub-material can be set or selected as the parameter value;
triggering an end parameter (TriggerStop): the trigger event for indicating that the makeup and/or face changing child material is finished is that the makeup and/or face changing child material is finished through what trigger event, the parameter value of the trigger event comprises each trigger event, and a user can select at least one event from a preset event set as the trigger event for finishing the makeup and/or face changing child material display;
deformation special effect parameter (Deformation): the method for representing the deformation effect of the deformation area generated on the image when the makeup and/or face changing sub-material is displayed may include a display position parameter and a deformation effect parameter of the deformation area, the display position of the deformation area may be determined by the position of a predetermined at least one key point corresponding to the deformation area, so that the display position parameter may be the position or number of the corresponding key point, etc., the deformation effect parameter may be a parameter representing the deformation effect such as inward stretching, outward stretching, etc., and the parameter value of the deformation effect parameter may include: the positions or numbers of the key points corresponding to the deformation areas and the parameter values of the deformation effect parameters can be used for carrying out deformation effect processing on the images by setting the parameter values of the deformation effect parameters, and generating the deformation effect (also called as the deformation effect) of the deformation areas on the images, for example, carrying out rendering processing of an AR effect on the video images;
a sticker effect parameter for indicating an effect of generating a sub-material on an image when displaying a makeup and/or face-changing sub-material, where the sticker effect parameter may include a display position of the sticker effect sub-material (e.g., earring, hat, etc.) and a play parameter, the display position of the sticker effect sub-material may be determined by a position of a predetermined at least one key point corresponding to the sticker effect sub-material, and thus the display position parameter may be a position or a number of the corresponding key point, and the like, and the play parameter may be a parameter indicating a play effect (e.g., a cycle number, a play frame number, and the like) of the sticker effect sub-material, and a parameter value of the sticker effect parameter may include: the positions or numbers of key points corresponding to the paster special effect sub-materials and the parameter values of the playing parameters can be used for generating the paster special effect sub-materials in the video or the image, such as playing earrings on the ears of people, playing hats on the tops of heads and the like, and performing AR effect rendering processing on the video or the image based on the parameter values of the paster special effect parameters;
a back group edge parameter (background effect) for indicating that a back group effect is generated on an image when makeup is displayed and/or face-changing child materials are changed, the back group effect parameter may include a target object (for example, a face, clothes, a hand, an ear, etc.) to be described in the image and a back group effect parameter (for example, a thickness of a back group, a color, etc.), and a parameter value of the back group effect parameter may include: the number or name of the target object, and parameter values of the stroke effect parameter (for example, a thickness value and a color value of the stroke), and setting the parameter values of the stroke special effect parameter can be used to perform stroke special effect processing on the target object in the image, add stroke to the target object in the image, implement the stroke special effect, and perform rendering processing of AR effect on the target object in the video image, for example, in order to highlight a certain object in the game, a stroke effect can be added to a certain target object in the game.
In some optional examples, the trigger event may include, but is not limited to, any one or more of the following:
no action trigger (NULL), i.e.: displaying the sub-elements without any action;
eye movements, e.g., blinking, closing, opening, etc.;
head movements, such as pan, nod, tilt, turn, etc.;
eyebrow movements, such as, for example, picking up the eyebrows;
hand movements, such as Aixinshou, Hold hand, palm, thumb, Zhantong May Happy, heart on one hand, OK hand, Scissors hand, pistol hand, index finger, etc. FIG. 6 is a schematic diagram illustrating an exemplary hand motion in an embodiment of the present invention;
mouth movements, e.g., mouth opening, mouth closing, etc.;
shoulder action, e.g., shoulder shrugging, etc.;
a special morph effect, for example, a special morph effect displayed on the face of a person, as shown in fig. 7, is an exemplary diagram of the special morph effect in the embodiment of the present invention;
pasting a special effect, for example, displaying a hat or the like on the top of a person's head, as shown in FIG. 8, is an exemplary illustration of a pasting special effect in an embodiment of the present invention;
a sound effect, for example, a certain sound appears in a video, and as shown in fig. 9, it is an exemplary schematic diagram of the edge-tracing effect in the embodiment of the present invention;
a special effect of edge tracing, for example, an edge tracing occurs on a certain target object in a video;
other actions.
In some optional examples of the embodiments of the present invention, establishing a correspondence between display attributes of makeup and/or face changing sub-materials and key points within a display position coverage range may include: and establishing a corresponding relation between the display attribute of the makeup and/or face changing sub-material and at least two key points in the display position coverage range.
Fig. 10 is a flowchart illustrating another embodiment of a method for generating a cosmetic and/or face-changing effect program package according to the present invention. As shown in fig. 10, the method for generating a makeup/change special effect program package according to this embodiment includes:
and 302, responding to the received starting instruction, and displaying an operation interface.
The operation interface includes: an operation bar, a content display bar and/or a program file bar.
And 304, displaying the reference image and the preset key points on the reference image in the content display column.
And 306, importing makeup and/or face changing sub-materials, and displaying the imported makeup and/or face changing sub-materials in the content display column.
For example, the imported makeup and/or face-changing sub-material may be displayed in the content display section in accordance with a display position and a display size of the makeup and/or face-changing sub-material set in advance.
308, obtaining parameter values of special effect parameters of the makeup and/or face changing sub-materials, and establishing a corresponding relation between the display attributes of the makeup and/or face changing sub-materials and key points in the display position coverage range.
The special effect parameters comprise superposition mode parameters. The display attributes include: size and display position.
In some embodiments of the present invention, after the content display bar displays the imported makeup and/or face-changing sub-material, the position of the makeup and/or face-changing sub-material may be moved or the size of the makeup and/or face-changing sub-material may be changed after the makeup and/or face-changing sub-material is selected on the content display bar, after the position and the size of the makeup and/or face-changing sub-material are determined, the display attribute of the makeup and/or face-changing sub-material may be determined, and the key points within the display position coverage range may be detected, and the display attribute of the makeup and/or face-changing sub-material may be corresponded to the key points within the display position coverage range, thereby achieving the close fit between the makeup and/or face-changing sub-material and the part within the display position coverage range.
And 310, generating a makeup and/or face changing special effect program file package according to the makeup and/or face changing sub-materials and the parameter values and the corresponding relation of the special effect parameters.
As shown in fig. 4, in some embodiments of the present invention, the operation interface may include three areas of a left side, a middle portion, and a right side. Accordingly, in operation 302, an operation bar is displayed on the left side of the operation interface, a content display bar is displayed in the middle of the operation interface, and a program file bar is displayed on the right side of the operation interface.
After the content display bar displays the imported makeup and/or face-changing sub-materials, the user can change the display attributes of the displayed makeup and/or face-changing sub-materials in the content display bar, including adjusting the display position or size of the displayed makeup and/or face-changing sub-materials.
Therefore, in another embodiment of the method for generating a makeup and/or face-changing special effect program file package of the present invention, the method may further include: and updating the display position and the corresponding relation of the makeup and/or face changing sub-material according to the position moving operation of the makeup and/or face changing sub-material received through the content display column.
In another embodiment of the method for generating a cosmetic and/or face-changing special effect program file package of the present invention, the method may further include: and updating the display size of the makeup and/or face changing sub-materials in the content display column according to the size adjustment operation of the makeup and/or face changing sub-materials received through the content display column.
For example, a user can select a makeup and/or face changing sub-material displayed in the content display bar through a mouse, move the mouse to a small frame at the lower right corner of the makeup and/or face changing sub-material, and zoom the makeup and/or face changing sub-material by moving the small frame, so that the display size of the makeup and/or face changing sub-material is adjusted; the user can select a makeup and/or face changing sub-material displayed in the content display bar through the mouse and directly move the position of the makeup and/or face changing sub-material, and the makeup and/or face changing sub-material is moved to a correct or desired position. In the subsequent playing of the special-effect program file package of the makeup and/or face changing sub-material, the position and the display proportion of the makeup and/or face changing sub-material on the playing terminal are consistent with the position and the display proportion in the content display column. Based on any of the above embodiments of the present invention, a user may add special effects to multiple reference portions, for example, the user may use ears, faces, and hands as target portions to which special effects are currently required to be added, and implement any of the above embodiments, so as to achieve special effect effects on make-up and/or face-changing sub-materials of the ears, faces, and hands.
When the user imports two or more makeup and/or face changing sub-materials, the display layers (i.e., the occlusion relationship) of the makeup and/or face changing sub-materials can be adjusted. Therefore, in another embodiment of the method for generating a makeup and/or face-changing special effect program file package of the present invention, the method may further include: adjusting the shielding relation between the two or more makeup and/or face changing sub-materials according to a layer parameter adjusting instruction sent by an interactive interface of the operation bar aiming at the two or more makeup and/or face changing sub-materials, and displaying the two or more makeup and/or face changing sub-materials according to the adjusted shielding relation and the parameter value of the superposition mode of the two or more makeup and/or face changing sub-materials.
As shown in fig. 4, make-up and/or face-changing sub-materials may be introduced through the interactive interface 20 in the left operation bar, the occlusion relationship between the layers of the make-up and/or face-changing sub-materials may be adjusted through the interactive interface 21, the layer parameters of each make-up and/or face-changing sub-material may be set, and the parameter values may be set for the special effect parameters of each make-up and/or face-changing sub-material through the interactive interface 22; the content display column takes the average face as a reference face, all the imported makeup and/or face changing sub-materials are directly displayed, the display positions of the makeup and/or face changing sub-materials can be moved through a mouse, and the sizes of small frames at the lower right corner of the makeup and/or face changing sub-materials are adjusted through the mouse; the program file display field on the right side is used for displaying the content of the playing program file of the makeup and/or face changing child material with the currently set parameter values through the display area 23, and the special effect program file package can be exported through the storage instruction interface 24 in the program file display field, that is: and generating and storing the special-effect program file package.
In addition, in another embodiment of the method for generating a makeup and/or face-changing special effect program file package according to the present invention, before generating the makeup and/or face-changing special effect program file package, the method may further include: and generating a special effect program file of the makeup and/or face changing sub-material according to the preset special effect program file and the parameter values and the corresponding relation of the special effect parameters of the makeup and/or face changing sub-material, and displaying the special effect program file of the makeup and/or face changing sub-material through a program file column.
The special effect program file may be, for example, a special effect program file generated by a json program or other executable program.
In another embodiment, after the generating of the makeup and/or face-changing special effect program file package according to the above embodiment of the present invention, the method may further include:
and saving the makeup and/or face changing special effect program file package at the position pointed by the saving instruction according to the received saving instruction.
In some embodiments, saving the makeup and/or face-changing special effect program file package at the position pointed by the saving instruction according to the received saving instruction may include:
in response to receiving a save instruction, displaying a save path selection interface and a compression interface;
receiving a save location sent through a save path selection interface; receiving a compression mode sent by a compression interface, and compressing the makeup and/or face changing special effect program file package according to the compression mode to generate a compressed file package;
and storing the compressed file package into the folder pointed by the saving position.
When the size of the makeup and/or face changing special effect program file package is large, the makeup and/or face changing special effect program file package is not suitable for running in a mobile phone terminal. In some embodiments, in the embodiments of the present invention, only the size of the makeup and/or face-changing special effect program file package is compressed, and the size of the makeup and/or face-changing sub-material in the special effect program file package is not changed, that is: and the size of the makeup and/or face changing sub-material in the makeup and/or face changing special effect program file package is kept to be the size of the makeup and/or face changing sub-material before the makeup and/or face changing sub-material is imported.
After the makeup and/or face-changing special effect program file package is generated based on the embodiments of the invention, the makeup and/or face-changing special effect program file package can be imported into the terminal, and the video played by the terminal is specially generated.
Fig. 11 is a flowchart illustrating an embodiment of a method for generating a cosmetic and/or face-changing effect according to the present invention. The makeup and/or face-changing special effect generating method according to each embodiment of the present invention may be implemented by a device (hereinafter, referred to as a makeup and/or face-changing special effect generating device according to the following embodiment of the present invention). As shown in fig. 11, the method for generating a special effect of makeup and/or face replacement according to this embodiment includes:
402, acquiring a makeup and/or face changing sub-material, a parameter value of a special effect parameter of the makeup and/or face changing sub-material, and a corresponding relation between a display attribute and a key point of the makeup and/or face changing sub-material.
The display attributes include: size and/or display position, i.e.: the size and the display position may be included, or only the size or the display position may be included.
The correspondence between the display attribute of the makeup and/or face changing sub-material and the key point in the display position coverage range may be a one-to-one correspondence between the display attribute and at least one key point (for example, two or more key points) in the display position coverage range, or a one-to-one binding relationship between the display attribute and at least one key point in the display position coverage range.
The makeup and/or face-changing sub-materials may include: makeup and/or face-changing sub-materials for one part of the person, and/or a combination of makeup and/or face-changing sub-materials for two or more parts of the person.
In some embodiments, the special effect parameters may include: and the superposition mode parameter is used for representing the superposition mode of the makeup and/or face changing sub-materials.
And 404, generating special effects of makeup and/or face changing sub-materials on the images based on the corresponding relation, the key points in the images related to the corresponding relation and the parameter values of the special effect parameters.
The images in embodiments of the present invention may include, but are not limited to, any one or more of the following: still images, images in video. Fig. 12 is a schematic diagram illustrating an exemplary effect of generating a makeup and/or face-changing sub-material on an image according to an embodiment.
Based on the method for generating the makeup and/or face-changing special effect, the makeup and/or face-changing sub-material and the parameter value of the special effect parameter thereof, and the corresponding relation between the display attribute of the makeup and/or face-changing sub-material and the key point are obtained, wherein the display attribute comprises the size and the display position; and generating a special effect of the makeup and/or face changing sub-material on the image based on the corresponding relation and the related key points in the image and the parameter values of the special effect parameters. According to the embodiment of the invention, the makeup and/or face changing special effects can be generated on the image through the preset makeup and/or face changing sub-materials and the parameter values of the special effect parameters thereof and the corresponding relation between the display attributes and the key points of the makeup and/or face changing sub-materials, so that the overall atmosphere effect of image playing is increased, the entertainment of a user is enhanced, the immersion of the user is improved, and the playing effect is improved.
Optionally, before the process of the embodiment of the method for generating a makeup and/or face-changing special effect of the present invention, the method may further include: importing a makeup and/or face changing special effect program file package; the makeup and/or face-changing special effect program file package comprises: the makeup and/or face changing sub-material, the parameter value of the special effect parameter of the makeup and/or face changing sub-material, and the corresponding relation between the display attribute and the key point of the makeup and/or face changing sub-material. Accordingly, in this embodiment, operation 402 may comprise: and acquiring makeup and/or face changing sub-materials, parameter values of the special effect parameters and corresponding relations from the makeup and/or face changing special effect program file package.
In one embodiment, importing the makeup and/or face-changing special effect program file package may include: reading the makeup and/or face changing special effect program file package into a memory by calling an interface function for reading the file package; analyzing the makeup and/or face changing special effect program file package to obtain a makeup and/or face changing sub-material and a special effect program file, wherein the special effect program file comprises parameter values of special effect parameters of the makeup and/or face changing sub-material.
The makeup and/or face-changing special effect program file package can be generated by adopting the method for generating the makeup and/or face-changing special effect program file package in any embodiment of the invention.
In addition, in another embodiment of the method for generating a makeup and/or face-changing effect of the present invention, the method may further include: and detecting the key points related to the corresponding relation of the image through a neural network, and outputting a key point detection result.
In some embodiments, the above-mentioned key point detection result may include, for example, but not limited to, any one or more of the following: the positions of the key points related to the corresponding relation in the images in the video; and presetting numbers of key points related to the corresponding relation.
In some embodiments of the foregoing embodiments of the present invention, the makeup and/or face replacement sub-material may be a single makeup and/or face replacement sub-material, or a combination of two or more makeup and/or face replacement sub-materials, or a group of makeup and/or face replacement sub-materials with a predetermined play timing.
Where there is a group of cosmetic and/or face-changing sub-materials having a predetermined play timing, in one optional example, operation 404 may comprise:
determining the playing time sequence of a plurality of makeup and/or face changing sub-materials based on the file names of the plurality of makeup and/or face changing sub-materials in a group of makeup and/or face changing sub-materials;
and generating special effects of makeup and/or face changing sub-materials on the images according to the determined playing time sequence based on the corresponding relation and the key points in the images related to the corresponding relation and the parameter values of the special effect parameters.
In some implementations of the foregoing embodiments of the present invention, the special effect parameter may include: and displaying parameters, wherein the display parameters are used for indicating whether makeup and/or face changing sub-materials are displayed or not. Correspondingly, in the above embodiment, generating a special effect of a makeup and/or face changing sub-material on an image based on the correspondence, the key points in the image related to the correspondence, and the parameter values of the special effect parameters may include: and when the parameter value of the display parameter is the parameter value for displaying the makeup and/or face changing sub-material, generating a special effect of the makeup and/or face changing sub-material on the image based on the corresponding relation, the key point in the image related to the corresponding relation and the parameter value of the special effect parameter.
Fig. 13 is a flowchart illustrating another embodiment of a method for generating a cosmetic and/or face-changing effect according to the present invention. As shown in fig. 13, the method for generating a cosmetic and/or face-changing effect of the embodiment includes:
502, acquiring a makeup and/or face-changing sub-material, a parameter value of a special effect parameter of the makeup and/or face-changing sub-material, and a corresponding relation between a display attribute and a key point of the makeup and/or face-changing sub-material.
The display attributes include: size and display position. The special effect parameters may include: and the triggering mode parameter is used for representing a triggering event for triggering the display of makeup and/or face changing sub-materials.
And 504, detecting the key points related to the corresponding relation on the image through a neural network, and outputting a key point detection result.
And 506, responding to the trigger event corresponding to the detected parameter value of the trigger mode parameter in the image, and generating a special effect of the makeup and/or face changing child material on the image based on the corresponding relation, the key point in the image related to the corresponding relation and the parameter value of the special effect parameter.
In some embodiments of the foregoing embodiments of the present invention, the special effect parameter may include: a delay triggering parameter for indicating a time for delaying display of makeup and/or face changing sub-materials. Correspondingly, in the above embodiment, generating a special effect of a makeup and/or face changing sub-material on an image based on the correspondence, the key points in the image related to the correspondence, and the parameter values of the special effect parameters may include: and in response to the display condition meeting the makeup and/or face changing sub-material, generating a special effect of the makeup and/or face changing sub-material on the image in a delayed manner according to the delayed playing time corresponding to the parameter value of the delayed trigger parameter and based on the corresponding relationship, the key point in the image related to the corresponding relationship and the parameter value of the special effect parameter. Wherein, the display conditions meeting the makeup and/or face changing sub-materials comprise: the parameter value of the display parameter is used for displaying makeup and/or face changing sub-materials, and/or a trigger event corresponding to the parameter value of the trigger mode parameter occurs.
In some embodiments of the foregoing embodiments of the present invention, the special effect parameter may include: and the circulation parameter is used for expressing the circulation playing times of the makeup and/or face changing sub-materials. Correspondingly, in the above embodiment, generating a special effect of a makeup and/or face changing sub-material on an image based on the correspondence, the key points in the image related to the correspondence, and the parameter values of the special effect parameters may include: and circularly displaying the makeup and/or face changing sub-materials on the image according to the corresponding times corresponding to the parameter values of the cyclic parameters based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter values of the special effect parameters so as to generate the special effects of the makeup and/or face changing sub-materials. Wherein, the display conditions meeting the makeup and/or face changing sub-materials comprise: the parameter value of the display parameter is used for displaying makeup and/or face changing sub-materials, and/or a trigger event corresponding to the parameter value of the trigger mode parameter occurs.
In some embodiments of the foregoing embodiments of the present invention, the special effect parameter may include: and the playing frame number parameter is used for expressing the number of frames played by the makeup and/or face changing sub-element. Correspondingly, in the above embodiment, generating a special effect of a makeup and/or face changing sub-material on an image based on the correspondence, the key points in the image related to the correspondence, and the parameter values of the special effect parameters may include: and responding to the playing condition meeting the makeup and/or face changing sub-material, and generating a special effect of the makeup and/or face changing sub-material on the image corresponding to the playing frame number in the video according to the playing frame number corresponding to the parameter value of the playing frame number parameter on the basis of the corresponding relation and the parameter values of the key points and the special effect parameters in the image related to the corresponding relation. Wherein, the display conditions meeting the makeup and/or face changing sub-materials comprise: the parameter value of the display parameter is used for displaying makeup and/or face changing sub-materials, and/or a trigger event corresponding to the parameter value of the trigger mode parameter occurs.
In addition, referring to fig. 13 again, in some implementations of the foregoing embodiments of the present invention, the special effect parameter may include: and triggering an ending parameter, wherein the triggering ending parameter is used for representing a triggering event for ending the display of the makeup and/or face changing sub-materials. Correspondingly, after generating the special effects of makeup and/or face changing sub-materials on the image, the method can further comprise the following steps:
and 508, detecting whether a trigger event corresponding to the parameter value of the trigger end parameter occurs.
And 510, in response to the detection of the trigger event corresponding to the parameter value of the trigger end parameter, stopping generating the special effect of the makeup and/or face changing sub-materials.
If the trigger event corresponding to the parameter value of the trigger end parameter is not detected, the operation 510 is not executed.
In some embodiments of the foregoing embodiments of the present invention, the special effect parameter may include: and a deformation special effect parameter for representing a deformation effect of generating a deformation area on the image when the makeup is displayed and/or the face sub-material is changed. Correspondingly, in this embodiment, when a special effect of makeup and/or a face changing sub-material is generated on an image based on the correspondence, the key points in the image related to the correspondence, and the parameter values of the special effect parameters, a deformation effect of a deformation region is also generated in the image according to the deformation special effect parameters.
In some embodiments of the foregoing embodiments of the present invention, the special effect parameter may include: and the sticker special effect parameter is used for representing the special effect of generating the sub-materials on the image when the makeup is displayed and/or the sub-materials are changed. Correspondingly, in this embodiment, when the special effects of make-up and/or face changing sub-materials are generated on the image based on the corresponding relationship, the key points in the image related to the corresponding relationship, and the parameter values of the special effect parameters, the special effects of the sub-materials are also generated in the image according to the sticker special effect parameters.
In some implementations of the aforementioned method for generating a cosmetic makeup and/or face-changing special effect, the obtaining a parameter value of a special effect parameter of a cosmetic makeup and/or face-changing sub-material, and a corresponding relationship between a display attribute and a key point of the cosmetic makeup and/or face-changing sub-material may include: creating a sticker handle through an interface function for creating the sticker handle; and reading the makeup and/or face changing sub-material, the parameter value of the special effect parameter of the makeup and/or face changing sub-material and the corresponding relation between the display attribute and the key point of the makeup and/or face changing sub-material, and storing the corresponding relation into the sticker handle.
Accordingly, in some other embodiments, generating a special effect of makeup and/or face changing sub-materials on an image based on the correspondence, the key points in the image related to the correspondence, and the parameter values of the special effect parameters may include: reading a parameter value of a special effect parameter, a corresponding relation between display attributes and key points of makeup and/or face changing sub-materials and the makeup and/or face changing sub-materials needing to be played from a sticker handle by calling an interface function for rendering the makeup and/or face changing materials; and generating special effects of makeup and/or face changing sub-materials on the image based on the parameter values and the corresponding relation.
In some optional examples of the foregoing embodiment, when the image is an image in a video, generating a special effect of a makeup and/or a face changing child material on the image based on the correspondence, the key point in the image related to the correspondence, and the parameter value of the special effect parameter, may further include: and acquiring the video frame number of the special effect of the makeup and/or face changing sub-material according to the parameter value of the special effect parameter, and reading an image corresponding to the video frame number from the video in advance so as to generate the special effect of the makeup and/or face changing sub-material on the image based on the parameter value of the special effect parameter and the corresponding relation.
Further, in some other embodiments, the method may further include: and in response to the completion of the playing of the makeup and/or face changing special effect program file package, destroying the paster handle by calling an interface function for destroying the paster handle.
The embodiment of the method for generating the makeup and/or face changing special effect can be used for various image or video playing scenes, for example, a live video scene containing characters, the makeup and/or face changing special effect is generated for the live video, and corresponding makeup and/or face changing sub-materials are overlaid and played on the live video according to a makeup and/or face changing special effect program file packet, so that the atmosphere effect is increased, the entertainment of a client is enhanced, and the use immersion is improved.
Any of the methods provided by embodiments of the present invention may be performed by any suitable device having data processing capabilities, including but not limited to: terminal equipment, a server and the like. Alternatively, any method and special effect generation method provided by the embodiments of the present invention may be executed by a processor, for example, the processor may execute any method mentioned in the embodiments of the present invention by calling a corresponding instruction stored in a memory. And will not be described in detail below.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Fig. 14 is a schematic structural diagram of an embodiment of a device for generating a cosmetic and/or face-changing special effect program file package according to the present invention. The device for generating the special effect program file package for makeup and/or face exchange according to the embodiments of the invention can be used for realizing the method for generating the special effect program file package for makeup and/or face exchange according to the embodiments of the invention. As shown in fig. 14, the generating means of this embodiment may include: the device comprises a display module, a first import module, a first acquisition module, an establishment module and a first generation module. Wherein:
and the display module is used for displaying the reference image and preset key points on the reference image, and displaying the imported makeup and/or face changing sub-materials. The reference image comprises: reference to at least a portion of an image of a person, for example an image of any one or more of: a full image, a head image, a face image, a shoulder image, an arm image, a gesture image, a waist image, a leg image, a foot image, and the like.
The first import module is used for importing makeup and/or face changing sub-materials. The makeup and/or face changing sub-material in each embodiment of the invention can be a picture or an animation composed of multiple pictures. In embodiments of the present invention, the makeup and/or face-changing sub-materials may include: makeup and/or face-changing sub-materials for one part of the person, and/or a combination of makeup and/or face-changing sub-materials for two or more parts of the person.
The first acquisition module is used for acquiring parameter values of special effect parameters of the makeup and/or face changing sub-materials, wherein the special effect parameters comprise superposition mode parameters.
The establishing module is used for establishing the corresponding relation between the display attribute of the makeup and/or face changing sub-material and the key point in the display position coverage range; the display attributes include: size and/or display location.
In some implementations of embodiments of the present invention, the predetermined key points may include, for example, but are not limited to, any one or more of the following: head keypoints, face keypoints, shoulder keypoints, arm keypoints, gesture keypoints, waist keypoints, leg keypoints, foot keypoints, human skeleton keypoints, and the like.
In one optional example, the head keypoints may include, for example, but are not limited to, at least one of: vertex key points, tip key points, and chin key points, etc.
In one optional example, the facial keypoints may include, for example, but are not limited to, at least one of: face contour keypoints, eye keypoints, eyebrow keypoints, nose keypoints, mouth keypoints, and so forth.
Illustratively, eye key points may include, for example, but are not limited to, at least one of: a left eye orbit keypoint, a left eye pupil center keypoint, a left eye center keypoint, a right eye orbit keypoint, a right eye pupil center keypoint, and a right eye center keypoint, and so on. Eyebrow key points may include, for example, but are not limited to, at least one of: left eyebrow keypoints and right eyebrow keypoints, and so on. The nose key points may include, for example, but are not limited to, at least one of: nasal bridge keypoints, nasal inferior border keypoints, and nasal lateral contour keypoints, among others. The mouth keypoints may include, for example, but are not limited to, at least one of: upper lip keypoints, and lower lip keypoints, and so on.
In one optional example, the shoulder keypoints may include, for example, but are not limited to, at least one of: a shoulder-nose junction key point located at the shoulder-and-head junction, and a shoulder-contour midpoint key point located at the midpoint between the arm-root-contour key point and the shoulder-nose junction key point, and so on.
In one optional example, the arm keypoints may include, for example, but are not limited to, at least one of: wrist contour keypoints, elbow contour keypoints, arm root contour keypoints, forearm contour midpoint keypoints at a midpoint location between the wrist contour keypoints and the elbow contour keypoints, and forearm midpoint keypoints at a midpoint location between the elbow contour keypoints and the arm root contour keypoints, and so on.
In one optional example, the gesture keypoints may include, for example, but are not limited to, at least one of: four vertex keypoints of the gesture box (i.e., the gesture detection box), and a center keypoint of the gesture box, etc.
In one optional example, the leg keypoints may include, for example, but are not limited to, at least one of: a crotch critical point, a knee contour critical point, an ankle contour critical point, a thigh root lateral contour critical point, a calf contour midpoint critical point located at a midpoint position between the knee contour critical point and the ankle contour critical point, an intra-thigh contour midpoint critical point located at a midpoint position between the knee contour critical point and the crotch critical point, and a thigh outer contour midpoint critical point located at a midpoint position between the knee contour critical point and the thigh root lateral contour critical point, and so on.
In one optional example, the waist keypoints may include, for example, but are not limited to, at least one of: dividing N equal parts between the thigh root outer side outline key point and the arm root outline key point to generate N equal parts; wherein N is greater than 1.
In one optional example, the foot keypoints may include, for example, but are not limited to, at least one of: toe keypoints and heel keypoints, and so on.
In one optional example, the human skeletal key points may include, but are not limited to, at least one of: a right shoulder skeleton key point, a right elbow skeleton key point, a right carpal skeleton key point, a left shoulder skeleton key point, a left elbow skeleton key point, a left carpal skeleton key point, a right hip skeleton key point, a right knee skeleton key point, a right ankle skeleton key point, a left hip skeleton key point, a left knee skeleton key point, a left ankle skeleton key point, a parietal skeleton key point, and a neck skeleton key point, and so forth.
And the first generation module is used for generating a makeup and/or face changing special effect program file package according to the makeup and/or face changing sub-material, the parameter value of the special effect parameter and the corresponding relation.
Based on the device for generating the makeup and/or face-changing special effect program file package provided by the embodiment of the invention, when the makeup and/or face-changing special effect program file package is generated, preset key points on a reference image and the reference image are displayed, makeup and/or face-changing sub-materials are imported, and the imported makeup and/or face-changing sub-materials are displayed; acquiring parameter values of special effect parameters of the makeup and/or face changing sub-materials, and establishing a corresponding relation between display attributes of the makeup and/or face changing sub-materials and key points in a display position coverage range; and generating a makeup and/or face changing special effect program file package according to the makeup and/or face changing sub-materials and the parameter values and the corresponding relation of the special effect parameters so as to realize the makeup and/or face changing special effect on the image. In addition, when the makeup and/or face changing special effect program file package is generated, the makeup and/or face changing special effect program file executable by the rendering engine can be generated only by importing the makeup and/or face changing sub-materials to the corresponding display attributes without manually writing the program file, the operation is simple, the required time is short, the overall efficiency of realizing the makeup and/or face changing special effects is improved, the possible errors of manually writing the program file are avoided, and the accuracy of the makeup and/or face changing special effects is effectively guaranteed.
In some embodiments, the first import module is specifically configured to receive an import instruction input through an interactive interface of the operation bar, and import makeup cosmetics and/or face changing sub-materials in the material folder pointed by the import instruction.
In some optional examples, the first import module is specifically configured to: receiving a selection instruction sent through an interactive interface of an operation bar, taking a reference part selected by the selection instruction as a target part needing to add makeup and/or change a special effect of a face at present, and displaying a special effect parameter setting interface under the target part in the operation bar; and receiving an import instruction sent through an interactive interface in the special effect parameter setting interface, and importing makeup and/or face changing sub-materials in the material folder pointed by the import instruction.
In some optional examples, when the first import module imports makeup and/or face changing sub-materials in the material folder pointed by the import instruction, the first import module is specifically configured to: receiving an import instruction sent through an interactive interface, and acquiring and displaying a material folder pointed by the import instruction; in response to receiving cosmetic and/or face changing sub-material selection operations in the material folder, importing one or more cosmetic and/or face changing sub-materials selected by the cosmetic and/or face changing sub-material selection operations; and/or in response to not receiving the operation of selecting the makeup and/or face changing sub-materials in the material folder, selecting one or more makeup and/or face changing sub-materials in the material folder according to the preset setting, and importing the selected makeup and/or face changing sub-materials according to the preset setting; wherein, the plurality of makeup and/or face-changing sub-materials form a group of makeup and/or face-changing sub-materials with preset playing time sequence.
For example, the playing timing sequence of the plurality of makeup cosmetics and/or face changing sub-materials in the set of makeup cosmetics and/or face changing sub-materials is determined based on the file names of the plurality of makeup cosmetics and/or face changing sub-materials.
In some embodiments, the display module is specifically configured to display the imported makeup and/or face replacement sub-material on the target portion according to the parameter value of the overlay method when displaying the imported makeup and/or face replacement sub-material.
In some embodiments, the first obtaining module is specifically configured to: responding to the received parameter value set for the special effect parameter of the makeup and/or face changing sub-material sent through the interactive interface of the operation bar, and taking the set parameter value as the parameter value of the special effect parameter of the makeup and/or face changing sub-material; and/or responding to the condition that the parameter value set for the special effect parameter of the makeup and/or face changing sub-material sent through the interactive interface of the operation bar is not received, and taking the preset parameter value as the parameter value of the special effect parameter of the makeup and/or face changing sub-material.
In some implementations of embodiments of the invention, the special effects parameters may include, for example, but are not limited to, any one or more of:
1, Display parameter (Display): for indicating whether makeup and/or face changing sub-materials are displayed. 2, stacking mode parameter (Tag): the method is used for representing the superposition mode of makeup and/or face changing sub-materials.
3, triggering mode parameter (TriggerType): the trigger event used for triggering and displaying the makeup and/or the face changing sub-material is used for triggering and displaying the makeup and/or the face changing sub-material according to what trigger event;
4, loop parameter (TriggerLoop): for indicating the number of times of cyclically displaying makeup and/or face-changing sub-materials. Specific values of the number of times of loop playing can be set or selected as parameter values;
5, Play frame number parameter (Frames): the system is used for expressing the number of frames of the makeup and/or face changing sub-materials, namely the corresponding makeup and/or face changing sub-materials are required to be displayed on the number of frames of images in the video playing process;
delay trigger parameter (TriggerDelay): for representing the time of delayed display of makeup and/or face-changing sub-materials, namely: when the trigger mode in the trigger mode parameters is detected, the makeup and/or face changing sub-materials are displayed by delaying for a certain time or delaying for a certain number of frames in the video playing process;
triggering an end parameter (TriggerStop): the trigger event used for indicating that the makeup and/or face changing child material is finished is used for indicating that the makeup and/or face changing child material is finished through what trigger event;
deformation special effect parameter (Deformation): the method for representing the deformation effect of the deformation area generated on the image when the makeup and/or face changing sub-material is displayed may include a display position parameter and a deformation effect parameter of the deformation area, the display position of the deformation area may be determined by the position of a predetermined at least one key point corresponding to the deformation area, so that the display position parameter may be the position or number of the corresponding key point, etc., the deformation effect parameter may be a parameter representing the deformation effect such as inward stretching, outward stretching, etc., and the parameter value of the deformation effect parameter may include: the position or number of the key point corresponding to the deformation area and the parameter value of the deformation effect parameter can be used for the deformation effect processing of the image by setting the parameter value of the deformation effect parameter, and the deformation effect (also called as the deformation effect) of the deformation area is generated on the image;
a sticker effect parameter for indicating an effect of generating a sub-material on an image when displaying a makeup and/or face-changing sub-material, where the sticker effect parameter may include a display position of the sticker effect sub-material (e.g., earring, hat, etc.) and a play parameter, the display position of the sticker effect sub-material may be determined by a position of a predetermined at least one key point corresponding to the sticker effect sub-material, and thus the display position parameter may be a position or a number of the corresponding key point, and the like, and the play parameter may be a parameter indicating a play effect (e.g., a cycle number, a play frame number, and the like) of the sticker effect sub-material, and a parameter value of the sticker effect parameter may include: the positions or numbers of key points corresponding to the paster special effect sub-materials and the parameter values of the playing parameters can be used for generating the paster special effect sub-materials in videos or images based on the parameter values of the paster special effect parameters;
a back group edge parameter (background effect) for indicating that a back group effect is generated on an image when makeup is displayed and/or face-changing child materials are changed, the back group effect parameter may include a target object (for example, a face, clothes, a hand, an ear, etc.) to be described in the image and a back group effect parameter (for example, a thickness of a back group, a color, etc.), and a parameter value of the back group effect parameter may include: the number or name of the target object and the parameter value of the stroking effect parameter (such as the thickness value and the color value of the stroking), and by setting the parameter value of the stroking special effect parameter, the method can be used for performing the stroking special effect processing on the target object in the image, adding the stroking to the target object in the image, realizing the stroking special effect, and performing the rendering processing of the AR effect on the target object in the video image.
In some optional examples, the trigger event may include, but is not limited to, any one or more of the following: no motion trigger, eye motion, head motion, eyebrow motion, hand motion, mouth motion, shoulder motion, special deformation effect, special sticker effect, special sound effect, special edge tracing effect, and other motions.
In some implementations of the embodiments of the present invention, the establishing module is specifically configured to establish a correspondence between display attributes of the makeup and/or face changing sub-material and at least two key points within a display position coverage range.
Fig. 15 is a schematic structural diagram of another embodiment of a device for generating a cosmetic and/or face-changing special effect program file package according to the present invention. As shown in fig. 15, in the generating apparatus of this embodiment, the display module includes: and the operation interface is used for responding to the received starting instruction display. The operation interface may include: an operation bar, a content display bar and/or a program file bar. Fig. 4 is a diagram showing an alternative example of the operation interface. Correspondingly, in this embodiment, the display module is specifically configured to display the reference image and the preset key points on the reference image in the content display bar; and displaying the imported makeup and/or face changing sub-materials in the content display column.
In some implementations of the embodiments of the present invention, the operation interface includes three areas, namely a left area, a middle area and a right area, wherein the operation bar is displayed on the left side of the operation interface, the content display bar is displayed in the middle of the operation interface, and the program file bar is displayed on the right side of the operation interface.
In addition, referring to fig. 15 again, in another embodiment of the device for generating a cosmetic and/or face-changing special effect program file package, the method may further include: and the first updating module is used for updating the display position and the corresponding relation of the makeup and/or face changing sub-material according to the position moving operation of the makeup and/or face changing sub-material received through the content display column.
In addition, referring to fig. 15 again, in another embodiment of the device for generating a cosmetic and/or face-changing special effect program file package, the method may further include: and the second updating module is used for updating the display size of the makeup and/or face changing sub-materials in the content display column according to the size adjusting operation of the makeup and/or face changing sub-materials received through the content display column.
In addition, referring to fig. 15 again, in another embodiment of the device for generating a cosmetic and/or face-changing special effect program file package, the method may further include: and the adjusting module is used for adjusting the shielding relation between the two or more makeup and/or face changing sub-materials according to the layer parameter adjusting instruction which is received through the interactive interface of the operation bar and is sent aiming at the two or more makeup and/or face changing sub-materials. Correspondingly, in this embodiment, the display module is further configured to display the two or more makeup and/or face changing sub-materials according to the adjusted occlusion relationship and the parameter values of the superposition manner of the two or more makeup and/or face changing sub-materials.
In addition, in the embodiment of the device for generating a makeup and/or face-changing special effect program file package, the first generation module may be further configured to generate a special effect program file for the makeup and/or face-changing sub-material according to a preset parameter value and a corresponding relationship of the special effect parameter of the special effect program file and the makeup and/or face-changing sub-material before the makeup and/or face-changing special effect program file package is generated, and display the special effect program file for the makeup and/or face-changing sub-material through the program file column. The special effect program file may include, for example: and generating a special effect program file by using a json program.
In addition, referring to fig. 15 again, in another embodiment of the device for generating a cosmetic and/or face-changing special effect program file package, the method may further include: and the storage module is used for storing the makeup beautifying and/or face changing special effect program file package at the position pointed by the storage instruction according to the received storage instruction.
In some implementations of embodiments of the present invention, the saving module is specifically configured to: in response to receiving a save instruction, displaying a save path selection interface and a compression interface; receiving a save location sent through a save path selection interface; receiving a compression mode sent by a compression interface, and compressing the makeup and/or face changing special effect program file package according to the compression mode to generate a compressed file package; and storing the compressed file package into the folder pointed by the saving position.
In some implementations of embodiments of the invention, the size of the makeup and/or face change sub-material in the makeup and/or face change special effects program file package is maintained as the size of the makeup and/or face change sub-material before the makeup and/or face change sub-material is imported.
Fig. 16 is a schematic structural diagram of an embodiment of a makeup and/or face-changing effect generating device according to the present invention. The makeup and/or face-changing special effect generation device in each embodiment of the invention can be used for realizing the makeup and/or face-changing special effect generation method in each embodiment of the invention. As shown in fig. 16, the generating means of this embodiment may include: the device comprises a second acquisition module and a second generation module. Wherein:
the second acquisition module is used for acquiring makeup and/or face changing sub-materials, parameter values of special effect parameters of the makeup and/or face changing sub-materials and corresponding relations between display attributes and key points of the makeup and/or face changing sub-materials; the display attributes include: size and/or display location. The display attributes include: size and/or display position, i.e.: the size and the display position may be included, or only the size or the display position may be included.
The correspondence between the display attribute of the makeup and/or face changing sub-material and the key point in the display position coverage range may be a one-to-one correspondence between the display attribute and at least one key point (for example, two or more key points) in the display position coverage range, or a one-to-one binding relationship between the display attribute and at least one key point in the display position coverage range. The makeup and/or face-changing sub-materials may include: makeup and/or face-changing sub-materials for one part of the person, and/or a combination of makeup and/or face-changing sub-materials for two or more parts of the person.
In some embodiments, the special effect parameters may include: and the superposition mode parameter is used for representing the superposition mode of the makeup and/or face changing sub-materials.
And the second generation module is used for generating special effects of makeup and/or face changing sub-materials on the images based on the corresponding relation, key points in the images related to the corresponding relation and parameter values of the special effect parameters.
The images may include, but are not limited to, any one or more of the following: still images, images in video.
Based on the makeup and/or face-changing special effect generating device provided by the embodiment of the invention, a makeup and/or face-changing sub-material and a parameter value of a special effect parameter thereof, and a corresponding relation between a display attribute of the makeup and/or face-changing sub-material and a key point are obtained, wherein the display attribute comprises size and display position; and generating a special effect of the makeup and/or face changing sub-material on the image based on the corresponding relation and the related key points in the image and the parameter values of the special effect parameters. According to the embodiment of the invention, the makeup and/or face changing special effects can be generated on the image through the preset makeup and/or face changing sub-materials and the parameter values of the special effect parameters thereof and the corresponding relation between the display attributes and the key points of the makeup and/or face changing sub-materials, so that the overall atmosphere effect of image playing is increased, the entertainment of a user is enhanced, the immersion of the user is improved, and the playing effect is improved.
Fig. 17 is a schematic structural diagram of another embodiment of the makeup and/or face-changing effect generating device according to the present invention. As shown in fig. 17, compared with the embodiment shown in fig. 16, the makeup appearance and/or face changing special effect generating device of this embodiment may further include: the second import module is used for importing a makeup and/or face changing special effect program file package; the makeup and/or face-changing special effect program file package comprises: the makeup and/or face changing sub-material, the parameter value of the special effect parameter of the makeup and/or face changing sub-material, and the corresponding relation between the display attribute and the key point of the makeup and/or face changing sub-material. Optionally, the makeup and/or face-changing special effect program file package may be a makeup and/or face-changing special effect program file package generated by using the method or the apparatus for generating a makeup and/or face-changing special effect program file package according to any embodiment of the present invention.
Correspondingly, in this embodiment, the second obtaining module is specifically configured to obtain makeup and/or face-changing sub-materials, parameter values of the special effect parameters, and a corresponding relationship from the makeup and/or face-changing special effect program file package.
In addition, referring to fig. 17 again, in another embodiment of the makeup and/or face-changing effect generating device, the method further includes: and the key point detection module is used for detecting the key points related to the corresponding relation of the images through a neural network and outputting the key point detection result. The detection result of the key point may include, but is not limited to, any one or more of the following items: the positions of the key points related to the corresponding relation in the images in the video; and presetting numbers of key points related to the corresponding relation.
In some embodiments, the makeup and/or face-changing base material comprises: a group of makeup and/or face changing sub-materials with preset playing time sequence. Accordingly, in some embodiments, the second generating module is specifically configured to: determining the playing time sequence of a plurality of makeup and/or face changing sub-materials based on the file names of the plurality of makeup and/or face changing sub-materials in a group of makeup and/or face changing sub-materials; and generating special effects of make-up and/or face changing sub-materials on the images according to the determined playing time sequence based on the corresponding relation, key points in the images related to the corresponding relation and parameter values of the special effect parameters.
In some embodiments, the special effect parameters include: and the superposition mode parameter is used for representing the superposition mode of the makeup and/or face changing sub-materials.
In some embodiments, the special effect parameters include: and displaying parameters, wherein the display parameters are used for indicating whether makeup and/or face changing sub-materials are displayed or not. Correspondingly, in this embodiment, the second generating module is specifically configured to, when the parameter value of the display parameter is a parameter value for displaying make-up and/or face-changing sub-material, generate a special effect of the make-up and/or face-changing sub-material on the image based on the correspondence, the key point in the image involved in the correspondence, and the parameter value of the special effect parameter.
In some embodiments, the special effect parameters include: and the triggering mode parameter is used for representing a triggering event for triggering the display of makeup and/or face changing sub-materials. Accordingly, referring again to fig. 17, the apparatus of this embodiment further comprises: the first detection module is used for detecting whether a trigger event corresponding to the parameter value of the trigger mode parameter occurs in the image. Correspondingly, the second generating module is specifically configured to generate a special effect of the makeup and/or face changing sub-material on the image based on the corresponding relationship, the key point in the image related to the corresponding relationship, and the parameter value of the special effect parameter in response to detecting that a trigger event corresponding to the parameter value of the trigger mode parameter occurs in the image.
In some embodiments, the special effect parameters include: a delay triggering parameter for indicating a time for delaying display of makeup and/or face changing sub-materials. Correspondingly, in this embodiment, the second generating module is specifically configured to, in response to a display condition that the makeup and/or face changing sub-material is satisfied, delay the generation of the special effect of the makeup and/or face changing sub-material on the image based on the correspondence, the key point in the image related to the correspondence, and the parameter value of the special effect parameter according to the delay play time corresponding to the parameter value of the delay trigger parameter; the display conditions meeting the requirements of makeup and/or face changing sub-materials comprise: the parameter value of the display parameter is used for displaying makeup and/or face changing sub-materials, and/or a trigger event corresponding to the parameter value of the trigger mode parameter occurs.
In some embodiments, the special effect parameters include: and the circulation parameter is used for expressing the circulation playing times of the makeup and/or face changing sub-materials. Correspondingly, in this embodiment, the second generating module is specifically configured to, in response to a display condition that meets the makeup and/or face changing sub-material, based on the correspondence, the key points in the image involved in the correspondence, and the parameter values of the special effect parameters, and according to the cycle times corresponding to the parameter values of the cycle parameters, cyclically display the makeup and/or face changing sub-material on the image, so as to generate a special effect of the makeup and/or face changing sub-material; the display conditions meeting the requirements of makeup and/or face changing sub-materials comprise: the parameter value of the display parameter is used for displaying makeup and/or face changing sub-materials, and/or a trigger event corresponding to the parameter value of the trigger mode parameter occurs.
In some embodiments, the special effect parameters include: and the playing frame number parameter is used for expressing the number of frames played by the makeup and/or face changing sub-element. Correspondingly, in this embodiment, the second generating module is specifically configured to, in response to a condition that a cosmetic and/or face changing sub-material is satisfied for playing, based on the correspondence, and the parameter values of the key points and the special effect parameters in the images related to the correspondence, and according to the playing frame number corresponding to the parameter value of the playing frame number parameter, generate a special effect of the cosmetic and/or face changing sub-material on the image corresponding to the playing frame number in the video; the display conditions meeting the requirements of makeup and/or face changing sub-materials comprise: the parameter value of the display parameter is used for displaying makeup and/or face changing sub-materials, and/or a trigger event corresponding to the parameter value of the trigger mode parameter occurs.
In some embodiments, the special effect parameters include: and triggering an ending parameter, wherein the triggering ending parameter is used for indicating a triggering event for ending the display of the makeup and/or the face changing sub-material. Accordingly, referring again to fig. 17, the apparatus of this embodiment further comprises: and the second detection module is used for detecting whether a trigger event corresponding to the parameter value of the trigger ending parameter occurs or not. In this embodiment, the second generating module is further configured to stop generating the special effect of the makeup and/or face changing sub-material in response to the second detecting module detecting that the trigger event corresponding to the parameter value of the trigger end parameter occurs.
In some embodiments, the special effect parameters include: and the deformation special effect parameter is used for representing the deformation effect of generating a deformation area on the image when the makeup is displayed and/or the face sub-material is changed. Correspondingly, in this embodiment, the second generating module is further configured to generate a deformation effect of the deformation region in the image according to the deformation special effect parameter when generating a special effect of make-up and/or face changing sub-material on the image based on the correspondence, the key point in the image related to the correspondence, and the parameter value of the special effect parameter.
In some embodiments, the special effect parameters include: and the paster special effect parameter is used for representing the special effect of generating the sub-materials on the image when the makeup is displayed and/or the sub-materials are changed. Correspondingly, in this embodiment, the second generating module is further configured to generate a special effect of the sub-material in the image according to the sticker special effect parameter when generating a special effect of make-up and/or face changing sub-material on the image based on the correspondence, the key points in the image related to the correspondence, and the parameter values of the special effect parameter.
In addition, another electronic device provided in an embodiment of the present invention includes:
a memory for storing a computer program;
and a processor, configured to execute the computer program stored in the memory, and when the computer program is executed, implement the method for generating a makeup and/or face-changing special effect program file package or the method for generating a makeup and/or face-changing special effect according to any embodiment of the present invention.
Fig. 18 is a schematic structural diagram of an embodiment of an electronic device according to the present invention. Referring now to fig. 18, shown is a schematic diagram of an electronic device suitable for use in implementing a terminal device or server of an embodiment of the present application. As shown in fig. 18, the electronic device includes one or more processors, a communication section, and the like, for example: one or more Central Processing Units (CPUs), and/or one or more image processors (GPUs), etc., which may perform various appropriate actions and processes according to executable instructions stored in a Read Only Memory (ROM) or loaded from a storage section into a Random Access Memory (RAM). The communication part may include, but is not limited to, a network card, which may include, but is not limited to, an ib (infiniband) network card, and the processor may communicate with the read-only memory and/or the random access memory to execute the executable instructions, connect with the communication part through the bus, and communicate with other target devices through the communication part, so as to complete operations corresponding to any method provided by the embodiments of the present application, for example, displaying a reference image and preset key points on the reference image; the reference image includes: referencing at least a portion of an image of a person; importing makeup and/or face changing sub-materials, and displaying the imported makeup and/or face changing sub-materials; acquiring parameter values of special effect parameters of the makeup and/or face changing sub-material, wherein the special effect parameters comprise superposition mode parameters, and establishing a corresponding relation between display attributes of the makeup and/or face changing sub-material and key points in a display position coverage range; the display attributes include: size and/or display location; and generating a makeup and/or face changing special effect program file package according to the makeup and/or face changing sub-material, the parameter value of the special effect parameter and the corresponding relation. For another example, acquiring a makeup and/or face changing sub-material, a parameter value of a special effect parameter of the makeup and/or face changing sub-material, and a corresponding relation between a display attribute and a key point of the makeup and/or face changing sub-material; the display attributes include: size and/or display location; and generating a special effect of the makeup and/or face changing sub-material on the image based on the corresponding relation, key points in the image related to the corresponding relation and parameter values of the special effect parameters.
In addition, in the RAM, various programs and data necessary for the operation of the apparatus can also be stored. The CPU, ROM, and RAM are connected to each other via a bus. In the case of RAM, ROM is an optional module. The RAM stores executable instructions or writes executable instructions into the ROM during operation, and the executable instructions cause the processor to execute operations corresponding to any one of the methods of the invention. An input/output (I/O) interface is also connected to the bus. The communication unit may be integrated, or may be provided with a plurality of sub-modules (e.g., a plurality of IB network cards) and connected to the bus link.
The following components are connected to the I/O interface: an input section including a keyboard, a mouse, and the like; an output section including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section including a hard disk and the like; and a communication section including a network interface card such as a LAN card, a modem, or the like. The communication section performs communication processing via a network such as the internet. The drive is also connected to the I/O interface as needed. A removable medium such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive as necessary, so that a computer program read out therefrom is mounted into the storage section as necessary.
It should be noted that the architecture shown in fig. 18 is only an optional implementation manner, and in a specific practical process, the number and types of the components in fig. 18 may be selected, deleted, added or replaced according to actual needs; in different functional component settings, separate settings or integrated settings may also be used, for example, the GPU and the CPU may be separately set or the GPU may be integrated on the CPU, the communication part may be separately set or integrated on the CPU or the GPU, and so on. These alternative embodiments are all within the scope of the present disclosure.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program tangibly embodied on a machine-readable medium, the computer program comprising program code for performing the method illustrated in the flowchart, the program code may include instructions corresponding to steps of a method for anti-counterfeiting detection of a human face provided by an embodiment of the present application. In such an embodiment, the computer program may be downloaded and installed from a network via the communication section, and/or installed from a removable medium. The computer program, when executed by the CPU, performs the above-described functions defined in the method of the present application.
In addition, an embodiment of the present invention further provides a computer program, which includes computer instructions, and when the computer instructions are executed in a processor of a device, the method for generating a makeup and/or face-changing special effect program file package, or the method for generating a makeup and/or face-changing special effect according to any one of the above embodiments of the present invention is implemented.
In addition, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements a method for generating a makeup and/or face-changing special effect program file package, or a method for generating a makeup and/or face-changing special effect according to any of the above embodiments of the present invention.
In the present specification, the embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts in the embodiments are referred to each other. For the system embodiment, since it basically corresponds to the method embodiment, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The method and apparatus of the present invention may be implemented in a number of ways. For example, the methods and apparatus of the present invention may be implemented in software, hardware, firmware, or any combination of software, hardware, and firmware. The above-described order for the steps of the method is for illustrative purposes only, and the steps of the method of the present invention are not limited to the order specifically described above unless specifically indicated otherwise. Furthermore, in some embodiments, the present invention may also be embodied as a program recorded in a recording medium, the program including machine-readable instructions for implementing a method according to the present invention. Thus, the present invention also covers a recording medium storing a program for executing the method according to the present invention.
The description of the present invention has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to practitioners skilled in this art. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.

Claims (37)

1. A method for generating a special effect program file package is characterized by comprising the following steps:
displaying the reference image; the reference image includes: referencing at least a portion of an image of a person;
acquiring a special effect material and displaying the special effect material;
acquiring parameter values of special effect parameters of the special effect materials through an interactive interface, and establishing a corresponding relation between display attributes of the special effect materials and key points in a display position coverage range; the display attributes include: size and/or display location;
and generating a special effect program file package executable by a rendering engine according to the special effect material, the parameter value of the special effect parameter and the corresponding relation, wherein the special effect program file package is used for generating the special effect of the special effect material on the image when being executed.
2. The method of claim 1, wherein obtaining special effects material comprises:
and receiving an import instruction input through the interactive interface, and importing the special effect materials in the material folder pointed by the import instruction.
3. The method according to claim 2, wherein the receiving an import instruction input through the interactive interface, and importing the special effects material in the material folder pointed by the import instruction comprises:
receiving a selection instruction sent through the interactive interface, taking a reference part selected by the selection instruction as a target part to be added with a special effect, and displaying a special effect parameter setting interface under the target part;
and receiving an import instruction sent by the interactive interface in the special-effect parameter setting interface, and importing the special-effect materials in the material folder pointed by the import instruction.
4. The method of claim 2, wherein importing the special effects material in the material folder pointed to by the import instruction comprises:
receiving an import instruction sent through the interactive interface, and acquiring and displaying a material folder pointed by the import instruction;
in response to receiving a special effect material selection operation for the material folder, importing one or more special effect materials selected by the special effect material selection operation; and/or
Responding to the fact that the special effect material selecting operation in the material folder is not received, selecting one or more special effect materials in the material folder according to the preset, and importing the selected special effect materials according to the preset;
wherein the plurality of special effect materials form a group of special effect materials with a predetermined playing time sequence.
5. The method according to claim 4, wherein the playback timing of a plurality of special effect materials in the set of special effect materials having a predetermined playback timing is determined based on file names of the special effect materials.
6. The method of claim 1, wherein the special effect parameters include a superposition mode parameter; the displaying the special effect material comprises:
and displaying the special effect material on the target part of the reference person in the reference image according to the parameter value of the superposition mode parameter.
7. The method according to claim 1, wherein the interactive interface is an interactive interface provided on an operation bar; the obtaining of the parameter value of the special effect parameter of the special effect material through the interactive interface includes:
responding to the received parameter value set for the special effect parameter of the special effect material sent through an interactive interface of an operation bar, and taking the set parameter value as the parameter value of the special effect parameter of the special effect material; and/or
And responding to the situation that the parameter value set aiming at the special effect parameter of the special effect material and sent through the interactive interface of the operation bar is not received, and taking the preset parameter value as the parameter value of the special effect parameter of the special effect material.
8. The method of claim 1, wherein establishing correspondence between display attributes of the special effects material and keypoints within the coverage of the display position comprises:
and establishing a corresponding relation between the display attribute of the special effect material and at least two key points in the display position coverage range.
9. The method of claim 1, further comprising:
in response to receiving a starting instruction, displaying an operation interface, wherein the operation interface comprises a content display bar;
the displaying the special effect material comprises: and displaying the special effect materials in the content display column.
10. The method according to claim 9, further comprising at least one of the following steps:
updating the display position of the special effect material and the corresponding relation according to the position moving operation of the special effect material received through the content display bar;
updating the display size of the special effect material in the content display column according to the size adjustment operation of the special effect material received through the content display column;
adjusting the shielding relation between two or more special effect materials according to a layer parameter adjusting instruction which is received through an interactive interface and is sent aiming at the two or more special effect materials, and displaying the two or more special effect materials according to the adjusted shielding relation and the parameter value of the superposition mode of the two or more special effect materials.
11. The method of claim 1, wherein prior to generating the special effects program package executable by the rendering engine, further comprising:
and generating a special effect program file of the special effect material according to a preset special effect program file, the parameter value of the special effect parameter of the special effect material and the corresponding relation.
12. The method of claim 1, wherein after generating the special effects program package executable by the rendering engine, further comprising:
and saving the special effect program file package at the position pointed by the saving instruction according to the received saving instruction.
13. The method of claim 12, wherein saving the special effects program package at the location pointed to by the save instruction according to the received save instruction comprises:
in response to receiving a save instruction, displaying a save path selection interface and a compression interface;
receiving a save location sent through the save path selection interface; receiving a compression mode sent by the compression interface, and compressing the special-effect program file package according to the compression mode to generate a compressed file package;
and storing the compressed file package into the folder pointed by the saving position.
14. The method of any of claims 1 to 13, wherein the at least a portion of the image of the reference character comprises an image of any one or more of: a full image, a head image, a face image, a shoulder image, an arm image, a gesture image, a waist image, a leg image, a foot image.
15. The method of any of claims 1 to 13, wherein the special effects material comprises: effect material for one portion of a person, and/or a combination of effect material for two or more portions of a person.
16. The method of any of claims 1 to 13, wherein displaying the reference image comprises: displaying the reference image and preset key points on the reference image;
wherein the preset key points comprise any one or more of the following: head key points, face key points, shoulder key points, arm key points, gesture key points, waist key points, leg key points, foot key points, and human skeleton key points.
17. The method according to any of claims 1 to 13, wherein the special effects parameters comprise any one or more of:
display parameters: the special effect material is used for indicating whether the special effect material is displayed or not;
and (3) superposition mode parameters: the special effect material stacking mode is used for representing the stacking mode of the special effect material;
triggering mode parameters: the trigger event is used for indicating a trigger event for displaying the special effect material;
circulation parameters: the special effect materials are used for representing the times of circularly displaying the special effect materials;
the parameters of the playing frame number are as follows: the special effect material playing device is used for expressing the number of frames played by the special effect material;
delay trigger parameters: a time for representing delayed display of the special effects material;
triggering an ending parameter: a trigger event for indicating the end of displaying the special effect material;
deformation special effect parameters: the deformation effect is used for representing the deformation effect of generating a deformation area on the image when the special effect material is displayed;
the sticker special effect parameter is used for representing the special effect of generating a sub-material on the image when the special effect material is displayed;
and the stroke special effect parameter is used for indicating that the stroke special effect is generated on the image when the special effect material is displayed.
18. The method of claim 17, wherein the triggering event comprises any one or more of: no action trigger, eye action, head action, eyebrow action, hand action, mouth action, shoulder action, special deformation effect, special paster effect, special sound effect and special drawing edge effect.
19. A special effect generation method, comprising:
importing a pre-generated special effect program file package which can be executed by a rendering engine;
acquiring a special effect material, a parameter value of a special effect parameter of the special effect material and a corresponding relation between a display attribute and a key point of the special effect material from the special effect program file package; the display attributes include: size and/or display location;
and generating a special effect of the special effect material on the image based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter values of the special effect parameters.
20. The method of claim 19, wherein the special effects material comprises: effect material for one portion of a person, and/or a combination of effect material for two or more portions of a person.
21. The method according to claim 19, wherein the special effects program package is a special effects program package generated by the method according to any one of claims 1 to 18.
22. The method of claim 19, further comprising:
and detecting key points related to the corresponding relation of the image through a neural network, and outputting a key point detection result.
23. The method of claim 22, wherein the keypoint detection results comprise any one or more of:
the positions of key points related to the corresponding relations in the image;
and presetting numbers of the key points related to the corresponding relation.
24. The method of claim 19, wherein the special effects material comprises: a group of special effect materials with preset playing time sequence; generating a special effect of the special effect material on the image based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter values of the special effect parameters, wherein the generating comprises the following steps:
determining a playing time sequence of a plurality of special effect materials in the group of special effect materials based on file names of the plurality of special effect materials;
and generating a special effect of the special effect material on the image according to the determined playing time sequence based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter values of the special effect parameters.
25. The method according to any of claims 19-24, wherein the special effects parameters comprise: and the superposition mode parameter is used for representing the superposition mode of the special effect material.
26. The method according to any of claims 19-24, wherein the special effects parameters comprise: displaying parameters, wherein the displaying parameters are used for indicating whether the special effect materials are displayed or not;
generating a special effect of the special effect material on the image based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter values of the special effect parameters, wherein the generating comprises the following steps:
and when the parameter value of the display parameter is the parameter value for displaying the special effect material, generating the special effect of the special effect material on the image based on the corresponding relation, the key point in the image related to the corresponding relation and the parameter value of the special effect parameter.
27. The method according to any of claims 19-24, wherein the special effects parameters comprise: the trigger mode parameter is used for representing a trigger event for triggering and displaying the special effect material;
the method further comprises the following steps: detecting whether a trigger event corresponding to the parameter value of the trigger mode parameter appears in the image;
generating a special effect of the special effect material on the image based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter values of the special effect parameters, wherein the generating comprises the following steps:
and in response to the fact that a trigger event corresponding to the parameter value of the trigger mode parameter appears in the image, generating a special effect of the special effect material on the image based on the corresponding relation, the key point in the image related to the corresponding relation and the parameter value of the special effect parameter.
28. The method of claim 27, wherein the special effects parameters comprise: a delay trigger parameter, wherein the delay trigger parameter is used for representing the time for delaying the display of the special effect material;
generating a special effect of the special effect material on the image based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter values of the special effect parameters, wherein the generating comprises the following steps:
responding to a display condition meeting a special effect material, according to delayed playing time corresponding to a parameter value of the delayed trigger parameter, and based on the corresponding relation, a key point in an image related to the corresponding relation and the parameter value of the special effect parameter, delaying to generate a special effect of the special effect material on the image;
the display conditions meeting the special effect materials comprise: and the parameter value of the display parameter is used for displaying the special effect material and/or a trigger event corresponding to the parameter value of the trigger mode parameter occurs.
29. The method of claim 27, wherein the special effects parameters comprise: the circulation parameter is used for representing the circulation playing times of the special effect material;
generating a special effect of the special effect material on the image based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter values of the special effect parameters, wherein the generating comprises the following steps:
responding to a display condition meeting a special effect material, and circularly displaying the special effect material on the image according to the corresponding relation, key points in the image related to the corresponding relation and parameter values of the special effect parameter and the circulation times corresponding to the parameter values of the circulation parameter so as to generate a special effect of the special effect material;
the display conditions meeting the special effect materials comprise: and the parameter value of the display parameter is used for displaying the special effect material and/or a trigger event corresponding to the parameter value of the trigger mode parameter occurs.
30. The method of claim 27, wherein the special effects parameters comprise: a playing frame number parameter, wherein the playing frame number parameter is used for expressing the number of frames of the special effect material;
generating a special effect of the special effect material on the image based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter values of the special effect parameters, wherein the generating comprises the following steps:
responding to a playing condition meeting a special effect material, and generating a special effect of the special effect material on an image corresponding to a playing frame number in a video according to the playing frame number corresponding to a parameter value of the playing frame number parameter based on the corresponding relation, a key point in the image related to the corresponding relation and a parameter value of the special effect parameter;
the display conditions meeting the special effect materials comprise: and the parameter value of the display parameter is used for displaying the special effect material and/or a trigger event corresponding to the parameter value of the trigger mode parameter occurs.
31. The method according to any of claims 19-24, wherein the special effects parameters comprise: triggering an end parameter, wherein the end parameter is used for representing a trigger event for ending the display of the special effect material;
the method further comprises the following steps:
detecting whether a trigger event corresponding to the parameter value of the trigger end parameter occurs;
and stopping generating the special effect of the special effect material in response to the detection of the trigger event corresponding to the parameter value of the trigger end parameter.
32. The method according to any of claims 19-24, wherein the special effects parameters comprise: a special effect deformation parameter used for representing the deformation effect of a deformation area generated on the image when the special effect material is displayed;
the method further comprises the following steps:
and generating a deformation effect of the deformation area in the image according to the deformation special effect parameter when generating the special effect of the special effect material on the image based on the corresponding relation, the key point in the image related to the corresponding relation and the parameter value of the special effect parameter.
33. The method according to any of claims 19-24, wherein the special effects parameters comprise: a sticker special effect parameter used for representing a special effect of generating a sub-material on an image when the special effect material is displayed;
the method further comprises the following steps:
and generating the special effect of the sub-material in the image according to the paster special effect parameter when generating the special effect of the special effect material on the image based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter value of the special effect parameter.
34. An apparatus for generating a special effect program package, comprising:
a display module for displaying a reference image; the reference image includes: referencing at least a portion of an image of a person; displaying the obtained special effect material;
the first import module is used for acquiring the special effect material;
the first acquisition module is used for acquiring the parameter value of the special effect parameter of the special effect material through an interactive interface;
the establishing module is used for establishing the corresponding relation between the display attribute of the special effect material and the key point in the display position coverage range; the display attributes include: size and/or display location;
and the first generation module is used for generating a special effect program file package which can be executed by a rendering engine according to the special effect material, the parameter value of the special effect parameter and the corresponding relation, and the special effect program file package is used for generating the special effect of the special effect material on the image when being executed.
35. A special effect generation apparatus, comprising:
the second import module is used for importing a pre-generated special effect program file package which can be executed by the rendering engine;
the second obtaining module is used for obtaining a special effect material, a parameter value of a special effect parameter of the special effect material and a corresponding relation between a display attribute and a key point of the special effect material from the special effect program file package; the display attributes include: size and/or display location;
and the second generation module is used for generating the special effect of the special effect material on the image based on the corresponding relation, the key points in the image related to the corresponding relation and the parameter values of the special effect parameters.
36. An electronic device, comprising:
a memory for storing a computer program;
a processor for executing a computer program stored in the memory, and when executed, implementing the method of any of the preceding claims 1-33.
37. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of the preceding claims 1 to 33.
CN202110977497.2A 2018-05-02 2018-05-02 Method and device for generating special-effect program file package and special effect Pending CN113658298A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110977497.2A CN113658298A (en) 2018-05-02 2018-05-02 Method and device for generating special-effect program file package and special effect

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810411198.0A CN108711180B (en) 2018-05-02 2018-05-02 Method and device for generating makeup and/or face-changing special effect program file package and method and device for generating makeup and/or face-changing special effect
CN202110977497.2A CN113658298A (en) 2018-05-02 2018-05-02 Method and device for generating special-effect program file package and special effect

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201810411198.0A Division CN108711180B (en) 2018-05-02 2018-05-02 Method and device for generating makeup and/or face-changing special effect program file package and method and device for generating makeup and/or face-changing special effect

Publications (1)

Publication Number Publication Date
CN113658298A true CN113658298A (en) 2021-11-16

Family

ID=63868672

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110977497.2A Pending CN113658298A (en) 2018-05-02 2018-05-02 Method and device for generating special-effect program file package and special effect
CN201810411198.0A Active CN108711180B (en) 2018-05-02 2018-05-02 Method and device for generating makeup and/or face-changing special effect program file package and method and device for generating makeup and/or face-changing special effect

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201810411198.0A Active CN108711180B (en) 2018-05-02 2018-05-02 Method and device for generating makeup and/or face-changing special effect program file package and method and device for generating makeup and/or face-changing special effect

Country Status (1)

Country Link
CN (2) CN113658298A (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109167936A (en) * 2018-10-29 2019-01-08 Oppo广东移动通信有限公司 A kind of image processing method, terminal and storage medium
CN111444743A (en) * 2018-12-27 2020-07-24 北京奇虎科技有限公司 Video portrait replacing method and device
CN109803165A (en) * 2019-02-01 2019-05-24 北京达佳互联信息技术有限公司 Method, apparatus, terminal and the storage medium of video processing
CN110070592B (en) * 2019-02-28 2020-05-05 北京字节跳动网络技术有限公司 Generation method and device of special effect package and hardware device
US11074733B2 (en) 2019-03-15 2021-07-27 Neocortext, Inc. Face-swapping apparatus and method
CN110321849B (en) * 2019-07-05 2023-12-22 腾讯科技(深圳)有限公司 Image data processing method, device and computer readable storage medium
CN110503724A (en) * 2019-08-19 2019-11-26 北京猫眼视觉科技有限公司 A kind of AR expression resource construction management system and method based on human face characteristic point
CN110704007A (en) * 2019-09-27 2020-01-17 成都星时代宇航科技有限公司 Double-picture display method and device, terminal and storage medium
CN112784622B (en) * 2019-11-01 2023-07-25 抖音视界有限公司 Image processing method and device, electronic equipment and storage medium
CN113709383A (en) * 2020-05-21 2021-11-26 北京字节跳动网络技术有限公司 Method, device and equipment for configuring video special effects and storage medium
CN112486263A (en) * 2020-11-30 2021-03-12 科珑诗菁生物科技(上海)有限公司 Eye protection makeup method based on projection and projection makeup dressing wearing equipment
CN113240777A (en) * 2021-04-25 2021-08-10 北京达佳互联信息技术有限公司 Special effect material processing method and device, electronic equipment and storage medium
CN115239845A (en) * 2021-04-25 2022-10-25 北京字跳网络技术有限公司 Method, device, equipment and medium for generating special effect configuration file
CN113709549A (en) * 2021-08-24 2021-11-26 北京市商汤科技开发有限公司 Special effect data packet generation method, special effect data packet generation device, special effect data packet image processing method, special effect data packet image processing device, special effect data packet image processing equipment and storage medium
CN113760161A (en) * 2021-08-31 2021-12-07 北京市商汤科技开发有限公司 Data generation method, data generation device, image processing method, image processing device, equipment and storage medium
CN113938618A (en) * 2021-09-29 2022-01-14 北京达佳互联信息技术有限公司 Special effect manufacturing method and device, electronic equipment and storage medium

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101436306A (en) * 2008-12-19 2009-05-20 北京中星微电子有限公司 Method, apparatus and video display card for implementing image special effect
US20130007669A1 (en) * 2011-06-29 2013-01-03 Yu-Ling Lu System and method for editing interactive three-dimension multimedia, and online editing and exchanging architecture and method thereof
CN104469179A (en) * 2014-12-22 2015-03-25 杭州短趣网络传媒技术有限公司 Method for combining dynamic pictures into mobile phone video
CN104778712A (en) * 2015-04-27 2015-07-15 厦门美图之家科技有限公司 Method and system for pasting image to human face based on affine transformation
CN104899825A (en) * 2014-03-06 2015-09-09 腾讯科技(深圳)有限公司 Method and device for modeling picture figure
CN105451090A (en) * 2014-08-26 2016-03-30 联想(北京)有限公司 Image processing method and image processing device
CN105493152A (en) * 2013-07-22 2016-04-13 株式会社得那 Image processing device and image processing program
CN105791692A (en) * 2016-03-14 2016-07-20 腾讯科技(深圳)有限公司 Information processing method and terminal
CN105975935A (en) * 2016-05-04 2016-09-28 腾讯科技(深圳)有限公司 Face image processing method and apparatus
CN106204696A (en) * 2016-07-05 2016-12-07 网易(杭州)网络有限公司 A kind of specially good effect implementation method and device
CN106296781A (en) * 2015-05-27 2017-01-04 深圳超多维光电子有限公司 Specially good effect image generating method and electronic equipment
CN106373170A (en) * 2016-08-31 2017-02-01 北京云图微动科技有限公司 Video making method and video making device
CN107343220A (en) * 2016-08-19 2017-11-10 北京市商汤科技开发有限公司 Data processing method, device and terminal device
CN107551549A (en) * 2017-08-09 2018-01-09 广东欧珀移动通信有限公司 Video game image method of adjustment and its device
WO2018033154A1 (en) * 2016-08-19 2018-02-22 北京市商汤科技开发有限公司 Gesture control method, device, and electronic apparatus
CN107820027A (en) * 2017-11-02 2018-03-20 北京奇虎科技有限公司 Video personage dresss up method, apparatus, computing device and computer-readable storage medium
CN107945219A (en) * 2017-11-23 2018-04-20 翔创科技(北京)有限公司 Face image alignment schemes, computer program, storage medium and electronic equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5029852B2 (en) * 2010-01-07 2012-09-19 花王株式会社 Makeup simulation method
CN102708575A (en) * 2012-05-17 2012-10-03 彭强 Daily makeup design method and system based on face feature region recognition
CN103731583B (en) * 2013-12-17 2016-05-18 四川金手指时代投资管理有限公司 Intelligent synthetic, print processing method is used for taking pictures
CN104123749A (en) * 2014-07-23 2014-10-29 邢小月 Picture processing method and system
CN107424115B (en) * 2017-05-31 2020-10-27 成都品果科技有限公司 Skin color correction algorithm based on face key points
CN107274493B (en) * 2017-06-28 2020-06-19 河海大学常州校区 Three-dimensional virtual trial type face reconstruction method based on mobile platform

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101436306A (en) * 2008-12-19 2009-05-20 北京中星微电子有限公司 Method, apparatus and video display card for implementing image special effect
US20130007669A1 (en) * 2011-06-29 2013-01-03 Yu-Ling Lu System and method for editing interactive three-dimension multimedia, and online editing and exchanging architecture and method thereof
CN105493152A (en) * 2013-07-22 2016-04-13 株式会社得那 Image processing device and image processing program
CN104899825A (en) * 2014-03-06 2015-09-09 腾讯科技(深圳)有限公司 Method and device for modeling picture figure
CN105451090A (en) * 2014-08-26 2016-03-30 联想(北京)有限公司 Image processing method and image processing device
CN104469179A (en) * 2014-12-22 2015-03-25 杭州短趣网络传媒技术有限公司 Method for combining dynamic pictures into mobile phone video
CN104778712A (en) * 2015-04-27 2015-07-15 厦门美图之家科技有限公司 Method and system for pasting image to human face based on affine transformation
CN106296781A (en) * 2015-05-27 2017-01-04 深圳超多维光电子有限公司 Specially good effect image generating method and electronic equipment
CN105791692A (en) * 2016-03-14 2016-07-20 腾讯科技(深圳)有限公司 Information processing method and terminal
CN105975935A (en) * 2016-05-04 2016-09-28 腾讯科技(深圳)有限公司 Face image processing method and apparatus
CN106204696A (en) * 2016-07-05 2016-12-07 网易(杭州)网络有限公司 A kind of specially good effect implementation method and device
CN107343220A (en) * 2016-08-19 2017-11-10 北京市商汤科技开发有限公司 Data processing method, device and terminal device
WO2018033154A1 (en) * 2016-08-19 2018-02-22 北京市商汤科技开发有限公司 Gesture control method, device, and electronic apparatus
CN106373170A (en) * 2016-08-31 2017-02-01 北京云图微动科技有限公司 Video making method and video making device
CN107551549A (en) * 2017-08-09 2018-01-09 广东欧珀移动通信有限公司 Video game image method of adjustment and its device
CN107820027A (en) * 2017-11-02 2018-03-20 北京奇虎科技有限公司 Video personage dresss up method, apparatus, computing device and computer-readable storage medium
CN107945219A (en) * 2017-11-23 2018-04-20 翔创科技(北京)有限公司 Face image alignment schemes, computer program, storage medium and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
知乎: "如何评价Facebook今年的开发者大会F8 2017?", HTTPS://WWW.ZHIHU.COM/QUESTION/58692500/, pages 274 - 275 *

Also Published As

Publication number Publication date
CN108711180B (en) 2021-08-06
CN108711180A (en) 2018-10-26

Similar Documents

Publication Publication Date Title
CN108711180B (en) Method and device for generating makeup and/or face-changing special effect program file package and method and device for generating makeup and/or face-changing special effect
CN108259496B (en) Method and device for generating special-effect program file package and special effect, and electronic equipment
CN108388434B (en) Method and device for generating special-effect program file package and special effect, and electronic equipment
CN108280883B (en) Method and device for generating special-effect-of-deformation program file package and method and device for generating special effect of deformation
CN109035373B (en) Method and device for generating three-dimensional special effect program file package and method and device for generating three-dimensional special effect
KR102241153B1 (en) Method, apparatus, and system generating 3d avartar from 2d image
CN108399654B (en) Method and device for generating drawing special effect program file package and drawing special effect
CN108986227B (en) Particle special effect program file package generation method and device and particle special effect generation method and device
CN117218250A (en) Animation model generation method and device
KR20240030109A (en) An electronic apparatus for providing avatar based on an user's face and a method for operating the same
CN117830527A (en) Digital person customizable portrait implementing method, system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination