WO2019154339A1 - 特效程序文件包的生成及特效生成方法与装置、电子设备 - Google Patents
特效程序文件包的生成及特效生成方法与装置、电子设备 Download PDFInfo
- Publication number
- WO2019154339A1 WO2019154339A1 PCT/CN2019/074503 CN2019074503W WO2019154339A1 WO 2019154339 A1 WO2019154339 A1 WO 2019154339A1 CN 2019074503 W CN2019074503 W CN 2019074503W WO 2019154339 A1 WO2019154339 A1 WO 2019154339A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sub
- materials
- key point
- parameter
- program file
- Prior art date
Links
- 230000000694 effects Effects 0.000 title claims abstract description 343
- 238000000034 method Methods 0.000 title claims abstract description 154
- 239000000463 material Substances 0.000 claims abstract description 695
- 238000009877 rendering Methods 0.000 claims abstract description 10
- 230000008676 import Effects 0.000 claims description 72
- 230000009471 action Effects 0.000 claims description 62
- 238000001514 detection method Methods 0.000 claims description 40
- 230000003796 beauty Effects 0.000 claims description 37
- 210000000988 bone and bone Anatomy 0.000 claims description 32
- 230000004044 response Effects 0.000 claims description 32
- 210000003128 head Anatomy 0.000 claims description 29
- 230000003993 interaction Effects 0.000 claims description 29
- 230000006870 function Effects 0.000 claims description 24
- 238000004590 computer program Methods 0.000 claims description 22
- 210000004709 eyebrow Anatomy 0.000 claims description 18
- 210000003127 knee Anatomy 0.000 claims description 18
- 230000006835 compression Effects 0.000 claims description 16
- 238000007906 compression Methods 0.000 claims description 16
- 210000002683 foot Anatomy 0.000 claims description 15
- 210000001624 hip Anatomy 0.000 claims description 15
- 230000002452 interceptive effect Effects 0.000 claims description 15
- 210000002414 leg Anatomy 0.000 claims description 15
- 210000000689 upper leg Anatomy 0.000 claims description 15
- 210000003423 ankle Anatomy 0.000 claims description 12
- 210000000707 wrist Anatomy 0.000 claims description 12
- 230000008859 change Effects 0.000 claims description 8
- 230000001815 facial effect Effects 0.000 claims description 6
- 210000001981 hip bone Anatomy 0.000 claims description 6
- 210000001747 pupil Anatomy 0.000 claims description 6
- 238000013528 artificial neural network Methods 0.000 claims description 5
- 210000000744 eyelid Anatomy 0.000 claims description 5
- 244000309466 calf Species 0.000 claims description 3
- 230000003111 delayed effect Effects 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 19
- 238000004891 communication Methods 0.000 description 11
- 210000004247 hand Anatomy 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 3
- 210000004209 hair Anatomy 0.000 description 3
- 230000004397 blinking Effects 0.000 description 2
- 210000005069 ears Anatomy 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 210000003414 extremity Anatomy 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000003739 neck Anatomy 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/435—Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
- H04N21/4351—Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream involving reassembling additional data, e.g. rebuilding an executable program from recovered modules
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/60—Software deployment
- G06F8/61—Installation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/70—Software maintenance or management
- G06F8/72—Code refactoring
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2213/00—Indexing scheme for animation
- G06T2213/12—Rule based animation
Definitions
- the present application relates to computer vision technology, and in particular to a method and device for generating a special effect program file package and a special effect, and an electronic device.
- Augmented Reality is a new technology that integrates real world information and virtual world information "seamlessly". It simulates the entity information in a certain time and space within the real world. Superimposing virtual information, applying virtual information to the real world, superimposing real-world characters, environments and virtual objects in real time on the same picture or space, so as to achieve a sensory experience beyond reality.
- the embodiment of the present application provides a technical solution for generating a special effect program file package and a technical solution for generating a special effect.
- a method for generating a special effect program file package includes:
- the set of sub-materials includes a plurality of sub-materials
- a special effect program file package is generated according to the set of sub-materials and parameter values of the play parameters.
- the plurality of sub-materials have a predetermined playing sequence.
- the playing timing of the plurality of sub-materials is determined based on file names of the plurality of sub-materials.
- a special effect generating method including:
- a special effect based on the at least one set of sub-materials is generated on the video image according to the detected key points and parameter values of the play parameters of the at least one set of sub-materials.
- a device for generating a special effect program file package includes:
- a first import module configured to import a set of sub-materials; the set of sub-materials includes a plurality of sub-materials;
- a first acquiring module configured to acquire a parameter value of a playing parameter of the set of sub-materials
- a first generating module configured to generate a special effect program file package according to the parameter values of the set of sub-materials and the playing parameters.
- a special effect generating apparatus including:
- a second obtaining module configured to obtain a parameter value of a playing parameter of at least one set of sub-materials in the special effect program file package; wherein the set of sub-materials includes a plurality of sub-materials;
- a first detecting module configured to perform key point detection on the video image
- a second generating module configured to generate an effect based on the at least one set of sub-materials on the video image according to the detected key points and parameter values of the playing parameters of the at least one set of sub-materials.
- an electronic device including:
- a memory for storing a computer program
- a processor configured to execute a computer program stored in the memory, and when the computer program is executed, implements the method described in any one of the embodiments of the present application.
- a computer readable storage medium having stored thereon a computer program, which when executed by a processor, implements the method of any of the embodiments of the present application.
- a computer program comprising computer instructions that, when executed in a processor of a device, implement the method of any of the embodiments of the present application.
- an electronic device, a program, and a medium when generating a special effect program file package, a group of sub-materials is imported, and the set of sub-materials includes a plurality of sub-materials;
- the parameter value of the playing parameter of the group material; the special effect program file package is generated according to the parameter values of the group of the child material and the playing parameter, so that the dynamic special effect processing is performed based on the video of the special effect program file package, and the dynamic special effect is realized on the played video
- the application embodiment can generate the special effect program file executable by the rendering engine without manually writing the program file, and the operation is simple and the required time is short, which improves the overall efficiency of the dynamic special effect implementation and effectively ensures the accuracy of the special effects effect.
- the method and device for generating a special effect, the electronic device, the program, and the medium provided by the foregoing embodiment of the present application obtain parameter values of playing parameters of at least one set of sub-materials in the special effect program file package, where the set of sub-materials includes multiple sub-materials; Performing key point detection on the video image; generating an effect based on at least one set of sub-materials on the video image according to the detected key points and parameter values of the playing parameters of the at least one set of sub-materials.
- a dynamic special effect is generated on the video by using a parameter value of a play parameter of at least one set of the sub-materials in the pre-generated special program file package and a key point in the video image, and the dynamic special effect play is implemented in the video, and the video is improved. Playback effect.
- FIG. 1 is a flowchart of an embodiment of a method for generating a special effect program file package of the present application.
- FIG. 2 is a diagram showing an example of an operation interface of a device for generating a special effect program file package according to an embodiment of the present application.
- FIG. 3 is an exemplary schematic diagram of a play parameter setting interface in which the reference part is a hand-time sub-material in the embodiment of the present application.
- FIG. 4 is an exemplary schematic diagram of a hand motion in an embodiment of the present application.
- FIG. 5 is an exemplary schematic diagram of key points of a face in an embodiment of the present application.
- FIG. 6 is a flowchart of another embodiment of a method for generating a special effect program file package of the present application.
- FIG. 7 is a flowchart of an embodiment of a method for generating a special effect of the present application.
- FIG. 8 is a flowchart of another embodiment of a method for generating a special effect of the present application.
- FIG. 9 is a schematic structural diagram of an embodiment of a device for generating a special effect program file package according to the present application.
- FIG. 10 is a schematic structural diagram of another embodiment of a device for generating a special effect program file package according to the present application.
- FIG. 11 is a schematic structural diagram of an embodiment of a special effect generating apparatus of the present application.
- FIG. 12 is a schematic structural diagram of another embodiment of a special effect generating apparatus of the present application.
- FIG. 13 is a schematic structural diagram of an application embodiment of an electronic device according to the present application.
- a plurality may mean two or more, and “at least one” may mean one, two or more.
- the term "and/or" in the disclosure is merely an association relationship describing an associated object, indicating that there may be three relationships, for example, A and/or B, which may indicate that A exists separately, and A and B exist simultaneously. There are three cases of B alone.
- the character "/" in the present application generally indicates that the context of the context is an "or" relationship.
- Embodiments of the present application can be applied to electronic devices such as terminal devices, computer systems, servers, etc., which can operate with numerous other general purpose or special purpose computing system environments or configurations.
- Examples of well-known terminal devices, computing systems, environments, and/or configurations suitable for use with electronic devices such as terminal devices, computer systems, servers, and the like include, but are not limited to, personal computer systems, server computer systems, thin clients, thick clients Machines, handheld or laptop devices, microprocessor-based systems, set-top boxes, programmable consumer electronics, networked personal computers, small computer systems, mainframe computer systems, and distributed cloud computing technology environments including any of the above, and the like.
- Electronic devices such as terminal devices, computer systems, servers, etc., can be described in the general context of computer system executable instructions (such as program modules) being executed by a computer system.
- program modules may include routines, programs, target programs, components, logic, data structures, and the like that perform particular tasks or implement particular abstract data types.
- the computer system/server can be implemented in a distributed cloud computing environment where tasks are performed by remote processing devices that are linked through a communication network.
- program modules may be located on a local or remote computing system storage medium including storage devices.
- FIG. 1 is a flowchart of an embodiment of a method for generating a special effect program file package of the present application.
- the method for generating the special effect program package of each embodiment of the present application can be implemented by, for example, but not limited to, one device (the embodiment of the present application is called a device for generating a special effect file package).
- the method for generating the special effect program file package of the embodiment includes:
- a plurality of sub-materials in a set of sub-materials have predetermined playback timings.
- the playing timing of the plurality of sub-materials in the set of sub-materials may be determined based on the file names of the plurality of sub-materials.
- the operation 102 may be performed by a processor invoking a corresponding instruction stored in a memory or by a first import module executed by the processor.
- the operation 104 may be performed by a processor invoking a corresponding instruction stored in a memory, or may be performed by a first acquisition module executed by the processor.
- the operation 106 may be performed by a processor invoking a corresponding instruction stored in a memory, or may be performed by a first generation module executed by the processor.
- a set of sub-materials may be imported, or multiple sets of sub-materials may be imported.
- operations 102-104 may be respectively performed for each group of sub-materials, and then operation 106 is performed for the plurality of sets of sub-materials, and a deformation effect program file is generated from parameter values of the plurality of sets of sub-materials and their play parameters.
- a package that is, a special effects package, can include a set of sub-materials, or can include multiple sets of sub-materials.
- the special effect program file package can be used for special effects processing of the video, and a dynamic special effect of a set of sub-materials is generated on the video, for example, an AR effect is rendered by the AR engine or an electronic device having an AR drawing function. deal with.
- the method for generating the special effect program file package provided by the above embodiment of the present application, when generating the special effect program file package, a set of sub-materials is imported, the set of sub-materials includes a plurality of sub-materials; and the parameter values of the play parameters of the set of sub-materials are obtained. And generating a special effect program file package according to the parameter values of the set of sub-materials and the playing parameters, so as to perform dynamic special effect processing based on the special effect program file package video, and implementing dynamic special effects on the played video, the embodiment of the present application does not need to manually write the program file.
- the special effect program file executable by the rendering engine can be generated, the operation is simple, the time required is short, the overall efficiency of the dynamic special effect realization is improved, and the accuracy of the special effect effect is effectively guaranteed.
- the device for generating the special effect program package may include a preset effect program file, which may be, for example, a lightweight data exchange format based on the JavaScript language (JavaScript Object Notiation, json). ) a file, or any other executable program file.
- the parameter value of the play parameter in the special effect program file may be vacant or preset to a default value.
- the generating device of the special effect program file package may include an operation bar, where the operation bar is provided with at least one interaction interface for receiving a parameter value set for a play parameter of a group of sub-materials; in addition, the special effect program file
- the package generating device may further include a program file display field, and the program file for displaying the play parameters of the set of sub-materials is as shown in FIG. 2 , which is an example of an operation interface of the device for generating the special effect program file package in the embodiment of the present application.
- the operation interface of the generating device of the special effect program package includes an operation bar and a program file display column.
- the program file display column displays the special effects when the playing parameter of the set of sub-materials is vacant or preset to the default value.
- the program file when receiving the parameter value set by the playing parameter of a group of sub-materials through the interactive interface of the operation bar, updating the parameter value set by the playing parameter of the group of sub-materials to the recently received parameter value, and displaying the program file
- the bar displays the effect file of the updated parameter value in real time.
- the operation 102 may include: receiving an import instruction sent through an interaction interface of the operation bar, and importing a plurality of sub-materials in the material folder pointed by the import instruction As the above set of sub-materials.
- the operation bar may include a play parameter setting interface including at least one interactive interface; and may further include other areas, such as a reference part display. In the area, the play parameter setting interface at this time may be a play parameter setting interface under each reference part.
- Reference sites in various embodiments of the present application may include, but are not limited to, any one or more of the following: ear, hand, face, hair, neck, limb.
- FIG. 3 in the embodiment of the present application, when the reference part is a hand, an exemplary schematic diagram of a play parameter setting interface of a group of sub-materials is shown.
- receiving an import instruction input through an interaction interface of the operation bar, and importing a plurality of sub-materials in the material folder pointed by the import instruction may include: receiving a play parameter by using an operation bar An import instruction sent by the interactive interface in the setting interface is imported, and multiple sub-materials in the material folder pointed to by the import instruction are imported.
- receiving an import instruction input through an interaction interface of the operation bar, and importing a plurality of sub-materials in the material folder pointed by the import instruction may include: receiving an operation block
- the selection instruction sent by the interaction interface uses the reference part selected by the selection instruction as the target part currently needed to add the special effect, and displays the play parameter setting interface under the target part of the operation block; the receiving is sent through the interactive interface in the play parameter setting interface.
- the importing an input instruction input through the interaction interface of the operation bar, and importing the plurality of sub-materials in the material folder pointed by the import instruction may include:
- Each material folder may include a plurality of sub-materials.
- the material folder may include a plurality of sub-materials of different shapes, colors, earrings, earmuffs, and the like, one of the embodiments of the present application.
- the preset position or the preset serial number in the material folder pointed to by the import instruction may be preset.
- Sub-materials For example, when the user does not select a child clip, the first to fifth child clips in the clip folder are selected and imported by default.
- obtaining the parameter values of the playing parameters of the plurality of sub-materials in operation 104 may include: responding to receiving, by the interaction interface in the playing parameter setting interface, the group of the foregoing a parameter value set by the play parameter of the material, the parameter value set as the parameter value of the play parameter of the set of sub-materials; and/or, in response to not receiving the one sent by the interactive interface in the play parameter setting interface
- the parameter value set by the play parameter of the group material, and the preset parameter value is used as the parameter value of the play parameter of the above group of child materials.
- the embodiment of the present application does not need to generate a rendering engine executable file by manually writing a program file, and the special operation package can be generated based on the user's selection operation of a set of sub-materials in the operation bar and the setting operation of the parameter values, and the operation is simple.
- the required time is short, which improves the overall efficiency of dynamic special effects and effectively guarantees the accuracy of special effects.
- all the sub-materials in the set of sub-materials may be applied.
- the parameter values of the playback parameters of the material are the same.
- the play parameters of a set of sub-materials may include, for example but are not limited to, any one or more of the following:
- display parameter used to indicate whether to display the above multiple sub-materials.
- the parameter values include “Yes” and “No”. When the parameter value is "Yes”, it means that the corresponding plurality of sub-materials need to be displayed during video playback, and the parameter value is selected as "No”. (No)" means that the plurality of sub-materials are not displayed during video playback;
- interval parameter used to indicate the number of frames of the adjacent two sub-materials in the plurality of sub-materials
- trigger action parameter used to indicate the trigger action of triggering the display of the plurality of sub-materials, which means that the action of triggering the plurality of sub-materials is triggered, and the parameter value may include each trigger action, and the user may select from the preset action set. Select at least one action as the trigger action. That is, during the video playing process, when the corresponding triggering action is detected, the corresponding plurality of sub-materials may be triggered to be displayed. For example, when the triggering action “opening mouth” specified in the triggering action parameter is detected in the video, the animation of the spit rainbow is started.
- the plurality of sub-materials, the start display time, the end display time, the display time, and the like of the plurality of sub-materials may be determined according to parameter values of other parameters, for example, according to the delay trigger parameter, the trigger end parameter, and the parameter value of the loop parameter. determine;
- loop parameter used to indicate the number of loop playback of the above multiple sub-materials, you can set or select the specific value of the number of loop play as its parameter value, for example, 1, 5, etc., you can agree that the parameter value is set to 0 Infinite loop playback;
- delay trigger parameter used to indicate the time to delay display of the plurality of sub-materials, that is, when a trigger action in the trigger action parameter is detected from a certain frame in the video, how many frames are delayed to start displaying the plurality of sub-subjects Material, you can set or select the time to delay display of the above multiple sub-materials as its parameter value;
- trigger end parameter used to indicate the end of the display of the plurality of sub-materials, refers to what action to end the plurality of sub-material display, the parameter value includes each trigger action, the user can select from the preset action set At least one action is an action of ending the display of the plurality of sub-materials. That is, during the video playing process, when the triggering action specified by the triggering end parameter is detected, the display/playing of the corresponding plurality of sub-materials may be ended, for example, when the triggering action “opening mouth” specified in the triggering action parameter occurs in the detected video. Playing the rainbow, which is formed by the above plurality of sub-materials, can set the parameter value in the trigger end parameter to "shutdown", and when the "shutdown" action occurs in the video, the rainbow disappears;
- Display size parameter (Scale): a reference basis for indicating a change in the display size of the plurality of sub-materials, and is used to achieve a display effect of the plurality of sub-materials being large and small.
- the parameter value of the display size parameter (ie, the reference basis for the display size change of the plurality of sub-materials) may be two or more key points in the preset key points (which may be represented as: PointA and PointB), and at this time, the above
- the display size of the plurality of sub-materials will vary according to the proportion of the size formed by the two or more key points in the video as reference, for example, the plurality of sub-materials are glasses, and the parameter values of the selected display size parameters represent the left eye.
- the display size of the plurality of sub-materials will change according to the change of the length of the connection between the two key points. If the parameter value of the display size parameter is not changed, the default parameter value may be two key points on the reference part corresponding to the plurality of sub-materials;
- Position type parameter (PositionType): a relationship type for indicating the plurality of sub-materials and positions;
- Position correlation parameter used to indicate whether the plurality of sub-materials follow the preset reference part movement, and refers to whether the plurality of sub-materials follow the positional movement of the reference part, and may include “Yes, Move With Position” and "No” option, when the parameter value selects "Yes”, the above multiple sub-materials follow the position of the reference part. If the parameter value of the position type parameter is foreground, the parameter value selects "No". , indicating that the plurality of sub-materials do not follow the positional movement of the reference portion;
- Position parameter used to indicate the position binding relationship between the plurality of sub-materials and the preset key points, and refers to the positional relationship between the plurality of sub-materials and the preset key points during the video playing process, and may be selected. Binding the positions of the plurality of sub-materials and the key points of the preset key points;
- RotateCenter A key point for indicating the rotation of the plurality of sub-materials, and selecting which key point the plurality of sub-materials will rotate according to during the video playing.
- the trigger action corresponding to the trigger action parameter includes any one or more of the following:
- NULL action trigger
- Eye movements for example, blinking, closing eyes, blinking, etc.
- Head movements for example, shaking his head, nodding his head, hoeing his head, turning his head, etc.
- Eyebrow movements for example, eyebrows, etc.
- Hand movements for example, loving hands, holding hands, palms, thumbing, congratulations, one-handedness, OK hands, scissors hands, pistols, index fingers, etc.;
- Mouth movements for example, opening a mouth, closing a mouth, etc.
- FIG. 4 it is an exemplary schematic diagram of the hand motion in the embodiment of the present application.
- the location type parameter described above includes, for example, any of the following:
- the parameter used to represent the foreground (Foreground): at this time, the corresponding plurality of sub-materials will be displayed as foreground in the video playing, and the plurality of sub-materials will be associated with the screen position of the display of the playing terminal during the playing, and the center point is The position on the screen of the display will remain the same;
- a parameter for indicating that the plurality of sub-materials are positioned and/or moved following the face position indicating that the reference part corresponding to the plurality of sub-materials is a face, and the plurality of sub-materials will be in the video playing process. Positioning and/or moving following the face position;
- a parameter for indicating that the plurality of sub-materials follow the position of the hand for positioning and/or moving indicating that the reference part corresponding to the plurality of sub-materials is a gesture (ie, a hand), and the plurality of sub-materials are played in the video at this time. The position will be followed and/or moved in accordance with the position of the hand;
- a parameter for indicating that the plurality of sub-materials are positioned and/or moved following the position of the foot for indicating that the plurality of sub-materials are positioned and/or moved following the position of the foot during video playback;
- a parameter for indicating that the plurality of sub-materials are positioned and/or moved following the position of the human skeleton for indicating that the plurality of sub-materials are positioned and/or moved following the position of the human skeleton during video playback;
- the playback position relationship associated with the reference portion may include, for example, any one or more of the following positional relationships: the plurality of sub-materials are moved following the position of the reference portion, and the plurality of sub-materials are scaled according to the size of the reference portion (Size) The plurality of sub-materials are moved along the position of the reference portion, and the plurality of sub-materials are scaled according to the size of the reference portion, and the plurality of sub-materials are depth-scaled (Depth) following the rotation of the reference portion; The sub-materials are moved along the position of the reference part, and the plurality of sub-materials are scaled according to the size of the reference part, and the plurality of sub-materials are depth-scaled (Depth) following the rotation of the reference part, and the plurality of sub-materials follow the reference.
- the plane of the part is selected for rotation (Rotation);
- a parameter for indicating a background indicating that the plurality of sub-materials are to be displayed as a background during video playback, and the plurality of sub-materials are associated with a screen position of a display of the playback terminal during video playback, the plurality of sub- The size of the material will be adjusted so that the four vertex coordinates of the plurality of sub-materials coincide with the four vertices of the screen of the display.
- the playing parameter further includes: a correspondence between a display position of the set of sub-materials and a predetermined at least one key point.
- the key points may include, but are not limited to, any one or more of the following: a head key point, a face key point, a shoulder key point, an arm key point, a gesture key point, a waist key point, and a leg key point. , key points of the foot, key points of the human skeleton, and so on.
- the head key points may include, but are not limited to, any one or more of the following: an overhead key point, a nose tip key point, and a chin key point, and the like.
- the facial key points may include, but are not limited to, any one or more of the following: facial contour key points, eye key points, eyebrow key points, nose key points, mouth key points, etc. Wait.
- the eye key points may include, but are not limited to, any one or more of the following: a left eyelid key point, a left eye pupil center key point, a left eye center key point, a right eye key point, and a right eye pupil center key point. , and the key points in the center of the right eye, and so on.
- the eyebrow key points may include, but are not limited to, any one or more of the following: a left eyebrow key point and a right eyebrow key point, and the like.
- the key points of the nose may include, but are not limited to, any one or more of the following: a key point of the nose bridge, a key point under the nose, and a key point on the outside of the nose, and the like.
- the key points of the mouth may include, for example but are not limited to, any one or more of the following: an upper lip key point, a lower lip key point, and the like.
- the shoulder key points may include, but are not limited to, any one or more of the following: a shoulder intersection key point at the intersection of the shoulder and the head, and a key point and shoulder at the arm root contour The midpoint of the shoulder contour at the midpoint between the key points, and so on.
- the arm key points may include, but are not limited to, any one or more of the following: wrist contour key points, elbow contour key points, arm root contour key points, and wrist contour points and elbows The midpoint point of the arm contour at the midpoint position between the contour key points, and the midpoint of the boom midpoint at the midpoint between the key point of the elbow contour and the key point of the arm root contour, and so on.
- the gesture key points may include, but are not limited to, any one or more of the following: four vertex keys of the gesture frame (ie, the gesture detection frame), and the center key point of the gesture frame, etc. Wait.
- the leg key points may include, but are not limited to, any one or more of the following: ankle key point, a knee contour key point, an ankle contour key point, a thigh root outer contour key point, located at the knee The key point of the calf contour at the midpoint between the contour key point and the ankle contour key point, the midpoint of the inner contour of the thigh at the midpoint between the key point of the knee inner contour and the key point of the ankle, And the midpoint of the outer contour of the thigh at the midpoint between the key point of the outer contour of the knee and the key point of the outer contour of the root of the thigh, and so on.
- the waist key points may include, but are not limited to, any one or more of the following: N-divide N between the outer contour points of the thigh root and the key points of the arm root contour, and generate N, etc. A point where N is greater than one.
- the foot key points can include, for example, but are not limited to, any one or more of the following: toe key points and heel key points, and the like.
- the human bone key points may include, but are not limited to, any one or more of the following: a right shoulder bone key point, a right elbow bone key point, a right wrist bone key point, a left shoulder bone key point, and a left elbow Bone key points, left wrist bone key points, right hip bone key points, right knee bone key points, right skeletal key points, left hip bone key points, left knee bone key points, left skeletal key points, head skeletal key points, And the key points of the neck bones, and so on.
- the positions of the plurality of key points may be set in advance to correspond to the positional relationship between the display positions of the plurality of sub-materials and the key points.
- the key points may be directly selected from the set of key points set in advance as the parameter values in the corresponding playing parameters.
- multiple key points may be defined for the face and the gesture (hand) based on face detection and gesture detection, respectively, in the effect generation, based on the face key point or the gesture key point. Correspondence of positional relationship.
- FIG. 5 is an exemplary schematic diagram of a key point of a face in the embodiment of the present application.
- a face key point may be defined as follows:
- the key points of the opponent can be defined as follows:
- the key points of the numbers 110-113 are the four vertices of the gesture detection frame (ie, the outer frame of the hand), and the key point of the number 114 is the center of the gesture detection frame.
- the method may further include: establishing a correspondence between a display position of the plurality of sub-materials and a predetermined at least one key point; and/or establishing a display position and detection of the plurality of sub-materials The correspondence between the key points in the center of the box.
- the key points in the correspondence established in the above embodiments of the present application are a head key point, a face key point, a shoulder key point, an arm key point, a waist key point, a leg key point, a foot key point, and a human body.
- the correspondence between the display position of the plurality of sub-materials and at least one of the key points may be established; in the correspondence established in the above embodiment of the present application, the key points are the head key points and the face
- the central position of the display position of the plurality of sub-materials and the corresponding detection frame (for example, the head detection frame, the face detection frame, the gesture detection frame, and the human detection frame) are established. The correspondence between points.
- the device for generating the special effect program package of the embodiment of the present application may further include a content display column.
- the method further includes: displaying a reference image through the content display column, and displaying a key point on the reference image.
- the reference image includes at least one reference portion.
- the reference site may include, for example, any one or more of the following: ears, hands, face, hair, neck, shoulders, and the like.
- the reference image may be, for example, at least a part of an image of a reference character, such as an image of any one or more of the following: a complete image, a head image, a face image, a shoulder image, an arm image, Gesture images, waist images, leg images, foot images, complete images of reference characters, and more.
- the method further includes: displaying the policy according to the preset parameter according to the parameter value of the playing parameter of the group of the sub-materials Displaying, in the content display column, each of the imported sub-materials in sequence, or simultaneously displaying the plurality of sub-materials of the imported set of sub-materials; or receiving the sub-materials of the set of sub-materials Select the operation to display the sub-material selected by the above selection operation in the content display column.
- the method further includes: updating the set of sub-materials in the content display according to the position moving operation of the set of sub-materials or one of the sub-materials received through the content display column The display position of the column, and the corresponding parameter values in the play parameters of the above set of sub-materials are updated.
- the user can select the above-mentioned group of sub-materials or one of the sub-materials displayed in the content display column by the mouse, move the mouse to the small frame in the lower right corner of the selected sub-material, and zoom the selected sub-material by moving the small frame.
- the user can select a group of child material or one of the child materials displayed in the content display column by the mouse and directly move the position thereof, and move the selected child material to the correct or desired position.
- the position and the display ratio of the plurality of sub-materials on the playback terminal will be consistent with the position and display ratio in the content display column.
- the user may add special effects to multiple reference parts.
- the ear, the face, and the hand may be used as the target parts that need to add special effects, and any of the above embodiments may be implemented to implement the ear and the face.
- the method further includes: adjusting, according to the layer parameter adjustment instruction sent by the interaction interface of the operation column, for two or more sub-materials, The occlusion relationship between the two or more sub-materials is displayed, and the two or more pieces of the plurality of sub-materials are displayed in the content display column according to the adjusted occlusion relationship and the parameter value of the play parameter.
- the method may further include:
- the special effect program file of the set of sub-materials is generated according to the preset effect program file and the parameter values of the play parameters of the set of sub-materials, and the special effect program file of the set of sub-materials is displayed through the program file column.
- the above-mentioned effect program file may include, for example, but not limited to, a special effect program file generated by a json program or any other executable program.
- the method further includes: generating, by the device that generates the special effect program file package, the operation interface, where the operation interface includes: an operation bar, Content display bar and program file bar.
- the above operation interface includes three areas of the left side, the middle side, and the right side.
- the displaying the operation interface may include: displaying an operation column on a left side of the operation interface, displaying a content display column in a middle portion of the operation interface, and displaying the program file column on a right side of the operation interface.
- the foregoing group of sub-materials can be imported through the interaction interface 20 in the left operation column, and the occlusion relationship between the plurality of groups of sub-material layers can be adjusted through the interaction interface 21, and the layer parameters of each group of sub-materials can be set.
- the parameter value is set by the interaction interface 23 for the playing parameters of a group of sub-materials;
- the content display column uses the average human face as the reference human face, and all the imported sub-materials are directly displayed, and the position of the displayed sub-material can be moved by the mouse;
- the program file display column on the right side is used to display the content of the player file of a group of sub-materials of the current setting parameter value through the display area 24 therein, and the special effect program file package can be exported through the save instruction interface 25 in the program file display column. That is: the special effects package is generated and saved.
- FIG. 6 is a flowchart of another embodiment of a method for generating a special effect program file package of the present application. As shown in FIG. 6, the method for generating the special effect program file package of the embodiment includes:
- the generating device of the special effect program file package is started according to the received startup command, and displays an operation interface.
- the operation interface includes: an operation bar, a content display bar and a program file bar.
- the reference image includes at least one reference portion.
- the operation 304 may be performed by a processor invoking a corresponding instruction stored in a memory, or may be performed by an operator interface or a content display bar that is executed by the processor.
- the operation 306 may be performed by a processor invoking a corresponding instruction stored in a memory, or may be performed by the first import module executed by the processor.
- the operation 308 may be performed by a processor invoking a corresponding instruction stored in a memory or by a first acquisition module executed by the processor.
- the operation 310 may be performed by a processor invoking a corresponding instruction stored in a memory, or may be performed by an operator interface or a program file column that is executed by the processor.
- the operation 312 may be performed by a processor invoking a corresponding instruction stored in a memory, or may be performed by a first generation module executed by the processor.
- the method may further include: saving the special effect program file package at a location pointed by the save instruction according to the received save instruction.
- the saving of the special effect file package at the location pointed by the save instruction according to the received save instruction may include:
- the special effect program file package can be compressed and saved, so as to be imported into the mobile phone terminal for special effect generation.
- the embodiment of the present application only compresses the size of the special effect file package, and does not change the size of the sub-material in the special effect file package, that is, the size of the sub-material in the special effect file package is kept before the plurality of sub-materials are imported. size.
- the special effect program file package can be imported into the terminal, and the dynamic effect of the video played by the terminal is generated.
- FIG. 7 is a flowchart of an embodiment of a method for generating a special effect of the present application.
- the special effect generating method of this embodiment includes:
- a group of sub-materials includes multiple sub-materials.
- the plurality of sub-materials have a predetermined playback timing.
- the operation 402 may be performed by a processor invoking a corresponding instruction stored in a memory, or may be performed by a second acquisition module executed by the processor.
- the video image may be subjected to key point detection related to the corresponding relationship by the neural network, and the key point detection result may be output.
- the key point detection result may include, but is not limited to, any one or more of the following: the position of the key point involved in the correspondence in the image in the video; the preset number of the key point involved in the corresponding relationship in the special program package .
- the operation 404 may be performed by a processor invoking a corresponding instruction stored in a memory, or may be performed by a first detection module executed by the processor.
- the operation 406 may be performed by a processor invoking a corresponding instruction stored in a memory, or may be performed by a second generation module executed by the processor.
- the parameter values of the playing parameters of at least one set of the sub-materials in the special effect program file package are obtained, wherein the set of sub-materials includes a plurality of sub-materials; the key point detection is performed on the video image; A detected key point and a parameter value of a play parameter of at least one set of sub-materials, and an effect based on at least one set of sub-materials is generated on the video image.
- a dynamic special effect is generated on the video by using a parameter value of a play parameter of at least one set of the sub-materials in the pre-generated special program file package and a key point in the video image, and the dynamic special effect play is implemented in the video, and the video is improved. Playback effect.
- the method may further include: importing a special effect program file package.
- the effect program file package may include parameter values of at least one set of sub-materials and play parameters of the at least one set of sub-materials, and the parameter values of the play parameters of the set of sub-materials include display positions and reservations of the set of sub-materials. The correspondence between at least one key point.
- importing the special effect program file package may include: reading the special effect program file package into the memory by calling a first interface function for reading the sticker material; and parsing the special effect program file package to obtain the at least the foregoing A set of sub-materials and special effect program files, wherein the special effect program file includes parameter values of playing parameters of the at least one set of sub-materials.
- the above-mentioned effect program file may include: a json program or an effect program file of other executable programs.
- the special effect program file package in the embodiment of the special effect generation method of the present application may be the special effect program file package generated by the embodiment of the method for generating the special effect program file package of the present application.
- the operation 402 may include: creating a sticker handle by using a second interface function for creating a sticker handle; reading parameter values of the playing parameters in the plurality of sub-materials and the effect program file package, and storing the above Sticker handles.
- the method further includes: determining a play timing of the plurality of sub-materials according to a file name of the plurality of sub-materials; and at least one group of the special effect program files in the sticker handle Obtaining a parameter value of a play parameter of the material and a play timing of the plurality of sub-materials, acquiring a position and a video frame number of each of the at least one set of the sub-materials in the video, and reading the video frame from the video in advance The number of corresponding video images.
- the operation 406 may include: reading, by using a third interface function for calling the rendering sticker material, the plurality of sub-materials that need to be displayed on the current video image of the video from the sticker handle; a third interface function for calling the rendering sticker material, reading the sub-material that needs to be displayed on the current video image of the video from the sticker handle; determining the above according to the detected key point and the parameter value of the playing parameter The position of the sub-material to be displayed on the current video image; the sub-material that needs to be displayed on the current video image is displayed on the display position on the current video image.
- the method further includes: destroying the sticker handle by a fourth interface function for invoking the destruction of the sticker handle in response to the effect program package being played.
- FIG. 8 is a flowchart of another embodiment of a method for generating a special effect of the present application.
- the special effect generating method of this embodiment includes:
- the parameter value of the play parameter of the set of sub-materials includes a correspondence between a display position of the set of sub-materials and a predetermined at least one key point.
- the above-mentioned effect program file may include: a json program or an effect program file of other executable programs.
- the special effect program file package in the specific effect generation method embodiment of the present application may be the special effect program file package generated by the method for generating the special effect program file package according to any one of the embodiments of the present application.
- the operations 502-504 may be performed by a processor invoking a corresponding instruction stored in a memory or by a second import module executed by the processor.
- the operations 506-508 may be performed by a processor invoking a corresponding instruction stored in a memory, or may be performed by a second acquisition module executed by the processor.
- the operation 510 may be performed by a processor invoking a corresponding instruction stored in a memory, or may be performed by a third acquisition module executed by the processor.
- the operation 512 may be performed by a processor invoking a corresponding instruction stored in a memory, or may be performed by a first detection module executed by the processor.
- the operations 514-518 may be performed by a processor invoking a corresponding instruction stored in a memory, or may be performed by a second generation module executed by the processor.
- the embodiments of the special effects generating method of the present application can be used in various video playing scenarios, for example, for a live video scene containing a character, and a dynamic special effect is generated for the live video, and at least one set of sub-materials in the special effect file package is superimposed on the character. Play on the corresponding part.
- the corresponding parts can be, for example, ears, hands, face, hair, neck, shoulders, and the like.
- the method for generating the special effect program file package and the special effect generation method provided by the embodiments of the present application may be performed by any suitable device having data processing capability, including but not limited to: a terminal device, a server, and the like.
- the method for generating the special effect program file package and the special effect generation method provided by the embodiment of the present application may be executed by a processor, such as the processor executing any special effect program mentioned in the embodiment of the present application by calling a corresponding instruction stored in the memory.
- File package generation method and special effect generation method This will not be repeated below.
- the foregoing program may be stored in a computer readable storage medium, and the program is executed when executed.
- the foregoing steps include the steps of the foregoing method embodiments; and the foregoing storage medium includes: a medium that can store program codes, such as a ROM, a RAM, a magnetic disk, or an optical disk.
- FIG. 9 is a schematic structural diagram of an embodiment of a device for generating a special effect program file package according to the present application.
- the apparatus for generating the special effect program file package of the embodiment can be used to implement the embodiment of the method for generating the special effect program file package of the present application.
- the apparatus for generating a special effect program file package of the embodiment includes: a first import module, a first acquisition module, and a first generation module. among them:
- a first import module configured to import a set of sub-materials; the set of sub-materials includes a plurality of sub-materials.
- a plurality of sub-materials in a group of sub-materials have a predetermined playing timing, and a playing timing of the plurality of sub-materials may be determined based on file names of the plurality of sub-materials.
- the first obtaining module is configured to obtain a parameter value of a playing parameter of the set of sub-materials.
- a first generating module configured to generate a special effect program file package according to the parameter values of the set of sub-materials and the playing parameters.
- the special effects package may include a set of sub-materials or sets of sub-materials.
- the device for generating a special effect program file package is configured to import a set of sub-materials, the set of sub-materials includes a plurality of sub-materials; and obtain parameter values of playing parameters of a group of sub-materials;
- the parameter value of the play parameter is generated to generate a special effect program file package, so as to perform dynamic special effect processing based on the special effect program file package video, and implement dynamic special effects on the played video.
- the rendering engine can be generated without manually writing the program file.
- the executed special effect program file is simple in operation and short in time, which improves the overall efficiency of the dynamic effect realization and effectively guarantees the accuracy of the special effects.
- FIG. 10 is a schematic structural diagram of another embodiment of a device for generating a special effect program file package according to the present application.
- the generating apparatus of this embodiment further includes an operation interface including an operation bar as shown in FIG. 2, which is an example view of the operation interface, as compared with the embodiment shown in FIG.
- the first importing module is configured to receive an importing instruction input through the interactive interface of the operation bar, and import a plurality of sub-materials in the material folder pointed by the importing instruction as the set of sub-materials.
- the first importing module is configured to: receive an import instruction sent through an interaction interface in a play parameter setting interface under the operation bar, and import multiple sub-materials in the material folder pointed by the import instruction; or, receive The selection instruction sent by the interaction interface of the operation bar selects the reference part selected by the instruction as the target part currently needed to add the special effect, and displays the play parameter setting interface under the target part in the operation bar; receives the interaction in the play parameter setting interface.
- the import instruction sent by the interface imports multiple sub-materials in the material folder pointed to by the import instruction.
- the first importing module is configured to: receive an import instruction sent through the interactive interface, obtain and display a material folder pointed to by the import instruction; and in response to receiving a selection operation on the child material in the material folder, Importing a plurality of sub-materials selected by the sub-material selection operation; and/or, in response to not receiving the sub-material selection operation in the material folder, selecting all the sub-materials or partial sub-materials in the material folder according to the preset setting, and importing The selected child material is set according to the preset.
- the method when the first import module imports the plurality of sub-materials in the material folder pointed to by the import instruction, the method is configured to: in response to the import instruction, include multiple sub-materials in the material folder pointed by the import instruction Display order, reading and importing multiple child materials in the display order, and displaying the file names of the imported multiple child materials in the display column in the display order; and/or, in response to the import instruction not including the material folder pointed to by the import instruction
- the display order of the plurality of sub-materials is read and imported in a preset order, and the file names of the imported plurality of sub-materials are displayed in a preset order in the operation bar.
- the first obtaining module is configured to: in response to receiving the parameter value set by the interaction parameter in the playing parameter setting interface for the playing parameter of the set of sub-materials, set the parameter value as a group a parameter value of a play parameter of the child material; and/or, in response to not receiving a parameter value set by a play parameter of the set of child material transmitted through the interactive interface in the play parameter setting interface, using the preset parameter value as a group The parameter value of the playback parameter of the child material.
- the play parameter may further include: a correspondence between a display position of the set of sub-materials and a predetermined at least one key point.
- the first obtaining module may be further configured to establish a correspondence between a display position of a group of sub-materials and at least one key point; and/or, establish a display position of the group of sub-materials and a detection frame. The correspondence between the key points of the center.
- the operation interface may further include a content display bar for displaying the reference image and displaying key points on the reference image, wherein the reference image includes at least one reference portion.
- a content display bar for displaying the reference image and displaying key points on the reference image, wherein the reference image includes at least one reference portion.
- the reference image may be, for example, at least a part of an image of a reference character, such as an image of any one or more of the following: a complete image, a head image, a face image, a shoulder image, an arm image, Gesture images, waist images, leg images, foot images, and more.
- the content display column may be further configured to display the imported set of sub-materials in the content display column according to the preset display strategy according to the parameter values of the playing parameters of the set of sub-materials.
- Each sub-material, or multiple sub-materials of the imported set of sub-materials are simultaneously displayed; or, a selection operation of the sub-materials in a set of sub-materials is received, and the sub-materials selected by the selection operation are displayed in the content display column.
- the generating apparatus of the foregoing embodiments of the present application may further include: a first updating module, configured to perform a position moving operation on a set of sub-materials or one of the sub-materials received according to the content display column, Updates the display position of a group of child assets in the content display bar, and updates the corresponding parameter values in the playback parameters of a group of child materials.
- a first updating module configured to perform a position moving operation on a set of sub-materials or one of the sub-materials received according to the content display column, Updates the display position of a group of child assets in the content display bar, and updates the corresponding parameter values in the playback parameters of a group of child materials.
- the generating apparatus of the foregoing embodiments of the present application may further include: a second updating module, configured to perform an operation of resizing a set of sub-materials or one of the sub-materials received according to the content display column. Updates the display size of a set of child assets in the content display bar and updates the corresponding parameter values in the playback parameters of a group of child assets.
- the generating apparatus of the foregoing embodiments of the present application may further include: an adjusting module, configured to adjust, according to the layer parameter adjustment instruction sent by the interaction interface of the operation bar for two or more sub-materials, Adjust the occlusion relationship between two or more sub-materials, and display two or more sub-materials according to the adjusted occlusion relationship and the parameter values of the playback parameters.
- an adjusting module configured to adjust, according to the layer parameter adjustment instruction sent by the interaction interface of the operation bar for two or more sub-materials, Adjust the occlusion relationship between two or more sub-materials, and display two or more sub-materials according to the adjusted occlusion relationship and the parameter values of the playback parameters.
- the operation interface may further include: a program file bar, configured to generate a special effect program file of a set of sub-materials according to a parameter value of the preset effect program file and a set of sub-material play parameters, and display a program file column The effect file of the group material.
- the special effect program file therein may include, for example but is not limited to: a special effect program file generated by a json program.
- the operation interface may include three areas of the left side, the middle side, and the right side.
- the operation bar is displayed on the left side of the operation interface
- the content display column is displayed in the middle of the operation interface
- the program file column is displayed on the right side of the operation interface.
- the generating apparatus of the foregoing embodiments of the present application may further include: a saving module, configured to save the special effect program file package at a location pointed by the save instruction according to the received save instruction.
- the saving module is configured to: in response to receiving the save instruction, display the save path selection interface and the compression interface; receive the save location sent by the save path selection interface; and receive the compression mode sent by the compression interface, and The effect file package of the child material is compressed according to the compression method to generate a compressed file package; the compressed file package is stored in a folder pointed to by the save location.
- the size of the child material in the effects package is kept the size before each child is imported.
- the special effects generating apparatus may be used to implement the foregoing specific effect generating method embodiments of the present application, and may be, but not limited to, an AR engine or an electronic device having an AR special effect drawing function.
- FIG. 11 is a schematic structural diagram of an embodiment of a special effect generating apparatus of the present application.
- the special effect generating apparatus of this embodiment includes: a second acquiring module, a first detecting module, and a second generating module. among them:
- the second obtaining module is configured to obtain a parameter value of the playing parameter of the at least one set of the sub-materials in the special effect program file package, where the set of sub-materials includes a plurality of sub-materials.
- the plurality of sub-materials have a predetermined playback timing.
- the special effect program file package in the embodiment of each special effect generating device of the present application may be the special effect program file package generated by the generating method or the device embodiment of any of the above-mentioned special effect program file packages.
- the first detecting module is configured to perform key point detection on the video image.
- a second generating module configured to generate an effect based on the at least one set of sub-materials on the video image according to the detected key points and parameter values of the playing parameters of the at least one set of sub-materials.
- the special effect generating apparatus acquires parameter values of playing parameters of at least one set of sub-materials in the special effect program file package, wherein the set of sub-materials includes a plurality of sub-materials; and the key points are detected on the video image; A detected key point and a parameter value of a play parameter of at least one set of sub-materials, and an effect based on at least one set of sub-materials is generated on the video image.
- a dynamic special effect is generated on the video by using a parameter value of a play parameter of at least one set of the sub-materials in the pre-generated special program file package and a key point in the video image, and the dynamic special effect play is implemented in the video, and the video is improved. Playback effect.
- FIG. 12 is a schematic structural diagram of another embodiment of a special effect generating apparatus of the present application.
- the special effect generating apparatus of this embodiment further includes: a second importing module for importing the special effect program file package, as compared with the embodiment shown in FIG.
- the effect program file package includes at least one set of sub-materials and parameter values of play parameters of at least one set of sub-materials, and the parameter values of the play parameters of the set of sub-materials include a display position of the set of sub-materials and at least one predetermined key point. Correspondence between them.
- the playing parameter includes: a triggering event parameter, and the triggering event parameter is used to indicate a triggering event that triggers display of the plurality of sub-materials.
- the special effect generating apparatus of the embodiment may further include: a second detecting module, configured to detect whether a triggering action corresponding to the parameter value of the triggering action parameter occurs in the video image.
- the second generating module is configured to: in response to detecting the triggering action corresponding to the parameter value of the triggering action parameter in the video image, according to the detected key point and the playing parameter of the at least one set of the child material The parameter value that generates an effect based on at least one set of sub-materials on the video.
- the playing parameter further includes: a trigger end parameter: the trigger end parameter is used to indicate an action of ending the display of the plurality of sub-materials.
- the second detecting module is further configured to detect whether a trigger action corresponding to the parameter value of the trigger end parameter occurs in the video image.
- the second generating module is further configured to end, in response to detecting the triggering action corresponding to the parameter value of the triggering end parameter in the video image, ending the generating of the effect of the at least one set of the sub-material on the currently playing video.
- the playing parameter further includes: a beauty/beauty effect parameter, wherein the beauty/beauty effect parameter is used to indicate the beauty displayed in the preset part when the sub-material is displayed/ Beauty effect.
- the second generating module is further configured to: when generating the special effect based on the at least one set of the sub-materials on the video image, according to the detected key points and the parameter values of the playing parameters of the at least one set of sub-materials, according to The beauty/beauty effect parameter displays the beauty/beauty effect in the preset part of the video image.
- the second importing module is configured to: read the special effect program file package into the memory by calling a first interface function for reading the sticker material; parsing the dynamic special effect program file package to obtain at least one set of sub-materials And the effect program file, the special effect program file includes parameter values of the play parameters of at least one set of sub-materials.
- the special effect program file therein may include, for example, a special effect program file of the json program.
- the second obtaining module is configured to: create a sticker handle by using a second interface function for creating a sticker handle; and read at least one set of sub-materials and at least one set of sub-materials in the special effect file package.
- the parameter value of the parameter is stored in the sticker handle.
- the special effect generating apparatus may further include: a determining module and a third acquiring module.
- the determining module is configured to determine a playing sequence of the plurality of sub-materials according to file names of the plurality of sub-materials.
- a third obtaining module configured to acquire, according to a parameter value of a play parameter of at least one set of sub-materials in the special effect program file of the sticker handle and a play timing of the plurality of sub-materials, a position of each of the at least one set of sub-materials displayed in the video And the number of video frames, and the video image corresponding to the number of video frames is read in advance from the video.
- the second generating module is configured to: read, by using a third interface function for calling the rendering sticker material, the sub-material that needs to be displayed on the current video image of the video from the sticker handle; The key point and the parameter value of the play parameter determine the position of the child material to be displayed on the current video image; the child material that needs to be displayed on the current video image is displayed on the current video image.
- the second obtaining module is further configured to be destroyed by using a fourth interface function for calling the destroying sticker handle in response to the special effect program file package being played. Sticker handle.
- the first detecting module is configured to perform key point detection on the video image by using a neural network, and output a key point detection result.
- the key point detection result may include any one or more of the following: the position of the key point involved in the correspondence in the image in the video; and the preset number of the key point involved in the correspondence.
- another electronic device provided by the embodiment of the present application includes:
- a memory for storing a computer program
- the processor is configured to execute a computer program stored in a memory, and when the computer program is executed, implement a method for generating a special effect program file package or a special effect generation method according to any one of the embodiments of the present application.
- FIG. 13 is a schematic structural diagram of an application embodiment of an electronic device according to the present application.
- the electronic device includes one or more processors, a communication unit, etc., such as one or more central processing units (CPUs), and/or one or more images.
- processors such as one or more central processing units (CPUs), and/or one or more images.
- a processor GPU or the like, the processor can perform various appropriate actions and processes according to executable instructions stored in a read only memory (ROM) or executable instructions loaded from a storage portion into a random access memory (RAM) .
- ROM read only memory
- RAM random access memory
- the communication portion may include, but is not limited to, a network card, which may include, but is not limited to, an IB (Infiniband) network card, and the processor may communicate with the read only memory and/or the random access memory to execute executable instructions, and connect to the communication portion through the bus. And communicating with the other target device by the communication unit, so as to complete the operation corresponding to any method provided by the embodiment of the present application, for example, importing a set of sub-materials; the set of sub-materials includes a plurality of sub-materials; a parameter value of a play parameter of the child material; generating a special effect program file package according to the parameter values of the set of child material and the play parameter.
- a network card which may include, but is not limited to, an IB (Infiniband) network card
- the processor may communicate with the read only memory and/or the random access memory to execute executable instructions, and connect to the communication portion through the bus. And communicating with the other target device by the communication unit, so as to complete the operation corresponding
- the CPU, ROM, and RAM are connected to each other through a bus.
- the ROM is an optional module.
- the RAM stores executable instructions, or writes executable instructions to the ROM at runtime, the executable instructions causing the processor to perform operations corresponding to the methods described in any of the embodiments of the present application.
- An input/output (I/O) interface is also connected to the bus.
- the communication unit can be integrated or set up with multiple sub-modules (eg multiple IB network cards) and on the bus link.
- the following components are connected to the I/O interface: an input portion including a keyboard, a mouse, and the like; an output portion including a cathode ray tube (CRT), a liquid crystal display (LCD), and the like, and a speaker; a storage portion including a hard disk or the like; The communication part of the network interface card of the LAN card, modem, etc.
- the communication section performs communication processing via a network such as the Internet.
- the drive is also connected to the I/O interface as needed.
- a removable medium such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory or the like is mounted on the drive as needed so that a computer program read therefrom is installed into the storage portion as needed.
- FIG. 13 is only an optional implementation manner.
- the number and type of the components in FIG. 13 may be selected, deleted, added, or replaced according to actual needs;
- Different function components can also be implemented in separate settings or integrated settings, such as GPU and CPU detachable settings or GPU can be integrated on the CPU, the communication part can be separated, or integrated on the CPU or GPU. and many more.
- an embodiment of the present application includes a computer program product comprising a computer program tangibly embodied on a machine readable medium, the computer program comprising program code for executing the method illustrated in the flowchart, the program code comprising The instructions corresponding to the steps of the face anti-counterfeiting detection method provided by the embodiment of the present application are executed.
- the computer program can be downloaded and installed from the network via a communication portion, and/or installed from a removable medium.
- the embodiment of the present application further provides a computer program, including a computer instruction, when the computer instruction is run in a processor of the device, to implement a method for generating a special effect program file package according to any embodiment of the present application, or a special effect.
- the generation method is not limited to any embodiment of the present application, or a special effect.
- the embodiment of the present application further provides a computer readable storage medium, where the computer program is stored, and when the computer program is executed by the processor, the special effect program file package described in any one of the foregoing embodiments of the present application is implemented. Method, or special effect generation method.
- the methods and apparatus of the present application may be implemented in a number of ways.
- the methods and apparatus of the present application can be implemented in software, hardware, firmware, or any combination of software, hardware, and firmware.
- the above-described sequence of steps for the method is for illustrative purposes only, and the steps of the method of the present application are not limited to the order described above unless otherwise specifically stated.
- the present application can also be implemented as a program recorded in a recording medium, the programs including machine readable instructions for implementing the method according to the present application.
- the present application also covers a recording medium storing a program for executing the method according to the present application.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Signal Processing (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
- Stored Programmes (AREA)
- Image Analysis (AREA)
Abstract
Description
关键点项目 | 关键点编号 | 关键点项目 | 关键点编号 |
脸框(脸部轮廓关键点) | 0-32 | 鼻梁 | 43-46 |
左眉毛 | 33-37,64-67 | 右眉毛 | 38-42,68-71 |
左眼眶 | 52-57,72-73 | 右眼眶 | 58-63,75-76 |
左眼瞳孔 | 74,104, | 右眼瞳孔 | 77,105 |
鼻子下沿 | 47-51 | 鼻子外侧轮廓 | 78-83 |
上嘴唇 | 84-90,96-100 | 下嘴唇 | 91-95,101-103 |
关键点项目 | 关键点编号 | 关键点项目 | 关键点编号 |
手势框 | 110-113 | 中心 | 114 |
Claims (93)
- 一种特效程序文件包的生成方法,其特征在于,包括:导入一组子素材,所述一组子素材包括多个子素材;获取所述一组子素材的播放参数的参数值;根据所述一组子素材和所述播放参数的参数值生成特效程序文件包。
- 根据权利要求1所述的方法,其特征在于,所述多个子素材具有预定的播放时序。
- 根据权利要求2所述的方法,其特征在于,所述多个子素材的播放时序基于所述多个子素材的文件名确定。
- 根据权利要求1-3任一所述的方法,其特征在于,所述导入一组子素材,包括:接收通过操作栏的交互接口输入的导入指令,导入所述导入指令指向的素材文件夹中的多个子素材作为所述一组子素材。
- 根据权利要求4所述的方法,其特征在于,所述接收通过操作栏的交互接口输入的导入指令,导入所述导入指令指向的素材文件夹中的多个子素材,包括:接收通过所述操作栏下播放参数设置界面中的交互接口发送的导入指令,导入所述导入指令指向的素材文件夹中的多个子素材;或者,接收通过所述操作栏的交互接口发送的选取指令,以所述选取指令选取的参考部位作为所述当前需要添加特效的目标部位,并在所述操作栏显示所述目标部位下的播放参数设置界面;接收通过所述播放参数设置界面中的交互接口发送的导入指令,导入所述导入指令指向的素材文件夹中的多个子素材。
- 根据权利要求4或5所述的方法,其特征在于,接收通过交互接口发送的导入指令,导入所述导入指令指向的素材文件夹中的多个子素材,包括:接收通过所述交互接口发送的导入指令,获取并显示所述导入指令指向的素材文件夹;响应于接收到对所述素材文件夹中的子素材选取操作,导入所述子素材选取操作选取的多个子素材;和/或,响应于未接收到对所述素材文件夹中的子素材选取操作,根据预先设置选取素材文件夹中的全部子素材或者部分子素材,并导入根据预先设置选取的子素材。
- 根据权利要求1-6任一所述的方法,其特征在于,所述特效程序文件包包括一组子素材;或者,所述特效程序文件包包括多组子素材。
- 根据权利要求4-7任一所述的方法,其特征在于,所述导入所述导入指令指向的素材文件夹中的多个子素材,包括:响应于所述导入指令中包括所述导入指令指向的素材文件夹中的多个子素材的显示顺序,按照所述显示顺序读取并导入所述多个子素材,并在所述操作栏按照所述显示顺序显示导入的所述多个子素材的文件名;和/或,响应于所述导入指令中未包括所述导入指令指向的素材文件夹中的多个子素材的显示顺序,按照预设顺序读取并导入所述多个子素材,并在所述操作栏按照预设顺序显示导入的所述多个子素材的文件名。
- 根据权利要求5-8任一所述的方法,其特征在于,所述获取所述一组子素材的播放参数的参数值,包括:响应于接收到通过所述播放参数设置界面中的交互接口发送的针对所述一组子素材的播放参数设置的参数值,以所述设置的参数值作为所述一组子素材的播放参数的参数值;和/或,响应于未接收到通过所述播放参数设置界面中的交互接口发送的针对所述一组子素材的播放参数设置的参数值,以预设参数值作为所述一组子素材的播放参数的参数值。
- 根据权利要求1-9任一所述的方法,其特征在于,所述一组子素材的播放参数包括以下任意一项或多项;其中:显示参数:用于表示是否显示所述多个子素材;间隔参数:用于表示显示所述多个子素材中相邻两个子素材间隔的帧数;触发动作参数:用于表示触发所述多个子素材显示的触发动作;循环参数:用于表示所述多个子素材的循环播放次数;延迟触发参数:用于表示延迟显示所述多个子素材的时间;触发结束参数:用于表示结束所述多个子素材显示的动作;显示尺寸参数:用于表示多个子素材的显示大小变化的参考依据;位置类型参数:用于表示多个子素材和位置的关系类型;位置关联参数:用于表示多个子素材是否跟随预设参考部位移动;位置参数:用于表示多个子素材与预设关键点之间的位置绑定关系;旋转参数:用于表示多个子素材旋转依据的关键点;美颜/美妆效果参数:用于表示显示子素材时在预设部位显示的美颜/美妆效果。
- 根据权利要求10所述的方法,其特征在于,所述触发动作参数对应的触发动作包括以下任意一项或多项:无动作触发,眼部动作,头部动作,眉部动作,手部动作,嘴部动作,肩部动作。
- 根据权利要求10或11所述的方法,其特征在于,所述位置类型参数包括以下任意一项:用于表示前景的参数;用于表示所述多个子素材跟随脸部位置进行定位和/或移动的参数;用于表示所述多个子素材跟随手的位置进行定位和/或移动的参数;用于表示所述多个子素材跟随头部的位置进行定位和/或移动的参数;用于表示所述多个子素材跟随肩部的位置进行定位和/或移动的参数;用于表示所述多个子素材跟随手臂的位置进行定位和/或移动的参数;用于表示所述多个子素材跟随腰部的位置进行定位和/或移动的参数;用于表示所述多个子素材跟随腿部的位置进行定位和/或移动的参数;用于表示所述多个子素材跟随脚部的位置进行定位和/或移动的参数;用于表示所述多个子素材跟随人体骨骼的位置进行定位和/或移动的参数;与参考部位相关的播放位置关系;用于表示背景的参数。
- 根据权利要求12所述的方法,其特征在于,所述与参考部位相关播放位置关系包括以下任意一项或多项:所述多个子素材跟随所述参考部位的位置进行移动,所述多个子素材跟随所述参考部位的大小进行缩放;所述多个子素材跟随所述参考部位的位置进行移动,所述多个子素材跟随所述参考部位的大小进行缩放,所述多个子素材跟随所述参考部位的旋转进行纵深缩放;所述多个子素材跟随所述参考部位的位置进行移动,所述多个子素材跟随所述参考部位的大小进行缩放,所述多个子素材跟随所述参考部位的旋转进行纵深缩放,所述多个子素材跟随所述参考部位的平面旋转进行旋转。
- 根据权利要求1-13任一所述的方法,其特征在于,所述播放参数中包括:所述一组子素材的显示位置和预定的至少一关键点之间的对应关系;所述关键点包括以下任意一种或多种:头部关键点,脸部关键点,肩部关键点,手臂关键点,手势关键点,腰部关键点,腿部关键点,脚部关键点,人体骨骼关键点。
- 根据权利要求14所述的方法,其特征在于,所述头部关键点包括以下任意一项或多项:头顶关键点,鼻尖关键点,以及下巴关键点;和/或,所述面部脸部关键点包括以下任意一项或多项:脸部轮廓关键点,眼睛关键点,眉毛关键点,鼻子关键点,嘴部关键点;和/或,所述肩部关键点包括以下任意一项或多项:位于肩部与头部交汇位置处的肩头交汇关键点,以及位于臂根轮廓关键点与肩头交汇关键点之间的中点位置处的肩轮廓中点关键点;和/或,所述手臂关键点包括以下任意一项或多项:手腕轮廓关键点,胳膊肘轮廓关键点,臂根轮廓关键点,位于手腕轮廓关键点与胳膊肘轮廓关键点之间的中点位置处的小臂轮廓中点关键点,以及位于胳膊肘轮廓关键点与臂根轮廓关键点之间的中点位置处的大臂中点关键点;和/或,所述手势关键点包括以下任意一项或多项:手势框的四个顶点关键点,以及手势框的中心关键点;和/或,所述腿部关键点包括以下任意一项或多项:裆部关键点,膝盖轮廓关键点,脚踝轮廓关键点,大腿根部外侧轮廓关键点,位于膝盖轮廓关键点与脚踝轮廓关键点之间的中点位置处的小腿轮廓中点关键点,位于膝盖内轮廓关键点与裆部关键点之间的中点位置处的大腿内轮廓中点关键点,以及位于膝盖外轮廓关键点与大腿根部外侧轮廓关键点之间的中点位置处的大腿外轮廓中点关键点;和/或,所述腰部关键点包括以下任意一项或多项:将大腿根部外侧轮廓关键点与臂根轮廓关键点之间N等分,所产生的N个等分点;其中,所述N大于1;和/或,所述脚部关键点包括以下任意一项或多项:脚尖关键点以及足跟关键点;和/或,所述人体骨骼关键点把包括以下任意一项或多项:右肩骨骼关键点,右肘骨骼关键点,右腕骨骼关键点,左肩骨骼关键点,左肘骨骼关键点,左腕骨骼关键点,右髋骨骼关键点,右膝骨骼关键点,右踝骨骼关键点,左髋骨骼关键点,左膝骨骼关键点,左踝骨骼关键点,头顶骨骼关键点,以及脖子骨骼关键点。
- 根据权利要求15所述的方法,其特征在于,所述眼睛关键点包括以下任意一项或多项:左眼眶关键点,左眼瞳孔中心关键点,左眼中心关键点,右眼眶关键点,右眼瞳孔中心关键点,以及右眼中心关键点;和/或,所述眉毛关键点包括以下任意一项或多项:左眉毛关键点以及右眉毛关键点;和/或,所述鼻子关键点包括以下任意一项或多项:鼻梁关键点,鼻子下沿关键点,以及鼻子外侧轮廓关键点;和/或,所述嘴部关键点包括以下任意一项或多项:上嘴唇关键点,以及下嘴唇关键点。
- 根据权利要求14-16任一所述的方法,其特征在于,还包括:建立所述一组子素材的显示位置和所述至少一关键点之间的对应关系;和/或,建立所述一组子素材的显示位置和检测框的中心关键点之间的对应关系。
- 根据权利要求14-17任一所述的方法,其特征在于,还包括:通过内容显示栏显示参考图像,并显示所述参考图像上的关键点;所述参考图像包括至少一个参考部位。
- 根据权利要求18所述的方法,其特征在于,所述参考图像包括:参考人物的至少一部分图像。
- 根据权利要求19所述的方法,其特征在于,所述参考人物的至少一部分图像包括所述参考人物的以下任意一项或多项的图像:完整图像,头部图像,脸部图像,肩部图像,手臂图像,手势图像,腰部图像,腿部图像,脚部图像。
- 根据权利要求18-20任一所述的方法,其特征在于,所述导入一组子素材之后,还包括:根据所述一组子素材的播放参数的参数值,按照预设显示策略在所述内容显示栏依次显示导入的所述一组子素材中的每个子素材、或者同时显示导入的所述一组子素材中的多个子素材;或者,接收到对所述一组子素材中子素材的选取操作,在所述内容显示栏显示所述选取操作选取的子素材。
- 根据权利要求21所述的方法,其特征在于,还包括:根据通过所述内容显示栏接收到的对所述一组子素材或其中一个子素材的位置移动操作,更新所述一组子素材在所述内容显示栏的显示位置,并对所述一组子素材的播放参数中的相应参数值进行更新。
- 根据权利要求21或22所述的方法,其特征在于,还包括:根据通过所述内容显示栏接收到的对所述一组子素材或其中一个子素材的大小调整操作,更新所述一组子素材在所述内容显示栏的显示大小,并对所述一组子素材的播放参数中的相应参数值进行更新。
- 根据权利要求1-23任一所述的方法,其特征在于,还包括:根据通过所述操作栏的交互接口接收到的针对两组或以上子素材发送的图层参数调整指令,调整所述两组或以上子素材之间的遮挡关系,并根据调整后的遮挡关系和所述播放参数的参数值显示所述两组或以上子素材。
- 根据权利要求1-24任一所述的方法,其特征在于,所述生成特效程序文件包之前,还包括:根据预先设置的特效程序文件和所述一组子素材的播放参数的参数值,生成所述一组子素材的特效程序文件,并通过程序文件栏显示所述一组子素材的特效程序文件。
- 根据权利要求25所述的方法,其特征在于,所述特效程序文件包括:以json程序生成的特效程序文件。
- 根据权利要求1-26任一所述的方法,其特征在于,还包括:根据接收到的启动指令启动,并显示操作界面,所 述操作界面包括:操作栏,内容显示栏和/或程序文件栏。
- 根据权利要求27所述的方法,其特征在于,所述操作界面包括左侧、中部和右侧三个区域;所述显示操作界面,包括:在所述操作界面的左侧显示所述操作栏,在所述操作界面的中部显示所述内容显示栏,在所述操作界面右侧显示所述程序文件栏。
- 根据权利要求1-28任一所述的方法,其特征在于,所述生成特效程序文件包之后,还包括:根据接收到的保存指令在所述保存指令指向的位置保存所述特效程序文件包。
- 根据权利要求29所述的方法,其特征在于,所述根据接收到的保存指令在所述保存指令指向的位置保存所述特效程序文件包,包括:响应于接收到保存指令,显示保存路径选择接口和压缩接口;接收通过所述保存路径选择接口发送的保存位置;以及接收基于所述压缩接口发送的压缩方式,并根据所述压缩方式对所述特效程序文件包进行压缩,生成压缩文件包;将所述压缩文件包存储至所述保存位置指向的文件夹中。
- 根据权利要求1-30任一所述的方法,其特征在于,所述特效程序文件包中子素材的大小保持为子素材被导入前的大小。
- 一种特效生成方法,其特征在于,包括:获取特效程序文件包中至少一组子素材的播放参数的参数值;其中,一组子素材包括多个子素材;对视频图像进行关键点检测;根据检测到的关键点和所述至少一组子素材的播放参数的参数值,在视频图像上生成基于所述至少一组子素材的特效。
- 根据权利要求32所述的方法,其特征在于,还包括:导入所述特效程序文件包;所述特效程序文件包包括:至少一组子素材和所述至少一组子素材的播放参数的参数值,所述一组子素材的播放参数的参数值包括所述一组子素材的显示位置和预定的至少一关键点之间的对应关系。
- 根据权利要求32或33所述的方法,其特征在于,所述多个子素材具有预定的播放时序。
- 根据权利要求34所述的方法,其特征在于,所述播放参数包括:触发事件参数,所述触发事件参数用于表示触发所述多个子素材显示的触发事件;所述方法还包括:检测所述视频图像中是否出现所述触发动作参数的参数值对应的触发动作;所述基于检测到的关键点和所述至少一组子素材的播放参数的参数值,在所述当前正在播放的视频上生成所述至少一组子素材的特效,包括:响应于检测到所视频图像中出现所述触发动作参数的参数值对应的触发动作,根据检测到的关键点和所述至少一组子素材的播放参数的参数值,在所述视频上生成基于所述至少一组子素材的特效。
- 根据权利要求35所述的方法,其特征在于,所述播放参数还包括:触发结束参数:所述触发结束参数用于表示结束所述多个子素材显示的动作;所述方法还包括:检测所述视频图像中是否出现所述触发结束参数的参数值对应的触发动作;响应于检测到所述视频图像中出现所述触发结束参数的参数值对应的触发动作,结束在所述当前正在播放的视频上生成所述至少一组子素材的特效。
- 根据权利要求35或36所述的方法,其特征在于,所述播放参数包括:美颜/美妆效果参数,所述美颜/美妆效果参数用于表示显示子素材时在预设部位显示的美颜/美妆效果;所述方法还包括:根据检测到的关键点和所述至少一组子素材的播放参数的参数值,在视频图像上生成基于所述至少一组子素材的特效时,根据所述美颜/美妆效果参数,在所述视频图像中的所述预设部位显示美颜/美妆效果。
- 根据权利要求32-37任一所述的方法,其特征在于,所述导入特效程序文件包,包括:通过调用用于读取贴纸素材的第一接口函数,将所述特效程序文件包读入内存;解析所述特效程序文件包,获得所述至少一组子素材和特效程序文件,所述特效程序文件包括所述至少一组子素材的播放参数的参数值。
- 根据权利要求38所述的方法,其特征在于,所述特效程序文件包括:json程序的特效程序文件。
- 根据权利要求38或39所述的方法,其特征在于,所述获取所述特效程序文件包中至少一组子素材的播放参数的参数值,包括:通过用于创建贴纸句柄的第二接口函数创建贴纸句柄;读取所述至少一组子素材和所述特效程序文件包中至少一组子素材的播放参数的参数值、并存储至所述贴纸句柄中。
- 根据权利要求40所述的方法,其特征在于,还包括:根据所述多个子素材的文件名确定所述多个子素材的播放时序;根据所述贴纸句柄中所述特效程序文件中至少一组子素材的播放参数的参数值和所述多个子素材的播放时序,获取所述至少一组子素材中每个子素材在所述视频中显示的位置和视频帧数,并预先从所述视频中读取所述视频帧数对应的视频图像。
- 根据权利要求40或41所述的方法,其特征在于,所述基于检测到的关键点和所述至少一组子素材的播放参数的参数值,在所述当前正在播放的视频上生成所述至少一组子素材的特效,包括:通过用于调用渲染贴纸素材的第三接口函数,从所述贴纸句柄中读取需要显示在所述视频的当前视频图像上的子素材;根据所述检测到的关键点和所述播放参数的参数值,确定所述需要显示的子素材在当前视频图像上显示的位置;将所述需要显示在所述当前视频图像上的子素材显示在所述当前视频图像上的所述显示的位置上。
- 根据权利要求40-42任一所述的方法,其特征在于,还包括:响应于所述特效程序文件包播放完毕,通过用于调用销毁贴纸句柄的第四接口函数销毁所述贴纸句柄。
- 根据权利要求33-43任一所述的方法,其特征在于,所述对所述视频图像进行关键点检测,包括:通过神经网络,对所述视频图像进行所述对应关系涉及的关键点检测,并输出关键点检测结果。
- 根据权利要求44所述的方法,其特征在于,所述关键点检测结果包括以下任意一项或多项:所述对应关系涉及的关键点在所述视频中图像中的位置;所述对应关系涉及的关键点的预设编号。
- 根据权利要求32-45任一所述的方法,其特征在于,所述特效程序文件包为采用如权利要求1-31任一所述的方 法生成的特效程序文件包。
- 一种特效程序文件包的生成装置,其特征在于,包括:第一导入模块,用于导入一组子素材,所述一组子素材包括多个子素材;第一获取模块,用于获取所述一组子素材的播放参数的参数值;第一生成模块,用于根据所述一组子素材和所述播放参数的参数值生成特效程序文件包。
- 根据权利要求47所述的装置,其特征在于,所述多个子素材具有预定的播放时序。
- 根据权利要求48所述的装置,其特征在于,所述多个子素材的播放时序基于所述多个子素材的文件名确定。
- 根据权利要求47-49任一所述的装置,其特征在于,还包括:操作界面,包括操作栏;所述第一导入模块,用于接收通过操作栏的交互接口输入的导入指令,导入所述导入指令指向的素材文件夹中的多个子素材作为所述一组子素材。
- 根据权利要求50所述的装置,其特征在于,所述第一导入模块,用于:接收通过所述操作栏下播放参数设置界面中的交互接口发送的导入指令,导入所述导入指令指向的素材文件夹中的多个子素材;或者,接收通过所述操作栏的交互接口发送的选取指令,以所述选取指令选取的参考部位作为所述当前需要添加特效的目标部位,并在所述操作栏显示所述目标部位下的播放参数设置界面;接收通过所述播放参数设置界面中的交互接口发送的导入指令,导入所述导入指令指向的素材文件夹中的多个子素材。
- 根据权利要求50或51所述的装置,其特征在于,所述第一导入模块,用于:接收通过所述交互接口发送的导入指令,获取并显示所述导入指令指向的素材文件夹;响应于接收到对所述素材文件夹中的子素材选取操作,导入所述子素材选取操作选取的多个子素材;和/或,响应于未接收到对所述素材文件夹中的子素材选取操作,根据预先设置选取素材文件夹中的全部子素材或者部分子素材,并导入根据预先设置选取的子素材。
- 根据权利要求47-52任一所述的装置,其特征在于,所述特效程序文件包包括一组子素材;或者,所述特效程序文件包包括多组子素材。
- 根据权利要求50-53任一所述的装置,其特征在于,所述第一导入模块导入所述导入指令指向的素材文件夹中的多个子素材时,用于:响应于所述导入指令中包括所述导入指令指向的素材文件夹中的多个子素材的显示顺序,按照所述显示顺序读取并导入所述多个子素材,并在所述操作栏按照所述显示顺序显示导入的所述多个子素材的文件名;和/或,响应于所述导入指令中未包括所述导入指令指向的素材文件夹中的多个子素材的显示顺序,按照预设顺序读取并导入所述多个子素材,并在所述操作栏按照预设顺序显示导入的所述多个子素材的文件名。
- 根据权利要求51-54任一所述的装置,其特征在于,所述第一获取模块用于:响应于接收到通过所述播放参数设置界面中的交互接口发送的针对所述一组子素材的播放参数设置的参数值,以所述设置的参数值作为所述一组子素材的播放参数的参数值;和/或,响应于未接收到通过所述播放参数设置界面中的交互接口发送的针对所述一组子素材的播放参数设置的参数值,以预设参数值作为所述一组子素材的播放参数的参数值。
- 根据权利要求51-55任一所述的装置,其特征在于,所述一组子素材的播放参数包括以下任意一项或多项;其中:显示参数:用于表示是否显示所述多个子素材;间隔参数:用于表示显示所述多个子素材中相邻两个子素材间隔的帧数;触发动作参数:用于表示触发所述多个子素材显示的触发动作;循环参数:用于表示所述多个子素材的循环播放次数;延迟触发参数:用于表示延迟显示所述多个子素材的时间;触发结束参数:用于表示结束所述多个子素材显示的动作;显示尺寸参数:用于表示多个子素材的显示大小变化的参考依据;位置类型参数:用于表示多个子素材和位置的关系类型;位置关联参数:用于表示多个子素材是否跟随预设参考部位移动;位置参数:用于表示多个子素材与预设关键点之间的位置绑定关系;旋转参数:用于表示多个子素材旋转依据的关键点;美颜/美妆效果参数:用于表示显示子素材时在预设部位显示的美颜/美妆效果。
- 根据权利要求56所述的装置,其特征在于,所述触发动作参数对应的触发动作包括以下任意一项或多项:无动作触发,眼部动作,头部动作,眉部动作,手部动作,嘴部动作,肩部动作。
- 根据权利要求55或56所述的装置,其特征在于,所述位置类型参数包括以下任意一项:用于表示前景的参数;用于表示所述多个子素材跟随脸部位置进行定位和/或移动的参数;用于表示所述多个子素材跟随手的位置进行定位和/或移动的参数;用于表示所述多个子素材跟随头部的位置进行定位和/或移动的参数;用于表示所述多个子素材跟随肩部的位置进行定位和/或移动的参数;用于表示所述多个子素材跟随手臂的位置进行定位和/或移动的参数;用于表示所述多个子素材跟随腰部的位置进行定位和/或移动的参数;用于表示所述多个子素材跟随腿部的位置进行定位和/或移动的参数;用于表示所述多个子素材跟随脚部的位置进行定位和/或移动的参数;用于表示所述多个子素材跟随人体骨骼的位置进行定位和/或移动的参数;与参考部位相关的播放位置关系;用于表示背景的参数。
- 根据权利要求58所述的装置,其特征在于,所述与参考部位相关播放位置关系包括以下任意一项或多项:所述多个子素材跟随所述参考部位的位置进行移动,所述多个子素材跟随所述参考部位的大小进行缩放;所述多个子素材跟随所述参考部位的位置进行移动,所述多个子素材跟随所述参考部位的大小进行缩放,所述多个子素材跟随所述参考部位的旋转进行纵深缩放;所述多个子素材跟随所述参考部位的位置进行移动,所述多个子素材跟随所述参考部位的大小进行缩放,所述多个子素材跟随所述参考部位的旋转进行纵深缩放,所述多个子素材跟随所述参考部位的平面旋转进行旋转。
- 根据权利要求47-59任一所述的装置,其特征在于,所述播放参数中包括:所述一组子素材的显示位置和预定的至少一关键点之间的对应关系;所述关键点包括以下任意一种或多种:头部关键点,脸部关键点,肩部关键点,手臂关键点,手势关键点,腰部关键点,腿部关键点,脚部关键点,人体骨骼关键点。
- 根据权利要求60所述的装置,其特征在于,所述头部关键点包括以下任意一项或多项:头顶关键点,鼻尖关键点,以及下巴关键点;和/或,所述面部脸部关键点包括以下任意一项或多项:脸部轮廓关键点,眼睛关键点,眉毛关键点,鼻子关键点,嘴部关键点;和/或,所述肩部关键点包括以下任意一项或多项:位于肩部与头部交汇位置处的肩头交汇关键点,以及位于臂根轮廓关键点与肩头交汇关键点之间的中点位置处的肩轮廓中点关键点;和/或,所述手臂关键点包括以下任意一项或多项:手腕轮廓关键点,胳膊肘轮廓关键点,臂根轮廓关键点,位于手腕轮廓关键点与胳膊肘轮廓关键点之间的中点位置处的小臂轮廓中点关键点,以及位于胳膊肘轮廓关键点与臂根轮廓关键点之间的中点位置处的大臂中点关键点;和/或,所述手势关键点包括以下任意一项或多项:手势框的四个顶点关键点,以及手势框的中心关键点;和/或,所述腿部关键点包括以下任意一项或多项:裆部关键点,膝盖轮廓关键点,脚踝轮廓关键点,大腿根部外侧轮廓关键点,位于膝盖轮廓关键点与脚踝轮廓关键点之间的中点位置处的小腿轮廓中点关键点,位于膝盖内轮廓关键点与裆部关键点之间的中点位置处的大腿内轮廓中点关键点,以及位于膝盖外轮廓关键点与大腿根部外侧轮廓关键点之间的中点位置处的大腿外轮廓中点关键点;和/或,所述腰部关键点包括以下任意一项或多项:将大腿根部外侧轮廓关键点与臂根轮廓关键点之间N等分,所产生的N个等分点;其中,所述N大于1;和/或,所述脚部关键点包括以下任意一项或多项:脚尖关键点以及足跟关键点;和/或,所述人体骨骼关键点把包括以下任意一项或多项:右肩骨骼关键点,右肘骨骼关键点,右腕骨骼关键点,左肩骨骼关键点,左肘骨骼关键点,左腕骨骼关键点,右髋骨骼关键点,右膝骨骼关键点,右踝骨骼关键点,左髋骨骼关键点,左膝骨骼关键点,左踝骨骼关键点,头顶骨骼关键点,以及脖子骨骼关键点。
- 根据权利要求61所述的装置,其特征在于,所述眼睛关键点包括以下任意一项或多项:左眼眶关键点,左眼瞳孔中心关键点,左眼中心关键点,右眼眶关键点,右眼瞳孔中心关键点,以及右眼中心关键点;和/或,所述眉毛关键点包括以下任意一项或多项:左眉毛关键点以及右眉毛关键点;和/或,所述鼻子关键点包括以下任意一项或多项:鼻梁关键点,鼻子下沿关键点,以及鼻子外侧轮廓关键点;和/或,所述嘴部关键点包括以下任意一项或多项:上嘴唇关键点,以及下嘴唇关键点。
- 根据权利要求47-62任一所述的装置,其特征在于,所述第一获取模块,还用于:建立所述一组子素材的显示位置和所述至少一关键点之间的对应关系;和/或,建立所述一组子素材的显示位置和检测框的中心关键点之间的对应关系。
- 根据权利要求60-63任一所述的装置,其特征在于,所述操作界面还包括内容显示栏,用于显示参考图像,并显示所述参考图像上的关键点;所述参考图像包括至少一个参考部位。
- 根据权利要求64所述的装置,其特征在于,所述参考图像包括:参考人物的至少一部分图像。
- 根据权利要求65所述的装置,其特征在于,所述参考人物的至少一部分图像包括所述参考人物的以下任意一项或多项的图像:完整图像,头部图像,脸部图像,肩部图像,手臂图像,手势图像,腰部图像,腿部图像,脚部图像。
- 根据权利要求64-66任一所述的装置,其特征在于,所述内容显示栏,还用于:根据所述一组子素材的播放参数的参数值,按照预设显示策略在所述内容显示栏依次显示导入的所述一组子素材中的每个子素材、或者同时显示导入的所述一组子素材中的多个子素材;或者,接收到对所述一组子素材中子素材的选取操作,在所述内容显示栏显示所述选取操作选取的子素材。
- 根据权利要求67所述的装置,其特征在于,还包括:第一更新模块,用于根据通过所述内容显示栏接收到的对所述一组子素材或其中一个子素材的位置移动操作,更新所述一组子素材在所述内容显示栏的显示位置,并对所述一组子素材的播放参数中的相应参数值进行更新。
- 根据权利要求67或68所述的装置,其特征在于,还包括:第二更新模块,用于根据通过所述内容显示栏接收到的对所述一组子素材或其中一个子素材的大小调整操作,更新所述一组子素材在所述内容显示栏的显示大小,并对所述一组子素材的播放参数中的相应参数值进行更新。
- 根据权利要求47-69任一所述的装置,其特征在于,还包括:调整模块,用于根据通过所述操作栏的交互接口接收到的针对两组或以上子素材发送的图层参数调整指令,调整所述两组或以上子素材之间的遮挡关系,并根据调整后的遮挡关系和所述播放参数的参数值显示所述两组或以上子素材。
- 根据权利要求47-70任一所述的装置,其特征在于,所述操作界面还包括:程序文件栏,用于根据预先设置的特效程序文件和所述一组子素材的播放参数的参数值,生成所述一组子素材的特效程序文件,并通过程序文件栏显示所述一组子素材的特效程序文件。
- 根据权利要求71所述的装置,其特征在于,所述特效程序文件包括:以json程序生成的特效程序文件。
- 根据权利要求47-72任一所述的装置,其特征在于,所述操作界面包括左侧、中部和右侧三个区域;在所述操作界面的左侧显示所述操作栏,在所述操作界面的中部显示所述内容显示栏,在所述操作界面右侧显示所述程序文件栏。
- 根据权利要求47-73任一所述的装置,其特征在于,还包括:保存模块,用于根据接收到的保存指令在所述保 存指令指向的位置保存所述特效程序文件包。
- 根据权利要求74所述的装置,其特征在于,所述保存模块,用于:响应于接收到保存指令,显示保存路径选择接口和压缩接口;接收通过所述保存路径选择接口发送的保存位置;以及接收基于所述压缩接口发送的压缩方式,并根据所述压缩方式对所述子素材的特效程序文件包进行压缩,生成压缩文件包;将所述压缩文件包存储至所述保存位置指向的文件夹中。
- 根据权利要求47-75任一所述的装置,其特征在于,所述特效程序文件包中子素材的大小保持为所述子素材被导入前的大小。
- 一种特效生成装置,其特征在于,包括:第二获取模块,用于获取特效程序文件包中至少一组子素材的播放参数的参数值;其中,一组子素材包括多个子素材;第一检测模块,用于对视频图像进行关键点检测;第二生成模块,用于根据检测到的关键点和所述至少一组子素材的播放参数的参数值,在视频图像上生成基于所述至少一组子素材的特效。
- 根据权利要求77所述的装置,其特征在于,还包括:第二导入模块,用于导入所述特效程序文件包;所述特效程序文件包至少一组子素材和所述至少一组子素材的播放参数的参数值,所述一组子素材的播放参数的参数值包括所述一组子素材的显示位置和预定的至少一关键点之间的对应关系。
- 根据权利要求77或78所述的装置,其特征在于,所述多个子素材具有预定的播放时序。
- 根据权利要求79所述的装置,其特征在于,所述播放参数包括:触发事件参数,所述触发事件参数用于表示触发所述多个子素材显示的触发事件;所述装置还包括:第二检测模块,用于检测所述视频图像中是否出现所述触发动作参数的参数值对应的触发动作;所述第二生成模块用于:响应于检测到所视频图像中出现所述触发动作参数的参数值对应的触发动作,根据检测到的关键点和所述至少一组子素材的播放参数的参数值,在所述视频上生成基于所述至少一组子素材的特效。
- 根据权利要求80所述的装置,其特征在于,所述播放参数还包括:触发结束参数:所述触发结束参数用于表示结束所述多个子素材显示的动作;所述第二检测模块,还用于检测所述视频图像中是否出现所述触发结束参数的参数值对应的触发动作;所述第二生成模块,还用于响应于检测到所述视频图像中出现所述触发结束参数的参数值对应的触发动作,结束在所述当前正在播放的视频上生成所述至少一组子素材的特效。
- 根据权利要求80或81所述的装置,其特征在于,所述播放参数包括:美颜/美妆效果参数,所述美颜/美妆效果参数用于表示显示子素材时在预设部位显示的美颜/美妆效果;所述第二生成模块,还用于根据检测到的关键点和所述至少一组子素材的播放参数的参数值,在视频图像上生成基于所述至少一组子素材的特效时,根据所述美颜/美妆效果参数,在所述视频图像中的所述预设部位显示美颜/美妆效果。
- 根据权利要求78-82任一所述的装置,其特征在于,所述第二导入模块,用于:通过调用用于读取贴纸素材的第一接口函数,将所述特效程序文件包读入内存;解析所述动态特效程序文件包,获得所述至少一组子素材和特效程序文件,所述特效程序文件包括所述至少一组子素材的播放参数的参数值。
- 根据权利要求83所述的装置,其特征在于,所述特效程序文件包括:json程序的特效程序文件。
- 根据权利要求83或84所述的装置,其特征在于,所述第二获取模块,用于:通过用于创建贴纸句柄的第二接口函数创建贴纸句柄;读取所述至少一组子素材和所述特效程序文件包中至少一组子素材的播放参数的参数值、并存储至所述贴纸句柄中。
- 根据权利要求85所述的装置,其特征在于,还包括:确定模块,用于根据所述多个子素材的文件名确定所述多个子素材的播放时序;第三获取模块,用于根据所述贴纸句柄中所述特效程序文件中至少一组子素材的播放参数的参数值和所述多个子素材的播放时序,获取所述至少一组子素材中每个子素材在所述视频中显示的位置和视频帧数,并预先从所述视频中读取所述视频帧数对应的视频图像。
- 根据权利要求85或86所述的装置,其特征在于,所述第二生成模块,用于:通过用于调用渲染贴纸素材的第三接口函数,从所述贴纸句柄中读取需要显示在所述视频的当前视频图像上的子素材;根据所述检测到的关键点和所述播放参数的参数值,确定所述需要显示的子素材在当前视频图像上显示的位置;将所述需要显示在所述当前视频图像上的子素材显示在所述当前视频图像上的所述显示的位置上。
- 根据权利要求85-87任一所述的装置,其特征在于,所述第二获取模块,还用于响应于所述特效程序文件包播放完毕,通过用于调用销毁贴纸句柄的第四接口函数销毁所述贴纸句柄。
- 根据权利要求77-88任一所述的装置,其特征在于,所述第一检测模块,用于通过神经网络,对所述视频图像进行所述对应关系涉及的关键点检测,并输出关键点检测结果。
- 根据权利要求89所述的装置,其特征在于,所述关键点检测结果包括以下任意一项或多项:所述对应关系涉及的关键点在所述视频中图像中的位置;所述对应关系涉及的关键点的预设编号。
- 根据权利要求77-90任一所述的装置,其特征在于,所述特效程序文件包为采用如权利要求1-31任一所述的方法或者如权利要求47-76任一所述的装置生成的特效程序文件包。
- 一种电子设备,其特征在于,包括:存储器,用于存储计算机程序;处理器,用于执行所述存储器中存储的计算机程序,且所述计算机程序被执行时,实现上述权利要求1-46中任一项所述的方法。
- 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,该计算机程序被处理器执行时,实现上述权利要求1-46中任一项所述的方法。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020536227A JP7167165B2 (ja) | 2018-02-08 | 2019-02-01 | 特殊効果プログラムファイルパッケージの生成及び特殊効果生成方法、装置並びに電子機器 |
AU2019218423A AU2019218423A1 (en) | 2018-02-08 | 2019-02-01 | Method and device for generating special effect program file package, method and device for generating special effect, and electronic device |
SG11202006351VA SG11202006351VA (en) | 2018-02-08 | 2019-02-01 | Method and device for generating special effect program file package, method and device for generating special effect, and electronic device |
KR1020207019275A KR102466689B1 (ko) | 2018-02-08 | 2019-02-01 | 특수 효과 프로그램 파일 패키지 및 특수 효과의 생성 방법과 장치, 전자 기기 |
EP19750743.7A EP3751413A4 (en) | 2018-02-08 | 2019-02-01 | METHOD AND DEVICE FOR GENERATING A SET OF SPECIAL EFFECT PROGRAM FILES, METHOD AND DEVICE FOR GENERATING A SPECIAL EFFECT, AND ELECTRONIC DEVICE |
US16/914,622 US11368746B2 (en) | 2018-02-08 | 2020-06-29 | Method and device for generating special effect program file package, method and device for generating special effect, and electronic device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810129969.7A CN108388434B (zh) | 2018-02-08 | 2018-02-08 | 特效程序文件包的生成及特效生成方法与装置、电子设备 |
CN201810129969.7 | 2018-02-08 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/914,622 Continuation US11368746B2 (en) | 2018-02-08 | 2020-06-29 | Method and device for generating special effect program file package, method and device for generating special effect, and electronic device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019154339A1 true WO2019154339A1 (zh) | 2019-08-15 |
Family
ID=63075383
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/074503 WO2019154339A1 (zh) | 2018-02-08 | 2019-02-01 | 特效程序文件包的生成及特效生成方法与装置、电子设备 |
Country Status (8)
Country | Link |
---|---|
US (1) | US11368746B2 (zh) |
EP (1) | EP3751413A4 (zh) |
JP (1) | JP7167165B2 (zh) |
KR (1) | KR102466689B1 (zh) |
CN (2) | CN108388434B (zh) |
AU (1) | AU2019218423A1 (zh) |
SG (1) | SG11202006351VA (zh) |
WO (1) | WO2019154339A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113591267A (zh) * | 2021-06-17 | 2021-11-02 | 东风汽车集团股份有限公司 | 一种变速箱壳体悬置强度的分析方法及装置 |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108259496B (zh) | 2018-01-19 | 2021-06-04 | 北京市商汤科技开发有限公司 | 特效程序文件包的生成及特效生成方法与装置、电子设备 |
CN108388434B (zh) * | 2018-02-08 | 2021-03-02 | 北京市商汤科技开发有限公司 | 特效程序文件包的生成及特效生成方法与装置、电子设备 |
CN110858409A (zh) * | 2018-08-24 | 2020-03-03 | 北京微播视界科技有限公司 | 动画生成方法和装置 |
CN109167936A (zh) * | 2018-10-29 | 2019-01-08 | Oppo广东移动通信有限公司 | 一种图像处理方法、终端及存储介质 |
CN110428390B (zh) * | 2019-07-18 | 2022-08-26 | 北京达佳互联信息技术有限公司 | 一种素材展示方法、装置、电子设备和存储介质 |
CN113497898B (zh) | 2020-04-02 | 2023-04-07 | 抖音视界有限公司 | 视频特效配置文件生成方法、视频渲染方法及装置 |
CN112637518B (zh) * | 2020-12-21 | 2023-03-24 | 北京字跳网络技术有限公司 | 模拟拍照特效的生成方法、装置、设备及介质 |
CN115239845A (zh) * | 2021-04-25 | 2022-10-25 | 北京字跳网络技术有限公司 | 一种特效配置文件的生成方法、装置、设备及介质 |
CN113760161A (zh) * | 2021-08-31 | 2021-12-07 | 北京市商汤科技开发有限公司 | 数据生成、图像处理方法、装置、设备及存储介质 |
CN116225267A (zh) * | 2021-11-30 | 2023-06-06 | 北京字节跳动网络技术有限公司 | 图像特效包的生成方法、装置、设备及存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102801924A (zh) * | 2012-07-20 | 2012-11-28 | 合肥工业大学 | 一种基于Kinect的电视节目主持互动系统 |
CN102984465A (zh) * | 2012-12-20 | 2013-03-20 | 北京中科大洋科技发展股份有限公司 | 一种适用于网络化智能化数字媒体的节目合成系统及方法 |
CN107341435A (zh) * | 2016-08-19 | 2017-11-10 | 北京市商汤科技开发有限公司 | 视频图像的处理方法、装置和终端设备 |
CN108388434A (zh) * | 2018-02-08 | 2018-08-10 | 北京市商汤科技开发有限公司 | 特效程序文件包的生成及特效生成方法与装置、电子设备 |
Family Cites Families (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1118005A (ja) | 1997-06-26 | 1999-01-22 | Nagano Nippon Denki Software Kk | 画像効果処理方式およびコンピュータ |
GB2340358B (en) | 1998-07-31 | 2002-11-13 | Sony Uk Ltd | Video special effects |
GB2340360B (en) | 1998-07-31 | 2002-11-06 | Sony Uk Ltd | Animation of video special effects |
JP2001307123A (ja) | 2000-02-18 | 2001-11-02 | Nippon Telegr & Teleph Corp <Ntt> | 表情変形のある似顔絵作成方法及び装置、似顔絵作成システム、似顔絵作成システム用送信機及び受信機、並びに、似顔絵作成プログラム及び似顔絵作成プログラムを記録した記録媒体 |
WO2002050657A1 (en) * | 2000-12-19 | 2002-06-27 | Coolernet, Inc. | System and method for multimedia authoring and playback |
WO2003005201A1 (fr) | 2001-07-04 | 2003-01-16 | Okyz | Procede et systeme d'exportation de donnees associees a des entites geometriques bidimensionnelles ou tridimensionnelles |
JP2003092706A (ja) * | 2001-09-18 | 2003-03-28 | Sony Corp | 効果付加装置、効果付加方法、及び効果付加プログラム |
US8737810B2 (en) | 2002-11-15 | 2014-05-27 | Thomson Licensing | Method and apparatus for cropping of subtitle elements |
JP2004171184A (ja) * | 2002-11-19 | 2004-06-17 | Toppan Printing Co Ltd | Webサーバ及びWebコンテンツ配信方法 |
TWI227444B (en) * | 2003-12-19 | 2005-02-01 | Inst Information Industry | Simulation method for make-up trial and the device thereof |
JP2005242566A (ja) * | 2004-02-25 | 2005-09-08 | Canon Inc | 画像合成装置及び方法 |
CN1564202A (zh) * | 2004-03-16 | 2005-01-12 | 无敌科技(西安)有限公司 | 图像过场动画特效的生成及播放方法 |
US7903927B2 (en) * | 2004-07-08 | 2011-03-08 | Sony Corporation | Editing apparatus and control method thereof, and program and recording medium |
JP2006260198A (ja) * | 2005-03-17 | 2006-09-28 | Toshiba Corp | 仮想化粧装置、仮想化粧方法および仮想化粧プログラム |
FR2884008A1 (fr) | 2005-03-31 | 2006-10-06 | France Telecom | Systeme et procede de localisation de points d'interet dans une image d'objet mettant en oeuvre un reseau de neurones |
JP4799105B2 (ja) * | 2005-09-26 | 2011-10-26 | キヤノン株式会社 | 情報処理装置及びその制御方法、コンピュータプログラム、記憶媒体 |
JP4760349B2 (ja) | 2005-12-07 | 2011-08-31 | ソニー株式会社 | 画像処理装置および画像処理方法、並びに、プログラム |
US20070153091A1 (en) | 2005-12-29 | 2007-07-05 | John Watlington | Methods and apparatus for providing privacy in a communication system |
JP2007257585A (ja) | 2006-03-27 | 2007-10-04 | Fujifilm Corp | 画像処理方法および装置ならびにプログラム |
KR20100069648A (ko) | 2007-09-12 | 2010-06-24 | 휴베이스-아이.인크 | 플래쉬 파일 생성 시스템 및 오리지널 화상 정보 생성 시스템 |
US20110289455A1 (en) * | 2010-05-18 | 2011-11-24 | Microsoft Corporation | Gestures And Gesture Recognition For Manipulating A User-Interface |
JP2012113677A (ja) * | 2010-11-05 | 2012-06-14 | Aitia Corp | 情報処理装置および情報処理プログラム |
CN102567031A (zh) | 2012-03-01 | 2012-07-11 | 盛乐信息技术(上海)有限公司 | 视频特效扩充方法及系统 |
US8824793B2 (en) | 2012-03-02 | 2014-09-02 | Adobe Systems Incorporated | Methods and apparatus for applying a bokeh effect to images |
CN102760303A (zh) * | 2012-07-24 | 2012-10-31 | 南京仕坤文化传媒有限公司 | 一种虚拟现实动态场景视频的拍摄技术与嵌入方法 |
US9076247B2 (en) | 2012-08-10 | 2015-07-07 | Ppg Industries Ohio, Inc. | System and method for visualizing an object in a simulated environment |
WO2014028009A1 (en) | 2012-08-15 | 2014-02-20 | Empire Technology Development Llc | Digital media privacy protection |
CN104637078B (zh) | 2013-11-14 | 2017-12-15 | 腾讯科技(深圳)有限公司 | 一种图像处理方法及装置 |
US9501701B2 (en) * | 2014-01-31 | 2016-11-22 | The Charles Stark Draper Technology, Inc. | Systems and methods for detecting and tracking objects in a video stream |
CN103928039B (zh) * | 2014-04-15 | 2016-09-21 | 北京奇艺世纪科技有限公司 | 一种视频合成方法及装置 |
CN104967893B (zh) * | 2014-07-10 | 2019-03-29 | 腾讯科技(北京)有限公司 | 便携电子设备的视频生成方法和装置 |
CN105451090B (zh) * | 2014-08-26 | 2019-03-29 | 联想(北京)有限公司 | 图像处理方法和图像处理装置 |
EP2993895A1 (en) * | 2014-09-05 | 2016-03-09 | Canon Kabushiki Kaisha | Image capturing apparatus and control method therefor |
CN104394331A (zh) * | 2014-12-05 | 2015-03-04 | 厦门美图之家科技有限公司 | 一种画面视频中添加匹配音效的视频处理方法 |
CN104469179B (zh) * | 2014-12-22 | 2017-08-04 | 杭州短趣网络传媒技术有限公司 | 一种将动态图片结合到手机视频中的方法 |
CN104778712B (zh) * | 2015-04-27 | 2018-05-01 | 厦门美图之家科技有限公司 | 一种基于仿射变换的人脸贴图方法和系统 |
CN106296781B (zh) | 2015-05-27 | 2020-09-22 | 深圳超多维科技有限公司 | 特效图像生成方法及电子设备 |
JP6754619B2 (ja) | 2015-06-24 | 2020-09-16 | 三星電子株式会社Samsung Electronics Co.,Ltd. | 顔認識方法及び装置 |
CN105975935B (zh) * | 2016-05-04 | 2019-06-25 | 腾讯科技(深圳)有限公司 | 一种人脸图像处理方法和装置 |
CN106097417B (zh) | 2016-06-07 | 2018-07-27 | 腾讯科技(深圳)有限公司 | 主题生成方法、装置、设备 |
CN106101576B (zh) | 2016-06-28 | 2019-07-16 | Oppo广东移动通信有限公司 | 一种增强现实照片的拍摄方法、装置及移动终端 |
CN106231205B (zh) | 2016-08-10 | 2019-07-30 | 苏州黑盒子智能科技有限公司 | 增强现实移动终端 |
CN106341720B (zh) * | 2016-08-18 | 2019-07-26 | 北京奇虎科技有限公司 | 一种在视频直播中添加脸部特效的方法及装置 |
CN106373170A (zh) | 2016-08-31 | 2017-02-01 | 北京云图微动科技有限公司 | 一种视频制作方法及装置 |
EP3321846A1 (en) | 2016-11-15 | 2018-05-16 | Mastercard International Incorporated | Systems and methods for secure biometric sample raw data storage |
US10733699B2 (en) * | 2017-10-24 | 2020-08-04 | Deep North, Inc. | Face replacement and alignment |
CN108259496B (zh) * | 2018-01-19 | 2021-06-04 | 北京市商汤科技开发有限公司 | 特效程序文件包的生成及特效生成方法与装置、电子设备 |
US20190236547A1 (en) * | 2018-02-01 | 2019-08-01 | Moxtra, Inc. | Record and playback for online collaboration sessions |
-
2018
- 2018-02-08 CN CN201810129969.7A patent/CN108388434B/zh active Active
- 2018-02-08 CN CN202110252012.3A patent/CN112860168B/zh active Active
-
2019
- 2019-02-01 EP EP19750743.7A patent/EP3751413A4/en active Pending
- 2019-02-01 SG SG11202006351VA patent/SG11202006351VA/en unknown
- 2019-02-01 JP JP2020536227A patent/JP7167165B2/ja active Active
- 2019-02-01 WO PCT/CN2019/074503 patent/WO2019154339A1/zh unknown
- 2019-02-01 AU AU2019218423A patent/AU2019218423A1/en not_active Abandoned
- 2019-02-01 KR KR1020207019275A patent/KR102466689B1/ko active IP Right Grant
-
2020
- 2020-06-29 US US16/914,622 patent/US11368746B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102801924A (zh) * | 2012-07-20 | 2012-11-28 | 合肥工业大学 | 一种基于Kinect的电视节目主持互动系统 |
CN102984465A (zh) * | 2012-12-20 | 2013-03-20 | 北京中科大洋科技发展股份有限公司 | 一种适用于网络化智能化数字媒体的节目合成系统及方法 |
CN107341435A (zh) * | 2016-08-19 | 2017-11-10 | 北京市商汤科技开发有限公司 | 视频图像的处理方法、装置和终端设备 |
CN108388434A (zh) * | 2018-02-08 | 2018-08-10 | 北京市商汤科技开发有限公司 | 特效程序文件包的生成及特效生成方法与装置、电子设备 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3751413A4 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113591267A (zh) * | 2021-06-17 | 2021-11-02 | 东风汽车集团股份有限公司 | 一种变速箱壳体悬置强度的分析方法及装置 |
CN113591267B (zh) * | 2021-06-17 | 2023-12-19 | 东风汽车集团股份有限公司 | 一种变速箱壳体悬置强度的分析方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
AU2019218423A1 (en) | 2020-09-24 |
JP2021508883A (ja) | 2021-03-11 |
KR102466689B1 (ko) | 2022-11-14 |
KR20200093034A (ko) | 2020-08-04 |
US20200329272A1 (en) | 2020-10-15 |
CN108388434A (zh) | 2018-08-10 |
CN108388434B (zh) | 2021-03-02 |
EP3751413A1 (en) | 2020-12-16 |
US11368746B2 (en) | 2022-06-21 |
JP7167165B2 (ja) | 2022-11-08 |
EP3751413A4 (en) | 2021-04-07 |
SG11202006351VA (en) | 2020-07-29 |
CN112860168B (zh) | 2022-08-02 |
CN112860168A (zh) | 2021-05-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019154339A1 (zh) | 特效程序文件包的生成及特效生成方法与装置、电子设备 | |
WO2019141126A1 (zh) | 特效程序文件包的生成及特效生成方法与装置、电子设备 | |
WO2019154337A1 (zh) | 变形特效程序文件包的生成及变形特效生成方法与装置 | |
CN108711180B (zh) | 美妆和/或换脸特效程序文件包的生成及美妆和/或换脸特效生成方法与装置 | |
WO2019154338A1 (zh) | 描边特效程序文件包的生成及描边特效生成方法与装置 | |
CN109035373B (zh) | 三维特效程序文件包的生成及三维特效生成方法与装置 | |
US20220327755A1 (en) | Artificial intelligence for capturing facial expressions and generating mesh data | |
CN116437137B (zh) | 直播处理方法、装置、电子设备及存储介质 | |
US20230063681A1 (en) | Dynamic augmentation of stimuli based on profile of user | |
TW202247107A (zh) | 用於訓練模型之臉部擷取人工智慧 | |
CN115272631A (zh) | 一种虚拟面部模型处理方法及相关设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19750743 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020536227 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20207019275 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2019750743 Country of ref document: EP Effective date: 20200907 |
|
ENP | Entry into the national phase |
Ref document number: 2019218423 Country of ref document: AU Date of ref document: 20190201 Kind code of ref document: A |