CN112509100A - Optimization method and device for dynamic character production - Google Patents

Optimization method and device for dynamic character production Download PDF

Info

Publication number
CN112509100A
CN112509100A CN202011520914.2A CN202011520914A CN112509100A CN 112509100 A CN112509100 A CN 112509100A CN 202011520914 A CN202011520914 A CN 202011520914A CN 112509100 A CN112509100 A CN 112509100A
Authority
CN
China
Prior art keywords
static
motion
action
character
action instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011520914.2A
Other languages
Chinese (zh)
Inventor
邵猛
魏博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Qianhai Hand Painted Technology and Culture Co Ltd
Original Assignee
Shenzhen Qianhai Hand Painted Technology and Culture Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Qianhai Hand Painted Technology and Culture Co Ltd filed Critical Shenzhen Qianhai Hand Painted Technology and Culture Co Ltd
Priority to CN202011520914.2A priority Critical patent/CN112509100A/en
Publication of CN112509100A publication Critical patent/CN112509100A/en
Priority to PCT/CN2021/101686 priority patent/WO2022134506A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence

Abstract

The invention provides an optimization method for making a dynamic character, which comprises the steps of firstly analyzing each part of a static character material, matching the action instruction to each part of the static character material according to the received action instruction, analyzing an action instruction packet to obtain key nodes and a motion path of each part of the static character, and finally obtaining a motion track of the static character according to the dynamic values and the motion tracks of two adjacent key nodes to obtain the motion track of the static character, namely completing the making of the dynamic character.

Description

Optimization method and device for dynamic character production
Technical Field
The invention belongs to the technical field of short video production, and particularly relates to an optimization method and device for dynamic character production, electronic equipment and a storage medium.
Background
In the process of animated video production, there is often a need to use dynamic characters. These dynamic characters need to be produced and provided by the platform side. In the conventional method for producing a dynamic character, a static material is usually produced, different animations are executed by designating different parts of the static material in each animation frame, and the dynamic character material is formed by continuously playing animation frames. The dynamic character file manufactured in the mode contains the information of each frame of animation frame, so that the volume of animation materials is large, the efficiency is low during network transmission, loading and analysis, the animation manufacturing process is not smooth enough, and the playing performance expense of animation videos is high.
Therefore, how to reduce the animation frame information in the material file by optimizing the manner of the animation frame of the dynamic character and ensure that the playing performance of the animation video can be guaranteed even if the animation frame is reduced during loading and parsing is a technical problem which needs to be solved at present.
Disclosure of Invention
In order to solve the technical defects, the invention provides an optimization method for dynamic character production, which comprises the following steps:
loading the static character materials to record initial values of the static character materials in an initial state;
receiving an action instruction packet, and driving each part of the static figure to execute a specific action according to the action instruction;
analyzing the action instruction packet to obtain key nodes and motion paths of motion of each part of the static figure;
and matching action instructions, namely matching the action instructions to all parts of the static figure material according to the first dynamic value and the second dynamic value of the static figure material under the adjacent first key node and the second key node and the motion path between the two key nodes to obtain the motion track of the static figure.
Correspondingly, the invention also provides an optimizing device for dynamic character production, which comprises:
the character material loading static module is used for recording an initial value of the static character material in an initial state;
the action instruction packet receiving module is used for driving each part of the static figure to execute specific actions according to the action instructions;
the action instruction packet analysis module is used for obtaining key nodes and motion paths of motion of all parts of the static figure;
and the action instruction matching module is used for matching the action instruction to each part of the static figure material according to the first dynamic value and the second dynamic value of the static figure material under the adjacent first key node and the second key node and the motion path between the two key nodes so as to obtain the motion track of the static figure.
Description of technical effects: the method comprises the steps of firstly analyzing all parts of a static figure material, matching the action instructions to all parts of the static figure material according to the received action instructions, analyzing an action instruction packet to obtain key nodes and a motion path of each part of the static figure, and finally obtaining a motion track of the static figure according to the dynamic values and the motion tracks of two adjacent key nodes to obtain the motion track of the static figure, namely completing the manufacture of a dynamic figure.
It should be noted that by recording the dynamic values and the motion trajectories of two adjacent key nodes, the dynamic values and the animation frames at the corresponding points of each part of the static character material between the two key nodes can be obtained.
It should be further explained that, in this way, the animation effect of each point on the motion trajectory does not need to be recorded or actually obtained, and the dynamic value of any point on the motion trajectory and the corresponding animation frame can be calculated only by knowing the key node and the motion trajectory.
Further, the step of loading the static person material for recording the initial value of the static person material in the initial state comprises:
analyzing specific materials of the static figure;
and acquiring initial values of the static character materials.
Correspondingly, the character material loading static module comprises;
the static figure analyzing unit is used for analyzing specific materials of the static figure;
an initial value acquisition unit for acquiring an initial value of the static person material.
Description of technical effects: and loading and analyzing each material of the static character, and obtaining initial state values of each part of the material of the static character so as to compare with subsequent motion values.
It should be noted that, the separation of the parts of the static character is helpful for completing various actions of the character in the motion process, and how to act the parts of the character is needed, and different actions can be realized according to the action instructions.
It should be further explained that a static character is decomposed into different materials, and the action values of the different materials at various time points are combined into the action value of the whole static character.
Further, the step of analyzing the action instruction packet to obtain the key nodes and the movement paths of the movement of each part of the static figure includes:
judging the type of the animation;
and dividing the action instruction into different action execution stages according to the animation type.
Correspondingly, the action instruction packet analysis module comprises:
a judging unit that judges a type of the animation;
and the segmentation unit is used for dividing the action instruction into different stages according to the animation type.
Description of technical effects: because of different animation types, the action instructions can be divided into different stages according to time according to different actions to be completed by the static figures in the animation process.
It should be noted that, according to the motion instruction, the motion process of the static character is divided into different motion execution stages, so that in a specific stage, the static character can execute a single motion instruction, and becomes a simple animation.
Further, the step of matching the action instruction to each part of the static figure material according to the first dynamic value and the second dynamic value of the static figure material under the adjacent first key node and the second key node and the motion path between the two key nodes to obtain the motion track of the static figure includes:
obtaining key nodes in the motion process of a static figure;
and matching the action instructions to all parts of the static character materials according to two adjacent key nodes and the motion path.
Correspondingly, the action instruction matching module comprises:
the key node acquisition unit is used for acquiring key nodes in the motion process of the static figure;
and the action instruction matching unit is used for matching the action instruction to each part of the static figure material according to the two adjacent key nodes and the motion path.
Description of technical effects: and calculating the animation frames at any point according to the two adjacent key nodes and the motion trail between the two key nodes.
It should be noted that any point in the motion process is obtained according to the motion trajectory and the initial point and the end point.
The invention also provides an electronic device comprising a memory and a processor, the memory storing a computer program, the computer program being executable in the processor to implement any of the methods described above.
The present invention also provides a storage medium storing a computer program which is executed in a processor to implement any of the methods described above.
The invention provides an optimization method for manufacturing a dynamic character, which comprises the steps of firstly analyzing each part of a static character material, matching the action instruction to each part of the static character material according to the received action instruction, analyzing an action instruction packet to obtain key nodes and a motion path of each part of the static character, and finally obtaining a motion track of the static character according to the dynamic values and the motion tracks of two adjacent key nodes to obtain the motion track of the static character, namely completing the manufacturing of the dynamic character.
Drawings
FIG. 1 is a flowchart of an embodiment of a method for optimizing dynamic character production;
FIG. 2 is a flow diagram of an embodiment of a method for modifying the method of FIG. 1;
FIG. 3 is a flow diagram of an embodiment of a method for modifying the method of FIG. 1;
FIG. 4 is a flow diagram of an embodiment of a method for modifying the method of FIG. 1;
FIG. 5 is an architecture diagram of an optimized device for dynamic character production according to an embodiment;
FIG. 6 is an architecture diagram of the human material loading static module of FIG. 5;
FIG. 7 is an architecture diagram of the action command packet parsing module of FIG. 5;
fig. 8 is an architecture diagram of the action command matching module in fig. 5.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that in the description of the present invention, unless otherwise explicitly specified or limited, the term "storage medium" may be various media that can store a computer program, such as ROM, RAM, a magnetic or optical disk, or the like. The term "processor" may be a chip or a circuit having a data Processing function, such as a CPLD (Complex Programmable Logic Device), an FPGA (Field-Programmable Gate Array), an MCU (micro Controller Unit), a PLC (Programmable Logic Controller), and a CPU (Central Processing Unit). The term "electronic device" may be any device having data processing and storage functions and may generally include fixed and mobile terminals. Fixed terminals such as desktop computers and the like. Mobile terminals such as mobile phones, PADs, and mobile robots, etc. Furthermore, the technical features mentioned in the different embodiments of the invention described later can be combined with each other as long as they do not conflict with each other.
In the following, the present invention proposes some preferred embodiments to teach those skilled in the art to implement.
The first embodiment is as follows:
referring to fig. 1, the present embodiment provides an optimization method for dynamic character production, including the steps of:
s1, loading the static character materials to record initial values of the static character materials in the initial state;
s2, receiving an action instruction packet, and driving each part of the static figure to execute a specific action according to the action instruction;
s3, analyzing the action instruction packet to obtain key nodes and motion paths of motion of each part of the static character;
and S4, matching action instructions, and matching the action instructions to all parts of the static character materials according to the first dynamic value and the second dynamic value of the static character materials under the adjacent first key node and the second key node and the motion path between the two key nodes to obtain the motion track of the static character.
Example two:
referring to fig. 5, correspondingly, the present invention further provides an optimizing device for dynamic character production, including:
the figure material loading static module 1 is used for recording the initial value of the static figure material in the initial state;
the action instruction packet receiving module 2 is used for driving each part of the static figure to execute specific actions according to the action instructions;
the action instruction packet analysis module 3 is used for obtaining key nodes and motion paths of motion of all parts of the static figure;
and the action instruction matching module 4 is used for matching the action instruction to each part of the static figure material according to the first dynamic value and the second dynamic value of the static figure material under the adjacent first key node and the second key node and the motion path between the two key nodes so as to obtain the motion track of the static figure.
Description of technical effects: the method comprises the steps of firstly analyzing all parts of a static figure material, matching the action instructions to all parts of the static figure material according to the received action instructions, analyzing an action instruction packet to obtain key nodes and a motion path of each part of the static figure, and finally obtaining a motion track of the static figure according to the dynamic values and the motion tracks of two adjacent key nodes to obtain the motion track of the static figure, namely completing the manufacture of a dynamic figure.
It should be noted that by recording the dynamic values and the motion trajectories of two adjacent key nodes, the dynamic values and the animation frames at the corresponding points of each part of the static character material between the two key nodes can be obtained.
It should be further explained that, in this way, the animation effect of each point on the motion trajectory does not need to be recorded or actually obtained, and the dynamic value of any point on the motion trajectory and the corresponding animation frame can be calculated only by knowing the key node and the motion trajectory.
EXAMPLE III
Referring to fig. 2, further, the step of loading the static human material for recording the initial values of the static human material in the initial state comprises:
s11, analyzing specific materials of the static persons;
and S12, acquiring initial values of the static character materials.
Example four
Referring to fig. 6, correspondingly, the loading of the human materials into the static module includes;
a static character analysis unit 11 for analyzing the concrete material of the static character;
an initial value acquisition unit 12 for acquiring initial values of the static human materials.
Description of technical effects: and loading and analyzing each material of the static character, and obtaining initial state values of each part of the material of the static character so as to compare with subsequent motion values.
It should be noted that, the separation of the parts of the static character is helpful for completing various actions of the character in the motion process, and how to act the parts of the character is needed, and different actions can be realized according to the action instructions.
It should be further explained that a static character is decomposed into different materials, and the action values of the different materials at various time points are combined into the action value of the whole static character. And calculating a first dynamic value and a second dynamic value of the static figure material under the two connected key nodes and a motion path to obtain the dynamic values of all parts of the static figure material between the two connected key nodes so as to achieve the complete action of the dynamic figure.
EXAMPLE five
As shown in fig. 3, the step of parsing the action instruction packet to obtain key nodes and movement paths of the movement of each part of the static character further includes:
s31, judging the type of the animation;
and S32, dividing the action command into different action execution stages according to the animation type.
EXAMPLE six
As shown in fig. 7, the action instruction packet parsing module correspondingly includes:
a judging unit 31 that judges a type of the animation;
and the segmentation unit 32 is used for dividing the action instruction into different stages according to the animation type.
Description of technical effects: because of different animation types, the action instructions can be divided into different stages according to time according to different actions to be completed by the static figures in the animation process.
It should be noted that, according to the motion instruction, the motion process of the static character is divided into different motion execution stages, so that in a specific stage, the static character can execute a single motion instruction, and becomes a simple animation.
EXAMPLE seven
As shown in fig. 4, the step of matching the action command to each part of the static character material according to the first dynamic value and the second dynamic value of the static character material under the adjacent first key node and second key node and the motion path between the two key nodes to obtain the motion track of the static character further includes:
s41, obtaining key nodes in the motion process of the static figure;
and S42, matching the action instruction to each part of the static character materials according to the two adjacent key nodes and the motion path.
Example eight
As shown in fig. 8, the action instruction matching module correspondingly includes:
a key node obtaining unit 41, configured to obtain a key node in a motion process of a static character;
and the action instruction matching unit 42 is used for matching the action instructions to the parts of the static character materials according to the two adjacent key nodes and the motion path.
Description of technical effects: and calculating the animation frames at any point according to the two adjacent key nodes and the motion trail between the two key nodes.
It should be noted that any point in the motion process is obtained according to the motion trajectory and the initial point and the end point.
The invention also provides an electronic device comprising a memory and a processor, the memory storing a computer program, the computer program being executable in the processor to implement any of the methods described above. The electronic device may be a mobile terminal or a web server.
The present invention also provides a storage medium storing a computer program which is executed in a processor to implement any of the methods described above.

Claims (10)

1. An optimization method for dynamic character production, comprising the steps of:
loading the static character materials to record initial values of the static character materials in an initial state;
receiving an action instruction packet, and driving each part of the static figure to execute a specific action according to the action instruction;
analyzing the action instruction packet to obtain key nodes and motion paths of motion of each part of the static figure;
and matching action instructions, namely matching the action instructions to all parts of the static figure material according to the first dynamic value and the second dynamic value of the static figure material under the adjacent first key node and the second key node and the motion path between the two key nodes to obtain the motion track of the static figure.
2. The method of claim 1, wherein the step of loading the static person material for recording the initial values of the static person material in the initial state comprises:
analyzing specific materials of the static figure;
and acquiring initial values of the static character materials.
3. The method of claim 1, wherein the step of parsing the action instruction packet to obtain key nodes and motion paths for motion of the respective parts of the static character comprises:
judging the type of the animation;
and dividing the action instruction into different action execution stages according to the animation type.
4. The method of claim 1, wherein the step of matching the action instructions to the portions of the static person material based on the first and second dynamic values of the static person material at the first and second key nodes adjacent to each other and the motion path between the two key nodes to obtain the motion profile of the static person comprises:
obtaining key nodes in the motion process of a static figure;
and matching the action instructions to all parts of the static character materials according to two adjacent key nodes and the motion path.
5. An apparatus for optimizing a dynamic personality presentation, comprising:
the static figure material loading module is used for recording an initial value of the static figure material in an initial state;
the action instruction packet receiving module is used for driving each part of the static figure to execute specific actions according to the action instructions;
the action instruction packet analysis module is used for obtaining key nodes and motion paths of motion of all parts of the static figure;
and the action instruction matching module is used for matching the action instruction to each part of the static figure material according to the first dynamic value and the second dynamic value of the static figure material under the adjacent first key node and the second key node and the motion path between the two key nodes so as to obtain the motion track of the static figure.
6. The apparatus of claim 5, wherein the static 0 character material loading module comprises;
the static figure analyzing unit is used for analyzing specific materials of the static figure;
and the acquisition unit is used for acquiring initial values of the static character materials.
7. The apparatus of claim 5, wherein the action bundle parsing module comprises:
a judging unit that judges a type of the animation;
and the segmentation unit is used for dividing the action instruction into different stages according to the animation type.
8. The apparatus of claim 5, wherein the action instruction matching module comprises:
the key node acquisition unit is used for acquiring key nodes in the motion process of the static figure;
and the action instruction matching unit is used for matching the action instruction to each part of the static figure material according to the two adjacent key nodes and the motion path.
9. An electronic device comprising a memory and a processor, the memory storing a computer program, wherein the computer program is executed in the processor to perform the method of any of claims 1-4.
10. A storage medium storing a computer program, characterized in that the computer program is executed in a processor to implement the method of any of claims 1-4.
CN202011520914.2A 2020-12-21 2020-12-21 Optimization method and device for dynamic character production Pending CN112509100A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011520914.2A CN112509100A (en) 2020-12-21 2020-12-21 Optimization method and device for dynamic character production
PCT/CN2021/101686 WO2022134506A1 (en) 2020-12-21 2021-06-23 Optimization method and device for fabricating dynamic characters

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011520914.2A CN112509100A (en) 2020-12-21 2020-12-21 Optimization method and device for dynamic character production

Publications (1)

Publication Number Publication Date
CN112509100A true CN112509100A (en) 2021-03-16

Family

ID=74923132

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011520914.2A Pending CN112509100A (en) 2020-12-21 2020-12-21 Optimization method and device for dynamic character production

Country Status (2)

Country Link
CN (1) CN112509100A (en)
WO (1) WO2022134506A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022134506A1 (en) * 2020-12-21 2022-06-30 深圳市前海手绘科技文化有限公司 Optimization method and device for fabricating dynamic characters

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101833459A (en) * 2010-04-14 2010-09-15 四川真视信息技术有限公司 Dynamic 2D bone personage realizing system based on webpage
CN107180446A (en) * 2016-03-10 2017-09-19 腾讯科技(深圳)有限公司 The expression animation generation method and device of character face's model
CN108335346A (en) * 2018-03-01 2018-07-27 黄淮学院 A kind of interactive animation generation system
CN108619724A (en) * 2018-04-17 2018-10-09 苏州万代信息科技有限公司 Game charater editing system and edit methods
CN111951358A (en) * 2020-08-11 2020-11-17 深圳市前海手绘科技文化有限公司 Application method of combined material in hand-drawn animation

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0307953D0 (en) * 2003-04-05 2003-05-14 Wicks Nigel E Amusement or educational device
CN104123742A (en) * 2014-07-21 2014-10-29 徐才 Method and player for translating static cartoon picture into two dimensional animation
CN106530372A (en) * 2016-09-18 2017-03-22 中山大学 Method for generating frame animation
CN109068069A (en) * 2018-07-03 2018-12-21 百度在线网络技术(北京)有限公司 Video generation method, device, equipment and storage medium
CN111862276B (en) * 2020-07-02 2023-12-05 南京师范大学 Automatic skeletal animation production method based on formalized action description text
CN112037311B (en) * 2020-09-08 2024-02-20 腾讯科技(深圳)有限公司 Animation generation method, animation playing method and related devices
CN112509100A (en) * 2020-12-21 2021-03-16 深圳市前海手绘科技文化有限公司 Optimization method and device for dynamic character production

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101833459A (en) * 2010-04-14 2010-09-15 四川真视信息技术有限公司 Dynamic 2D bone personage realizing system based on webpage
CN107180446A (en) * 2016-03-10 2017-09-19 腾讯科技(深圳)有限公司 The expression animation generation method and device of character face's model
CN108335346A (en) * 2018-03-01 2018-07-27 黄淮学院 A kind of interactive animation generation system
CN108619724A (en) * 2018-04-17 2018-10-09 苏州万代信息科技有限公司 Game charater editing system and edit methods
CN111951358A (en) * 2020-08-11 2020-11-17 深圳市前海手绘科技文化有限公司 Application method of combined material in hand-drawn animation

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022134506A1 (en) * 2020-12-21 2022-06-30 深圳市前海手绘科技文化有限公司 Optimization method and device for fabricating dynamic characters

Also Published As

Publication number Publication date
WO2022134506A1 (en) 2022-06-30

Similar Documents

Publication Publication Date Title
TW202016693A (en) Human-computer interaction processing system, method, storage medium and electronic device
KR20190116199A (en) Video data processing method, device and readable storage medium
CN111489737B (en) Voice command recognition method and device, storage medium and computer equipment
CN112367403B (en) Online preservation and optimization method and device for animation draft
CN109086126B (en) Task scheduling processing method and device, server, client and electronic equipment
CN113069769B (en) Cloud game interface display method and device, electronic equipment and storage medium
WO2021232958A1 (en) Method and apparatus for executing operation, electronic device, and storage medium
CN111489738B (en) Feature extraction method and voice command identification method based on multi-head attention mechanism
CN112509100A (en) Optimization method and device for dynamic character production
CN102426567A (en) Graphical editing and debugging system of automatic answer system
CN116151363A (en) Distributed reinforcement learning system
CN111488813A (en) Video emotion marking method and device, electronic equipment and storage medium
CN110222755A (en) Deep learning scene recognition method based on Fusion Features
CN114546804A (en) Information push effect evaluation method and device, electronic equipment and storage medium
CN115830633A (en) Pedestrian re-identification method and system based on multitask learning residual error neural network
CN111935548A (en) Interactive hand-drawn video production method
CN109542729A (en) Device performance parameters data analysing method and device
CN112509101A (en) Method for realizing motion transition of multiple dynamic character materials in animation video
CN112652039A (en) Animation segmentation data acquisition method, segmentation method, device, equipment and medium
CN104915663A (en) Method, system, and mobile terminal for improving face identification and display
CN110415015A (en) Product degree of recognition analysis method, device, terminal and computer readable storage medium
CN107147947A (en) Key frame recognition methods and device
CN117351354B (en) Lightweight remote sensing image target detection method based on improved MobileViT
CN115858698B (en) Agent profile analysis method, system and readable storage medium
CN117033150A (en) Recording method and device for user operation, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210316