CN106780681B - Role action generation method and device - Google Patents

Role action generation method and device Download PDF

Info

Publication number
CN106780681B
CN106780681B CN201611091834.3A CN201611091834A CN106780681B CN 106780681 B CN106780681 B CN 106780681B CN 201611091834 A CN201611091834 A CN 201611091834A CN 106780681 B CN106780681 B CN 106780681B
Authority
CN
China
Prior art keywords
action
data
characteristic
skeleton
key frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611091834.3A
Other languages
Chinese (zh)
Other versions
CN106780681A (en
Inventor
王彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Pixel Software Technology Co Ltd
Original Assignee
Beijing Pixel Software Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Pixel Software Technology Co Ltd filed Critical Beijing Pixel Software Technology Co Ltd
Priority to CN201611091834.3A priority Critical patent/CN106780681B/en
Publication of CN106780681A publication Critical patent/CN106780681A/en
Application granted granted Critical
Publication of CN106780681B publication Critical patent/CN106780681B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings

Abstract

The invention discloses a role action generation method and device. The role action generation method comprises the following steps: acquiring key frame attitude information of a role action, wherein the key frame attitude information is used for describing the skeleton characteristics of the role action; and generating a skeleton animation file according to the key frame posture information by a preset algorithm, wherein the skeleton animation file is used for executing the preset action of the game role. The method and the device solve the problems that the action of the character in the game is fixed and the requirement of a player on the personalized character image cannot be met, effectively avoid the single and tedious role of the game by generating the acquired key frame posture information of the action of the character into a new skeleton animation file, and realize the creation of distinctive character images according to the preference of the user so as to design the favorite character image of the user, thereby improving the interest of the game and improving the user experience.

Description

Role action generation method and device
Technical Field
The embodiment of the invention relates to the technical field of game development, in particular to a role action generation method and device.
Background
In the 3D large-scale online game development process, in order to increase the interest and the social property of the game, abundant expressions and actions are set for the character of the character. For example, in a shooting game, a character usually has fixed actions such as running, jumping and shooting preset in the game; in the role-playing game, the role can further preset more fighting skill actions. The traditional character editing mode is generated by professional animators through editing by professional editors. In recent years, there are also many games where the player can set the expressive details of the character, such as "Jianling" facing the player, face pinching system for facial shapes, and "watching pioneer" can also set the character expressive actions from the selection list. With the increase of the controllability requirement of the player on the virtual world, the design of a character action editing system for the game player becomes a technical problem.
For the setting of the role action, the basic technical scheme is to store a plurality of complete animation files in a game pack. When the game is running, the player can set expression playing actions by himself or trigger preset skill actions according to skill key operation. The scheme is mature and is applied to most 3D net games. For example the world of magic beasts.
All actions of the character are fixed and invariable, the action performance of the character triggered by each operation is the same for the same character of different players, and the requirement of individuation of the action of the character in the game virtual world cannot be met.
Disclosure of Invention
The invention provides a character action generation method and a character action generation device, which aim to solve the problem that the action of a character in a game is fixed and the requirement of a player on an individualized character image cannot be met.
In a first aspect, an embodiment of the present invention provides a role action generation method, including:
acquiring key frame attitude information of a role action, wherein the key frame attitude information is used for describing the skeleton characteristics of the role action;
and generating a skeleton animation file according to the key frame posture information by a preset algorithm, wherein the skeleton animation file is used for executing the preset action of the game role.
In a second aspect, an embodiment of the present invention further provides a role action generating apparatus, including: .
The key frame attitude information acquisition module is used for acquiring key frame attitude information of the role action, and the key frame attitude information is used for describing the skeleton characteristics of the role action;
and the skeleton animation file generation module is used for generating a skeleton animation file according to the key frame posture information by a preset algorithm, and the skeleton animation file is used for executing the preset action of the game role.
According to the embodiment of the invention, the obtained key frame posture information of the role action is generated into the new skeleton animation file, so that the problem that the role action in the game is fixed and the requirement of a player on the personalized role image cannot be met is solved, the single and tedious role of the game is effectively avoided, the purpose of modeling different role images according to the preference of the user is realized, and the role image liked by the user is further generated, so that the interest of the game is improved, and the user experience is improved.
Drawings
Fig. 1 is a flowchart illustrating a role action generation method according to a first embodiment of the present invention.
FIG. 2 is a schematic view of a skeleton animation file generation operation flow in a character action generation method according to a second embodiment of the present invention.
Fig. 3 is a schematic structural diagram of a framework according to a second embodiment of the present invention.
Fig. 4 is a schematic diagram of distribution of two-dimensional key points in the action sketch stroke in the second embodiment of the present invention.
Fig. 5 is a schematic structural diagram of a character action generating device in the third embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a schematic flow chart of a character action generation method according to an embodiment of the present invention, where the embodiment is applicable to a game character automatic generation situation, and the method may be executed by a character action generation apparatus, and the apparatus may be implemented by software and/or hardware. As shown in fig. 1, the method includes:
s110, obtaining key frame posture information of the role action, wherein the key frame posture information is used for describing the skeleton characteristics of the role action.
The character action may be an action performed by a game character in the game, and in this embodiment, the character action is a personalized character action set by a game player according to the requirement of the game player. The key frame posture information is information which is input to the terminal by a game player and corresponds to the personalized role action set by the game player. The key frame posture information may be image information or an adjustment instruction for adjusting the existing character action in the game.
When the key frame posture information is image information, the terminal extracts corresponding skeleton characteristics according to the action information in the image and generates corresponding role actions according to the skeleton characteristics. When the key frame posture information is an adjusting instruction for adjusting the existing role action in the game, the terminal can adjust the skeleton characteristic of the role action according to the adjusting instruction and obtain a new skeleton characteristic.
And S120, generating a skeleton animation file according to the key frame posture information by a preset algorithm, wherein the skeleton animation file is used for executing preset actions of the game role.
And the skeleton animation file is obtained by processing the acquired key frame attitude information by the terminal according to a preset algorithm.
Further, one skeletal animation file may correspond to at least two keyframe pose information.
When the character action set by the player is simple, the skeleton animation file can be obtained by processing two key frame posture information, namely start posture information and end posture information, wherein the start posture information represents the posture of the character when the action is started to be executed, and the end posture information represents the posture of the character when the action is ended to be executed. When the action of the character set by the player is complex, the skeleton animation file is obtained by processing a start posture information, an end posture information and at least one intermediate posture information. The higher the complexity of the character action, the more the number of the intermediate posture information.
According to the technical scheme, the new skeleton animation file is generated through the acquired key frame posture information of the role action, the problems that the role action in the game is fixed and cannot meet the requirements of players for the personalized role image are solved, the single and tedious role of the game is effectively avoided, the purpose of creating the distinctive role image according to the preference of the users is achieved, and then the role image liked by the users is generated, so that the interest of the game is improved, and the user experience is improved.
Example two
FIG. 2 is a schematic view of a skeleton animation file generation operation flow in a character action generation method according to a second embodiment of the present invention. The embodiment provides a character action generation method on the basis of the first embodiment, and further describes a specific step of generating a skeleton animation file by using the key frame posture information. As shown in fig. 2, specifically:
s210, extracting key frame actions from the preset skeleton animation file, converting the key frame actions into action characteristic data and storing the action characteristic data into an action characteristic library, wherein the action characteristic library comprises the action characteristic data, an action file name and an action frame sequence number.
The preset skeleton animation file is an original skeleton animation file of a game, the key frame action is corresponding information extracted from the preset skeleton animation file when a game role executes a relatively definite action, and the definite action can be running, jumping, shooting, kicking, waving, rolling, bending and the like.
The method comprises the steps of converting key frame actions into action characteristic data, wherein the process is a characteristic simplifying process, specifically, extracting key bone nodes in the key frame actions, determining the positions and the states of the key bone nodes, and constructing information similar to a bone characteristic structure.
The skeleton characteristic structure is a two-dimensional characteristic skeleton and represents the positions and the states of main joints in the game role. The more bone nodes in the skeleton characteristic structure, the more detailed the motion posture expression of the game role, and the more vivid the generated role action, at this time, the more complicated the generation process from the key frame posture information to the skeleton animation file, the longer the time required, and the higher the requirement on the terminal processor. When the number of motion nodes in the skeleton feature structure is small, the motion posture of the expressed game character is relatively simple, and the action of the generated character is relatively simple, so that the selection of the number of the bone nodes in the skeleton feature structure should be comprehensively considered in terms of game performance, the accuracy of the action of the generated character, and the like.
In this embodiment, the skeletal features are preferably comprised of 17 bony nodes. Fig. 3 is a schematic structural diagram of a framework according to a second embodiment of the present invention. As shown in fig. 3, the skeletal feature structure (KeyFrame) includes 16 limb nodes and 1 head node, and is represented in a vector manner, i.e. the skeletal feature structure is represented by 1The 5 limb unit vectors and the 1 head node vector can be expressed as KeyFrame { (e: { (e)0,e1,e2),...,(e12,e13,e14) Head }, wherein e0-e14The head represents the head node vector, and it should be noted that the influence of the head node vector on the role actions is small and can be ignored in the actual application process.
Preferably, the skeletal features comprise at least 15 limb unit vectors.
The motion feature data may be a set of skeletal feature vectors corresponding to key frame motions.
Further, the extracting key frame actions from the preset skeleton animation file, converting the key frame actions into action feature data and storing the action feature data into an action feature library includes:
and extracting at least one key frame action from a preset skeleton animation file based on a limit posture algorithm.
The extreme attitude algorithm is used for judging whether the current role action is in an extreme attitude, specifically, when the movement or the rotation acceleration of a key bone node of the game role body is close to a 0 value, the current role action is judged as the extreme attitude, and a key frame action is extracted from the current role action.
There are multiple extreme poses in a preset skeletal animation file, so that multiple keyframe actions can be extracted from one skeletal animation file.
And extracting a plurality of key motion nodes for each key frame motion, and normalizing the key motion nodes according to the skeleton characteristic structure to obtain motion characteristic data.
The key motion nodes represent the major bone nodes of the key frame motion, such as cervical, elbow, lumbar, and patellar nodes, among others. And normalizing the plurality of key motion nodes according to the bone characteristic structure to obtain action characteristic data. The normalization process is used to process the key motion nodes into data that can be computed and has comparability.
And establishing a corresponding relation among the action characteristic data, the key frame action file name and the action frame sequence number corresponding to the key frame action, and storing the corresponding relation in an action characteristic library.
Converting the key frame action into action characteristic data, acquiring the detailed frame number of the key frame action in a preset skeleton animation file to obtain an action frame sequence number, naming the action key frame data to obtain an action file name, establishing a corresponding relation among the action characteristic data, the action frame sequence number and the action file name, and storing the corresponding relation in an action characteristic library.
S220, obtaining key frame posture information of the role action, wherein the key frame posture information is used for describing the skeleton characteristics of the role action.
S230, obtaining a skeleton characteristic structure of the key frame posture information, wherein the key frame posture information comprises action sketch strokes.
The action simple stroke is a simple shape of a character action described by a game player according to the requirement of the game player, and also can be a simple shape of a part of the character action.
The action sketch strokes can be sent out by a user through a hand drawing board connected with the terminal, and also can be sent out through a virtual drawing board or a touch screen of the terminal.
S240, a, extracting two-dimensional key points from the action sketch strokes, and obtaining the sketch stroke characteristic data according to the two-dimensional key points.
The two-dimensional keypoints are used to determine skeletal features. At least two-position key points, a start key point and an end key point can be extracted from each action sketch.
In the process of acquiring the action sketch strokes, a user is prompted to perform five input operations, and then five sketch strokes are acquired, and sequentially and respectively represent the trunk, the left arm, the right arm, the left limb and the right limb of a game role.
Preferably, in the present embodiment, four two-dimensional key points are extracted from the action sketch. Fig. 4 is a schematic diagram of distribution of two-dimensional key points in the action sketch stroke in the second embodiment of the present invention. As shown in fig. 4, four are twoThe dimension key points are respectively P0、Pmaxdis、Pmaxdis'And PN,P0Representing the starting key point, P, of the action skeinNRepresenting the ending key point, P, of the action skeinmaxdisRepresenting distance line segment P in action simple stroke0PNThe farthest point, Pmaxdis'Express action simple stroke P0PmaxdisDistance segment P in curve0PmaxdisThe furthest point. And 3 eigenvectors can be obtained according to the four two-dimensional key points.
20 two-dimensional key points and 15 feature vectors, namely 15 limb vectors, are extracted from the five-stroke motion sketch strokes, and the sketch stroke feature data is obtained according to the 15 limb vectors.
And S250, calculating the characteristic distance between the character data of the sketching and the action character data in the action character library, selecting the action character data with the characteristic distance value closest to 0 value as the optimal character data, and obtaining the three-dimensional skeleton posture frame data according to the action file name and the action frame number corresponding to the optimal character data in the action character library.
And the three-dimensional skeleton posture frame data is the feature data which is most matched with the action feature data in the action feature library.
Further, the calculation formula of the characteristic distance includes:
Figure GDA0002314038610000081
dis denotes the characteristic distance, PsketchRepresenting character data of the simplified strokes;
Pkeyframerepresenting the action characteristic data, wherein n represents the number of characteristic nodes in the skeleton characteristic structure; delta (P)skectch,Pkeyframe) Calculating the characteristic distance between the character data of the simple strokes and the action characteristic data;
esketch_irepresenting ith characteristic node data in the character data of the simplified strokes;
ekeyframe_irepresenting ith feature node data in the action feature data;
wiand representing the weight value of the ith characteristic node data in the simplified stroke characteristic data.
And S260, repeating the step a and the step b to obtain a three-dimensional bone posture frame data set.
The set of three-dimensional bone pose frame data is comprised of at least one three-dimensional bone pose frame data.
And repeating the step a and the step b according to the complexity of the personalized character action required by the game player to obtain a plurality of simple stroke feature data, wherein the higher the complexity of the character action is, the more the repetition times of the step a and the step b are.
S270, performing interpolation operation on the three-dimensional skeleton attitude frame data according to the execution time and the execution sequence of the preset three-dimensional skeleton attitude frame data in the skeleton animation file, and obtaining the skeleton animation file.
Illustratively, the three-dimensional skeletal pose frame data set is { KeyFrame }0,KeyFrame1,...,KeyFrameNAnd the motion time set corresponding to the three-dimensional skeleton attitude frame data set is { KeyTime }0,KeyTime1,...,KeyTimeNInterpolating each Bone posture Bone of the motion Frame KeyFrame at each moment to obtain motion data { Frame at each moment0,Frame1,...,FrameNAnd obtaining the skeleton animation file.
The technical scheme of this embodiment, through setting for the concrete structure of action simple stroke and skeleton characteristic structure for game player's input operation, the matching operation of simple stroke characteristic data in the action characteristic storehouse is simpler, quick, high-efficient, through carrying out the difference operation to three-dimensional skeleton gesture frame data, obtain the more abundant and lifelike role action of expression effect, generate the most appropriate role action with game player's demand, and then the creation is unusual role image, thereby improve the interest of recreation, user experience has been promoted.
EXAMPLE III
Fig. 3 is a schematic structural diagram of a character action generating device according to a third embodiment of the present invention, and as shown in fig. 3, the game level processing device includes: a keyframe pose information acquisition module 310 and a skeletal animation file generation module 320.
The key frame posture information obtaining module 310 is configured to obtain key frame posture information of a character action, where the key frame posture information is used to describe a skeleton feature of the character action.
And a skeleton animation file generating module 320, configured to generate a skeleton animation file from the keyframe posture information according to a preset algorithm, where the skeleton animation file is used to execute a preset action of the game character.
Further, the character action generating device further includes:
the action characteristic library acquisition module is used for extracting the key frame action from a preset skeleton animation file before acquiring the key frame posture information of the role action, converting the key frame action into action characteristic data and storing the action characteristic data into an action characteristic library, wherein the action characteristic library comprises action characteristic data, an action file name and an action frame sequence number.
Further, the skeletal animation file generation module comprises:
the skeleton characteristic structure acquisition unit is used for acquiring a skeleton characteristic structure of the key frame posture information, and the key frame posture information comprises action simplified strokes;
the system comprises a simple stroke feature data acquisition unit, a simple stroke feature data acquisition unit and a simple stroke feature data acquisition unit, wherein the simple stroke feature data acquisition unit is used for extracting two-dimensional key points from an action simple stroke and obtaining simple stroke feature data according to the two-dimensional key points;
the three-dimensional skeleton posture frame data acquisition unit is used for calculating the characteristic distance between the sketch feature data and the action feature data in the action feature library, selecting the action feature data with the characteristic distance value closest to a 0 value as optimal feature data, and acquiring three-dimensional skeleton posture frame data according to the action file name and the action frame number corresponding to the optimal feature data in the action feature library;
the gesture frame data set acquisition unit is used for starting or closing the stroke characteristic data acquisition unit and the three-dimensional skeleton gesture frame data acquisition unit according to the acquisition times of the action stroke to obtain a three-dimensional skeleton gesture frame data set;
and the interpolation operation execution unit is used for carrying out interpolation operation on the three-dimensional skeleton attitude frame data according to the execution time and the execution sequence of the preset three-dimensional skeleton attitude frame data in the skeleton animation file, and obtaining the skeleton animation file.
Further, the action characteristic library obtaining module includes:
the key frame action extraction unit is used for extracting at least one key frame action from the original skeleton animation file based on a limit posture algorithm;
the normalization processing unit is used for extracting a plurality of key motion nodes for each key frame action and normalizing the key motion nodes according to the skeleton characteristic structure to obtain action characteristic data;
and the corresponding relation establishing unit is used for establishing a corresponding relation among the action characteristic data, the position where the key frame action is stored and the action frame sequence number corresponding to the key frame action, and storing the corresponding relation into an action characteristic library. Further, the calculation formula of the characteristic distance includes:
Figure GDA0002314038610000111
dis denotes the characteristic distance, PsketchRepresenting character data of the simplified strokes;
Pkeyframerepresenting the action characteristic data, wherein n represents the number of characteristic nodes in the skeleton characteristic structure; delta (P)skectch,Pkeyframe) Calculating the characteristic distance between the character data of the simple strokes and the action characteristic data;
esketch_irepresenting ith characteristic node data in the character data of the simplified strokes;
ekeyframe_irepresenting ith feature node data in the action feature data;
wiand representing the weight value of the ith characteristic node data in the simplified stroke characteristic data.
Further, the skeletal features include at least 15 limb unit vectors.
According to the technical scheme of the embodiment, a new skeleton animation file is generated through the acquired key frame posture information of the character action. The problem that the action of the character in the game is fixed and cannot meet the requirement of a player on the image of the personalized character is solved. The game method effectively avoids the single and tedious role of the game, realizes that the distinctive role image is created according to the preference of the user, and further generates the role image liked by the user, thereby improving the interest of the game and promoting the user experience.
The role action generating device provided by the embodiment of the invention can be used for executing the role action generating method provided by the embodiment of the invention, and has corresponding functions and beneficial effects. For technical details that are not described in detail in this embodiment, reference may be made to a role action generation method provided in any embodiment of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (8)

1. A character action generation method is characterized by comprising the following steps:
acquiring key frame attitude information of a role action, wherein the key frame attitude information is used for describing the skeleton characteristics of the role action;
generating a skeleton animation file from the key frame posture information according to a preset algorithm, wherein the skeleton animation file is used for executing preset actions of game roles;
before obtaining the key frame attitude information of the role action, the method further comprises the following steps:
extracting key frame actions from a preset skeleton animation file, converting the key frame actions into action characteristic data and storing the action characteristic data into an action characteristic library, wherein the action characteristic library comprises the action characteristic data, an action file name and an action frame sequence number;
generating a skeleton animation file from the key frame posture information according to a preset algorithm, wherein the method comprises the following steps:
acquiring a skeleton characteristic structure of the key frame posture information, wherein the key frame posture information comprises action sketch strokes;
a. extracting two-dimensional key points from the action simplified strokes, and obtaining the characteristic data of the simplified strokes according to the two-dimensional key points;
b. calculating the characteristic distance between the character data of the sketching and the action character data in the action character library, selecting the action character data with the characteristic distance value closest to 0 value as the optimal character data, and obtaining three-dimensional skeleton attitude frame data according to the action file name and the action frame number corresponding to the optimal character data in the action character library;
repeating the step a and the step b to obtain a three-dimensional skeleton posture frame data set;
and carrying out interpolation operation on the three-dimensional skeleton attitude frame data according to the execution time and the execution sequence of the preset three-dimensional skeleton attitude frame data in the skeleton animation file, and obtaining the skeleton animation file.
2. The character action generating method according to claim 1, wherein the extracting key frame actions from the preset skeleton animation file, converting the key frame actions into action feature data and storing the action feature data in an action feature library comprises:
extracting at least one key frame action from a preset skeleton animation file based on a limit posture algorithm, wherein the limit posture algorithm is used for judging whether the current role action is in a limit posture or not;
extracting a plurality of key motion nodes for each key frame motion, and normalizing the key motion nodes according to the skeleton feature structure to obtain motion feature data;
and establishing a corresponding relation among the action characteristic data, the key frame action file name and the action frame sequence number corresponding to the key frame action, and storing the corresponding relation in an action characteristic library.
3. The character action generating method according to claim 1,
the calculation formula of the characteristic distance comprises the following steps:
Figure FDA0002376576200000021
dis denotes the characteristic distance, PsketchRepresenting character data of the simplified strokes;
Pkeyframerepresenting the action characteristic data, wherein n represents the number of characteristic nodes in the skeleton characteristic structure;
Δ(Pskectch,Pkeyframe) Calculating the characteristic distance between the character data of the simple strokes and the action characteristic data;
esketch_irepresenting ith characteristic node data in the character data of the simplified strokes;
ekeyframe_irepresenting ith feature node data in the action feature data;
wiand representing the weight value of the ith characteristic node data in the simplified stroke characteristic data.
4. The character action generating method according to claim 1,
the skeletal features include at least 15 limb unit vectors.
5. A character motion generation apparatus, comprising:
the key frame attitude information acquisition module is used for acquiring key frame attitude information of the role action, and the key frame attitude information is used for describing the skeleton characteristics of the role action;
the skeleton animation file generation module is used for generating a skeleton animation file according to the key frame posture information by a preset algorithm, and the skeleton animation file is used for executing the preset action of the game role;
the action characteristic library acquisition module is used for extracting key frame actions from a preset skeleton animation file before acquiring key frame posture information of the role actions, converting the key frame actions into action characteristic data and storing the action characteristic data into an action characteristic library, wherein the action characteristic library comprises action characteristic data, an action file name and an action frame sequence number;
the skeleton animation file generation module comprises:
the skeleton characteristic structure acquisition unit is used for acquiring a skeleton characteristic structure of the key frame posture information, and the key frame posture information comprises action simplified strokes;
the system comprises a simple stroke feature data acquisition unit, a simple stroke feature data acquisition unit and a simple stroke feature data acquisition unit, wherein the simple stroke feature data acquisition unit is used for extracting two-dimensional key points from an action simple stroke and obtaining simple stroke feature data according to the two-dimensional key points;
the three-dimensional skeleton posture frame data acquisition unit is used for calculating the characteristic distance between the sketch feature data and the action feature data in the action feature library, selecting the action feature data with the characteristic distance value closest to a 0 value as optimal feature data, and acquiring three-dimensional skeleton posture frame data according to the action file name and the action frame number corresponding to the optimal feature data in the action feature library;
the gesture frame data set acquisition unit is used for starting or closing the stroke characteristic data acquisition unit and the three-dimensional skeleton gesture frame data acquisition unit according to the acquisition times of the action stroke to obtain a three-dimensional skeleton gesture frame data set;
and the interpolation operation execution unit is used for carrying out interpolation operation on the three-dimensional skeleton attitude frame data according to the execution time and the execution sequence of the preset three-dimensional skeleton attitude frame data in the skeleton animation file, and obtaining the skeleton animation file.
6. The character motion generation apparatus according to claim 5, wherein the motion feature library acquisition module includes:
the system comprises a key frame action extraction unit, a skeleton animation file extraction unit and a skeleton animation file extraction unit, wherein the key frame action extraction unit is used for extracting at least one key frame action from a preset skeleton animation file based on a limit posture algorithm, and the limit posture algorithm is used for judging whether the current role action is in a limit posture or not;
the normalization processing unit is used for extracting a plurality of key motion nodes for each key frame action and normalizing the key motion nodes according to the skeleton characteristic structure to obtain action characteristic data;
and the corresponding relation establishing unit is used for establishing a corresponding relation among the action characteristic data, the key frame action file name and the action frame sequence number corresponding to the key frame action and storing the corresponding relation into an action characteristic library.
7. The character action generating apparatus according to claim 5,
the calculation formula of the characteristic distance comprises the following steps:
Figure FDA0002376576200000041
dis denotes the characteristic distance, PsketchRepresenting character data of the simplified strokes;
Pkeyframerepresenting the action characteristic data, wherein n represents the number of characteristic nodes in the skeleton characteristic structure;
Δ(Pskectch,Pkeyframe) Calculating the characteristic distance between the character data of the simple strokes and the action characteristic data;
esketch_irepresenting ith characteristic node data in the character data of the simplified strokes;
ekeyframe_irepresenting ith feature node data in the action feature data;
wiand representing the weight value of the ith characteristic node data in the simplified stroke characteristic data.
8. The character action generating apparatus according to claim 5,
the skeletal features include at least 15 limb unit vectors.
CN201611091834.3A 2016-12-01 2016-12-01 Role action generation method and device Active CN106780681B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611091834.3A CN106780681B (en) 2016-12-01 2016-12-01 Role action generation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611091834.3A CN106780681B (en) 2016-12-01 2016-12-01 Role action generation method and device

Publications (2)

Publication Number Publication Date
CN106780681A CN106780681A (en) 2017-05-31
CN106780681B true CN106780681B (en) 2020-04-28

Family

ID=58915615

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611091834.3A Active CN106780681B (en) 2016-12-01 2016-12-01 Role action generation method and device

Country Status (1)

Country Link
CN (1) CN106780681B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107330961A (en) * 2017-07-10 2017-11-07 湖北燿影科技有限公司 A kind of audio-visual conversion method of word and system
CN108109189A (en) * 2017-12-05 2018-06-01 北京像素软件科技股份有限公司 Act sharing method and device
CN108038160B (en) * 2017-12-06 2019-03-08 央视动画有限公司 Dynamic animation store method, dynamic animation call method and device
CN109671016B (en) * 2018-12-25 2019-12-17 网易(杭州)网络有限公司 face model generation method and device, storage medium and terminal
CN110781820B (en) * 2019-10-25 2022-08-05 网易(杭州)网络有限公司 Game character action generating method, game character action generating device, computer device and storage medium
CN110841296B (en) * 2019-11-12 2023-09-15 网易(杭州)网络有限公司 Game character skill generation method and device, electronic equipment and storage medium
CN111111199A (en) * 2019-11-19 2020-05-08 江苏名通信息科技有限公司 Role three-dimensional modeling system and method based on picture extraction
CN111127605A (en) * 2019-12-23 2020-05-08 北京像素软件科技股份有限公司 Game role visual angle direction control method and device and electronic equipment
CN111481928B (en) * 2020-04-13 2023-04-07 网易(杭州)网络有限公司 Animation loading method and device, server and storage medium
CN112891947B (en) * 2021-04-02 2024-02-23 网易(杭州)网络有限公司 Jump animation processing method, apparatus, electronic device and computer readable medium
CN113706666A (en) * 2021-08-11 2021-11-26 网易(杭州)网络有限公司 Animation data processing method, non-volatile storage medium, and electronic device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101493954A (en) * 2009-02-26 2009-07-29 清华大学 Three-dimensional modelling approach based on framework sketch drafting
CN101958007A (en) * 2010-09-20 2011-01-26 南京大学 Three-dimensional animation posture modeling method by adopting sketch
CN102179048A (en) * 2011-02-28 2011-09-14 武汉市高德电气有限公司 Method for implementing realistic game based on movement decomposition and behavior analysis
CN102231209A (en) * 2011-04-19 2011-11-02 浙江大学 Two-dimensional character cartoon generating method based on isomerism feature dimensionality reduction
CN103065348A (en) * 2012-12-27 2013-04-24 江苏太奇通软件有限公司 Two-dimension animation automatic generating method based on skeleton motion database
CN103606178A (en) * 2013-10-23 2014-02-26 合肥工业大学 Interactive motion data acquisition method based on portable terminal
CN104021584A (en) * 2014-06-25 2014-09-03 无锡梵天信息技术股份有限公司 Implementation method of skinned skeletal animation

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9734618B2 (en) * 2013-11-25 2017-08-15 Autodesk, Inc. Animating sketches via kinetic textures

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101493954A (en) * 2009-02-26 2009-07-29 清华大学 Three-dimensional modelling approach based on framework sketch drafting
CN101958007A (en) * 2010-09-20 2011-01-26 南京大学 Three-dimensional animation posture modeling method by adopting sketch
CN102179048A (en) * 2011-02-28 2011-09-14 武汉市高德电气有限公司 Method for implementing realistic game based on movement decomposition and behavior analysis
CN102231209A (en) * 2011-04-19 2011-11-02 浙江大学 Two-dimensional character cartoon generating method based on isomerism feature dimensionality reduction
CN102231209B (en) * 2011-04-19 2014-04-16 浙江大学 Two-dimensional character cartoon generating method based on isomerism feature dimensionality reduction
CN103065348A (en) * 2012-12-27 2013-04-24 江苏太奇通软件有限公司 Two-dimension animation automatic generating method based on skeleton motion database
CN103606178A (en) * 2013-10-23 2014-02-26 合肥工业大学 Interactive motion data acquisition method based on portable terminal
CN104021584A (en) * 2014-06-25 2014-09-03 无锡梵天信息技术股份有限公司 Implementation method of skinned skeletal animation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Motion doodles: an interface for sketching character motion;Matthew Thorne, et al.;《ACM Transactions on Graphics》;20040831;第23卷(第03期);第424-431页 *

Also Published As

Publication number Publication date
CN106780681A (en) 2017-05-31

Similar Documents

Publication Publication Date Title
CN106780681B (en) Role action generation method and device
CN111260762B (en) Animation implementation method and device, electronic equipment and storage medium
CN111626218B (en) Image generation method, device, equipment and storage medium based on artificial intelligence
WO2022001652A1 (en) Virtual character control method and apparatus, computer device, and storage medium
Seol et al. Creature features: online motion puppetry for non-human characters
Chen et al. KinÊtre: animating the world with the human body
CN110675475B (en) Face model generation method, device, equipment and storage medium
US20100259538A1 (en) Apparatus and method for generating facial animation
KR102374307B1 (en) Modification of animated characters
JP2019175266A (en) Operation generation device, model generation device, operation generation method and program
US20160232698A1 (en) Apparatus and method for generating animation
US11816772B2 (en) System for customizing in-game character animations by players
CN109064548B (en) Video generation method, device, equipment and storage medium
KR20230148239A (en) Robust facial animation from video using neural networks
CN115908109A (en) Facial image stylized model training method, equipment and storage medium
CN114093021A (en) Dance video motion extraction method and device, computer equipment and storage medium
US11830121B1 (en) Neural animation layering for synthesizing martial arts movements
CN115115752A (en) Virtual garment deformation prediction method and device, storage medium and electronic equipment
CN114758108A (en) Virtual object driving method, device, storage medium and computer equipment
WO2023035725A1 (en) Virtual prop display method and apparatus
CN116248920A (en) Virtual character live broadcast processing method, device and system
US20230267668A1 (en) Joint twist generation for animation
CN114373033A (en) Image processing method, image processing apparatus, image processing device, storage medium, and computer program
US20220172431A1 (en) Simulated face generation for rendering 3-d models of people that do not exist
CN115526967A (en) Animation generation method and device for virtual model, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant