CN116385605A - Method and device for generating flight animation of target object and electronic equipment - Google Patents
Method and device for generating flight animation of target object and electronic equipment Download PDFInfo
- Publication number
- CN116385605A CN116385605A CN202211551588.0A CN202211551588A CN116385605A CN 116385605 A CN116385605 A CN 116385605A CN 202211551588 A CN202211551588 A CN 202211551588A CN 116385605 A CN116385605 A CN 116385605A
- Authority
- CN
- China
- Prior art keywords
- target
- animation
- adjustment
- flight
- target object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 101
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 claims abstract description 101
- 238000002156 mixing Methods 0.000 claims abstract description 94
- 238000012545 processing Methods 0.000 claims abstract description 43
- 230000033001 locomotion Effects 0.000 claims abstract description 33
- 230000009187 flying Effects 0.000 claims description 28
- 230000008859 change Effects 0.000 claims description 23
- 238000003860 storage Methods 0.000 claims description 9
- 238000010924 continuous production Methods 0.000 claims description 6
- 230000004044 response Effects 0.000 claims description 4
- 230000000694 effects Effects 0.000 abstract description 18
- 230000000007 visual effect Effects 0.000 abstract description 8
- 230000036544 posture Effects 0.000 description 85
- 230000008569 process Effects 0.000 description 22
- 230000001133 acceleration Effects 0.000 description 17
- 230000009471 action Effects 0.000 description 13
- 230000009183 running Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 8
- 230000003993 interaction Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 239000000203 mixture Substances 0.000 description 5
- 238000004519 manufacturing process Methods 0.000 description 4
- 238000005070 sampling Methods 0.000 description 4
- 210000000988 bone and bone Anatomy 0.000 description 3
- 239000012634 fragment Substances 0.000 description 3
- 230000004927 fusion Effects 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000009191 jumping Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000027455 binding Effects 0.000 description 1
- 238000009739 binding Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000009189 diving Effects 0.000 description 1
- 238000000802 evaporation-induced self-assembly Methods 0.000 description 1
- 230000010006 flight Effects 0.000 description 1
- 230000024703 flight behavior Effects 0.000 description 1
- 235000019580 granularity Nutrition 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012887 quadratic function Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000033764 rhythmic process Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000009184 walking Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention provides a method, a device and electronic equipment for generating a flight animation of a target object, and the method, the device and the electronic equipment are used for acquiring a flight control instruction corresponding to the target object and the current gesture of the target object; the target object includes a plurality of body parts; determining a target adjustment location of the plurality of body locations, and adjustment parameters of the target adjustment location, based on the flight control instructions; based on the current gesture of the target object and the adjustment parameters of the target adjustment part, performing animation mixing processing on the preset animation resources to generate a flight animation of the target object; the preset animation resources comprise: and representing the animation frame of the target object in a preset gesture or a preset motion state. According to the method, a more lifelike flight animation effect is generated under the condition that fewer system resources are used, and the visual experience of a player is improved.
Description
Technical Field
The invention relates to the technical field of game interaction, in particular to a method and a device for generating a flight animation of a target object and electronic equipment.
Background
In the related art, the flight animation of the virtual character controlled by the player may be processed through Motion Matching (Motion Matching) technology or hybrid space animation technology. However, the performance requirement of the action matching technology on the terminal equipment is high, so that the load of the terminal equipment is too high and the loss is large. The hybrid space animation technique requires a large number of animations to be produced, and the resulting animation turns poorly, making the player's visual experience worse.
Disclosure of Invention
Accordingly, the invention aims to provide a method and a device for generating a flight animation of a target object and electronic equipment, so that the efficiency of learning and building scene objects such as buildings by players is improved, and the game experience of the players is improved.
In a first aspect, an embodiment of the present invention provides a method for generating a flight animation of a target object, where the method includes: acquiring a flight control instruction corresponding to a target object and the current gesture of the target object; the target object includes a plurality of body parts; determining a target adjustment location of the plurality of body locations, and adjustment parameters of the target adjustment location, based on the flight control instructions; based on the current gesture of the target object and the adjustment parameters of the target adjustment part, performing animation mixing processing on the preset animation resources to generate a flight animation of the target object; the preset animation resources comprise: and representing the animation frame of the target object in a preset gesture or a preset motion state.
In a second aspect, an embodiment of the present invention provides a device for generating a flight animation of a target object, including: the flight control instruction acquisition module is used for acquiring a flight control instruction corresponding to the target object and the current gesture of the target object; the target object includes a plurality of body parts; an adjustment parameter determination module for determining a target adjustment location of the plurality of body locations, and an adjustment parameter of the target adjustment location, based on the flight control instruction; the flight animation generation module is used for carrying out animation mixing processing on preset animation resources based on the current gesture of the target object and the adjustment parameters of the target adjustment part to generate the flight animation of the target object; the preset animation resources comprise: and representing the animation frame of the target object in a preset gesture or a preset motion state.
In a third aspect, an embodiment of the present invention provides an electronic device, including a processor and a memory, where the memory stores machine executable instructions executable by the processor, and the processor executes the machine executable instructions to implement the method for generating a flight animation of a target object.
In a fourth aspect, embodiments of the present invention provide a machine-readable storage medium storing machine-executable instructions that, when invoked and executed by a processor, cause the processor to implement the method of generating a fly-animation of a target object described above.
The embodiment of the invention has the following beneficial effects:
according to the method and the device for generating the flight animation of the target object and the electronic equipment, the flight control instruction corresponding to the target object and the current gesture of the target object are obtained; the target object includes a plurality of body parts; determining a target adjustment location of the plurality of body locations, and adjustment parameters of the target adjustment location, based on the flight control instructions; based on the current gesture of the target object and the adjustment parameters of the target adjustment part, performing animation mixing processing on the preset animation resources to generate a flight animation of the target object; the preset animation resources comprise: and representing the animation frame of the target object in a preset gesture or a preset motion state. According to the method, a more lifelike flight animation effect is generated under the condition that fewer system resources are used, and the visual experience of a player is improved.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
In order to make the above objects, features and advantages of the present invention more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are some embodiments of the invention and that other drawings may be obtained from these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of a method for generating a target object flight animation according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a target object in a basic flying state according to an embodiment of the present invention;
Fig. 3 is a schematic diagram of a hand gesture of a target object in a steering flight state according to an embodiment of the present invention;
fig. 4 is a schematic diagram of leg postures of another target object in a steering flight state according to an embodiment of the present invention.
FIG. 5 is a schematic diagram of a torso gesture of a target object turning to the left according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a leg gesture of a target object turning to the left according to an embodiment of the present invention.
FIG. 7 is a schematic diagram of a device for generating a target object in accordance with an embodiment of the present invention;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
At present, the role flight animation in the game has lower importance level on action transformation details during steering, and the role action is stiff and unsmooth enough during the steering operation of a player, so that the flight experience operation hand feeling in the game is poor.
In the related art, a Motion Matching (Motion Matching) manner in an engine is generally used to obtain a character flight animation having high fluency.
Motion Matching is a relatively new animation selection technique in recent years. The technique includes the steps of pre-sampling the animation data, and acquiring information such as motion (also referred to as "motion Pose"), motion trajectories, and the like. When the game is running, the game can be matched from a pre-sampled animation feature database according to the input of a player and the current action as target data, and after the matching is completed, the optimal animation sampling data is selected, and then the animation is transited from the existing position Blend to the target position to continue playing.
In popular terms, the existing motion data is abstracted according to the information of tracks, pose and the like, and the most suitable animation is found from the abstracted data to play according to the context in running. The process of the technology is roughly divided into: (1) sampling (2) constructing run time data (3) finding the most suitable animation (4) to mix (also called "Blend") to the target position for playing. Due to the nature of Motion Matching, it is theoretically possible to switch arbitrarily between each sampling point.
The method synthesizes new animation by comparing the current animation with the animation fragments in the database in real time and searching for a target animation frame or fragment which is most matched with the current animation in the animation database, but the method has high load on a processor of terminal equipment such as a mobile phone, a tablet computer and the like.
Conventional animation generation techniques also include hybrid spatial animation. The hybrid spatial animation utilizes resources that blend based on specific properties or conditions, reducing the need to create a single hard-coded node for hybrid animation. It enables an animator or programmer to specify the inputs and original animations, and the manner in which the inputs are mixed between animations, virtually any type of mixing can be performed through the hybrid spatial generalization. The mixing space is entirely generic and the factors that determine mixing vary for each individual mixing space. Each mixing space has an input that receives a value. These values may be calculated (via EventGraph) during the update period of the animated blueprint, driven by gameplay code or other means. These features make the mixing space extremely flexible and can be used to place initiative in the creator of the animagraph to mix the animations in a manner that meets their needs.
This approach has low requirements on the performance of the handset, but is stiff in terms of the animation steering effect while in flight. In addition, since the number of animations to be produced is large, more cost is consumed for producing the animations, and thus, it is necessary to make a trade-off between effects and performance in order to achieve a smooth flying animation.
Based on the above, the method, the device and the electronic equipment for generating the flight animation of the target object provided by the embodiment of the invention can be applied to the flight process of the virtual object in various virtual scenes.
The method for generating the flight animation of the target object in one embodiment of the invention can be operated on a local terminal device or a server. When the flight animation generation method of the target object runs on the server, the method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an alternative embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, the running main body of the game program and the game picture presentation main body are separated, the storage and running of the flying animation generating method of the target object are completed on the cloud game server, and the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the cloud game server which performs information processing is a cloud. When playing the game, the player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In an alternative embodiment, taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
In a possible implementation manner, the embodiment of the invention provides a method for generating a flight animation of a target object, as shown in fig. 1, which includes the following steps:
step S102, acquiring a flight control instruction corresponding to a target object and the current gesture of the target object; the target object includes a plurality of body parts.
The target object may be a virtual human object, or may be another biological object, or may be a movable virtual object designed by a game designer. The target virtual object has a plurality of body parts, such as a head, torso, legs, hands, etc., or other body parts having similar functions to the body parts described above. In the process of flying the target object, when the target object is in different flying states, the gesture change trend of the body parts is usually different, and the change trend can be used for keeping the target object balanced in the process of flying the target object and can also be used for providing acceleration for the target object in the process of flying.
The above-mentioned flight control instruction may be input to the terminal device by the player who controls the target object through the man-machine interaction device, the method is executed by the terminal device, or the flight control instruction is transmitted to the game server by the terminal device, and the method is executed by the game server.
At the present moment, the target object may be in a flight state, or may be in a static state, a walking state, or a running state. The flight state can be divided into various types, such as uniform flight, diving flight, steering flight and the like. The above-mentioned flight control instructions are generally used to control the target object to enter a flight state from a current non-flight state, or to control the target object to be in a certain flight state.
Step S104, determining a target adjustment part of the plurality of body parts and an adjustment parameter of the target adjustment part based on the flight control instruction.
In the real world, various body parts of an organism may undergo a certain posture change in order to maintain balance or due to inertia during take-off, accelerated flight, decelerated flight or flight turning. For example, when the living body starts to accelerate the flight, the living body may have a slightly reclined posture, and the hand may be lifted for balance.
In order to simulate the real organism flight state changing process, a player is given more real flight experience, real biological flight images can be analyzed in advance, and body parts with postures adjusted under various flight states can be known. Therefore, under different flight states, a plurality of target adjustment positions which need to be subjected to gesture adjustment, and parameters such as adjustment modes (such as adjustment of a swinging angle or a torsion angle), starting time, ending time, gesture adjustment amplitude and the like which are adjusted by the target adjustment positions after the target object enters the corresponding flight state, for example, hands begin to swing at the moment that the target object enters the acceleration flight state, firstly swing upwards by a first angle within 2s, and then swing downwards by a second angle within 2s, are determined in advance. The first angle and the second angle are gesture adjustment amplitude, and the numerical value of the first angle and the second angle have a certain mathematical relationship with acceleration of accelerating flight.
In some flight states, the gesture adjustment amplitude may be irrelevant to the control parameter corresponding to the flight control instruction, after the target adjustment position is determined, the gesture adjustment amplitude is determined, if the flight state controlled by the flight control instruction is a take-off state, each body part of the target object will finally fly in a set gesture, and then the gesture difference between the set gesture and the gesture of the target gesture in the non-flight state is the gesture adjustment amplitude.
When the flight control instruction is acquired, the target adjustment position and the adjustment parameter of the target adjustment position can be determined based on the flight state controlled by the flight control instruction. Specifically, the control parameters corresponding to the flight control instructions for controlling different flight states are generally different, and the flight state controlled by the flight control instructions may be determined based on the control parameters corresponding to the flight control instructions. For example, the control parameters corresponding to the flight control command for controlling the acceleration flight state may be acceleration parameters, force parameters, and the like. The flight control instruction can be analyzed, when the control parameters corresponding to the flight control instruction comprise one or more of the parameters, the flight state controlled by the flight control instruction is determined to be an acceleration flight state, so that the body parts are determined to be target adjustment parts, and the starting time, the ending time and the like of adjustment of the target adjustment parts after the target object enters the corresponding flight state are determined.
The corresponding relation between the control parameter corresponding to the flight control command and the gesture adjustment amplitude of the target adjustment part can be preset, such as linear relation and quadratic function relation, and the corresponding relation can be represented by an interpolation table, for example, when the acceleration is 10m/s 2 When the user is in a state of sitting back, the preset body is set to have a change angle of 5 degrees. Further, the attitude adjustment amplitude of each target adjustment portion can be determined based on the control parameter of the flight state control operation.
Step S106, based on the current gesture of the target object and the adjustment parameters of the target adjustment part, performing animation mixing processing on the preset animation resource to generate a flight animation of the target object; the preset animation resources comprise: and representing the animation frame of the target object in a preset gesture or a preset motion state.
The preset animation resource may include an animation frame representing that the target object is in a preset gesture, where the preset gesture may be a flight gesture of the target object in various flight states, for example, a steering flight gesture, an accelerating flight gesture, or the like, or may be a running gesture, a jumping gesture, or the like of the target object, where a body part of the target object is twisted and/or swung in the preset gesture. The preset animation resource may also include an animation segment that characterizes that the target object is in a preset motion state, which, as described above, may be one or more flight states of the target object, or may be a running state, a jumping state, etc. of the target object. The above-described animation segments are also typically composed of a plurality of animation frames.
After determining the adjustment parameters of the target adjustment part, such as the adjustment mode, the adjustment time and the gesture adjustment range, the processing parameters when the animation mixing processing is performed on the preset animation resource need to be determined. The flying animation generated as described above is typically composed of a plurality of animation frames. It is generally necessary to generate a plurality of animation frames in the order in which the animation frames are played. Specifically, the current posture adjustment parameter of the target adjustment portion in each of the animation frames to be generated may be determined as the processing parameter at the time of performing the animation mixing process, based on the adjustment parameter of the target adjustment portion.
After the processing parameters in the animation mixing processing are obtained, the animation mixing processing can be performed on the preset animation resources of the target object and the animation resources of the current gesture of the target object based on the processing parameters so as to generate each animation frame of the flight animation. In particular implementations, a hybrid spatial approach may be generally employed. In this case, it is also necessary to use a bone model of the target object, and input the position parameter of the target adjustment portion in the bone model and the corresponding processing parameter as parameters into the mixing space to perform the animation mixing processing based on the portion of the bone model of the target object corresponding to the target adjustment portion. The method has small calculated amount and low performance requirement on the terminal equipment. Other animation generating modes can be adopted to process the preset animation resources, and the animation resources can be specifically set according to requirements.
According to the method for generating the flight animation of the target object, the flight control instruction corresponding to the target object and the current gesture of the target object are obtained; the target object includes a plurality of body parts; determining a target adjustment location of the plurality of body locations, and adjustment parameters of the target adjustment location, based on the flight control instructions; based on the current gesture of the target object and the adjustment parameters of the target adjustment part, performing animation mixing processing on the preset animation resources to generate a flight animation of the target object; the preset animation resources comprise: and representing the animation frame of the target object in a preset gesture or a preset motion state. According to the method, a more lifelike flight animation effect is generated under the condition that fewer system resources are used, and the visual experience of a player is improved.
The following embodiments provide a specific way to determine a target adjustment location among a plurality of body locations, and an adjustment parameter of the target adjustment location, based on flight control instructions.
The above-described flight instructions may be generated by a flight control operation of the player. The flight control operation may be a touch operation of a player on a target object displayed on a graphical user interface or a control for controlling flight, or may be an operation of a player on a man-machine interaction device, such as a setting key on a keyboard. For example, when a player desires an accelerated flight of a target object, the player may slide the target object upward to generate a flight command for controlling the accelerated flight, and an acceleration parameter in the flight command may be proportional to the sliding distance.
During flight, the target object may be in various flight states, such as a constant speed flight state, a deceleration flight state, an acceleration flight state, or a turning flight state. When the flight states of the target objects are different, the flight attitudes to the target objects are generally different, i.e., the attitudes of the respective body parts are different. In different flight states, the posture of the body part can be obtained by analyzing and researching the posture of a real organism in the flight process. The flight control command generally indicates a target flight state controlled by the flight control command, for example, by analyzing the flight control command to determine that the flight control command includes an acceleration parameter, the flight control command can be considered to control an acceleration flight state.
After the target flight state controlled by the flight control instruction is determined, the target object is controlled to be in the target flight state in response to the flight control instruction, and the body part to be adjusted corresponding to the target flight state is determined as the target adjustment part. Wherein the target adjusting part corresponds to a preset posture adjusting mode; the preset gesture adjustment mode represents the change trend of the gesture of the target adjustment part in the continuous process of the target flight state; the change trend may be that the swing angle increases at a preset speed within a set time, then is maintained, and simultaneously defines the gesture adjustment amplitude, or may not define the gesture adjustment amplitude, but subsequently determine the adjustment parameter of the target adjustment part based on the flight control instruction and the preset gesture adjustment mode of the target adjustment part, where the adjustment parameter includes the gesture adjustment amplitude. .
When the method is specifically implemented, the control time and the control force corresponding to the flight control instruction can be determined by analyzing the flight control instruction; the control force may be generally represented by a parameter such as an acceleration parameter or a steering speed, which can affect the posture adjustment range of the target adjustment portion. Because the control time corresponding to the flight control instruction is limited, and the preset gesture adjustment mode is a gesture change trend in the continuous process of the target flight state, the current gesture adjustment mode of the target adjustment part can be determined based on the control time corresponding to the flight control instruction and the preset gesture adjustment mode, so that the change trend of the gesture of the target adjustment part in the control time can be represented by the current gesture adjustment mode.
Specifically, a portion corresponding to the control time in the preset posture adjustment manner may be intercepted as the current posture adjustment manner. For example, when the preset posture adjustment mode is a posture change function with time as an independent variable, the current posture adjustment mode may be a posture change function with a constraint condition of the independent variable being at a control time.
After determining the current attitude adjustment function, the attitude adjustment magnitudes of the plurality of target adjustment positions may be determined based on the control dynamics corresponding to the flight control instruction and the current attitude adjustment manner. The control force can be represented by a parameter which affects the attitude adjustment amplitude of the target adjustment part in the flight control instruction. For example, for an acceleration flight state, the acceleration parameter generally affects the posture adjustment amplitude of the corresponding target adjustment portion, and in general, the larger the acceleration, the larger the posture adjustment amplitude of each corresponding target adjustment portion in order to maintain the body balance of the target object. In the current posture adjustment mode, a corresponding relation between the posture adjustment amplitude and the control force is usually indicated by a certain parameter, and the corresponding relation can be represented in the forms of an interpolation function, a proportionality coefficient and the like.
The following describes a procedure for determining the adjustment parameters of the target adjustment portion, taking the steering flight state of the target object controlled by the flight control instruction as an example. The target adjustment part corresponding to the steering state is usually provided with a preset posture adjustment mode. For example, the posture adjustment method corresponding to the trunk includes a start time and a holding time of a torsion adjustment process of the trunk, a body orientation adjustment parameter and a torsion adjustment coefficient related to a steering angle, and the posture adjustment method corresponding to the leg portion includes a swing adjustment coefficient related to a steering angle and a swing adjustment process of the leg portion.
Corresponding to the rotating flight state, the flight control instruction is analyzed, and the rotating duration and the rotating direction of the target object under the control of the flight control instruction can be generally determined. When the adjustment parameters are determined, the adjustment time corresponding to the trunk, the legs and the hands in the rotation time can be determined based on the rotation time; then calculating the steering angle of the flight control instruction based on the rotation time length, the rotation direction and a preset unit rotation angle; and finally, determining the posture adjustment amplitude of each target adjustment part based on the steering angle and the posture adjustment coefficient related to the steering angle in the posture adjustment mode corresponding to the target adjustment part. For example, the posture of the trunk may be adjusted in the whole rotation period, and the posture adjustment time of the hands and the feet may be the first 5s in the rotation period, which may be specifically set according to the requirement, and is not limited herein. When determining the adjustment parameters of each target adjustment part, aiming at the trunk, calculating the direction adjustment amplitude and the torsion angle amplitude of the trunk based on the steering angle and the control time; for legs and hands, based on steering angle and control time, swing adjustment amplitude of legs and hands needs to be calculated.
The following embodiments provide a specific way to generate a flight animation of a target object by performing animation mixing processing on a preset animation resource based on a current gesture of the target object and an adjustment parameter of a target adjustment position.
In a specific implementation, the above-mentioned flying animation generally includes a plurality of target animation frames arranged in a set order, and the plurality of target animation frames are played in the arrangement order to form the flying animation. In general, the following operations need to be sequentially performed for each target animation frame in order of arrangement of a plurality of target animation frames to generate a corresponding target animation frame: firstly, determining a mixing space parameter corresponding to a target animation frame based on an adjustment parameter of a target adjustment part and arrangement positions of the target animation frame in a plurality of target animation frames; and further, based on the mixing space parameters corresponding to the target animation frame and the current gesture of the target object, performing animation mixing processing on the preset animation resource to generate the target animation frame.
In the process of determining the mixing space parameters corresponding to the target animation frames, the current posture adjustment parameters corresponding to the target adjustment positions in the target animation frames are determined firstly based on the adjustment parameters of the target adjustment positions and the arrangement positions of the target animation frames in the plurality of target animation frames. The current posture adjustment mode in the adjustment parameters can represent the change trend of the posture of the target adjustment part in the control time corresponding to the flight control instruction, for example, the target adjustment part is controlled to swing upwards at a first speed for a first time and then swing upwards at a second speed for a second time. The entire flight animation corresponds to the control time of the flight control command, and different target animation frames correspond to different target moments in the control time.
Specifically, the target time corresponding to the target animation frame in the control time corresponding to the flight control instruction may be determined based on the arrangement positions of the target animation frame in the plurality of target animation frames; and determining the current posture adjustment parameters corresponding to the target adjustment part in the target animation frame based on the target moment, the current posture adjustment mode corresponding to the target adjustment part and the posture adjustment amplitude. When the method is specifically implemented, the current posture adjustment parameters corresponding to the target adjustment positions in the target animation frame can be calculated according to the expression form of the current posture adjustment mode; for example, when the current posture adjustment mode is represented by a function using time as an argument, the target time may be substituted into the function, and then a proportional parameter of the current posture adjustment parameter and the posture adjustment amplitude corresponding to the target animation frame is obtained, so that the current posture adjustment parameter is calculated.
Animation blending techniques include a variety of techniques such as blending trees (Blend trees), animation Layers (Layers), blending spaces (Blend spaces), and the like. The mixing tree is used for linearly mixing a plurality of animation fragments according to the position, the speed and the angular speed; each animation layer only controls part of the animation main body, and other parts are shielded by a shade; the mixing space is to divide an animation frame image to be generated into a plurality of grids, and calculate weights of animation key frames for mixing processing in each grid based on mixing space parameters input to the mixing space, thereby generating corresponding animation frames. When different animation mixing techniques are employed, corresponding animation mixing parameters need to be determined according to the animation mixing techniques.
When the animation is mixed by adopting the mixing space technology, the mixing space parameters corresponding to the target animation frame are required to be determined based on the current posture adjustment parameters of the target adjustment part. Since the hybrid space technique also requires application of the skeletal model asset of the target object when performing animation fusion, the generated hybrid space parameters also typically require specifying the portion of the skeletal model asset corresponding to the target adjustment site to participate in the animation blending process. In addition, the weights of different animation key frames in the animation mixing process are determined based on the current posture adjustment parameters of the target adjustment part, so that the weights are input into a mixing space as mixing space parameters for animation mixing.
Because a plurality of animation key frames in the preset animation resources generally correspond to a preset flight state, after determining gesture adjustment parameters corresponding to target animation frames, firstly, a target object is required to be controlled to be in the target flight state in response to a flight control instruction, and the animation key frames corresponding to the target flight state in the plurality of animation key frames of the target object are determined to be target key frames; and further performing animation mixing processing on the animation frames corresponding to the current gesture of the target key frame and the target object based on the mixed space parameters corresponding to the target animation frames by using a mixed space technology to generate target animation frames. Animation mixing is performed through a mixing space technology to obtain a relatively mature animation processing technology, and the specific process is not described herein.
The embodiment of the invention also provides another method for generating the flight animation of the target object. This is implemented on the basis of the method shown in fig. 1. The method has the advantages that the two schemes of action matching and mixed space animation researched above are complemented, and the engine state machine scheme with better performance is optimized, so that the action matching effect with better effect is achieved. The core idea is that the human body control reaction simulating the flight behavior of the armor is used for respectively disassembling the limb actions of the upper body and the lower body of the character during steering, subdividing and manufacturing the mixed space postures of the steering starting stage and the steering ending stage, and overlapping, transiting and fusing the mixed space postures into the mixed space of four flight directions through the animation blueprint of the UE4 engine, so that the body animation is natural and smooth in the whole steering process.
In particular implementations, flights may be divided into two states, take-off (fly start) and in-flight (fly), according to the flight state, where the take-off state handles take-off related animation representations, including take-off animations, special effect bindings, and the like. In-flight conditions primarily deal with the animation of the in-flight presentation. When the method enters the flying process, the target object can be firstly jumped to the take-off state, and then the target object is transited to the flying process according to the playing time of the animation. The flight state in the whole flight is further subdivided into a deceleration flight (DecFly), a normal flight (normfly), an acceleration flight (AccFly), a start acceleration flight (StartAccFly) and the like according to the difference in flight performance.
The transition between each flight state can be carried out, wherein each state comprises a state flight basic animation, a steering trunk mixed space superposition, an initial state leg animation mixed space superposition, a hand animation mixed space superposition and the like. Through dividing the flying process into relatively independent state machines with different granularities, the animation expression in each state can be polished conveniently, the states can not be interfered with each other, in addition, the program can be conveniently parameterized to express logic through the state division, and the expression effect is more controllable.
For each in-flight state, the in-flight state can be internally divided into a flight base animation, a steering trunk mixed space superposition, an initial state leg animation mixed space superposition, a hand animation mixed space superposition and the like.
Firstly, a basic flying animation is played, at the moment, the animation only comprises a flying basic gesture, and no dynamic effect is achieved. And then parameterizing the inclination degree of the trunk on the Yaw angle (Yaw) and the Pitch angle (Pitch), converting the steering data input by the player to obtain parameter values corresponding to the mixing space, and updating the corresponding parameter values to adjust the inclination degree of the trunk so as to obtain the inclination posture of the trunk and superposing the inclination posture on the basic flying animation, wherein the posture of the target object in the basic flying state is shown in fig. 2. After the animation of the superimposed trunk is obtained, aiming at the natural effect of keeping the body balance when the target object turns, the animation superposition of reversely lifting legs is added when the target object just starts turning, namely, when the target object starts turning leftwards, the left leg naturally lifts up with small amplitude, the right leg naturally swings down with small amplitude, and the action amplitude is controlled through parameters of a mixing space. In the initial stage, for body balance, the effect of backward swing of the legs is added, but with the completion of steering, the action needs to be transited to incline towards the steering direction to achieve the effect of trying to get rid of centrifugal force and fly towards the center of a circle, so that the actions of the hands and the legs can be superimposed into the animation according to the duration and the degree of steering. The hand motion is shown in fig. 3, and the leg motion is shown in fig. 4. Similarly, the swing motion of the leg is superimposed into the animation.
Through superposition of a series of detail actions, the state data in the flight process can be parameterized and mapped into the corresponding mixing space, so that a flight animation effect convenient for parameter control is obtained.
When the target object is in flight steering, the steering angle needs to be animatedAnd (3) converting the fusion space to obtain input parameters of the trunk fusion space, and finally outputting trunk animation. Specifically calculated as follows, assuming that the difference between the target direction (yaw) in which the target object turns and the current direction of the target object is Y, the difference between the target pitch and the current pitch of the target object is P, and the maximum value of the difference between the target object flight turns yaw is Y max The maximum difference of pitch is Y max Then Fly_turn_Acc_X (formula is denoted as T yaw ) And fly_turn_acc_y (expressed as T pitch ) Is calculated as follows:
when the flight starts to turn, and the target orientation is greater than a certain angle with the current orientation, the initial leg animation needs to be overlapped, the amplitude of the leg motion is determined by the target yaw of the target object, and the greater the target yaw value, the greater the leg swing motion. Fly_Turn_Start_Leg_Acc (S is used in the formula) leg Representation) is obtained by converting and calculating the difference value between the target object flying steering target direction and the current direction, and the specific calculation formula is as follows:
The flying steering process also needs to adjust the swing motion of the arms and legs according to the time length of steering in a certain direction, when the flying starts to turn, the time length of steering is recorded, and the program calculates the amplitude value Fly_hand_Leg_Acc_X (shown as F in the formula) of the arm motion according to the time length of steering and the angle difference hand ) And magnitude of Leg motion value fly_hand_leg_acc_y (expressed as F in the formula) leg ). Assume that the duration of the current turn is t i Arm amplitude was calculated as follows:
wherein,,representing the arm amplitude value of the previous frame, C (t i ) Representing the current time curve value, f hand Is an interpolation factor.
Wherein,,representing the leg amplitude value, C' (t), of the previous frame i ) Representing the current time curve value, f d Representing leg movement delay factor, f leg Representing the interpolation factor.
In the process of the flight steering, the animation direction of the trunk of the target object can be adjusted according to different target directions, so that the flight steering is more real, and fig. 5 is a left steering schematic diagram of the target object.
When arm animation is mixed, a mixing space can be established by utilizing inclined animations of left and right arms of the upper body, and transition is superimposed into a flight basic animation in the front half animation of starting turning, so that the upper body of the body has a tendency of controlling the direction in advance. The steering timeliness of the upper body is relatively high, and the buffering action of steering is not required to be ended.
When the leg animation is mixed, the action of the lower body is responsible for controlling the balance of the character when turning, and the character is split into two stages according to the human motion society, namely a turning starting stage and a turning continuous stage. The amplitude of the swing of the two legs of the animation in the mixing space of the two stages is consistent, and the swing is used for adding and subtracting counteraction.
In the beginning stage of turning, the lower body will play the start_leg mixing space first, i.e. swing legs opposite to the turning direction, so as to achieve the purpose of controlling body balance, as shown in fig. 6, when the target object turns left, the right leg swings upwards to control body balance. And in the steering duration stage, the Leg is swung in the original steering direction to buffer the steering end. When the device is specifically implemented, the difference time can be adjusted through mixing, so that the lower body playing leg swinging time is matched with the rhythm of body control during steering, and the effect of smoother and smoother action is achieved.
The method effectively solves the problem that the character flight animation is stiff and smooth enough, changes the traditional overall animation production mode into a single-frame slice type production mode, reduces the production period, realizes cost reduction and efficiency improvement, and meets the performance requirement of a low-end machine.
For the above method embodiment, referring to a target object flight animation generation device shown in fig. 7, a graphical user interface is provided through a terminal device; displaying a scene picture of the virtual scene in the graphical user interface; the virtual scene comprises a target object; the device comprises:
The flight control instruction acquisition module 702 is configured to acquire a flight control instruction corresponding to a target object and a current gesture of the target object; the target object includes a plurality of body parts;
an adjustment parameter determination module 704 for determining a target adjustment location of the plurality of body locations, and an adjustment parameter of the target adjustment location, based on the flight control instructions;
the flight animation generation module 706 is configured to perform animation mixing processing on a preset animation resource based on the current gesture of the target object and the adjustment parameter of the target adjustment part, so as to generate a flight animation of the target object; the preset animation resources comprise: and representing the animation frame of the target object in a preset gesture or a preset motion state.
The flight animation generating device of the target object acquires the flight control instruction corresponding to the target object and the current gesture of the target object; the target object includes a plurality of body parts; determining a target adjustment location of the plurality of body locations, and adjustment parameters of the target adjustment location, based on the flight control instructions; based on the current gesture of the target object and the adjustment parameters of the target adjustment part, performing animation mixing processing on the preset animation resources to generate a flight animation of the target object; the preset animation resources comprise: and representing the animation frame of the target object in a preset gesture or a preset motion state. According to the method, a more lifelike flight animation effect is generated under the condition that fewer system resources are used, and the visual experience of a player is improved.
The target object corresponds to a plurality of flight states; the flight state corresponds to a preset body part to be adjusted; the adjustment parameter determination module is further configured to: responding to the flight control instruction to control the target object to be in a target flight state, and determining a body part to be adjusted corresponding to the target flight state as a target adjustment part; the target adjusting part corresponds to a preset posture adjusting mode; the preset gesture adjustment mode represents the change trend of the gesture of the target adjustment part in the continuous process of the target flight state; and determining the adjustment parameters of the target adjustment part based on the flight control instruction and the preset posture adjustment mode of the target adjustment part.
The adjustment parameters comprise a current gesture adjustment mode and a gesture adjustment amplitude; the adjustment parameter determination module is further configured to: determining a current posture adjustment mode of the target adjustment part based on the control time corresponding to the flight control instruction and a preset posture adjustment mode; the current gesture adjustment mode is used for representing the change trend of the gesture of the target adjustment part in the control time; and determining the gesture adjustment amplitude of the plurality of target adjustment positions based on the control force corresponding to the flight control instruction and the current gesture adjustment mode.
The flying animation comprises a plurality of target animation frames arranged according to a set sequence; the flight animation generation module is also used for: according to the arrangement sequence of a plurality of target animation frames, the following operations are sequentially executed for each target animation frame: determining a mixing space parameter corresponding to the target animation frame based on the adjustment parameter of the target adjustment part and the arrangement positions of the target animation frame in the plurality of target animation frames; and performing animation mixing processing on the preset animation resources based on the mixing space parameters corresponding to the target animation frames and the current gesture of the target object to generate the target animation frames.
The flight animation generation module is also used for: determining current posture adjustment parameters corresponding to the target adjustment positions in the target animation frames based on the adjustment parameters of the target adjustment positions and the arrangement positions of the target animation frames in the plurality of target animation frames; and determining the mixing space parameter corresponding to the target animation frame based on the current posture adjustment parameter of the target adjustment part.
The adjustment parameters comprise a current gesture adjustment mode and a gesture adjustment amplitude; the current gesture adjustment mode is used for representing the change trend of the gesture of the target adjustment part in the control time corresponding to the flight control instruction; the flight animation generation module is also used for: determining a target moment corresponding to the target animation frame in control time corresponding to the flight control instruction based on arrangement positions of the target animation frame in a plurality of target animation frames; and determining the current posture adjustment parameters corresponding to the target adjustment part in the target animation frame based on the target moment, the current posture adjustment mode corresponding to the target adjustment part and the posture adjustment amplitude.
The target object corresponds to a plurality of flight states; the preset animation resources comprise a plurality of animation key frames of the target object; the animation key frame corresponds to a preset flight state; the flight animation generation module is also used for: responding to the flight control instruction to control the target object to be in a target flight state, and determining an animation key frame corresponding to the target flight state from a plurality of animation key frames of the target object as a target key frame; and based on the mixing space parameters corresponding to the target animation frames, performing animation mixing processing on the target key frames and the animation frames corresponding to the current gesture of the target object to generate the target animation frames.
The present embodiment also provides an electronic device including a processor and a memory, where the memory stores machine executable instructions executable by the processor, and the processor executes the machine executable instructions to implement a method for generating a flight animation of the target object, for example:
acquiring a flight control instruction corresponding to a target object and the current gesture of the target object; the target object includes a plurality of body parts; determining a target adjustment location of the plurality of body locations, and adjustment parameters of the target adjustment location, based on the flight control instructions; based on the current gesture of the target object and the adjustment parameters of the target adjustment part, performing animation mixing processing on the preset animation resources to generate a flight animation of the target object; the preset animation resources comprise: and representing the animation frame of the target object in a preset gesture or a preset motion state.
According to the method for generating the flight animation of the target object, a more lifelike flight animation effect is generated under the condition that fewer system resources are used, and the visual experience of a player is improved.
Optionally, the target object corresponds to a plurality of flight states; the flight state corresponds to a preset body part to be adjusted; a step of determining a target adjustment location among a plurality of body locations, and an adjustment parameter of the target adjustment location, based on the flight control instructions, comprising: responding to the flight control instruction to control the target object to be in a target flight state, and determining a body part to be adjusted corresponding to the target flight state as a target adjustment part; the target adjusting part corresponds to a preset posture adjusting mode; the preset gesture adjustment mode represents the change trend of the gesture of the target adjustment part in the continuous process of the target flight state; and determining the adjustment parameters of the target adjustment part based on the flight control instruction and the preset posture adjustment mode of the target adjustment part.
Optionally, the adjustment parameters include a current posture adjustment mode and a posture adjustment amplitude; determining an adjustment parameter of the target adjustment part based on the flight control instruction and a preset posture adjustment mode of the target adjustment part, including: determining a current posture adjustment mode of the target adjustment part based on the control time corresponding to the flight control instruction and a preset posture adjustment mode; the current gesture adjustment mode is used for representing the change trend of the gesture of the target adjustment part in the control time; and determining the gesture adjustment amplitude of the plurality of target adjustment positions based on the control force corresponding to the flight control instruction and the current gesture adjustment mode.
Optionally, the flying animation includes a plurality of target animation frames arranged according to a set sequence; based on the current gesture of the target object and the adjustment parameters of the target adjustment part, performing animation mixing processing on the preset animation resource to generate the flying animation of the target object, comprising the following steps: according to the arrangement sequence of a plurality of target animation frames, the following operations are sequentially executed for each target animation frame: determining a mixing space parameter corresponding to the target animation frame based on the adjustment parameter of the target adjustment part and the arrangement positions of the target animation frame in the plurality of target animation frames; and performing animation mixing processing on the preset animation resources based on the mixing space parameters corresponding to the target animation frames and the current gesture of the target object to generate the target animation frames.
Optionally, the step of determining the mixing space parameter corresponding to the target animation frame based on the adjustment parameter of the target adjustment part and the arrangement positions of the target animation frame in the plurality of target animation frames includes: determining current posture adjustment parameters corresponding to the target adjustment positions in the target animation frames based on the adjustment parameters of the target adjustment positions and the arrangement positions of the target animation frames in the plurality of target animation frames; and determining the mixing space parameter corresponding to the target animation frame based on the current posture adjustment parameter of the target adjustment part.
Optionally, the adjustment parameters include a current posture adjustment mode and a posture adjustment amplitude; the current gesture adjustment mode is used for representing the change trend of the gesture of the target adjustment part in the control time corresponding to the flight control instruction; determining a current posture adjustment parameter corresponding to the target adjustment part in the target animation frame based on the adjustment parameter of the target adjustment part and the arrangement positions of the target animation frame in the plurality of target animation frames, wherein the method comprises the following steps: determining a target moment corresponding to the target animation frame in control time corresponding to the flight control instruction based on arrangement positions of the target animation frame in a plurality of target animation frames; and determining the current posture adjustment parameters corresponding to the target adjustment part in the target animation frame based on the target moment, the current posture adjustment mode corresponding to the target adjustment part and the posture adjustment amplitude.
Optionally, the target object corresponds to a plurality of flight states; the preset animation resources comprise a plurality of animation key frames of the target object; the animation key frame corresponds to a preset flight state; based on the mixing space parameters corresponding to the target animation frame and the current gesture of the target object, performing animation mixing processing on the preset animation resource to generate the target animation frame, wherein the method comprises the following steps: responding to the flight control instruction to control the target object to be in a target flight state, and determining an animation key frame corresponding to the target flight state from a plurality of animation key frames of the target object as a target key frame; and based on the mixing space parameters corresponding to the target animation frames, performing animation mixing processing on the target key frames and the animation frames corresponding to the current gesture of the target object to generate the target animation frames.
Referring to fig. 8, the electronic device includes a processor 100 and a memory 101, the memory 101 storing machine-executable instructions that can be executed by the processor 100, the processor 100 executing the machine-executable instructions to implement the above-described target object flight animation generation method.
Further, the electronic device shown in fig. 8 further includes a bus 102 and a communication interface 103, and the processor 100, the communication interface 103, and the memory 101 are connected through the bus 102.
The memory 101 may include a high-speed random access memory (RAM, random Access Memory), and may further include a non-volatile memory (non-volatile memory), such as at least one magnetic disk memory. The communication connection between the system network element and at least one other network element is implemented via at least one communication interface 103 (which may be wired or wireless), and may use the internet, a wide area network, a local network, a metropolitan area network, etc. Bus 102 may be an ISA bus, a PCI bus, an EISA bus, or the like. The buses may be classified as address buses, data buses, control buses, etc. For ease of illustration, only one bi-directional arrow is shown in FIG. 8, but not only one bus or type of bus.
The processor 100 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 100 or by instructions in the form of software. The processor 100 may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU for short), a network processor (Network Processor, NP for short), etc.; but also digital signal processors (Digital Signal Processor, DSP for short), application specific integrated circuits (Application Specific Integrated Circuit, ASIC for short), field-programmable gate arrays (Field-Programmable Gate Array, FPGA for short) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in the memory 101, and the processor 100 reads the information in the memory 101 and, in combination with its hardware, performs the steps of the method of the previous embodiment.
The present embodiment also provides a machine-readable storage medium storing machine-executable instructions that, when invoked and executed by a processor, cause the processor to implement the method for generating a flight animation of a target object described above.
The method, the device and the electronic equipment for generating the flight animation of the target object provided by the embodiment of the invention comprise a computer readable storage medium storing program codes, wherein the program codes comprise instructions for executing the method in the previous method embodiment, such as:
acquiring a flight control instruction corresponding to a target object and the current gesture of the target object; the target object includes a plurality of body parts; determining a target adjustment location of the plurality of body locations, and adjustment parameters of the target adjustment location, based on the flight control instructions; based on the current gesture of the target object and the adjustment parameters of the target adjustment part, performing animation mixing processing on the preset animation resources to generate a flight animation of the target object; the preset animation resources comprise: and representing the animation frame of the target object in a preset gesture or a preset motion state.
According to the method for generating the flight animation of the target object, a more lifelike flight animation effect is generated under the condition that fewer system resources are used, and the visual experience of a player is improved.
Optionally, the target object corresponds to a plurality of flight states; the flight state corresponds to a preset body part to be adjusted; a step of determining a target adjustment location among a plurality of body locations, and an adjustment parameter of the target adjustment location, based on the flight control instructions, comprising: responding to the flight control instruction to control the target object to be in a target flight state, and determining a body part to be adjusted corresponding to the target flight state as a target adjustment part; the target adjusting part corresponds to a preset posture adjusting mode; the preset gesture adjustment mode represents the change trend of the gesture of the target adjustment part in the continuous process of the target flight state; and determining the adjustment parameters of the target adjustment part based on the flight control instruction and the preset posture adjustment mode of the target adjustment part.
Optionally, the adjustment parameters include a current posture adjustment mode and a posture adjustment amplitude; determining an adjustment parameter of the target adjustment part based on the flight control instruction and a preset posture adjustment mode of the target adjustment part, including: determining a current posture adjustment mode of the target adjustment part based on the control time corresponding to the flight control instruction and a preset posture adjustment mode; the current gesture adjustment mode is used for representing the change trend of the gesture of the target adjustment part in the control time; and determining the gesture adjustment amplitude of the plurality of target adjustment positions based on the control force corresponding to the flight control instruction and the current gesture adjustment mode.
Optionally, the flying animation includes a plurality of target animation frames arranged according to a set sequence; based on the current gesture of the target object and the adjustment parameters of the target adjustment part, performing animation mixing processing on the preset animation resource to generate the flying animation of the target object, comprising the following steps: according to the arrangement sequence of a plurality of target animation frames, the following operations are sequentially executed for each target animation frame: determining a mixing space parameter corresponding to the target animation frame based on the adjustment parameter of the target adjustment part and the arrangement positions of the target animation frame in the plurality of target animation frames; and performing animation mixing processing on the preset animation resources based on the mixing space parameters corresponding to the target animation frames and the current gesture of the target object to generate the target animation frames.
Optionally, the step of determining the mixing space parameter corresponding to the target animation frame based on the adjustment parameter of the target adjustment part and the arrangement positions of the target animation frame in the plurality of target animation frames includes: determining current posture adjustment parameters corresponding to the target adjustment positions in the target animation frames based on the adjustment parameters of the target adjustment positions and the arrangement positions of the target animation frames in the plurality of target animation frames; and determining the mixing space parameter corresponding to the target animation frame based on the current posture adjustment parameter of the target adjustment part.
Optionally, the adjustment parameters include a current posture adjustment mode and a posture adjustment amplitude; the current gesture adjustment mode is used for representing the change trend of the gesture of the target adjustment part in the control time corresponding to the flight control instruction; determining a current posture adjustment parameter corresponding to the target adjustment part in the target animation frame based on the adjustment parameter of the target adjustment part and the arrangement positions of the target animation frame in the plurality of target animation frames, wherein the method comprises the following steps: determining a target moment corresponding to the target animation frame in control time corresponding to the flight control instruction based on arrangement positions of the target animation frame in a plurality of target animation frames; and determining the current posture adjustment parameters corresponding to the target adjustment part in the target animation frame based on the target moment, the current posture adjustment mode corresponding to the target adjustment part and the posture adjustment amplitude.
Optionally, the target object corresponds to a plurality of flight states; the preset animation resources comprise a plurality of animation key frames of the target object; the animation key frame corresponds to a preset flight state; based on the mixing space parameters corresponding to the target animation frame and the current gesture of the target object, performing animation mixing processing on the preset animation resource to generate the target animation frame, wherein the method comprises the following steps: responding to the flight control instruction to control the target object to be in a target flight state, and determining an animation key frame corresponding to the target flight state from a plurality of animation key frames of the target object as a target key frame; and based on the mixing space parameters corresponding to the target animation frames, performing animation mixing processing on the target key frames and the animation frames corresponding to the current gesture of the target object to generate the target animation frames.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again.
In addition, in the description of embodiments of the present invention, unless explicitly stated and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present invention will be understood by those skilled in the art in specific cases.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In the description of the present invention, it should be noted that the directions or positional relationships indicated by the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. are based on the directions or positional relationships shown in the drawings, are merely for convenience of describing the present invention and simplifying the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above examples are only specific embodiments of the present invention for illustrating the technical solution of the present invention, but not for limiting the scope of the present invention, and although the present invention has been described in detail with reference to the foregoing examples, it will be understood by those skilled in the art that the present invention is not limited thereto: any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or perform equivalent substitution of some of the technical features, while remaining within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention, and are intended to be included in the scope of the present invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.
Claims (10)
1. A method for generating a target object's flight animation, the method comprising:
acquiring a flight control instruction corresponding to a target object and a current gesture of the target object; the target object comprises a plurality of body parts;
determining a target adjustment location of the plurality of body locations, and adjustment parameters of the target adjustment location, based on the flight control instructions;
based on the current gesture of the target object and the adjustment parameters of the target adjustment part, performing animation mixing processing on a preset animation resource to generate a flying animation of the target object ; The preset animation resources comprise: and representing the animation frame of the target object in a preset gesture or a preset motion state.
2. The method of claim 1, wherein the target object corresponds to a plurality of flight conditions; the flight state corresponds to a preset body part to be adjusted;
a step of determining a target adjustment location of the plurality of body locations, and an adjustment parameter of the target adjustment location, based on the flight control instruction, comprising:
the target object is controlled to be in a target flight state in response to the flight control instruction, and a body part to be adjusted corresponding to the target flight state is determined to be a target adjustment part; the target adjusting part corresponds to a preset posture adjusting mode; the preset gesture adjustment mode represents the change trend of the gesture of the target adjustment part in the continuous process of the target flight state;
And determining the adjustment parameters of the target adjustment part based on the flight control instruction and the preset posture adjustment mode of the target adjustment part.
3. The method of claim 2, wherein the adjustment parameters include a current attitude adjustment mode and an attitude adjustment amplitude;
determining an adjustment parameter of the target adjustment part based on the flight control instruction and a preset posture adjustment mode of the target adjustment part, wherein the step comprises the following steps:
determining a current posture adjustment mode of the target adjustment part based on the control time corresponding to the flight control instruction and the preset posture adjustment mode; the current gesture adjustment mode is used for representing the change trend of the gesture of the target adjustment part in the control time;
and determining the gesture adjustment amplitude of a plurality of target adjustment positions based on the control force corresponding to the flight control instruction and the current gesture adjustment mode.
4. The method of claim 1, wherein the flying animation comprises a plurality of target animation frames arranged in a set order;
based on the current gesture of the target object and the adjustment parameters of the target adjustment part, performing animation mixing processing on a preset animation resource, and generating a flight animation of the target object, wherein the method comprises the following steps:
According to the arrangement sequence of the target animation frames, the following operations are sequentially executed for each target animation frame:
determining a mixing space parameter corresponding to the target animation frame based on the adjustment parameter of the target adjustment part and the arrangement position of the target animation frame in the plurality of target animation frames;
and performing animation mixing processing on a preset animation resource based on the mixing space parameters corresponding to the target animation frame and the current gesture of the target object to generate the target animation frame.
5. The method of claim 4, wherein the step of determining the blending space parameter corresponding to the target animation frame based on the adjustment parameter of the target adjustment location and the arrangement position of the target animation frame in the plurality of target animation frames comprises:
determining a current posture adjustment parameter corresponding to the target adjustment part in the target animation frame based on the adjustment parameter of the target adjustment part and the arrangement positions of the target animation frame in a plurality of target animation frames;
and determining the mixing space parameter corresponding to the target animation frame based on the current posture adjustment parameter of the target adjustment part.
6. The method of claim 5, wherein the adjustment parameters include a current attitude adjustment mode and an attitude adjustment amplitude; the current gesture adjustment mode is used for representing the change trend of the gesture of the target adjustment part in the control time corresponding to the flight control instruction;
determining a current posture adjustment parameter corresponding to the target adjustment part in the target animation frame based on the adjustment parameter of the target adjustment part and the arrangement positions of the target animation frame in the plurality of target animation frames, wherein the step comprises the following steps:
determining a target moment corresponding to the target animation frame in control time corresponding to the flight control instruction based on arrangement positions of the target animation frame in the plurality of target animation frames;
and determining the current posture adjustment parameters corresponding to the target adjustment part in the target animation frame based on the target moment, the current posture adjustment mode corresponding to the target adjustment part and the posture adjustment amplitude.
7. The method of claim 4, wherein the target object corresponds to a plurality of flight conditions; the preset animation resource comprises a plurality of animation key frames of the target object; the animation key frame corresponds to a preset flight state;
Based on the mixing space parameters corresponding to the target animation frame and the current gesture of the target object, performing animation mixing processing on a preset animation resource, and generating the target animation frame, wherein the step comprises the following steps:
controlling the target object to be in a target flight state in response to the flight control instruction, and determining an animation key frame corresponding to the target flight state in a plurality of animation key frames of the target object as a target key frame;
and based on the mixing space parameters corresponding to the target animation frames, performing animation mixing processing on the target key frames and the animation frames corresponding to the current gesture of the target object to generate the target animation frames.
8. A flying animation generating device of a target object, the device comprising:
the flight control instruction acquisition module is used for acquiring a flight control instruction corresponding to a target object and the current gesture of the target object; the target object comprises a plurality of body parts;
an adjustment parameter determination module for determining a target adjustment location of the plurality of body locations, and an adjustment parameter of the target adjustment location, based on the flight control instructions;
The flight animation generation module is used for carrying out animation mixing processing on preset animation resources based on the current gesture of the target object and the adjustment parameters of the target adjustment part to generate the flight animation of the target object; the preset animation resources comprise: and representing the animation frame of the target object in a preset gesture or a preset motion state.
9. An electronic device comprising a processor and a memory, the memory storing machine-executable instructions executable by the processor, the processor executing the machine-executable instructions to implement the method of generating a fly-animation of a target object of any of claims 1-7.
10. A machine-readable storage medium storing machine-executable instructions that, when invoked and executed by a processor, cause the processor to implement a method of generating a fly-animation of a target object according to any one of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211551588.0A CN116385605A (en) | 2022-12-05 | 2022-12-05 | Method and device for generating flight animation of target object and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211551588.0A CN116385605A (en) | 2022-12-05 | 2022-12-05 | Method and device for generating flight animation of target object and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116385605A true CN116385605A (en) | 2023-07-04 |
Family
ID=86971816
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211551588.0A Pending CN116385605A (en) | 2022-12-05 | 2022-12-05 | Method and device for generating flight animation of target object and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116385605A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117788650A (en) * | 2024-02-27 | 2024-03-29 | 腾讯科技(深圳)有限公司 | Data processing method, device, electronic equipment, storage medium and program product |
-
2022
- 2022-12-05 CN CN202211551588.0A patent/CN116385605A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117788650A (en) * | 2024-02-27 | 2024-03-29 | 腾讯科技(深圳)有限公司 | Data processing method, device, electronic equipment, storage medium and program product |
CN117788650B (en) * | 2024-02-27 | 2024-06-07 | 腾讯科技(深圳)有限公司 | Data processing method, device, electronic equipment, storage medium and program product |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11836843B2 (en) | Enhanced pose generation based on conditional modeling of inverse kinematics | |
US9741146B1 (en) | Kinetic energy smoother | |
WO2022001652A1 (en) | Virtual character control method and apparatus, computer device, and storage medium | |
US20230334744A1 (en) | Method and apparatus for generating walk animation of virtual role, device and storage medium | |
US11816772B2 (en) | System for customizing in-game character animations by players | |
CN112669414B (en) | Animation data processing method and device, storage medium and computer equipment | |
CN111714880B (en) | Picture display method and device, storage medium and electronic device | |
CN111773686A (en) | Animation generation method and device, storage medium and electronic device | |
CN112843704B (en) | Animation model processing method, device, equipment and storage medium | |
CN115641375B (en) | Method, device, equipment and storage medium for processing hair of virtual object | |
US11978145B2 (en) | Expression generation for animation object | |
GB2580615A (en) | Method and system for determining blending coefficients | |
CN113332723B (en) | Sound effect processing method and device in game | |
US20240257429A1 (en) | Neural animation layering for synthesizing movement | |
CN116385605A (en) | Method and device for generating flight animation of target object and electronic equipment | |
CN112843683B (en) | Virtual character control method and device, electronic equipment and storage medium | |
CN112891947B (en) | Jump animation processing method, apparatus, electronic device and computer readable medium | |
US20230120883A1 (en) | Inferred skeletal structure for practical 3d assets | |
CN115526967A (en) | Animation generation method and device for virtual model, computer equipment and storage medium | |
CN113318439B (en) | Method and device for processing starting animation, processor and electronic device | |
WO2011159257A1 (en) | System and method of generating an interactive output | |
WO2024212702A1 (en) | Avatar configuration method and apparatus, device, and storage medium | |
CN117732055A (en) | Method, device, equipment and storage medium for controlling movement of virtual character in game | |
CN117115322A (en) | Animation display method, device, storage medium and electronic equipment | |
CN118160008A (en) | Inferred skeletal structure of a utility 3D asset |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |