CN112802163A - Animation adjusting method and device in game and electronic terminal - Google Patents

Animation adjusting method and device in game and electronic terminal Download PDF

Info

Publication number
CN112802163A
CN112802163A CN202110156974.9A CN202110156974A CN112802163A CN 112802163 A CN112802163 A CN 112802163A CN 202110156974 A CN202110156974 A CN 202110156974A CN 112802163 A CN112802163 A CN 112802163A
Authority
CN
China
Prior art keywords
attack
virtual object
animation
default
game
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110156974.9A
Other languages
Chinese (zh)
Other versions
CN112802163B (en
Inventor
谭咏
杜志荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110156974.9A priority Critical patent/CN112802163B/en
Publication of CN112802163A publication Critical patent/CN112802163A/en
Application granted granted Critical
Publication of CN112802163B publication Critical patent/CN112802163B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides an animation adjusting method and device in a game and an electronic terminal, relates to the technical field of games, and solves the technical problem that the attacking action of a virtual object is lack of reality. The method comprises the following steps: responding to an attack instruction aiming at the virtual object, and determining a target attack position corresponding to the attack instruction in the game scene; determining transformation information of a skeleton connected with an attack end in the virtual object according to a default attack position of the attack end of the virtual object and the target attack position, wherein the default attack position is a position where the attack end completes a default attack animation; displaying an animation of the attacking tip hitting the target attack location in the graphical user interface based on the transformation information of the bone.

Description

Animation adjusting method and device in game and electronic terminal
Technical Field
The present application relates to the field of game technologies, and in particular, to a method and an apparatus for adjusting an animation in a game, and an electronic terminal.
Background
An animation montage (abbreviated as montage) is a multifunctional art tool, and various animation effects including intelligent circulation of animation, logic-based animation switching and the like can be realized by using the tool. In general, the action of a virtual object in a game is realized by animation which is made in advance.
At present, the animation of the attack action of the virtual object is realized by applying a fixed animation template, and the situation that the reality sense of the attack action such as the hitting of the virtual object is low can occur in the game process. For example, it often happens that the fist of the virtual object does not reach the attacked position of the attacked party, and the virtual object is judged to hit the attacked party, so that the attacking action of the virtual object is seriously distorted.
Disclosure of Invention
The application aims to provide an animation adjusting method and device in a game and an electronic terminal so as to relieve the technical problem that the truth of the attack action of a virtual object is low.
In a first aspect, an embodiment of the present application provides an animation adjustment method in a game, where a terminal provides a graphical user interface, and a game scene of the game includes a virtual object that executes an attack action; the method comprises the following steps:
responding to an attack instruction aiming at the virtual object, and determining a target attack position corresponding to the attack instruction in the game scene;
determining transformation information of a skeleton connected with an attack end in the virtual object according to a default attack position of the attack end of the virtual object and the target attack position, wherein the default attack position is a position where the attack end completes a default attack animation;
displaying an animation of the attacking tip hitting the target attack location in the graphical user interface based on the transformation information of the bone.
In one possible implementation, the step of determining transformation information of a bone connected to an attack end of the virtual object according to a default attack position of the attack end of the virtual object and the target attack position includes:
and reversely calculating transformation information of a bone connected with the attack end in the virtual object by utilizing Inverse Kinematics (IK) based on the process that the attack end arrives at the target attack position from the default attack position.
In one possible implementation, the IK includes any one or more of:
double Bone Inverse Kinematics (Two Bone IK), anteroposterior extension Inverse Kinematics (FABRIK), Cyclic Coordinate Descent Inverse Kinematics (CCDIK), Spline line (Spline Inverse Kinematics, Spline IK).
In one possible implementation, the step of calculating the transformation information of the bone connected to the attack end in the virtual object in a reverse direction by using a reverse motion IK based on the process of the attack end arriving at the target attack position from the default attack position includes:
determining an IK fusion weight between IK and an original pose of the virtual object based on a specified IK fusion curve;
and reversely calculating the transformation information of the bone connected with the attack end in the virtual object by utilizing the IK according to the IK fusion weight based on the process that the attack end arrives from the default attack position to the target attack position.
In one possible implementation, the method further comprises:
generating the specified IK fusion curve based on a plurality of IK fusion parameters in response to an editing operation on the IK fusion parameters.
In one possible implementation, the IK fusion parameters include any one or more of:
0. 1 and parameters between 0 and 1.
In one possible implementation, 1 is used to indicate that the weight of fusing with the IK is 1, and 0 is used to indicate that the weight of fusing with the IK is 0.
In one possible implementation, the attack end includes any one or more of:
hand, foot, leg joint, arm joint, head.
In one possible implementation, the bone in the virtual object connected to the attack end includes any one or more of:
arm bones, leg bones, neck bones, trunk and trunk bones.
In one possible implementation, when the IK fusion parameter is 1, the arms, legs, neck, and/or trunk of the virtual subject are in a fully straightened state.
In a second aspect, an animation adjusting device in a game is provided, wherein a terminal provides a graphical user interface, and a game scene of the game comprises a virtual object for executing an attack action; the device comprises:
the first determination module is used for responding to an attack instruction aiming at the virtual object and determining a target attack position corresponding to the attack instruction in the game scene;
a second determining module, configured to determine, according to a default attack position of an attack end of the virtual object and the target attack position, transformation information of a skeleton connected to the attack end in the virtual object, where the default attack position is a position where the attack end completes a default attack animation;
a display module for displaying an animation of the attacking tip hitting the target attack location in the graphical user interface based on the transformation information of the skeleton.
In a third aspect, an embodiment of the present application further provides an electronic terminal, which includes a memory and a processor, where the memory stores a computer program that is executable on the processor, and the processor executes the computer program to implement the method in the first aspect.
In a fourth aspect, this embodiment of the present application further provides a computer-readable storage medium storing computer-executable instructions, which, when invoked and executed by a processor, cause the processor to perform the method of the first aspect.
The embodiment of the application brings the following beneficial effects:
the method, the device and the electronic terminal for adjusting the animation in the game provided by the embodiment of the application can respond to the attack instruction aiming at the virtual object, determine the target attack position corresponding to the attack instruction in the game scene, then determine the transformation information of the skeleton connected with the attack end part in the virtual object according to the default attack position and the target attack position of the attack end part of the virtual object, wherein the default attack position is the position where the attack end part completes the default attack animation, and further display the animation of the attack end part hitting the target attack position in the graphical user interface based on the transformation information of the skeleton. In the scheme, the conversion information of the skeleton connected with the attack end part in the virtual object is determined according to the default attack position of the attack end part of the virtual object and the target attack position in the attack, so that the animation effect that the attack end part really hits the target attack position can be further displayed based on the skeleton conversion information, the accuracy of the attack judgment effect of the virtual object is improved, the more real attack sense is realized, and the technical problem that the attack action of the current virtual object lacks the sense of reality is solved.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the detailed description of the present application or the technical solutions in the prior art, the drawings needed to be used in the detailed description of the present application or the prior art description will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of an example of an electronic terminal according to an embodiment of the present application;
fig. 3 is a schematic view of a usage scenario of an electronic terminal according to an embodiment of the present application;
FIG. 4 is a flowchart illustrating a method for adjusting animation in a game according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of skeleton transformation information of an animation adjustment method in a game according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of an edit window of a method for adjusting animation in a game according to an embodiment of the present application;
FIG. 7 is a schematic diagram of another editing window of an animation adjusting method in a game according to an embodiment of the present application;
FIG. 8 is a diagram illustrating a designated IK fusion curve for an animation adaptation method in a game according to an embodiment of the present application;
FIG. 9 is a schematic diagram of another editing window of an animation adjusting method in a game according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an animation adjusting device in a game according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions of the present application will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "comprising" and "having," and any variations thereof, as referred to in the embodiments of the present application, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
At present, the animation of the attack action of the virtual object is realized by applying a fixed animation template, and the reality of the attack effect of the attack action such as striking and the like is low when a game runs. For example, in many large 3A-level motion games, in order to find a real attack determination point of a virtual object, it is determined that the damage range of the virtual object close combat attack is located at a specific attack location point, but the damage range of the virtual object close combat attack in an actual game is slightly deviated from a preset area, and it often happens that the fist of the virtual object does not reach the attacked location of the attacked party, and it is determined that the fist of the virtual object hits the attacked party, so that the accuracy of the virtual object attack determination effect is low.
The existing method for relieving the problem needs designers to additionally manufacture a plurality of animation templates to solve the problem that the damage range deviates from the preset area, for example, at least two actions, namely upward and downward actions, are respectively added to the animation templates of the virtual object close combat attack, and then the used animation templates are determined according to the specific position of the target attack. However, when the position of the target attack changes continuously, more animation templates need to be created, which greatly increases the game creation cost, and the animation effect of the attack action of the virtual object cannot be intelligently adjusted, so that the truth of the attack action of the virtual object is low.
Based on this, the embodiment of the application provides an animation adjusting method and device in a game and an electronic terminal, and the technical problem that the trueness of the attack action of a virtual object is low can be solved through the method.
The animation adjusting method in the game in the embodiment of the application can be applied to the electronic terminal. Wherein the electronic terminal comprises a display for presenting a graphical user interface, an input device and a processor. The input device may be a keyboard, mouse, touch screen, or the like for receiving operations directed to the graphical user interface.
In practical application, the electronic terminal may be a computer device, or may also be a touch terminal such as a touch screen mobile phone and a tablet computer. As an example, the electronic terminal is a touch terminal, and the display and the input device thereof may be integrated into a touch screen for presenting and receiving operations for a graphical user interface.
In some embodiments, when the electronic terminal operates the graphical user interface, the graphical user interface may be used to operate content local to the electronic terminal, and may also be used to operate content of the peer server.
For example, as shown in fig. 1, fig. 1 is a schematic view of an application scenario provided in the embodiment of the present application. The application scenario may include an electronic terminal (e.g., a cell phone 102) and a server 101, and the electronic terminal may communicate with the server 101 through a wired network or a wireless network. The electronic terminal is used for running a virtual desktop, and can interact with the server 101 through the virtual desktop to operate the content in the server 101.
The electronic terminal of the embodiment is described by taking the mobile phone 102 as an example. The handset 102 includes Radio Frequency (RF) circuitry 110, memory 120, a touch screen 130, a processor 140, and the like. Those skilled in the art will appreciate that the handset configuration shown in fig. 2 is not intended to be limiting and may include more or fewer components than those shown, or may combine certain components, or split certain components, or arranged in different components. Those skilled in the art will appreciate that the touch screen 130 is part of a User Interface (UI) and that the cell phone 102 may include fewer than or the same User Interface as illustrated.
The RF circuitry 110 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
The memory 120 may be used to store software programs and modules, and the processor 140 executes various functional applications and data processing of the handset 102 by executing the software programs and modules stored in the memory 120. The memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the stored data area may store data created from use of the handset 102, and the like. Further, the memory 120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The touch screen 130 may be used to display a graphical user interface and receive user operations with respect to the graphical user interface. A particular touch screen 130 may include a display panel and a touch panel. The Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may collect contact or non-contact operations of a user on or near the touch panel (for example, as shown in fig. 3, operations of the user on or near the touch panel using any suitable object or accessory such as a finger 103, a stylus pen, etc.), and generate preset operation instructions. In addition, the touch panel may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction and gesture of a user, detects signals brought by touch operation and transmits the signals to the touch controller; the touch controller receives the touch information from the touch detection device, converts the touch information into information that can be processed by the processor, sends the information to the processor 140, and receives and executes a command sent by the processor 140. In addition, the touch panel may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, a surface acoustic wave, and the like, and may also be implemented by any technology developed in the future. Further, the touch panel may cover the display panel, a user may operate on or near the touch panel covered on the display panel according to a graphical user interface displayed by the display panel, the touch panel detects an operation thereon or nearby and transmits the operation to the processor 140 to determine a user input, and the processor 140 provides a corresponding visual output on the display panel in response to the user input. In addition, the touch panel and the display panel can be realized as two independent components or can be integrated.
The processor 140 is the control center of the handset 102, connects various parts of the entire handset using various interfaces and lines, and performs various functions and processes of the handset 102 by running or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120, thereby performing overall monitoring of the handset.
Embodiments of the present application are further described below with reference to the accompanying drawings.
Fig. 4 is a schematic flowchart of an animation adjusting method in a game according to an embodiment of the present application. The method can be applied to a terminal capable of presenting a graphical user interface, the graphical user interface is provided through the terminal, and a game scene of the game comprises a virtual object for executing an attack action. As shown in fig. 4, the method includes:
step S410, responding to the attack instruction aiming at the virtual object, and determining the target attack position corresponding to the attack instruction in the game scene.
The virtual object in the embodiment of the present application may be any object that can perform an attack action in a game scene, for example, a virtual character, a virtual animal, a virtual monster, a virtual character, and the like. It should be noted that the target attack position is a correct attack position that should be hit when the virtual object performs the attack action, for example, sizes of the attacked objects in the game scene may be different, and when the virtual character attacks the attacked objects according to the received attack action, different positions may be attacked. For example, the virtual object may attack the head of the attacked object or attack the leg of the attacked object according to the received attack instruction; it is also possible that the system defaults that the virtual object needs to attack the headers of all the attacked objects, and then, for attacked objects of different sizes, the virtual object will cause the attacking positions to be different when attacking the headers of different passive machine objects, that is, different target attack positions.
Step S420, according to the default attack position and the target attack position of the attack end of the virtual object, the transformation information of the skeleton connected with the attack end in the virtual object is determined.
And the default attack position is the position where the attack end completes the default attack animation. For example, as shown in FIG. 5, there is a distance between the default attack location and the target attack location. When the game runs, calculation can be carried out based on the process that the attack end part reaches the target attack position from the default attack position, and then transformation information of a skeleton connected with the attack end part is reversely deduced.
Step S430, displaying an animation of the attack end hitting the target attack position in the graphical user interface based on the transformation information of the skeleton.
For example, as shown in fig. 6 and 7, data such as transformation information of a skeleton during game running can be simulated by using an editor, and an effect that the target attack position moves with the attack end is achieved, so that an animation effect that the attack end of the virtual object more truly hits the target attack position is achieved, and naturalness and reality of an attack action of the virtual object are improved.
In the embodiment of the application, the transformation information of the skeleton connected with the attack end part in the virtual object is determined according to the default attack position of the attack end part of the virtual object and the target attack position in the attack, so that the animation effect that the attack end part really hits the target attack position can be further displayed based on the skeleton transformation information, the accuracy of the attack judgment effect of the virtual object is improved, and more real attack sense is realized. Meanwhile, for attacked objects with different sizes, the virtual object can be ensured to attack the target attack position really needed to be hit, for example, when the virtual object needs to attack the heads of different passive machine objects, the attack aiming at each passive machine object can achieve the effect of hitting the head position of the passive machine object.
The above steps are described in detail below.
In some embodiments, the attack end may include different portions of the virtual object. As an example, the attack end includes any one or more of: hand, foot, leg joint, arm joint, head.
For example, as shown in fig. 5, when the attack end is a fist of a hand, the target attack position moves with the fist, and an animation effect that the fist of the virtual object attacks the target attack position is realized. By setting different attack end parts, the diversity and the authenticity of game contents are enhanced.
In some embodiments, the inverse motion IK may be used to back-calculate transformation information for the bone connected to the attacking tip. As an example, the step S420 may include the following steps:
and a step a) of reversely calculating the transformation information of the skeleton connected with the attack end in the virtual object by utilizing the reverse motion IK based on the process that the attack end arrives from the default attack position to the target attack position.
For example, as shown in fig. 6 and fig. 7, in this embodiment, an attack action is broadcast by using a UE4 Engine (universal Engine 4), a node of Cyclic Coordinate Descent Inverse dynamics (CCDIK) is first dragged to an animation blueprint, then a Details panel is observed, and then a target attack position can be arbitrarily dragged in an editor window by clicking the node of Cyclic Coordinate Descent Inverse dynamics CCDIK, so as to update an attack action of a virtual object in real time, or the position of the node of Cyclic Coordinate Descent Inverse dynamics CCDIK is dragged to simulate an attack end position at the target attack position during game operation in an editor state, and a transformation information of a skeleton connecting the virtual object and the attack end can be reversely calculated by using a reverse motion IK in the whole process.
The transformation information of the skeleton connected with the attacking end can be reversely calculated by utilizing the reverse motion IK, so that the attacking action of the virtual object can be accurately simulated, and the effect of game animation is improved.
In some embodiments, the reverse motion IK may comprise multiple types. As an example, IK includes any one or more of:
double-Bone inverse kinematics Two Bone IK, anteroposterior extension inverse kinematics FABRIK, cyclic coordinate descent inverse kinematics CCDIK, Spline IK.
Among them, Inverse Kinematics (IK) reflects a form of movement brought by the hand to the shoulder of a virtual object, and may include: double skeletal Inverse Kinematics (Two Bone Inverse Kinematics, Two Bone IK), anteroposterior extension Inverse Kinematics (FABRIK), Cyclic Coordinate Descent Inverse Kinematics (CCDIK), Spline lines (Spline Inverse Kinematics, Spline IK).
It should be noted that the anteroposterior extension inverse kinematics FABRIK is an IK solver that can process a chain of bones of arbitrary length (at least 2 segments). Cyclic coordinate descent inverse kinematics CCDIK is a lightweight IK solver (similar to FABRIK) typically used to drive the bone chain, and differs from FABRIK in that CCDIK provides the function of defining angular constraints that can limit the rotation of any bone in the solution. Spline IK can generally take a particular skeleton chain as a Control point (Control Points) and further constrain the Control point to a Spline in the animation blueprint.
In some embodiments, an IK fusion weight may be determined using a specified IK fusion curve, and transformation information for the bone connected to the attacking tip may be back-calculated from the IK fusion weight. As an example, the step a) may include the steps of:
step b), determining an IK fusion weight between the IK and the original posture of the virtual object based on the designated IK fusion curve;
and c), reversely calculating the transformation information of the skeleton connected with the attack end in the virtual object by utilizing the IK according to the IK fusion weight based on the process that the attack end arrives at the target attack position from the default attack position.
For example, as shown in fig. 8 and 9, the induced IK fusion weight can be determined by using the designated IK fusion curve, so that the animation of the virtual object and the inverse motion IK can be more natural, and the transformation information of the skeleton connected with the attack end can be reversely calculated, thereby improving the animation effect of the overall action and the accuracy of judging the attack position.
Based on this, a specified IK fusion curve can be generated from the plurality of IK fusion parameters. As an example, the method further includes the steps of:
and d), responding to the editing operation aiming at the IK fusion parameters, and generating a specified IK fusion curve based on the plurality of IK fusion parameters.
For example, as shown in fig. 8, a plurality of IK fusion parameters may be defined in the attack animation, and a designated IK fusion curve may be generated based on the plurality of IK fusion parameters, so that the designated IK fusion curve after editing the parameters may be used in the animation blueprint, thereby flexibly adjusting the animation effect of the virtual object attack action and improving the sense of reality of the virtual object attack action.
In some embodiments, the fusion parameters are set within a range. As an example, the IK fusion parameters include any one or more of: 0. 1 and parameters between 0 and 1.
For example, as shown in fig. 8, the IK fusion parameter range includes 0 and 1 between 0 and 1, and by adding the IK fusion parameter, the animation effect of the virtual object attack action can be more real and natural, and the game experience of the player is enriched.
In some embodiments, the weight of fusing with IK may be expressed as a numerical value. As an example, 1 is used to indicate that the weight of the fusion utilization IK is 1, and 0 is used to indicate that the weight of the fusion utilization IK is 0.
For example, the weight of fusion utilization IK is 1 when the arm of the virtual object is straightened, and 0 at a stage where the inverse motion IK is not required. By editing the weight introduced by the IK in the attack animation, the attack action of the virtual object can be more vivid and natural, and the authenticity of the action effect is improved.
In some embodiments, the bones that the virtual object connects to the attack end may be of multiple types. As an example, the bone in the virtual object connected to the attack end includes any one or more of: arm bones, leg bones, neck bones, trunk and trunk bones.
For example, as shown in fig. 5, the bone of the virtual object connected to the attacking end is an arm bone, the attacking end is a fist, the target attacking position moves with the fist of the virtual object, and the fist of the virtual object pulls the arm bone to move. By arranging different types of connecting skeletons, the applicability of the attack action of different parts of the virtual object is enhanced, the animation production cost can be saved, and the workload of art departments is reduced.
In some embodiments, the IK fusion parameters may represent specific states of different parts of the virtual object. As one example, when the IK fusion parameter is 1, the arm, leg, neck, and/or trunk of the virtual subject are in a fully straightened state.
For example, as shown in fig. 8, when the IK fusion parameter is 1, the arm of the virtual subject is set to be completely straightened. The states of different parts of the virtual object can be represented by setting the IK fusion parameters of the specified IK fusion curve, so that the authenticity and the detail of the attack action of the virtual object are improved.
Fig. 10 provides a schematic diagram of an animation adaptation device in a game. The terminal provides a graphical user interface, and a game scene of the game comprises a virtual object for executing the attack action. As shown in fig. 10, the in-game animation adjusting apparatus 1000 includes:
a first determining module 1001, configured to determine, in response to an attack instruction for a virtual object, a target attack position corresponding to the attack instruction in a game scene;
a second determining module 1002, configured to determine, according to a default attack position and a target attack position of an attack end of a virtual object, transformation information of a skeleton connected to the attack end in the virtual object, where the default attack position is a position where the attack end completes a default attack animation;
a display module 1003 for displaying an animation of the attack end hitting the target attack position in the graphical user interface based on the transformation information of the skeleton.
In some embodiments, the second determining module 1002 is specifically configured to:
based on the process of reaching from the default attack position of the attack end of the virtual object to the target attack position, the transformation information of the skeleton connected with the attack end in the virtual object is reversely calculated by utilizing the reverse motion IK.
In some embodiments, the IK comprises any one or more of:
double-Bone inverse kinematics Two Bone IK, anteroposterior extension inverse kinematics FABRIK, cyclic coordinate descent inverse kinematics CCDIK, Spline IK.
In some embodiments, the second determining module 1002 is further configured to:
determining an IK fusion weight between the IK and an original pose of the virtual object based on the specified IK fusion curve;
and reversely calculating the transformation information of the skeleton connected with the attack end in the virtual object by utilizing the IK according to the IK fusion weight based on the process of reaching the target attack position from the default attack position of the attack end of the virtual object.
In some embodiments, the apparatus further comprises:
a generation module to generate a specified IK fusion curve based on the plurality of IK fusion parameters in response to an editing operation for the IK fusion parameters.
In some embodiments, the IK fusion parameters include any one or more of: 0. 1 and parameters between 0 and 1.
In some embodiments, 1 is used to indicate that the weight of the fusion utilization IK is 1 and 0 is used to indicate that the weight of the fusion utilization IK is 0.
In some embodiments, the attacking tip comprises any one or more of: hand, foot, leg joint, arm joint, head.
In some embodiments, the bone in the virtual object connected to the attack end includes any one or more of: arm bones, leg bones, neck bones, trunk and trunk bones.
In some embodiments, when the IK fusion parameter is 1, the arm, leg, neck, and/or trunk of the virtual subject are in a fully straightened state.
The animation adjusting device in the game provided by the embodiment of the application has the same technical characteristics as the animation adjusting method in the game provided by the embodiment, so that the same technical problems can be solved, and the same technical effects can be achieved.
Corresponding to the animation adjustment method in the game, the embodiment of the application also provides a computer readable storage medium, wherein the computer readable storage medium stores machine executable instructions, and when the computer executable instructions are called and executed by a processor, the computer executable instructions cause the processor to execute the steps of the animation adjustment method in the game.
The animation adjusting device in the game provided by the embodiment of the application can be specific hardware on the device, or software or firmware installed on the device. The device provided by the embodiment of the present application has the same implementation principle and technical effect as the foregoing method embodiments, and for the sake of brief description, reference may be made to the corresponding contents in the foregoing method embodiments where no part of the device embodiments is mentioned. It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the foregoing systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
For another example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments provided in the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the animation adjusting method in the game according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus once an item is defined in one figure, it need not be further defined and explained in subsequent figures, and moreover, the terms "first", "second", "third", etc. are used merely to distinguish one description from another and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present application, and are used for illustrating the technical solutions of the present application, but not limiting the same, and the scope of the present application is not limited thereto, and although the present application is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope disclosed in the present application; such modifications, changes or substitutions do not depart from the scope of the embodiments of the present application. Are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. An animation adjustment method in a game, wherein a graphical user interface is provided through a terminal, a game scene of the game contains a virtual object for executing an attack action, and the method comprises the following steps:
responding to an attack instruction aiming at the virtual object, and determining a target attack position corresponding to the attack instruction in the game scene;
determining transformation information of a skeleton connected with an attack end in the virtual object according to a default attack position of the attack end of the virtual object and the target attack position, wherein the default attack position is a position where the attack end completes a default attack animation;
displaying an animation of the attacking tip hitting the target attack location in the graphical user interface based on the transformation information of the bone.
2. The method of claim 1, wherein the step of determining transformation information of a bone connected to the attack end of the virtual object according to the target attack position and a default attack position of the attack end of the virtual object comprises:
and reversely calculating the transformation information of the bone connected with the attack end in the virtual object by utilizing a reverse motion IK based on the process of the attack end from the default attack position to the target attack position.
3. The method of claim 2, wherein the IK comprises any one or more of:
double-Bone inverse kinematics Two Bone IK, anteroposterior extension inverse kinematics FABRIK, cyclic coordinate descent inverse kinematics CCDIK, Spline IK.
4. A method according to claim 2 or 3, wherein said step of calculating back the transformation information of the bone connected to said attacking tip in said virtual object using a back motion IK based on the course of said attacking tip arriving from said default attacking position to said target attacking position, comprises:
determining an IK fusion weight between IK and an original pose of the virtual object based on a specified IK fusion curve;
and reversely calculating the transformation information of the bone connected with the attack end in the virtual object by utilizing the IK according to the IK fusion weight based on the process that the attack end arrives from the default attack position to the target attack position.
5. The method of claim 4, further comprising:
generating the specified IK fusion curve based on a plurality of IK fusion parameters in response to an editing operation on the IK fusion parameters.
6. The method according to claim 5, wherein the IK fusion parameters include any one or more of:
0. 1 and parameters between 0 and 1.
7. The method according to claim 6, wherein 1 is used to indicate that the IK is used for fusion with a weight of 1, and 0 is used to indicate that the IK is used for fusion with a weight of 0.
8. The method of claim 1, wherein the attacking tip comprises any one or more of:
hand, foot, leg joint, arm joint, head.
9. The method of claim 1, wherein the bone in the virtual object to which the attacking tip is connected comprises any one or more of:
arm bones, leg bones, neck bones, trunk and trunk bones.
10. The method according to claim 7, wherein when the IK fusion parameter is 1, the virtual subject's arms, legs, neck and/or trunk are in a fully straightened state.
11. An animation adjusting device in a game is characterized in that a terminal provides a graphical user interface, and a game scene of the game comprises a virtual object for executing an attack action; the device comprises:
the first determination module is used for responding to an attack instruction aiming at the virtual object and determining a target attack position corresponding to the attack instruction in the game scene;
a second determining module, configured to determine, according to a default attack position of an attack end of the virtual object and the target attack position, transformation information of a skeleton connected to the attack end in the virtual object, where the default attack position is a position where the attack end completes a default attack animation;
a display module for displaying an animation of the attacking tip hitting the target attack location in the graphical user interface based on the transformation information of the skeleton.
12. An electronic terminal comprising a memory and a processor, the memory having stored thereon a computer program operable on the processor, wherein the processor, when executing the computer program, performs the steps of the method of any of claims 1 to 10.
13. A computer readable storage medium having stored thereon computer executable instructions which, when invoked and executed by a processor, cause the processor to execute the method of any of claims 1 to 10.
CN202110156974.9A 2021-02-03 2021-02-03 Animation adjustment method and device in game and electronic terminal Active CN112802163B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110156974.9A CN112802163B (en) 2021-02-03 2021-02-03 Animation adjustment method and device in game and electronic terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110156974.9A CN112802163B (en) 2021-02-03 2021-02-03 Animation adjustment method and device in game and electronic terminal

Publications (2)

Publication Number Publication Date
CN112802163A true CN112802163A (en) 2021-05-14
CN112802163B CN112802163B (en) 2023-09-15

Family

ID=75814173

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110156974.9A Active CN112802163B (en) 2021-02-03 2021-02-03 Animation adjustment method and device in game and electronic terminal

Country Status (1)

Country Link
CN (1) CN112802163B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102448565A (en) * 2009-05-29 2012-05-09 微软公司 Real time retargeting of skeletal data to game avatar
CN106984045A (en) * 2017-04-13 2017-07-28 网易(杭州)网络有限公司 Method, device, storage medium and the games system of game role action control
JP6506443B1 (en) * 2018-04-27 2019-04-24 株式会社 ディー・エヌ・エー Image generation apparatus and image generation program
CN111330267A (en) * 2020-03-04 2020-06-26 腾讯科技(深圳)有限公司 Animation display method, device, equipment and storage medium
CN111467804A (en) * 2020-04-30 2020-07-31 网易(杭州)网络有限公司 Hit processing method and device in game
CN111773668A (en) * 2020-07-03 2020-10-16 珠海金山网络游戏科技有限公司 Animation playing method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102448565A (en) * 2009-05-29 2012-05-09 微软公司 Real time retargeting of skeletal data to game avatar
CN106984045A (en) * 2017-04-13 2017-07-28 网易(杭州)网络有限公司 Method, device, storage medium and the games system of game role action control
JP6506443B1 (en) * 2018-04-27 2019-04-24 株式会社 ディー・エヌ・エー Image generation apparatus and image generation program
CN111330267A (en) * 2020-03-04 2020-06-26 腾讯科技(深圳)有限公司 Animation display method, device, equipment and storage medium
CN111467804A (en) * 2020-04-30 2020-07-31 网易(杭州)网络有限公司 Hit processing method and device in game
CN111773668A (en) * 2020-07-03 2020-10-16 珠海金山网络游戏科技有限公司 Animation playing method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王鑫;柳敏乾;王万良;: "基于蜂窝的多角色交互性动画合成", 计算机辅助设计与图形学学报, no. 06, pages 142 - 153 *

Also Published As

Publication number Publication date
CN112802163B (en) 2023-09-15

Similar Documents

Publication Publication Date Title
JP7377328B2 (en) Systems and methods for controlling technological processes
JP7024132B2 (en) Object display method, terminal device, and computer program
EP3273334B1 (en) Information processing method, terminal and computer storage medium
CN105148517B (en) A kind of information processing method, terminal and computer-readable storage medium
CN107930114A (en) Information processing method and device, storage medium, electronic equipment
US10872455B2 (en) Method and portable electronic device for changing graphics processing resolution according to scenario
CN106708255A (en) Interaction control method and system for virtual interface
CN106984044B (en) Method and equipment for starting preset process
CN114327201A (en) Cloud mobile phone control method and device and computer equipment
CN112802163B (en) Animation adjustment method and device in game and electronic terminal
CN105373314A (en) Target object control method and apparatus
CN113345059B (en) Animation generation method and device, storage medium and electronic equipment
CN109669608B (en) Action effect generation method and device
CN112516582A (en) Method and device for switching states in game
CN111640204B (en) Method and device for constructing three-dimensional object model, electronic equipment and medium
CN118161860A (en) Method and device for controlling virtual object in game and electronic terminal
CN117251096A (en) Electronic device control method, electronic device control device, electronic device and readable storage medium
CN116459517A (en) Method and device for processing virtual object in game and electronic terminal
JP5938501B1 (en) Computer program and game program for progressing game by touch operation
CN116832448A (en) Virtual character control method, device, terminal equipment and medium
CN110928474A (en) Junk file cleaning method and device, storage medium and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant