CN113209618A - Control method, device, equipment and medium of virtual role - Google Patents

Control method, device, equipment and medium of virtual role Download PDF

Info

Publication number
CN113209618A
CN113209618A CN202110610049.9A CN202110610049A CN113209618A CN 113209618 A CN113209618 A CN 113209618A CN 202110610049 A CN202110610049 A CN 202110610049A CN 113209618 A CN113209618 A CN 113209618A
Authority
CN
China
Prior art keywords
movement
virtual character
transition
state
transition animation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110610049.9A
Other languages
Chinese (zh)
Other versions
CN113209618B (en
Inventor
晏嘉庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110610049.9A priority Critical patent/CN113209618B/en
Publication of CN113209618A publication Critical patent/CN113209618A/en
Application granted granted Critical
Publication of CN113209618B publication Critical patent/CN113209618B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5252Changing parameters of virtual cameras using two or more virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character changes room or displaying a rear-mirror view in a car-driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game

Abstract

The application discloses a control method, a control device, control equipment and a control medium of a virtual role, which are applied to the field of virtual worlds. The method comprises the following steps: displaying a first avatar located in a virtual environment; receiving a movement control operation for controlling the first virtual character to change a movement state in the virtual environment, wherein the movement state comprises a standing state and a traveling state; determining, in response to the movement control operation, a first transition animation frame corresponding to a first movement distance in a transition animation, the first movement distance being a movement distance of the first virtual character in a first transition phase of the change in the movement state; and playing the first transition animation frame corresponding to the first moving distance to obtain the moving animation of the first virtual character in the first transition stage. The method can make the change of the moving state of the virtual character show naturally on the premise that the calculation pressure of the server is small.

Description

Control method, device, equipment and medium of virtual role
Technical Field
The present application relates to the field of virtual worlds, and in particular, to a method, an apparatus, a device, and a medium for controlling a virtual character.
Background
The virtual character can perform a series of activities in the three-dimensional virtual world, for example, the virtual character moves from a static state in the virtual environment, or the virtual character stops moving from the static state in the virtual environment.
The related art will first create two animations for the virtual character, one being a static animation of the virtual character and the other being a moving animation of the virtual character. When a user controls a static virtual character to start moving, the static animation is transited to the moving animation in a program fusion mode, so that the virtual character shows a process from static to moving.
The related art may make the movement of the virtual character look unnatural.
Disclosure of Invention
The embodiment of the application provides a method, a device, equipment and a medium for controlling a virtual character, wherein the method can display the process of the change of the moving state of the virtual character in a virtual environment, and the animation expression is smooth. The technical scheme is as follows.
According to an aspect of the present application, there is provided a method of moving a virtual character, the method including:
displaying a first avatar located in a virtual environment;
receiving a movement control operation for controlling the first virtual character to change a movement state in the virtual environment, wherein the movement state comprises a standing state and a traveling state;
determining, in response to the movement control operation, a first transition animation frame corresponding to a first movement distance in a first transition animation, the first movement distance being a first movement distance of the first virtual character in a first transition phase of the change in the movement state;
and playing the first transition animation frame corresponding to the first moving distance to obtain the moving animation of the first virtual character in the first transition stage.
According to another aspect of the present application, there is provided a mobile device of a virtual character, the device including:
a display module for displaying a first virtual character located in a virtual environment;
a receiving module, configured to receive a movement control operation, where the movement control operation is used to control the first virtual character to change a movement state in the virtual environment, where the movement state includes a standing state and a traveling state;
a control module for determining, in response to the movement control operation, a first transition animation frame corresponding to a first movement distance in a first transition animation, the first movement distance being a first movement distance of the first virtual character in a first transition phase of the change in the movement state;
the display module is further configured to play the first transition animation frame corresponding to the first moving distance to obtain a moving animation of the first virtual character in the first transition stage.
In an optional design of the present application, the control module is further configured to, in response to the movement control operation, obtain a reference position point of the first virtual character in a first transition phase of a movement state change; calculating the first movement distance between the current position of the first virtual character and the reference position point; determining the first transition animation frame corresponding to the first movement distance in the first transition animation.
In an optional design of the present application, the control module is further configured to determine a target timestamp from a first transition animation curve according to the first movement distance, where the first transition animation curve is used to represent a relationship between the movement distance and the timestamp in the first transition animation; determining the first transition animation frame in the first transition animation corresponding to the target timestamp.
In an optional design of the present application, the control module is further configured to determine a starting position point of the first avatar as the reference position point in a first transition phase in which the first avatar changes from the standing state to the traveling state in response to the movement control operation.
In an optional design of the present application, the control module is further configured to calculate, in response to the movement control operation, a predicted stopping point according to a travel parameter of the first avatar in a first transition phase of the first avatar changing from the travel state to the standing state, where the travel parameter includes at least one of a speed magnitude, a speed direction, a friction force, an air resistance, and an acceleration; determining the predicted stopping point as the reference position point.
In an optional design of the present application, the receiving module is further configured to receive a moving state change instruction sent by a server, where the moving state change instruction is used to control the second virtual character to change the moving state in the virtual environment;
the control module is further used for responding to the movement state change instruction, and determining a second transition animation frame corresponding to a second movement distance in a second transition animation, wherein the second movement distance is the movement distance of the second virtual character in a second transition stage of the movement state change;
the display module is further configured to play the second transition animation frame corresponding to the second moving distance to obtain a moving animation of the second virtual character in the second transition stage.
In an optional design of the present application, the control module is further configured to calculate, in response to the movement state change instruction, a second movement distance between the current position and the starting position point of the second virtual character in a second transition phase of the movement state change of the second virtual character; determining a second transition animation frame corresponding to the second moving distance in a second transition animation curve of the second transition animation, wherein the second transition animation curve is used for representing the relation between the moving distance and the time stamp in the second transition animation.
In an optional design of the present application, the control module is further configured to calculate, in response to the movement state change instruction, the second movement distance between the current position of the second virtual character and the predicted stop point of the second virtual character in a second transition phase of the movement state change of the second virtual character; determining the second transition animation frame corresponding to the second movement distance in the second transition animation curve of the second transition animation.
According to another aspect of the present application, there is provided a computer device including: a processor and a memory, the memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions, the at least one instruction, the at least one program, set of codes, or set of instructions being loaded and executed by the processor to implement the control method of the virtual character as described above.
According to another aspect of the present application, there is provided a computer storage medium having at least one program code stored therein, the program code being loaded and executed by a processor to implement the control method of a virtual character as described above.
According to another aspect of the application, a computer program product or a computer program is provided, comprising computer instructions, which are stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, so that the computer device executes the control method of the virtual character as described above.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
when the virtual character changes in moving state, the corresponding transition animation frame is played according to the moving distance of the virtual character in the virtual environment, and the transition animation frame is continuously played to obtain the complete moving animation. When the change of the moving state of the virtual character is displayed, the animation and the displacement of the first virtual character are matched, the phenomenon of 'sliding' is avoided, the reality is enhanced more naturally and smoothly. And when the server and the client carry out data interaction, only an instruction with a small data volume and the information of the position point need to be sent, so that the data interaction volume between the client and the server can be reduced. Moreover, the calculation of the character model of the virtual character is executed on the client instead of the server, so that the calculation pressure of the server can be reduced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic block diagram of a computer system provided in an exemplary embodiment of the present application;
FIG. 2 is a flowchart illustrating a method for controlling a virtual character according to an exemplary embodiment of the present application;
FIG. 3 is an interface diagram illustrating a method for controlling a virtual character according to an exemplary embodiment of the present application;
FIG. 4 is a schematic diagram of a camera model provided by an exemplary embodiment of the present application;
FIG. 5 is a schematic diagram of a first transitional animation frame provided by an exemplary embodiment of the present application;
FIG. 6 is a schematic diagram of a first transitional animation frame provided by an exemplary embodiment of the present application;
FIG. 7 is a flowchart illustrating a method for controlling a virtual character according to an exemplary embodiment of the present application;
FIG. 8 is a diagram of a transition animation curve provided by an exemplary embodiment of the present application;
FIG. 9 is a diagram of a transition animation curve provided by an exemplary embodiment of the present application;
FIG. 10 is a flowchart illustrating a method for controlling a virtual character according to an exemplary embodiment of the present application;
FIG. 11 is a flowchart illustrating a method for controlling a virtual character according to an exemplary embodiment of the present application;
FIG. 12 is a flowchart illustrating a method for controlling a virtual character according to an exemplary embodiment of the present application;
FIG. 13 is a schematic diagram of a control device for a virtual character provided in an exemplary embodiment of the present application;
fig. 14 is a schematic structural diagram of a server according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, terms referred to in the embodiments of the present application are described:
and (3) matching displacement of the animation: the animation and the displacement of the three-dimensional model of the virtual character must be completely matched to avoid the phenomenon of 'sliding' because the movement of the virtual character is realized by alternately stepping off the two feet of the virtual character, but when the step animation of the virtual character is not completely matched with the actual displacement of the virtual character, the feet of the virtual character move transversely; or the moving speed of the virtual character is obviously higher than that of the virtual character foot, and model data in the animation is usually adopted to drive the real displacement of the virtual character in order to realize the complete matching of the animation and the displacement of the three-dimensional model of the virtual character.
Animation frame: indicating a frame of image in the playing process of the animation. An animation frame is a still image. The image here includes at least one of a two-dimensional image, a 2.5-dimensional image, and a three-dimensional image.
Virtual environment: is a virtual environment that is displayed (or provided) when an application is run on the terminal. The virtual environment may be a simulated world of a real world, a semi-simulated semi-fictional three-dimensional world, or a purely fictional three-dimensional world. The virtual environment may be any one of a two-dimensional virtual world, a 2.5-dimensional virtual world, and a three-dimensional virtual world. Optionally, the virtual environment is also used for a battle between at least two virtual characters, in which virtual resources are available for use by the at least two virtual characters. Optionally, the virtual environment includes a symmetric lower left corner region and an upper right corner region, and the virtual characters belonging to two enemy camps occupy one of the regions respectively, and destroy a target building/site/base/crystal deep in the other region as a winning target.
Virtual roles: refers to a movable object in a virtual world. The movable object may be at least one of a virtual character, a virtual animal, and an animation character. Alternatively, when the virtual world is a three-dimensional virtual world, the virtual characters may be three-dimensional stereo models, each virtual character having its own shape and volume in the three-dimensional virtual world, occupying a part of the space in the three-dimensional virtual world. Optionally, the virtual character is a three-dimensional character constructed based on three-dimensional human skeleton technology, and the virtual character realizes different external images by wearing different skins. In some implementations, the virtual role can also be implemented by using a 2.5-dimensional or 2-dimensional model, which is not limited in this application.
User Interface (UI) controls: refers to any visual control or element that is visible on the user interface of the application, such as controls for pictures, input boxes, text boxes, buttons, tabs, etc., some of which are responsive to user actions.
Referring to fig. 1, a schematic diagram of an implementation environment provided by an embodiment of the present application is shown. The implementation environment may include: a first terminal 110, a server 120, and a second terminal 130.
The first terminal 110 has an application 111 supporting a virtual environment, and the application 111 may be a multiplayer online battle program. When the first terminal runs the application 111, a user interface of the application 111 is displayed on the screen of the first terminal 110. The application 111 may be any one of a first person shooter Game, a military Simulation program, a MOBA (Multiplayer Online Battle Arena) Game, a large-fleeing shooting Game, and a Simulation strategy Game (SLG). In the present embodiment, the application 111 is exemplified as a MOBA game. The first terminal 110 is a terminal used by the first user 112, and the first user 112 uses the first terminal 110 to control a first virtual character located in the virtual environment for activity, where the first virtual character may be referred to as a master virtual character of the first user 112. The activities of the first avatar include, but are not limited to: adjusting at least one of body posture, crawling, walking, running, riding, flying, jumping, shooting, attacking, and releasing skills. Illustratively, the first avatar is a first virtual character, such as a simulated character or an animation character.
The second terminal 130 has an application 131 supporting a virtual environment, and the application 131 may be a multiplayer online battle program. When the second terminal 130 runs the application 131, a user interface of the application 131 is displayed on the screen of the second terminal 130. The client may be any one of a first-person shooter game, a military simulation program, an MOBA game, a large-fleeing and killing shooting game, and an SLG game, and in this embodiment, the application 131 is an MOBA game as an example. The second terminal 130 is a terminal used by the second user 132, and the second user 132 uses the second terminal 130 to control a second virtual character located in the virtual environment to perform an activity, where the second virtual character may be referred to as a master virtual character of the second user 132. Illustratively, the second avatar is a second virtual character, such as a simulated character or an animation character.
Optionally, the first virtual character and the second virtual character are in the same virtual world. Optionally, the first virtual role and the second virtual role may belong to the same camp, the same team, the same organization, a friend relationship, or a temporary communication right. Alternatively, the first virtual character and the second virtual character may belong to different camps, different teams, different organizations, or have a hostile relationship.
Optionally, the applications installed on the first terminal 110 and the second terminal 130 are the same, or the applications installed on the two terminals are the same type of application on different operating system platforms (android or IOS). The first terminal 110 may generally refer to one of a plurality of terminals, and the second terminal 130 may generally refer to another of the plurality of terminals, and this embodiment is only illustrated by the first terminal 110 and the second terminal 130. The device types of the first terminal 110 and the second terminal 130 are the same or different, and include: at least one of a smartphone, a tablet, an e-book reader, a digital player, a laptop portable computer, and a desktop computer.
Only two terminals are shown in fig. 1, but there are a plurality of other terminals that may access the server 120 in different embodiments. Optionally, one or more terminals are terminals corresponding to the developer, a development and editing platform for supporting the application program in the virtual environment is installed on the terminal, the developer can edit and update the application program on the terminal and transmit the updated application program installation package to the server 120 through a wired or wireless network, and the first terminal 110 and the second terminal 130 can download the application program installation package from the server 120 to update the application program.
The first terminal 110, the second terminal 130, and other terminals are connected to the server 120 through a wireless network or a wired network.
The server 120 includes at least one of a server, a server cluster composed of a plurality of servers, a cloud computing platform, and a virtualization center. The server 120 is used to provide background services for applications that support a three-dimensional virtual environment. Optionally, the server 120 undertakes primary computational work and the terminals undertake secondary computational work; alternatively, the server 120 undertakes the secondary computing work and the terminal undertakes the primary computing work; alternatively, the server 120 and the terminal perform cooperative computing by using a distributed computing architecture.
In one illustrative example, the server 120 includes a memory 121, a processor 122, a user account database 123, a combat services module 124, and a user-oriented Input/Output Interface (I/O Interface) 125. The processor 122 is configured to load an instruction stored in the server 120, and process data in the user account database 123 and the combat service module 124; the user account database 123 is configured to store data of a user account used by the first terminal 110, the second terminal 130, and other terminals, such as a head portrait of the user account, a nickname of the user account, a rating of the user account, and a service area where the user account is located; the fight service module 124 is used for providing a plurality of fight rooms for the users to fight, such as 1V1 fight, 3V3 fight, 5V5 fight and the like; the user-facing I/O interface 125 is used to establish communication with the first terminal 110 and/or the second terminal 130 through a wireless network or a wired network to exchange data.
Fig. 2 is a flowchart illustrating a control method for a virtual character according to an exemplary embodiment of the present application. The method may be performed by the first terminal 110 or the second terminal 130 or a client on the first terminal 110 or a client on the second terminal 130 shown in fig. 1, and the method includes:
step 202: a first avatar located in a virtual environment is displayed.
The first virtual role is the role that is hosted on the local client. The first avatar includes, but is not limited to, at least one of a virtual character, a virtual animal, and an animation character.
The virtual environment is a three-dimensional virtual environment. Optionally, the virtual environment is a two-dimensional virtual environment, or alternatively, the virtual environment is a 2.5-dimensional virtual environment.
Optionally, at least one of a move control, a set control, a dialog box, a volume adjustment control, a jump control, a squat control, a care, a view switching control, a recording control, and a microphone icon may also be displayed on the user interface of the client. The movement control is used for controlling the movement of the virtual character in the virtual world. The setting control is used for setting basic parameters on the client, and the basic parameters include but are not limited to at least one of volume, brightness, contrast, picture definition, gray scale and frame number. A dialog box is an area where user dialogs are displayed. The volume adjusting control is used for adjusting the volume of the client. And the jumping control is used for controlling the virtual character to jump. The squatting control is used for controlling the virtual character to squat. The isocenter is used to assist the user in targeting virtual objects in the virtual environment. The view switching control is used for switching the view of the virtual world observed by the user, for example, from a first person view to a third person view, or from the third person view to the first person view. The recording control is used for recording the voice of the user. The microphone icon is used to indicate whether the microphone is in an on state.
Illustratively, as shown in fig. 3, other avatars 301, a move control 302, a set control 303, a volume adjustment control 304, a jump control 305, a squat control 306, a microphone icon 307 are displayed on the user interface.
Optionally, the user interface is a view of the first avatar as viewed by the camera model in the virtual environment.
Optionally, the camera model automatically follows the virtual character in the virtual environment, that is, when the position of the virtual character in the virtual environment changes, the camera model changes while following the position of the virtual character in the virtual environment, and the camera model is always within the preset distance range of the virtual character in the virtual environment. Optionally, the relative positions of the camera model and the virtual character do not change during the automatic following process.
The camera model is a three-dimensional model positioned around a virtual character in a virtual environment, and when a first-person visual angle is adopted, the camera model is positioned near the head of the virtual character or positioned at the head of the virtual character; when a third person perspective view is adopted, the camera model can be located behind the virtual character and bound with the virtual character, or located at any position away from the virtual character by a preset distance, the virtual character located in the virtual environment can be observed from different angles through the camera model, and optionally, when the third person perspective view is the over-shoulder perspective view of the first person, the camera model is located behind the virtual character (such as the head and the shoulder of the virtual character). Optionally, the viewing angle includes other viewing angles, such as a top viewing angle, in addition to the first person viewing angle and the third person viewing angle; the camera model may be positioned over the head of the virtual character when a top view is used, which is a view of viewing the virtual environment from an aerial top view. Optionally, the camera model is not actually displayed in the virtual environment, i.e. the camera model is not displayed in the virtual environment displayed by the user interface.
To illustrate an example where the camera model is located at any position away from the virtual character by a preset distance, optionally, one virtual character corresponds to one camera model, and the camera model may rotate with the virtual character as a rotation center, for example: the camera model is rotated with any point of the virtual character as a rotation center, the camera model rotates not only angularly but also shifts in displacement during the rotation, and the distance between the camera model and the rotation center is kept constant during the rotation, that is, the camera model rotates on the surface of a sphere with the rotation center as the sphere center, wherein any point of the virtual character can be the head, the trunk or any point around the virtual character, which is not limited in the embodiment of the present application. Optionally, when the virtual character is observed by the camera model, the center of the view angle of the camera model points to the direction in which the point of the spherical surface on which the camera model is located points to the center of the sphere.
Optionally, the camera model may also observe the virtual character at a preset angle in different directions of the virtual character.
Referring to fig. 4, schematically, a point is determined in the virtual character 41 as a rotation center 42, and the camera model rotates around the rotation center 42, and optionally, the camera model is configured with an initial position, which is a position above and behind the virtual character (for example, a rear position of the brain). Illustratively, as shown in fig. 4, the initial position is position 43, and when the camera model rotates to position 44 or position 45, the direction of the angle of view of the camera model changes as the camera model rotates.
Step 204: and receiving a movement control operation, wherein the movement control operation is used for controlling the first virtual character to change the movement state in the virtual environment, and the movement state comprises a standing state and a traveling state.
The movement control operation is to control the first virtual character to change the movement state in the virtual environment by pressing one or more preset physical keys. Alternatively, the user may perform the movement control operation by a signal generated by a tap, a long press, a click, a double click, and/or a slide on the touch screen.
Alternatively, when the terminal is a VR (Virtual Reality) device, the movement control operation is a behavior action of the user in a real environment. For example, if the user steps forward in the real environment, the virtual character in the virtual environment correspondingly steps forward.
Optionally, the moving state further comprises at least one of a jumping state, a turning state, a takeoff state, a landing state, a descending state, and an ascending state. Illustratively, the movement control operation is for controlling the first avatar to change from an immediately state to a jumping state in the virtual environment.
The standing state means that the virtual character is static in the virtual environment, and the advancing state means that the virtual character advances at a constant speed. Illustratively, the traveling state refers to the virtual character traveling in the virtual environment at a speed of 3 m/s.
Step 206: in response to the movement control operation, a first transition animation frame corresponding to a first movement distance, which is a movement distance of the first virtual character in a first transition stage of the change in the movement state, is determined in the first transition animation.
The first transition phase refers to a process in which the first avatar switches the moving state. For example, in the present embodiment, the first transition phase refers to a process in which the first avatar changes from a standing state to a traveling state, or the first transition phase refers to a process in which the first avatar changes from a traveling state to a standing state.
Illustratively, when the speed of the first avatar in the virtual environment is 0, the first avatar is considered to be in a standing state, and the moment when the movement control operation is received is considered to be the starting moment of the first transition phase; when the first virtual character accelerates from the station state to the preset speed, so that the first virtual character keeps the preset speed to move at a constant speed, the moment when the first virtual character reaches the preset speed is regarded as the ending moment of the first transition stage, or after the first virtual character reaches the preset speed and the first virtual character moves at the preset speed at the constant speed for a preset distance, the moment when the first virtual character moves at the constant speed for the preset distance at the preset speed is regarded as the ending moment of the first transition stage.
The first transition animation is an animation that drives a three-dimensional model of the first virtual character using a skeletal model of the first virtual character. The first transition animation belongs to one of three-dimensional animations.
Optionally, the first transition animation is an animation that the first virtual character is displayed in a transition phase of the change in the movement state. Illustratively, a first transition animation is displayed when the first virtual character changes from an in-station state to a travel state. Optionally, the movement state comprises at least one of a standing state, a traveling state, a jumping state, a turning state, a take-off state, a landing state, a descending state, and an ascending state.
Optionally, the first transition animation is associated with a type of movement state change. Illustratively, a transitional animation 1 is displayed when the first virtual character changes from an in-station state to a travel state. When the first virtual character changes from the traveling state to the standing state, a transition animation 2 is displayed.
Optionally, the first transition animation is designed by the technician at his or her discretion.
Optionally, the first transition animation is derived by capturing real-person motion through motion capture techniques.
Optionally, the first transition animation comprises an animation observed from a different perspective. Illustratively, viewing the first virtual character from a front perspective of the first virtual character results in a first transition animation, or viewing the first virtual character from a back perspective of the first virtual character results in a first transition animation.
Optionally, the perspective of the first transition animation is determined from the perspective of the local client.
A transitional animation may transition between two different animations. In this embodiment, each moving state of the virtual character has a corresponding animation, for example, the virtual character displays a standing animation when standing, and the virtual character displays a moving animation when moving. When two different animations are directly connected together, the connection of the two animations appears unnatural, for example, when the standing animation and the moving animation of the virtual character are directly connected together, and after the standing animation is played, the moving animation is played, the action of the virtual character appears unnatural, and the virtual character may have problems of shaking, dragging and the like, so that the connection of the two animations appears natural by adding the transition animation.
The first transition animation frame is one frame image corresponding to the first moving distance in the first transition animation.
Optionally, the first transitional animation frame is at least one of a two-dimensional image, a 2.5-dimensional image, and a three-dimensional image.
Step 208: and playing a first transition animation frame corresponding to the first moving distance to obtain the moving animation of the first virtual character in the first transition stage.
In the process of playing the first transition animation frame, since the first transition animation frame is determined according to the first moving distance of the first virtual character, when the moving animation is played, the whole playing process is smooth and continuous.
Illustratively, as shown in fig. 5, during the process of the first virtual character 500 changing from a standing state to a traveling state, the transition animation frame 501 shows the first virtual character 500 in a standing state, and when the first virtual character 500 starts traveling and has traveled a distance, the transition animation frame 502 is displayed, and the body posture of the first virtual character 500 has changed. When the virtual character travels a further distance and the transitional animation frame 503 is displayed, the body posture of the first virtual character 500 changes again. After the first avatar 500 enters the traveling state, a transition animation frame 504 is displayed, in the transition animation frame 504, the posture of the first avatar 500 indicates that the first avatar 500 is traveling (fig. 5 only shows a few representative transition image frames in the process that the first avatar 500 changes from the standing state to the traveling state, and in the actual implementation process, the number of the transition image frames is adjusted according to actual needs).
Optionally, the animation frame of the last m frames of the first transition animation corresponds to the moving state after the moving state changes, and m is a positive integer. Illustratively, when the first transition animation has a total of 50 frames and the first transition animation is changed from the standing state to the traveling state, the animation frame of the last 5 frames of the first transition animation appears as the standing state of the first virtual character.
Optionally, the animation frames of the first n frames of the first transition animation correspond to the moving state before the moving state changes, and n is a positive integer. Illustratively, when the first transition animation has a total of 50 frames and the first transition animation is changed from the standing state to the traveling state, the animation frame of the last 5 frames of the first transition animation appears as the standing state of the first virtual character.
Illustratively, as shown in fig. 6, during the process of the first virtual character 600 changing from the traveling state to the standing state, the transition animation frame 601 shows that the first virtual character 600 is in the traveling state, and when the first virtual character 600 starts to stop and a distance has elapsed, the transition animation frame 602 is displayed, and the body posture of the first virtual character 600 has changed. When the virtual character travels a further distance, the transitional animation frame 603 is displayed, and the body posture of the first virtual character 600 changes again. After the first avatar 600 enters the standing state, a transition animation frame 604 is displayed, in the transition animation frame 604, the posture of the first avatar 600 indicates that the first avatar 600 is in the standing state (fig. 6 only shows a few representative transition image frames in the process of the first avatar 600 changing from the advancing state to the standing state, and in the actual implementation process, the number of the transition image frames is adjusted according to actual needs).
In summary, in this embodiment, when the virtual character changes in the moving state, the corresponding transition animation frame is played according to the moving distance of the virtual character in the virtual environment, and the transition animation frame is continuously played to obtain the complete moving animation. When the change of the moving state of the virtual character is displayed, the animation and the displacement of the first virtual character are matched, the phenomenon of 'sliding' is avoided, the reality is enhanced more naturally and smoothly. And when the server and the client carry out data interaction, only an instruction with a small data volume and the information of the position point need to be sent, so that the data interaction volume between the client and the server can be reduced. Moreover, the calculation of the character model of the virtual character is executed on the client instead of the server, so that the calculation pressure of the server can be reduced.
In the following optional embodiment, the moving distance of the virtual character is matched with the transition animation frame in the transition animation, so that the virtual character can obtain smoother transition animation to show the change of the moving state of the virtual character in the change process of the moving state. On the other hand, the transition animation curve of the transition animation is used for obtaining the transition animation frame, so that the more accurate transition animation frame can be obtained.
Fig. 7 is a flowchart illustrating a control method for a virtual character according to an exemplary embodiment of the present application. The method may be performed by the first terminal 110 or the second terminal 130 or a client on the first terminal 110 or a client on the second terminal 130 shown in fig. 1, and the method includes:
step 701: a first avatar located in a virtual environment is displayed.
The movement control operation is used for controlling the first virtual character to change the movement state in the virtual environment, wherein the movement state comprises a standing state and a traveling state.
The first virtual role is the role that is hosted on the local client. The first avatar includes, but is not limited to, at least one of a virtual character, a virtual animal, and an animation character.
The virtual environment is a three-dimensional virtual environment. Optionally, the virtual environment is a two-dimensional virtual environment, or alternatively, the virtual environment is a 2.5-dimensional virtual environment.
Illustratively, as shown in fig. 3, a virtual character 301, a movement control 302, a setting control 303, a volume adjustment control 304, a jump control 305, a squat control 306, a recording control 307, and a microphone icon 308 are displayed on the user interface.
Step 702: a movement control operation is received.
The movement control operation is to control the first virtual character to change the movement state in the virtual environment by pressing one or more preset physical keys. Alternatively, the user may perform the movement control operation by a signal generated by a tap, a long press, a click, a double click, and/or a slide on the touch screen.
Alternatively, when the terminal is a VR (Virtual Reality) device, the movement control operation is a behavior action of the user in a real environment. For example, if the user steps forward in the real environment, the virtual character in the virtual environment correspondingly steps forward.
Optionally, the movement state comprises at least one of a standing state, a traveling state, a jumping state, a turning state, a take-off state, a landing state, a descending state, and an ascending state. Illustratively, the movement control operation is for controlling the first avatar to change from an immediately state to a jumping state in the virtual environment.
Step 703: in response to the movement control operation, a reference position point of the first virtual character is acquired at a first transition stage of the movement state change.
The reference position point is determined according to the change mode of the movement state of the first virtual character in the virtual environment. Illustratively, when the first avatar changes from a standing state to a traveling state, the reference position point 1 corresponds to, and when the first avatar changes from the traveling state to a standing state, the reference position point 1 and the reference position point 2 correspond to, and are two different points.
Illustratively, when the first avatar changes from the standing state to the traveling state, the reference location point is obtained by:
in response to the movement control operation, a start position point of the first avatar is determined as a reference position point at a first transition stage where the first avatar changes from the standing state to the traveling state.
Optionally, a coordinate system is established in the virtual environment, and the position point of the first virtual character is recorded through the coordinate system.
Optionally, the coordinates of the center point of the first virtual character model are used as the current position of the first virtual character.
For example, if the first avatar is standing at position a in the virtual environment and the first avatar is going to change from a standing state to a traveling state, the position a where the first avatar stands may be regarded as the starting position of the first avatar, and may also be regarded as the reference position point.
Illustratively, when the first avatar changes from the traveling state to the standing state, the reference position point is obtained by:
1. in response to the movement control operation, a predicted stopping point is calculated from the traveling parameters of the first avatar at a first transition stage where the first avatar changes from the traveling state to the standing state.
The travel parameter includes at least one of speed magnitude, speed direction, friction, air resistance, acceleration.
Illustratively, the first avatar travels in a first direction, the first direction is determined as a positive direction of the x-axis, and when the first avatar is ready to be converted from the traveling state to the standing state at x-0, if the speed of the first avatar is 4m/s, the acceleration is-2 m/s2Then the first avatar will stop at x-4 and enter the standing state.
The travel parameter affects a movement distance required for the change in the motion state of the first virtual character. For example, when the first avatar changes from a traveling state to a standing state, the first avatar performs a deceleration movement, the acceleration of the first avatar is affected by ground friction, and when the ground friction is large, the first avatar receives a large acceleration opposite to the speed direction, and the first avatar is more easily decelerated, so that the first avatar changes from the traveling state to the standing state within a short moving distance. Similarly, when the first avatar changes from the traveling state to the standing state, the air resistance also affects the moving distance required for the first avatar to change from the traveling state to the standing state.
2. The predicted stopping point is determined as a reference position point.
Step 704: a first movement distance between the current position of the first virtual character and the reference position point is calculated.
Optionally, a coordinate system is established in the virtual environment, and the current position and the reference position point are recorded through the coordinate system. Illustratively, the coordinates of the current position in the virtual environment are (0, 4), the coordinates of the reference position point in the virtual environment are (0, 7), and the first movement distance is 3.
Step 705: a target timestamp is determined from the first transition animation curve based on the first movement distance.
The transition animation curve is used to represent a relationship between a moving distance and a time stamp in the transition animation.
Since different transition animations are used for different movement state changes, the transition animation curve is determined according to the movement state change. Illustratively, fig. 8 shows a transition animation graph of a transition animation corresponding to a change of the first virtual character from an immediately stationary state to a traveling state, in which a horizontal axis represents a time stamp and a vertical axis represents a first movement distance. Fig. 9 shows a transition animation graph of a transition animation corresponding to the change of the first virtual character's traveling state from the standing state, and likewise, the horizontal axis represents a time stamp and the vertical axis represents the first movement distance in the graph.
Illustratively, as shown in FIG. 8, finding the first movement distance represents point 801 in FIG. 8, from which point 801 a point 802 is derived from the transition animation graph, the point 802 representing the target timestamp.
Illustratively, as shown in FIG. 9, finding the first movement distance represents point 901 in FIG. 9, and finding point 902 from the transition animation graph according to point 901, point 902 represents the target timestamp.
Step 706: a first transition animation frame corresponding to the target timestamp is determined in the first transition animation.
The first transition animation is played according to the time stamp, and each frame in the first transition animation has the corresponding time stamp.
Step 707: and playing a first transition animation frame corresponding to the first moving distance to obtain the moving animation of the first virtual character in the first transition stage.
Illustratively, as shown in fig. 5, during the process of the first virtual character 500 changing from a standing state to a traveling state, the transition animation frame 501 shows the first virtual character 500 in a standing state, and when the first virtual character 500 starts traveling and has traveled a distance, the transition animation frame 502 is displayed, and the body posture of the first virtual character 500 has changed. When the virtual character travels a further distance and the transitional animation frame 503 is displayed, the body posture of the first virtual character 500 changes again. After the first avatar 500 enters the traveling state, a transition animation frame 504 is displayed, in the transition animation frame 504, the posture of the first avatar 500 indicates that the first avatar 500 is traveling (fig. 5 only shows a few representative transition image frames in the process that the first avatar 500 changes from the standing state to the traveling state, and in the actual implementation process, the number of the transition image frames is adjusted according to actual needs).
Illustratively, as shown in fig. 6, during the process of the first virtual character 600 changing from the traveling state to the standing state, the transition animation frame 601 shows that the first virtual character 600 is in the traveling state, and when the first virtual character 600 starts to stop and a distance has elapsed, the transition animation frame 602 is displayed, and the body posture of the first virtual character 600 has changed. When the virtual character travels a further distance, the transitional animation frame 603 is displayed, and the body posture of the first virtual character 600 changes again. After the first avatar 600 enters the standing state, a transition animation frame 604 is displayed, in the transition animation frame 604, the posture of the first avatar 600 indicates that the first avatar 600 is standing (fig. 6 only shows a few representative transition image frames in the process of the first avatar 600 changing from the advancing state to the standing state, and in the actual implementation process, the number of the transition image frames is adjusted according to actual needs).
In summary, in this embodiment, when the virtual character changes in the moving state, the corresponding transition animation frame is played according to the moving distance of the virtual character in the virtual environment, and the transition animation frame is continuously played to obtain the complete moving animation. When the change of the movement state of the virtual character is displayed, the movement state is more natural and smooth, and the sense of reality is enhanced. And when the server and the client carry out data interaction, only an instruction with a small data volume and the information of the position point need to be sent, so that the data interaction volume between the client and the server can be reduced. Moreover, the calculation of the character model of the virtual character is executed on the client instead of the server, so that the calculation pressure of the server can be reduced.
And moreover, relatively coherent picture representation can be obtained by matching the animation frame of the first virtual character through the moving distance of the first virtual character, meanwhile, the client can determine the animation only by performing single matching calculation, the calculation amount of the client can be reduced, and the timeliness of transition animation representation is improved.
In the following alternative embodiment, the virtual environment may further have a second virtual character, which is manipulated by a user on another client or an AI (Artificial Intelligence), different from the first virtual character, and is manipulated by a user on the local client, so that it is necessary to describe how the second virtual character is represented on the local client.
Fig. 10 is a flowchart illustrating a control method for a virtual character according to an exemplary embodiment of the present application. The method may be performed by the first terminal 110 or the second terminal 130 or a client on the first terminal 110 or a client on the second terminal 130 shown in fig. 1, and the method includes:
step 1001: a second avatar located in the virtual environment is displayed.
The second virtual character is a character in the virtual environment that is different from the first virtual character. Optionally, the second virtual role is a user-controlled role on other clients, or the second virtual role is an AI-controlled role.
Optionally, the second avatar includes, but is not limited to, at least one of a virtual character, a virtual animal, and an animation character.
The virtual environment is a three-dimensional virtual environment. Optionally, the virtual environment is a two-dimensional virtual environment, or alternatively, the virtual environment is a 2.5-dimensional virtual environment.
Step 1002: and receiving a movement state change instruction sent by the server.
The movement state change instruction is used for controlling the second virtual character to change the movement state in the virtual environment. When the second virtual character is a user-controlled character on the other client, the movement state change instruction is generated upon receiving a movement control operation on the other client.
Optionally, the moving state comprises a standing state and a traveling state.
Optionally, the moving state further comprises at least one of a jumping state, a turning state, a takeoff state, a landing state, a descending state, and an ascending state. For example, the change of the movement state may be a change from a standing state to a jumping state.
Step 1003: in response to the movement state change instruction, a second transition animation frame corresponding to a second movement distance is determined in the second transition animation.
Optionally, when the change of the movement state comprises a change from an immediately state to a travelling state, the step comprises the sub-steps of:
1. in response to the movement state change instruction, a second movement distance between the current position and the starting position point of the second virtual character is calculated in a second transition phase of the movement state change of the second virtual character.
Optionally, a coordinate system is established in the virtual environment, and the current position and the starting position point of the second virtual character are recorded through the coordinate system.
Optionally, the coordinates of the center point of the second virtual character model are used as the current position of the second virtual character.
Illustratively, the second avatar is originally standing at position B in the virtual environment, and the second avatar is about to change from the standing state to the traveling state, and then the position B where the second avatar originally stands is recorded as the starting position of the second avatar.
2. And determining a second transition animation frame from the second transition animation according to the second moving distance.
Further, determining a target timestamp from the second transition animation curve according to the second moving distance; a second transition animation frame is determined from the second transition animation according to the target timestamp.
The transition animation curve is used to represent a relationship between a moving distance and a time stamp in the transition animation. Wherein the horizontal axis of the second transition animation curve represents the time stamp and the vertical axis represents the second movement distance.
Optionally, when the change of the moving state comprises a change from a traveling state to a standing state, the moving state change instruction comprises a predicted stopping point of the second avatar, the step comprising the sub-steps of:
1. in response to the movement state change instruction, a second movement distance between the current position of the second virtual character and the predicted stop point of the second virtual character is calculated in a second transition phase of the movement state change of the second virtual character.
Optionally, a coordinate system is established in the virtual environment, and the current position and the starting position point of the second virtual character are recorded through the coordinate system.
Optionally, the coordinates of the center point of the second virtual character model are used as the current position of the second virtual character.
Illustratively, the second avatar is originally standing at position B in the virtual environment, and the second avatar is about to change from the standing state to the traveling state, and then the position B where the second avatar originally stands is recorded as the starting position of the second avatar.
2. And determining a second transition animation frame from the second transition animation according to the second moving distance.
Further, determining a target timestamp from the second transition animation curve according to the second moving distance; a second transition animation frame is determined from the second transition animation according to the target timestamp.
The transition animation curve is used to represent a relationship between a moving distance and a time stamp in the transition animation.
Step 1004: and playing a second transition animation frame corresponding to the second moving distance to obtain the moving animation of the second virtual character in the second transition stage.
The second transition phase refers to a process of the second avatar switching the moving state. For example, in the present embodiment, the second transition phase refers to a process in which the second avatar changes from the standing state to the traveling state, or the second transition phase refers to a process in which the second avatar changes from the traveling state to the standing state.
Illustratively, when the speed of the second avatar in the virtual environment is 0, the second avatar is considered to be in a standing state, and the time when the movement control operation is received is considered to be the starting time of the second transition phase; and when the second virtual character accelerates from the station state to the preset speed, so that the second virtual character keeps the preset speed to move at a constant speed, considering the moment when the second virtual character reaches the preset speed as the ending moment of the second transition stage, or considering the moment when the second virtual character reaches the preset speed and the second virtual character moves at the preset speed at the constant speed for a preset distance as the ending moment of the second transition stage.
In the process of playing the second transitional animation frame, since the second transitional animation frame is determined according to the second moving distance of the second virtual character, when the moving animation is played, the whole playing process is smooth and continuous.
In summary, in this embodiment, when the virtual character changes in the moving state, the corresponding transition animation frame is played according to the moving distance of the virtual character in the virtual environment, and the transition animation frame is continuously played to obtain the complete moving animation. When the change of the movement state of the virtual character is displayed, the movement state is more natural and smooth, and the sense of reality is enhanced. And the transition animation frame of the virtual character is determined by moving the distance, so that the data interaction amount between the client and the server can be reduced, and the calculation pressure of the server is reduced.
And only the moving state change instruction needs to be transmitted, so that the transmitted data volume is small, the dependence on the network is small, and the smooth display of the virtual character transition animation can be realized even under the condition of large network delay.
Fig. 11 and 12 are flowcharts each illustrating a control method of a virtual character according to an exemplary embodiment of the present application. The change in the movement state of the user-controlled first avatar at the local client may involve a change in the movement state of the first avatar displayed at the local client, server, and other clients.
On the local client, the first avatar starts to change from the standing state to the traveling state in step 1101, and the local client sends a start movement notification to the server. In step 1102, the local client records a starting location point. In step 1103, the local client calculates a movement distance between the current position of the first avatar and the reference position point. In step 1104, the local client determines a travel transition animation frame from the travel transition animation according to the movement distance. In step 1105, the local client plays the moving transition animation frame according to the moving distance to obtain the moving animation of the first virtual character in the moving transition stage. In step 1106, the first avatar on the local client travels at a uniform speed. In step 1107, the first avatar on the local client starts to change from the traveling state to the standing state, and the local client sends a start stop notification to the server. In step 1108, the local client calculates a predicted stopping point according to the traveling parameters of the first virtual character, and the local client sends the predicted stopping point to the server. In step 1109, the local client calculates a movement distance between the current position of the first avatar and the predicted stopping point. In step 1110, the local client determines a standing transition animation frame from the standing transition animation according to the moving distance. In step 1111, the local client plays the standing transition animation frame according to the moving distance to obtain the moving animation of the first virtual character in the standing transition stage. In step 1112, the local client displays the first avatar to stand.
On the server, in step 1113, the first avatar starts to change from the standing state to the traveling state upon receiving the start movement notification sent by the local client. In step 1114, the first avatar in the server travels at a uniform speed. In step 1115, upon receiving a start stop notification sent by the local client, the first avatar starts to change from the traveling state to the standing state. In step 1116, the server synchronizes the predicted stopping points. In step 1117, the first avatar in the server stands.
On the other client, the other client receives the start move instruction sent by the server and the first avatar starts to change from the standing state to the traveling state in step 1118. In step 1119, the other client records the starting location point. In step 1120, the other client calculates a movement distance between the current position of the first avatar and the reference position point. In step 1121, the other client determines a travel transition animation frame from the travel transition animation according to the movement distance. In step 1122, the other clients play the traveling transition animation frame according to the moving distance, so as to obtain the moving animation of the first virtual character in the traveling transition phase. In step 1123, the other client displays the first avatar to travel at a constant speed. In step 1124, the first avatar starts to change from the traveling state to the standing state after the other client receives the start stop instruction sent by the server. In step 1125, after receiving the predicted stopping point, other clients calculate the moving distance between the current position of the first avatar and the predicted stopping point. In step 1126, the other clients determine a standing transition animation frame from the standing transition animation according to the moving distance. In step 1127, the other clients play the standing transition animation frame according to the moving distance to obtain the moving animation of the first virtual character in the standing transition stage. In step 1128, the other client displays the first avatar to stand.
In summary, in this embodiment, when the virtual character changes in the moving state, the corresponding transition animation frame is played according to the moving distance of the virtual character in the virtual environment, and the transition animation frame is continuously played to obtain the complete moving animation. When the change of the movement state of the virtual character is displayed, the movement state is more natural and smooth, and the sense of reality is enhanced. And the transition animation frame of the virtual character is determined by moving the distance, so that the data interaction amount between the client and the server can be reduced, and the calculation pressure of the server is reduced.
And only the moving state change instruction needs to be transmitted, so that the transmitted data volume is small, the dependence on the network is small, and the smooth display of the virtual character transition animation can be realized even under the condition of large network delay.
Fig. 13 is a schematic structural diagram illustrating a control apparatus for a virtual character according to an exemplary embodiment of the present application. The apparatus may be implemented as all or a portion of an apparatus by software, hardware, or a combination of both, the apparatus 120 comprising:
a display module 121, configured to display a first virtual character located in a virtual environment;
a receiving module 122, configured to receive a movement control operation, where the movement control operation is used to control the first avatar to perform a change of a movement state in the virtual environment, where the movement state includes a standing state and a traveling state;
a control module 123 for determining, in response to the movement control operation, a first transition animation frame corresponding to a first movement distance in a first transition animation, the first movement distance being a first movement distance of the first virtual character in a first transition phase of the movement state change;
the display module 121 is further configured to obtain a moving animation of the first virtual character in the first transition stage through the first transition animation frame.
In an optional design of the present application, the control module 123 is further configured to, in response to the movement control operation, obtain a reference position point of the first virtual character in a first transition phase of a movement state change; calculating the first movement distance between the current position of the first virtual character and the reference position point; determining the first transition animation frame corresponding to the first movement distance in the first transition animation.
In an optional design of the present application, the control module 123 is further configured to determine a target timestamp from a first transition animation curve according to the first moving distance, where the first transition animation curve is used to represent a relationship between the moving distance and the timestamp in the first transition animation; determining the first transition animation frame in the first transition animation corresponding to the target timestamp.
In an optional design of the present application, the control module 123 is further configured to determine a starting position point of the first avatar as the reference position point in a first transition phase in which the first avatar changes from the standing state to the traveling state in response to the movement control operation.
In an optional design of the present application, the control module 123 is further configured to, in response to the movement control operation, calculate a predicted stopping point according to a traveling parameter of the first avatar in a first transition phase of the first avatar changing from the traveling state to the standing state, where the traveling parameter includes at least one of a speed magnitude, a speed direction, a friction force, an air resistance, and an acceleration; determining the predicted stopping point as the reference position point.
In an optional design of the present application, the receiving module 122 is further configured to receive a moving state change instruction sent by a server, where the moving state change instruction is used to control the second virtual character to change the moving state in the virtual environment.
The control module 123 is further configured to determine, in response to the movement state change instruction, a second transition animation frame corresponding to a second movement distance in a second transition animation, where the second movement distance is a movement distance of the second virtual character in a second transition phase of the movement state change.
The display module 121 is further configured to play the second transition animation frame corresponding to the second moving distance, so as to obtain a moving animation of the second virtual character in the second transition stage.
In an optional design of the present application, the control module 123 is further configured to, in response to the moving state change instruction, calculate a second moving distance between the current position and the starting position point of the second virtual character in a second transition phase of the moving state change of the second virtual character; determining a second transition animation frame corresponding to the second moving distance in a second transition animation curve of the second transition animation, wherein the second transition animation curve is used for representing the relation between the moving distance and the time stamp in the second transition animation.
In an optional design of the present application, the control module 123 is further configured to calculate, in response to the moving state change instruction, the second moving distance between the current position of the second avatar and the predicted stop point of the second avatar in a second transition phase of the moving state change of the second avatar; determining the second transition animation frame corresponding to the second movement distance in the second transition animation curve of the second transition animation.
In summary, in this embodiment, when the virtual character changes in the moving state, the corresponding transition animation frame is played according to the moving distance of the virtual character in the virtual environment, and the transition animation frame is continuously played to obtain the complete moving animation. When the change of the movement state of the virtual character is displayed, the movement state is more natural and smooth, and the sense of reality is enhanced. And the transition animation frame of the virtual character is determined by moving the distance, so that the data interaction amount between the client and the server can be reduced, and the calculation pressure of the server is reduced.
Fig. 14 is a schematic structural diagram of a server according to an embodiment of the present application. Specifically, the method comprises the following steps: the server 1300 includes a Central Processing Unit (CPU) 1301, a system Memory 1304 including a Random Access Memory (RAM) 1302 and a Read-Only Memory (ROM) 1303, and a system bus 1305 connecting the system Memory 1304 and the CPU 1301. The computer device 1300 also includes a basic input/output system (I/O system) 1306, which facilitates transfer of information between devices within the computer, and a mass storage device 1307 for storing an operating system 1313, application programs 1314, and other program modules 1315.
The basic input/output system 1306 includes a display 1308 for displaying information and an input device 1309, such as a mouse, keyboard, etc., for user input of information. Wherein a display 1308 and an input device 1309 are connected to the central processing unit 1301 through an input/output controller 1310 connected to the system bus 1305. The basic input/output system 1306 may also include an input/output controller 1310 for receiving and processing input from a number of other devices, such as a keyboard, mouse, or electronic stylus. Similarly, an input/output controller 1310 also provides output to a display screen, a printer, or other type of output device.
The mass storage device 1307 is connected to the central processing unit 1301 through a mass storage controller (not shown) connected to the system bus 1305. The mass storage device 1307 and its associated computer-readable media provide non-volatile storage for the server 1300. That is, mass storage device 1307 may include a computer-readable medium (not shown) such as a hard disk or Compact Disc-Only Memory (CD-ROM) drive.
Without loss of generality, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media include RAM, ROM, Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), flash Memory or other solid state Memory technology, CD-ROM, Digital Versatile Disks (DVD), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage, or other magnetic storage devices. Of course, those skilled in the art will appreciate that computer storage media is not limited to the foregoing. The system memory 1304 and mass storage device 1307 described above may be collectively referred to as memory.
According to various embodiments of the present application, server 1300 may also operate as a remote computer connected to a network via a network, such as the Internet. That is, the server 1300 may be connected to the network 1312 through the network interface unit 1311, which is connected to the system bus 1305, or may be connected to other types of networks or remote computer systems (not shown) using the network interface unit 1311.
According to another aspect of the present application, there is also provided a computer storage medium having at least one program code stored therein, the program code being loaded and executed by a processor to implement the control method of the virtual character as described above.
According to another aspect of the present application, there is also provided a computer program product or a computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions to enable the computer device to execute the control method of the virtual character.
It should be understood that reference to "a plurality" herein means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. A method for controlling a virtual character, the method comprising:
displaying a first avatar located in a virtual environment;
receiving a movement control operation for controlling the first virtual character to change a movement state in the virtual environment, wherein the movement state comprises a standing state and a traveling state;
determining, in response to the movement control operation, a first transition animation frame corresponding to a first movement distance in a first transition animation, the first movement distance being a movement distance of the first virtual character in a first transition phase of a change in a movement state;
and playing a first transition animation frame corresponding to the first moving distance to obtain the moving animation of the first virtual character in the first transition stage.
2. The method of claim 1, wherein determining a first transition animation frame in a first transition animation corresponding to a first movement distance in response to the movement control operation comprises:
responding to the movement control operation, and acquiring a reference position point of the first virtual character in a first transition stage of movement state change;
calculating the first movement distance between the current position of the first virtual character and the reference position point;
determining the first transition animation frame corresponding to the first movement distance in the first transition animation.
3. The method of claim 2, wherein said determining the first transition animation frame in the first transition animation that corresponds to the first movement distance comprises:
determining a target timestamp from a first transition animation curve according to the first moving distance, wherein the first transition animation curve is used for representing the relation between the moving distance and the timestamp in the first transition animation;
determining the first transition animation frame in the first transition animation corresponding to the target timestamp.
4. The method of claim 2, wherein the change in movement state comprises the first avatar changing from the standing state to the traveling state;
the acquiring the reference position point of the first virtual character in a first transition phase of the movement state change in response to the movement control operation includes:
in response to the movement control operation, determining a starting position point of the first avatar as the reference position point at a first transition stage in which the first avatar changes from the standing state to the traveling state.
5. The method of claim 2, wherein the change in movement state comprises the first avatar changing from the travel state to the stance state;
the acquiring a reference position point of the first virtual character at a first transition stage of the movement state change in response to the movement control operation includes:
in response to the movement control operation, calculating a predicted stopping point according to a travel parameter of the first avatar in a first transition phase in which the first avatar changes from the travel state to the standing state, the travel parameter including at least one of a speed magnitude, a speed direction, a friction force, an air resistance, and an acceleration;
determining the predicted stopping point as the reference position point.
6. The method of any of claims 1 to 5, further comprising a second virtual character in the virtual environment;
the method further comprises the following steps:
receiving a moving state change instruction sent by a server, wherein the moving state change instruction is used for controlling the second virtual role to change the moving state in the virtual environment;
determining, in response to the movement state change instruction, a second transition animation frame corresponding to a second movement distance in a second transition animation, the second movement distance being a movement distance of the second virtual character in a second transition phase of the movement state change;
and playing the second transition animation frame corresponding to the first moving distance to obtain the moving animation of the second virtual character in the second transition stage.
7. The method of claim 6, wherein the change in movement state comprises changing from the standing state to the traveling state;
the determining, in response to the movement state change instruction, a second transition animation frame corresponding to a second movement distance in a second transition animation includes:
in response to the movement state change instruction, calculating a second movement distance between the current position and the starting position point of the second virtual character in a second transition phase of the movement state change of the second virtual character;
determining a second transition animation frame corresponding to the second moving distance in a second transition animation curve of the second transition animation, wherein the second transition animation curve is used for representing the relation between the moving distance and the time stamp in the second transition animation.
8. The method of claim 6, wherein the change in movement state comprises changing from the travel state to the stance state; the movement state change instruction comprises a predicted stopping point of the second virtual character;
the determining, in response to the movement state change instruction, a second transition animation frame corresponding to a second movement distance in a second transition animation includes:
in response to the movement state change instruction, calculating the second movement distance between the current position of the second virtual character and the predicted stop point of the second virtual character at a second transition stage of the movement state change of the second virtual character;
determining the second transition animation frame corresponding to the second movement distance in the second transition animation curve of the second transition animation.
9. An apparatus for controlling a virtual character, the apparatus comprising:
a display module for displaying a first virtual character located in a virtual environment;
a receiving module, configured to receive a movement control operation, where the movement control operation is used to control the first virtual character to change a movement state in the virtual environment, where the movement state includes a standing state and a traveling state;
a control module for determining, in response to the movement control operation, a first transition animation frame corresponding to a first movement distance in a first transition animation, the first movement distance being a first movement distance of the first virtual character in a first transition phase of the change in the movement state;
the display module is further configured to play the first transition animation frame corresponding to the first moving distance to obtain a moving animation of the first virtual character in the first transition stage.
10. The apparatus of claim 9,
the control module is further configured to respond to the movement control operation, and acquire a reference position point of the first virtual character in a first transition stage of a movement state change; calculating the first movement distance between the current position of the first virtual character and the reference position point; determining the first transition animation frame corresponding to the first movement distance in the first transition animation.
11. The apparatus of claim 10,
the control module is further used for determining a target timestamp from a first transition animation curve according to the first moving distance, wherein the first transition animation curve is used for representing the relation between the moving distance and the timestamp in the first transition animation; determining the first transition animation frame in the first transition animation corresponding to the target timestamp.
12. The apparatus according to any one of claims 9 to 11,
the receiving module is further configured to receive a moving state change instruction sent by a server, where the moving state change instruction is used to control the second virtual character to change the moving state in the virtual environment;
the control module is further used for responding to the movement state change instruction, and determining a second transition animation frame corresponding to a second movement distance in a second transition animation, wherein the second movement distance is the movement distance of the second virtual character in a second transition stage of the movement state change;
the display module is further configured to play the second transition animation frame corresponding to the second moving distance to obtain a moving animation of the second virtual character in the second transition stage.
13. The apparatus of claim 12,
the control module is further configured to calculate, in response to the movement state change instruction, a second movement distance between the current position and the starting position point of the second virtual character at a second transition stage of the movement state change of the second virtual character; determining a second transition animation frame corresponding to the second moving distance in a second transition animation curve of the second transition animation, wherein the second transition animation curve is used for representing the relation between the moving distance and the time stamp in the second transition animation.
14. A computer device, characterized in that the computer device comprises: a processor and a memory, the memory having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by the processor to implement the control method of a virtual character according to any one of claims 1 to 8.
15. A computer-readable storage medium, in which at least one program code is stored, the program code being loaded and executed by a processor to implement the control method of a virtual character according to any one of claims 1 to 8.
CN202110610049.9A 2021-06-01 2021-06-01 Virtual character control method, device, equipment and medium Active CN113209618B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110610049.9A CN113209618B (en) 2021-06-01 2021-06-01 Virtual character control method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110610049.9A CN113209618B (en) 2021-06-01 2021-06-01 Virtual character control method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN113209618A true CN113209618A (en) 2021-08-06
CN113209618B CN113209618B (en) 2023-04-28

Family

ID=77082417

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110610049.9A Active CN113209618B (en) 2021-06-01 2021-06-01 Virtual character control method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN113209618B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113658300A (en) * 2021-08-18 2021-11-16 北京百度网讯科技有限公司 Animation playing method and device, electronic equipment and storage medium
CN114627215A (en) * 2022-05-16 2022-06-14 山东捷瑞数字科技股份有限公司 Method and device for camera shake animation production based on three-dimensional software
CN115779436A (en) * 2023-02-09 2023-03-14 腾讯科技(深圳)有限公司 Animation switching method, device, equipment and computer readable storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102811352A (en) * 2011-06-03 2012-12-05 卡西欧计算机株式会社 Moving image generating method and moving image generating apparatus

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102811352A (en) * 2011-06-03 2012-12-05 卡西欧计算机株式会社 Moving image generating method and moving image generating apparatus

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113658300A (en) * 2021-08-18 2021-11-16 北京百度网讯科技有限公司 Animation playing method and device, electronic equipment and storage medium
CN114627215A (en) * 2022-05-16 2022-06-14 山东捷瑞数字科技股份有限公司 Method and device for camera shake animation production based on three-dimensional software
CN115779436A (en) * 2023-02-09 2023-03-14 腾讯科技(深圳)有限公司 Animation switching method, device, equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN113209618B (en) 2023-04-28

Similar Documents

Publication Publication Date Title
US11782501B2 (en) System and method for presenting virtual reality content to a user based on body posture
US20210252398A1 (en) Method and system for directing user attention to a location based game play companion application
CN113209618B (en) Virtual character control method, device, equipment and medium
CN111462307B (en) Virtual image display method, device, equipment and storage medium of virtual object
CN112862935B (en) Game role movement processing method and device, storage medium and computer equipment
WO2022083452A1 (en) Two-dimensional image display method and apparatus for virtual object, and device and storage medium
US11409276B2 (en) Method and system for controlling robots within in an interactive arena and generating a virtual overlayed
CN112402960A (en) State switching method, device, equipment and storage medium in virtual scene
TWI831074B (en) Information processing methods, devices, equipments, computer-readable storage mediums, and computer program products in virtual scene
JP6767515B2 (en) How and systems to direct user attention to location-based gameplay companion applications
JP7406567B2 (en) Virtual environment screen display method and device, computer device and program
JP2023527846A (en) Data processing method, apparatus, computer device and computer program in virtual scene
KR20230042517A (en) Contact information display method, apparatus and electronic device, computer-readable storage medium, and computer program product
WO2023169010A1 (en) Virtual object control method and apparatus, electronic device, storage medium, and program product
CN112999657B (en) Method, device, equipment and medium for displaying phantom of virtual character
CN116251349A (en) Method and device for prompting target position in game and electronic equipment
CN114288661A (en) Regional action method, device, equipment and medium in virtual environment
CN115920368A (en) Picture control method, device, medium and equipment
CN117138333A (en) Game marking method, game marking device, electronic equipment and computer readable storage medium
CN113440846A (en) Game display control method and device, storage medium and electronic equipment
CN116920386A (en) Display control method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40051659

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant