CN115999153A - Virtual character control method and device, storage medium and terminal equipment - Google Patents

Virtual character control method and device, storage medium and terminal equipment Download PDF

Info

Publication number
CN115999153A
CN115999153A CN202211097766.7A CN202211097766A CN115999153A CN 115999153 A CN115999153 A CN 115999153A CN 202211097766 A CN202211097766 A CN 202211097766A CN 115999153 A CN115999153 A CN 115999153A
Authority
CN
China
Prior art keywords
virtual
virtual character
touch operation
shelter
squat
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211097766.7A
Other languages
Chinese (zh)
Inventor
苗浩琦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202211097766.7A priority Critical patent/CN115999153A/en
Priority to PCT/CN2023/079125 priority patent/WO2024051116A1/en
Publication of CN115999153A publication Critical patent/CN115999153A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The embodiment of the application discloses a control method, a device, a storage medium and a terminal device for virtual roles, wherein the method comprises the following steps: responding to a first touch operation aiming at the squat control, and controlling the virtual character to squat behind the first surface of the virtual shelter; responding to a second touch operation aiming at the moving rocker towards the target direction, and controlling the virtual character to move towards the target direction behind the first surface; when the virtual character moves to the boundary position of the first surface towards the target direction, responding to a third touch operation for the moving rocker, controlling the virtual character to switch from the first surface of the virtual shelter to the rear of the second surface of the virtual shelter for squat, wherein the first surface is adjacent to the second surface. Therefore, the third touch operation of the movable rocker can control the virtual character to turn angles between the surfaces of the virtual shelter in a squatting state, other controls are not needed to be added, the virtual character is controlled more flexibly, and the operation is simple and easy.

Description

Virtual character control method and device, storage medium and terminal equipment
Technical Field
The present invention relates to the field of computers, and in particular, to a method and apparatus for controlling a virtual character, a computer readable storage medium, and a terminal device.
Background
In recent years, with the development and popularization of terminal equipment technology, more and more applications having three-dimensional virtual environments, such as: virtual reality applications, three-dimensional map programs, military simulation programs, first person shooter games (First person shooting game, FPS), role-playing games (RPG), and the like.
In the prior art, taking FPS games as an example, a user needs to control movement of a virtual character, so as to realize control of the virtual character to avoid after the virtual shelter. However, in the existing game using the shelter shooting as the core, the control of the virtual character is not flexible enough, and the various demands of users cannot be met.
Disclosure of Invention
The embodiment of the application provides a control method and device for a virtual character, which can improve the avoidance success rate of the virtual character at a virtual shelter.
In order to solve the technical problems, the embodiment of the application provides the following technical scheme:
a method of controlling a virtual character, providing a graphical user interface through a terminal device, the graphical user interface comprising at least a portion of a virtual scene, a mobile rocker and a crouch control, and a virtual character and a virtual shelter located in the virtual scene, the method comprising:
Responding to a first touch operation aiming at the squat control, and controlling the virtual character to squat behind the first surface of the virtual shelter;
responding to a second touch operation aiming at the moving rocker towards a target direction, and controlling the virtual character to move towards the target direction behind the first surface;
and when the virtual character moves to the boundary position of the first surface towards the target direction, responding to a third touch operation for the moving rocker, and controlling the virtual character to switch from the first surface of the virtual shelter to the rear of the second surface of the virtual shelter to squat, wherein the first surface is adjacent to the second surface.
A control device for a virtual character, characterized in that a graphical user interface is provided by a terminal device, said graphical user interface comprising at least a part of a virtual scene, a mobile rocker and a crouch control, and a virtual character and a virtual shelter located in said virtual scene, comprising:
the first control module is used for responding to a first touch operation aiming at the squat control and controlling the virtual character to squat behind the first surface of the virtual shelter;
The second control module is used for responding to a second touch operation aiming at the moving rocker towards a target direction and controlling the virtual character to move towards the target direction behind the first surface;
and the third control module is used for responding to a third touch operation for the movable rocker when the virtual character moves to the boundary position of the first surface towards the target direction, and controlling the virtual character to switch from the first surface of the virtual shelter to the rear of the second surface of the virtual shelter to squat, wherein the first surface is adjacent to the second surface.
A computer readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps in the method of controlling a virtual character described above.
A terminal device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps in the control method of the virtual character as described above when executing the program.
In the embodiment of the application, the virtual character is controlled to squat behind the first surface of the virtual shelter by responding to the first touch operation aiming at the squat control; responding to a second touch operation aiming at the moving rocker towards a target direction, and controlling the virtual character to move towards the target direction behind the first surface; and when the virtual character moves to the boundary position of the first surface towards the target direction, responding to a third touch operation for the moving rocker, and controlling the virtual character to switch from the first surface of the virtual shelter to the rear of the second surface of the virtual shelter to squat, wherein the first surface is adjacent to the second surface. Therefore, the third touch operation of the movable rocker can control the virtual character to turn angles between the surfaces of the virtual shelter in a squatting state, other controls are not needed to be added, the virtual character is controlled more flexibly, and the operation is simple and easy.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1a is a schematic system diagram of a method for controlling a virtual character according to an embodiment of the present application.
Fig. 1b is a schematic flow chart of a method for controlling a virtual character according to an embodiment of the present application.
Fig. 1c is a schematic diagram of a graphical user interface according to an embodiment of the present application.
Fig. 1d is another schematic diagram of a graphical user interface provided in an embodiment of the present application.
Fig. 1e is another schematic diagram of a graphical user interface provided in an embodiment of the present application.
Fig. 1f is another schematic diagram of a graphical user interface provided in an embodiment of the present application.
Fig. 1g is another schematic diagram of a graphical user interface provided in an embodiment of the present application.
Fig. 2 is a schematic structural diagram of a virtual character control device according to an embodiment of the present application;
Fig. 3 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The embodiment of the application provides a virtual character control method, a virtual character control device, a storage medium and terminal equipment. Specifically, the method for controlling the virtual character in the embodiment of the present application may be executed by a terminal device, where the terminal device may be a device such as a terminal or a server. The terminal may be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA), etc., and the terminal device may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client, etc. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligent platforms.
For example, when the control method of the virtual character is run on the terminal, the terminal device stores a game application and presents a part of a game scene in the game through the display component. The terminal device is used for interacting with a user through a graphical user interface, for example, the terminal device downloads and installs a game application program and runs the game application program. The way in which the terminal device presents the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device, or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including game screens and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for running the game, generating the graphical user interface, responding to the operation instructions, and controlling the display of the graphical user interface on the touch display screen.
For example, when the control method of the virtual character is run on a server, it may be a cloud game. Cloud gaming refers to a game style based on cloud computing. In the running mode of the cloud game, a running main body of the game application program and a game picture presentation main body are separated, and the storage and the running of the control method of the virtual character are completed on the cloud game server. The game image presentation is completed at a cloud game client, which is mainly used for receiving and sending game data and presenting game images, for example, the cloud game client may be a display device with a data transmission function, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, etc., near a user side, but a terminal device for controlling the virtual character is a cloud game server in the cloud. When playing the game, the user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the cloud game client through a network, and finally decodes the data through the cloud game client and outputs the game pictures.
Referring to fig. 1a, fig. 1a is a schematic system diagram of a virtual character control method according to an embodiment of the present application. The system may include at least one terminal device 1000, at least one server 2000, at least one database 3000, and a network 4000. Terminal 1000 held by a user may be connected to servers of different games through network 4000. Terminal device 1000 can be any device having computing hardware capable of supporting and executing software products corresponding to a game. In addition, terminal device 1000 can have one or more multi-touch sensitive screens for sensing and obtaining input from a user through touch or slide operations performed at multiple points of one or more touch sensitive display screens. In addition, when the system includes a plurality of terminal apparatuses 1000, a plurality of servers 2000, and a plurality of networks 4000, different terminals 1000 may be connected to each other through different networks 4000, through different servers 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition, the different terminal apparatuses 1000 may be connected to other terminals or to a server or the like using their own bluetooth network or hotspot network. For example, a plurality of users may be online through different terminal apparatuses 1000 so as to be connected through an appropriate network and synchronized with each other to support a multi-user game. In addition, the system may include a plurality of databases 3000, the plurality of databases 3000 being coupled to different servers 2000, and information related to game environments may be continuously stored in the databases 3000 when different users play multi-user games online.
The embodiment of the application provides a virtual character control method which can be executed by a terminal or a server. The embodiment of the application is described by taking the control method of the virtual character as an example executed by the terminal. The terminal comprises a display component and a processor, wherein the display component is used for presenting a graphical user interface and receiving operation instructions generated by a user acting on the display component. When a user operates the graphical user interface through the display component, the graphical user interface can control the local content of the terminal through responding to the received operation instruction, and can also control the content of the opposite-end server through responding to the received operation instruction. For example, the user-generated operational instructions for the graphical user interface include instructions for launching the gaming application, and the processor is configured to launch the gaming application after receiving the user-provided instructions for launching the gaming application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch-sensitive display screen. A touch display screen is a multi-touch-sensitive screen capable of sensing touch or slide operations performed simultaneously by a plurality of points on the screen. The user performs touch operation on the graphical user interface by using a finger, and when the graphical user interface detects the touch operation, the graphical user interface controls different virtual roles in the graphical user interface of the game to perform actions corresponding to the touch operation. For example, the game may be any one of a leisure game, an action game, a role playing game, a strategy game, a sports game, an educational game, a first person shooter game (First person shooting game, FPS), and the like. Wherein the game may comprise a virtual scene of the game drawn on a graphical user interface. Further, one or more virtual characters, such as virtual characters, controlled by a user (or users) may be included in the virtual scene of the game. In addition, one or more obstacles, such as rails, ravines, walls, etc., may be included in the virtual scene of the game to limit movement of the virtual character, such as limiting movement of one or more objects to a particular area within the virtual scene. Optionally, the virtual scene of the game also includes one or more elements, such as skills, scores, character health status, energy, etc., to provide assistance to the user, provide virtual services, increase scores related to the user's performance, etc. In addition, the graphical user interface may also present one or more indicators to provide indication information to the user. For example, a game may include a user-controlled avatar and one or more other avatars (such as enemy characters). In one embodiment, one or more other virtual characters are controlled by other users of the game. For example, one or more other virtual characters may be computer controlled, such as a robot using an Artificial Intelligence (AI) algorithm, to implement a human-machine engagement mode. For example, the avatar possesses various skills or capabilities that the game user uses to achieve the goal. For example, the avatar may possess one or more weapons, props, tools, etc. that may be used to eliminate other objects from the game. Such skills or capabilities may be activated by a user of the game using one of a plurality of preset touch operations with the touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of the user.
It should be noted that, the system schematic diagram of the virtual character control system shown in fig. 1a is only an example, and the virtual character control system and the scenario described in the embodiments of the present application are for more clearly describing the technical solution of the embodiments of the present application, and do not constitute a limitation on the technical solution provided in the embodiments of the present application, and as one of ordinary skill in the art can know, along with the evolution of the virtual character control system and the appearance of the new service scenario, the technical solution provided in the embodiments of the present application is also applicable to similar technical problems.
In this embodiment, description will be made in terms of a control device of a virtual character, which may be integrated in a terminal device having a storage unit and a microprocessor mounted therein and having arithmetic capability.
Referring to fig. 1b, fig. 1b is a schematic flow chart of a control method of a virtual character according to an embodiment of the present application. The control method of the virtual character comprises the following steps:
in step 101, in response to a first touch operation for a squat control, a virtual character is controlled to squat behind a first surface of a virtual shelter.
The terminal equipment is provided with a graphic user interface, and the graphic user interface can be started for a user by controlling a game application program installed on the terminal equipment. The three-dimensional virtual scene is a virtual scene provided when an application program runs on the terminal, and can be a simulation scene of a real world, a semi-simulation and semi-fictitious scene or a pure fictitious scene. The scene picture displayed on the graphical user interface is a scene picture presented when the virtual character observes the three-dimensional virtual scene. The user controls the virtual character in the game scene through the terminal, the virtual character can observe the three-dimensional virtual scene through a camera model, taking the FPS game as an example, when the virtual character is in a first person view angle, the camera model is positioned at the head or neck of the target virtual character, and only the arm part of the virtual character can be displayed in the graphical user interface; when in the third person viewing angle, the camera model is behind the target avatar, and only the upper body portion of the avatar may be displayed in the graphical user interface. The graphic user interface is a scene picture presented by observing the three-dimensional virtual scene through the camera model at a certain view angle.
Specifically, referring to fig. 1c, fig. 1c is a first schematic diagram of a graphical user interface according to an embodiment of the present application. The graphical user interface is presented by a screen of the terminal device 1000, in which a virtual shelter 10 is included, the virtual shelter 10 being made up of a plurality of shelter surfaces, a virtual character 20 squat behind a first surface 11 of the virtual shelter 10, a movement control 30 and a squat control 40. Wherein, the user may control the crouching gesture of the virtual character 20 by controlling the virtual character 20 to crouch behind the surface closest to the virtual shelter 10 in the virtual shelter 10 by other gestures such as standing gesture, shooting gesture, or running gesture through the first touch operation of the crouching control 40, thereby realizing the crouching gesture of the virtual character 20. The first touch operation may be a click operation. For FPS-type games, the virtual character 20 may also be configured with a virtual firearm through which firing shots are made, so an attack control 50 for controlling the firing of the virtual firearm is also included in the graphical user interface. Also included within the graphical user interface are cursor controls 60 that prompt the user for current directional information of the virtual character 20, map controls 70 that prompt the user for the location of the virtual character 20 in the three-dimensional virtual environment, and the like. An indication control 61 is further disposed in the cursor control 60, and is used for indicating the direction of the virtual character 20 in the cursor control 60. It will be appreciated that the user graphical interface may include not only the above-mentioned identifier and control, but also other functional controls or identifiers, which are determined according to the specific game content, and are not limited herein.
In particular, for FPS-type games, the virtual shelter is an important component, and the virtual character can avoid attacks of other virtual characters by avoiding after the shelter, or can suddenly attack other virtual characters by avoiding after the shelter.
In some embodiments, the method further comprises:
and when the virtual character is in the squat gesture, controlling the virtual character to switch from the squat gesture to the standing gesture in response to the touch operation on the squat control.
When the virtual character is in the squat gesture and the touch operation of the user for the squat control is received again, the gesture of the virtual character can be controlled to be switched from the original squat gesture to the standing gesture.
In some embodiments, the method further comprises:
when the virtual character is in a standing posture, responding to a sliding operation aiming at the movable rocker, and controlling the virtual character to move in the virtual scene;
and displaying the squat control when the positions of the virtual character and the virtual shelter meet the preset position relation.
Wherein, when the virtual character is in a standing posture, the user can control the virtual character to move in the virtual scene through a sliding operation with respect to the moving rocker.
Specifically, the preset positional relationship may be that a distance between the virtual character and the virtual shelter is a preset distance value. When the user controls the virtual character to move in the virtual scene through a sliding operation with respect to the moving rocker, the squat control 40 may be hidden when the distance between the virtual character and the virtual shelter exceeds a preset distance value, and the squat control 40 may be displayed again when the distance between the virtual character and the virtual shelter satisfies the preset distance value.
In step 102, the virtual character is controlled to move in the target direction behind the first surface in response to the second touch operation for moving the joystick in the target direction.
Fig. 1d is a second schematic diagram of the graphical user interface provided in the embodiment of the present application, as shown in fig. 1 d. When the virtual character 20 is squat on a shelter surface of the virtual shelter 10, if a second touch operation of the user for moving the joystick is received, it is determined that the virtual character 20 can only move along the preset route L of the virtual shelter 10, and since the virtual character 20 can only move along the preset route L, the second touch operation for moving the joystick can only be directed to two opposite target directions of the preset route. For example, a sliding operation in which the target direction of the movement control 30 is the right direction controls the virtual character 20 to move to the right behind the first surface 11.
In some embodiments, the target direction is a direction parallel to the first surface.
The target direction is parallel to the first surface 11, since the target direction is two opposite target directions facing the preset path, and the preset path L is parallel to the first surface 11.
In step 103, when the virtual character moves to the boundary position of the first surface in the target direction, the virtual character is controlled to switch from the first surface of the virtual shelter to the rear of the second surface of the virtual shelter for squat in response to the third touch operation for the moving rocker, wherein the first surface is adjacent to the second surface.
Fig. 1e is a third schematic diagram of a graphical user interface according to an embodiment of the present application, as shown in fig. 1 e. When the virtual character 20 moves to the boundary position l of the first surface 11 in the target direction, if the user performs the third touch operation on the movement control 30, the virtual character 20 may be controlled to switch from the first surface 11 of the virtual shelter 10 to squat behind the second surface 12 of the virtual shelter, where the first surface 11 is adjacent to the second surface 12.
In some embodiments, the method further comprises:
And when the virtual character moves to the boundary position of the first surface towards the target direction, responding to a third touch operation aiming at the moving rocker, and displaying a corner progress mark on a graphical user interface during the duration of the third touch operation, wherein the corner progress mark is used for representing the preset time length and the elapsed time length which need to be waited after the virtual character is switched from the first surface to the second surface.
Fig. 1f is a fourth schematic diagram of a graphical user interface according to an embodiment of the present application, as shown in fig. 1 f. The third touch operation may be a continuous operation, so that during the duration of the third touch operation, a corner action of the virtual character 20 between different shelter surfaces of the virtual shelter 10 is displayed in the gui, and during the corner, a corner progress mark 90 is displayed on the gui, where the corner progress mark is used to represent a preset time period and an elapsed time period that the virtual character needs to wait after switching from the first surface to the second surface.
In some embodiments, before the controlling the switching of the virtual character from the first surface of the virtual shelter to the rear of the second surface of the virtual shelter to squat, further comprises:
And determining that the third touch operation action duration reaches a preset duration.
The precondition that the first virtual character is triggered to switch to the rear of the second surface of the virtual shelter to perform the squat corner process may also be that the duration of action of the user on the third touch operation of the mobile control element meets a preset duration, for example, 3s.
In some embodiments, the method further comprises:
and when the third touch operation action duration is determined to reach the preset duration, the corner progress mark is in a full state.
Referring to fig. 1f, the corner progress mark 90 may be an arrow-shaped mark, and a progress indication of a filling pattern may be displayed in the arrow-shaped mark, when the corner progress mark 90 is filled, the corner progress mark 90 is in a full state, and the full state means that the third touch operation duration reaches a preset duration, so that corner switching of the virtual character between different shelter surfaces of the virtual shelter may be triggered.
In some embodiments, the corner progress is identified as a arrowed bar for indicating a movement trajectory of the virtual character to switch from a first surface of the virtual shelter to a rear of a second surface of the virtual shelter.
Referring to fig. 1f, the corner progress mark 90 is a curved bar with an arrow, the curved bar is used for indicating a moving track of the virtual character from the first surface of the virtual shelter to the rear of the second surface of the virtual shelter, and the arrow is used for indicating a moving direction of the virtual character when moving along the moving track.
Specifically, as shown in fig. 1g, fig. 1g is a fifth schematic diagram of a graphical user interface provided in an embodiment of the present application. When the virtual character 20 turns, the virtual character 20 is behind the second surface 12 and in a squat position.
In some embodiments, the method further comprises:
and before the action duration of the third touch operation reaches the preset duration, responding to the end of the third touch operation, and hiding the corner progress mark.
If the user finishes the third touch operation on the mobile control before the action duration of the third touch operation reaches the preset duration, the user proves that the user currently wants to prohibit the virtual character from rotating, and the rotating angle progress mark is hidden, so that the user is informed of the rotating angle judgment.
In some embodiments, the second touch operation is a sliding operation, the third touch operation is a long press operation, and the third touch operation and the second touch operation are continuous operations.
The second touch operation of controlling the virtual character to move towards the target direction behind the first surface is directed to the movement control operation, and controlling the virtual character to move towards the movement control is generally directed to the sliding operation of the movement control, so the second touch operation can be the sliding operation. And the third touch operation is also an operation for the mobile control, so in order to improve the continuity and simplicity of the user operation, the second touch operation and the third touch operation can be set to be consecutive operations. The user realizes the second touch operation by sliding the mobile control, and realizes the third touch operation by long-press operation. For example, the user slides the movement control by the hand pointer to control the virtual character to move behind the surface of the virtual shelter, and if the virtual character moves to the boundary position of the surface of a certain shelter during the sliding process, the sliding is not performed, but the finger does not leave the movement control, long-pressing of the third touch operation is realized, and thus the virtual character is triggered to switch from the certain surface of the virtual shelter to the rear of the other surface to squat.
In some embodiments, the second touch operation and the third touch operation are both sliding operations.
The third touch operation may be the same sliding operation as the second touch operation, that is, when the virtual character moves to the boundary position of the surface of a shelter, sliding is continued, so that the third touch operation is implemented, and the virtual character is controlled to be squatted after being switched from a certain surface of the virtual shelter to another surface.
In some embodiments, controlling the virtual character to switch from behind the first surface of the virtual shelter to squat behind the second surface of the virtual shelter comprises:
the virtual character is controlled to switch from behind the first surface of the virtual shelter to behind the second surface of the virtual shelter in a squat position.
In the process of turning the virtual character from the first surface of the virtual shelter to the rear of the second surface of the virtual shelter, the virtual character turns in a squat posture, or the virtual character is changed from the squat posture behind the first surface of the virtual shelter to a standing posture, turns in the standing posture, and changes from the standing posture to the squat posture when turning to the rear of the second surface. The pose of the virtual character in a specific rotation process is not limited herein.
As can be seen from the above, in the embodiments of the present application, the virtual character is controlled to squat behind the first surface of the virtual shelter by responding to the first touch operation for the squat control; responding to a second touch operation aiming at the moving rocker towards the target direction, and controlling the virtual character to move towards the target direction behind the first surface; when the virtual character moves to the boundary position of the first surface towards the target direction, responding to a third touch operation for the moving rocker, controlling the virtual character to switch from the first surface of the virtual shelter to the rear of the second surface of the virtual shelter for squat, wherein the first surface is adjacent to the second surface. Therefore, the third touch operation of the movable rocker can control the virtual character to turn angles between the surfaces of the virtual shelter in a squatting state, other controls are not needed to be added, the virtual character is controlled more flexibly, and the operation is simple and easy.
In order to facilitate better implementation of the virtual character control method provided by the embodiment of the application, the embodiment of the application also provides a device based on the virtual character control method. The meaning of the nouns is the same as that of the virtual character control method, and specific implementation details can be referred to the description of the method embodiment.
Referring to fig. 2, fig. 2 is a schematic structural diagram of a virtual character control device according to an embodiment of the present application, where the virtual character control device may include a first control module 301, a second control module 302, a third control module 303, and so on.
A first control module 301, configured to control the virtual character to squat behind the first surface of the virtual shelter in response to a first touch operation for the squat control;
a second control module 302, configured to control the virtual character to move in the target direction behind the first surface in response to a second touch operation for the moving rocker to the target direction;
and a third control module 303, configured to control, when the virtual character moves to the boundary position of the first surface in the target direction, the virtual character to switch from the first surface of the virtual shelter to the rear of the second surface of the virtual shelter for squat in response to a third touch operation on the moving rocker, where the first surface is adjacent to the second surface.
In some embodiments, the apparatus further comprises:
and the first display module is used for responding to a third touch operation aiming at the movable rocker when the virtual character moves to the boundary position of the first surface towards the target direction, and displaying a corner progress mark on the graphical user interface during the duration of the third touch operation, wherein the corner progress mark is used for representing the preset time length and the elapsed time length which need to be waited after the virtual character is switched from the first surface to the second surface.
In some embodiments, the apparatus further comprises:
the determining module is used for determining that the third touch operation action duration reaches a preset duration.
In some embodiments, the apparatus further comprises:
and the second display module is used for displaying the corner progress mark in a full state when the third touch operation action duration reaches the preset duration.
In some embodiments, the corner progress is identified as a arrowed bar for indicating a movement trajectory of the virtual character to switch from a first surface of the virtual shelter to a rear of a second surface of the virtual shelter.
In some embodiments, the apparatus further comprises:
and the hiding module is used for responding to the end of the third touch operation and hiding the corner progress mark before the action duration of the third touch operation reaches the preset duration.
In some embodiments, the target direction is a direction parallel to the first surface.
In some embodiments, the second touch operation is a sliding operation, the third touch operation is a long press operation, and the third touch operation and the second touch operation are continuous operations.
In some embodiments, the second touch operation and the third touch operation are both sliding operations.
In some embodiments, the third control module 303 includes:
and the control sub-module is used for controlling the virtual character to switch from the rear of the first surface of the virtual shelter to the rear of the second surface of the virtual shelter in a squat posture.
In some embodiments, the apparatus further comprises:
and the fourth control module is used for responding to the touch operation on the squat control when the virtual character is in the squat gesture, and controlling the virtual character to switch from the squat gesture to the standing gesture.
In some embodiments, the apparatus further comprises:
a fifth control module for controlling the virtual character to move in the virtual scene in response to a sliding operation for the moving rocker when the virtual character is in a standing posture;
and the third display module is used for displaying the squatting control when the positions of the virtual character and the virtual shelter meet the preset position relation.
As can be seen from the foregoing, in the embodiment of the present application, the first control module 301 responds to the first touch operation for the squat control to control the virtual character to squat behind the first surface of the virtual shelter; the second control module 302 responds to a second touch operation for the moving rocker towards a target direction and controls the virtual character to move towards the target direction behind the first surface; the third control module 303 controls the virtual character to switch from the first surface of the virtual shelter to the rear of the second surface of the virtual shelter to squat in response to the third touch operation for the moving rocker when the virtual character moves to the boundary position of the first surface in the target direction, wherein the first surface is adjacent to the second surface. Therefore, the third touch operation of the movable rocker can control the virtual character to turn angles between the surfaces of the virtual shelter in a squatting state, other controls are not needed to be added, the virtual character is controlled more flexibly, and the operation is simple and easy.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Correspondingly, the embodiment of the application also provides a terminal device, which can be a terminal or a server, and the terminal device can be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game machine, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA) and the like. As shown in fig. 3, fig. 3 is a schematic structural diagram of a terminal device according to an embodiment of the present application. The terminal device 400 comprises a processor 401 having one or more processing cores, a memory 402 having one or more computer readable storage media, and a computer program stored on the memory 402 and executable on the processor. The processor 401 is electrically connected to the memory 402. It will be appreciated by those skilled in the art that the terminal device structure shown in the figures does not constitute a limitation of the terminal device, and may include more or less components than those illustrated, or may combine certain components, or may have a different arrangement of components.
The processor 401 is a control center of the terminal device 400, connects respective portions of the entire terminal device 400 using various interfaces and lines, and performs various functions of the terminal device 400 and processes data by running or loading software programs and/or modules stored in the memory 402 and calling data stored in the memory 402, thereby performing overall monitoring of the terminal device 400.
In the embodiment of the present application, the processor 401 in the terminal device 400 loads the instructions corresponding to the processes of one or more application programs into the memory 402 according to the following steps, and the processor 401 executes the application programs stored in the memory 402, so as to implement various functions:
responding to a first touch operation aiming at the squat control, and controlling the virtual character to squat behind the first surface of the virtual shelter; responding to a second touch operation aiming at the moving rocker towards the target direction, and controlling the virtual character to move towards the target direction behind the first surface; when the virtual character moves to the boundary position of the first surface towards the target direction, responding to a third touch operation for the moving rocker, controlling the virtual character to switch from the first surface of the virtual shelter to the rear of the second surface of the virtual shelter for squat, wherein the first surface is adjacent to the second surface.
In some embodiments, the processor is further configured to perform: and when the virtual character moves to the boundary position of the first surface towards the target direction, responding to a third touch operation for the movable rocker, and displaying a corner progress mark on the graphical user interface during the duration of the third touch operation, wherein the corner progress mark is used for representing the preset time length and the elapsed time length which need to be waited after the virtual character is switched from the first surface to the second surface.
In some embodiments, the processor is further configured to perform: and determining that the third touch operation action duration reaches a preset duration.
In some embodiments, the processor is further configured to perform: and when the third touch operation action duration is determined to reach the preset duration, the corner progress mark is in a full state.
In some embodiments, the processor is further configured to perform: the turning progress is identified as a curved bar with an arrow for indicating a movement trajectory of the virtual character from a first surface of the virtual shelter to a rear of a second surface of the virtual shelter.
In some embodiments, the processor is further configured to perform: and before the action duration of the third touch operation reaches the preset duration, responding to the end of the third touch operation, and hiding the corner progress mark.
In some embodiments, the processor is further configured to perform: the target direction is a direction parallel to the first surface.
In some embodiments, the processor is further configured to perform: the second touch operation is a sliding operation, the third touch operation is a long-press operation, and the third touch operation and the second touch operation are continuous operations.
In some embodiments, the processor is further configured to perform: the second touch operation and the third touch operation are both sliding operations.
In some embodiments, the processor is further configured to perform: the virtual character is controlled to switch from behind the first surface of the virtual shelter to behind the second surface of the virtual shelter in a squat position.
In some embodiments, the processor is further configured to perform: when the virtual character is in a standing posture, responding to a sliding operation aiming at the movable rocker, and controlling the virtual character to move in the virtual scene;
and displaying the squat control when the positions of the virtual character and the virtual shelter meet the preset position relation.
In some embodiments, the processor is further configured to perform: and when the virtual character is in the squat gesture, controlling the virtual character to switch from the squat gesture to the standing gesture in response to the touch operation on the squat control.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Optionally, as shown in fig. 3, the terminal device 400 further includes: a touch display 403, a radio frequency circuit 404, an audio circuit 405, an input unit 406, and a power supply 407. The processor 401 is electrically connected to the touch display 403, the radio frequency circuit 404, the audio circuit 405, the input unit 406, and the power supply 407, respectively. It will be appreciated by those skilled in the art that the terminal device structure shown in fig. 3 is not limiting of the terminal device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
The touch display 403 may be used to display a graphical user interface and receive operation instructions generated by a user acting on the graphical user interface. The touch display screen 403 may include a display panel and a touch panel. Wherein the display panel may be used to display information entered by a user or provided to a user and various graphical user interfaces of the terminal device, which may be composed of graphics, text, icons, video and any combination thereof. Alternatively, the display panel may be configured in the form of a liquid crystal display (LCD, liquid Crystal Display), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations on or near the user (such as operations on or near the touch panel by the user using any suitable object or accessory such as a finger, stylus, etc.), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, and sends the touch point coordinates to the processor 401, and can receive and execute commands sent from the processor 401. The touch panel may overlay the display panel, and upon detection of a touch operation thereon or thereabout, the touch panel is passed to the processor 401 to determine the type of touch event, and the processor 401 then provides a corresponding visual output on the display panel in accordance with the type of touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 403 to implement the input and output functions. In some embodiments, however, the touch panel and the touch panel may be implemented as two separate components to perform the input and output functions. I.e. the touch-sensitive display 403 may also implement an input function as part of the input unit 406.
In the embodiment of the application, the processor 401 executes the game application program to generate a graphical user interface on the touch display screen 403, where the virtual scene on the graphical user interface includes at least one skill control area, and the skill control area includes at least one skill control. The touch display 403 is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface.
The radio frequency circuit 404 may be configured to receive and transmit radio frequency signals to and from a network device or other terminal device via wireless communication to and from the network device or other terminal device.
The audio circuitry 405 may be used to provide an audio interface between the user and the terminal device through a speaker, microphone. The audio circuit 405 may transmit the received electrical signal after audio data conversion to a speaker, where the electrical signal is converted into a sound signal for output; on the other hand, the microphone converts the collected sound signals into electrical signals, which are received by the audio circuit 405 and converted into audio data, which are processed by the audio data output processor 401 and sent via the radio frequency circuit 404 to, for example, another terminal device, or which are output to the memory 402 for further processing. The audio circuit 405 may also include an ear bud jack to provide communication of the peripheral ear bud with the terminal device.
The input unit 406 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 407 is used to supply power to the various components of the terminal device 400. Alternatively, the power supply 407 may be logically connected to the processor 401 through a power management system, so as to implement functions of managing charging, discharging, and power consumption management through the power management system. The power supply 407 may also include one or more of any of a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown in fig. 3, the terminal device 400 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which will not be described herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
As can be seen from the above, the terminal device provided in this embodiment controls the virtual character to squat behind the first surface of the virtual shelter by responding to the first touch operation for the squat control; responding to a second touch operation aiming at the moving rocker towards the target direction, and controlling the virtual character to move towards the target direction behind the first surface; when the virtual character moves to the boundary position of the first surface towards the target direction, responding to a third touch operation for the moving rocker, controlling the virtual character to switch from the first surface of the virtual shelter to the rear of the second surface of the virtual shelter for squat, wherein the first surface is adjacent to the second surface. Therefore, the third touch operation of the movable rocker can control the virtual character to turn angles between the surfaces of the virtual shelter in a squatting state, other controls are not needed to be added, the virtual character is controlled more flexibly, and the operation is simple and easy.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer readable storage medium having stored therein a plurality of computer programs that can be loaded by a processor to perform steps in any of the skills control methods provided by embodiments of the present application. For example, the computer program may perform the steps of:
responding to a first touch operation aiming at the squat control, and controlling the virtual character to squat behind the first surface of the virtual shelter; responding to a second touch operation aiming at the moving rocker towards the target direction, and controlling the virtual character to move towards the target direction behind the first surface; when the virtual character moves to the boundary position of the first surface towards the target direction, responding to a third touch operation for the moving rocker, controlling the virtual character to switch from the first surface of the virtual shelter to the rear of the second surface of the virtual shelter for squat, wherein the first surface is adjacent to the second surface.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Wherein the storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
The steps in any kind of virtual character control method provided in the embodiments of the present application may be executed by the computer program stored in the storage medium, so that the beneficial effects that any kind of virtual character control method provided in the embodiments of the present application may be achieved, which are detailed in the previous embodiments and are not described herein.
The above describes in detail a virtual character control method, device, storage medium and terminal device provided in the embodiments of the present application, and specific examples are applied to describe the principles and embodiments of the present application, where the descriptions of the above embodiments are only used to help understand the method and core ideas of the present application; meanwhile, those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, and the present description should not be construed as limiting the present application in view of the above.

Claims (15)

1. A method of controlling a virtual character, characterized in that a graphical user interface is provided by a terminal device, said graphical user interface comprising at least a part of a virtual scene, a mobile rocker and a crouch control, and a virtual character and a virtual shelter located in said virtual scene, said method comprising:
responding to a first touch operation aiming at the squat control, and controlling the virtual character to squat behind the first surface of the virtual shelter;
responding to a second touch operation aiming at the moving rocker towards a target direction, and controlling the virtual character to move towards the target direction behind the first surface;
and when the virtual character moves to the boundary position of the first surface towards the target direction, responding to a third touch operation for the moving rocker, and controlling the virtual character to switch from the first surface of the virtual shelter to the rear of the second surface of the virtual shelter to squat, wherein the first surface is adjacent to the second surface.
2. The control method according to claim 1, characterized in that the method further comprises:
and when the virtual character moves to the boundary position of the first surface towards the target direction, responding to a third touch operation for the movable rocker, and displaying a corner progress mark on the graphical user interface during the duration of the third touch operation, wherein the corner progress mark is used for representing the preset time length and the elapsed time length which need to be waited after the virtual character is switched from the first surface to the second surface.
3. The control method of claim 2, wherein the controlling the virtual character to switch from behind the first surface of the virtual shelter to behind the second surface of the virtual shelter, prior to squat, further comprises:
and determining that the third touch operation action duration reaches a preset duration.
4. A control method according to claim 3, characterized in that the method further comprises:
and when the third touch operation action duration is determined to reach the preset duration, the corner progress mark is in a full state.
5. The control method of claim 2, wherein the turning progress is identified as a arrowed bar for indicating a movement trajectory of the virtual character from behind a first surface of the virtual shelter to a second surface of the virtual shelter.
6. The control method according to claim 2, characterized in that the method further comprises:
and before the action duration of the third touch operation reaches the preset duration, responding to the end of the third touch operation, and hiding the corner progress mark.
7. The control method according to claim 1, characterized in that the target direction is a direction parallel to the first surface.
8. The control method according to claim 1, wherein the second touch operation is a sliding operation, the third touch operation is a long press operation, and the third touch operation and the second touch operation are continuous operations.
9. The control method according to claim 1, wherein the second touch operation and the third touch operation are both sliding operations.
10. The control method of claim 1, wherein the controlling the virtual character to switch from behind the first surface of the virtual shelter to squat behind the second surface of the virtual shelter comprises:
the virtual character is controlled to switch from behind the first surface of the virtual shelter to behind the second surface of the virtual shelter in a squat position.
11. The control method according to claim 1, characterized in that the method further comprises:
and when the virtual character is in the squat gesture, controlling the virtual character to switch from the squat gesture to the standing gesture in response to the touch operation on the squat control.
12. The control method according to claim 1, characterized in that the method further comprises:
When the virtual character is in a standing posture, responding to a sliding operation aiming at the movable rocker, and controlling the virtual character to move in the virtual scene;
and displaying the squat control when the positions of the virtual character and the virtual shelter meet the preset position relation.
13. A control device for a virtual character, characterized in that a graphical user interface is provided by a terminal device, said graphical user interface comprising at least a part of a virtual scene, a mobile rocker and a crouch control, and a virtual character and a virtual shelter located in said virtual scene, comprising:
the first control module is used for responding to a first touch operation aiming at the squat control and controlling the virtual character to squat behind the first surface of the virtual shelter;
the second control module is used for responding to a second touch operation aiming at the moving rocker towards a target direction and controlling the virtual character to move towards the target direction behind the first surface;
and the third control module is used for responding to a third touch operation for the movable rocker when the virtual character moves to the boundary position of the first surface towards the target direction, and controlling the virtual character to switch from the first surface of the virtual shelter to the rear of the second surface of the virtual shelter to squat, wherein the first surface is adjacent to the second surface.
14. A computer readable storage medium, characterized in that it stores a plurality of instructions adapted to be loaded by a processor to perform the steps in the method of controlling a virtual character according to any one of claims 1 to 12.
15. A terminal device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps in the method of controlling a virtual character according to any one of claims 1 to 12 when executing the program.
CN202211097766.7A 2022-09-08 2022-09-08 Virtual character control method and device, storage medium and terminal equipment Pending CN115999153A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211097766.7A CN115999153A (en) 2022-09-08 2022-09-08 Virtual character control method and device, storage medium and terminal equipment
PCT/CN2023/079125 WO2024051116A1 (en) 2022-09-08 2023-03-01 Control method and apparatus for virtual character, and storage medium and terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211097766.7A CN115999153A (en) 2022-09-08 2022-09-08 Virtual character control method and device, storage medium and terminal equipment

Publications (1)

Publication Number Publication Date
CN115999153A true CN115999153A (en) 2023-04-25

Family

ID=86018013

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211097766.7A Pending CN115999153A (en) 2022-09-08 2022-09-08 Virtual character control method and device, storage medium and terminal equipment

Country Status (2)

Country Link
CN (1) CN115999153A (en)
WO (1) WO2024051116A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024051116A1 (en) * 2022-09-08 2024-03-14 网易(杭州)网络有限公司 Control method and apparatus for virtual character, and storage medium and terminal device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008044708A1 (en) * 2006-10-13 2008-04-17 Kabushiki Kaisha Sega Doing Business As Sega Corporation Electronic play device, control method for electronic play device and game program
CN114225416A (en) * 2021-12-16 2022-03-25 网易(杭州)网络有限公司 Game control method and device
CN114404986A (en) * 2022-01-20 2022-04-29 网易(杭州)网络有限公司 Method and device for controlling player character, electronic device and storage medium
CN115999153A (en) * 2022-09-08 2023-04-25 网易(杭州)网络有限公司 Virtual character control method and device, storage medium and terminal equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024051116A1 (en) * 2022-09-08 2024-03-14 网易(杭州)网络有限公司 Control method and apparatus for virtual character, and storage medium and terminal device

Also Published As

Publication number Publication date
WO2024051116A1 (en) 2024-03-14

Similar Documents

Publication Publication Date Title
CN113101652A (en) Information display method and device, computer equipment and storage medium
CN113082688B (en) Method and device for controlling virtual character in game, storage medium and equipment
CN113426124B (en) Display control method and device in game, storage medium and computer equipment
CN114522423A (en) Virtual object control method and device, storage medium and computer equipment
CN113398566A (en) Game display control method and device, storage medium and computer equipment
CN115999153A (en) Virtual character control method and device, storage medium and terminal equipment
WO2024011894A1 (en) Virtual-object control method and apparatus, and storage medium and computer device
CN116139483A (en) Game function control method, game function control device, storage medium and computer equipment
CN115382201A (en) Game control method and device, computer equipment and storage medium
CN115212572A (en) Control method and device of game props, computer equipment and storage medium
CN112245914B (en) Viewing angle adjusting method and device, storage medium and computer equipment
CN114225412A (en) Information processing method, information processing device, computer equipment and storage medium
CN113398564B (en) Virtual character control method, device, storage medium and computer equipment
CN113398590B (en) Sound processing method, device, computer equipment and storage medium
CN113332721B (en) Game control method, game control device, computer equipment and storage medium
CN115970284A (en) Attack method and device of virtual weapon, storage medium and computer equipment
CN117482523A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN115430145A (en) Target position interaction method and device, electronic equipment and readable storage medium
CN116474367A (en) Virtual lens control method and device, storage medium and computer equipment
CN116139484A (en) Game function control method, game function control device, storage medium and computer equipment
CN115193062A (en) Game control method, device, storage medium and computer equipment
CN116850594A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN115212567A (en) Information processing method, information processing device, computer equipment and computer readable storage medium
CN117482516A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN116115991A (en) Aiming method, aiming device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination