WO2021213070A1 - 虚拟角色的控制方法、装置、设备及存储介质 - Google Patents
虚拟角色的控制方法、装置、设备及存储介质 Download PDFInfo
- Publication number
- WO2021213070A1 WO2021213070A1 PCT/CN2021/080690 CN2021080690W WO2021213070A1 WO 2021213070 A1 WO2021213070 A1 WO 2021213070A1 CN 2021080690 W CN2021080690 W CN 2021080690W WO 2021213070 A1 WO2021213070 A1 WO 2021213070A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- release
- skill
- control
- virtual character
- virtual environment
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/533—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/426—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/56—Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/308—Details of the user interface
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/65—Methods for processing data by generating or executing the game program for computing the condition of a game character
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
- A63F2300/6607—Methods for processing data by generating or executing the game program for rendering three dimensional images for animating game characters, e.g. skeleton kinematics
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/807—Role playing or strategy games
Definitions
- the embodiments of the present application relate to the field of virtual environments, and in particular to a method, device, device, and storage medium for controlling a virtual character.
- Battle games are games where multiple user accounts compete in the same virtual scene.
- the battle game may be a Multiplayer Online Battle Arena Games (MOBA), and the user may control the virtual object to release skills in the MOBA game, thereby attacking the hostile virtual object.
- MOBA Multiplayer Online Battle Arena Games
- the skill release includes at least two release modes: quick release and aiming release.
- the quick release refers to the release of the skill according to the current facing direction of the virtual object in the virtual environment after the user triggers on the release control of the skill.
- the embodiments of the present application provide a method, device, device, and storage medium for controlling a virtual character, which can improve the user's human-computer interaction efficiency during the skill release process.
- the technical solution is as follows:
- a method for controlling a virtual character is provided, the method is executed by a computer device, and the method includes:
- the virtual environment interface includes a screen for observing the virtual environment, and the screen includes a master virtual character located in the virtual environment;
- control the master virtual character In response to the skill release operation and the movement control operation, control the master virtual character to release the directional skill in the second direction in the virtual environment.
- a virtual character control device is provided, the device is applied to computer equipment, and the device includes:
- a display module configured to display a virtual environment interface, the virtual environment interface includes a screen for observing the virtual environment, and the screen includes a master virtual character located in the virtual environment;
- the receiving module is configured to receive skill release operations and movement control operations, where the skill release operations are used to control the master virtual character to release directional skills in the virtual environment in a first direction, and the movement control operations are used to Controlling the master virtual character to move in the second direction in the virtual environment;
- the release module is configured to control the master virtual character to release the directional skill in the second direction in the virtual environment in response to the skill release operation and the movement control operation.
- the virtual environment interface further includes a mobile control, and the mobile control operation is a drag operation received on the mobile control;
- the receiving module is further configured to receive a drag operation on the mobile control
- the device further includes:
- the obtaining module is configured to obtain the drag direction of the drag operation from the presentation layer; and determine the corresponding second direction when the master virtual character moves according to the drag direction.
- the device further includes:
- a sending module configured to send a skill release data packet to the server, where the skill release data packet includes the second direction;
- the receiving module is further configured to receive a skill release feedback package sent by the server;
- the release module is further configured to control the master virtual role to release the directional skill in the second direction in the virtual environment in response to the skill release feedback package.
- the sending module is configured to send a movement control data packet to the server in response to the movement control operation, and the movement control data packet includes the second direction;
- the receiving module is further configured to receive a mobile control feedback packet sent by the server;
- the device further includes:
- the movement module is configured to control the master virtual character to move in the second direction in the virtual environment in response to the movement control feedback packet.
- the device further includes:
- the buffer module is configured to buffer the second direction in the logic layer as the facing direction of the master virtual character in response to the movement control feedback packet.
- the acquisition module is further configured to, in response to the skill release operation, and the movement control operation is not received, acquire the master virtual character's information from the logic layer.
- the facing direction is the first direction;
- the release module is further configured to control the master virtual character to release the directional skill in the virtual environment in the first direction.
- the virtual environment interface further includes a skill release control
- the receiving module is configured to receive a first trigger operation in the first area of the skill release control as the skill release operation.
- the receiving module is further configured to receive a second triggering operation in a second area of the skill release control, and the second area is a division corresponding to the skill release control. An area other than the first area; determining a release direction corresponding to the second trigger operation;
- the release module is further configured to control the master virtual character to release the directional skill in the release direction in the virtual environment.
- a computer device in another aspect, includes a processor and a memory.
- the memory stores at least one instruction, at least a program, a code set, or an instruction set.
- a piece of program, the code set or the instruction set is loaded and executed by the processor to realize the control method of the virtual character as described above.
- a computer-readable storage medium stores at least one instruction, at least one program, code set, or instruction set, the at least one instruction, the at least one program,
- the code set or instruction set is loaded and executed by the processor to realize the control method of the virtual character as described above.
- a computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
- the processor of the computer device reads the computer instruction from the computer-readable storage medium, and the processor executes the computer instruction, so that the computer device executes the method for controlling the virtual character as described above.
- the second direction corresponding to the movement control operation is determined, and the master virtual character is controlled in the second direction to release the directional skills instead of the automatically selected first direction.
- Directional skills so as to ensure that the directional skills are released in the adjusted facing direction of the master virtual character, improve the accuracy of the directional skills when releasing them, and avoid the need to wait for the directional skills to cool down due to the wrong release direction (that is, after release After a period of recovery and re-enter the releasable state), based on the user’s re-operation to re-release the directional skills, the human-computer interaction efficiency is low, thereby improving the human-computer interaction efficiency and reducing the need for computer equipment. Handling wrong operations, thereby improving the overall performance of the computer equipment.
- FIGS. 1A and 1B are schematic diagrams of interfaces of the skill release process provided by an exemplary embodiment of the present application.
- Figure 2 is a schematic diagram of the quick release timeline of skills in related technologies
- Fig. 3 is a schematic diagram of a quick release timeline of skills provided by an exemplary embodiment of the present application.
- Fig. 4 is a structural block diagram of a computer system provided by an exemplary embodiment of the present application.
- Fig. 5 is a flowchart of a method for controlling a virtual character provided by an exemplary embodiment of the present application
- FIG. 6 is a schematic diagram of an interface of skill release and movement control provided based on the embodiment shown in FIG. 5;
- Fig. 7 is a flowchart of a method for controlling a virtual character provided by another exemplary embodiment of the present application.
- FIG. 8 is a flow chart of the release process based on the skills provided by the embodiment shown in FIG. 7;
- FIG. 9 is a schematic diagram of an interface of skill release and movement control provided based on the embodiment shown in FIG. 7;
- Fig. 10 is a flowchart of a method for controlling a virtual character provided by another exemplary embodiment of the present application.
- FIG. 11 is a flowchart of the release process of skills provided based on the embodiment shown in FIG. 10;
- FIG. 12 is an overall flowchart of the skill release process provided by an exemplary embodiment of the present application.
- FIG. 13 shows a schematic diagram of a virtual environment interface for rapid release of directional skills provided by an exemplary embodiment of the present application
- Fig. 14 is a structural block diagram of a virtual character control device provided by an exemplary embodiment of the present application.
- Fig. 15 is a structural block diagram of a virtual character control device provided by another exemplary embodiment of the present application.
- Fig. 16 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
- the virtual environment is the virtual environment displayed (or provided) when the application program runs on the terminal.
- the virtual environment may be a simulation environment of the real world, a semi-simulation and semi-fictional environment, or a purely fictitious environment.
- the virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in this application.
- the virtual environment is a three-dimensional virtual environment as an example.
- the virtual environment is used to provide a combat environment for at least two master virtual characters.
- the virtual environment includes a symmetrical lower left corner area and an upper right corner area.
- the master virtual characters belonging to two opposing camps occupy one of the areas and destroy the target building, stronghold, base, or crystal in the depth of the opponent's area. As the goal of victory.
- Virtual characters refer to movable objects in a virtual environment.
- the movable object may be a virtual character, a virtual animal, an animation character, etc., such as a character or animal displayed in a three-dimensional virtual environment.
- the virtual character is a three-dimensional model created based on animation skeletal technology.
- Each virtual character has its own shape and volume in the three-dimensional virtual environment and occupies a part of the space in the three-dimensional virtual environment.
- the virtual character is a master virtual character controlled by a user as an example.
- the master virtual character generally refers to one or more master virtual characters in a virtual environment.
- a multiplayer online tactical competitive game refers to a virtual environment where different virtual teams belonging to at least two rival camps occupy their respective map areas and compete with a certain victory condition as the goal.
- the victory conditions include, but are not limited to: occupying a stronghold or destroying the enemy camp’s stronghold, killing the virtual character of the enemy camp, ensuring one’s own survival in a specified scene and time, grabbing a certain resource, and surpassing the opponent’s score within a specified time At least one of.
- Tactical competition can be carried out in units of rounds, and the map of each round of tactical competition can be the same or different.
- Each virtual team includes one or more virtual characters, such as 1, 2, 3, or 5. The duration of a MOBA game is from the moment the game starts to the moment the victory condition is fulfilled.
- the method provided in this application can be applied to virtual reality applications, three-dimensional map programs, military simulation programs, first-person shooting games (FPS), MOBA games, etc.
- the following embodiments are applications in games Let me illustrate.
- a game based on a virtual environment consists of one or more game worlds.
- the virtual environment in the game can simulate real world scenes.
- the user can control the main virtual character in the game to walk, run, jump, shoot, and fight in the virtual environment.
- Driving releasing skills, being attacked by other virtual characters, being injured in the virtual environment, attacking other virtual characters and other actions, with strong interaction, and multiple users can team up for competitive games online.
- the master virtual character when the master virtual character releases skills in the virtual environment, it includes at least one of the following skills release methods:
- quick release refers to the ability to release the skill with the virtual object facing in the virtual environment by triggering the skill release control during the skill release process
- the skill release control corresponds to a first area and a second area, wherein when a first trigger operation in the first area is received, the directional skill is released in the virtual environment in the first direction, where the first The first direction is the facing direction of the virtual object, or the first direction is the direction corresponding to the position of the attack target within the skill release range.
- the skill is released in the direction of the virtual object facing in the virtual environment.
- the first trigger operation in the first area includes a touch operation on the skill release control, and the end position of the touch operation is located in the first area; or, the first trigger operation in the first area includes a touch operation on the first area. Touch operation in the area, and the touch operation does not move out of the first area.
- aiming release refers to the ability to release the skill in the adjusted direction after adjusting the direction of the skill release through the skill release control during the skill release process
- the direction of the skill release is determined according to the second trigger operation, and when the trigger operation ends, the skill is released in the direction of the skill release.
- the second trigger operation in the second area includes a touch operation that starts in the first area, and the end position of the touch operation is located in the second area; or, the second trigger operation in the second area includes a touch operation that acts on Touch operation in the second area, and the touch operation does not move out of the second area.
- FIGS. 1A and 1B are interface diagrams of the skill release process provided by an exemplary embodiment of the present application.
- the virtual environment interface 100 includes a skill release control 110, and the skill release control 110 includes The first area 111 and the second area 112; wherein the skill release process based on the skill release control 110 in FIG. 1A is shown in FIG. 1B.
- a virtual object 120 In response to receiving the skill release operation in the first area 111, a virtual object 120 The skill is released in the face direction in the virtual environment; in response to receiving the skill release operation in the second area 112, the release direction corresponding to the skill release operation is determined, and the skill is released in the release direction.
- the virtual environment interface also includes a mobile joystick for controlling the facing direction of the virtual object and controlling the movement of the virtual object in the virtual environment.
- the mobile joystick receives the control operation to change the facing direction of the virtual object
- the mobile terminal obtains the facing direction of the virtual object from the logic layer, and quickly releases the skill in the facing direction.
- the control operation received by the mobile joystick is uploaded to the server and no feedback message is received.
- the face direction acquired by the layer is the direction before the adjustment, so the release direction when the skill is released is different from the adjusted face direction, so the accuracy of the skill release direction is low.
- the skill release process includes the following processes:
- Step 201 The virtual object faces the first direction.
- Step 202 Move the joystick to control the virtual object to face the second direction.
- Step 203 Trigger the skill on the skill trigger control, and obtain the facing direction (first direction) of the virtual object.
- Step 204 The client sends a mobile control packet to the server.
- Step 205 The client sends a skill release package to the server.
- Step 206 The client receives the movement control feedback packet fed back by the server, and controls the virtual object to face the second direction.
- Step 207 The client receives the skill release feedback package fed back by the server, and controls the virtual object to release the skill in the first direction.
- Step 208 The virtual object faces the second direction.
- the skill release process includes the following processes:
- Step 301 The virtual object faces the first direction.
- Step 302 Move the joystick to control the virtual object to face the second direction.
- Step 303 Trigger the skill on the skill trigger control, and obtain the control direction (second direction) of the mobile joystick.
- Step 304 The client sends a mobile control packet to the server.
- Step 305 The client sends a skill release package to the server.
- Step 306 The client receives the movement control feedback packet fed back by the server, and controls the virtual object to face the second direction.
- Step 307 The client receives the skill release feedback package fed back by the server, and controls the virtual object to release the skill in the second direction.
- Step 308 the virtual object faces the second direction.
- the acquired release direction is the control direction received on the mobile control, that is, the direction the virtual object finally faces, thereby improving the accuracy of the skill release direction.
- Fig. 4 shows a structural block diagram of a computer system provided by an exemplary embodiment of the present application.
- the computer system 400 includes: a first terminal 420, a server 440, and a second terminal 460.
- the first terminal 420 installs and runs an application program supporting the virtual environment.
- the application can be any of virtual reality applications, three-dimensional map programs, military simulation programs, FPS games, MOBA games, multiplayer gun battle survival games, and battle royale shooting games.
- the first terminal 420 is a terminal used by the first user.
- the first user uses the first terminal 420 to control the first master virtual character in the virtual environment to perform activities, including but not limited to: adjusting body posture, walking, running, At least one of jumping, releasing skills, picking up, attacking, and avoiding attacks from other virtual characters.
- the first master virtual character is a first virtual character, such as a simulated character or an animation character.
- the first master virtual character releases the regional skill in the virtual environment, and the virtual environment screen moves from the position where the master virtual character is located to the target area selected by the regional skill indicator.
- the regional skill indicator is used to indicate the skill release area when the master virtual character releases the skill.
- the first terminal 420 is connected to the server 440 through a wireless network or a wired network.
- the server 440 includes at least one of a server, multiple servers, a cloud computing platform, and a virtualization center.
- the server 440 includes a processor 444 and a memory 442.
- the memory 442 further includes a receiving module 4421, a control module 4422, and a sending module 4423.
- the receiving module 4421 is used to receive a request sent by a client, such as a team request; a control module 4422 It is used to control the rendering of the virtual environment picture; the sending module 4423 is used to send a message notification to the client, such as a team success notification.
- the server 440 is used to provide background services for applications supporting the three-dimensional virtual environment.
- the server 440 is responsible for the main calculation work, and the first terminal 420 and the second terminal 460 are responsible for the secondary calculation work; or, the server 440 is responsible for the secondary calculation work, and the first terminal 420 and the second terminal 460 are responsible for the main calculation work; Alternatively, the server 440, the first terminal 420, and the second terminal 460 adopt a distributed computing architecture to perform collaborative computing.
- the second terminal 460 is connected to the server 440 through a wireless network or a wired network.
- the second terminal 460 installs and runs an application program supporting the virtual environment.
- the application can be any of virtual reality applications, three-dimensional map programs, military simulation programs, FPS games, MOBA games, multiplayer gun battle survival games, and battle royale shooting games.
- the second terminal 460 is a terminal used by the second user.
- the second user uses the second terminal 460 to control the second master virtual character in the virtual environment to perform activities, including but not limited to: adjusting body posture, walking, running, At least one of jumping, releasing skills, picking up, attacking, and avoiding attacks from other master virtual characters.
- the second master virtual character is a second virtual character, such as a simulation character or an animation character.
- first avatar and the second avatar are in the same virtual environment.
- first virtual persona and the second virtual persona may belong to the same team, the same organization, have a friendship relationship, or have temporary communication permissions.
- the applications installed on the first terminal 420 and the second terminal 460 are the same, or the applications installed on the two terminals are the same type of application on different control system platforms.
- the first terminal 420 may generally refer to one of a plurality of terminals
- the second terminal 460 may generally refer to one of a plurality of terminals. This embodiment only uses the first terminal 420 and the second terminal 460 as examples.
- the device types of the first terminal 420 and the second terminal 460 are the same or different.
- the device types include: smart phones, tablet computers, e-book readers, MP3 (Moving Picture Experts Group Audio Layer III, moving picture experts compress standard audio layer 3) ) Player, at least one of MP4 (Moving Picture Experts Group Audio Layer IV, Motion Picture Experts compresses standard audio layer 4) player, laptop portable computer and desktop computer.
- the terminal includes a smart phone as an example.
- the number of the aforementioned terminals may be more or less. For example, there may be only one terminal, or there may be dozens or hundreds of terminals, or more.
- the embodiments of the present application do not limit the number of terminals and device types.
- FIG. 5 is a flowchart of a method for controlling a virtual character provided by an exemplary embodiment of the present application.
- the method may be executed by a computer device, and the computer device may be implemented as the first terminal 420 in the computer system 400 as shown in FIG. 4 Or other terminals in the second terminal 460 or the computer system 400, as shown in FIG. 5, the method includes:
- Step 501 Display a virtual environment interface.
- the virtual environment interface includes a screen for observing the virtual environment, and the screen includes a master virtual character located in the virtual environment.
- the virtual environment interface further includes a release control for controlling the master virtual character to release directional skills, and a movement control for controlling the master virtual character to move in the virtual environment.
- the screen includes the master virtual character located in the virtual environment.
- the terminal used by the user runs an application that supports the virtual environment.
- the screen of the terminal displays the user interface corresponding to the application when using the application, that is, the virtual environment interface is displayed, and the virtual environment interface is displayed
- the virtual environment displayed on the picture includes at least one element of mountains, flatlands, rivers, lakes, oceans, deserts, sky, plants, buildings, and vehicles.
- the virtual environment is a virtual environment with an arbitrary boundary shape, for example, the virtual environment is a rhombus.
- the user can browse the full picture of the virtual environment by viewing the map corresponding to the virtual environment.
- a camera model is set in the virtual environment, and the camera model is used to observe the virtual environment from different perspectives, so as to obtain the virtual environment picture.
- the angle of view refers to the angle of observation when the virtual character's first-person or third-person perspective is used to observe in the virtual environment.
- a release control for controlling the master virtual character to release directional skills is displayed in the virtual environment interface, where the directional skills correspond to the skill release direction, that is, when the skills are released, the directional skills need to be released in the specified direction.
- the skill release direction includes at least one of the following two situations:
- control the master virtual character to release the directional skills in the first direction In the process of quickly releasing the directional skills, control the master virtual character to release the directional skills in the first direction.
- the virtual The facing direction of the object in the virtual environment is used as the release direction of the directional skill.
- the direction corresponding to the attacking object is used as the release direction of the directional skill;
- the release direction of the directional skill is adjusted through the release adjustment operation of the release control, and when the release is triggered, the directional skill is released in the adjusted release direction.
- the virtual environment interface also displays a movement control for controlling the movement of the master virtual character, where the movement control can also be used to control the master virtual character to adjust the direction of movement during the process of controlling the movement of the master virtual character.
- the user can adjust the facing direction of the master virtual character through the movement control, and control the master virtual character to move in the facing direction in the virtual environment.
- the quick release method can improve the efficiency of skill release when releasing the directional skills, generally, when the user releases the skills, the user quickly adjusts the facing direction of the master virtual character through the mobile control, and then through the directivity The skill release operation of the skill quickly releases the directional skill in an accurate direction.
- Step 502 Receive a skill release operation and a movement control operation.
- the skill release operation is used to control the master virtual character to release directional skills in the virtual environment in the first direction; the movement control operation is used to control the master virtual character to move in the second direction in the virtual environment, and the first The direction and the second direction are independent of each other.
- the skill release operation corresponds to the quick release of the directional skills described above, that is, the master virtual character is controlled through the skill release operation to release the directional skills in a quick release manner.
- the first direction is the direction automatically selected by the client during the release of the directional skill. For example, when there is an attacking target in the preset range around the master virtual character, the direction corresponding to the location of the attacking target is taken as In the first direction, when there is no attacking target within the preset range around the master virtual character, the facing direction of the master virtual character is taken as the first direction.
- the embodiment of the present application takes the facing direction of the master virtual character as the first direction as an example for description.
- the method for acquiring the facing direction includes: directly acquiring the current orientation of the master virtual character in the virtual environment from the logical layer As the main control virtual character’s face direction; however, the face direction obtained from the logic layer may be inaccurate due to the delay of the direction adjustment, resulting in an inaccurate skill release direction.
- the virtual environment interface 600 Including a master virtual character 610, a skill release control 620, and a movement control 630.
- the movement control 630 receives a drag operation to the lower right, and controls the master virtual character 610 to face the direction corresponding to the lower right, and to the right Move in the corresponding direction below, and the quick release operation is received on the skill release control 620, and the current master virtual character 610 faces the corresponding direction at the bottom left. Therefore, the skill release direction included in the skill release request sent by the terminal to the server is the same as Corresponding to the lower left, the skill is released to the lower left, which is different from the controlled direction on the movement control 630.
- the second direction corresponding to the movement control operation is used instead of the first direction to release the directional skills, that is, the second direction corresponding to the movement control operation is obtained from the presentation layer as the face of the master virtual character In the direction.
- the virtual environment interface also includes a mobile control.
- the mobile control operation is a drag operation received on the mobile control. After receiving the drag operation on the mobile control, the dragging direction of the drag operation is obtained from the presentation layer. , And determine the corresponding second direction when the master virtual character moves according to the dragging direction.
- the mobile control operation is an operation triggered based on a mobile control
- the presentation layer is used to implement interface performance and receive interface operations.
- the presentation layer is used to display the screen corresponding to the virtual environment in the virtual environment interface and the controls used to control the main virtual character or the game process.
- the presentation layer is also used to receive The touch operation on the virtual environment interface, and the touch operation is reported to the logic layer through the server for logical processing.
- both the presentation layer and the logic layer exist in the game client, where the logic layer cannot directly access the data in the presentation layer, the presentation layer can access the data in the logic layer, and the presentation layer cannot perform the logic of the logic layer.
- the presentation layer can access the data in the logic layer, and the presentation layer cannot perform the logic of the logic layer.
- it is necessary to perform logic processing in the logic layer through the server according to the received touch operation.
- the user performs a touch operation on the mobile control in the virtual environment interface, so that the touch data is read by the presentation layer and a mobile touch message is generated, for example, the mobile touch message includes
- the client sends a mobile touch message to the server.
- the logic layer responds to the facing direction of the master virtual character according to the mobile feedback message
- the presentation layer reads the adjusted face direction from the logic layer and performs performance, so as to realize the control of the master virtual character.
- the virtual environment interface also includes a skill release control.
- the skill release control is used to control the virtual object to release directional skills.
- the skill release control corresponds to a first area and a second area, where the first area is used for To trigger the quick release of directional skills, the second area is used to trigger the targeted release of directional skills.
- the first area in response to receiving the first trigger operation in the first area of the skill release control, it is determined to receive the skill release operation; and when the first area in the first area of the skill release control is received When the operation is triggered, the directional skills are quickly released.
- the release direction corresponding to the second trigger operation is determined, and the master virtual character is controlled to release the direction in the corresponding release direction in the virtual environment.
- sexual skills where the second area is the area other than the first area corresponding to the skill release control.
- Step 503 In response to the skill release operation and the movement control operation, control the master virtual character to release the directional skill in the second direction in the virtual environment.
- the second direction is the direction corresponding to the movement control operation, and when the user controls the master virtual character to adjust the facing direction, the adjusted facing direction, so that the release direction of the directional skill is consistent with the movement control direction.
- the skill release control is usually set to release the skill based on the skill release control once, it takes a specified period of time before the next skill release can be performed based on the skill release control. That is to say, after the skill is released, it needs to go through the skill It can be released again only after cooling. Therefore, based on the virtual character control method provided by the embodiment of the present application, the waste of skill release time caused by the wrong direction of the directional skill release direction can be reduced, thereby improving the efficiency of human-computer interaction.
- the virtual character control method when the directional skill is released, if a movement control operation is received, the second direction corresponding to the movement control operation is determined, and the master is controlled in the second direction.
- the virtual character releases the directional skills instead of releasing the directional skills in the first direction automatically selected, so as to ensure that the directional skills are released in the facing direction after the master virtual character is adjusted, and the accuracy of the directional skills is improved to avoid Due to the wrong release direction, it is necessary to wait for the directional skill to cool down (that is, after a period of recovery after release and enter the releasable state again), the human-computer interaction efficiency caused by the re-release of the directional skill based on the user's re-operation Low problems, thereby improving the efficiency of human-computer interaction, reducing the need for computer equipment to handle erroneous operations, thereby improving the overall performance of the computer equipment.
- FIG. 7 is a diagram of a virtual character control method provided by another exemplary embodiment of the present application.
- the method can be executed by a computer device, and the computer device can be implemented as the first terminal 420 or the second terminal 460 in the computer system 400 as shown in FIG. 4 or other terminals in the computer system 400, as shown in FIG. As shown in 7, the method includes:
- Step 701 Display a virtual environment interface.
- the virtual environment interface includes a screen for observing the virtual environment, and the screen includes a master virtual character located in the virtual environment.
- a release control for controlling the master virtual character to release directional skills and a movement control for controlling the master virtual character to move in the virtual environment are superimposed and displayed on the screen.
- the first direction is used as the release direction of the skill.
- the master virtual character is acquired in the rapid release process of the directional skill. Face the direction in the virtual environment, and release the directional skills in the face direction of the master virtual character.
- Step 702 Receive a skill release operation and a movement control operation.
- the skill release operation is used to control the master virtual character to release directional skills in the virtual environment in the first direction; the movement control operation is used to control the master virtual character to move in the second direction in the virtual environment, and the first The direction and the second direction are independent of each other.
- the skill release operation corresponds to the quick release of the directional skills described above, that is, the master virtual character is controlled through the skill release operation to release the directional skills in a quick release manner.
- the movement control operation is an operation triggered by a movement control.
- the second direction corresponding to the movement control operation is obtained from the presentation layer, and the presentation layer is used to implement interface performance and receive interface operations.
- the movement control operation is realized by the drag operation on the mobile control, that is, the drag operation on the mobile control is received, the drag direction of the drag operation is obtained from the presentation layer, and the master virtual character is determined according to the drag direction The corresponding second direction when moving.
- Step 703 Send a skill release data packet to the server, where the skill release data packet includes the second direction.
- the presentation layer can access the data of the logic layer, but cannot modify the logic of the logic layer, that is, it cannot control the logic layer to perform logical processing. Therefore, the presentation layer sends skills to the server after obtaining the second direction.
- the release data package where the skill release data package includes the second direction as the release direction of the directional skill.
- Step 704 Receive a skill release feedback package sent by the server.
- the skill release feedback package sent by the server is received through the logic layer, and the logic layer performs logical processing according to the skill release feedback package.
- Step 705 In response to the skill release feedback package, control the master virtual character to release the directional skill in the second direction in the virtual environment.
- the logic layer controls the master virtual character to release the directional skill in the second direction in the virtual environment according to the control data in the skill release feedback package.
- the second direction is the direction corresponding to the movement control operation received on the movement control, and when the user controls the master virtual character to adjust the facing direction, the adjusted facing direction, so that the directionality skill is released. Consistent with the movement control direction.
- the skill release logic is described in conjunction with the user, presentation layer, server, and logic layer.
- the presentation layer 820 is triggered to send the skill release to the server 830 Data package, where the skill release data package includes the movement control direction.
- the server 830 receives the skill release data packet, it sends the skill release feedback package to the logic layer 840.
- the logic layer 840 performs logical processing based on the skill release feedback package and sends it to the presentation layer 820
- the skill release status is sent to instruct the presentation layer 820 to display the skill release process.
- the presentation layer 820 obtains the skill release status from the logic layer 840 and displays the skill release process on the presentation layer 820.
- the virtual environment interface 900 includes a master virtual character 910, a mobile joystick 920, and a directional skill trigger control 930, where the master virtual character 910 faces the first in the virtual environment Direction, receive the movement control operation on the mobile joystick 920, and control the master virtual character 910 to face the second direction in the virtual environment.
- the client receives the trigger operation on the trigger control 930, so that the client receives from the presentation layer
- the movement control operation on the mobile joystick 920 is read, and the directional skill is released in the second direction.
- the directional skill when the directional skill is released, if a movement control operation is received, the second direction corresponding to the movement control operation is determined, and the master virtual character is controlled in the second direction.
- the character releases the directional skills instead of releasing the directional skills in the first direction that is automatically selected, so as to ensure that the directional skills are released in the facing direction adjusted by the master virtual character, improve the accuracy of the directional skills when releasing, and avoid The direction of the release is wrong, you need to wait for the directional skill to cool down (that is, after a period of time after the release, and re-enter the releasable state), based on the user's re-operation to re-release the directional skill, resulting in low human-computer interaction efficiency Problems, thereby improving the efficiency of human-computer interaction, reducing the need for computer equipment to handle erroneous operations, and thereby improving the overall performance of the computer equipment.
- the method provided in this embodiment sends a skill release data packet to the server through the presentation layer of the terminal, and the server feeds back the skill release feedback package to the logic layer of the terminal, thereby realizing the release of skills from the logic layer, and at the same time, the performance layer
- the second direction controls the master virtual character to release the directional skills, which improves the accuracy of the release of the directional skills.
- FIG. 10 is another exemplary implementation of the present application.
- the example provides a flowchart of a method for controlling a virtual character.
- the method can be executed by a computer device.
- the computer device can be implemented as the first terminal 420 or the second terminal 460 or the computer in the computer system 400 as shown in FIG.
- the method includes:
- Step 1001 Display a virtual environment interface.
- the virtual environment interface includes a screen for observing the virtual environment, and the screen includes a master virtual character located in the virtual environment.
- a release control for controlling the master virtual character to release directional skills and a movement control for controlling the master virtual character to move in the virtual environment are superimposed and displayed on the screen.
- the corresponding first direction is used as the release direction of the skill.
- the master virtual character is acquired Face the direction in the virtual environment, and release the directional skills in the face direction.
- Step 1002 Receive a skill release operation, and the skill release operation is used to control the master virtual character to release the directional skill in the first direction.
- the skill release operation corresponds to the quick release of the above-mentioned directional skills, that is, the master virtual character is controlled by the skill release operation to release the directional skills in a quick release mode, and the client is in the face of the master virtual character. After acquiring in the direction, release the directional skill in the facing direction.
- Step 1003 Receive a movement control operation, where the movement control operation is used to control the master virtual character to move in the second direction in the virtual environment.
- the movement control operation is realized by a drag operation on the mobile control, that is, the drag operation on the mobile control is received, the drag direction of the drag operation is obtained from the presentation layer, and the second corresponding to the drag direction is determined. direction.
- the movement control data packet includes the second direction
- receive the movement control feedback packet sent by the server and respond to the movement control feedback packet to control the master virtual
- the character moves facing the second direction in the virtual environment.
- the second direction is cached in the logic layer as the facing direction of the master virtual character.
- step 1003 can be executed first, then step 1003, step 1003 can be executed first, then step 1002, or step 1002 and step 1003 can be executed at the same time.
- the execution order of step 1002 is not limited.
- Step 1004 Control the master virtual character to release directional skills in the second direction in the virtual environment.
- the presentation layer sends a skill release data package to the server.
- the skill release data package includes the second direction.
- the logic layer receives the skill release feedback package fed back by the server, and controls the master virtual character in the virtual state according to the skill release feedback package. Release skills in the second direction in the environment.
- Step 1005 In response to the skill release operation and no movement control operation is received, obtain the facing direction of the master virtual character from the logic layer as the first direction.
- the facing direction of the master virtual character is not adjusted, then The facing direction of the current master virtual character is the first direction to release the skill.
- Step 1006 Control the master virtual character in the first direction to release directional skills in the virtual environment.
- the logic layer receives the skill release feedback package fed back by the server, and controls the master virtual character to report to the virtual environment according to the skill release feedback package. Release skills in the first direction.
- Step 1101 Determine whether the presentation layer has received an input operation based on moving the joystick; if so, perform step 1102, otherwise, perform step 1103.
- Step 1102 Use the operation direction of the input operation of moving the joystick received by the presentation layer as the skill release direction.
- Step 1103 Use the facing direction of the master virtual character cached in the logic layer as the skill release direction.
- the method provided in this embodiment determines whether a movement control operation is accepted on the mobile control when the skill release operation is received, and when the movement control operation is received, the release is controlled by the movement control direction in the presentation layer.
- Directional skills and when the movement control operation is not received, use the face direction control in the logic layer to release the directional skills, so as to determine the accurate release direction of the directional skills during the release process, and improve the accuracy of the directional skills. Rate.
- FIG. 12 shows an overall flowchart of the skill release process provided by an exemplary embodiment of the present application. As shown in FIG. 12, the process includes:
- Step 1201 Receive a skill release operation.
- the skill release operation is used to control the master virtual character to release the directional skills in the virtual environment.
- Step 1202 determine whether there is a skill rocker orientation; if yes, perform step 1203, otherwise, perform step 1204.
- the orientation of the skill joystick is used to distinguish the release modes of the directional skills, where the release modes include quick release and aiming release.
- the skill joystick orientation exists, it means that the current release mode of the directional skill is aiming release.
- the current directional skill release method is quick release.
- Step 1203 use the skill rocker orientation as the skill release orientation.
- step 1204 it is judged whether there is a moving joystick orientation; if yes, go to step 1205; otherwise, go to step 1206.
- the direction of the skill joystick does not exist, it means that the current directional skill release method is quick release, and then it is further determined whether it is necessary to adjust the facing direction of the master virtual character by moving the joystick during the quick release process. .
- Step 1205 use the moving joystick as the skill release direction.
- the direction of the mobile joystick is used as the direction of skill release.
- Step 1206 Use the character orientation as the skill release orientation.
- Step 1207 release the directional skills.
- FIG. 13 shows a schematic diagram of the virtual environment interface for the quick release of the directional skill provided by an exemplary embodiment of the present application.
- the environment interface includes a master virtual character 1310, a mobile joystick 1320, and a directional skill trigger control 1330.
- the master virtual character 1310 faces right in the virtual environment (first direction), and the terminal receives the user Based on the movement control operation of the mobile joystick 1320, the master virtual character 1310 is controlled to move to the left.
- the orientation of the master virtual character in the virtual environment is changed to face left (second direction).
- the trigger operation on the trigger control 1330 based on the directional skill of the user is also received.
- the changed direction (second direction) releases the directional skill, but during the user's operation, it takes a certain time for the master virtual character to switch the direction, and if the user uses a quick cast method to release the directional skill, it may be possible
- the master virtual character ignores the current orientation and directly The direction corresponding to the user's movement control operation releases the directional skill, so that when the directional skill is quickly released, the interaction efficiency can be ensured, and the accuracy of the skill release can also be improved.
- the directional skill when the directional skill is released, if a movement control operation is received, the second direction corresponding to the movement control operation is determined, and the master virtual character is controlled in the second direction.
- the character releases the directional skills instead of releasing the directional skills in the first direction that is automatically selected, so as to ensure that the directional skills are released in the facing direction adjusted by the master virtual character, improve the accuracy of the directional skills when releasing, and avoid The direction of the release is wrong, you need to wait for the directional skill to cool down (that is, after a period of time after the release, and re-enter the releasable state), based on the user's re-operation to re-release the directional skill, resulting in low human-computer interaction efficiency Problems, thereby improving the efficiency of human-computer interaction, reducing the need for computer equipment to handle erroneous operations, and thereby improving the overall performance of the computer equipment.
- Fig. 14 is a structural block diagram of a virtual character control device provided by an exemplary embodiment of the present application. As shown in Fig. 14, the device includes:
- the display module 1410 is configured to display a virtual environment interface, the virtual environment interface includes a screen for observing the virtual environment, and the screen includes a master virtual character located in the virtual environment;
- the receiving module 1420 is configured to receive a skill release operation and a movement control operation.
- the skill release operation is used to control the master virtual character to release directional skills in the virtual environment in a first direction
- the movement control operation is used for To control the master virtual character to move in the second direction in the virtual environment;
- the release module 1430 is configured to control the master virtual character to release the directional skill in the second direction in the virtual environment in response to the skill release operation and the movement control operation.
- the virtual environment interface further includes a mobile control, and the mobile control operation is a drag operation received on the mobile control;
- the receiving module 1420 is further configured to receive a drag operation on the mobile control
- the device further includes:
- the obtaining module 1440 is configured to obtain the drag direction of the drag operation from the presentation layer; and determine the corresponding second direction when the master virtual character moves according to the drag direction.
- the device further includes:
- a sending module 1450 configured to send a skill release data packet to the server, where the skill release data packet includes the second direction;
- the receiving module 1420 is further configured to receive a skill release feedback package sent by the server;
- the release module 1430 is further configured to control the master virtual character to release the directional skill in the second direction in the virtual environment in response to the skill release feedback package.
- the sending module 1450 is configured to send a movement control data packet to the server in response to the movement control operation, the movement control data packet including the second direction;
- the receiving module 1420 is further configured to receive a mobility control feedback packet sent by the server;
- the device further includes:
- the movement module is configured to control the master virtual character to move in the second direction in the virtual environment in response to the movement control feedback packet.
- the device further includes:
- the buffer module is configured to buffer the second direction in the logic layer as the facing direction of the master virtual character in response to the movement control feedback packet.
- the obtaining module 1440 is further configured to obtain the face of the master virtual character from the logic layer in response to the skill release operation and the movement control operation is not received. Towards the direction as the first direction;
- the release module 1430 is further configured to control the master virtual character to release the directional skill in the virtual environment in the first direction.
- the virtual environment interface further includes a skill release control
- the receiving module 1420 is further configured to receive a first trigger operation in the first area of the skill release control as the skill release operation.
- the receiving module 1420 is further configured to receive a second trigger operation in a second area of the skill release control, and the second area is corresponding to the skill release control. An area other than the first area; determining the release direction corresponding to the second trigger operation;
- the release module 1430 is also used to control the master virtual character to release the directional skill in the release direction in the virtual environment.
- the virtual character control device when the directional skill is released, if the movement control operation is received, the second direction corresponding to the movement control operation is determined, and the master virtual character is controlled in the second direction.
- the character releases the directional skills instead of releasing the directional skills in the first direction that is automatically selected, so as to ensure that the directional skills are released in the facing direction adjusted by the master virtual character, improve the accuracy of the directional skills when releasing, and avoid The direction of the release is wrong, you need to wait for the directional skill to cool down (that is, after a period of time after the release, and re-enter the releasable state), based on the user's re-operation to re-release the directional skill, resulting in low human-computer interaction efficiency Problems, thereby improving the efficiency of human-computer interaction, reducing the need for computer equipment to handle erroneous operations, and thereby improving the overall performance of the computer equipment.
- the present application also provides a terminal.
- the terminal includes a processor and a memory. At least one instruction is stored in the memory. Steps performed by one terminal or steps performed by the second terminal. It should be noted that the terminal may be the terminal provided in Figure 16 below.
- FIG. 16 shows a structural block diagram of a terminal 1600 provided by an exemplary embodiment of the present application.
- the terminal 1600 may be: a smart phone, a tablet computer, an MP3 player, an MP4 player, a notebook computer, or a desktop computer.
- the terminal 1600 may also be called user equipment, portable terminal, laptop terminal, desktop terminal and other names.
- the terminal 1600 includes a processor 1601 and a memory 1602.
- the processor 1601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on.
- the processor 1601 can adopt at least one hardware form among DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array, Programmable Logic Array). accomplish.
- the processor 1601 may also include a main processor and a coprocessor.
- the main processor is a processor used to process data in the awake state, also called a CPU (Central Processing Unit, central processing unit); the coprocessor is A low-power processor used to process data in the standby state.
- the processor 1601 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used to render and draw content that needs to be displayed on the display screen.
- the processor 1601 may further include an AI (Artificial Intelligence) processor, and the AI processor is used to process computing operations related to machine learning.
- AI Artificial Intelligence
- the memory 1602 may include one or more computer-readable storage media, which may be non-transitory.
- the memory 1602 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
- the non-transitory computer-readable storage medium in the memory 1602 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 1601 to implement the virtual character provided in the method embodiment of the present application. ⁇ Control methods.
- the terminal 1600 optionally further includes: a peripheral device interface 1603 and at least one peripheral device.
- the processor 1601, the memory 1602, and the peripheral device interface 1603 may be connected by a bus or a signal line.
- Each peripheral device can be connected to the peripheral device interface 1603 through a bus, a signal line, or a circuit board.
- the peripheral device includes: at least one of a radio frequency circuit 1604, a display screen 1605, a camera component 1606, an audio circuit 1607, a positioning component 1608, and a power supply 1609.
- the terminal 1600 further includes one or more sensors 1610.
- the one or more sensors 1610 include, but are not limited to: an acceleration sensor 1611, a gyroscope sensor 1612, a pressure sensor 1613, a fingerprint sensor 1614, an optical sensor 1615, and a proximity sensor 1616.
- FIG. 16 does not constitute a limitation on the terminal 1600, and may include more or fewer components than shown, or combine certain components, or adopt different component arrangements.
- the memory also includes one or more programs, the one or more programs are stored in the memory, and the one or more programs include all or more of the virtual character control methods provided in the embodiments of the present application. Part of the steps.
- the present application provides a computer-readable storage medium that stores at least one instruction, and at least one instruction is loaded and executed by the processor to implement the control method of the virtual character provided by the foregoing method embodiments All or part of the steps in.
- the computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
- the processor of the computer device reads the computer instruction from the computer-readable storage medium, and the processor executes the computer instruction, so that the computer device executes all or part of the steps in the virtual character control method provided by the foregoing method embodiments.
- the program can be stored in a computer-readable storage medium.
- the storage medium mentioned can be a read-only memory, a magnetic disk or an optical disk, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims (18)
- 一种虚拟角色的控制方法,其特征在于,所述方法由计算机设备执行,所述方法包括:显示虚拟环境界面,所述虚拟环境界面中包括对所述虚拟环境进行观察的画面,所述画面中包括位于所述虚拟环境中的主控虚拟角色;接收技能释放操作和移动控制操作,所述技能释放操作用于控制所述主控虚拟角色以第一方向在所述虚拟环境中释放指向性技能,所述移动控制操作用于控制所述主控虚拟角色在所述虚拟环境中向第二方向移动,所述第一方向和所述第二方向互相独立;响应于所述技能释放操作和所述移动控制操作,控制所述主控虚拟角色在所述虚拟环境中向所述第二方向释放所述指向性技能。
- 根据权利要求1所述的方法,其特征在于,所述虚拟环境界面中还包括移动控件,所述移动控制操作为在所述移动控件上接收到的拖动操作;所述第二方向的获取方式,包括:接收对所述移动控件的所述拖动操作;从表现层获取所述拖动操作的拖动方向;根据所述拖动方向确定所述主控虚拟角色移动时对应的所述第二方向。
- 根据权利要求1所述的方法,其特征在于,所述控制所述主控虚拟角色在所述虚拟环境中向所述第二方向释放所述指向性技能,包括:向服务器发送技能释放数据包,所述技能释放数据包中包括所述第二方向;接收所述服务器发送的技能释放反馈包;响应于所述技能释放反馈包,控制所述主控虚拟角色在所述虚拟环境中向所述第二方向释放所述指向性技能。
- 根据权利要求1至3任一所述的方法,其特征在于,所述方法还包括:响应于所述移动控制操作,向服务器发送移动控制数据包,所述移动控制数据包中包括所述第二方向;接收所述服务器发送的移动控制反馈包;响应于所述移动控制反馈包,控制所述主控虚拟角色在所述虚拟环境中面向所述第二方向移动。
- 根据权利要求4所述的方法,其特征在于,所述接收所述服务器发送的移动控制反馈包之后,还包括:响应于所述移动控制反馈包,在逻辑层中缓存所述第二方向作为所述主控虚拟角色的面朝方向。
- 根据权利要求5所述的方法,其特征在于,所述方法还包括:响应于所述技能释放操作,且未接收到所述移动控制操作,从所述逻辑层获取所述主控虚拟角色的所述面朝方向作为所述第一方向;向所述第一方向控制所述主控虚拟角色在所述虚拟环境中释放所述指向性技能。
- 根据权利要求1至3任一所述的方法,其特征在于,所述虚拟环境界面中还包括技能释放控件;所述接收技能释放操作,包括:接收在所述技能释放控件的第一区域内的第一触发操作,作为所述技能释放操作。
- 根据权利要求7所述的方法,其特征在于,所述方法还包括:接收在所述技能释放控件的第二区域内的第二触发操作,所述第二区域为与所述技能释放控件对应的除所述第一区域以外的区域;确定所述第二触发操作对应的释放方向;控制所述主控虚拟角色在所述虚拟环境中向所述释放方向释放所述指向性技能。
- 一种虚拟角色的控制装置,其特征在于,所述装置应用于计算机设备中,所述装置包括:显示模块,用于显示虚拟环境界面,所述虚拟环境界面中包括对所述虚拟环境进行观察的画面,所述画面中包括位于所述虚拟环境中的主控虚拟角色;接收模块,用于接收技能释放操作和移动控制操作,所述技能释放操作用于控制所述主控虚拟角色以第一方向在所述虚拟环境中释放指向性技能,所述移动控制操作用于控制所述主控虚拟角色在所述虚拟环境中向第二方向移动;释放模块,用于响应于所述技能释放操作和所述移动控制操作,控制所述主控虚拟角色在所述虚拟环境中向所述第二方向释放所述指向性技能。
- 根据权利要求9所述的装置,其特征在于,所述虚拟环境界面中还包括移动控件,所述移动控制操作为在所述移动控件上接收到的拖动操作;所述接收模块,还用于接收对所述移动控件的拖动操作;所述装置,还包括:获取模块,用于从表现层获取所述拖动操作的拖动方向;根据所述拖动方向确定所述主控虚拟角色移动时对应的所述第二方向。
- 根据权利要求10所述的装置,其特征在于,所述装置还包括:发送模块,用于向服务器发送技能释放数据包,所述技能释放数据包中包括所述第二方向;所述接收模块,还用于接收所述服务器发送的技能释放反馈包;所述释放模块,还用于响应于所述技能释放反馈包,控制所述主控虚拟角色在所述虚拟环境中向所述第二方向释放所述指向性技能。
- 根据权利要求9至11任一所述的装置,其特征在于,所述发送模块,用于响应于所述移动控制操作,向服务器发送移动控制数据包,所述移动控制数据包中包括所述第二方向;所述接收模块,还用于接收所述服务器发送的移动控制反馈包;所述装置,还包括:移动模块,用于响应于所述移动控制反馈包,控制所述主控虚拟角色在所述虚拟环境中面向所述第二方向移动。
- 根据权利要求12所述的装置,其特征在于,所述装置还包括:缓存模块,用于响应于所述移动控制反馈包,在逻辑层中缓存所述第二方向作为所述主控虚拟角色的面朝方向。
- 根据权利要求13所述的装置,其特征在于,所述获取模块,还用于响应于所述技能释放操作,且未接收到所述移动控制操作,从所述逻辑层获取所述主控虚拟角色的所述面朝方向作为所述第一方向;所述释放模块,还用于向所述第一方向控制所述主控虚拟角色在所述虚拟环境中释放所述指向性技能。
- 根据权利要求9至11任一所述的装置,其特征在于,所述虚拟环境界面中还包括技能释放控件;所述接收模块,用于接收在所述技能释放控件的第一区域内的第一触发操作,作为所述技能释放操作。
- 根据权利要求15所述的装置,其特征在于,所述接收模块,还用于接收在所述技能释放控件的第二区域内的第二触发操作,所述第二区域为与所述技能释放控件对应的除所述第一区域以外的区域;确定所述第二触发操作对应的释放方向;所述释放模块,还用于控制所述主控虚拟角色在所述虚拟环境中向所述释放方向释放所述指向性技能。
- 一种计算机设备,其特征在于,所述计算机设备包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如权利要求1至8任一所述的虚拟角色的控制方法。
- 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有至少一条计算机程序,所述计算机程序由处理器加载并执行以实现如权利要求1至8任一所述的虚拟角色的控制方法。
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020217036000A KR20210150465A (ko) | 2020-04-23 | 2021-03-15 | 가상 캐릭터 제어 방법 및 장치, 디바이스, 및 저장 매체 |
EP21782629.6A EP3943172A4 (en) | 2020-04-23 | 2021-03-15 | METHOD AND APPARATUS FOR CONTROLLING VIRTUAL CHARACTERS, DEVICE AND STORAGE MEDIA |
CA3137791A CA3137791A1 (en) | 2020-04-23 | 2021-03-15 | Virtual character control method and apparatus, device, and storage medium |
AU2021254521A AU2021254521B2 (en) | 2020-04-23 | 2021-03-15 | Virtual character control method and apparatus, device, and storage medium |
JP2021564351A JP7451563B2 (ja) | 2020-04-23 | 2021-03-15 | 仮想キャラクタの制御方法並びにそのコンピュータ機器、コンピュータプログラム、及び仮想キャラクタの制御装置 |
SG11202112169UA SG11202112169UA (en) | 2020-04-23 | 2021-03-15 | Virtual character control method and apparatus, device, and storage medium |
US17/570,391 US20220126205A1 (en) | 2020-04-23 | 2022-01-07 | Virtual character control method and apparatus, device, and storage medium |
JP2024034076A JP2024063201A (ja) | 2020-04-23 | 2024-03-06 | 仮想キャラクタの制御方法、装置、機器及び記憶媒体 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010328532.3A CN111589127B (zh) | 2020-04-23 | 2020-04-23 | 虚拟角色的控制方法、装置、设备及存储介质 |
CN202010328532.3 | 2020-04-23 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/570,391 Continuation US20220126205A1 (en) | 2020-04-23 | 2022-01-07 | Virtual character control method and apparatus, device, and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021213070A1 true WO2021213070A1 (zh) | 2021-10-28 |
Family
ID=72180363
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/080690 WO2021213070A1 (zh) | 2020-04-23 | 2021-03-15 | 虚拟角色的控制方法、装置、设备及存储介质 |
Country Status (9)
Country | Link |
---|---|
US (1) | US20220126205A1 (zh) |
EP (1) | EP3943172A4 (zh) |
JP (2) | JP7451563B2 (zh) |
KR (1) | KR20210150465A (zh) |
CN (1) | CN111589127B (zh) |
AU (1) | AU2021254521B2 (zh) |
CA (1) | CA3137791A1 (zh) |
SG (1) | SG11202112169UA (zh) |
WO (1) | WO2021213070A1 (zh) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111589127B (zh) * | 2020-04-23 | 2022-07-12 | 腾讯科技(深圳)有限公司 | 虚拟角色的控制方法、装置、设备及存储介质 |
CN112044071B (zh) | 2020-09-04 | 2021-10-15 | 腾讯科技(深圳)有限公司 | 虚拟物品的控制方法、装置、终端及存储介质 |
JP7270008B2 (ja) * | 2020-09-08 | 2023-05-09 | カムツス コーポレーション | ゲーム提供方法、コンピュータプログラム、コンピュータ読取可能な記録媒体、およびコンピュータ装置 |
CN112274927A (zh) * | 2020-11-18 | 2021-01-29 | 网易(杭州)网络有限公司 | 游戏交互方法、装置及电子设备 |
CN112843679B (zh) * | 2021-03-04 | 2022-11-08 | 腾讯科技(深圳)有限公司 | 虚拟对象的技能释放方法、装置、设备及介质 |
CN113476822B (zh) * | 2021-06-11 | 2022-06-10 | 荣耀终端有限公司 | 一种触控方法及设备 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170149241A1 (en) * | 2009-07-15 | 2017-05-25 | Yehuda Binder | Sequentially operated modules |
US20180104584A1 (en) * | 2016-10-19 | 2018-04-19 | Nintendo Co., Ltd. | Computer-readable non-transitory storage medium having stored therein game program, game processing method, game system, and game apparatus |
CN109865286A (zh) * | 2019-02-20 | 2019-06-11 | 网易(杭州)网络有限公司 | 游戏中的信息处理方法、装置及存储介质 |
CN110413171A (zh) * | 2019-08-08 | 2019-11-05 | 腾讯科技(深圳)有限公司 | 控制虚拟对象进行快捷操作的方法、装置、设备及介质 |
CN110694261A (zh) * | 2019-10-21 | 2020-01-17 | 腾讯科技(深圳)有限公司 | 控制虚拟对象进行攻击的方法、终端及存储介质 |
CN111589127A (zh) * | 2020-04-23 | 2020-08-28 | 腾讯科技(深圳)有限公司 | 虚拟角色的控制方法、装置、设备及存储介质 |
Family Cites Families (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050071306A1 (en) * | 2003-02-05 | 2005-03-31 | Paul Kruszewski | Method and system for on-screen animation of digital objects or characters |
US7235012B2 (en) * | 2004-08-23 | 2007-06-26 | Brain Box Concepts, Inc. | Video game controller with side or quick look feature |
US7963833B2 (en) * | 2004-10-15 | 2011-06-21 | Microsoft Corporation | Games with targeting features |
US8043149B2 (en) * | 2005-03-03 | 2011-10-25 | Sony Computer Entertainment America Llc | In-game shot aiming indicator |
US8651964B2 (en) * | 2005-04-29 | 2014-02-18 | The United States Of America As Represented By The Secretary Of The Army | Advanced video controller system |
US20070117628A1 (en) * | 2005-11-19 | 2007-05-24 | Stanley Mark J | Method and apparatus for providing realistic gun motion input to a video game |
US9327191B2 (en) * | 2006-05-08 | 2016-05-03 | Nintendo Co., Ltd. | Method and apparatus for enhanced virtual camera control within 3D video games or other computer graphics presentations providing intelligent automatic 3D-assist for third person viewpoints |
US8834245B2 (en) * | 2007-08-17 | 2014-09-16 | Nintendo Co., Ltd. | System and method for lock on target tracking with free targeting capability |
US8142286B2 (en) * | 2007-08-17 | 2012-03-27 | Microsoft Corporation | Programmable movement of an orientation of a game character view of a game environment |
US8777708B2 (en) * | 2008-06-27 | 2014-07-15 | Microsoft Corporation | Targeting control in a simulated environment |
US8342926B2 (en) * | 2008-07-13 | 2013-01-01 | Sony Computer Entertainment America Llc | Game aim assist |
US9868062B2 (en) * | 2012-03-13 | 2018-01-16 | Sony Interactive Entertainment America Llc | System, method, and graphical user interface for controlling an application on a tablet |
JP5563633B2 (ja) * | 2012-08-31 | 2014-07-30 | 株式会社スクウェア・エニックス | ビデオゲーム処理装置、およびビデオゲーム処理プログラム |
US9770664B2 (en) * | 2013-04-05 | 2017-09-26 | Gree, Inc. | Method and apparatus for providing online shooting game |
US10549180B2 (en) * | 2013-09-30 | 2020-02-04 | Zynga Inc. | Swipe-direction gesture control for video games using glass input devices |
JP5711409B1 (ja) * | 2014-06-26 | 2015-04-30 | ガンホー・オンライン・エンターテイメント株式会社 | 端末装置 |
JP6598522B2 (ja) * | 2015-06-12 | 2019-10-30 | 任天堂株式会社 | 情報処理装置、情報処理システム、情報処理方法、及び情報処理プログラム |
CN105194871B (zh) * | 2015-09-14 | 2017-03-22 | 网易(杭州)网络有限公司 | 一种控制游戏角色的方法 |
US20180015375A1 (en) * | 2016-07-12 | 2018-01-18 | Paul Marino | Computer-implemented multiplayer combat video game method and apparatus |
JP6143934B1 (ja) * | 2016-11-10 | 2017-06-07 | 株式会社Cygames | 情報処理プログラム、情報処理方法、及び情報処理装置 |
CN107398071B (zh) * | 2017-07-19 | 2021-01-26 | 网易(杭州)网络有限公司 | 游戏目标选择方法及装置 |
CN107661630A (zh) * | 2017-08-28 | 2018-02-06 | 网易(杭州)网络有限公司 | 一种射击游戏的控制方法及装置、存储介质、处理器、终端 |
CN107678647B (zh) * | 2017-09-26 | 2023-04-28 | 网易(杭州)网络有限公司 | 虚拟射击主体控制方法、装置、电子设备及存储介质 |
CN107773987B (zh) * | 2017-10-24 | 2020-05-22 | 网易(杭州)网络有限公司 | 虚拟射击主体控制方法、装置、电子设备及存储介质 |
CN107913515B (zh) * | 2017-10-25 | 2019-01-08 | 网易(杭州)网络有限公司 | 信息处理方法及装置、存储介质、电子设备 |
CN108196765A (zh) * | 2017-12-13 | 2018-06-22 | 网易(杭州)网络有限公司 | 显示控制方法、电子设备及存储介质 |
JP6561163B1 (ja) | 2018-03-09 | 2019-08-14 | 株式会社 ディー・エヌ・エー | ゲーム装置及びゲームプログラム |
JP7381225B2 (ja) * | 2018-09-06 | 2023-11-15 | 株式会社Cygames | プログラム、電子装置、方法及びシステム |
CN109550240A (zh) * | 2018-09-20 | 2019-04-02 | 厦门吉比特网络技术股份有限公司 | 一种游戏的技能释放方法和装置 |
CN109550241B (zh) * | 2018-09-20 | 2023-04-07 | 厦门吉比特网络技术股份有限公司 | 一种单摇杆控制方法和系统 |
CN109806579A (zh) * | 2019-02-01 | 2019-05-28 | 网易(杭州)网络有限公司 | 游戏中虚拟对象的控制方法、装置、电子设备及存储介质 |
JP2019202128A (ja) | 2019-04-17 | 2019-11-28 | 株式会社セガゲームス | 情報処理装置及びプログラム |
JP6928709B1 (ja) * | 2020-12-28 | 2021-09-01 | プラチナゲームズ株式会社 | 情報処理プログラム、情報処理装置、および情報処理方法 |
-
2020
- 2020-04-23 CN CN202010328532.3A patent/CN111589127B/zh active Active
-
2021
- 2021-03-15 SG SG11202112169UA patent/SG11202112169UA/en unknown
- 2021-03-15 WO PCT/CN2021/080690 patent/WO2021213070A1/zh unknown
- 2021-03-15 EP EP21782629.6A patent/EP3943172A4/en active Pending
- 2021-03-15 JP JP2021564351A patent/JP7451563B2/ja active Active
- 2021-03-15 CA CA3137791A patent/CA3137791A1/en active Pending
- 2021-03-15 KR KR1020217036000A patent/KR20210150465A/ko not_active Application Discontinuation
- 2021-03-15 AU AU2021254521A patent/AU2021254521B2/en active Active
-
2022
- 2022-01-07 US US17/570,391 patent/US20220126205A1/en active Pending
-
2024
- 2024-03-06 JP JP2024034076A patent/JP2024063201A/ja active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170149241A1 (en) * | 2009-07-15 | 2017-05-25 | Yehuda Binder | Sequentially operated modules |
US20180104584A1 (en) * | 2016-10-19 | 2018-04-19 | Nintendo Co., Ltd. | Computer-readable non-transitory storage medium having stored therein game program, game processing method, game system, and game apparatus |
CN109865286A (zh) * | 2019-02-20 | 2019-06-11 | 网易(杭州)网络有限公司 | 游戏中的信息处理方法、装置及存储介质 |
CN110413171A (zh) * | 2019-08-08 | 2019-11-05 | 腾讯科技(深圳)有限公司 | 控制虚拟对象进行快捷操作的方法、装置、设备及介质 |
CN110694261A (zh) * | 2019-10-21 | 2020-01-17 | 腾讯科技(深圳)有限公司 | 控制虚拟对象进行攻击的方法、终端及存储介质 |
CN111589127A (zh) * | 2020-04-23 | 2020-08-28 | 腾讯科技(深圳)有限公司 | 虚拟角色的控制方法、装置、设备及存储介质 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3943172A4 * |
Also Published As
Publication number | Publication date |
---|---|
SG11202112169UA (en) | 2021-12-30 |
AU2021254521A1 (en) | 2021-11-11 |
CN111589127B (zh) | 2022-07-12 |
JP7451563B2 (ja) | 2024-03-18 |
KR20210150465A (ko) | 2021-12-10 |
US20220126205A1 (en) | 2022-04-28 |
CN111589127A (zh) | 2020-08-28 |
JP2022533919A (ja) | 2022-07-27 |
EP3943172A4 (en) | 2022-07-20 |
AU2021254521B2 (en) | 2023-02-02 |
CA3137791A1 (en) | 2021-10-28 |
JP2024063201A (ja) | 2024-05-10 |
EP3943172A1 (en) | 2022-01-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021213070A1 (zh) | 虚拟角色的控制方法、装置、设备及存储介质 | |
WO2021218516A1 (zh) | 虚拟对象控制方法、装置、设备及存储介质 | |
WO2021244324A1 (zh) | 虚拟对象的控制方法、装置、设备以及存储介质 | |
US20230033874A1 (en) | Virtual object control method and apparatus, terminal, and storage medium | |
CN113440846B (zh) | 游戏的显示控制方法、装置、存储介质及电子设备 | |
WO2021227870A1 (zh) | 虚拟对象的控制方法、装置、终端及存储介质 | |
CN111399639B (zh) | 虚拟环境中运动状态的控制方法、装置、设备及可读介质 | |
US11878242B2 (en) | Method and apparatus for displaying virtual environment picture, device, and storage medium | |
WO2021203904A1 (zh) | 虚拟环境画面的显示方法、装置、设备及存储介质 | |
TWI804032B (zh) | 虛擬場景中的資料處理方法、裝置、設備、儲存媒體及程式產品 | |
JP7406567B2 (ja) | 仮想環境の画面表示方法及び装置、並びにコンピュータ装置及びプログラム | |
WO2021159795A1 (zh) | 三维虚拟环境中的技能瞄准方法、装置、设备及存储介质 | |
WO2023010690A1 (zh) | 虚拟对象释放技能的方法、装置、设备、介质及程序产品 | |
JP2023164787A (ja) | 仮想環境の画面表示方法、装置、機器及びコンピュータプログラム | |
CN111752697B (zh) | 应用程序的运行方法、装置、设备及可读存储介质 | |
CN111589102B (zh) | 辅助工具检测方法、装置、设备及存储介质 | |
CN111589134A (zh) | 虚拟环境画面的显示方法、装置、设备及存储介质 | |
WO2023071808A1 (zh) | 基于虚拟场景的图形显示方法、装置、设备以及介质 | |
CN112843682B (zh) | 数据同步方法、装置、设备及存储介质 | |
CN117618885A (zh) | 虚拟对象的控制方法、装置、设备、存储介质及程序产品 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2021782629 Country of ref document: EP Effective date: 20211012 |
|
ENP | Entry into the national phase |
Ref document number: 2021564351 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20217036000 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2021254521 Country of ref document: AU Date of ref document: 20210315 Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21782629 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |