WO2021213070A1 - 虚拟角色的控制方法、装置、设备及存储介质 - Google Patents

虚拟角色的控制方法、装置、设备及存储介质 Download PDF

Info

Publication number
WO2021213070A1
WO2021213070A1 PCT/CN2021/080690 CN2021080690W WO2021213070A1 WO 2021213070 A1 WO2021213070 A1 WO 2021213070A1 CN 2021080690 W CN2021080690 W CN 2021080690W WO 2021213070 A1 WO2021213070 A1 WO 2021213070A1
Authority
WO
WIPO (PCT)
Prior art keywords
release
skill
control
virtual character
virtual environment
Prior art date
Application number
PCT/CN2021/080690
Other languages
English (en)
French (fr)
Inventor
万钰林
胡勋
翁建苗
粟山东
张勇
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to KR1020217036000A priority Critical patent/KR20210150465A/ko
Priority to EP21782629.6A priority patent/EP3943172A4/en
Priority to CA3137791A priority patent/CA3137791A1/en
Priority to AU2021254521A priority patent/AU2021254521B2/en
Priority to JP2021564351A priority patent/JP7451563B2/ja
Priority to SG11202112169UA priority patent/SG11202112169UA/en
Publication of WO2021213070A1 publication Critical patent/WO2021213070A1/zh
Priority to US17/570,391 priority patent/US20220126205A1/en
Priority to JP2024034076A priority patent/JP2024063201A/ja

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/65Methods for processing data by generating or executing the game program for computing the condition of a game character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6607Methods for processing data by generating or executing the game program for rendering three dimensional images for animating game characters, e.g. skeleton kinematics
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Definitions

  • the embodiments of the present application relate to the field of virtual environments, and in particular to a method, device, device, and storage medium for controlling a virtual character.
  • Battle games are games where multiple user accounts compete in the same virtual scene.
  • the battle game may be a Multiplayer Online Battle Arena Games (MOBA), and the user may control the virtual object to release skills in the MOBA game, thereby attacking the hostile virtual object.
  • MOBA Multiplayer Online Battle Arena Games
  • the skill release includes at least two release modes: quick release and aiming release.
  • the quick release refers to the release of the skill according to the current facing direction of the virtual object in the virtual environment after the user triggers on the release control of the skill.
  • the embodiments of the present application provide a method, device, device, and storage medium for controlling a virtual character, which can improve the user's human-computer interaction efficiency during the skill release process.
  • the technical solution is as follows:
  • a method for controlling a virtual character is provided, the method is executed by a computer device, and the method includes:
  • the virtual environment interface includes a screen for observing the virtual environment, and the screen includes a master virtual character located in the virtual environment;
  • control the master virtual character In response to the skill release operation and the movement control operation, control the master virtual character to release the directional skill in the second direction in the virtual environment.
  • a virtual character control device is provided, the device is applied to computer equipment, and the device includes:
  • a display module configured to display a virtual environment interface, the virtual environment interface includes a screen for observing the virtual environment, and the screen includes a master virtual character located in the virtual environment;
  • the receiving module is configured to receive skill release operations and movement control operations, where the skill release operations are used to control the master virtual character to release directional skills in the virtual environment in a first direction, and the movement control operations are used to Controlling the master virtual character to move in the second direction in the virtual environment;
  • the release module is configured to control the master virtual character to release the directional skill in the second direction in the virtual environment in response to the skill release operation and the movement control operation.
  • the virtual environment interface further includes a mobile control, and the mobile control operation is a drag operation received on the mobile control;
  • the receiving module is further configured to receive a drag operation on the mobile control
  • the device further includes:
  • the obtaining module is configured to obtain the drag direction of the drag operation from the presentation layer; and determine the corresponding second direction when the master virtual character moves according to the drag direction.
  • the device further includes:
  • a sending module configured to send a skill release data packet to the server, where the skill release data packet includes the second direction;
  • the receiving module is further configured to receive a skill release feedback package sent by the server;
  • the release module is further configured to control the master virtual role to release the directional skill in the second direction in the virtual environment in response to the skill release feedback package.
  • the sending module is configured to send a movement control data packet to the server in response to the movement control operation, and the movement control data packet includes the second direction;
  • the receiving module is further configured to receive a mobile control feedback packet sent by the server;
  • the device further includes:
  • the movement module is configured to control the master virtual character to move in the second direction in the virtual environment in response to the movement control feedback packet.
  • the device further includes:
  • the buffer module is configured to buffer the second direction in the logic layer as the facing direction of the master virtual character in response to the movement control feedback packet.
  • the acquisition module is further configured to, in response to the skill release operation, and the movement control operation is not received, acquire the master virtual character's information from the logic layer.
  • the facing direction is the first direction;
  • the release module is further configured to control the master virtual character to release the directional skill in the virtual environment in the first direction.
  • the virtual environment interface further includes a skill release control
  • the receiving module is configured to receive a first trigger operation in the first area of the skill release control as the skill release operation.
  • the receiving module is further configured to receive a second triggering operation in a second area of the skill release control, and the second area is a division corresponding to the skill release control. An area other than the first area; determining a release direction corresponding to the second trigger operation;
  • the release module is further configured to control the master virtual character to release the directional skill in the release direction in the virtual environment.
  • a computer device in another aspect, includes a processor and a memory.
  • the memory stores at least one instruction, at least a program, a code set, or an instruction set.
  • a piece of program, the code set or the instruction set is loaded and executed by the processor to realize the control method of the virtual character as described above.
  • a computer-readable storage medium stores at least one instruction, at least one program, code set, or instruction set, the at least one instruction, the at least one program,
  • the code set or instruction set is loaded and executed by the processor to realize the control method of the virtual character as described above.
  • a computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
  • the processor of the computer device reads the computer instruction from the computer-readable storage medium, and the processor executes the computer instruction, so that the computer device executes the method for controlling the virtual character as described above.
  • the second direction corresponding to the movement control operation is determined, and the master virtual character is controlled in the second direction to release the directional skills instead of the automatically selected first direction.
  • Directional skills so as to ensure that the directional skills are released in the adjusted facing direction of the master virtual character, improve the accuracy of the directional skills when releasing them, and avoid the need to wait for the directional skills to cool down due to the wrong release direction (that is, after release After a period of recovery and re-enter the releasable state), based on the user’s re-operation to re-release the directional skills, the human-computer interaction efficiency is low, thereby improving the human-computer interaction efficiency and reducing the need for computer equipment. Handling wrong operations, thereby improving the overall performance of the computer equipment.
  • FIGS. 1A and 1B are schematic diagrams of interfaces of the skill release process provided by an exemplary embodiment of the present application.
  • Figure 2 is a schematic diagram of the quick release timeline of skills in related technologies
  • Fig. 3 is a schematic diagram of a quick release timeline of skills provided by an exemplary embodiment of the present application.
  • Fig. 4 is a structural block diagram of a computer system provided by an exemplary embodiment of the present application.
  • Fig. 5 is a flowchart of a method for controlling a virtual character provided by an exemplary embodiment of the present application
  • FIG. 6 is a schematic diagram of an interface of skill release and movement control provided based on the embodiment shown in FIG. 5;
  • Fig. 7 is a flowchart of a method for controlling a virtual character provided by another exemplary embodiment of the present application.
  • FIG. 8 is a flow chart of the release process based on the skills provided by the embodiment shown in FIG. 7;
  • FIG. 9 is a schematic diagram of an interface of skill release and movement control provided based on the embodiment shown in FIG. 7;
  • Fig. 10 is a flowchart of a method for controlling a virtual character provided by another exemplary embodiment of the present application.
  • FIG. 11 is a flowchart of the release process of skills provided based on the embodiment shown in FIG. 10;
  • FIG. 12 is an overall flowchart of the skill release process provided by an exemplary embodiment of the present application.
  • FIG. 13 shows a schematic diagram of a virtual environment interface for rapid release of directional skills provided by an exemplary embodiment of the present application
  • Fig. 14 is a structural block diagram of a virtual character control device provided by an exemplary embodiment of the present application.
  • Fig. 15 is a structural block diagram of a virtual character control device provided by another exemplary embodiment of the present application.
  • Fig. 16 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • the virtual environment is the virtual environment displayed (or provided) when the application program runs on the terminal.
  • the virtual environment may be a simulation environment of the real world, a semi-simulation and semi-fictional environment, or a purely fictitious environment.
  • the virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in this application.
  • the virtual environment is a three-dimensional virtual environment as an example.
  • the virtual environment is used to provide a combat environment for at least two master virtual characters.
  • the virtual environment includes a symmetrical lower left corner area and an upper right corner area.
  • the master virtual characters belonging to two opposing camps occupy one of the areas and destroy the target building, stronghold, base, or crystal in the depth of the opponent's area. As the goal of victory.
  • Virtual characters refer to movable objects in a virtual environment.
  • the movable object may be a virtual character, a virtual animal, an animation character, etc., such as a character or animal displayed in a three-dimensional virtual environment.
  • the virtual character is a three-dimensional model created based on animation skeletal technology.
  • Each virtual character has its own shape and volume in the three-dimensional virtual environment and occupies a part of the space in the three-dimensional virtual environment.
  • the virtual character is a master virtual character controlled by a user as an example.
  • the master virtual character generally refers to one or more master virtual characters in a virtual environment.
  • a multiplayer online tactical competitive game refers to a virtual environment where different virtual teams belonging to at least two rival camps occupy their respective map areas and compete with a certain victory condition as the goal.
  • the victory conditions include, but are not limited to: occupying a stronghold or destroying the enemy camp’s stronghold, killing the virtual character of the enemy camp, ensuring one’s own survival in a specified scene and time, grabbing a certain resource, and surpassing the opponent’s score within a specified time At least one of.
  • Tactical competition can be carried out in units of rounds, and the map of each round of tactical competition can be the same or different.
  • Each virtual team includes one or more virtual characters, such as 1, 2, 3, or 5. The duration of a MOBA game is from the moment the game starts to the moment the victory condition is fulfilled.
  • the method provided in this application can be applied to virtual reality applications, three-dimensional map programs, military simulation programs, first-person shooting games (FPS), MOBA games, etc.
  • the following embodiments are applications in games Let me illustrate.
  • a game based on a virtual environment consists of one or more game worlds.
  • the virtual environment in the game can simulate real world scenes.
  • the user can control the main virtual character in the game to walk, run, jump, shoot, and fight in the virtual environment.
  • Driving releasing skills, being attacked by other virtual characters, being injured in the virtual environment, attacking other virtual characters and other actions, with strong interaction, and multiple users can team up for competitive games online.
  • the master virtual character when the master virtual character releases skills in the virtual environment, it includes at least one of the following skills release methods:
  • quick release refers to the ability to release the skill with the virtual object facing in the virtual environment by triggering the skill release control during the skill release process
  • the skill release control corresponds to a first area and a second area, wherein when a first trigger operation in the first area is received, the directional skill is released in the virtual environment in the first direction, where the first The first direction is the facing direction of the virtual object, or the first direction is the direction corresponding to the position of the attack target within the skill release range.
  • the skill is released in the direction of the virtual object facing in the virtual environment.
  • the first trigger operation in the first area includes a touch operation on the skill release control, and the end position of the touch operation is located in the first area; or, the first trigger operation in the first area includes a touch operation on the first area. Touch operation in the area, and the touch operation does not move out of the first area.
  • aiming release refers to the ability to release the skill in the adjusted direction after adjusting the direction of the skill release through the skill release control during the skill release process
  • the direction of the skill release is determined according to the second trigger operation, and when the trigger operation ends, the skill is released in the direction of the skill release.
  • the second trigger operation in the second area includes a touch operation that starts in the first area, and the end position of the touch operation is located in the second area; or, the second trigger operation in the second area includes a touch operation that acts on Touch operation in the second area, and the touch operation does not move out of the second area.
  • FIGS. 1A and 1B are interface diagrams of the skill release process provided by an exemplary embodiment of the present application.
  • the virtual environment interface 100 includes a skill release control 110, and the skill release control 110 includes The first area 111 and the second area 112; wherein the skill release process based on the skill release control 110 in FIG. 1A is shown in FIG. 1B.
  • a virtual object 120 In response to receiving the skill release operation in the first area 111, a virtual object 120 The skill is released in the face direction in the virtual environment; in response to receiving the skill release operation in the second area 112, the release direction corresponding to the skill release operation is determined, and the skill is released in the release direction.
  • the virtual environment interface also includes a mobile joystick for controlling the facing direction of the virtual object and controlling the movement of the virtual object in the virtual environment.
  • the mobile joystick receives the control operation to change the facing direction of the virtual object
  • the mobile terminal obtains the facing direction of the virtual object from the logic layer, and quickly releases the skill in the facing direction.
  • the control operation received by the mobile joystick is uploaded to the server and no feedback message is received.
  • the face direction acquired by the layer is the direction before the adjustment, so the release direction when the skill is released is different from the adjusted face direction, so the accuracy of the skill release direction is low.
  • the skill release process includes the following processes:
  • Step 201 The virtual object faces the first direction.
  • Step 202 Move the joystick to control the virtual object to face the second direction.
  • Step 203 Trigger the skill on the skill trigger control, and obtain the facing direction (first direction) of the virtual object.
  • Step 204 The client sends a mobile control packet to the server.
  • Step 205 The client sends a skill release package to the server.
  • Step 206 The client receives the movement control feedback packet fed back by the server, and controls the virtual object to face the second direction.
  • Step 207 The client receives the skill release feedback package fed back by the server, and controls the virtual object to release the skill in the first direction.
  • Step 208 The virtual object faces the second direction.
  • the skill release process includes the following processes:
  • Step 301 The virtual object faces the first direction.
  • Step 302 Move the joystick to control the virtual object to face the second direction.
  • Step 303 Trigger the skill on the skill trigger control, and obtain the control direction (second direction) of the mobile joystick.
  • Step 304 The client sends a mobile control packet to the server.
  • Step 305 The client sends a skill release package to the server.
  • Step 306 The client receives the movement control feedback packet fed back by the server, and controls the virtual object to face the second direction.
  • Step 307 The client receives the skill release feedback package fed back by the server, and controls the virtual object to release the skill in the second direction.
  • Step 308 the virtual object faces the second direction.
  • the acquired release direction is the control direction received on the mobile control, that is, the direction the virtual object finally faces, thereby improving the accuracy of the skill release direction.
  • Fig. 4 shows a structural block diagram of a computer system provided by an exemplary embodiment of the present application.
  • the computer system 400 includes: a first terminal 420, a server 440, and a second terminal 460.
  • the first terminal 420 installs and runs an application program supporting the virtual environment.
  • the application can be any of virtual reality applications, three-dimensional map programs, military simulation programs, FPS games, MOBA games, multiplayer gun battle survival games, and battle royale shooting games.
  • the first terminal 420 is a terminal used by the first user.
  • the first user uses the first terminal 420 to control the first master virtual character in the virtual environment to perform activities, including but not limited to: adjusting body posture, walking, running, At least one of jumping, releasing skills, picking up, attacking, and avoiding attacks from other virtual characters.
  • the first master virtual character is a first virtual character, such as a simulated character or an animation character.
  • the first master virtual character releases the regional skill in the virtual environment, and the virtual environment screen moves from the position where the master virtual character is located to the target area selected by the regional skill indicator.
  • the regional skill indicator is used to indicate the skill release area when the master virtual character releases the skill.
  • the first terminal 420 is connected to the server 440 through a wireless network or a wired network.
  • the server 440 includes at least one of a server, multiple servers, a cloud computing platform, and a virtualization center.
  • the server 440 includes a processor 444 and a memory 442.
  • the memory 442 further includes a receiving module 4421, a control module 4422, and a sending module 4423.
  • the receiving module 4421 is used to receive a request sent by a client, such as a team request; a control module 4422 It is used to control the rendering of the virtual environment picture; the sending module 4423 is used to send a message notification to the client, such as a team success notification.
  • the server 440 is used to provide background services for applications supporting the three-dimensional virtual environment.
  • the server 440 is responsible for the main calculation work, and the first terminal 420 and the second terminal 460 are responsible for the secondary calculation work; or, the server 440 is responsible for the secondary calculation work, and the first terminal 420 and the second terminal 460 are responsible for the main calculation work; Alternatively, the server 440, the first terminal 420, and the second terminal 460 adopt a distributed computing architecture to perform collaborative computing.
  • the second terminal 460 is connected to the server 440 through a wireless network or a wired network.
  • the second terminal 460 installs and runs an application program supporting the virtual environment.
  • the application can be any of virtual reality applications, three-dimensional map programs, military simulation programs, FPS games, MOBA games, multiplayer gun battle survival games, and battle royale shooting games.
  • the second terminal 460 is a terminal used by the second user.
  • the second user uses the second terminal 460 to control the second master virtual character in the virtual environment to perform activities, including but not limited to: adjusting body posture, walking, running, At least one of jumping, releasing skills, picking up, attacking, and avoiding attacks from other master virtual characters.
  • the second master virtual character is a second virtual character, such as a simulation character or an animation character.
  • first avatar and the second avatar are in the same virtual environment.
  • first virtual persona and the second virtual persona may belong to the same team, the same organization, have a friendship relationship, or have temporary communication permissions.
  • the applications installed on the first terminal 420 and the second terminal 460 are the same, or the applications installed on the two terminals are the same type of application on different control system platforms.
  • the first terminal 420 may generally refer to one of a plurality of terminals
  • the second terminal 460 may generally refer to one of a plurality of terminals. This embodiment only uses the first terminal 420 and the second terminal 460 as examples.
  • the device types of the first terminal 420 and the second terminal 460 are the same or different.
  • the device types include: smart phones, tablet computers, e-book readers, MP3 (Moving Picture Experts Group Audio Layer III, moving picture experts compress standard audio layer 3) ) Player, at least one of MP4 (Moving Picture Experts Group Audio Layer IV, Motion Picture Experts compresses standard audio layer 4) player, laptop portable computer and desktop computer.
  • the terminal includes a smart phone as an example.
  • the number of the aforementioned terminals may be more or less. For example, there may be only one terminal, or there may be dozens or hundreds of terminals, or more.
  • the embodiments of the present application do not limit the number of terminals and device types.
  • FIG. 5 is a flowchart of a method for controlling a virtual character provided by an exemplary embodiment of the present application.
  • the method may be executed by a computer device, and the computer device may be implemented as the first terminal 420 in the computer system 400 as shown in FIG. 4 Or other terminals in the second terminal 460 or the computer system 400, as shown in FIG. 5, the method includes:
  • Step 501 Display a virtual environment interface.
  • the virtual environment interface includes a screen for observing the virtual environment, and the screen includes a master virtual character located in the virtual environment.
  • the virtual environment interface further includes a release control for controlling the master virtual character to release directional skills, and a movement control for controlling the master virtual character to move in the virtual environment.
  • the screen includes the master virtual character located in the virtual environment.
  • the terminal used by the user runs an application that supports the virtual environment.
  • the screen of the terminal displays the user interface corresponding to the application when using the application, that is, the virtual environment interface is displayed, and the virtual environment interface is displayed
  • the virtual environment displayed on the picture includes at least one element of mountains, flatlands, rivers, lakes, oceans, deserts, sky, plants, buildings, and vehicles.
  • the virtual environment is a virtual environment with an arbitrary boundary shape, for example, the virtual environment is a rhombus.
  • the user can browse the full picture of the virtual environment by viewing the map corresponding to the virtual environment.
  • a camera model is set in the virtual environment, and the camera model is used to observe the virtual environment from different perspectives, so as to obtain the virtual environment picture.
  • the angle of view refers to the angle of observation when the virtual character's first-person or third-person perspective is used to observe in the virtual environment.
  • a release control for controlling the master virtual character to release directional skills is displayed in the virtual environment interface, where the directional skills correspond to the skill release direction, that is, when the skills are released, the directional skills need to be released in the specified direction.
  • the skill release direction includes at least one of the following two situations:
  • control the master virtual character to release the directional skills in the first direction In the process of quickly releasing the directional skills, control the master virtual character to release the directional skills in the first direction.
  • the virtual The facing direction of the object in the virtual environment is used as the release direction of the directional skill.
  • the direction corresponding to the attacking object is used as the release direction of the directional skill;
  • the release direction of the directional skill is adjusted through the release adjustment operation of the release control, and when the release is triggered, the directional skill is released in the adjusted release direction.
  • the virtual environment interface also displays a movement control for controlling the movement of the master virtual character, where the movement control can also be used to control the master virtual character to adjust the direction of movement during the process of controlling the movement of the master virtual character.
  • the user can adjust the facing direction of the master virtual character through the movement control, and control the master virtual character to move in the facing direction in the virtual environment.
  • the quick release method can improve the efficiency of skill release when releasing the directional skills, generally, when the user releases the skills, the user quickly adjusts the facing direction of the master virtual character through the mobile control, and then through the directivity The skill release operation of the skill quickly releases the directional skill in an accurate direction.
  • Step 502 Receive a skill release operation and a movement control operation.
  • the skill release operation is used to control the master virtual character to release directional skills in the virtual environment in the first direction; the movement control operation is used to control the master virtual character to move in the second direction in the virtual environment, and the first The direction and the second direction are independent of each other.
  • the skill release operation corresponds to the quick release of the directional skills described above, that is, the master virtual character is controlled through the skill release operation to release the directional skills in a quick release manner.
  • the first direction is the direction automatically selected by the client during the release of the directional skill. For example, when there is an attacking target in the preset range around the master virtual character, the direction corresponding to the location of the attacking target is taken as In the first direction, when there is no attacking target within the preset range around the master virtual character, the facing direction of the master virtual character is taken as the first direction.
  • the embodiment of the present application takes the facing direction of the master virtual character as the first direction as an example for description.
  • the method for acquiring the facing direction includes: directly acquiring the current orientation of the master virtual character in the virtual environment from the logical layer As the main control virtual character’s face direction; however, the face direction obtained from the logic layer may be inaccurate due to the delay of the direction adjustment, resulting in an inaccurate skill release direction.
  • the virtual environment interface 600 Including a master virtual character 610, a skill release control 620, and a movement control 630.
  • the movement control 630 receives a drag operation to the lower right, and controls the master virtual character 610 to face the direction corresponding to the lower right, and to the right Move in the corresponding direction below, and the quick release operation is received on the skill release control 620, and the current master virtual character 610 faces the corresponding direction at the bottom left. Therefore, the skill release direction included in the skill release request sent by the terminal to the server is the same as Corresponding to the lower left, the skill is released to the lower left, which is different from the controlled direction on the movement control 630.
  • the second direction corresponding to the movement control operation is used instead of the first direction to release the directional skills, that is, the second direction corresponding to the movement control operation is obtained from the presentation layer as the face of the master virtual character In the direction.
  • the virtual environment interface also includes a mobile control.
  • the mobile control operation is a drag operation received on the mobile control. After receiving the drag operation on the mobile control, the dragging direction of the drag operation is obtained from the presentation layer. , And determine the corresponding second direction when the master virtual character moves according to the dragging direction.
  • the mobile control operation is an operation triggered based on a mobile control
  • the presentation layer is used to implement interface performance and receive interface operations.
  • the presentation layer is used to display the screen corresponding to the virtual environment in the virtual environment interface and the controls used to control the main virtual character or the game process.
  • the presentation layer is also used to receive The touch operation on the virtual environment interface, and the touch operation is reported to the logic layer through the server for logical processing.
  • both the presentation layer and the logic layer exist in the game client, where the logic layer cannot directly access the data in the presentation layer, the presentation layer can access the data in the logic layer, and the presentation layer cannot perform the logic of the logic layer.
  • the presentation layer can access the data in the logic layer, and the presentation layer cannot perform the logic of the logic layer.
  • it is necessary to perform logic processing in the logic layer through the server according to the received touch operation.
  • the user performs a touch operation on the mobile control in the virtual environment interface, so that the touch data is read by the presentation layer and a mobile touch message is generated, for example, the mobile touch message includes
  • the client sends a mobile touch message to the server.
  • the logic layer responds to the facing direction of the master virtual character according to the mobile feedback message
  • the presentation layer reads the adjusted face direction from the logic layer and performs performance, so as to realize the control of the master virtual character.
  • the virtual environment interface also includes a skill release control.
  • the skill release control is used to control the virtual object to release directional skills.
  • the skill release control corresponds to a first area and a second area, where the first area is used for To trigger the quick release of directional skills, the second area is used to trigger the targeted release of directional skills.
  • the first area in response to receiving the first trigger operation in the first area of the skill release control, it is determined to receive the skill release operation; and when the first area in the first area of the skill release control is received When the operation is triggered, the directional skills are quickly released.
  • the release direction corresponding to the second trigger operation is determined, and the master virtual character is controlled to release the direction in the corresponding release direction in the virtual environment.
  • sexual skills where the second area is the area other than the first area corresponding to the skill release control.
  • Step 503 In response to the skill release operation and the movement control operation, control the master virtual character to release the directional skill in the second direction in the virtual environment.
  • the second direction is the direction corresponding to the movement control operation, and when the user controls the master virtual character to adjust the facing direction, the adjusted facing direction, so that the release direction of the directional skill is consistent with the movement control direction.
  • the skill release control is usually set to release the skill based on the skill release control once, it takes a specified period of time before the next skill release can be performed based on the skill release control. That is to say, after the skill is released, it needs to go through the skill It can be released again only after cooling. Therefore, based on the virtual character control method provided by the embodiment of the present application, the waste of skill release time caused by the wrong direction of the directional skill release direction can be reduced, thereby improving the efficiency of human-computer interaction.
  • the virtual character control method when the directional skill is released, if a movement control operation is received, the second direction corresponding to the movement control operation is determined, and the master is controlled in the second direction.
  • the virtual character releases the directional skills instead of releasing the directional skills in the first direction automatically selected, so as to ensure that the directional skills are released in the facing direction after the master virtual character is adjusted, and the accuracy of the directional skills is improved to avoid Due to the wrong release direction, it is necessary to wait for the directional skill to cool down (that is, after a period of recovery after release and enter the releasable state again), the human-computer interaction efficiency caused by the re-release of the directional skill based on the user's re-operation Low problems, thereby improving the efficiency of human-computer interaction, reducing the need for computer equipment to handle erroneous operations, thereby improving the overall performance of the computer equipment.
  • FIG. 7 is a diagram of a virtual character control method provided by another exemplary embodiment of the present application.
  • the method can be executed by a computer device, and the computer device can be implemented as the first terminal 420 or the second terminal 460 in the computer system 400 as shown in FIG. 4 or other terminals in the computer system 400, as shown in FIG. As shown in 7, the method includes:
  • Step 701 Display a virtual environment interface.
  • the virtual environment interface includes a screen for observing the virtual environment, and the screen includes a master virtual character located in the virtual environment.
  • a release control for controlling the master virtual character to release directional skills and a movement control for controlling the master virtual character to move in the virtual environment are superimposed and displayed on the screen.
  • the first direction is used as the release direction of the skill.
  • the master virtual character is acquired in the rapid release process of the directional skill. Face the direction in the virtual environment, and release the directional skills in the face direction of the master virtual character.
  • Step 702 Receive a skill release operation and a movement control operation.
  • the skill release operation is used to control the master virtual character to release directional skills in the virtual environment in the first direction; the movement control operation is used to control the master virtual character to move in the second direction in the virtual environment, and the first The direction and the second direction are independent of each other.
  • the skill release operation corresponds to the quick release of the directional skills described above, that is, the master virtual character is controlled through the skill release operation to release the directional skills in a quick release manner.
  • the movement control operation is an operation triggered by a movement control.
  • the second direction corresponding to the movement control operation is obtained from the presentation layer, and the presentation layer is used to implement interface performance and receive interface operations.
  • the movement control operation is realized by the drag operation on the mobile control, that is, the drag operation on the mobile control is received, the drag direction of the drag operation is obtained from the presentation layer, and the master virtual character is determined according to the drag direction The corresponding second direction when moving.
  • Step 703 Send a skill release data packet to the server, where the skill release data packet includes the second direction.
  • the presentation layer can access the data of the logic layer, but cannot modify the logic of the logic layer, that is, it cannot control the logic layer to perform logical processing. Therefore, the presentation layer sends skills to the server after obtaining the second direction.
  • the release data package where the skill release data package includes the second direction as the release direction of the directional skill.
  • Step 704 Receive a skill release feedback package sent by the server.
  • the skill release feedback package sent by the server is received through the logic layer, and the logic layer performs logical processing according to the skill release feedback package.
  • Step 705 In response to the skill release feedback package, control the master virtual character to release the directional skill in the second direction in the virtual environment.
  • the logic layer controls the master virtual character to release the directional skill in the second direction in the virtual environment according to the control data in the skill release feedback package.
  • the second direction is the direction corresponding to the movement control operation received on the movement control, and when the user controls the master virtual character to adjust the facing direction, the adjusted facing direction, so that the directionality skill is released. Consistent with the movement control direction.
  • the skill release logic is described in conjunction with the user, presentation layer, server, and logic layer.
  • the presentation layer 820 is triggered to send the skill release to the server 830 Data package, where the skill release data package includes the movement control direction.
  • the server 830 receives the skill release data packet, it sends the skill release feedback package to the logic layer 840.
  • the logic layer 840 performs logical processing based on the skill release feedback package and sends it to the presentation layer 820
  • the skill release status is sent to instruct the presentation layer 820 to display the skill release process.
  • the presentation layer 820 obtains the skill release status from the logic layer 840 and displays the skill release process on the presentation layer 820.
  • the virtual environment interface 900 includes a master virtual character 910, a mobile joystick 920, and a directional skill trigger control 930, where the master virtual character 910 faces the first in the virtual environment Direction, receive the movement control operation on the mobile joystick 920, and control the master virtual character 910 to face the second direction in the virtual environment.
  • the client receives the trigger operation on the trigger control 930, so that the client receives from the presentation layer
  • the movement control operation on the mobile joystick 920 is read, and the directional skill is released in the second direction.
  • the directional skill when the directional skill is released, if a movement control operation is received, the second direction corresponding to the movement control operation is determined, and the master virtual character is controlled in the second direction.
  • the character releases the directional skills instead of releasing the directional skills in the first direction that is automatically selected, so as to ensure that the directional skills are released in the facing direction adjusted by the master virtual character, improve the accuracy of the directional skills when releasing, and avoid The direction of the release is wrong, you need to wait for the directional skill to cool down (that is, after a period of time after the release, and re-enter the releasable state), based on the user's re-operation to re-release the directional skill, resulting in low human-computer interaction efficiency Problems, thereby improving the efficiency of human-computer interaction, reducing the need for computer equipment to handle erroneous operations, and thereby improving the overall performance of the computer equipment.
  • the method provided in this embodiment sends a skill release data packet to the server through the presentation layer of the terminal, and the server feeds back the skill release feedback package to the logic layer of the terminal, thereby realizing the release of skills from the logic layer, and at the same time, the performance layer
  • the second direction controls the master virtual character to release the directional skills, which improves the accuracy of the release of the directional skills.
  • FIG. 10 is another exemplary implementation of the present application.
  • the example provides a flowchart of a method for controlling a virtual character.
  • the method can be executed by a computer device.
  • the computer device can be implemented as the first terminal 420 or the second terminal 460 or the computer in the computer system 400 as shown in FIG.
  • the method includes:
  • Step 1001 Display a virtual environment interface.
  • the virtual environment interface includes a screen for observing the virtual environment, and the screen includes a master virtual character located in the virtual environment.
  • a release control for controlling the master virtual character to release directional skills and a movement control for controlling the master virtual character to move in the virtual environment are superimposed and displayed on the screen.
  • the corresponding first direction is used as the release direction of the skill.
  • the master virtual character is acquired Face the direction in the virtual environment, and release the directional skills in the face direction.
  • Step 1002 Receive a skill release operation, and the skill release operation is used to control the master virtual character to release the directional skill in the first direction.
  • the skill release operation corresponds to the quick release of the above-mentioned directional skills, that is, the master virtual character is controlled by the skill release operation to release the directional skills in a quick release mode, and the client is in the face of the master virtual character. After acquiring in the direction, release the directional skill in the facing direction.
  • Step 1003 Receive a movement control operation, where the movement control operation is used to control the master virtual character to move in the second direction in the virtual environment.
  • the movement control operation is realized by a drag operation on the mobile control, that is, the drag operation on the mobile control is received, the drag direction of the drag operation is obtained from the presentation layer, and the second corresponding to the drag direction is determined. direction.
  • the movement control data packet includes the second direction
  • receive the movement control feedback packet sent by the server and respond to the movement control feedback packet to control the master virtual
  • the character moves facing the second direction in the virtual environment.
  • the second direction is cached in the logic layer as the facing direction of the master virtual character.
  • step 1003 can be executed first, then step 1003, step 1003 can be executed first, then step 1002, or step 1002 and step 1003 can be executed at the same time.
  • the execution order of step 1002 is not limited.
  • Step 1004 Control the master virtual character to release directional skills in the second direction in the virtual environment.
  • the presentation layer sends a skill release data package to the server.
  • the skill release data package includes the second direction.
  • the logic layer receives the skill release feedback package fed back by the server, and controls the master virtual character in the virtual state according to the skill release feedback package. Release skills in the second direction in the environment.
  • Step 1005 In response to the skill release operation and no movement control operation is received, obtain the facing direction of the master virtual character from the logic layer as the first direction.
  • the facing direction of the master virtual character is not adjusted, then The facing direction of the current master virtual character is the first direction to release the skill.
  • Step 1006 Control the master virtual character in the first direction to release directional skills in the virtual environment.
  • the logic layer receives the skill release feedback package fed back by the server, and controls the master virtual character to report to the virtual environment according to the skill release feedback package. Release skills in the first direction.
  • Step 1101 Determine whether the presentation layer has received an input operation based on moving the joystick; if so, perform step 1102, otherwise, perform step 1103.
  • Step 1102 Use the operation direction of the input operation of moving the joystick received by the presentation layer as the skill release direction.
  • Step 1103 Use the facing direction of the master virtual character cached in the logic layer as the skill release direction.
  • the method provided in this embodiment determines whether a movement control operation is accepted on the mobile control when the skill release operation is received, and when the movement control operation is received, the release is controlled by the movement control direction in the presentation layer.
  • Directional skills and when the movement control operation is not received, use the face direction control in the logic layer to release the directional skills, so as to determine the accurate release direction of the directional skills during the release process, and improve the accuracy of the directional skills. Rate.
  • FIG. 12 shows an overall flowchart of the skill release process provided by an exemplary embodiment of the present application. As shown in FIG. 12, the process includes:
  • Step 1201 Receive a skill release operation.
  • the skill release operation is used to control the master virtual character to release the directional skills in the virtual environment.
  • Step 1202 determine whether there is a skill rocker orientation; if yes, perform step 1203, otherwise, perform step 1204.
  • the orientation of the skill joystick is used to distinguish the release modes of the directional skills, where the release modes include quick release and aiming release.
  • the skill joystick orientation exists, it means that the current release mode of the directional skill is aiming release.
  • the current directional skill release method is quick release.
  • Step 1203 use the skill rocker orientation as the skill release orientation.
  • step 1204 it is judged whether there is a moving joystick orientation; if yes, go to step 1205; otherwise, go to step 1206.
  • the direction of the skill joystick does not exist, it means that the current directional skill release method is quick release, and then it is further determined whether it is necessary to adjust the facing direction of the master virtual character by moving the joystick during the quick release process. .
  • Step 1205 use the moving joystick as the skill release direction.
  • the direction of the mobile joystick is used as the direction of skill release.
  • Step 1206 Use the character orientation as the skill release orientation.
  • Step 1207 release the directional skills.
  • FIG. 13 shows a schematic diagram of the virtual environment interface for the quick release of the directional skill provided by an exemplary embodiment of the present application.
  • the environment interface includes a master virtual character 1310, a mobile joystick 1320, and a directional skill trigger control 1330.
  • the master virtual character 1310 faces right in the virtual environment (first direction), and the terminal receives the user Based on the movement control operation of the mobile joystick 1320, the master virtual character 1310 is controlled to move to the left.
  • the orientation of the master virtual character in the virtual environment is changed to face left (second direction).
  • the trigger operation on the trigger control 1330 based on the directional skill of the user is also received.
  • the changed direction (second direction) releases the directional skill, but during the user's operation, it takes a certain time for the master virtual character to switch the direction, and if the user uses a quick cast method to release the directional skill, it may be possible
  • the master virtual character ignores the current orientation and directly The direction corresponding to the user's movement control operation releases the directional skill, so that when the directional skill is quickly released, the interaction efficiency can be ensured, and the accuracy of the skill release can also be improved.
  • the directional skill when the directional skill is released, if a movement control operation is received, the second direction corresponding to the movement control operation is determined, and the master virtual character is controlled in the second direction.
  • the character releases the directional skills instead of releasing the directional skills in the first direction that is automatically selected, so as to ensure that the directional skills are released in the facing direction adjusted by the master virtual character, improve the accuracy of the directional skills when releasing, and avoid The direction of the release is wrong, you need to wait for the directional skill to cool down (that is, after a period of time after the release, and re-enter the releasable state), based on the user's re-operation to re-release the directional skill, resulting in low human-computer interaction efficiency Problems, thereby improving the efficiency of human-computer interaction, reducing the need for computer equipment to handle erroneous operations, and thereby improving the overall performance of the computer equipment.
  • Fig. 14 is a structural block diagram of a virtual character control device provided by an exemplary embodiment of the present application. As shown in Fig. 14, the device includes:
  • the display module 1410 is configured to display a virtual environment interface, the virtual environment interface includes a screen for observing the virtual environment, and the screen includes a master virtual character located in the virtual environment;
  • the receiving module 1420 is configured to receive a skill release operation and a movement control operation.
  • the skill release operation is used to control the master virtual character to release directional skills in the virtual environment in a first direction
  • the movement control operation is used for To control the master virtual character to move in the second direction in the virtual environment;
  • the release module 1430 is configured to control the master virtual character to release the directional skill in the second direction in the virtual environment in response to the skill release operation and the movement control operation.
  • the virtual environment interface further includes a mobile control, and the mobile control operation is a drag operation received on the mobile control;
  • the receiving module 1420 is further configured to receive a drag operation on the mobile control
  • the device further includes:
  • the obtaining module 1440 is configured to obtain the drag direction of the drag operation from the presentation layer; and determine the corresponding second direction when the master virtual character moves according to the drag direction.
  • the device further includes:
  • a sending module 1450 configured to send a skill release data packet to the server, where the skill release data packet includes the second direction;
  • the receiving module 1420 is further configured to receive a skill release feedback package sent by the server;
  • the release module 1430 is further configured to control the master virtual character to release the directional skill in the second direction in the virtual environment in response to the skill release feedback package.
  • the sending module 1450 is configured to send a movement control data packet to the server in response to the movement control operation, the movement control data packet including the second direction;
  • the receiving module 1420 is further configured to receive a mobility control feedback packet sent by the server;
  • the device further includes:
  • the movement module is configured to control the master virtual character to move in the second direction in the virtual environment in response to the movement control feedback packet.
  • the device further includes:
  • the buffer module is configured to buffer the second direction in the logic layer as the facing direction of the master virtual character in response to the movement control feedback packet.
  • the obtaining module 1440 is further configured to obtain the face of the master virtual character from the logic layer in response to the skill release operation and the movement control operation is not received. Towards the direction as the first direction;
  • the release module 1430 is further configured to control the master virtual character to release the directional skill in the virtual environment in the first direction.
  • the virtual environment interface further includes a skill release control
  • the receiving module 1420 is further configured to receive a first trigger operation in the first area of the skill release control as the skill release operation.
  • the receiving module 1420 is further configured to receive a second trigger operation in a second area of the skill release control, and the second area is corresponding to the skill release control. An area other than the first area; determining the release direction corresponding to the second trigger operation;
  • the release module 1430 is also used to control the master virtual character to release the directional skill in the release direction in the virtual environment.
  • the virtual character control device when the directional skill is released, if the movement control operation is received, the second direction corresponding to the movement control operation is determined, and the master virtual character is controlled in the second direction.
  • the character releases the directional skills instead of releasing the directional skills in the first direction that is automatically selected, so as to ensure that the directional skills are released in the facing direction adjusted by the master virtual character, improve the accuracy of the directional skills when releasing, and avoid The direction of the release is wrong, you need to wait for the directional skill to cool down (that is, after a period of time after the release, and re-enter the releasable state), based on the user's re-operation to re-release the directional skill, resulting in low human-computer interaction efficiency Problems, thereby improving the efficiency of human-computer interaction, reducing the need for computer equipment to handle erroneous operations, and thereby improving the overall performance of the computer equipment.
  • the present application also provides a terminal.
  • the terminal includes a processor and a memory. At least one instruction is stored in the memory. Steps performed by one terminal or steps performed by the second terminal. It should be noted that the terminal may be the terminal provided in Figure 16 below.
  • FIG. 16 shows a structural block diagram of a terminal 1600 provided by an exemplary embodiment of the present application.
  • the terminal 1600 may be: a smart phone, a tablet computer, an MP3 player, an MP4 player, a notebook computer, or a desktop computer.
  • the terminal 1600 may also be called user equipment, portable terminal, laptop terminal, desktop terminal and other names.
  • the terminal 1600 includes a processor 1601 and a memory 1602.
  • the processor 1601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on.
  • the processor 1601 can adopt at least one hardware form among DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array, Programmable Logic Array). accomplish.
  • the processor 1601 may also include a main processor and a coprocessor.
  • the main processor is a processor used to process data in the awake state, also called a CPU (Central Processing Unit, central processing unit); the coprocessor is A low-power processor used to process data in the standby state.
  • the processor 1601 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used to render and draw content that needs to be displayed on the display screen.
  • the processor 1601 may further include an AI (Artificial Intelligence) processor, and the AI processor is used to process computing operations related to machine learning.
  • AI Artificial Intelligence
  • the memory 1602 may include one or more computer-readable storage media, which may be non-transitory.
  • the memory 1602 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
  • the non-transitory computer-readable storage medium in the memory 1602 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 1601 to implement the virtual character provided in the method embodiment of the present application. ⁇ Control methods.
  • the terminal 1600 optionally further includes: a peripheral device interface 1603 and at least one peripheral device.
  • the processor 1601, the memory 1602, and the peripheral device interface 1603 may be connected by a bus or a signal line.
  • Each peripheral device can be connected to the peripheral device interface 1603 through a bus, a signal line, or a circuit board.
  • the peripheral device includes: at least one of a radio frequency circuit 1604, a display screen 1605, a camera component 1606, an audio circuit 1607, a positioning component 1608, and a power supply 1609.
  • the terminal 1600 further includes one or more sensors 1610.
  • the one or more sensors 1610 include, but are not limited to: an acceleration sensor 1611, a gyroscope sensor 1612, a pressure sensor 1613, a fingerprint sensor 1614, an optical sensor 1615, and a proximity sensor 1616.
  • FIG. 16 does not constitute a limitation on the terminal 1600, and may include more or fewer components than shown, or combine certain components, or adopt different component arrangements.
  • the memory also includes one or more programs, the one or more programs are stored in the memory, and the one or more programs include all or more of the virtual character control methods provided in the embodiments of the present application. Part of the steps.
  • the present application provides a computer-readable storage medium that stores at least one instruction, and at least one instruction is loaded and executed by the processor to implement the control method of the virtual character provided by the foregoing method embodiments All or part of the steps in.
  • the computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
  • the processor of the computer device reads the computer instruction from the computer-readable storage medium, and the processor executes the computer instruction, so that the computer device executes all or part of the steps in the virtual character control method provided by the foregoing method embodiments.
  • the program can be stored in a computer-readable storage medium.
  • the storage medium mentioned can be a read-only memory, a magnetic disk or an optical disk, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

一种虚拟角色的控制方法、装置、设备及存储介质,包括:显示虚拟环境界面;接收技能释放操作和移动控制操作,技能释放操作用于以第一方向释放指向性技能,移动控制操作用于控制主控虚拟角色向第二方向移动,第一方向和第二方向互相独立;控制主控虚拟角色向第二方向释放指向性技能。在释放指向性技能时,根据接收到的移动控制操作确定第二方向,并以第二方向控制主控虚拟角色释放指向性技能。本发明能够确保以主控虚拟角色调整后的面朝方向释放指向性技能,提高指向性技能释放时的准确率。

Description

虚拟角色的控制方法、装置、设备及存储介质
本申请要求于2020年04月23日提交的、申请号为202010328532.3、发明名称为“虚拟角色的控制方法、装置、设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及虚拟环境领域,特别涉及一种虚拟角色的控制方法、装置、设备及存储介质。
背景技术
对战游戏是多个用户账号在同一虚拟场景内进行竞技的游戏。可选地,对战游戏可以是多人在线战术竞技游戏(Multiplayer Online Battle Arena Games,MOBA),用户可以在MOBA游戏中控制虚拟对象进行技能释放,从而对敌对虚拟对象进行攻击。
相关技术中,技能释放包括快速释放和瞄准释放至少两种释放方式,其中,快速释放是指用户在技能的释放控件上进行触发后,根据虚拟对象当前在虚拟环境中的面朝方向释放技能。
然而,通过上述方式进行技能释放,当用户通过移动控件调整虚拟对象的面朝方向的同时触发技能的快速释放时,由于虚拟对象的面朝方向尚未更新,因此会导致用户期望释放技能的方向与实际释放技能的方向不同,导致技能释放的准确率较低,用户需要重新冷却技能后,重新进行释放,人机交互效率较低。
发明内容
本申请实施例提供了一种虚拟角色的控制方法、装置、设备及存储介质,能够提高用户在技能释放过程中的人机交互效率。所述技术方案如下:
一方面,提供了一种虚拟角色的控制方法,所述方法由计算机设备执行, 所述方法包括:
显示虚拟环境界面,所述虚拟环境界面中包括对所述虚拟环境进行观察的画面,所述画面中包括位于所述虚拟环境中的主控虚拟角色;
接收技能释放操作和移动控制操作,所述技能释放操作用于控制所述主控虚拟角色以第一方向在所述虚拟环境中释放指向性技能,所述移动控制操作用于控制所述主控虚拟角色在所述虚拟环境中向第二方向移动,所述第一方向和所述第二方向互相独立;
响应于所述技能释放操作和所述移动控制操作,控制所述主控虚拟角色在所述虚拟环境中向所述第二方向释放所述指向性技能。
另一方面,提供了一种虚拟角色的控制装置,所述装置应用于计算机设备中,所述装置包括:
显示模块,用于显示虚拟环境界面,所述虚拟环境界面中包括对所述虚拟环境进行观察的画面,所述画面中包括位于所述虚拟环境中的主控虚拟角色;
接收模块,用于接收技能释放操作和移动控制操作,所述技能释放操作用于控制所述主控虚拟角色以第一方向在所述虚拟环境中释放指向性技能,所述移动控制操作用于控制所述主控虚拟角色在所述虚拟环境中向第二方向移动;
释放模块,用于响应于所述技能释放操作和所述移动控制操作,控制所述主控虚拟角色在所述虚拟环境中向所述第二方向释放所述指向性技能。
在一个可选的实施例中,所述虚拟环境界面中还包括移动控件,所述移动控制操作为在所述移动控件上接收到的拖动操作;
所述接收模块,还用于接收对所述移动控件的拖动操作;
所述装置,还包括:
获取模块,用于从表现层获取所述拖动操作的拖动方向;根据所述拖动方向确定所述主控虚拟角色移动时对应的所述第二方向。
在一个可选的实施例中,所述装置还包括:
发送模块,用于向服务器发送技能释放数据包,所述技能释放数据包中包括所述第二方向;
所述接收模块,还用于接收所述服务器发送的技能释放反馈包;
所述释放模块,还用于响应于所述技能释放反馈包,控制所述主控虚拟角 色在所述虚拟环境中向所述第二方向释放所述指向性技能。
在一个可选的实施例中,所述发送模块,用于响应于所述移动控制操作,向服务器发送移动控制数据包,所述移动控制数据包中包括所述第二方向;
所述接收模块,还用于接收所述服务器发送的移动控制反馈包;
所述装置,还包括:
移动模块,用于响应于所述移动控制反馈包,控制所述主控虚拟角色在所述虚拟环境中面向所述第二方向移动。
在一个可选的实施例中,所述装置,还包括:
缓存模块,用于响应于所述移动控制反馈包,在逻辑层中缓存所述第二方向作为所述主控虚拟角色的面朝方向。
在一个可选的实施例中,所述获取模块,还用于响应于所述技能释放操作,且未接收到所述移动控制操作,从所述逻辑层获取所述主控虚拟角色的所述面朝方向作为所述第一方向;
所述释放模块,还用于向所述第一方向控制所述主控虚拟角色在所述虚拟环境中释放所述指向性技能。
在一个可选的实施例中,所述虚拟环境界面中还包括技能释放控件;
所述接收模块,用于接收在所述技能释放控件的第一区域内的第一触发操作,作为所述技能释放操作。
在一个可选的实施例中,所述接收模块,还用于接收在所述技能释放控件的第二区域内的第二触发操作,所述第二区域为与所述技能释放控件对应的除所述第一区域以外的区域;确定所述第二触发操作对应的释放方向;
所述释放模块,还用于控制所述主控虚拟角色在所述虚拟环境中向所述释放方向释放所述指向性技能。
另一方面,提供了一种计算机设备,所述计算机设备包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如上所述的虚拟角色的控制方法。
另一方面,提供了一种计算机可读存储介质,所述计算机可读存储介质中 存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由处理器加载并执行以实现如上所述的虚拟角色的控制方法。
另一方面,提供了一种计算机程序产品或计算机程序,该计算机程序产品或计算机程序包括计算机指令,该计算机指令存储在计算机可读存储介质中。计算机设备的处理器从计算机可读存储介质读取该计算机指令,处理器执行该计算机指令,使得该计算机设备执行如上所述的虚拟角色的控制方法。
本申请实施例提供的技术方案带来的有益效果至少包括:
在释放指向性技能时,如果接收到移动控制操作,则确定移动控制操作对应的第二方向,并以第二方向控制主控虚拟角色释放指向性技能,而不是以自动选择的第一方向释放指向性技能,从而确保以主控虚拟角色调整后的面朝方向释放指向性技能,提高指向性技能释放时的准确率,避免由于释放方向有误,需要等待该指向性技能冷却(即释放后经过一段时间的恢复并再次进入可释放状态)后,基于用户的再次操作对指向性技能重新释放而导致的人机交互效率低的问题,从而提高了人机交互效率,减少了需要计算机设备进行处理的错误操作,进而提升计算机设备整体性能。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其它的附图。
图1A和图1B是本申请一个示例性实施例提供的技能释放过程的界面示意图;
图2是相关技术中技能的快速释放时间轴示意图;
图3是本申请一个示例性实施例提供的技能的快速释放时间轴示意图;
图4是本申请一个示例性实施例提供的计算机系统的结构框图;
图5是本申请一个示例性实施例提供的虚拟角色的控制方法的流程图;
图6是基于图5示出的实施例提供的技能释放与移动控制的界面示意图;
图7是本申请另一个示例性实施例提供的虚拟角色的控制方法的流程图;
图8是基于图7示出的实施例提供的技能的释放过程流程图;
图9是基于图7示出的实施例提供的技能释放与移动控制的界面示意图;
图10是本申请另一个示例性实施例提供的虚拟角色的控制方法的流程图;
图11是基于图10示出的实施例提供的技能的释放过程流程图;
图12是本申请一个示例性实施例提供的技能释放过程整体流程图;
图13示出了本申请一示例性实施例提供的指向性技能快速释放的虚拟环境界面的示意图;
图14是本申请一个示例性实施例提供的虚拟角色的控制装置的结构框图;
图15是本申请另一个示例性实施例提供的虚拟角色的控制装置的结构框图;
图16是本申请一个示例性实施例提供的终端的结构框图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施方式作进一步地详细描述。
首先,对本申请实施例中涉及的名词进行简单介绍:
1)虚拟环境
虚拟环境是应用程序在终端上运行时显示(或提供)的虚拟环境。该虚拟环境可以是对真实世界的仿真环境,也可以是半仿真半虚构的环境,还可以是纯虚构的环境。虚拟环境可以是二维虚拟环境、2.5维虚拟环境和三维虚拟环境中的任意一种,本申请对此不加以限定。下述实施例以虚拟环境是三维虚拟环境来举例说明。在一些实施例中,虚拟环境用于为至少两个主控虚拟角色提供作战环境。该虚拟环境包括对称的左下角区域和右上角区域,属于两个敌对阵营的主控虚拟角色分别占据其中一个区域,并以摧毁对方区域深处的目标建筑,或据点,或基地,或水晶来作为胜利目标。
2)虚拟角色
虚拟角色是指虚拟环境中的可活动对象。该可活动对象可以是虚拟人物、虚拟动物、动漫人物等,比如:在三维虚拟环境中显示的人物或动物等。可选 地,虚拟角色是基于动画骨骼技术创建的三维立体模型。每个虚拟角色在三维虚拟环境中具有自身的形状和体积,占据三维虚拟环境中的一部分空间。本申请实施例以虚拟角色为用户控制的主控虚拟角色为例,主控虚拟角色泛指虚拟环境中的一个或多个主控虚拟角色。
3)多人在线战术竞技游戏(Multiplayer Online Battle Arena Games,MOBA)
多人在线战术竞技游戏是指在虚拟环境中,分属至少两个敌对阵营的不同虚拟队伍分别占据各自的地图区域,以某一种胜利条件作为目标进行竞技。该胜利条件包括但不限于:占领据点或摧毁敌对阵营据点、击杀敌对阵营的虚拟角色、在指定场景和时间内保证自身的存活、抢夺到某种资源、在指定时间内比分超过对方中的至少一种。战术竞技可以以局为单位来进行,每局战术竞技的地图可以相同,也可以不同。每个虚拟队伍包括一个或多个虚拟角色,比如1个、2个、3个或5个。一局MOBA游戏的持续时间是从游戏开始的时刻至达成胜利条件的时刻。
本申请中提供的方法可以应用于虚拟现实应用程序、三维地图程序、军事仿真程序、第一人称射击游戏(First Person Shooting Game,FPS)、MOBA游戏等,下述实施例是以在游戏中的应用来举例说明。
基于虚拟环境的游戏由一个或多个游戏世界构成,游戏中的虚拟环境可以模拟真实世界的场景,用户可以操控游戏中的主控虚拟角色在虚拟环境中进行行走、跑步、跳跃、射击、格斗、驾驶、释放技能、受到其他虚拟角色的攻击、受到虚拟环境中的伤害、攻击其他虚拟角色等动作,交互性较强,并且多个用户可以在线组队进行竞技游戏。
在一些实施例中,主控虚拟角色在虚拟环境中释放技能时,包括如下技能释放方式中的至少一种:
第一,快速释放,是指在技能释放的过程中,通过对技能释放控件的触发以虚拟对象在虚拟环境中的面朝方向释放技能;
可选地,技能释放控件对应有第一区域和第二区域,其中,当接收到在第一区域内的第一触发操作时,以第一方向在虚拟环境中释放指向性技能,其中,第一方向为虚拟对象的面朝方向,或,第一方向为技能释放范围内攻击目标所处位置对应的方向。可选地,当接收到第一触发操作,且虚拟对象周侧预设范 围内无攻击目标时,以虚拟对象在虚拟环境中的面朝方向释放技能。其中,第一区域内的第一触发操作包括作用于技能释放控件上的触摸操作,且触摸操作的结束位置位于第一区域内;或,第一区域内的第一触发操作包括作用于第一区域内的触摸操作,且触摸操作未移出该第一区域。
第二,瞄准释放,是指在技能释放的过程中,通过技能释放控件对技能释放的方向进行调整后,以调整后的方向释放技能;
可选地,当接收到在第二区域内的第二触发操作时,根据第二触发操作确定技能释放的方向,并当触发操作结束时,以技能释放的方向释放技能。其中,第二区域内的第二触发操作包括作用于开始于第一区域内的触摸操作,且触摸操作的结束位置位于第二区域内;或,第二区域内的第二触发操作包括作用于第二区域内的触摸操作,且触摸操作未移出第二区域。
示意性的,图1A和图1B是本申请一个示例性实施例提供的技能释放过程的界面示意图,如图1A所示,在虚拟环境界面100中包括技能释放控件110,技能释放控件110中包括第一区域111和第二区域112;其中,基于图1A中的技能释放控件110实现的技能释放过程如图1B所示,响应于接收在第一区域111内的技能释放操作,以虚拟对象120在虚拟环境中的面朝方向释放技能;响应于接收在第二区域112内的技能释放操作,确定与技能释放操作对应的释放方向,并以释放方向释放技能。
本申请实施例中,针对上述快速释放的技能释放方式进行说明:
相关技术中,虚拟环境界面中还包括移动摇杆,用于控制虚拟对象的面朝方向以及控制虚拟对象在虚拟环境中移动,在移动摇杆接收到控制操作改变虚拟对象的面朝方向的同时,接收到在技能释放控件上的快速释放操作时,移动终端从逻辑层获取虚拟对象的面朝方向,并以面朝方向快速释放技能。而在网络状况较差的环境下,移动摇杆接收到的控制操作上传到服务器后并未接收到反馈消息,也即,虚拟对象尚未在逻辑层完成面朝方向的调整,而客户端从逻辑层获取的面朝方向为调整前的方向,故技能释放时的释放方向,与调整后的面朝方向并不相同,从而技能释放方向准确率较低。
示意性的,请参考图2,在时间轴上,技能释放过程中包括如下过程:
步骤201,虚拟对象面朝第一方向。
步骤202,移动摇杆控制虚拟对象面朝第二方向。
步骤203,在技能触发控件上触发技能,获取虚拟对象的面朝方向(第一方向)。
步骤204,客户端向服务器发送移动控制包。
步骤205,客户端向服务器发送技能释放包。
步骤206,客户端接收服务器反馈的移动控制反馈包,并控制虚拟对象面朝第二方向。
步骤207,客户端接收服务器反馈的技能释放反馈包,并控制虚拟对象朝第一方向释放技能。
步骤208,虚拟对象面朝第二方向。
也就是说,在控制虚拟对象面朝第二方向,与从逻辑层获取方向进行技能释放之间存在时间差,导致从逻辑层中获取的方向为更新之前的第一方向,技能释放方向不准确。
而本申请实施例中,如图3所示,在时间轴上,技能释放过程中包括如下过程:
步骤301,虚拟对象面朝第一方向。
步骤302,移动摇杆控制虚拟对象面朝第二方向。
步骤303,在技能触发控件上触发技能,获取移动摇杆的控制方向(第二方向)。
步骤304,客户端向服务器发送移动控制包。
步骤305,客户端向服务器发送技能释放包。
步骤306,客户端接收服务器反馈的移动控制反馈包,并控制虚拟对象面朝第二方向。
步骤307,客户端接收服务器反馈的技能释放反馈包,并控制虚拟对象朝第二方向释放技能。
步骤308,虚拟对象面朝第二方向。
也就是说,在技能释放过程中,所获取的释放方向为在移动控件上接收到的控制方向,也就是虚拟对象最终面朝的方向,从而提高技能释放方向的准确率。
图4示出了本申请一个示例性实施例提供的计算机系统的结构框图。该计算机系统400包括:第一终端420、服务器440和第二终端460。
第一终端420安装和运行有支持虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图程序、军事仿真程序、FPS游戏、MOBA游戏、多人枪战类生存游戏、大逃杀类型的射击游戏中的任意一种。第一终端420是第一用户使用的终端,第一用户使用第一终端420控制位于虚拟环境中的第一主控虚拟角色进行活动,该活动包括但不限于:调整身体姿态、行走、奔跑、跳跃、释放技能、拾取、攻击、躲避其他虚拟角色的攻击中的至少一种。示意性的,第一主控虚拟角色是第一虚拟人物,比如仿真人物角色或动漫人物角色。示意性的,第一主控虚拟角色在虚拟环境中释放区域型技能,虚拟环境画面由主控虚拟角色所在的位置向区域型技能指示器选中的目标区域移动。区域型技能指示器用于指示主控虚拟角色在释放技能时的技能释放区域。
第一终端420通过无线网络或有线网络与服务器440相连。
服务器440包括一台服务器、多台服务器、云计算平台和虚拟化中心中的至少一种。示意性的,服务器440包括处理器444和存储器442,存储器442又包括接收模块4421、控制模块4422和发送模块4423,接收模块4421用于接收客户端发送的请求,如组队请求;控制模块4422用于控制虚拟环境画面的渲染;发送模块4423用于向客户端发送消息通知,如组队成功通知。服务器440用于为支持三维虚拟环境的应用程序提供后台服务。可选地,服务器440承担主要计算工作,第一终端420和第二终端460承担次要计算工作;或者,服务器440承担次要计算工作,第一终端420和第二终端460承担主要计算工作;或者,服务器440、第一终端420和第二终端460三者之间采用分布式计算架构进行协同计算。
第二终端460通过无线网络或有线网络与服务器440相连。
第二终端460安装和运行有支持虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图程序、军事仿真程序、FPS游戏、MOBA游戏、多人枪战类生存游戏、大逃杀类类型的射击游戏中的任意一种。第二终端460是第二用户使用的终端,第二用户使用第二终端460控制位于虚拟环境中的第二主控虚拟角色进行活动,该活动包括但不限于:调整身体姿态、行走、奔跑、跳跃、释放技能、拾取、攻击、躲避其他主控虚拟角色的攻击中的至少一种。 示意性的,第二主控虚拟角色是第二虚拟人物,比如仿真人物角色或动漫人物角色。
可选地,第一虚拟人物角色和第二虚拟人物角色处于同一虚拟环境中。可选地,第一虚拟人物角色和第二虚拟人物角色可以属于同一个队伍、同一个组织、具有好友关系或具有临时性的通讯权限。
可选地,第一终端420和第二终端460上安装的应用程序是相同的,或两个终端上安装的应用程序是不同控制系统平台的同一类型应用程序。第一终端420可以泛指多个终端中的一个,第二终端460可以泛指多个终端中的一个,本实施例仅以第一终端420和第二终端460来举例说明。第一终端420和第二终端460的设备类型相同或不同,该设备类型包括:智能手机、平板电脑、电子书阅读器、MP3(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)播放器、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器、膝上型便携计算机和台式计算机中的至少一种。以下实施例以终端包括智能手机来举例说明。
本领域技术人员可以知晓,上述终端的数量可以更多或更少。比如上述终端可以仅为一个,或者上述终端为几十个或几百个,或者更多数量。本申请实施例对终端的数量和设备类型不加以限定。
图5是本申请一个示例性实施例提供的虚拟角色的控制方法的流程图,该方法可以由计算机设备执行,该计算机设备可以实现为如图4所示的计算机系统400中的第一终端420或第二终端460中或该计算机系统400中的其他终端,如图5所示,该方法包括:
步骤501,显示虚拟环境界面,虚拟环境界面中包括对虚拟环境进行观察的画面,画面中包括位于虚拟环境中的主控虚拟角色。
可选地,虚拟环境界面中还包括用于控制主控虚拟角色释放指向性技能的释放控件,以及用于控制主控虚拟角色在虚拟环境中移动的移动控件。
可选地,画面中包括位于虚拟环境中的主控虚拟角色。用户使用的终端上运行有支持虚拟环境的应用程序,当用户运行该应用程序时,终端的显示屏上对应显示在使用该应用程序时的用户界面,即显示虚拟环境界面,虚拟环境界面中显示由以目标观察位置对虚拟环境进行观察的画面,该画面显示的虚拟环 境包括:山川、平地、河流、湖泊、海洋、沙漠、天空、植物、建筑、交通工具中的至少一种元素。
在一些实施例中,虚拟环境是具有任意边界形状的虚拟环境,如虚拟环境呈菱形。用户可通过查看虚拟环境对应的地图来浏览虚拟环境的全貌。在虚拟环境中设置有摄像机模型,摄像机模型用于以不同的视角对虚拟环境进行观察,从而得到虚拟环境画面。
视角是指以主控虚拟角色的第一人称视角或者第三人称视角在虚拟环境中进行观察时的观察角度。
可选地,虚拟环境界面中显示有用于控制主控虚拟角色释放指向性技能的释放控件,其中,指向性技能对应有技能释放方向,也即在释放技能时,需要以指定方向释放指向性技能。其中,技能释放方向包括如下两种情况中的至少一种:
第一,在对指向性技能进行快速释放的过程中,控制主控虚拟角色以第一方向释放指向性技能,示意性的,当虚拟对象周侧的预设范围内无攻击对象时,以虚拟对象在虚拟环境中的面朝方向作为指向性技能的释放方向,当虚拟对象周侧的预设范围内存在攻击对象时,以攻击对象对应的方向作为指向性技能的释放方向;
第二,在对指向性技能进行瞄准释放的过程中,通过对释放控件的释放调整操作对指向性技能的释放方向进行调整,并在触发释放时,以调整后的释放方向释放指向性技能。
本申请实施例中,针对上述指向性技能的快速释放过程进行说明,也就是说,针对在指向性技能的释放过程中,获取主控虚拟角色在虚拟环境中的面朝方向,并以面朝方向释放指向性技能的技能释放方式进行说明。
可选地,虚拟环境界面中还显示有用于控制主控虚拟角色移动的移动控件,其中,移动控件在控制主控虚拟角色移动的过程中,还可以用于控制主控虚拟角色调整移动方向,也就是说,用户可以通过移动控件调整主控虚拟角色的面朝方向,并控制主控虚拟角色在虚拟环境中向面朝方向移动。
可选地,由于在对指向性技能进行释放时,快速释放方式能够提高技能释放效率,通常,用户在释放技能时,通过移动控件快速调整主控虚拟角色的面朝方向后,通过对指向性技能的技能释放操作,以准确的方向快速释放指向性 技能。
步骤502,接收技能释放操作和移动控制操作。
可选地,技能释放操作用于控制主控虚拟角色以第一方向在虚拟环境中释放指向性技能;移动控制操作用于控制主控虚拟角色在虚拟环境中向第二方向移动,且第一方向和第二方向互相独立。
可选地,技能释放操作对应上述指向性技能的快速释放,也就是说,通过技能释放操作控制主控虚拟角色以快速释放方式对指向性技能进行释放。其中,第一方向为指向性技能的释放过程中,客户端自动选择的方向,如:当主控虚拟角色周侧预设范围内存在攻击对象时,则将攻击对象所处位置对应的方向作为第一方向,当主控虚拟角色周侧预设范围内不存在攻击对象时,则将主控虚拟角色的面朝方向作为第一方向。
本申请实施例以主控虚拟角色的面朝方向作为第一方向为例进行说明,可选地,面朝方向的获取方式包括:从逻辑层直接获取主控虚拟角色当前在虚拟环境中的朝向作为主控虚拟角色的面朝方向;然而,从逻辑层获取的面朝方向可能因为方向调整的延迟而不准确,从而导致技能释放方向不准确,如图6所示,在虚拟环境界面600中,包括主控虚拟角色610、技能释放控件620以及移动控件630,移动控件630上接收到向右下方的拖动操作,控制主控虚拟角色610面朝与右下方对应的方向,并向与右下方对应的方向进行移动,而技能释放控件620上接收到快速释放操作,当前主控虚拟角色610面朝左下方对应的方向,因此,终端向服务器发送的技能释放请求中包括的技能释放方向与左下方对应,技能向左下方释放,与移动控件630上被控制的方向不同。
本申请实施例中,用移动控制操作对应的第二方向取代第一方向,进行指向性技能的释放,也就是说,从表现层获取移动控制操作对应的第二方向作为主控虚拟角色的面朝方向。可选地,虚拟环境界面中还包括移动控件,移动控制操作为在移动控件上接收到的拖动操作,则接收对移动控件的拖动操作后,从表现层获取拖动操作的拖动方向,并根据拖动方向确定主控虚拟角色移动时对应的第二方向。
可选地,移动控制操作为基于移动控件触发的操作,表现层用于实现界面表现以及接收界面操作。示意性的,表现层用于对虚拟环境界面中虚拟环境对应的画面,以及用于对主控虚拟角色进行控制或者游戏过程进行控制的控件进 行显示,可选地,表现层还用于接收在虚拟环境界面上的触控操作,并将触控操作通过服务器上报至逻辑层进行逻辑处理。
可选地,表现层和逻辑层都存在与游戏客户端中,其中,逻辑层无法直接访问表现层中的数据,表现层可以访问逻辑层中的数据,而表现层无法对逻辑层的逻辑进行修改,需要根据接收到的触控操作通过服务器从而在逻辑层中进行逻辑处理。
可选地,在移动控制操作过程中,用户在虚拟环境界面中对移动控件进行触控操作,从而由表现层读取触控数据,并生成移动触控消息,如:移动触控消息中包括主控虚拟角色调整后的面朝方向,客户端将移动触控消息发送至服务器,由服务器向客户端的逻辑层发送移动反馈消息后,逻辑层根据移动反馈消息对主控虚拟角色的面朝方向进行调整,由表现层从逻辑层读取调整后的面朝方向后进行表现,从而实现对主控虚拟角色的控制。
可选地,虚拟环境界面中还包括技能释放控件,技能释放控件用于控制虚拟对象释放指向性技能,可选地,技能释放控件对应有第一区域和第二区域,其中,第一区域用于触发指向性技能的快速释放,第二区域用于触发指向性技能的瞄准释放。可选的,对于第一区域,响应于接收到在技能释放控件的第一区域内的第一触发操作,确定接收技能释放操作;且当接收到在技能释放控件的第一区域内的第一触发操作时,对指向性技能进行快速释放。可选地,当接收到在技能释放控件的第二区域内的第二触发操作时,确定第二触发操作对应的释放方向,并控制主控虚拟角色在虚拟环境中以对应的释放方向释放指向性技能,其中,第二区域为技能释放控件对应的除第一区域以外的区域。
步骤503,响应于技能释放操作和移动控制操作,控制主控虚拟角色在虚拟环境中向第二方向释放指向性技能。
可选地,第二方向为移动控制操作对应的方向,以及用户控制主控虚拟角色进行面朝方向调整时,调整后的面朝方向,从而指向性技能的释放方向与移动控制方向一致。
由于技能释放控件通常设置为在基于该技能释放控件进行过一次技能释放过后,需要经过指定时长后,才能基于该技能释放控件进行下一次技能释放,也就是说,技能被释放后,需要经过技能冷却才能进行再次释放,因此,基于本申请实施例提供的虚拟角色控制方法,可以减少由于指向性技能释放方向的 错误而导致的技能释放时间的浪费,从而提高人机交互效率。
综上所述,本申请实施例提供的虚拟角色的控制方法,在释放指向性技能时,如果接收到移动控制操作,则确定移动控制操作对应的第二方向,并以第二方向控制主控虚拟角色释放指向性技能,而不是以自动选择的第一方向释放指向性技能,从而确保以主控虚拟角色调整后的面朝方向释放指向性技能,提高指向性技能释放时的准确率,避免由于释放方向有误,需要通等待该指向性技能冷却(即释放后经过一段时间的恢复并再次进入可释放状态)后,基于用户的再次操作对指向性技能重新释放而导致的人机交互效率低的问题,从而提高了人机交互效率,减少了需要计算机设备进行处理的错误操作,进而提升计算机设备整体性能。
在一个可选的实施例中,控制主控虚拟角色释放指向性技能时,需要通过服务器实现逻辑层的技能释放过程,图7是本申请另一个示例性实施例提供的虚拟角色的控制方法的流程图,该方法可以由计算机设备执行,该计算机设备可以实现为如图4所示的计算机系统400中的第一终端420或第二终端460中或该计算机系统400中的其他终端,如图7所示,该方法包括:
步骤701,显示虚拟环境界面,虚拟环境界面中包括对虚拟环境进行观察的画面,画面中包括位于虚拟环境中的主控虚拟角色。
可选地,画面上叠加显示有用于控制主控虚拟角色释放指向性技能的释放控件,以及用于控制主控虚拟角色在虚拟环境中移动的移动控件。
可选地,区别于在指向性技能的快速释放过程中,对应以第一方向作为技能的释放方向,在本申请实施例中,在指向性技能的快速释放过程中,获取主控虚拟角色在虚拟环境中的面朝方向,并以主控虚拟角色的面朝方向释放指向性技能。
步骤702,接收技能释放操作和移动控制操作。
可选地,技能释放操作用于控制主控虚拟角色以第一方向在虚拟环境中释放指向性技能;移动控制操作用于控制主控虚拟角色在虚拟环境中向第二方向移动,且第一方向和第二方向互相独立。
可选地,技能释放操作对应上述指向性技能的快速释放,也即,通过技能释放操作控制主控虚拟角色以快速释放方式对指向性技能进行释放。
可选地,移动控制操作为通过移动控件触发的操作,根据移动控制操作,从表现层获取移动控制操作对应的第二方向,表现层用于实现界面表现以及接收界面操作。
可选地,移动控制操作通过对移动控件的拖动操作实现,也即接收对移动控件的拖动操作,从表现层获取拖动操作的拖动方向,并根据拖动方向确定主控虚拟角色移动时对应的第二方向。
步骤703,向服务器发送技能释放数据包,技能释放数据包中包括第二方向。
可选地,由于表现层能够访问逻辑层的数据,而不能对逻辑层的逻辑进行修改,也即不能控制逻辑层进行逻辑处理,因此,表现层在获取到第二方向后,向服务器发送技能释放数据包,其中,技能释放数据包中包括将第二方向作为指向性技能的释放方向。
步骤704,接收服务器发送的技能释放反馈包。
可选地,通过逻辑层接收服务器发送的技能释放反馈包,由逻辑层根据技能释放反馈包进行逻辑处理。
步骤705,响应于技能释放反馈包,控制主控虚拟角色在虚拟环境中向第二方向释放指向性技能。
可选地,逻辑层接收到技能释放反馈包后,根据技能释放反馈包中的控制数据,控制主控虚拟角色在虚拟环境中向第二方向释放指向性技能。
可选地,第二方向为移动控件上接收到的移动控制操作对应的方向,以及用户控制主控虚拟角色进行面朝方向调整时,调整后的面朝方向,从而使得指向性技能的释放方向与移动控制方向一致。
示意性的,结合用户、表现层、服务器以及逻辑层对技能释放逻辑进行说明,如图8所示,首先用户810基于技能释放控件执行技能释放操作后,触发表现层820向服务器830发送技能释放数据包,其中,技能释放数据包中包括移动控制方向,服务器830接收技能释放数据包后,向逻辑层840发送技能释放反馈包,逻辑层840基于技能释放反馈包进行逻辑处理,向表现层820发送技能释放状态以指示表现层820展示技能释放过程,相应的,表现层820从逻辑层840获取技能释放状态,并在表现层820展示技能释放过程。
示意性的,请参考图9,在虚拟环境界面900中包括主控虚拟角色910、移动摇杆920和指向性技能的触发控件930,其中,主控虚拟角色910在虚拟环境 中面朝第一方向,接收在移动摇杆920上的移动控制操作,控制主控虚拟角色910在虚拟环境中面朝第二方向,同时,客户端接收在触发控件930上的触发操作,从而客户端从表现层读取在移动摇杆920上的移动控制操作,并向第二方向释放指向性技能。
综上所述,本申请实施例提供的虚拟角色的控制方法,在释放指向性技能时,如果接收到移动控制操作,则确定移动控制操作对应的第二方向,以第二方向控制主控虚拟角色释放指向性技能,而不是以自动选择的第一方向释放指向性技能,从而确保以主控虚拟角色调整后的面朝方向释放指向性技能,提高指向性技能释放时的准确率,避免由于释放方向有误,需要等待该指向性技能冷却(即释放后经过一段时间的恢复并再次进入可释放状态)后,基于用户的再次操作对指向性技能重新释放而导致的人机交互效率低的问题,从而提高了人机交互效率,减少了需要计算机设备进行处理的错误操作,进而提升计算机设备整体性能。
本实施例提供的方法,通过终端的表现层向服务器发送技能释放数据包,并由服务器向终端的逻辑层反馈技能释放反馈包,从而从逻辑层实现技能的释放,并同时实现在表现层以第二方向控制主控虚拟角色释放指向性技能,提高了指向性技能释放时的准确率。
在一个可选的实施例中,当未接收到在移动控件上的移动控制操作时,从逻辑层直接获取主控虚拟角色的面朝方向进行技能释放,图10是本申请另一个示例性实施例提供的虚拟角色的控制方法的流程图,该方法可以由计算机设备执行,该计算机设备可以实现为如图4所示的计算机系统400中的第一终端420或第二终端460中或该计算机系统400中的其他终端,如图10所示,该方法包括:
步骤1001,显示虚拟环境界面,虚拟环境界面中包括对虚拟环境进行观察的画面,画面中包括位于虚拟环境中的主控虚拟角色。
可选地,画面上叠加显示有用于控制主控虚拟角色释放指向性技能的释放控件,以及用于控制主控虚拟角色在虚拟环境中移动的移动控件。
可选地,区别于在指向性技能的快速释放过程中,对应有第一方向作为技能的释放方向,在本申请实施例中,在指向性技能的快速释放过程中,获取主 控虚拟角色在虚拟环境中的面朝方向,并以面朝方向释放指向性技能。
步骤1002,接收技能释放操作,技能释放操作用于控制主控虚拟角色以第一方向释放指向性技能。
可选地,技能释放操作对应上述指向性技能的快速释放,也就是说,通过技能释放操作控制主控虚拟角色以快速释放方式对指向性技能进行释放,客户端在对主控虚拟角色的面朝方向进行获取后,以面朝方向释放指向性技能。
步骤1003,接收移动控制操作,移动控制操作用于控制主控虚拟角色在虚拟环境中向第二方向移动。
可选地,移动控制操作通过对移动控件的拖动操作实现,也即接收对移动控件的拖动操作,从表现层获取拖动操作的拖动方向,并确定与拖动方向对应的第二方向。
可选地,接收到移动控制操作后,向服务器发送移动控制数据包,移动控制数据包中包括第二方向,接收服务器发送的移动控制反馈包,并响应于移动控制反馈包,控制主控虚拟角色在虚拟环境中面朝第二方向移动。
可选地,响应于移动控制反馈包,在逻辑层中缓存第二方向作为主控虚拟角色的面朝方向。
值得注意的是,上述步骤1002和步骤1003,可以先执行步骤1002再执行步骤1003,也可以先执行步骤1003再执行步骤1002,还可以步骤1002和步骤1003同时执行,本申请实施例步骤1003与步骤1002的执行顺序不加以限定。
步骤1004,控制主控虚拟角色在虚拟环境中向第二方向释放指向性技能。
可选地,由表现层向服务器发送技能释放数据包,技能释放数据包中包括第二方向,由逻辑层接收服务器反馈的技能释放反馈包,并根据技能释放反馈包控制主控虚拟角色在虚拟环境中向第二方向释放技能。
步骤1005,响应于技能释放操作,且未接收到移动控制操作,从逻辑层获取主控虚拟角色的面朝方向作为第一方向。
可选地,当接收到技能释放操作,且在移动控件上未触发有移动控制操作时,也即在对指向性技能进行释放时,未对主控虚拟角色的面朝方向进行调整,则以当前主控虚拟角色的面朝方向为第一方向释放技能。
步骤1006,以第一方向控制主控虚拟角色在虚拟环境中释放指向性技能。
可选地,向服务器发送技能释放数据包,技能释放数据包中包括第一方向, 由逻辑层接收服务器反馈的技能释放反馈包,并根据技能释放反馈包控制主控虚拟角色在虚拟环境中向第一方向释放技能。
示意性的,请参考图11,当用户触发对指向性技能的快速释放时,可以实现为以下过程:
步骤1101,判断表现层是否接收到基于移动摇杆的输入操作;若是,则执行步骤1102,否则执行步骤1103。
步骤1102,将表现层接收到的移动摇杆的输入操作的操作方向作为技能释放方向。
步骤1103,将逻辑层缓存的主控虚拟角色的面朝方向作为技能释放方向。
综上所述,本实施例提供的方法,当接收到技能释放操作时,确定是否在移动控件上接受到移动控制操作,当接收到移动控制操作时,以表现层中的移动控制方向控制释放指向性技能,并当未接收到移动控制操作时,以逻辑层中的面朝方向控制释放指向性技能,从而确定指向性技能在释放过程中准确的释放方向,提高指向性技能释放时的准确率。
示意性的,请参考图12,其示出了本申请一个示例性实施例提供的技能释放过程整体流程图,如图12所示,该过程中包括:
步骤1201,接收技能释放操作。
可选地,技能释放操作用于控制主控虚拟角色在虚拟环境中对指向性技能进行释放。
步骤1202,判断是否有技能摇杆朝向;若是,执行步骤1203,否则,执行步骤1204。
可选地,技能摇杆朝向用于区分指向性技能的释放方式,其中,释放方式包括快速释放和瞄准释放,当存在技能摇杆朝向时,则表示当前指向性技能的释放方式为瞄准释放,当不存在技能摇杆朝向时,则表示当前指向性技能的释放方式为快速释放。
步骤1203,以技能摇杆朝向作为技能释放朝向。
可选地,当存在技能摇杆朝向时,表示当前指向性技能的释放方式为瞄准释放,则以技能摇杆朝向作为技能释放的方向。
步骤1204,判断是否有移动摇杆朝向;若是,执行步骤1205,否则,执行 步骤1206。
可选地,当不存在技能摇杆朝向时,表示当前指向性技能的释放方式为快速释放,则进一步判断在快速释放过程中,是否需要通过移动摇杆朝向调整主控虚拟角色的面朝方向。
步骤1205,以移动摇杆朝向作为技能释放朝向。
当存在移动摇杆朝向时,也即需要调整主控虚拟角色的面朝方向,因此,以移动摇杆朝向作为技能释放的方向。
步骤1206,以角色朝向作为技能释放朝向。
当不存在移动摇杆朝向时,也即未对主控虚拟角色的面朝方向进行调整,此时,以当前主控虚拟角色的面朝方向作为技能的释放方向。
步骤1207,释放指向性技能。
在指向性技能的释放方式为快速释放时,请参考图13,其示出了本申请一示例性实施例提供的指向性技能快速释放的虚拟环境界面的示意图,如图13所示,在虚拟环境界面中包括主控虚拟角色1310、移动摇杆1320以及指向性技能的触发控件1330,在初始状态时,主控虚拟角色1310在虚拟环境中面朝右(第一方向),终端接收到用户基于移动摇杆1320的移动控制操作,控制主控虚拟角色1310向左移动,同时,主控虚拟角色在虚拟环境中的朝向更改为面朝左(第二方向),若终端在接收到用户基于移动摇杆1320的移动控制操作的同时,也接收到用户基于指向性技能的触发控件1330上的触发操作,假设用户需要将主控虚拟角色1310的面朝方向进行调转,向主控虚拟角色1310更改后的方向(第二方向)释放该指向性技能,但是在用户的操作过程中,主控虚拟角色调转方向需要一定的时间,而如果用户采用快捷施法的方式来释放指向性技能,就可能导致以下两种错误的操作结果:一种是用户观察到主控虚拟角色调转方向完成后,对指向性技能执行快捷释放操作,但是,从用户观察到主控虚拟角色转向完成到执行快捷释放操作之间必然是存在一定的延时的,从而导致交互效率较低;另一种是,用户在主控虚拟角色调转方向未完成时就执行了快捷施法操作,导致指向性技能的释放方向保持为未调转之前的方向(第一方向),与用户操作意图不符合,从而导致误操作;而通过本方案,用户在控制主控虚拟角色向目标方向移动过程中,对指向性技能执行快捷释放操作后,将第二方向确定为该指向性技能的释放方向,即主控虚拟角色将向移动方向释放该指向性技能, 从实际控制效果角度来说,就是主控虚拟角色无视当前朝向,直接向用户的移动控制操作对应的方向释放该指向性技能,从而在通过快捷释放指向性技能时,既能保证交互效率的同时,也能提高技能释放的准确性。
综上所述,本申请实施例提供的虚拟角色的控制方法,在释放指向性技能时,如果接收到移动控制操作,则确定移动控制操作对应的第二方向,以第二方向控制主控虚拟角色释放指向性技能,而不是以自动选择的第一方向释放指向性技能,从而确保以主控虚拟角色调整后的面朝方向释放指向性技能,提高指向性技能释放时的准确率,避免由于释放方向有误,需要等待该指向性技能冷却(即释放后经过一段时间的恢复并再次进入可释放状态)后,基于用户的再次操作对指向性技能重新释放而导致的人机交互效率低的问题,从而提高了人机交互效率,减少了需要计算机设备进行处理的错误操作,进而提升计算机设备整体性能。
图14是本申请一个示例性实施例提供的虚拟角色的控制装置的结构框图,如图14所示,该装置包括:
显示模块1410,用于显示虚拟环境界面,所述虚拟环境界面中包括对所述虚拟环境进行观察的画面,所述画面中包括位于所述虚拟环境中的主控虚拟角色;
接收模块1420,用于接收技能释放操作和移动控制操作,所述技能释放操作用于控制所述主控虚拟角色以第一方向在所述虚拟环境中释放指向性技能,所述移动控制操作用于控制所述主控虚拟角色在所述虚拟环境中向第二方向移动;
释放模块1430,用于响应于所述技能释放操作和所述移动控制操作,控制所述主控虚拟角色在所述虚拟环境中向所述第二方向释放所述指向性技能。
在一个可选的实施例中,所述虚拟环境界面中还包括移动控件,所述移动控制操作为在所述移动控件上接收到的拖动操作;
所述接收模块1420,还用于接收对所述移动控件的拖动操作;
在一个可选的实施例中,如图15所示,该装置,还包括:
获取模块1440,用于从表现层获取所述拖动操作的拖动方向;根据所述拖动方向确定所述主控虚拟角色移动时对应的所述第二方向。
在一个可选的实施例中,该装置还包括:
发送模块1450,用于向服务器发送技能释放数据包,所述技能释放数据包中包括所述第二方向;
所述接收模块1420,还用于接收所述服务器发送的技能释放反馈包;
所述释放模块1430,还用于响应于所述技能释放反馈包,控制所述主控虚拟角色在所述虚拟环境中向所述第二方向释放所述指向性技能。
在一个可选的实施例中,发送模块1450,用于响应于所述移动控制操作,向服务器发送移动控制数据包,所述移动控制数据包中包括所述第二方向;
所述接收模块1420,还用于接收所述服务器发送的移动控制反馈包;
所述装置,还包括:
移动模块,用于响应于所述移动控制反馈包,控制所述主控虚拟角色在所述虚拟环境中面向所述第二方向移动。
在一个可选的实施例中,所述装置,还包括:
缓存模块,用于响应于所述移动控制反馈包,在逻辑层中缓存所述第二方向作为所述主控虚拟角色的面朝方向。
在一个可选的实施例中,获取模块1440,还用于响应于所述技能释放操作,且未接收到所述移动控制操作,从所述逻辑层获取所述主控虚拟角色的所述面朝方向作为所述第一方向;
所述释放模块1430,还用于向所述第一方向控制所述主控虚拟角色在所述虚拟环境中释放所述指向性技能。
在一个可选的实施例中,所述虚拟环境界面中还包括技能释放控件;
所述接收模块1420,还用于接收在所述技能释放控件的第一区域内的第一触发操作,作为所述技能释放操作。
在一个可选的实施例中,所述接收模块1420,还用于接收在所述技能释放控件的第二区域内的第二触发操作,所述第二区域为与所述技能释放控件对应的除所述第一区域以外的区域;确定所述第二触发操作对应的释放方向;
所述释放模块1430,还用于控制所述主控虚拟角色在所述虚拟环境中向所述释放方向释放所述指向性技能。
综上所述,本申请实施例提供的虚拟角色的控制装置,在释放指向性技能时,如果接收移动控制操作,则确定移动控制操作对应的第二方向,并以第二 方向控制主控虚拟角色释放指向性技能,而不是以自动选择的第一方向释放指向性技能,从而确保以主控虚拟角色调整后的面朝方向释放指向性技能,提高指向性技能释放时的准确率,避免由于释放方向有误,需要等待该指向性技能冷却(即释放后经过一段时间的恢复并再次进入可释放状态)后,基于用户的再次操作对指向性技能重新释放而导致的人机交互效率低的问题,从而提高了人机交互效率,减少了需要计算机设备进行处理的错误操作,进而提升计算机设备整体性能。
本申请还提供了一种终端,该终端包括处理器和存储器,存储器中存储有至少一条指令,至少一条指令由处理器加载并执行以实现上述各个方法实施例提供的虚拟角色的控制方法由第一终端执行的步骤或由第二终端执行的步骤。需要说明的是,该终端可以是如下图16所提供的终端。
图16示出了本申请一个示例性实施例提供的终端1600的结构框图。该终端1600可以是:智能手机、平板电脑、MP3播放器、MP4播放器、笔记本电脑或台式电脑。终端1600还可能被称为用户设备、便携式终端、膝上型终端、台式终端等其他名称。
通常,终端1600包括有:处理器1601和存储器1602。
处理器1601可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器1601可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。处理器1601也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器1601可以集成有GPU(Graphics Processing Unit,图像处理器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。在一些实施例中,处理器1601还可以包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器1602可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是非暂态的。存储器1602还可包括高速随机存取存储器,以及非易失 性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1602中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器1601所执行以实现本申请中方法实施例提供的虚拟角色的控制方法。
在一些实施例中,终端1600还可选包括有:外围设备接口1603和至少一个外围设备。处理器1601、存储器1602和外围设备接口1603之间可以通过总线或信号线相连。各个外围设备可以通过总线、信号线或电路板与外围设备接口1603相连。具体地,外围设备包括:射频电路1604、显示屏1605、摄像头组件1606、音频电路1607、定位组件1608和电源1609中的至少一种。
在一些实施例中,终端1600还包括有一个或多个传感器1610。该一个或多个传感器1610包括但不限于:加速度传感器1611、陀螺仪传感器1612、压力传感器1613、指纹传感器1614、光学传感器1615以及接近传感器1616。
本领域技术人员可以理解,图16中示出的结构并不构成对终端1600的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
所述存储器还包括一个或者一个以上的程序,所述一个或者一个以上程序存储于存储器中,所述一个或者一个以上程序包含用于进行本申请实施例提供的虚拟角色的控制方法中的全部或部分步骤。
本申请提供了一种计算机可读存储介质,该计算机可读存储介质中存储有至少一条指令,至少一条指令由所述处理器加载并执行以实现上述各个方法实施例提供的虚拟角色的控制方法中的全部或部分步骤。
本申请还提供了一种计算机程序产品或计算机程序,该计算机程序产品或计算机程序包括计算机指令,该计算机指令存储在计算机可读存储介质中。计算机设备的处理器从计算机可读存储介质读取该计算机指令,处理器执行该计算机指令,使得该计算机设备执行上述各方法实施例提供的虚拟角色的控制方法中的全部或部分步骤。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
以上所述仅为本申请的可选实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (18)

  1. 一种虚拟角色的控制方法,其特征在于,所述方法由计算机设备执行,所述方法包括:
    显示虚拟环境界面,所述虚拟环境界面中包括对所述虚拟环境进行观察的画面,所述画面中包括位于所述虚拟环境中的主控虚拟角色;
    接收技能释放操作和移动控制操作,所述技能释放操作用于控制所述主控虚拟角色以第一方向在所述虚拟环境中释放指向性技能,所述移动控制操作用于控制所述主控虚拟角色在所述虚拟环境中向第二方向移动,所述第一方向和所述第二方向互相独立;
    响应于所述技能释放操作和所述移动控制操作,控制所述主控虚拟角色在所述虚拟环境中向所述第二方向释放所述指向性技能。
  2. 根据权利要求1所述的方法,其特征在于,所述虚拟环境界面中还包括移动控件,所述移动控制操作为在所述移动控件上接收到的拖动操作;
    所述第二方向的获取方式,包括:
    接收对所述移动控件的所述拖动操作;
    从表现层获取所述拖动操作的拖动方向;
    根据所述拖动方向确定所述主控虚拟角色移动时对应的所述第二方向。
  3. 根据权利要求1所述的方法,其特征在于,所述控制所述主控虚拟角色在所述虚拟环境中向所述第二方向释放所述指向性技能,包括:
    向服务器发送技能释放数据包,所述技能释放数据包中包括所述第二方向;
    接收所述服务器发送的技能释放反馈包;
    响应于所述技能释放反馈包,控制所述主控虚拟角色在所述虚拟环境中向所述第二方向释放所述指向性技能。
  4. 根据权利要求1至3任一所述的方法,其特征在于,所述方法还包括:
    响应于所述移动控制操作,向服务器发送移动控制数据包,所述移动控制数据包中包括所述第二方向;
    接收所述服务器发送的移动控制反馈包;
    响应于所述移动控制反馈包,控制所述主控虚拟角色在所述虚拟环境中面向所述第二方向移动。
  5. 根据权利要求4所述的方法,其特征在于,所述接收所述服务器发送的移动控制反馈包之后,还包括:
    响应于所述移动控制反馈包,在逻辑层中缓存所述第二方向作为所述主控虚拟角色的面朝方向。
  6. 根据权利要求5所述的方法,其特征在于,所述方法还包括:
    响应于所述技能释放操作,且未接收到所述移动控制操作,从所述逻辑层获取所述主控虚拟角色的所述面朝方向作为所述第一方向;
    向所述第一方向控制所述主控虚拟角色在所述虚拟环境中释放所述指向性技能。
  7. 根据权利要求1至3任一所述的方法,其特征在于,所述虚拟环境界面中还包括技能释放控件;
    所述接收技能释放操作,包括:
    接收在所述技能释放控件的第一区域内的第一触发操作,作为所述技能释放操作。
  8. 根据权利要求7所述的方法,其特征在于,所述方法还包括:
    接收在所述技能释放控件的第二区域内的第二触发操作,所述第二区域为与所述技能释放控件对应的除所述第一区域以外的区域;
    确定所述第二触发操作对应的释放方向;
    控制所述主控虚拟角色在所述虚拟环境中向所述释放方向释放所述指向性技能。
  9. 一种虚拟角色的控制装置,其特征在于,所述装置应用于计算机设备中,所述装置包括:
    显示模块,用于显示虚拟环境界面,所述虚拟环境界面中包括对所述虚拟环境进行观察的画面,所述画面中包括位于所述虚拟环境中的主控虚拟角色;
    接收模块,用于接收技能释放操作和移动控制操作,所述技能释放操作用于控制所述主控虚拟角色以第一方向在所述虚拟环境中释放指向性技能,所述移动控制操作用于控制所述主控虚拟角色在所述虚拟环境中向第二方向移动;
    释放模块,用于响应于所述技能释放操作和所述移动控制操作,控制所述主控虚拟角色在所述虚拟环境中向所述第二方向释放所述指向性技能。
  10. 根据权利要求9所述的装置,其特征在于,所述虚拟环境界面中还包括移动控件,所述移动控制操作为在所述移动控件上接收到的拖动操作;
    所述接收模块,还用于接收对所述移动控件的拖动操作;
    所述装置,还包括:
    获取模块,用于从表现层获取所述拖动操作的拖动方向;根据所述拖动方向确定所述主控虚拟角色移动时对应的所述第二方向。
  11. 根据权利要求10所述的装置,其特征在于,所述装置还包括:
    发送模块,用于向服务器发送技能释放数据包,所述技能释放数据包中包括所述第二方向;
    所述接收模块,还用于接收所述服务器发送的技能释放反馈包;
    所述释放模块,还用于响应于所述技能释放反馈包,控制所述主控虚拟角色在所述虚拟环境中向所述第二方向释放所述指向性技能。
  12. 根据权利要求9至11任一所述的装置,其特征在于,
    所述发送模块,用于响应于所述移动控制操作,向服务器发送移动控制数据包,所述移动控制数据包中包括所述第二方向;
    所述接收模块,还用于接收所述服务器发送的移动控制反馈包;
    所述装置,还包括:
    移动模块,用于响应于所述移动控制反馈包,控制所述主控虚拟角色在所述虚拟环境中面向所述第二方向移动。
  13. 根据权利要求12所述的装置,其特征在于,所述装置还包括:
    缓存模块,用于响应于所述移动控制反馈包,在逻辑层中缓存所述第二方向作为所述主控虚拟角色的面朝方向。
  14. 根据权利要求13所述的装置,其特征在于,
    所述获取模块,还用于响应于所述技能释放操作,且未接收到所述移动控制操作,从所述逻辑层获取所述主控虚拟角色的所述面朝方向作为所述第一方向;
    所述释放模块,还用于向所述第一方向控制所述主控虚拟角色在所述虚拟环境中释放所述指向性技能。
  15. 根据权利要求9至11任一所述的装置,其特征在于,所述虚拟环境界面中还包括技能释放控件;
    所述接收模块,用于接收在所述技能释放控件的第一区域内的第一触发操作,作为所述技能释放操作。
  16. 根据权利要求15所述的装置,其特征在于,
    所述接收模块,还用于接收在所述技能释放控件的第二区域内的第二触发操作,所述第二区域为与所述技能释放控件对应的除所述第一区域以外的区域;确定所述第二触发操作对应的释放方向;
    所述释放模块,还用于控制所述主控虚拟角色在所述虚拟环境中向所述释放方向释放所述指向性技能。
  17. 一种计算机设备,其特征在于,所述计算机设备包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如权利要求1至8任一所述的虚拟角色的控制方法。
  18. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有至少一条计算机程序,所述计算机程序由处理器加载并执行以实现如权利要求1至8任一所述的虚拟角色的控制方法。
PCT/CN2021/080690 2020-04-23 2021-03-15 虚拟角色的控制方法、装置、设备及存储介质 WO2021213070A1 (zh)

Priority Applications (8)

Application Number Priority Date Filing Date Title
KR1020217036000A KR20210150465A (ko) 2020-04-23 2021-03-15 가상 캐릭터 제어 방법 및 장치, 디바이스, 및 저장 매체
EP21782629.6A EP3943172A4 (en) 2020-04-23 2021-03-15 METHOD AND APPARATUS FOR CONTROLLING VIRTUAL CHARACTERS, DEVICE AND STORAGE MEDIA
CA3137791A CA3137791A1 (en) 2020-04-23 2021-03-15 Virtual character control method and apparatus, device, and storage medium
AU2021254521A AU2021254521B2 (en) 2020-04-23 2021-03-15 Virtual character control method and apparatus, device, and storage medium
JP2021564351A JP7451563B2 (ja) 2020-04-23 2021-03-15 仮想キャラクタの制御方法並びにそのコンピュータ機器、コンピュータプログラム、及び仮想キャラクタの制御装置
SG11202112169UA SG11202112169UA (en) 2020-04-23 2021-03-15 Virtual character control method and apparatus, device, and storage medium
US17/570,391 US20220126205A1 (en) 2020-04-23 2022-01-07 Virtual character control method and apparatus, device, and storage medium
JP2024034076A JP2024063201A (ja) 2020-04-23 2024-03-06 仮想キャラクタの制御方法、装置、機器及び記憶媒体

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010328532.3A CN111589127B (zh) 2020-04-23 2020-04-23 虚拟角色的控制方法、装置、设备及存储介质
CN202010328532.3 2020-04-23

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/570,391 Continuation US20220126205A1 (en) 2020-04-23 2022-01-07 Virtual character control method and apparatus, device, and storage medium

Publications (1)

Publication Number Publication Date
WO2021213070A1 true WO2021213070A1 (zh) 2021-10-28

Family

ID=72180363

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/080690 WO2021213070A1 (zh) 2020-04-23 2021-03-15 虚拟角色的控制方法、装置、设备及存储介质

Country Status (9)

Country Link
US (1) US20220126205A1 (zh)
EP (1) EP3943172A4 (zh)
JP (2) JP7451563B2 (zh)
KR (1) KR20210150465A (zh)
CN (1) CN111589127B (zh)
AU (1) AU2021254521B2 (zh)
CA (1) CA3137791A1 (zh)
SG (1) SG11202112169UA (zh)
WO (1) WO2021213070A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111589127B (zh) * 2020-04-23 2022-07-12 腾讯科技(深圳)有限公司 虚拟角色的控制方法、装置、设备及存储介质
CN112044071B (zh) 2020-09-04 2021-10-15 腾讯科技(深圳)有限公司 虚拟物品的控制方法、装置、终端及存储介质
JP7270008B2 (ja) * 2020-09-08 2023-05-09 カムツス コーポレーション ゲーム提供方法、コンピュータプログラム、コンピュータ読取可能な記録媒体、およびコンピュータ装置
CN112274927A (zh) * 2020-11-18 2021-01-29 网易(杭州)网络有限公司 游戏交互方法、装置及电子设备
CN112843679B (zh) * 2021-03-04 2022-11-08 腾讯科技(深圳)有限公司 虚拟对象的技能释放方法、装置、设备及介质
CN113476822B (zh) * 2021-06-11 2022-06-10 荣耀终端有限公司 一种触控方法及设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170149241A1 (en) * 2009-07-15 2017-05-25 Yehuda Binder Sequentially operated modules
US20180104584A1 (en) * 2016-10-19 2018-04-19 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having stored therein game program, game processing method, game system, and game apparatus
CN109865286A (zh) * 2019-02-20 2019-06-11 网易(杭州)网络有限公司 游戏中的信息处理方法、装置及存储介质
CN110413171A (zh) * 2019-08-08 2019-11-05 腾讯科技(深圳)有限公司 控制虚拟对象进行快捷操作的方法、装置、设备及介质
CN110694261A (zh) * 2019-10-21 2020-01-17 腾讯科技(深圳)有限公司 控制虚拟对象进行攻击的方法、终端及存储介质
CN111589127A (zh) * 2020-04-23 2020-08-28 腾讯科技(深圳)有限公司 虚拟角色的控制方法、装置、设备及存储介质

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050071306A1 (en) * 2003-02-05 2005-03-31 Paul Kruszewski Method and system for on-screen animation of digital objects or characters
US7235012B2 (en) * 2004-08-23 2007-06-26 Brain Box Concepts, Inc. Video game controller with side or quick look feature
US7963833B2 (en) * 2004-10-15 2011-06-21 Microsoft Corporation Games with targeting features
US8043149B2 (en) * 2005-03-03 2011-10-25 Sony Computer Entertainment America Llc In-game shot aiming indicator
US8651964B2 (en) * 2005-04-29 2014-02-18 The United States Of America As Represented By The Secretary Of The Army Advanced video controller system
US20070117628A1 (en) * 2005-11-19 2007-05-24 Stanley Mark J Method and apparatus for providing realistic gun motion input to a video game
US9327191B2 (en) * 2006-05-08 2016-05-03 Nintendo Co., Ltd. Method and apparatus for enhanced virtual camera control within 3D video games or other computer graphics presentations providing intelligent automatic 3D-assist for third person viewpoints
US8834245B2 (en) * 2007-08-17 2014-09-16 Nintendo Co., Ltd. System and method for lock on target tracking with free targeting capability
US8142286B2 (en) * 2007-08-17 2012-03-27 Microsoft Corporation Programmable movement of an orientation of a game character view of a game environment
US8777708B2 (en) * 2008-06-27 2014-07-15 Microsoft Corporation Targeting control in a simulated environment
US8342926B2 (en) * 2008-07-13 2013-01-01 Sony Computer Entertainment America Llc Game aim assist
US9868062B2 (en) * 2012-03-13 2018-01-16 Sony Interactive Entertainment America Llc System, method, and graphical user interface for controlling an application on a tablet
JP5563633B2 (ja) * 2012-08-31 2014-07-30 株式会社スクウェア・エニックス ビデオゲーム処理装置、およびビデオゲーム処理プログラム
US9770664B2 (en) * 2013-04-05 2017-09-26 Gree, Inc. Method and apparatus for providing online shooting game
US10549180B2 (en) * 2013-09-30 2020-02-04 Zynga Inc. Swipe-direction gesture control for video games using glass input devices
JP5711409B1 (ja) * 2014-06-26 2015-04-30 ガンホー・オンライン・エンターテイメント株式会社 端末装置
JP6598522B2 (ja) * 2015-06-12 2019-10-30 任天堂株式会社 情報処理装置、情報処理システム、情報処理方法、及び情報処理プログラム
CN105194871B (zh) * 2015-09-14 2017-03-22 网易(杭州)网络有限公司 一种控制游戏角色的方法
US20180015375A1 (en) * 2016-07-12 2018-01-18 Paul Marino Computer-implemented multiplayer combat video game method and apparatus
JP6143934B1 (ja) * 2016-11-10 2017-06-07 株式会社Cygames 情報処理プログラム、情報処理方法、及び情報処理装置
CN107398071B (zh) * 2017-07-19 2021-01-26 网易(杭州)网络有限公司 游戏目标选择方法及装置
CN107661630A (zh) * 2017-08-28 2018-02-06 网易(杭州)网络有限公司 一种射击游戏的控制方法及装置、存储介质、处理器、终端
CN107678647B (zh) * 2017-09-26 2023-04-28 网易(杭州)网络有限公司 虚拟射击主体控制方法、装置、电子设备及存储介质
CN107773987B (zh) * 2017-10-24 2020-05-22 网易(杭州)网络有限公司 虚拟射击主体控制方法、装置、电子设备及存储介质
CN107913515B (zh) * 2017-10-25 2019-01-08 网易(杭州)网络有限公司 信息处理方法及装置、存储介质、电子设备
CN108196765A (zh) * 2017-12-13 2018-06-22 网易(杭州)网络有限公司 显示控制方法、电子设备及存储介质
JP6561163B1 (ja) 2018-03-09 2019-08-14 株式会社 ディー・エヌ・エー ゲーム装置及びゲームプログラム
JP7381225B2 (ja) * 2018-09-06 2023-11-15 株式会社Cygames プログラム、電子装置、方法及びシステム
CN109550240A (zh) * 2018-09-20 2019-04-02 厦门吉比特网络技术股份有限公司 一种游戏的技能释放方法和装置
CN109550241B (zh) * 2018-09-20 2023-04-07 厦门吉比特网络技术股份有限公司 一种单摇杆控制方法和系统
CN109806579A (zh) * 2019-02-01 2019-05-28 网易(杭州)网络有限公司 游戏中虚拟对象的控制方法、装置、电子设备及存储介质
JP2019202128A (ja) 2019-04-17 2019-11-28 株式会社セガゲームス 情報処理装置及びプログラム
JP6928709B1 (ja) * 2020-12-28 2021-09-01 プラチナゲームズ株式会社 情報処理プログラム、情報処理装置、および情報処理方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170149241A1 (en) * 2009-07-15 2017-05-25 Yehuda Binder Sequentially operated modules
US20180104584A1 (en) * 2016-10-19 2018-04-19 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having stored therein game program, game processing method, game system, and game apparatus
CN109865286A (zh) * 2019-02-20 2019-06-11 网易(杭州)网络有限公司 游戏中的信息处理方法、装置及存储介质
CN110413171A (zh) * 2019-08-08 2019-11-05 腾讯科技(深圳)有限公司 控制虚拟对象进行快捷操作的方法、装置、设备及介质
CN110694261A (zh) * 2019-10-21 2020-01-17 腾讯科技(深圳)有限公司 控制虚拟对象进行攻击的方法、终端及存储介质
CN111589127A (zh) * 2020-04-23 2020-08-28 腾讯科技(深圳)有限公司 虚拟角色的控制方法、装置、设备及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3943172A4 *

Also Published As

Publication number Publication date
SG11202112169UA (en) 2021-12-30
AU2021254521A1 (en) 2021-11-11
CN111589127B (zh) 2022-07-12
JP7451563B2 (ja) 2024-03-18
KR20210150465A (ko) 2021-12-10
US20220126205A1 (en) 2022-04-28
CN111589127A (zh) 2020-08-28
JP2022533919A (ja) 2022-07-27
EP3943172A4 (en) 2022-07-20
AU2021254521B2 (en) 2023-02-02
CA3137791A1 (en) 2021-10-28
JP2024063201A (ja) 2024-05-10
EP3943172A1 (en) 2022-01-26

Similar Documents

Publication Publication Date Title
WO2021213070A1 (zh) 虚拟角色的控制方法、装置、设备及存储介质
WO2021218516A1 (zh) 虚拟对象控制方法、装置、设备及存储介质
WO2021244324A1 (zh) 虚拟对象的控制方法、装置、设备以及存储介质
US20230033874A1 (en) Virtual object control method and apparatus, terminal, and storage medium
CN113440846B (zh) 游戏的显示控制方法、装置、存储介质及电子设备
WO2021227870A1 (zh) 虚拟对象的控制方法、装置、终端及存储介质
CN111399639B (zh) 虚拟环境中运动状态的控制方法、装置、设备及可读介质
US11878242B2 (en) Method and apparatus for displaying virtual environment picture, device, and storage medium
WO2021203904A1 (zh) 虚拟环境画面的显示方法、装置、设备及存储介质
TWI804032B (zh) 虛擬場景中的資料處理方法、裝置、設備、儲存媒體及程式產品
JP7406567B2 (ja) 仮想環境の画面表示方法及び装置、並びにコンピュータ装置及びプログラム
WO2021159795A1 (zh) 三维虚拟环境中的技能瞄准方法、装置、设备及存储介质
WO2023010690A1 (zh) 虚拟对象释放技能的方法、装置、设备、介质及程序产品
JP2023164787A (ja) 仮想環境の画面表示方法、装置、機器及びコンピュータプログラム
CN111752697B (zh) 应用程序的运行方法、装置、设备及可读存储介质
CN111589102B (zh) 辅助工具检测方法、装置、设备及存储介质
CN111589134A (zh) 虚拟环境画面的显示方法、装置、设备及存储介质
WO2023071808A1 (zh) 基于虚拟场景的图形显示方法、装置、设备以及介质
CN112843682B (zh) 数据同步方法、装置、设备及存储介质
CN117618885A (zh) 虚拟对象的控制方法、装置、设备、存储介质及程序产品

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2021782629

Country of ref document: EP

Effective date: 20211012

ENP Entry into the national phase

Ref document number: 2021564351

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20217036000

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2021254521

Country of ref document: AU

Date of ref document: 20210315

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21782629

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE