CN117122906A - Virtual character control method, device and computer equipment - Google Patents

Virtual character control method, device and computer equipment Download PDF

Info

Publication number
CN117122906A
CN117122906A CN202210558049.3A CN202210558049A CN117122906A CN 117122906 A CN117122906 A CN 117122906A CN 202210558049 A CN202210558049 A CN 202210558049A CN 117122906 A CN117122906 A CN 117122906A
Authority
CN
China
Prior art keywords
virtual
virtual character
character
control
skill
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210558049.3A
Other languages
Chinese (zh)
Inventor
关磊
魏翰林
胡戌涛
陈涛
胡昕彤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202210558049.3A priority Critical patent/CN117122906A/en
Publication of CN117122906A publication Critical patent/CN117122906A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5252Changing parameters of virtual cameras using two or more virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character changes room or displaying a rear-mirror view in a car-driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5255Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/833Hand-to-hand fighting, e.g. martial arts competition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • A63F2300/6669Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera using a plurality of virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character change rooms
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • A63F2300/6676Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera by dedicated player input
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8029Fighting without shooting
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a virtual character control method, a virtual character control device and computer equipment, and belongs to the technical field of network games. The virtual character control method comprises the following steps: controlling the virtual character to climb to a side surface of a virtual object in the virtual scene; responding to a first touch operation aiming at a movement control response area, and controlling a virtual character to move in the virtual scene; in response to the skill release instruction, determining a skill release direction according to the second camera orientation, adjusting the virtual character from the preset character orientation to the first character orientation according to the skill release direction, and controlling the virtual character to move a preset distance towards the first character orientation. The application can achieve the effect of providing effective control assistance for players.

Description

Virtual character control method, device and computer equipment
Technical Field
The present application relates to the field of online game technologies, and in particular, to a virtual character control method, device, and computer device.
Background
In a massively multiplayer online role-playing (Massive Multiplayer Online, abbreviated as MMO) game, a player can control a virtual character in the game to perform a corresponding action or operation through instructions issued by a computer device, and a corresponding animation can be displayed according to the action of the virtual character.
In the related art, a player often needs to control a virtual character to interact with other virtual characters based on a certain virtual object in a game. For example, the virtual object is a virtual wall, and the player can control the virtual character to climb on the virtual wall and fight against other virtual characters. This interaction may be referred to as wall-clicking.
However, the means for triggering or controlling wall shots provided in the related games is relatively single, and thus there is a problem in that it is difficult to provide effective control assistance to the player.
Disclosure of Invention
The application aims to provide a virtual character control method, a virtual character control device and computer equipment, which can achieve the effect of providing effective control assistance for players.
Embodiments of the present application are implemented as follows:
in a first aspect of the embodiment of the present application, a virtual character control method is provided, where a graphical user interface of a game is provided through a first terminal device, where content displayed on the graphical user interface includes a virtual scene, and the virtual scene includes a virtual character; the virtual character control method comprises the following steps:
controlling the virtual character to climb to a side surface of a virtual object in the virtual scene;
responding to a first touch operation aiming at a movement control response area, and controlling the virtual character to move in the virtual scene;
Responding to a second touch operation aiming at a skill response area, responding to a third touch operation aiming at a skill release direction adjustment response area when the second touch operation is in continuous touch, and adjusting the shooting direction of the virtual camera from a first camera direction to a second camera direction according to the third touch operation when the virtual character is controlled to be maintained at the position of the virtual character in the virtual scene, wherein the second game field of view picture is a picture determined by the virtual camera shooting the virtual scene according to the second camera direction;
and responding to a skill release instruction, determining a skill release direction according to the second camera orientation, adjusting the virtual character from a preset character orientation to a first character orientation according to the skill release direction, and controlling the virtual character to move a preset distance towards the first character orientation, wherein the preset character orientation is different from the first character orientation.
Optionally, the method further comprises:
and responding to the situation that the virtual character encounters the hostile virtual character on the path through which the virtual character moves by the preset distance, and controlling the virtual character to attack the hostile virtual character.
Optionally, the movement control response area is a response area corresponding to a movement control located in the graphical user interface.
Optionally, the skill response area is a response area corresponding to a skill control located on the graphical user interface.
Optionally, the virtual character control method further includes:
and responding to a second touch operation aiming at the skill response area, and displaying a function control on the graphical user interface.
Optionally, the skill release direction adjustment area is a response area corresponding to the functional control located on the graphical user interface;
the second touch operation is an operation meeting a first preset duration.
Optionally, the virtual character control method further includes:
and responding to the duration of the second touch operation aiming at the skill response area to meet the first preset duration, and controlling to generate a skill release direction adjustment response area.
Optionally, the control generates a skill release direction adjustment response area comprising:
and controlling the display of the functional control on the graphical user interface, wherein the skill release direction adjustment response area is a response area corresponding to the functional control.
Optionally, the movement control response area and the skill release direction adjustment response area are located on the same side of the graphical user interface, and the movement control response area and the skill response area are located on opposite sides of the graphical user interface, respectively.
Optionally, the graphical user interface includes a view adjustment control, and the virtual character control method further includes:
and in response to a fourth touch operation for the visual field adjustment control, adjusting the shooting direction of the virtual camera from a third camera direction to a fourth camera direction according to the fourth touch operation, and adjusting a third game visual field picture to a fourth game visual field picture, wherein the third game visual field picture is a picture determined by the virtual camera shooting the virtual scene according to the third camera direction, and the fourth game visual field picture is a picture determined by the virtual camera shooting the virtual scene according to the fourth camera direction.
Optionally, the skill response area is a response area corresponding to a skill control located in the graphical user interface, and the virtual role control method further includes:
and responding to a second touch operation aiming at the skill response area, controlling and displaying a skill wheel control, wherein the skill wheel control comprises at least one sub-control, and the sub-control is configured to respond to a fifth touch operation to control the virtual character to execute actions corresponding to the sub-control, wherein the fifth touch operation is continuous with the second touch operation.
Optionally, the responding to the first touch operation of the movement control responding area controls the virtual character to move in the virtual scene, including:
and adjusting the virtual character from a first character orientation to a second character orientation, controlling the virtual character to move from a first character position according to the second character orientation, and adjusting the first game view field picture to a third game view field picture, wherein the third game view field picture is a picture determined by acquiring the virtual scene according to the first camera orientation and the current character position to which the virtual character moves.
Optionally, the virtual character control method further includes:
and in response to a skill release instruction, determining a third angular orientation of the virtual character according to the second camera orientation, and controlling to adjust the virtual character from the current character orientation to the third angular orientation.
Optionally, the virtual character control method further includes:
detecting a current weaponry state of the virtual character, the weaponry state including an untapped weapon, a weaponry near combat weapon, and a weaponry remote weapon;
and if the current weapon equipment state of the virtual character is an unarmed weapon or an equipment remote weapon, determining that the virtual character is not controlled to move towards the first character by a preset distance.
In a second aspect of the embodiment of the present application, there is provided a virtual character control apparatus including:
a first control module for controlling the virtual character to climb to a side surface of a virtual object in the virtual scene;
the mobile control module is used for responding to a first touch operation aiming at a mobile control response area and controlling the virtual character to move in the virtual scene;
a second control module, configured to respond to a second touch operation for a skill response area, respond to a third touch operation for adjusting a skill release direction in a case where the second touch operation is in continuous touch, and adjust a shooting direction of the virtual camera from a first camera direction to a second camera direction according to the third touch operation in a case where the virtual character is controlled to be maintained at a position of the virtual character in the virtual scene, and adjust a first game view screen to a second game view screen, where the second game view screen is a screen determined by the virtual camera to shoot the virtual scene according to the second camera direction;
and the skill release control module is used for responding to a skill release instruction, determining a skill release direction according to the second camera orientation, adjusting the virtual character from a preset character orientation to a first character orientation according to the skill release direction, and controlling the virtual character to move a preset distance towards the first character orientation, wherein the preset character orientation is different from the first character orientation.
In a third aspect of embodiments of the present application, there is provided a computer device including a memory, a processor, and a computer program stored in the memory and executable on the processor, the computer program implementing the virtual character control method according to the first aspect, when executed by the processor.
In a fourth aspect of the embodiments of the present application, there is provided a computer readable storage medium storing a computer program which, when executed by a processor, implements the virtual character control method described in the first aspect.
The beneficial effects of the embodiment of the application include:
according to the virtual character control method provided by the embodiment of the application, the virtual character is controlled to climb to the side surface of a virtual object in the virtual scene by controlling the virtual character to respond to a first touch operation aiming at a movement control response area, the virtual character is controlled to move in the virtual scene, responding to a second touch operation aiming at a skill response area, under the condition that the second touch operation is in continuous touch, responding to a third touch operation aiming at a skill release direction adjustment response area, under the condition that the virtual character is controlled to maintain the position of the virtual character in the virtual scene, the shooting direction of the virtual camera is adjusted from a first camera direction to a second camera direction according to the third touch operation, the first game view is adjusted to a second game view, responding to a skill release instruction, the virtual character is adjusted to a first character direction according to the skill release direction, the virtual character is controlled to move a preset distance towards the first character, and under the condition that the virtual character moves a preset distance towards the skill release direction, and under the condition that the virtual character encounters a path crossing the preset distance, the virtual character is controlled to attack the virtual character.
Wherein in case the virtual character climbs to a side surface of a virtual object in the virtual scene, it may be determined that the virtual character is in a target state, and the virtual character may be controlled to trigger a special action or a special skill in the target state, such as causing the virtual character to trigger a wall-striking operation.
In the case where the virtual character climbs to the side surface of the virtual object in the virtual scene, the virtual character can be controlled to move in the virtual scene in response to the first touch operation for the movement control response area, for example, the height at which the virtual character climbs to the side surface of the virtual object can be changed after the virtual character moves in the virtual scene, and the climbing of the virtual character to the side surface at which any one of the virtual characters of the virtual object can be reached can also be controlled, and an effect of providing an effective control assistance to the player can be achieved.
In response to a second touch operation for the skill response area and a third touch operation for adjusting the response area for the skill release direction, the shooting orientation of the virtual camera can be adjusted from the first camera orientation to the second camera orientation and the first game view screen can be adjusted to the second game view screen according to the third touch operation. Therefore, the user can conveniently observe the pictures with different angles in the virtual scene by adjusting the first game view picture to the second game view picture. In this way, an effect of providing an effective control assistance to the player can be achieved.
Controlling the virtual character to move toward the first character by a preset distance specifically refers to controlling a path of the virtual character to move from a current position of the virtual character to a direction indicated by the first character by a preset distance length. In addition, in the process of moving the virtual character, skill actions or skill animations corresponding to the skill controls can be executed at the same time. Thus, the skill releasing direction, the character direction and/or the movement direction of the virtual character when the virtual character releases the corresponding skill can be controlled, and the effect of providing effective control assistance for the player can be achieved. In this way, an effect of providing an effective control assistance to the player can be achieved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a first virtual character control method according to an embodiment of the present application;
FIG. 2 is a diagram of a first graphical user interface provided in an embodiment of the present application;
FIG. 3 is a diagram of a second graphical user interface provided in an embodiment of the present application;
FIG. 4 is a diagram of a third graphical user interface provided by an embodiment of the present application;
FIG. 5 is a diagram of a fourth graphical user interface provided by an embodiment of the present application;
FIG. 6 is a diagram of a fifth graphical user interface provided by an embodiment of the present application;
FIG. 7 is a diagram of a sixth graphical user interface provided by an embodiment of the present application;
FIG. 8 is a diagram of a seventh graphical user interface provided by an embodiment of the present application;
fig. 9 is a schematic structural diagram of a first virtual character control device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a second virtual character control device according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a third virtual character control device according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of a fourth virtual character control device according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
In the related art, a player often needs to control a virtual character to interact with other virtual characters based on a certain virtual object in a game. For example, the virtual object is a virtual wall, and the player can control the virtual character to climb on the virtual wall and fight against other virtual characters. This interaction may be referred to as wall-clicking. However, the means for triggering or controlling wall shots provided in the related games is relatively single, and thus there is a problem in that it is difficult to provide effective control assistance to the player.
To this end, an embodiment of the present application provides a virtual character control method, by controlling the virtual character to climb to a side surface of a virtual object in the virtual scene, controlling the virtual character to move in the virtual scene in response to a first touch operation for moving a control response area, adjusting a preset character orientation to a first character orientation in response to a skill release instruction in response to a second touch operation for a skill response area, and controlling the virtual character to move a preset distance from the preset character orientation to the first character orientation in response to the skill release direction in response to a third touch operation for adjusting the response area in response to a third touch operation for controlling the virtual character to maintain a position of the virtual character in the virtual scene, adjusting a shooting orientation of the virtual camera from the first camera orientation to the second camera orientation in accordance with the third touch operation, and adjusting the first game view to the second game view in response to the skill release instruction. An effect of providing an effective control assistance to the player can be achieved.
The virtual character control method in one embodiment of the present application may be run on a terminal device or a server. The terminal device may be a local terminal device. When the virtual character control method runs on a server, the virtual character control method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and client equipment.
In an alternative embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, the storage and operation of the game display method are completed on the cloud game server, the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the terminal device for information processing is cloud game server of cloud. When playing the game, the player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
In a possible implementation manner, the embodiment of the present invention provides a virtual character control method, and a graphical user interface is provided through a first terminal device, where the first terminal device may be the aforementioned local terminal device or the aforementioned client device in the cloud interaction system.
The embodiment of the application is described by taking a virtual character control method applied to a terminal game as an example. But it is not shown that the embodiment of the present application can be applied only to terminal games for virtual character control.
The virtual character control method provided by the embodiment of the application is explained in detail below.
Fig. 1 is a flowchart of a virtual character control method provided by the present application. The content displayed by the graphical user interface includes a virtual scene including a virtual character therein. Referring to fig. 1, an embodiment of the present application provides a virtual character control method, including:
step 1001: the virtual character is controlled to climb to a side surface of a virtual object in the virtual scene.
Alternatively, the virtual character may be one of the virtual characters in the terminal game manipulated by the player.
Optionally, the virtual scene is a scene provided in the terminal game for the virtual character to move, fight, and perform other activities. That is, the virtual scene includes a virtual character therein.
The virtual object may be any object in the virtual scene. In general, the virtual object may be a virtual tree, a virtual wall, a virtual house, a virtual cliff, a virtual stone, a virtual mountain wall, or the like in the virtual scene. The embodiment of the present application is not limited thereto.
The side surface of the virtual object may refer to a surface on which the virtual character cannot stand, squat or lie.
The virtual character climbing to a side surface of a virtual object in the virtual scene may mean that the virtual character climbs or hangs on the side surface of the virtual object.
In addition, the virtual character can be controlled to perform actions such as moving, jumping, dodging, climbing, swimming and the like through various controls on a graphical user interface provided by the computer device. The embodiment of the present application is not limited thereto.
In the case that the virtual character climbs to the side surface of the virtual object in the virtual scene, the virtual character can be determined to be in a target state, and then the virtual character can be controlled to trigger special actions or special skills in the target state, for example, the virtual character is enabled to trigger wall-striking operation, and in addition, the release direction of the special actions or special skills of the virtual character can be controlled or adjusted through corresponding controls.
In an alternative embodiment, where the avatar is climbing to a side surface of a virtual object in the virtual scene, the avatar may be determined to be in a target state to trigger a special action or special skill that may control the avatar to trigger in the target state. In this embodiment, the target state may optionally include a wall climbing state or a tree climbing state.
For example, the virtual character being in a wall climbing state may mean that the virtual character climbs or climbs on a virtual wall, a virtual mountain wall, a virtual cliff, a virtual stone, or a virtual house. The virtual character being in a tree climbing state may mean that the virtual character climbs or climbs on a virtual tree.
Step 1002: and controlling the virtual character to move in the virtual scene in response to the first touch operation aiming at the movement control response area.
Optionally, the movement control response area is a response area corresponding to a movement control located in the graphical user interface. Wherein the response area may be greater than or the same as the graphical user interface area occupied by the mobile control.
In the case that the first touch operation or other touch operation is not performed on the movement control response area, the movement control response area may not be displayed on the graphical user interface or the transparency of the movement control response area may be adjusted to a preset transparency, which may refer to a completely transparent transparency or a semitransparent transparency. The adjustment can be specifically performed according to actual needs, and the embodiment of the application is not limited to this.
The movable control can be a virtual control such as a virtual rocker, a virtual direction key, a virtual steering wheel and the like, and can be specifically set according to actual needs.
Optionally, the movement control may be used to control the direction and speed of movement of the virtual character in the virtual scene.
The first touch operation may be a sliding operation, a dragging operation, a clicking operation, or a clicking operation, which are made by a user or a related technician, and the embodiment of the present application is not limited thereto.
In the case where the virtual character climbs to the side surface of the virtual object in the virtual scene, the virtual character can be controlled to move in the virtual scene in response to the first touch operation for the movement control response area, for example, the height at which the virtual character climbs to the side surface of the virtual object can be changed after the virtual character moves in the virtual scene, and the climbing of the virtual character to the side surface at which any one of the virtual characters of the virtual object can be reached can also be controlled, and an effect of providing an effective control assistance to the player can be achieved. In this embodiment, when the virtual character is controlled to move in the virtual scene in response to the first touch operation for the movement control response area, the character orientation of the virtual character is maintained in a preset orientation, where the preset orientation is a direction that is not determined according to the operation parameter of the first touch operation, for example, when the virtual character is on a wall, the limb orientation may be opposite to the wall, or a preset inclination angle may be provided, or when the virtual character is climbing on the wall, the body orientation may be dynamically adjusted according to a preset rule of a program, but the adjustment is not determined according to the operation parameter of the first touch operation, and may be understood that the preset orientation is not controlled by a player, for example, in order to make the animation of the virtual character more realistic during the crawling of the virtual character, the head of the character may be controlled to perform uncontrolled rotation, or the preset animation may be performed by limbs, or the like, so that the virtual character orientation may be changed.
Further, when the virtual character is not climbing to a side surface of a virtual object in the virtual scene, for example, when the virtual character stands on the surface of the virtual scene, in an alternative embodiment, in response to a touch operation for a movement control response area, the virtual character is controlled to be adjusted from a current character direction to a target character direction, the virtual character is controlled to move from a current character position according to the target character direction, and the virtual scene is acquired according to a preset camera direction and the current character position to which the virtual character moves so as to update a game field picture.
In the embodiment of the application, the virtual character is adjusted from the current character orientation to the target character orientation, and specifically, the determination of the target character orientation is related to the operation direction and the operation position of the first touch operation of the player in the movement control response area.
For example, the player moves the movement control rightward through the sliding operation, and accordingly, the direction of the virtual character is also adjusted rightward accordingly. Note that the player may change the orientation of the virtual character when performing the first touch operation in the movement control response area, but the change of the shooting orientation of the virtual camera is not controlled by performing the touch operation in the movement control response area at this time.
In an alternative embodiment, the movement is controlled from the first character position according to a preset orientation of the virtual character. The preset direction may be a direction determined by other operations, for example, an operation to determine the direction of the virtual character by acting on the visual field direction control area.
For example, a visual field orientation control area may be provided on the other side of the mobile control provided in the graphical user interface, for example, the mobile control may be placed at the lower left corner of the graphical user interface, and the visual field orientation control area may be placed at the upper left corner of the graphical user interface, so that a user or a player may simultaneously change the shooting orientation of the virtual camera and the orientation of the virtual character by performing touch operation on the visual field orientation control area while controlling the movement of the virtual character. In addition, the field of view orientation control region may include a corresponding control.
Step 1003: responding to a second touch operation aiming at a skill response area, responding to a third touch operation aiming at a skill release direction adjustment response area when the second touch operation is in continuous touch, and adjusting the shooting direction of the virtual camera from a first camera direction to a second camera direction and adjusting the first game view field picture to a second game view field picture according to the third touch operation when the virtual character is controlled to maintain the position of the virtual character in the virtual scene.
In an alternative embodiment, when the second touch operation is in a continuous touch condition, that is, when the player operates on the skill control for a long time, the position of the virtual character in the game scene cannot be adjusted. For example, when the second touch operation is in the continuous touch, the virtual character enters the power storage state, and the position of the virtual character in the game scene cannot be adjusted. In the present embodiment, when the second touch operation is performed in the continuous touch mode, the character orientation and/or the virtual camera orientation is not controlled and adjusted when a slide operation continuous to the second touch operation is applied.
In an alternative embodiment, when the second touch operation is in a continuous touch condition, that is, when the player operates on the skill control for a long time, neither the position nor the orientation of the virtual character in the game scene can be adjusted.
In an alternative embodiment, the inability of the position and/or orientation of the avatar in the game scene to be adjusted means that the avatar is controlled not to respond to touch operations directed to controlling movement and/or orientation adjustment of the avatar. For example, when the second touch operation is not performed continuously for the skill response area, the virtual character can be controlled to move through the touch operation applied to the movement control, but when the second touch operation is performed continuously for the skill response area, if the second touch operation is performed continuously, the position of the virtual character in the virtual scene is kept unchanged, that is, the player cannot control the virtual character to move through the movement control at the moment.
Optionally, the skill response area is a response area corresponding to a skill control located on the graphical user interface.
The skill controls may include common attack controls, special recruitment controls, special capability controls, other controls that may provide gain for the virtual character or for a friend virtual character that is in the same lineup as the virtual character, and other controls that may provide a benefit for an hostile virtual character that is in hostile lineup with the virtual character. The embodiment of the present application is not limited thereto.
The number of skill controls may be set by the relevant technician or user as desired, for example. For example, only one skill control may be provided, or a plurality of skill controls may be provided.
In general, at least one common attack control needs to be set in the countermeasure game or the combat game, and if the virtual character also has a special recruitment or a special capability, a corresponding special recruitment control or a special capability control needs to be set.
In addition, two common attack controls can be set. In the event that the two common attack controls are triggered, the virtual character may perform the same action, or the virtual character may perform a different action. The embodiment of the present application is not limited thereto.
The second touch operation may be a pointing click operation or a long press operation.
The virtual character may be controlled to enter a state of preparing to attack or preparing to release skills by performing the second touch operation on the skill response area.
The second touch operation being in continuous touch specifically may mean that the second touch operation is a long-press operation, and continuous long-press triggering is performed on the skill response area.
The skill-release-direction-adjustment responsive area may refer to a corresponding area for adjusting skill release direction. Generally, the skill-release-direction-adjustment-response area may be disposed around the skill response area for convenience of user or player operations. The skill response area may also be wrapped by the skill release direction adjustment response area. The skill release direction adjustment response area can be set at the same side position as the mobile control in the graphical user interface, so that when the virtual character enters a force accumulation state, namely, the position of the virtual character in a game scene cannot be adjusted, touch operation can be conveniently performed on the skill release direction adjustment response area, and the man-machine interaction efficiency is improved. The skill release direction adjustment response area may also be located at any other possible location on the graphical user interface, which is not limited by embodiments of the present application.
In addition, the skill release direction adjustment response area can display a corresponding control to adjust the skill release direction, or can not display any control to directly perform corresponding touch operation on the skill release direction adjustment response area to adjust the skill release direction. The embodiment of the present application is not limited thereto.
Alternatively, the skill release direction adjustment response area may be hidden or may not be triggered by any operation without a second touch operation on the skill response area. That is, the skill release direction adjustment response area may be triggered only when the second touch operation is performed on the skill response area.
Alternatively, the skill release direction adjustment response area may be an area that is resident in the graphical user interface, or may be an area that is only displayed in the graphical user interface if triggered.
The skill response area may also be an area that resides in the graphical user interface or that is only displayed in the graphical user interface if triggered. The embodiment of the present application is not limited thereto.
The third touch operation may be a pointing click operation, a sliding operation, or a drag operation. The setting may be specifically performed according to the actual situation, which is not limited in the embodiment of the present application.
The shooting direction of the virtual camera can be adjusted through the third touch operation.
The shooting direction of the virtual camera may refer to a direction in which the virtual camera captures a picture in the virtual scene.
Optionally, the first camera orientation may refer to a default shooting orientation of the virtual camera in the virtual scene, or may refer to a current shooting orientation of the virtual camera, or may refer to a shooting orientation of the virtual camera in a case where the second touch operation is not performed on the skill response area and/or the third touch operation is not performed on the skill release direction adjustment response area. The embodiment of the present application is not limited thereto.
The second camera orientation may refer to an orientation of the virtual camera that the associated technician or user indicated by the third touch operation needs to adjust.
Moreover, the shooting direction of the virtual camera may also be a default moving direction, an attack direction, or a skill releasing direction of the virtual character. The embodiment of the present application is not limited thereto.
The first game view picture is a picture determined by the virtual video camera according to the first camera direction shooting the virtual scene.
In general, the first game field of view screen may be a current game screen. That is, the first game field of view screen may be a game screen displayed on the graphical user interface without performing the second touch operation on the skill response area and/or without performing the third touch operation on the skill release direction adjustment response area. The embodiment of the present application is not limited thereto.
Optionally, the second game view frame is a frame determined by the virtual camera according to the second camera orientation to shoot the virtual scene.
It should be noted that, in order for the player to perform a more convenient operation in the skill-release-direction-adjustment-response area, the skill-release-direction-adjustment-response area may be provided on the same side as the movement-control-response area, so that the player can control the skill release direction or movement direction of the virtual character in different states by using one hand, and the skill-response area may be provided on the other side of the skill-release-direction-adjustment-response area, so that the skill control of the virtual character is simultaneously triggered by the other hand regardless of the skill release direction or movement direction of the virtual character by the player. Thus, the man-machine interaction efficiency can be improved.
It should be noted that, in response to the second touch operation for the skill response area and the third touch operation for adjusting the skill release direction, the shooting direction of the virtual camera may be adjusted from the first camera direction to the second camera direction and the first game view frame may be adjusted to the second game view frame according to the third touch operation. Therefore, the user can conveniently observe the pictures with different angles in the virtual scene by adjusting the first game view picture to the second game view picture. In this way, an effect of providing an effective control assistance to the player can be achieved.
Step 1004: and responding to the skill release instruction, determining a skill release direction according to the second camera orientation, adjusting the virtual character from a preset character orientation to a first character orientation according to the skill release direction, and controlling the virtual character to move a preset distance towards the first character orientation.
Alternatively, the skill release instruction may be a release operation for the second touch operation described above. Generally, if the second touch operation is a click operation, the skill release instruction may refer to a lift operation of the click operation. If the second touch operation is a long press operation, the skill release instruction may refer to a release operation of the long press operation. The embodiment of the present application is not limited thereto.
Optionally, the predetermined character orientation is different from the first character orientation.
The preset character orientation and the first character orientation may refer to a body orientation and/or a face orientation of the virtual character.
The preset character orientation may refer to a default character orientation of the virtual character in the virtual scene.
The first character orientation may be the same as the skill release direction, the first character orientation may be opposite to the skill release direction, and the first character orientation may also have an angle with the skill release direction. The embodiment of the present application is not limited thereto.
The operation of determining the skill release direction according to the second camera orientation may specifically be performed by using the direction in which the second camera orientation is directed as the skill release direction, or by using the direction in which the position of the virtual character is directed toward the center of the second game field of view screen as the skill release direction.
Alternatively, the preset distance may be a distance set in advance. The preset distance may refer to a movement distance of an action that the virtual character is required to perform, corresponding to the triggered skill control.
It should be noted that, controlling the virtual character to move toward the first character by a predetermined distance specifically refers to controlling a path of the virtual character to move from a current position of the virtual character to a direction indicated by the first character by a predetermined distance length. In addition, in the process of moving the virtual character, skill actions or skill animations corresponding to the skill controls can be executed at the same time. The embodiment of the present application is not limited thereto.
Thus, the skill releasing direction, the character direction and/or the movement direction of the virtual character when the virtual character releases the corresponding skill can be controlled, and the effect of providing effective control assistance for the player can be achieved.
In the embodiment of the application, the virtual character is controlled to move in the virtual scene in response to a first touch operation aiming at a movement control response area by controlling the virtual character to climb to the side surface of a virtual object in the virtual scene, the virtual character is controlled to move in the virtual scene in response to a second touch operation aiming at a skill response area, the virtual character is controlled to move from a preset character direction to a first character direction in response to a second touch operation aiming at a skill release direction when the second touch operation is in continuous touch, the virtual character is controlled to move towards the first character direction by a third touch operation aiming at a skill release direction, the virtual character is controlled to move towards the preset character for a preset distance, and the virtual character is controlled to attack on a virtual enemy on a path through which the preset distance passes in response to the virtual character moving towards the preset character.
Wherein in case the virtual character climbs to a side surface of a virtual object in the virtual scene, it may be determined that the virtual character is in a target state, and the virtual character may be controlled to trigger a special action or a special skill in the target state, such as causing the virtual character to trigger a wall-striking operation.
In the case where the virtual character climbs to the side surface of the virtual object in the virtual scene, the virtual character can be controlled to move in the virtual scene in response to the first touch operation for the movement control response area, for example, the height at which the virtual character climbs to the side surface of the virtual object can be changed after the virtual character moves in the virtual scene, and the climbing of the virtual character to the side surface at which any one of the virtual characters of the virtual object can be reached can also be controlled, and an effect of providing an effective control assistance to the player can be achieved.
In response to a second touch operation for the skill response area and a third touch operation for adjusting the response area for the skill release direction, the shooting orientation of the virtual camera can be adjusted from the first camera orientation to the second camera orientation and the first game view screen can be adjusted to the second game view screen according to the third touch operation. Therefore, the user can conveniently observe the pictures with different angles in the virtual scene by adjusting the first game view picture to the second game view picture. In this way, an effect of providing an effective control assistance to the player can be achieved.
Controlling the virtual character to move toward the first character by a preset distance specifically refers to controlling a path of the virtual character to move from a current position of the virtual character to a direction indicated by the first character by a preset distance length. In addition, in the process of moving the virtual character, skill actions or skill animations corresponding to the skill controls can be executed at the same time. Thus, the skill releasing direction, the character direction and/or the movement direction of the virtual character when the virtual character releases the corresponding skill can be controlled, and the effect of providing effective control assistance for the player can be achieved. In this way, an effect of providing an effective control assistance to the player can be achieved.
In one possible manner, the method further comprises:
and controlling the virtual character to attack the hostile virtual character in response to the hostile virtual character being encountered on the path through which the virtual character moves by the preset distance.
Optionally, the hostile virtual character may be any virtual character that includes other player characters and/or non-player characters that are not in the same camp as the virtual character, as well as other virtual characters that may cause harm and/or a reduction in benefit to the virtual character.
The virtual character moving the preset distance to the path traversed by the virtual character encountering the hostile virtual character may refer to a path with a preset distance length moving from the current position to the first character towards the indicated direction when the virtual character performs the wall-striking action in response to the skill release instruction, and if the hostile virtual character of the virtual character exists on the path, the hostile virtual character of the virtual character may be controlled to attack the hostile virtual character of the virtual character.
Optionally, before the virtual character encounters the hostile virtual character on the path traversed by the virtual character moving the preset distance, the virtual character may keep the corresponding assault action unchanged or may perform the corresponding continuously-changing assault action during the movement. The embodiment of the present application is not limited thereto.
In addition, if the virtual character does not encounter the hostile virtual character on the path traversed by the preset distance, or if the hostile virtual character does not exist on the path traversed by the preset distance, the virtual character can only keep the corresponding assault action unchanged or can execute the corresponding continuously-changed assault action. The embodiment of the present application is not limited thereto.
It should be noted that, in the embodiment of the present application, the attack on the hostile virtual character by the virtual character may be controlled by the attack action or the attack action continuous to the attack action. In addition, in the case that the virtual character attacks the hostile virtual character, the virtual character may cause a certain injury or influence to the hostile virtual character, for example, reduce the armor value, the life value and/or any other possible attribute value of the hostile virtual character, which is not limited in the embodiment of the present application.
Thus, under the condition that the hostile virtual character is encountered on the path through which the virtual character moves by the preset distance, the hostile virtual character is controlled to attack by the virtual character. In this way, an effect of providing an effective control assistance to the player can be achieved.
Referring to fig. 2, fig. 2 provides a schematic view of a graphical user interface, and in fig. 2, a virtual object B1, a virtual character C, a hostile virtual character D of the virtual character C, a movement control response area Y, a skill response area J, and a skill release direction adjustment response area T are displayed in the graphical user interface A1.
The movement control response area Y further includes a movement control P. Skill response area J also includes a skill control G1 and a skill control G2. In addition, the distance between the position of the hostile virtual character D and the position of the virtual character C is smaller than the preset distance.
It should be noted that, as can be seen from fig. 2, the virtual object B1 is a virtual wall, the moving control P is a virtual rocker, the skill control G1 and the skill control G2 are both common attack controls, specifically, the skill control G1 is a horizontal stroke control, and the skill control G2 is a vertical stroke control. In fig. 2, each control in the movement control response area Y and the skill response area J is only an example, and does not represent that only the controls in the movement control response area Y and the skill response area J can be included as shown in fig. 2.
As shown in fig. 2, the first touch operation for the movement control response area is a sliding operation to the left or a dragging operation to the left. Then, in the case that the player or the user triggers the first touch operation, the virtual character C may be controlled to move leftwards in the virtual scene, and in this case, in response to the player-triggered operation, the above steps are performed, resulting in the graphical user interface A2. Referring to fig. 3, the graphical user interface A2 is a graphical user interface displayed by the computer device after the virtual character is controlled to move in the virtual scene in response to the first touch operation for the movement control response area in the graphical user interface A1.
As can be seen from fig. 2 and 3, the virtual character C has changed in the position of the virtual object B1, and in particular, the virtual character C has moved leftward in the position of the virtual object B1 in fig. 3 as compared with fig. 2. It can be seen that the purpose of controlling the movement of the virtual character C in the virtual scene is achieved in response to the first touch operation for the movement control response area Y.
With continued reference to FIG. 3, it can be seen that the player also has a second touch operation on the skill control G1 in the skill response area J, and that the second touch operation is a click operation or a long press operation.
If the second touch operation is a long press operation, the response area T can be adjusted for the skill release direction to perform a third touch operation, so that the virtual character C can be immobilized, the shooting direction of the virtual camera can be adjusted, and the game view screen displayed on the graphical user interface A2 can be changed.
For example, if the second camera orientation is adjusted to point in the direction of the hostile virtual character D, if the player or the user triggers the skill release instruction, the second camera orientation may be set as the skill release direction, and the skill release instruction may be a release operation for the second touch operation. In this case, the virtual character C may be adjusted from the preset character orientation to a first character orientation which is directed toward the hostile virtual character D according to the skill release direction, and the virtual character C may be controlled to move a preset distance toward the first character orientation. Then, in the case that the player or the user triggers the skill release instruction, the virtual character C may be controlled to move a preset distance toward the hostile virtual character D in the virtual scene, and in this case, in response to the player triggering operation, the above steps are performed, so as to obtain the graphical user interface A3. Referring to fig. 4, the graphic user interface A3 is a graphic user interface displayed by the above-mentioned computer device after the virtual character C is controlled to move a preset distance toward the hostile virtual character D in response to the skill release instruction in the graphic user interface A2.
As can be seen from fig. 4, the virtual character C leaves the virtual object B1 and moves a distance to the position of the hostile virtual character D, and since the distance between the position of the hostile virtual character D and the position of the virtual character C is smaller than the preset distance, that is, the hostile virtual character D is located on the path traversed when the virtual character C moves the preset distance. Thus, the avatar C attacks against the hostile avatar D. As shown in fig. 4, the virtual weapon held by virtual character C has been attacked by hostile virtual character D.
In one possible implementation, referring to fig. 5, fig. 5 provides a graphical user interface A4. The virtual character control method further comprises the following steps:
and responding to a second touch operation aiming at the skill response area, and displaying a function control on the graphical user interface.
Optionally, the skill release direction adjustment area is a response area corresponding to the functionality control located in the graphical user interface.
Optionally, the second touch operation is an operation that satisfies the first preset duration. Specifically, the triggering time period of the second touch operation may be greater than or equal to the first preset time period.
The first preset duration may be set by a related technician or user, which is not limited in the embodiment of the present application.
The functionality control may be located in the skill response area, or may be located around the skill control described above, or may also be located within the skill release direction adjustment area.
The function of the functional control may be to control the virtual character to perform operations such as vibration, jump, and evasion, or to control the shooting direction of the virtual camera to be changed, or to control the game view displayed on the current graphical user interface to be changed.
The functions of the functional controls can be set according to actual needs, that is, after the functional controls are triggered, the virtual character C can execute different actions according to the functions of the functional controls. The number of functionality controls may be any possible positive integer. The embodiment of the present application is not limited thereto.
For example, if the second touch operation for the skill response area is a long press operation for the skill control G1 in the skill response area J, the triggering time of the second touch operation is longer than the first preset time period. As shown in fig. 5, the functional controls G3, G4, and G5 may be displayed in the graphical user interface A4. Therefore, each functional control is located in the skill response area, so that a user or a player can trigger each functional control quickly and conveniently under the condition of triggering the long-press operation of the skill control G1.
In the example of fig. 5, the functions of the functional controls G3, G4 and G5 are respectively knife vibration, jump and evasion.
The graphical user interface A4 provided in fig. 5 is only an example, and is not limited to the number of the functional controls being only 3, and is not limited to the functions of the functional controls being only the functions of the functional controls G3, G4 and G5.
In this way, more functional controls can be displayed in the graphical user interface to control the actions or operations of the virtual characters under the condition that the second touch operation aiming at the skill response area is triggered, and therefore the effect of providing effective control assistance for the player can be achieved.
In a possible implementation manner, the virtual character control method further includes:
and responding to the duration of the second touch operation aiming at the skill response area to meet the first preset duration, and controlling to generate a skill release direction adjustment response area.
It is noted that, if the duration of the second touch operation for the skill response area is greater than or equal to the first preset duration, the second touch operation may be indicated as a long press operation. Otherwise, the second touch operation may be determined to be a click operation. When the second touch operation is a long press operation, a skill release direction adjustment response area may be generated, and when the second touch operation is a click operation, the skill may be released directly toward the current shooting direction of the virtual camera or the center of the first game field of view screen or the preset character direction of the virtual character. In this way, it is possible to generate the skill release direction adjustment response area only in the case where the second touch operation is a long press operation without displaying or generating the skill release direction adjustment response area in the case where the second touch operation is a click operation so that the player or the user adjusts the skill release direction, and thus, it is possible to improve flexibility and practicality in displaying or generating the skill release direction adjustment response area.
In a possible implementation, the control generates a skill release direction adjustment response area, including:
and controlling the display of the functional control on the graphical user interface, wherein the skill release direction adjustment response area is a response area corresponding to the functional control.
Therefore, whether each functional control displayed on the graphical user interface is triggered or not can be controlled in the skill release direction adjustment response area, and a user or a player can conveniently trigger each functional control under the condition of triggering the second touch operation.
In a possible implementation manner, the movement control response area and the skill release direction adjustment response area are located on the same side of the graphical user interface, and the movement control response area and the skill response area are respectively located on opposite sides of the graphical user interface.
For example, the movement control response area and the skill release direction adjustment response area are typically located on the left side of the graphical user interface and the skill response area is located on the right side of the graphical user interface. For the movement control response area and the skill release direction adjustment response area which are also positioned on the left side of the graphical user interface, the movement control response area may be positioned at a lower left position of the graphical user interface and the skill release direction adjustment response area may be positioned at an upper left position of the graphical user interface.
For example, referring to fig. 2 to 5, in the graphical user interface A1, the graphical user interface A2, the graphical user interface A3, and the graphical user interface A4, the movement control response area Y and the skill release direction adjustment response area T are each located on the left side of any graphical user interface, and the skill response area J is located on the left side of any graphical user interface.
In this way, the user or the player can conveniently trigger the third touch operation by adjusting the response area according to the skill releasing direction under the condition that the second touch operation is triggered according to the skill response area. And because the movement control response area and the skill response area are respectively positioned at the two opposite sides of the graphical user interface, the first touch operation can be triggered by the left hand and the right hand for the movement control response area and the third touch operation can be triggered by the skill release direction adjustment response area, so that the movement direction and the skill release direction of the virtual character can be adjusted simultaneously. In this way, flexibility in controlling the virtual character can be ensured.
In one possible implementation, the graphical user interface includes a field of view adjustment control.
The virtual character control method further comprises the following steps:
And responding to a fourth touch operation aiming at the visual field adjusting control, adjusting the shooting direction of the virtual camera from a third camera direction to a fourth camera direction according to the fourth touch operation, and adjusting a third game visual field picture to a fourth game visual field picture.
The fourth touch operation may be a click operation, a slide operation, or a drag operation.
In the case where the fourth touch operation is a click operation, the angular difference between the third camera orientation and the fourth camera orientation may be fixed. That is, in this case, the shooting direction of the virtual camera can be rotated by a fixed angle in a fixed direction every time the view adjustment control is clicked. The embodiment of the present application is not limited thereto.
In the case where the fourth touch operation is a sliding operation or a drag operation, the angular difference between the third camera orientation and the fourth camera orientation may be determined according to a length between an operation start position and an operation end position of the sliding operation or the drag operation. In general, the longer the length between the operation start position and the operation end position of the sliding operation or the drag operation, the larger the angle difference between the third camera orientation and the fourth camera orientation may be. Conversely, the smaller the angular difference between the third and fourth camera orientations may be.
Optionally, the third camera orientation may refer to a default shooting orientation of the virtual camera in the virtual scene, may refer to a current shooting orientation of the virtual camera, and may refer to a shooting orientation of the virtual camera without performing a fourth touch operation on the view adjustment control.
The fourth camera orientation may mean that the associated technician or user indicated by the fourth touch operation needs to adjust the orientation of the virtual camera.
Optionally, the third game view frame is a frame determined by the virtual camera according to the third camera orientation to shoot the virtual scene.
The fourth game view screen is a screen determined by the virtual camera according to the fourth camera orientation to shoot the virtual scene.
In another possible embodiment, the view adjustment control is located on a different side of the graphical user interface than the movement control area.
In particular, the view adjustment control may be a control displayed in the graphical user interface or may be a control hidden in the graphical user interface.
In another possible implementation manner, the direction of the virtual character can be adjusted simultaneously when the direction of the virtual camera is adjusted through the touch operation of the view adjustment control.
The virtual character control method further comprises the following steps:
and in response to target touch operation aiming at the visual field adjusting control, adjusting the virtual character from a preset character direction to a target character direction, controlling the virtual character to move from the preset character position according to the target character direction, and simultaneously adjusting the direction of the virtual camera from a fifth camera direction to a sixth camera direction and adjusting a fifth game visual field picture to a sixth game visual field picture.
Optionally, the fifth game view frame is a frame determined by the virtual scene of the scene according to the direction of the fifth camera to shoot the virtual scene.
The sixth game view picture is a picture determined by the scene virtual scene according to the sixth camera direction and the current role position to which the virtual role moves to collect the virtual scene.
It is noted that, when the second touch operation is in continuous touch and the virtual character is controlled to be maintained at the position of the virtual character in the virtual scene, the view adjustment control can adjust the orientation of the virtual camera, but can no longer control the virtual character to move in the game scene or change the position of the virtual character in the virtual scene.
Notably, when the player adjusts the visual field of the virtual character by adjusting the visual field adjustment control, the skill release direction follows the existing snap principle, i.e., toward the direction in which the virtual character is currently facing, rather than being controlled by adjusting the touch operation of the responsive zone in that skill release direction.
For example, referring to fig. 6, in fig. 6, a view adjustment control S is also displayed in the graphical user interface A1, and the shooting direction of the virtual camera can be adjusted from the third camera direction to the fourth camera direction by clicking the view adjustment control S or dragging the view adjustment control S or sliding the view adjustment control S, so as to adjust the third game view screen to the fourth game view screen.
In one possible implementation, the functional control displayed when the second touch operation is in continuous operation may be the same control as the view adjustment control S, that is, the view adjustment function of the view adjustment control S is multiplexed, so that the view adjustment control S may change the camera orientation of the virtual camera even when the virtual character is in the power accumulating state.
In another possible implementation manner, the functionality control is a functionality control added in addition to the control displayed in the current graphical user interface A1, and although both the functionality control and the visual field adjustment control S can adjust the camera orientation of the virtual camera, the functionality control can respond to the operation to adjust the camera orientation of the virtual camera and further determine the final skill release direction in the case that the second touch operation is in continuous operation, and the visual field adjustment control S responds to the operation to adjust the camera orientation of the virtual camera, but does not affect the skill release direction. When the camera orientation of the virtual video camera is adjusted through the function control, the virtual character cannot respond to the operation of the player on the movement control in the movement control area to execute the instruction of adjusting the movement position and the character orientation of the virtual character, namely the movement of the virtual character in the game scene cannot be controlled. When the camera orientation of the virtual video camera is adjusted through the visual field adjusting control S, the moving position and the moving direction of the virtual character in the accumulating state can be adjusted through the moving control in the moving control area, namely, the virtual character is controlled to move in the game scene.
In one possible implementation, the skill response area is a response area corresponding to a skill control located on the graphical user interface.
The virtual character control method further comprises the following steps:
and responding to the second touch operation aiming at the skill response area, and controlling and displaying the skill wheel disc control.
Optionally, the skills wheel control includes at least one child control.
Optionally, the sub-control may be configured to control the virtual character to execute the action corresponding to the sub-control in response to the fifth touch operation.
Optionally, the fifth touch operation is a continuous operation with the second touch operation.
In an embodiment of the present application, the fifth touch operation may be a click operation. When a click operation is received on a skill control, a plurality of sub-skill controls associated with the skill control can be displayed, and the virtual character is controlled to execute different game operations through the sub-skill controls. The clicking operation may be a single clicking operation that is lifted in a preset time, or a long pressing operation that is performed without lifting in a preset time. In this embodiment, the fifth touch operation is a continuous operation with the second touch operation, for example, the fifth touch operation may be a continuous click operation with the sliding operation.
In one possible implementation manner, when the duration of the fifth touch operation performed on the skill response area is an operation that satisfies the first preset duration, the virtual character is controlled to be in a state that cannot respond to the movement control and/or the orientation control, for example, when the virtual character is in a power storage state and a skill release instruction is not received, if the fifth touch operation is detected, the power storage state of the virtual character is controlled to be interrupted, and the game operation corresponding to the sub-skill control is controlled to be performed by the virtual character.
Based on the setting of the control mechanism, continuous operation cannot be performed on the skill response area to adjust the skill release direction in the process of performing the attack, so that in order to solve the problem, the skill release direction adjustment response area set in the embodiment of the application adjusts the camera orientation of the virtual camera through touch operation of the skill release direction adjustment response area, thereby changing the game view picture in the graphical user interface, better positioning the position of the attack object in the picture switching process, determining the skill release direction according to the camera orientation of the virtual camera, adjusting the skill release direction in the attack process, ensuring the accuracy of skill release and improving the man-machine interaction efficiency.
Referring to fig. 7, fig. 7 provides a graphical user interface A5. If the second touch operation for the skill response area is the long press operation for the skill control G1 in the skill response area J, and the triggering time of the second touch operation is longer than the first preset time. As shown in fig. 7, the child controls N1, N2, N3, and N4 may be displayed in the graphical user interface A5. It can be seen that each child control is located around the skill control G1. In the example of fig. 7, the functions of the child controls N1, N2, N3 and N4 are jump longitudinal impact, evasion, squat transverse impact and knife vibration respectively.
The graphical user interface A5 provided in fig. 7 is only an example, and is not limited to the number of sub-controls being only 4, and is not limited to the functions of each 2-control being only the functions of the sub-control N1, the sub-control N2, the sub-control N3, and the sub-control N4.
In this way, more child controls can be displayed in the graphical user interface to control the actions or operations of the virtual characters under the condition that the second touch operation aiming at the skill response area is triggered, and therefore the effect of providing effective control assistance for the player can be achieved.
In one possible implementation manner, in response to a first touch operation for a movement control response area, controlling the virtual character to move in the virtual scene includes:
And adjusting the virtual character from the first character direction to the second character direction, controlling the virtual character to move from the first character position according to the second character direction, and adjusting the first game view field picture to a third game view field picture.
Alternatively, the first character position may refer to a position in the virtual scene where the virtual character is located before the movement occurs.
Optionally, the third game view frame is a frame determined by the virtual scene acquired by the virtual scene according to the first camera direction and the current role position to which the virtual role moves.
Thus, the movement of the virtual character in the virtual scene can be accurately controlled, and the game view picture can be adjusted in real time in the process of the movement of the virtual character.
Referring to fig. 8, fig. 8 provides a graphical user interface A6. As shown in fig. 8, a virtual object B1, a virtual character C, a hostile virtual character D of the virtual character C, a movement control response area Y, a skill response area J, a skill release direction adjustment response area T, a movement control P, a function control G1, a function control G2, a function control G3, a function control G4, a function control G5, a child control N1, a child control N2, a child control N3, a child control N4, and a view adjustment control S are displayed in the graphical user interface A6.
As shown in fig. 8, the functionality controls G1, G2, G3, G4, G5, N1, N2, N3, N4 may all be located in the skill response area J.
The view adjustment control S is located on the right side of the graphical user interface A6 with the skill response area J, that is, the view adjustment control S is located on the same side of the graphical user interface A6 as the skill response area J. Instead, the skill release direction adjustment response area T and the movement control response area Y are located on the left side of the graphical user interface A6, that is, the skill release direction adjustment response area T is located on the other side of the field of view adjustment control S and the skill response area J.
As can be seen from fig. 8, the player can adjust the shooting direction of the virtual camera through the right-hand visual field adjustment control S, and simultaneously can control and display each sub-control in the skill wheel control through the right-hand triggering operation on the functional control G1 in the skill response area J, and can adjust the skill release direction of the skill of the virtual character C through the left-hand triggering operation on the skill release direction adjustment response area T. Therefore, the trigger of the visual field adjusting control S, the skill response area J and the skill release direction adjusting response area T can be realized at the same time, and the effect of improving the man-machine interaction efficiency can be further achieved.
In a possible implementation manner, the virtual character control method further includes:
in response to a skill release instruction, determining a third angular orientation of the virtual character from the second camera orientation, control will adjust the virtual character from the current character orientation to a second character orientation.
In a possible implementation manner, the virtual character control method further includes:
the current weapon equipment status of the virtual character is detected.
The weaponry status includes an untapped weapon, a weaponry near combat weapon, and a weaponry remote weapon.
In the event that the current weaponry status of the virtual character is either an unarmed weapon or an armed remote weapon, it is determined that the virtual character is not controlled to move a preset distance toward the first character.
In addition, in the case that the current weapon equipment state of the virtual character is an untapped weapon or an equipment remote weapon, it can be determined that the virtual character is not in a target state, and then the virtual character is not controlled to trigger a special action or a special skill in the target state, and in this case, the virtual character is not caused to trigger a wall-striking operation. In this way, it is ensured that only weapons in the near combat situation are provided in the virtual character, and thus an effective control assistance for the player can be achieved.
In one possible approach, if the target state is a wall climbing state, the virtual character may be controlled to perform a wall striking action in response to the skill release instruction.
If the target state is a climbing tree state, the virtual character can be controlled to execute wall striking action or top punching action in response to the skill releasing instruction.
In one possible approach, the wall stroke action may include a flick wall stroke action and a power wall stroke action.
The virtual character control method may further include:
and determining the action attribute according to the duration of the second touch operation.
The action attributes include: action intensity and/or action type.
The action attribute may also indicate a value of the above-mentioned preset distance. That is, the distance that the tapping or accumulating wall stroke may move may be the same or different. The embodiment of the present application is not limited thereto.
The action intensity and/or the action type may indicate that the wall stroke action is a tap wall stroke action or a stored force wall stroke action.
Generally, if the duration of the second touch operation is longer than the preset time threshold, the wall striking motion may be determined as a power accumulating wall striking motion. Otherwise, it may be determined that the tapping action is a tapping action. The embodiment of the present application is not limited thereto.
And responding to the skill release instruction, controlling the virtual character to execute the action which corresponds to the target state and has the action attribute in the skill release direction.
In addition, in the process that the virtual character executes the corresponding top-punching action, the flick wall-beating action or the accumulated wall-beating action, if any hostile virtual character is encountered or contacted, the virtual character can be controlled to attack the hostile virtual character according to the action which corresponds to the target state and has the action attribute.
The following describes a device, equipment, a computer readable storage medium, etc. for executing the virtual character control method provided by the present application, and specific implementation processes and technical effects thereof are referred to above, and are not described in detail below.
Fig. 9 is a schematic structural diagram of a virtual character control device according to an embodiment of the present application, referring to fig. 9, the device includes:
a first control module 201 for controlling the virtual character to climb to a side surface of a virtual object in the virtual scene.
The movement control module 202 is configured to control the virtual character to move in the virtual scene in response to a first touch operation for a movement control response area.
The second control module 203 is configured to respond to a second touch operation for the skill response area, respond to a third touch operation for adjusting the skill release direction in a case where the second touch operation is in continuous touch, and adjust a shooting direction of the virtual camera from a first camera direction to a second camera direction and adjust the first game view screen to a second game view screen according to the third touch operation in a case where the virtual character is controlled to be maintained at a position of the virtual character in the virtual scene.
The second game view screen is a screen determined by the virtual camera shooting the virtual scene according to the second camera orientation.
The skill release control module 204 is configured to determine a skill release direction according to the second camera orientation in response to a skill release instruction, adjust the virtual character from a preset character orientation to a first character orientation according to the skill release direction, and control the virtual character to move a preset distance towards the first character orientation, wherein the preset character orientation is different from the second character orientation.
In one possible manner, referring to fig. 10, the apparatus further includes an attack control module 205. The attack control module 205 is configured to control the virtual character to attack the hostile virtual character in response to encountering the hostile virtual character on a path through which the virtual character moves by the preset distance.
Optionally, the movement control response area is a response area corresponding to a movement control located in the graphical user interface.
Optionally, the skill response area is a response area corresponding to a skill control located on the graphical user interface.
Optionally, the second control module 203 is further configured to display a functionality control on the graphical user interface in response to a second touch operation for the skill response area.
Optionally, the skill release direction adjustment area is a response area corresponding to the functionality control located in the graphical user interface.
The second touch operation is an operation meeting a first preset duration.
Optionally, the second control module 203 is further configured to control to generate a skill release direction adjustment response area in response to the duration of the second touch operation for the skill response area meeting the first preset duration.
Optionally, the second control module 203 is further configured to control to display a functionality control on the graphical user interface, where the skill release direction adjustment response area is a response area corresponding to the functionality control.
Optionally, the movement control response area and the skill release direction adjustment response area are located on the same side of the graphical user interface, and the movement control response area and the skill response area are located on opposite sides of the graphical user interface, respectively.
In one possible approach, referring to fig. 11, the apparatus further includes a field of view adjustment module 206.
The view adjustment module 206 is configured to adjust, in response to a fourth touch operation for the view adjustment control, a shooting direction of the virtual camera from a third camera direction to a fourth camera direction according to the fourth touch operation, and adjust a third game view screen to a fourth game view screen.
Optionally, the third game view frame is a frame determined by the virtual camera according to the third camera orientation to shoot the virtual scene.
Optionally, the fourth game view frame is a frame determined by the virtual camera according to the fourth camera orientation to shoot the virtual scene.
Optionally, the skill response area is a response area corresponding to a skill control located on the graphical user interface.
Optionally, the second control module 203 is further configured to control displaying the skill wheel control in response to a second touch operation for the skill response area.
The skill wheel control includes at least one child control. The child control is configured to respond to the fifth touch operation and control the virtual character to execute the action corresponding to the child control.
The fifth touch operation is continuous with the second touch operation.
Optionally, the movement control module 202 is further configured to adjust the virtual character from a first character orientation to a second character orientation, control the virtual character to move from the first character position according to the second character orientation, and adjust the first game view frame to a third game view frame, where the third game view frame is a frame determined by acquiring the virtual scene of the scene according to the first camera orientation and the current character position to which the virtual character moves.
Optionally, the second control module 203 is further configured to respond to a second touch operation for the skill response area, respond to a third touch operation for adjusting the skill release direction in a case where the second touch operation is in continuous touch, and adjust a shooting orientation of the virtual camera from the first camera orientation to the second camera orientation according to the third touch operation in a case where the virtual character is controlled to maintain a current character orientation of the virtual character in the virtual scene.
Optionally, the skill release control module 204 is further configured to determine a third angular orientation of the virtual character according to the second camera orientation in response to the skill release instruction, and control to adjust the virtual character from the current character orientation to the second character orientation.
In one possible manner, referring to fig. 12, the apparatus further comprises a detection module 207.
The detection module 207 is used to detect the current weaponry status of the virtual character, including untapped weaponry, weaponry near combat weaponry, and weaponry remote weaponry.
In the event that the current weaponry status of the virtual character is either an unarmed weapon or an armed remote weapon, it is determined that the virtual character is not controlled to move a preset distance toward the first character.
The foregoing apparatus is used for executing the method provided in the foregoing embodiment, and its implementation principle and technical effects are similar, and are not described herein again.
The above modules may be one or more integrated circuits configured to implement the above methods, for example: one or more application specific integrated circuits (Application Specific Integrated Circuit, abbreviated as ASICs), or one or more microprocessors, or one or more field programmable gate arrays (Field Programmable Gate Array, abbreviated as FPGAs), etc. For another example, when a module above is implemented in the form of a processing element scheduler code, the processing element may be a general-purpose processor, such as a central processing unit (Central Processing Unit, CPU) or other processor that may invoke the program code. For another example, the modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Fig. 13 is a schematic structural diagram of a computer device according to an embodiment of the present application. Referring to fig. 13, the computer apparatus includes: memory 301, processor 302, memory 301 stores a computer program executable on processor 302, and processor 302 implements the steps of any of the various method embodiments described above when executing the computer program. The computer device may be the terminal device or the server, which is not limited in the embodiment of the present application.
Optionally, the processor 302 may be used to control the virtual character to climb to a side surface of a virtual object in the virtual scene.
Optionally, the processor 302 may be configured to control the virtual character to move in the virtual scene in response to a first touch operation for a movement control response area.
Optionally, the processor 302 may be configured to respond to a second touch operation for the skill response area, respond to a third touch operation for adjusting the skill release direction in a case where the second touch operation is in continuous touch, and adjust the shooting orientation of the virtual video camera from the first camera orientation to the second camera orientation and adjust the first game view to the second game view according to the third touch operation in a case where the virtual character is controlled to be maintained at the position of the virtual character in the virtual scene.
The second game view screen is a screen determined by the virtual camera shooting the virtual scene according to the second camera orientation.
Optionally, the processor 302 may be configured to determine a skill release direction according to the second camera orientation in response to the skill release instruction, adjust the virtual character from a preset character orientation to a first character orientation according to the skill release direction, and control the virtual character to move a preset distance towards the first character orientation, wherein the preset character orientation is different from the second character orientation.
Alternatively, the processor 302 may be configured to control the virtual character to attack the hostile virtual character in response to encountering the hostile virtual character on a path traversed by the virtual character to move the preset distance.
Optionally, the movement control response area is a response area corresponding to a movement control located in the graphical user interface.
Optionally, the skill response area is a response area corresponding to a skill control located on the graphical user interface.
Optionally, the processor 302 may be configured to display a functionality control on the graphical user interface in response to a second touch operation for the skill response area.
Optionally, the skill release direction adjustment area is a response area corresponding to the functionality control located in the graphical user interface.
The second touch operation is an operation meeting a first preset duration.
Optionally, the processor 302 may be configured to control generation of the skill release direction adjustment response area in response to the duration of the second touch operation for the skill response area meeting the first preset duration.
Optionally, the processor 302 may be configured to control displaying the functionality control on the graphical user interface, where the skill release direction adjustment response area is a response area corresponding to the functionality control.
Optionally, the movement control response area and the skill release direction adjustment response area are located on the same side of the graphical user interface, and the movement control response area and the skill response area are located on opposite sides of the graphical user interface, respectively.
Optionally, the processor 302 may be configured to adjust the shooting direction of the virtual camera from the third camera direction to the fourth camera direction and adjust the third game field of view screen to the fourth game field of view screen according to a fourth touch operation for the field of view adjustment control.
Optionally, the third game view frame is a frame determined by the virtual camera according to the third camera orientation to shoot the virtual scene.
Optionally, the fourth game view frame is a frame determined by the virtual camera according to the fourth camera orientation to shoot the virtual scene.
Optionally, the skill response area is a response area corresponding to a skill control located on the graphical user interface.
Optionally, the processor 302 may be configured to control displaying the skill wheel control in response to a second touch operation for the skill response area.
The skill wheel control includes at least one child control. The child control is configured to respond to the fifth touch operation and control the virtual character to execute the action corresponding to the child control.
The fifth touch operation is continuous with the second touch operation.
Optionally, the processor 302 may be configured to adjust the virtual character from a first character orientation to a second character orientation, control the virtual character to move from the first character position according to the second character orientation, and adjust the first game view frame to a third game view frame, where the third game view frame is a frame determined by capturing the virtual scene of the scene according to the first camera orientation and the current character position to which the virtual character moves.
Optionally, the processor 302 may be configured to respond to a second touch operation for the skill response area, respond to a third touch operation for adjusting the skill release direction in a case where the second touch operation is in continuous touch, and adjust a shooting orientation of the virtual camera from the first camera orientation to the second camera orientation according to the third touch operation in a case where the virtual character is controlled to maintain a current character orientation of the virtual character in the virtual scene.
Optionally, the processor 302 may be configured to determine a third character orientation of the virtual character according to the second camera orientation in response to the skill release instruction, and control the adjustment of the virtual character from the current character orientation to the second character orientation.
Optionally, the processor 302 may be configured to detect a current weaponry status of the virtual character, including an untapped weapon, a weaponry near weapon, and a weaponry remote weapon.
In the event that the current weaponry status of the virtual character is either an unarmed weapon or an armed remote weapon, it is determined that the virtual character is not controlled to move a preset distance toward the first character.
Embodiments of the present application also provide a computer readable storage medium storing a computer program which, when executed by a processor, performs the steps of the respective method embodiments described above.
Optionally, the processor may be configured to control the virtual character to climb to a side surface of a virtual object in the virtual scene.
Optionally, the processor may be configured to control the virtual character to move in the virtual scene in response to a first touch operation for a movement control response area.
Optionally, the processor may be configured to respond to a second touch operation for a skill response area, respond to a third touch operation for adjusting the skill release direction in a case where the second touch operation is in continuous touch, and adjust a shooting orientation of the virtual video camera from a first camera orientation to a second camera orientation and adjust the first game view screen to a second game view screen according to the third touch operation in a case where the virtual character is controlled to be maintained at a position of the virtual character in the virtual scene.
The second game view screen is a screen determined by the virtual camera shooting the virtual scene according to the second camera orientation.
Optionally, the processor may be configured to determine a skill release direction from the second camera orientation in response to a skill release instruction, adjust the virtual character from a preset character orientation to a first character orientation according to the skill release direction, and control the virtual character to move a preset distance toward the first character orientation, wherein the preset character orientation is different from the second character orientation.
Optionally, the processor may be configured to control the virtual character to attack the hostile virtual character in response to encountering the hostile virtual character on a path traversed by the virtual character moving the preset distance.
Optionally, the movement control response area is a response area corresponding to a movement control located in the graphical user interface.
Optionally, the skill response area is a response area corresponding to a skill control located on the graphical user interface.
Optionally, the processor may be configured to display a functionality control on the graphical user interface in response to a second touch operation for the skill response area.
Optionally, the skill release direction adjustment area is a response area corresponding to the functionality control located in the graphical user interface.
The second touch operation is an operation meeting a first preset duration.
Optionally, the processor may be configured to control generation of the skill release direction adjustment response area in response to a duration of the second touch operation for the skill response area meeting a first preset duration.
Optionally, the processor may be configured to control displaying a functionality control on the graphical user interface, and the skill release direction adjustment response area is a response area corresponding to the functionality control.
Optionally, the movement control response area and the skill release direction adjustment response area are located on the same side of the graphical user interface, and the movement control response area and the skill response area are located on opposite sides of the graphical user interface, respectively.
Optionally, the processor may be configured to adjust a shooting orientation of the virtual camera from a third camera orientation to a fourth camera orientation according to a fourth touch operation for the view adjustment control, and adjust a third game view to a fourth game view.
Optionally, the third game view frame is a frame determined by the virtual camera according to the third camera orientation to shoot the virtual scene.
Optionally, the fourth game view frame is a frame determined by the virtual camera according to the fourth camera orientation to shoot the virtual scene.
Optionally, the skill response area is a response area corresponding to a skill control located on the graphical user interface.
Optionally, the processor may be configured to control the display of the skill wheel control in response to a second touch operation for the skill response area.
The skill wheel control includes at least one child control. The child control is configured to respond to the fifth touch operation and control the virtual character to execute the action corresponding to the child control.
The fifth touch operation is continuous with the second touch operation.
Optionally, the processor may be configured to adjust the virtual character from a first character orientation to a second character orientation, control the virtual character to move from the first character position according to the second character orientation, and adjust the first game view frame to a third game view frame, where the third game view frame is a frame determined by the scene virtual scene acquired by the virtual scene according to the first camera orientation and the current character position to which the virtual character moves.
Optionally, the processor may be configured to respond to a second touch operation for the skill response area, respond to a third touch operation for adjusting the skill release direction in a case where the second touch operation is in continuous touch, and adjust a shooting orientation of the virtual camera from a first camera orientation to a second camera orientation according to the third touch operation in a case where the virtual character is controlled to maintain a current character orientation of the virtual character in the virtual scene.
Optionally, the processor may be configured to determine a third angular orientation of the virtual character from the second camera orientation in response to the skill release instruction, and control the adjustment of the virtual character from the current character orientation to the second character orientation.
Optionally, the processor may be configured to detect a current weaponry status of the virtual character, the weaponry status including an untapped weapon, a weaponry near combat weapon, and a weaponry remote weapon.
In the event that the current weaponry status of the virtual character is either an unarmed weapon or an armed remote weapon, it is determined that the virtual character is not controlled to move a preset distance toward the first character.
Optionally, the present application also provides a program product, such as a computer readable storage medium, comprising a program for performing any of the virtual character control method embodiments described above when executed by a processor.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of elements is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units implemented in the form of software functional units described above may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (english: processor) to perform part of the steps of the methods of the embodiments of the invention. And the aforementioned storage medium includes: u disk, mobile hard disk, read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk, etc.
The foregoing is merely illustrative of embodiments of the present application, and the present application is not limited thereto, and any changes or substitutions can be easily made by those skilled in the art within the technical scope of the present application, and the present application is intended to be covered by the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.
The above description is only of the preferred embodiments of the present application and is not intended to limit the present application, but various modifications and variations can be made to the present application by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (17)

1. A virtual character control method is characterized in that a graphical user interface of a game is provided through a first terminal device, wherein the content displayed by the graphical user interface comprises a virtual scene, and the virtual scene comprises a virtual character; the virtual character control method comprises the following steps:
controlling the virtual character to climb to a side surface of a virtual object in the virtual scene;
responding to a first touch operation aiming at a movement control response area, and controlling the virtual character to move in the virtual scene;
Responding to a second touch operation aiming at a skill response area, responding to a third touch operation aiming at a skill release direction adjustment response area when the second touch operation is in continuous touch, and adjusting the shooting direction of a virtual camera from a first camera direction to a second camera direction according to the third touch operation when the virtual character is controlled to be maintained at the position of the virtual character in the virtual scene, wherein the second game view is determined by the virtual camera shooting the virtual scene according to the second camera direction;
and responding to a skill release instruction, determining a skill release direction according to the second camera orientation, adjusting the virtual character from a preset character orientation to a first character orientation according to the skill release direction, and controlling the virtual character to move a preset distance towards the first character orientation, wherein the preset character orientation is different from the first character orientation.
2. The virtual character control method as claimed in claim 1, wherein the method further comprises:
And responding to the situation that the virtual character encounters the hostile virtual character on the path through which the virtual character moves by the preset distance, and controlling the virtual character to attack the hostile virtual character.
3. The virtual character control method according to claim 1, wherein the movement control response area is a response area corresponding to a movement control located on the graphical user interface.
4. The virtual character control method according to claim 1, wherein the skill response area is a response area corresponding to a skill control located on the graphical user interface.
5. The virtual character control method according to claim 1, wherein the virtual character control method further comprises:
and responding to a second touch operation aiming at the skill response area, and displaying a function control on the graphical user interface.
6. The virtual character control method according to claim 5, wherein the skill release direction adjustment area is a response area corresponding to the function control located in the graphical user interface;
the second touch operation is an operation meeting a first preset duration.
7. The virtual character control method as claimed in claim 5, wherein the virtual character control method further comprises:
And responding to the duration of the second touch operation aiming at the skill response area to meet the first preset duration, and controlling to generate a skill release direction adjustment response area.
8. The virtual character control method according to claim 7, wherein the control generating a skill release direction adjustment response area includes:
and controlling the display of the functional control on the graphical user interface, wherein the skill release direction adjustment response area is a response area corresponding to the functional control.
9. The virtual character control method according to claim 1, wherein the movement control response area and the skill release direction adjustment response area are located on the same side of the graphical user interface, and the movement control response area and the skill response area are located on opposite sides of the graphical user interface, respectively.
10. The virtual character control method according to claim 1, wherein the graphical user interface includes a view adjustment control;
the virtual character control method further comprises the steps of:
and in response to a fourth touch operation for the visual field adjustment control, adjusting the shooting direction of the virtual camera from a third camera direction to a fourth camera direction according to the fourth touch operation, and adjusting a third game visual field picture to a fourth game visual field picture, wherein the third game visual field picture is a picture determined by the virtual camera shooting the virtual scene according to the third camera direction, and the fourth game visual field picture is a picture determined by the virtual camera shooting the virtual scene according to the fourth camera direction.
11. The virtual character control method according to claim 1, wherein the skill response area is a response area corresponding to a skill control located on the graphical user interface;
the virtual character control method further comprises the steps of:
and responding to a second touch operation aiming at the skill response area, controlling and displaying a skill wheel control, wherein the skill wheel control comprises at least one sub-control, and the sub-control is configured to respond to a fifth touch operation to control the virtual character to execute actions corresponding to the sub-control, wherein the fifth touch operation is continuous with the second touch operation.
12. The virtual character control method according to claim 1, wherein the controlling the virtual character to move in the virtual scene in response to the first touch operation for the movement control response area comprises:
and adjusting the virtual character from a first character orientation to a second character orientation, controlling the virtual character to move from a first character position according to the second character orientation, and adjusting the first game view field picture to a third game view field picture, wherein the third game view field picture is a picture determined by acquiring the virtual scene according to the first camera orientation and the current character position to which the virtual character moves.
13. The virtual character control method according to claim 1, wherein the virtual character control method further comprises:
and in response to a skill release instruction, determining a third angular orientation of the virtual character according to the second camera orientation, and controlling the virtual character to be adjusted from the current character orientation to the third angular orientation.
14. The virtual character control method according to any one of claims 1 to 13, wherein the virtual character control method further comprises:
detecting a current weaponry state of the virtual character, the weaponry state including an untapped weapon, a weaponry near combat weapon, and a weaponry remote weapon;
and if the current weapon equipment state of the virtual character is an unarmed weapon or an equipment remote weapon, determining that the virtual character is not controlled to move towards the first character by a preset distance.
15. A virtual character control apparatus, the apparatus comprising:
a first control module for controlling the virtual character to climb to a side surface of a virtual object in the virtual scene;
the mobile control module is used for responding to a first touch operation aiming at a mobile control response area and controlling the virtual character to move in the virtual scene;
A second control module, configured to respond to a second touch operation for a skill response area, respond to a third touch operation for adjusting a skill release direction in a case where the second touch operation is in continuous touch, and adjust a shooting direction of the virtual camera from a first camera direction to a second camera direction according to the third touch operation in a case where the virtual character is controlled to be maintained at a position of the virtual character in the virtual scene, and adjust a first game view screen to a second game view screen, where the second game view screen is a screen determined by the virtual camera to shoot the virtual scene according to the second camera direction;
and the skill release control module is used for responding to a skill release instruction, determining a skill release direction according to the second camera orientation, adjusting the virtual character from a preset character orientation to a first character orientation according to the skill release direction, and controlling the virtual character to move a preset distance towards the first character orientation, wherein the preset character orientation is different from the first character orientation.
16. A computer device, comprising: a memory, a processor, in which a computer program is stored which is executable on the processor, the processor implementing the steps of the virtual character control method according to any one of the preceding claims 1 to 14 when the computer program is executed.
17. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a computer program which, when executed by a processor, implements the steps of the virtual character control method of any one of claims 1 to 14.
CN202210558049.3A 2022-05-19 2022-05-19 Virtual character control method, device and computer equipment Pending CN117122906A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210558049.3A CN117122906A (en) 2022-05-19 2022-05-19 Virtual character control method, device and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210558049.3A CN117122906A (en) 2022-05-19 2022-05-19 Virtual character control method, device and computer equipment

Publications (1)

Publication Number Publication Date
CN117122906A true CN117122906A (en) 2023-11-28

Family

ID=88851463

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210558049.3A Pending CN117122906A (en) 2022-05-19 2022-05-19 Virtual character control method, device and computer equipment

Country Status (1)

Country Link
CN (1) CN117122906A (en)

Similar Documents

Publication Publication Date Title
CN112691377B (en) Control method and device of virtual role, electronic equipment and storage medium
CN113398601B (en) Information transmission method, information transmission device, computer-readable medium, and apparatus
CN114339368B (en) Display method, device and equipment for live event and storage medium
JP7447296B2 (en) Interactive processing method, device, electronic device and computer program for virtual tools
CN112416196B (en) Virtual object control method, device, equipment and computer readable storage medium
CN112121414B (en) Tracking method and device in virtual scene, electronic equipment and storage medium
CN110548288A (en) Virtual object hit prompting method and device, terminal and storage medium
JP2024512582A (en) Virtual item display methods, devices, electronic devices and computer programs
CN112057860B (en) Method, device, equipment and storage medium for activating operation control in virtual scene
CN111589145A (en) Virtual article display method, device, terminal and storage medium
WO2022267616A1 (en) In-game information prompting method and apparatus, electronic device, and storage medium
TWI821779B (en) Virtual object controlling method, device, computer apparatus, and storage medium
CN111589102B (en) Auxiliary tool detection method, device, equipment and storage medium
CN113144603A (en) Method, device, equipment and storage medium for switching call objects in virtual scene
CN117122898A (en) Method, device and equipment for controlling virtual roles through touch equipment
CN117122906A (en) Virtual character control method, device and computer equipment
CN115708956A (en) Game picture updating method and device, computer equipment and medium
CN112121433B (en) Virtual prop processing method, device, equipment and computer readable storage medium
CN112717403B (en) Virtual object control method and device, electronic equipment and storage medium
CN112057859B (en) Virtual object control method, device, terminal and storage medium
CN112843682B (en) Data synchronization method, device, equipment and storage medium
CN117122916A (en) Game character attack direction control method and device and electronic equipment
CN115970262A (en) Game interaction method and device, terminal equipment and computer-readable storage medium
CN115300913A (en) Control method and device of shooting game, electronic equipment and storage medium
CN114146413A (en) Virtual object control method, device, equipment, storage medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination