CN113546403A - Role control method, role control device, terminal and computer readable storage medium - Google Patents

Role control method, role control device, terminal and computer readable storage medium Download PDF

Info

Publication number
CN113546403A
CN113546403A CN202110872984.2A CN202110872984A CN113546403A CN 113546403 A CN113546403 A CN 113546403A CN 202110872984 A CN202110872984 A CN 202110872984A CN 113546403 A CN113546403 A CN 113546403A
Authority
CN
China
Prior art keywords
behavior
area
type
game
basic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110872984.2A
Other languages
Chinese (zh)
Inventor
陈哲
戴一鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110872984.2A priority Critical patent/CN113546403A/en
Publication of CN113546403A publication Critical patent/CN113546403A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a role control method, a role control device, a terminal and a computer readable storage medium; the embodiment of the application provides a behavior control, which comprises a first type basic behavior area, a second type basic behavior area and a target behavior area overlapped with part of the first type basic behavior area and the second type basic behavior area, wherein the first type basic behavior area and the second type basic behavior area correspond to different game behaviors, the game behavior corresponding to the target behavior area is the same as that of the first type basic behavior area, and a touch position can be determined in response to touch operation on the behavior control; when the touch position is in the area where the target behavior area is overlapped with the second type of basic behavior area, controlling the game object to execute the game behavior of the first type of basic behavior area; and when the touch position is in the first type basic target behavior area, controlling the game object to execute the game behavior corresponding to the first type basic behavior area. The scheme can improve the accuracy of role control.

Description

Role control method, role control device, terminal and computer readable storage medium
Technical Field
The application relates to the technical field of games, in particular to a role control method, a role control device, a role control terminal and a computer readable storage medium.
Background
In electronic games (especially mobile phone games), due to the complex and diversified game playing methods, users need to perform more detailed operations if they want to operate game characters to perform certain game behaviors, for example, peek (a game term) means that in shooting games, players hide in a shelter and quickly find out and retract a part of their bodies to learn the enemy in front of the shelter, so as to attack enemies and reduce the probability of being killed by the enemies exposed outside the shelter.
Because the player needs to carefully control the virtual stick to accurately and quickly implement these complex and delicate operations, the current character control method causes the player to have difficulty in accurately implementing these complex operations.
Disclosure of Invention
The embodiment of the application provides a role control method, a role control device, a terminal and a computer readable storage medium, which can improve the accuracy of role control.
The embodiment of the application provides a role control method, a terminal provides a graphical user interface, content displayed by the graphical user interface at least partially comprises a game scene and a game object therein, the graphical user interface provides a behavior control, the behavior control comprises a first type basic behavior region, a second type basic behavior region and a target behavior region overlapped with part of the first type basic behavior region and the second type basic behavior region, the first type basic behavior region and the second type basic behavior region correspond to different game behaviors, and the game behavior corresponding to the target behavior region is the same as that of the first type basic behavior region, the method comprises the following steps:
responding to the touch operation of the behavior control, and determining the touch position of the touch operation;
when the touch position is in the area where the target behavior area is overlapped with the second type of basic behavior area, controlling the game object to execute the game behavior corresponding to the target behavior area and identical to the game behavior of the first type of basic behavior area;
and when the touch position is in the first type basic target behavior area, controlling the game object to execute the game behavior corresponding to the first type basic behavior area.
An embodiment of the present application further provides a role control apparatus, where a terminal provides a graphical user interface, where content displayed on the graphical user interface at least partially includes a game scene and a game object therein, the graphical user interface provides a behavior control, the behavior control includes a first type basic behavior region, a second type basic behavior region, and a target behavior region overlapping with a part of the first type basic behavior region and the second type basic behavior region, the first type basic behavior region and the second type basic behavior region correspond to different game behaviors, and a game behavior corresponding to the target behavior region is the same as a game behavior of the first type basic behavior region, the apparatus includes:
the touch control unit is used for responding to the touch control operation of the behavior control and determining the touch control position of the touch control operation;
the target unit is used for controlling the game object to execute the game behavior which corresponds to the target behavior area and is the same as the game behavior of the first type basic behavior area when the touch position is in the area where the target behavior area is overlapped with the second type basic behavior area;
and the first unit is used for controlling the game object to execute the game behavior corresponding to the first type basic behavior area when the touch position is in the first type basic target behavior area.
In some embodiments, the first type basic behavior region includes a first type basic behavior first sub-region and a first type basic behavior second sub-region, the game behaviors include a first behavior and a second behavior, the game behaviors corresponding to the first type basic behavior first sub-region include the first behavior, and the game behaviors corresponding to the first type basic behavior second sub-region include the second behavior.
In some embodiments, the target behavior region includes a first target behavior sub-region and a second target behavior sub-region, and when the touch position is in a region where the target behavior region overlaps with the second type of basic behavior region, the target unit includes:
the target first subunit is used for controlling the game object to execute the first action when the touch position is in the area where the first sub-area of the target action and the second basic action area are overlapped;
and the target second subunit is used for controlling the game object to execute the second behavior when the touch position is in the region where the second sub-region of the target behavior is overlapped with the second type of basic behavior region.
In some embodiments, the graphical user interface further includes an adjacent region formed by radiating outward from a preset position in the behavior control, where the adjacent region is adjacent to the first type of basic behavior region, and the apparatus further includes:
and the adjacent unit is used for controlling the game object to execute the game behavior corresponding to the first type of basic behavior area when the touch position is in the adjacent area.
In some embodiments, the preset position is a center position of the behavior control.
In some embodiments, the neighboring region includes a first neighboring sub-region and a second neighboring sub-region, the first neighboring sub-region is adjacent to a first sub-region of the first type of base behavior in the behavior control, the second neighboring sub-region is adjacent to a second sub-region of the first type of base behavior in the behavior control, and the neighboring unit is configured to:
when the touch position is in a first adjacent sub-area, controlling the game object to execute a first action;
and when the touch position is in a second adjacent subarea, controlling the game object to execute a second action.
In some embodiments, the game behavior includes a behavior of moving to a moving direction, the moving direction includes a first type moving direction and a second type moving direction, the game behavior corresponding to the first type basic behavior region includes moving to the first type moving direction, and the game behavior corresponding to the first type basic behavior region includes moving to the second type moving direction.
In some embodiments, the first type of movement direction comprises a left direction and the second type of movement direction comprises a right direction.
In some embodiments, the apparatus further comprises:
and the second unit is used for controlling the game object to execute the game behavior corresponding to the second type basic behavior area when the touch position is in the area which is not overlapped with the target behavior area in the second type basic target behavior area.
In some embodiments, the second type basic target behavior region includes a plurality of second type basic target behavior sub-regions, and the game behavior corresponding to each of the second type basic target behavior sub-regions is different.
In some embodiments, the second type of basic target behavior region includes a second type of basic target behavior first sub-region, a second type of basic target behavior second sub-region, a second type of basic target behavior third sub-region, a second type of basic target behavior fourth region, a second type of basic target behavior fifth sub-region, a second type of basic target behavior sixth sub-region;
the game behaviors corresponding to the first sub-area of the second type of basic target behaviors comprise forward movement, the game behaviors corresponding to the second sub-area of the second type of basic target behaviors comprise leftward forward movement, the game behaviors corresponding to the third sub-area of the second type of basic target behaviors comprise rightward forward movement, the game behaviors corresponding to the fourth sub-area of the second type of basic target behaviors comprise backward movement, the game behaviors corresponding to the fifth sub-area of the second type of basic target behaviors comprise backward movement, the game behaviors corresponding to the fourth sub-area of the second type of basic target behaviors comprise leftward backward movement, and the game behaviors corresponding to the fourth sub-area of the second type of basic target behaviors comprise rightward backward movement.
In some embodiments, the behavior control further includes a non-response region, and the non-response region corresponds to the stop-response behavior, and the apparatus further includes:
and the non-response unit is used for controlling the game object to execute a response stopping action when the touch position is in the non-response area.
In some embodiments, the non-responsive unit is configured to control the game object to stop moving in the game scene.
The embodiment of the application also provides a terminal, which comprises a memory and a control unit, wherein the memory stores a plurality of instructions; the processor loads instructions from the memory to execute the steps of any character control method provided by the embodiment of the application.
The embodiment of the present application further provides a computer-readable storage medium, where a plurality of instructions are stored in the computer-readable storage medium, and the instructions are suitable for being loaded by a processor to perform any of the steps in the role control method provided in the embodiment of the present application.
The method and the device for playing the game have the advantages that the terminal can provide a graphical user interface, content displayed on the graphical user interface at least partially comprises a game scene and a game object in the game scene, the graphical user interface provides a behavior control, the behavior control comprises a first type basic behavior area, a second type basic behavior area and a target behavior area overlapped with part of the first type basic behavior area and the second type basic behavior area, the first type basic behavior area and the second type basic behavior area correspond to different game behaviors, and the game behaviors corresponding to the target behavior area are the same as those of the first type basic behavior area. The method and the device for determining the touch position of the behavior control can respond to the touch operation of the behavior control and determine the touch position of the touch operation; when the touch position is in the area where the target behavior area is overlapped with the second type of basic behavior area, controlling the game object to execute the game behavior corresponding to the target behavior area and identical to the game behavior of the first type of basic behavior area; and when the touch position is in the first type basic target behavior area, controlling the game object to execute the game behavior corresponding to the first type basic behavior area.
For example, in some embodiments, a behavior control, which may include 8 behavior regions, a left shift region, a right shift region, an upward shift region, a downward shift region, a left shift region, a right shift region, and a left shift range and a right shift range that partially cover the 8 behavior regions. When the player touches different areas, the game object can be controlled to move towards different directions. The left shift area and the right shift area belong to a first basic behavior area, the up shift area, the down shift area, the left forward shift area, the left backward shift area, the right forward shift area and the right backward shift area belong to a second basic behavior area, and the left shift range and the right shift range belong to a target behavior area. Therefore, in the embodiment, if the player wants to control the game object to make peek after the shelter, even if the player mistakenly touches the second-type basic behavior area, the game object is controlled to move to the left as long as the touch operation of the player falls into the left movement range, and the game object is controlled to move to the right as long as the touch operation of the player falls into the right movement range. Therefore, the scheme can prevent the game object from leaving the shelter due to mistaken touch when the game object makes peek after the shelter, and the game object is exposed to the heat of the enemy.
According to the behavior control provided by the embodiment of the application, because the behavior control comprises the target behavior area partially covering the basic behavior area, as long as the touch operation of the player falls into the target behavior area, the game object controlled by the player can execute the game behavior corresponding to the first basic behavior area, even if the player mistakenly touches the second basic behavior area. Therefore, the embodiment of the application can effectively prevent the game object from executing wrong game behaviors caused by the mistaken touch operation of the player, so the scheme can improve the accuracy of character control.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1a is a schematic flowchart of a role control method provided in an embodiment of the present application;
fig. 1b is a schematic structural diagram of a behavior control of a role control method provided in an embodiment of the present application;
fig. 1c is a schematic structural diagram of a behavior control of a role control method provided in an embodiment of the present application;
fig. 1d is a schematic diagram of an adjacent area of a role control method provided in an embodiment of the present application;
fig. 1e is a schematic structural diagram of a behavior control of a role control method provided in an embodiment of the present application;
fig. 1f is a schematic diagram of a behavior control and an adjacent area of a role control method provided in the embodiment of the present application;
FIG. 2a is a schematic diagram of a graphical user interface in a role control method provided in an embodiment of the present application;
fig. 2b is a schematic structural diagram of a mobile control in a role control method provided in an embodiment of the present application;
fig. 3 is a schematic structural diagram of a character control apparatus according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a role control method, a role control device, a terminal and a computer readable storage medium.
The role control device may be specifically integrated in an electronic device, and the electronic device may be a terminal, a server, or other devices. The terminal can be a mobile phone, a tablet Computer, an intelligent bluetooth device, a notebook Computer, or a Personal Computer (PC), and the like; the server may be a single server or a server cluster composed of a plurality of servers.
In some embodiments, the role control apparatus may also be integrated in a plurality of electronic devices, for example, the role control apparatus may be integrated in a plurality of servers, and the role control method of the present application is implemented by the plurality of servers.
A role control method in one embodiment of the present disclosure may be executed in a terminal device or a server. The terminal device may be a local terminal device. When the role control method is operated on a server, the method can be implemented and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an optional embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud games. Taking a cloud game as an example, a cloud game refers to a game mode based on cloud computing. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, the storage and the operation of the role control method are completed on the cloud game server, and the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a behavior terminal, a television, a computer, a palm computer and the like; however, the terminal device performing role control is a cloud game server in the cloud. When a game is played, a user operates the client device to send an operation instruction, such as an operation instruction of touch operation, to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as a game picture and the like are encoded and compressed and returned to the client device through a network, and finally, the client device decodes the data and outputs the game picture.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used for presenting a game screen. The local terminal device is used for interacting with a user through a graphical user interface, namely, a game program is downloaded and installed and operated through the electronic device conventionally. The manner in which the local terminal device provides the graphical user interface to the user may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the user by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including a game screen and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
A game scene (or referred to as a virtual scene) is a virtual scene that an application program displays (or provides) when running on a terminal or a server. Optionally, the virtual scene is a simulated environment of the real world, or a semi-simulated semi-fictional virtual environment, or a purely fictional virtual environment. The virtual scene is any one of a two-dimensional virtual scene and a three-dimensional virtual scene, and the virtual environment can be sky, land, sea and the like, wherein the land comprises environmental elements such as deserts, cities and the like. For example, in a sandbox type 3D shooting game, the virtual scene is a 3D game world for the user to control the virtual object to play against, and an exemplary virtual scene may include: at least one element selected from the group consisting of mountains, flat ground, rivers, lakes, oceans, deserts, sky, plants, buildings, and vehicles.
The game interface is an interface corresponding to an application program provided or displayed through a graphical user interface, the interface comprises a graphical user interface and a game picture for interaction of a user, and the game picture is a picture of a game scene.
In alternative embodiments, game controls (e.g., skill controls, behavior controls, functionality controls, etc.), indicators (e.g., direction indicators, character indicators, etc.), information presentation areas (e.g., number of clicks, game play time, etc.), or game setting controls (e.g., system settings, stores, coins, etc.) may be included in the UI interface.
For example, in some embodiments, a behavior control may be included in the graphical user interface.
In an optional embodiment, the game screen is a display screen corresponding to a virtual scene displayed by the terminal device, and the game screen may include virtual objects such as a game object, an NPC character, and an AI character, which execute a game logic in the virtual scene.
For example, in some embodiments, the content displayed in the graphical user interface at least partially comprises a game scene, wherein the game scene comprises at least one game object.
In some embodiments, the game objects in the game scene comprise virtual objects manipulated by the player.
A game object (or referred to as a virtual object, game object) refers to a dynamic object that can be controlled in a virtual scene. Alternatively, the dynamic object may be a virtual character, a virtual animal, an animation character, or the like. The virtual object is a Character controlled by a user through an input device, or an Artificial Intelligence (AI) set in a virtual environment battle through training, or a Non-virtual Character (NPC) set in a virtual scene battle.
Optionally, the virtual object is a virtual character playing a game in a virtual scene. Optionally, the number of virtual objects in the virtual scene match is preset, or dynamically determined according to the number of clients participating in the match, which is not limited in the embodiment of the present application.
In one possible implementation, the user can control the virtual object to play the game behavior in the virtual scene, and the game behavior can include moving, releasing skills, using props, dialog, and the like, for example, controlling the virtual object to run, jump, crawl, and the like, and can also control the virtual object to fight with other virtual objects using the skills, virtual props, and the like provided by the application program.
The virtual camera is a necessary component for game scene pictures, is used for presenting the game scene pictures, one game scene at least corresponds to one virtual camera, two or more than two virtual cameras can be used as game rendering windows according to actual needs, the game rendering windows are used for capturing and presenting picture contents of a game world for a user, and the viewing angles of the game world, such as a first person viewing angle and a third person viewing angle, of the user can be adjusted by setting parameters of the virtual camera.
In an optional implementation manner, an embodiment of the present invention provides a role control method, where a graphical user interface is provided by a terminal device, where the terminal device may be the aforementioned local terminal device, or the aforementioned client device in a cloud interaction system.
The following are detailed below. The numbers in the following examples are not intended to limit the order of preference of the examples.
The disclosed embodiment is first directed to providing another character control method, in which a concept of a target behavior region is newly added to a behavior control, so that as long as a touch operation of a player falls within the target behavior region, a game object controlled by the player executes a game behavior corresponding to a first type of basic behavior region, even if the player mistakenly touches a second type of basic behavior region. Therefore, the embodiment of the application can effectively prevent the game object from executing wrong game behaviors caused by the mistaken touch operation of the player, so the scheme can improve the accuracy of character control and further improve the controllability of the game.
The method and the device for playing the game have the advantages that the terminal provides a graphical user interface, content displayed on the graphical user interface at least partially comprises a game scene and a game object in the game scene, the graphical user interface provides a behavior control, the behavior control comprises a first type basic behavior area, a second type basic behavior area and a target behavior area overlapped with part of the first type basic behavior area and the second type basic behavior area, the first type basic behavior area and the second type basic behavior area correspond to different game behaviors, and the game behaviors corresponding to the target behavior area are the same as those of the first type basic behavior area.
Fig. 1a is a flowchart of a role control method according to an embodiment of the present application, and as shown in fig. 1a, the method in this embodiment includes the following steps:
110. and responding to the touch operation of the behavior control, and determining the touch position of the touch operation.
The touch operation may include operations such as touch, drag, swipe, long press, double click, and click.
The behavior control is used for controlling the game object to execute a specific game behavior in the game scene. The behavior control can comprise a plurality of basic behavior areas and target behavior areas partially overlapped with the basic behavior areas, and each basic behavior area corresponds to different game behaviors. The base behavior region may include, but is not limited to, a first base behavior region, and a second base behavior region.
The partial overlap refers to that the target behavior region overlaps with a part of the basic behavior region.
Alternatively, the target behavior region may overlap with a portion of the region in each behavior region; alternatively, the target behavior region may overlap with a part of some behavior regions.
The shapes of the behavior control, the basic behavior region and the target behavior region can be designed according to actual requirements. The behavior control can include N basic behavior regions, and the basic behavior regions can be the same in shape or different in shape, where N is a positive integer. For example, the behavior control may be a circle, wherein the behavior control may include 8 base behavior regions with sectors of the same shape, the 8 behavior regions may divide the behavior control 8 of the circle equally, the target behavior region may be a rectangle with a side length smaller than the radius of the behavior control, and the center of the target behavior region may overlap with the center of the behavior control, so that the target behavior region partially overlaps with the 8 behavior regions.
For example, fig. 1b shows a behavior control, and as shown in fig. 1b, the behavior control includes 8 basic behavior regions with the same shape and size, which are basic behavior regions 20 to 27, respectively; the center of the behavior control is covered with a rectangular target behavior area 10, and the target behavior area 10 covers part of the basic behavior areas 20-27.
The basic behavior region may be formed by a plurality of basic behavior sub-regions, each of which may correspond to a different game behavior, for example, the basic behavior sub-region 26 may correspond to a behavior "move to the left", the basic behavior sub-region 22 may correspond to a behavior "move to the right", and so on.
In some embodiments, the behavior control may further include a virtual joystick, and the player may drag the virtual joystick to move in the behavior control to implement a touch operation on the behavior control, where a touch position of the touch operation is a position of the virtual joystick when the virtual joystick is moved. That is, in some embodiments, step 110 may include the steps of:
responding to the dragging operation of the virtual rocker, and controlling the virtual rocker to move in the behavior control;
and taking the position of the virtual rocker as the touch position when the virtual rocker is moved.
120. And when the touch position is in the area where the target behavior area is overlapped with the second type of basic behavior area, controlling the game object to execute the game behavior corresponding to the target behavior area and identical to the game behavior of the first type of basic behavior area.
103. And when the touch position is in the first type basic target behavior area, controlling the game object to execute the game behavior corresponding to the first type basic behavior area.
According to the steps 120-130, as long as the touch position is in the first type basic behavior area or the target behavior area, the game object can be controlled to execute the game behavior corresponding to the first type basic behavior area.
For example, referring to fig. 1b, the behavior control in fig. 1b includes the target behavior area 10 and the first type basic behavior areas 22 and 26, and when the touch position is in any one of the target behavior area 10 and the first type basic behavior areas 22 and 26, the game object is controlled to execute the game behavior corresponding to the first type basic behavior area.
Optionally, the first type of basic behavior region may include a plurality of target behavior sub-regions, and the target behavior sub-regions may correspond to the same game behavior or different game behaviors, and the correspondence relationship may be set according to an actual scene. For example, in some embodiments, the first type of basic behavior region includes a first sub-region of the first type of basic behaviors and a second sub-region of the first type of basic behaviors, the game behaviors include a first behavior and a second behavior, the game behaviors corresponding to the first sub-region of the first type of basic behaviors include the first behavior, and the game behaviors corresponding to the second sub-region of the first type of basic behaviors include the second behavior.
In some embodiments, the game behavior includes an behavior of moving in a moving direction, the moving direction includes a first type moving direction and a second type moving direction, the game behavior corresponding to the first type basic behavior region includes moving in the first type moving direction, and the game behavior corresponding to the first type basic behavior region includes moving in the second type moving direction.
In some embodiments, the first type of movement direction comprises a left direction and the second type of movement direction comprises a right direction, i.e. the first action may be a left movement and the second action may be a right movement.
Optionally, the target behavior region may include a plurality of target behavior sub-regions. For example, in some embodiments, the target behavior region may include a target behavior first sub-region and a target behavior second sub-region, and step 120 may include the steps of:
when the touch position is in the area where the first sub-area of the target behavior and the second basic behavior area are overlapped, controlling the game object to execute the first behavior;
and when the touch position is in the area where the second sub-area of the target behavior is overlapped with the second type basic behavior area, controlling the game object to execute a second behavior.
For example, referring to fig. 1c, the behavior control in fig. 1c includes a plurality of target behavior sub-regions, 11, 12, 22, and 26, respectively, and assuming that 26 corresponds to a first behavior and 22 corresponds to a second behavior, then:
when the touch position is within 11, controlling the game object to execute a first action; controlling the game object to perform a first action when the touch position is within 26; when the touch position is within 12, controlling the game object to execute a second action; and when the touch position is within 22, controlling the game object to execute a second action.
In some embodiments, the game behavior may include a behavior moving to a movement direction corresponding to the behavior region, such as moving forward, moving backward, moving left, moving right, moving forward left, moving backward left, moving forward right, moving backward right, and so on, that is, steps 120 to 130 may refer to fig. 1c, where the behavior control in fig. 1c has a first sub-region of the target behavior 11, a second sub-region of the target behavior 12, a first sub-region of the first type of basic behavior 26, a second sub-region of the first type of basic behavior 22, and if 26 corresponds to moving to the left and 22 corresponds to moving to the right, then:
when the touch position is in a first sub-area of the first type of basic behaviors, controlling the game object to move to the left; when the touch position is in the first sub-area of the target behavior, controlling the game object to move to the left; when the touch position is in a second sub-area of the first type basic behaviors, controlling the game object to move to the right; and when the touch position is in the second sub-area of the target behavior, controlling the game object to move towards the right.
The shapes, colors, effects and the like of the first sub-region of the target behaviors, the second sub-region of the target behaviors, the first sub-region of the first type of basic behaviors and the second sub-region of the first type of basic behaviors can be designed according to requirements.
In some embodiments, the method may further comprise step 140, step 140 comprising:
and when the touch position is in an area which is not overlapped with the target behavior area in the second type basic target behavior area, controlling the game object to execute the game behavior corresponding to the second type basic behavior area.
In some embodiments, the second type basic target behavior region includes a plurality of second type basic target behavior sub-regions, and the game behavior corresponding to each of the second type basic target behavior sub-regions is different.
For example, referring to FIG. 1c, a first type of base behavior region in a behavior control includes 22, 26, and a second type of base behavior region outside of it includes 20, 21, 23, 24, 25, 27. Wherein the second type base behavior region may include a plurality of second type base behavior sub-regions, e.g., the second type base behavior sub-regions in the behavior control include 20, 21, 23, 24, 25, 27.
In some embodiments, the game play behavior may include a behavior of moving to a movement direction corresponding to the behavior region, and therefore, step 150 may include the steps of:
and when the touch position is in an area which is not overlapped with the target behavior area in the second type basic target behavior area, controlling the game object to move towards the moving direction corresponding to the second type basic behavior area.
Alternatively, each of the second type base behaviour sub-regions may correspond to the same or different gaming behaviour. For example, 20 corresponds to "forward movement", 21 corresponds to "forward movement to the right", 23 corresponds to "backward movement to the right", 24 corresponds to "backward movement", 25 corresponds to "backward movement to the left", and 27 corresponds to "forward movement to the left".
In some embodiments, the second type of basic target behavior region includes a second type of basic target behavior first sub-region, a second type of basic target behavior second sub-region, a second type of basic target behavior third sub-region, a second type of basic target behavior fourth region, a second type of basic target behavior fifth sub-region, a second type of basic target behavior sixth sub-region;
the game behaviors corresponding to the first sub-area of the second type of basic target behaviors comprise forward movement, the game behaviors corresponding to the second sub-area of the second type of basic target behaviors comprise leftward forward movement, the game behaviors corresponding to the third sub-area of the second type of basic target behaviors comprise rightward forward movement, the game behaviors corresponding to the fourth sub-area of the second type of basic target behaviors comprise backward movement, the game behaviors corresponding to the fifth sub-area of the second type of basic target behaviors comprise backward movement, the game behaviors corresponding to the fourth sub-area of the second type of basic target behaviors comprise leftward backward movement, and the game behaviors corresponding to the fourth sub-area of the second type of basic target behaviors comprise rightward backward movement.
For example, referring to fig. 1c, when the touch position is in the area of 20 that does not overlap with the target behavior area, the game object is controlled to move forward; when the touch position is in the area which is not overlapped with the target behavior area in the 21, controlling the game object to move to the right front; when the touch position is in the area which is not overlapped with the target behavior area in 23, controlling the game object to move to the right rear direction; when the touch position is in the region which is not overlapped with the target behavior region in the 24, controlling the game object to move backwards; when the touch position is in the area which is not overlapped with the target behavior area in 25, controlling the game object to move towards the left rear direction; when the touch position is in the area of 27 that does not overlap the target behavior area, the game object is controlled to move to the left front.
In some embodiments, the graphical user interface further includes an adjacent region formed by radiating outward from a preset position in the behavior control, where the adjacent region is adjacent to the first type of basic behavior region, and in some embodiments, the preset position is a center position of the behavior control, and the method may further include step 150, where step 150 includes:
and when the touch position is in the adjacent area, controlling the game object to execute the game behavior corresponding to the first type of basic behavior area.
The adjacent region provided in the graphical user interface is outside the behavior control, and the adjacent region is used for expanding the trigger range of the game behavior corresponding to the first type of basic behavior region, namely, when a player touches any one of the first type of basic behavior region, the target behavior region and the adjacent region, the game object can be controlled to execute the game behavior corresponding to the first type of basic behavior region.
In some embodiments, the neighboring region includes a first neighboring sub-region adjacent to the first sub-region of the first type of base behavior in the behavior control and a second neighboring sub-region adjacent to the second sub-region of the first type of base behavior in the behavior control, step 150 includes:
when the touch position is in a first adjacent sub-area, controlling the game object to execute a first action;
and when the touch position is in a second adjacent subarea, controlling the game object to execute a second action.
Optionally, the shape, size, color, visual effect, etc. of the adjacent regions may be set according to actual requirements.
Referring to FIG. 1d, the adjacent region may include a first adjacent sub-region 36 and a second adjacent sub-region 32, the first adjacent sub-region 36 being adjacent to the first sub-region 26 of the first type of base behavior in the behavior control, the second adjacent sub-region 32 being adjacent to the second sub-region 22 of the first type of base behavior in the behavior control; since the first sub-area 26 of the first type of basic behavior corresponds to the first behavior and the second sub-area 26 of the first type of basic behavior corresponds to the second behavior, when the touch position is in the first adjacent sub-area 36, the game object is controlled to execute the first behavior; and when the touch position is in the second adjacent sub-area 32, controlling the game object to execute a second action.
For example, referring to fig. 1d, when the touch position is within any one of the first target sub-range 11, the first target sub-region 26, or the first adjacent sub-region 36, the game object is controlled to move to the left; when the touch position is within any one of the second target sub-range 12, the second target sub-area 22 or the second adjacent sub-area 32, the game object is controlled to move rightward.
In some embodiments, the behavior control further includes a non-response area, and the non-response area corresponds to the response stopping behavior, and the method further includes step 160, where the step 160 includes:
and when the touch position is in the non-response area, controlling the game object to execute a response stopping action.
In some embodiments, the stop response activity includes stopping movement, and controlling the game object to perform the stop response activity includes:
and controlling the game object to stop moving in the game scene.
The shape, size, color, visual effect and the like of the non-response area can be set according to actual requirements.
For example, referring to FIG. 1e, the unresponsive region 40 is a visible circle centered on the behavior control.
For example, in some embodiments, referring to fig. 1f, the first type of basic behavior region includes 26, the target behavior region includes 11, and the adjacent region includes 36, and in response to the player touching one of 26, 11 or 36 other than the behavior control, the game object is controlled to move to the left; the first type of basic behavior area comprises 22, the target behavior area comprises 12, the adjacent area comprises 32, and the game object is controlled to move to the right in response to the player touching one of 22, 12 or 32 except the behavior control.
Through the embodiment, the problem that the virtual object cannot return to the back of the shelter after leaving the shelter and is exposed to the fire of an enemy due to the fact that the virtual object moves back and forth while making a peek operation through frequent left-right transverse movement due to mistaken touch operation of a user can be avoided; in addition, while solving the problem that the virtual object does not move left and right as expected by the player, the visual unresponsive area proposed by the scheme also informs the player which areas in the behavior control are not movable areas.
As can be seen from the above, the embodiment of the application can determine the touch position in response to the touch operation on the behavior control; when the touch position is in the area where the target behavior area is overlapped with the second type of basic behavior area, controlling the game object to execute the game behavior corresponding to the target behavior area and identical to the game behavior of the first type of basic behavior area; and when the touch position is in the first type basic target behavior area, controlling the game object to execute the game behavior corresponding to the first type basic behavior area. Therefore, the method and the device can improve the accuracy of role control.
The method described in the above embodiments is further described in detail below.
In this embodiment, the method of the embodiment of the present application will be described in detail by taking a shooting-type mobile game as an example.
As shown in fig. 2a, the graphical user interface of the shooting-type mobile game provides a movement control 0 (i.e., a behavior control), and the movement control 0 includes a visible unresponsive area 40 and a virtual joystick 50. The player can drag the virtual rocker 50 to move in the movement control 0, so that the game character is controlled to move left and right after the shelter.
Optionally, the conventional movement control can be used for controlling the game character to move when the player is far away from the shelter, and the conventional movement control can be replaced by the movement control proposed by the scheme only when the player is close to the shelter, so that the game character can make a standard peek operation after the shelter.
Referring to fig. 2b, fig. 2b is a region structure of behavior control 0 in fig. 2a, wherein none of the dotted line portions is visible, and only non-responsive region 40 and virtual rocker 50 in behavior control 0 are visible.
The game character is controlled to move leftwards when the player drags the virtual rocker 50 to the dotted line area 6, is controlled to move rightwards when the player drags the virtual rocker 50 to the dotted line area 2, and is controlled to stop moving when the player drags the virtual rocker 50 to the non-response area 40. The game character is controlled to move in other directions, for example, left-front, right-front, left-rear, right-rear, etc., in response to the player dragging the virtual stick 50 to other areas.
The moving control comprises a response area which is equally divided in eight directions, and a non-response area exists in the center. The center radius of the unresponsive region may be 8 pixels. Optionally, the angle of the sector area in the oblique direction may be controlled by the angles of the sector areas, i.e.:
the sector angle in the oblique direction is { (360 ° -the sum of the sector angles of the upper, lower, left and right sides)/4 }.
Therefore, the scheme can be used for refining the response of moving the rocker left and right, the problem that when the rocker moves left and right is solved, the player triggers the movement in other directions, and meanwhile, the visible non-response area also informs the player which areas are not movable areas.
Therefore, the scheme can improve the moving operation precision of the player, improve the operation hand feeling and game competition of the player, and prevent the phenomenon that the character moves forwards and hits the wall due to false response when the player moves left and right after the player is on the wall and performs the left and right transverse movement stretching-out operation in the actual playing process.
In addition, due to the rule that the straight line between the two points is shortest, the player can transversely move left and right in an open area, and the efficiency of pulling and avoiding the aiming of an enemy is also highest.
In order to better implement the method, an embodiment of the present application further provides a role control device, where the role control device may be specifically integrated in an electronic device, and the electronic device may be a terminal, a server, or other devices. The terminal can be a mobile phone, a tablet computer, an intelligent Bluetooth device, a notebook computer, a personal computer and other devices; the server may be a single server or a server cluster composed of a plurality of servers.
For example, in the present embodiment, the method of the present embodiment will be described in detail by taking an example in which the role control device is specifically integrated in the behavior terminal.
The character control device provides a graphical user interface through a terminal, content displayed on the graphical user interface at least partially comprises a game scene and a game object in the game scene, the graphical user interface provides a behavior control, the behavior control comprises a first type basic behavior area, a second type basic behavior area and a target behavior area overlapped with part of the first type basic behavior area and the second type basic behavior area, the first type basic behavior area and the second type basic behavior area correspond to different game behaviors, and the game behaviors corresponding to the target behavior area are the same as those of the first type basic behavior area.
As shown in fig. 3, the character control apparatus includes a touch unit 310, a target unit 320, and a first unit 330, as follows:
(1) the touch unit 310 is configured to determine a touch position of a touch operation in response to the touch operation on the behavior control.
(2) And the target unit 320 is configured to, when the touch position is in an area where the target behavior area overlaps with the second type basic behavior area, control the game object to execute a game behavior corresponding to the target behavior area, which is the same as the game behavior of the first type basic behavior area.
(3) The first unit 330 is configured to control the game object to execute a game behavior corresponding to the first type basic behavior area when the touch position is in the first type basic target behavior area.
In some embodiments, the first type basic behavior region includes a first type basic behavior first sub-region and a first type basic behavior second sub-region, the game behaviors include a first behavior and a second behavior, the game behaviors corresponding to the first type basic behavior first sub-region include the first behavior, and the game behaviors corresponding to the first type basic behavior second sub-region include the second behavior.
In some embodiments, the target behavior region includes a target behavior first sub-region and a target behavior second sub-region, and when the touch position is in a region where the target behavior region overlaps with the second type of basic behavior region, the target unit 320 includes:
the target first subunit is used for controlling the game object to execute the first action when the touch position is in the area where the first sub-area of the target action and the second basic action area are overlapped;
and the target second subunit is used for controlling the game object to execute the second behavior when the touch position is in the region where the second sub-region of the target behavior is overlapped with the second type of basic behavior region.
In some embodiments, the graphical user interface further includes an adjacent region formed by radiating outward from a preset position in the behavior control, the adjacent region being adjacent to the first type of basic behavior region, and the apparatus further includes:
and the adjacent unit is used for controlling the game object to execute the game behavior corresponding to the first type of basic behavior area when the touch position is in the adjacent area.
In some embodiments, the preset position is a center position of the behavior control.
In some embodiments, the neighboring region includes a first neighboring sub-region and a second neighboring sub-region, the first neighboring sub-region is adjacent to a first sub-region of the first type of base behavior in the behavior control, the second neighboring sub-region is adjacent to a second sub-region of the first type of base behavior in the behavior control, and the neighboring unit is configured to:
when the touch position is in a first adjacent sub-area, controlling the game object to execute a first action;
and when the touch position is in a second adjacent subarea, controlling the game object to execute a second action.
In some embodiments, the game behavior includes an behavior of moving in a moving direction, the moving direction includes a first type moving direction and a second type moving direction, the game behavior corresponding to the first type basic behavior region includes moving in the first type moving direction, and the game behavior corresponding to the first type basic behavior region includes moving in the second type moving direction.
In some embodiments, the first type of movement direction comprises a left direction and the second type of movement direction comprises a right direction.
In some embodiments, the apparatus further comprises:
and the second unit is used for controlling the game object to execute the game behavior corresponding to the second type basic behavior area when the touch position is in the area which is not overlapped with the target behavior area in the second type basic target behavior area.
In some embodiments, the second type basic target behavior region includes a plurality of second type basic target behavior sub-regions, and the game behavior corresponding to each of the second type basic target behavior sub-regions is different.
In some embodiments, the second type of basic target behavior region includes a second type of basic target behavior first sub-region, a second type of basic target behavior second sub-region, a second type of basic target behavior third sub-region, a second type of basic target behavior fourth region, a second type of basic target behavior fifth sub-region, a second type of basic target behavior sixth sub-region;
the game behaviors corresponding to the first sub-area of the second type of basic target behaviors comprise forward movement, the game behaviors corresponding to the second sub-area of the second type of basic target behaviors comprise leftward forward movement, the game behaviors corresponding to the third sub-area of the second type of basic target behaviors comprise rightward forward movement, the game behaviors corresponding to the fourth sub-area of the second type of basic target behaviors comprise backward movement, the game behaviors corresponding to the fifth sub-area of the second type of basic target behaviors comprise backward movement, the game behaviors corresponding to the fourth sub-area of the second type of basic target behaviors comprise leftward backward movement, and the game behaviors corresponding to the fourth sub-area of the second type of basic target behaviors comprise rightward backward movement.
In some embodiments, the behavior control further includes a non-response region, and the non-response region corresponds to the stop-response behavior, and the apparatus further includes:
and the non-response unit is used for controlling the game object to execute a response stopping action when the touch position is in the non-response area.
In some embodiments, the non-responsive unit is configured to control the game object to stop moving in the game scene.
In a specific implementation, the above units may be implemented as independent entities, or may be combined arbitrarily to be implemented as the same or several entities, and the specific implementation of the above units may refer to the foregoing method embodiments, which are not described herein again.
As can be seen from the above, in the role control device of the embodiment, the touch unit responds to the touch operation on the behavior control, and determines the touch position of the touch operation; when the touch position is in the area where the target behavior area is overlapped with the second type of basic behavior area, the target unit controls the game object to execute the game behavior corresponding to the target behavior area and identical to the game behavior of the first type of basic behavior area; and when the touch position is in the first type basic target behavior area, the first unit controls the game object to execute the game behavior corresponding to the first type basic behavior area. Therefore, the method and the device can improve the accuracy of role control.
Correspondingly, the embodiment of the present application further provides a computer device, where the computer device may be a terminal or a server, and the terminal may be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game machine, a Personal computer, and a Personal Digital Assistant (PDA).
As shown in fig. 4, fig. 4 is a schematic structural diagram of a computer device 400 according to an embodiment of the present application, where the computer device 400 includes a processor 401 having one or more processing cores, a memory 402 having one or more computer-readable storage media, and a computer program stored in the memory 402 and running on the processor. The processor 401 is electrically connected to the memory 402. Those skilled in the art will appreciate that the computer device configurations illustrated in the figures are not meant to be limiting of computer devices and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The processor 401 is a control center of the computer device 400, connects the respective parts of the entire computer device 400 using various interfaces and lines, performs various functions of the computer device 400 and processes data by running or loading software programs and/or modules stored in the memory 402 and calling data stored in the memory 402, thereby monitoring the computer device 400 as a whole.
In the embodiment of the present application, the processor 401 in the computer device 400 loads instructions corresponding to processes of one or more application programs into the memory 402 according to the following steps, and the processor 401 runs the application programs stored in the memory 402, thereby implementing various functions:
responding to the touch operation of the behavior control, and determining the touch position of the touch operation;
when the touch position is in the first type basic behavior area or the target behavior area, controlling the game object to execute the game behavior corresponding to the first type basic behavior area;
when the touch position is in the second type basic behavior area and is not in the target behavior area, controlling the game object to execute the game behavior corresponding to the second type basic behavior area;
and the second type basic behavior region is a behavior region except the first type basic behavior region.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 4, the computer device 400 further includes: touch-sensitive display screen 403, radio frequency circuit 404, audio circuit 405, input unit 406 and power 407. The processor 401 is electrically connected to the touch display screen 403, the radio frequency circuit 404, the audio circuit 405, the input unit 406, and the power source 407. Those skilled in the art will appreciate that the computer device configuration illustrated in FIG. 4 does not constitute a limitation of computer devices, and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The touch display screen 403 may be used for displaying a graphical user interface and receiving operation instructions generated by a user acting on the graphical user interface.
For example, the touch-sensitive display screen 403 may display a graphical user interface providing an ammunition control, where the content displayed in the graphical user interface at least partially includes a game scene, where the game scene includes at least one game object
The touch display screen 403 may include a display panel and a touch panel. The display panel may be used, among other things, to display information entered by or provided to a user and various graphical user interfaces of the computer device, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus pen, and the like), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 401, and can receive and execute commands sent by the processor 401. The touch panel may overlay the display panel and, in response to the touch panel detecting a touch operation thereon or nearby, communicate to the processor 401 to determine the type of touch event, and the processor 401 then provides a corresponding visual output on the display panel according to the type of touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 403 to realize input and output functions. However, in some embodiments, the touch panel and the touch panel can be implemented as two separate components to perform the input and output functions. That is, the touch display screen 403 may also be used as a part of the input unit 406 to implement an input function.
In the embodiment of the present application, a game application is executed by the processor 401 to generate a graphical user interface on the touch display screen 403, where a virtual scene on the graphical user interface includes at least one skill control area, and the skill control area includes at least one skill control. The touch display screen 403 is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface.
The rf circuit 404 may be used for transceiving rf signals to establish wireless communication with a network device or other computer device via wireless communication, and for transceiving signals with the network device or other computer device.
The audio circuit 405 may be used to provide an audio interface between a user and a computer device through speakers, microphones. The audio circuit 405 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 405 and converted into audio data, which is then processed by the audio data output processor 401, and then sent to, for example, another computer device via the radio frequency circuit 404, or output to the memory 402 for further processing. The audio circuit 405 may also include an earbud jack to provide communication of a peripheral headset with the computer device.
The input unit 406 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 407 is used to power the various components of the computer device 400. Optionally, the power source 407 may be logically connected to the processor 401 through a power management system, so as to implement functions of managing charging, discharging, power consumption management, and the like through the power management system. The power supply 407 may also include one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, or any other component.
Although not shown in fig. 4, the computer device 400 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described in detail herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
From the above, the computer device provided by the embodiment can improve the accuracy of character control.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, the present application provides a computer-readable storage medium, in which a plurality of computer programs are stored, and the computer programs can be loaded by a processor to execute the steps in any of the role control methods provided by the embodiments of the present application. For example, the computer program may perform the steps of:
responding to the touch operation of the behavior control, and determining the touch position of the touch operation;
when the touch position is in the area where the target behavior area is overlapped with the second type of basic behavior area, controlling the game object to execute the game behavior corresponding to the target behavior area and identical to the game behavior of the first type of basic behavior area;
and when the touch position is in the first type basic target behavior area, controlling the game object to execute the game behavior corresponding to the first type basic behavior area.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the computer program stored in the storage medium can execute the steps in any character control method provided in the embodiments of the present application, the beneficial effects that can be achieved by any character control method provided in the embodiments of the present application can be achieved, and detailed descriptions are omitted here for the sake of detail in the foregoing embodiments.
The role control method, apparatus, storage medium and computer device provided in the embodiments of the present application are described in detail above, and a specific example is applied in the present application to explain the principle and implementation manner of the present application, and the description of the above embodiments is only used to help understand the method and core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (16)

1. A character control method is characterized in that a terminal provides a graphical user interface, the content displayed on the graphical user interface at least partially comprises a game scene and a game object therein, the graphical user interface provides a behavior control, the behavior control comprises a first type basic behavior area, a second type basic behavior area and a target behavior area which is overlapped with part of the first type basic behavior area and the second type basic behavior area, the first type basic behavior area and the second type basic behavior area correspond to different game behaviors, and the game behavior corresponding to the target behavior area is the same as the game behavior of the first type basic behavior area, and the method comprises the following steps:
responding to the touch operation of the behavior control, and determining the touch position of the touch operation;
when the touch position is in the area where the target behavior area is overlapped with the second type of basic behavior area, controlling the game object to execute the game behavior corresponding to the target behavior area, which is the same as the game behavior of the first type of basic behavior area;
and when the touch position is in the first type basic target behavior area, controlling the game object to execute the game behavior corresponding to the first type basic behavior area.
2. The character control method of claim 1, wherein the first type of basic behavior region comprises a first sub-region of first type basic behaviors and a second sub-region of first type basic behaviors, the game behaviors comprise a first behavior and a second behavior, the game behaviors corresponding to the first sub-region of the first type basic behaviors comprise a first behavior, and the game behaviors corresponding to the second sub-region of the first type basic behaviors comprise a second behavior.
3. The character control method according to claim 2, wherein the target behavior area includes a first target behavior sub-area and a second target behavior sub-area, and when the touch position is in an area where the target behavior area overlaps with the second type of basic behavior area, the game object is controlled to execute the same game behavior as the game behavior of the first type of basic behavior area corresponding to the target behavior area, including:
when the touch position is in an area where the first sub-area of the target behavior and the second basic behavior area are overlapped, controlling the game object to execute the first behavior;
and when the touch position is in an area where the second sub-area of the target behavior is overlapped with the second type of basic behavior area, controlling the game object to execute the second behavior.
4. The character control method of claim 2, wherein the graphical user interface further includes an adjacent region formed by radiating outward from a preset position in the behavior control, the adjacent region being adjacent to the first type of basic behavior region, the method further comprising:
and when the touch position is in the adjacent area, controlling the game object to execute the game behavior corresponding to the first type of basic behavior area.
5. The character control method of claim 4, wherein the adjacent area comprises a first adjacent sub-area and a second adjacent sub-area, the first adjacent sub-area is adjacent to a first sub-area of the first type of basic behavior in the behavior control, the second adjacent sub-area is adjacent to a second sub-area of the first type of basic behavior in the behavior control, and when the touch position is in the adjacent area, the controlling the game object to execute the game behavior corresponding to the first type of basic behavior area comprises:
when the touch position is in the first adjacent subregion, controlling the game object to execute the first behavior;
when the touch position is in the second adjacent sub-area, controlling the game object to execute the second action.
6. The character control method of claim 4, wherein the preset position is a center position of the behavior control.
7. The character control method of any one of claims 1-6, wherein the game behavior comprises a behavior of moving towards a moving direction, the moving direction comprises a first type moving direction and a second type moving direction, the game behavior corresponding to the first type basic behavior region comprises moving towards the first type moving direction, and the game behavior corresponding to the first type basic behavior region comprises moving towards the second type moving direction.
8. The character control method of claim 7, wherein the first type of movement direction comprises a left direction and the second type of movement direction comprises a right direction.
9. The character control method of claim 1, wherein the method further comprises:
and when the touch position is in an area which is not overlapped with the target behavior area in the second type basic target behavior area, controlling the game object to execute the game behavior corresponding to the second type basic behavior area.
10. The character control method of claim 9, wherein the second type basic target behavior region comprises a plurality of second type basic target behavior sub-regions, and the game behavior corresponding to each of the second type basic behavior sub-regions is different.
11. The character control method of claim 10, wherein the second type of basic target behavior region comprises a second type of basic target behavior first sub-region, a second type of basic target behavior second sub-region, a second type of basic target behavior third sub-region, a second type of basic target behavior fourth region, a second type of basic target behavior fifth sub-region, a second type of basic target behavior sixth sub-region;
the game behaviors corresponding to the first sub-area of the second type of basic target behaviors comprise forward movement, the game behaviors corresponding to the second sub-area of the second type of basic target behaviors comprise leftward forward movement, the game behaviors corresponding to the third sub-area of the second type of basic target behaviors comprise rightward forward movement, the game behaviors corresponding to the fourth sub-area of the second type of basic target behaviors comprise backward movement, the game behaviors corresponding to the fifth sub-area of the second type of basic target behaviors comprise backward movement, the game behaviors corresponding to the fourth sub-area of the second type of basic target behaviors comprise leftward backward movement, and the game behaviors corresponding to the fourth sub-area of the second type of basic target behaviors comprise rightward backward movement.
12. The character control method of claim 1, wherein the behavior control further comprises a no-response region corresponding to a stop-response behavior, the method further comprising:
and when the touch position is in the non-response area, controlling the game object to execute the response stopping behavior.
13. The character control method of claim 12, wherein the stop response behavior includes stop movement, and the controlling the game object to perform the stop response behavior includes:
and controlling the game object to stop moving in the game scene.
14. A character control apparatus, wherein a terminal provides a graphical user interface, content displayed on the graphical user interface at least partially includes a game scene and a game object therein, the graphical user interface provides a behavior control, the behavior control includes a first type basic behavior region, a second type basic behavior region and a target behavior region overlapping with a part of the first type basic behavior region and the second type basic behavior region, the first type basic behavior region and the second type basic behavior region correspond to different game behaviors, and a game behavior corresponding to the target behavior region is the same as a game behavior of the first type basic behavior region, the apparatus includes:
the touch control unit is used for responding to the touch control operation of the behavior control and determining the touch control position of the touch control operation;
the target unit is used for controlling the game object to execute the game behavior corresponding to the target behavior area and the same as the game behavior of the first type basic behavior area when the touch position is in the area where the target behavior area is overlapped with the second type basic behavior area;
and the first unit is used for controlling the game object to execute the game behavior corresponding to the first type basic behavior area when the touch position is in the first type basic target behavior area.
15. A terminal comprising a processor and a memory, said memory storing a plurality of instructions; the processor loads instructions from the memory to perform the steps of the character control method of any of claims 1-13.
16. A computer readable storage medium storing instructions adapted to be loaded by a processor to perform the steps of the character control method of any of claims 1-13.
CN202110872984.2A 2021-07-30 2021-07-30 Role control method, role control device, terminal and computer readable storage medium Pending CN113546403A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110872984.2A CN113546403A (en) 2021-07-30 2021-07-30 Role control method, role control device, terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110872984.2A CN113546403A (en) 2021-07-30 2021-07-30 Role control method, role control device, terminal and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN113546403A true CN113546403A (en) 2021-10-26

Family

ID=78133425

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110872984.2A Pending CN113546403A (en) 2021-07-30 2021-07-30 Role control method, role control device, terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN113546403A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108379839A (en) * 2018-03-23 2018-08-10 网易(杭州)网络有限公司 Response method, device and the terminal of control
CN111773677A (en) * 2020-07-23 2020-10-16 网易(杭州)网络有限公司 Game control method and device, computer storage medium and electronic equipment
WO2021036581A1 (en) * 2019-08-30 2021-03-04 腾讯科技(深圳)有限公司 Method for controlling virtual object, and related apparatus
CN112807673A (en) * 2021-02-01 2021-05-18 网易(杭州)网络有限公司 Game role control method and device, electronic equipment and storage medium
US20210146248A1 (en) * 2018-11-22 2021-05-20 Netease (Hangzhou) Network Co.,Ltd. Virtual character processing method, virtual character processing device, electronic apparatus and storage medium
CN112835498A (en) * 2021-01-25 2021-05-25 北京字跳网络技术有限公司 Control method, control device and computer storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108379839A (en) * 2018-03-23 2018-08-10 网易(杭州)网络有限公司 Response method, device and the terminal of control
US20210146248A1 (en) * 2018-11-22 2021-05-20 Netease (Hangzhou) Network Co.,Ltd. Virtual character processing method, virtual character processing device, electronic apparatus and storage medium
WO2021036581A1 (en) * 2019-08-30 2021-03-04 腾讯科技(深圳)有限公司 Method for controlling virtual object, and related apparatus
CN111773677A (en) * 2020-07-23 2020-10-16 网易(杭州)网络有限公司 Game control method and device, computer storage medium and electronic equipment
CN112835498A (en) * 2021-01-25 2021-05-25 北京字跳网络技术有限公司 Control method, control device and computer storage medium
CN112807673A (en) * 2021-02-01 2021-05-18 网易(杭州)网络有限公司 Game role control method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
KR102625233B1 (en) Method for controlling virtual objects, and related devices
JP7331124B2 (en) Virtual object control method, device, terminal and storage medium
CN113440846B (en) Game display control method and device, storage medium and electronic equipment
US20220297004A1 (en) Method and apparatus for controlling virtual object, device, storage medium, and program product
EP3950080A1 (en) Method and apparatus for selecting virtual object, and device and storage medium
CN113546417A (en) Information processing method and device, electronic equipment and storage medium
CN113350793A (en) Interface element setting method and device, electronic equipment and storage medium
JP7384521B2 (en) Virtual object control method, device, computer equipment and computer program
CN115999153A (en) Virtual character control method and device, storage medium and terminal equipment
US20220032188A1 (en) Method for selecting virtual objects, apparatus, terminal and storage medium
CN114159785A (en) Virtual item discarding method and device, electronic equipment and storage medium
CN114159788A (en) Information processing method, system, mobile terminal and storage medium in game
CN113546403A (en) Role control method, role control device, terminal and computer readable storage medium
JP7419400B2 (en) Virtual object control method, device, terminal and computer program
CN113509729B (en) Virtual prop control method and device, computer equipment and storage medium
CN114146411A (en) Game object control method, device, electronic equipment and storage medium
CN115430151A (en) Game role control method and device, electronic equipment and readable storage medium
CN115040868A (en) Prompt information generation method, area adjustment method and device
CN115569380A (en) Game role control method, device, computer equipment and storage medium
CN115518380A (en) Method and device for controlling game role, computer equipment and storage medium
CN116262176A (en) Information interaction method, device, electronic equipment and storage medium
CN117205555A (en) Game interface display method, game interface display device, electronic equipment and readable storage medium
CN116059639A (en) Virtual object control method, device, electronic equipment and storage medium
CN116832438A (en) Virtual object control method, device, terminal and storage medium
CN116585707A (en) Game interaction method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination