CN115382201A - Game control method and device, computer equipment and storage medium - Google Patents

Game control method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN115382201A
CN115382201A CN202211049847.XA CN202211049847A CN115382201A CN 115382201 A CN115382201 A CN 115382201A CN 202211049847 A CN202211049847 A CN 202211049847A CN 115382201 A CN115382201 A CN 115382201A
Authority
CN
China
Prior art keywords
virtual object
target virtual
game
action
game scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211049847.XA
Other languages
Chinese (zh)
Inventor
苗浩琦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202211049847.XA priority Critical patent/CN115382201A/en
Publication of CN115382201A publication Critical patent/CN115382201A/en
Priority to PCT/CN2023/079122 priority patent/WO2024045528A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a game control method, a game control device, computer equipment and a storage medium. According to the scheme, a plurality of action instructions are designed for the same action control of a game interface, different action instructions are triggered by different operations of a player on the action control, when the specified operation of the player on the action control is detected, an action execution area is determined according to the position of a target virtual object controlled by the player in a game scene, and then when a game shelter exists in the action execution area, the target virtual object can be controlled to complete the action corresponding to the action control and then squat in the shelter, so that the operation convenience of the game player is improved, and the game experience of the player is improved.

Description

Game control method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a game control method and apparatus, a computer device, and a storage medium.
Background
With the development of the internet, a large number of games of different types emerge to meet the daily entertainment requirements of users. In a game with bunker firing as a core, virtual characters squat inside a bunker to shoot are the most basic battle modes. Avatars have frequent and diverse interactions with the shelter, such as the avatars squatting into the shelter. In the game, both rollover and entry into the shelter are the most effective ways to evade enemy attacks.
In the related art, in a bunker firing game, a game interface provides a rolling control key, and after a player clicks the control, a virtual character is triggered to complete a rolling action and then keep a standing state. The operation of entering the shelter is realized by displaying a squat key on the main interface when the character approaches the shelter, and the character squats behind the shelter after clicking. However, when the virtual character is far away from the shelter, the player needs to control the virtual character to move to the area near the shelter to enter the shelter for squatting, the operation is complex, and the player experience is poor.
Disclosure of Invention
The embodiment of the application provides a game control method and device, a computer device and a storage medium, which can improve game experience of game players.
The embodiment of the application provides a game control method, which comprises the following steps:
providing an action control on the graphical user interface;
in response to a first operation on the action control, determining an executable region of a first action of the target virtual object in the game scene based on a position of the target virtual object in the game scene;
when a shelter exists in the executable area, displaying a squat-able indication icon corresponding to the shelter on the graphical user interface;
in response to the first operation ending, control the target virtual object to perform the first action squat toward the bunker to enter the bunker.
Correspondingly, the embodiment of the present application further provides a game control device, including:
the providing unit is used for providing an action control on the graphical user interface;
a determination unit, configured to determine, in response to a first operation on the action control, an executable region of a first action of the target virtual object in the game scene based on a position of the target virtual object in the game scene;
a display unit, configured to display, on the graphical user interface, a squat-able indication icon corresponding to a shelter when the shelter is present in the executable area;
a first control unit to, in response to the first operation ending, control the target virtual object to perform the first action squat towards the bunker to enter the bunker.
In some embodiments, the determining unit comprises:
the first determining subunit is configured to determine a circular area in the game scene according to the position of the target virtual object in the game scene and a preset distance, so as to obtain the executable area.
In some embodiments, the determining unit comprises:
the second determining subunit is used for determining a circular area in the game scene according to the position of the target virtual object in the game scene and a preset distance;
the first acquisition subunit is used for acquiring the current view angle direction of the target virtual object in the game scene;
and the first selecting subunit is used for selecting a sector area from the circular area based on the view angle direction and a preset angle to obtain the executable area.
In some embodiments, the determining unit comprises:
the third determining subunit is used for determining a circular area in the game scene according to the position of the target virtual object in the game scene and a preset distance;
a fourth determining subunit, configured to determine, if multiple shelters exist in the circular area, a target shelter closest to the target virtual object from the multiple shelters;
and the second selecting subunit is configured to select, based on the position relationship between the target shelter and the target virtual object in the game scene, a region where the target shelter is located from the circular region, and obtain the executable region.
In some embodiments, the determining unit comprises:
the fifth determining subunit determines a circular area in the game scene according to the position of the target virtual object in the game scene and a preset distance;
the second acquiring subunit is used for acquiring the target direction of the other virtual objects in the game scene relative to the target virtual object;
and the third determining subunit is configured to select a sector area from the circular area based on the target direction and a preset angle, so as to obtain the executable area.
In some embodiments, the first control unit comprises:
a first control subunit, configured to, in response to an end of the first operation, control the target virtual object to perform the first action squat toward a bunker closest to the target virtual object to enter the bunker.
In some embodiments, the first control unit comprises:
a fourth determining subunit, configured to determine an occlusion region of the bunker from the game scene;
a third control subunit, configured to control the target virtual object to be in a sheltered area of the shelter in a squat state after performing the first action toward the shelter.
In some embodiments, the apparatus further comprises:
a second control unit configured to control the target virtual object to perform the first action in a sliding direction of the sliding operation in response to the sliding operation that is continuous with the first operation.
In some embodiments, the apparatus further comprises:
and a third control unit, configured to control, in response to a second operation on the motion control, that the target virtual object is in a standing state after executing the first motion, where the first operation and the second operation are different.
Accordingly, embodiments of the present application further provide a computer device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor executes the game control method provided in any of the embodiments of the present application.
Correspondingly, the embodiment of the application also provides a storage medium, wherein the storage medium stores a plurality of instructions, and the instructions are suitable for being loaded by the processor to execute the game control method.
According to the method and the device, a plurality of action instructions are designed for the same action control of a game interface, different action instructions are triggered by different operations of a player on the action control, when the specified operation of the player on the action control is detected, an action execution area is determined according to the position of a target virtual object controlled by the player in a game scene, and then when a game shelter exists in the action execution area, the target virtual object can be controlled to squat in the shelter after finishing the action corresponding to the action control, so that the operation convenience of the game player is improved, and the game experience of the player is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic view of a scene of a game control system according to an embodiment of the present application.
Fig. 2 is a schematic flow chart of a game control method according to an embodiment of the present application.
Fig. 3 is a schematic view of an application scenario of a game control method according to an embodiment of the present application.
Fig. 4 is a schematic application scenario diagram of another game control method according to an embodiment of the present application.
Fig. 5 is a schematic application scenario diagram of another game control method according to an embodiment of the present application.
Fig. 6 is a schematic application scenario diagram of another game control method according to an embodiment of the present application.
Fig. 7 is a schematic application scenario diagram of another game control method according to an embodiment of the present application.
Fig. 8 is a schematic application scenario diagram of another game control method according to an embodiment of the present application.
Fig. 9 is a schematic application scenario diagram of another game control method according to an embodiment of the present application.
Fig. 10 is a schematic application scenario diagram of another game control method according to an embodiment of the present application.
Fig. 11 is a schematic application scenario diagram of another game control method according to an embodiment of the present application.
Fig. 12 is a schematic application scenario diagram of another game control method according to an embodiment of the present application.
Fig. 13 is a schematic flowchart of another game control method according to an embodiment of the present application.
Fig. 14 is a schematic application scenario diagram of another game control method according to an embodiment of the present application.
Fig. 15 is a schematic application scenario diagram of another game control method according to an embodiment of the present application.
Fig. 16 is a schematic application scenario diagram of another game control method according to an embodiment of the present application.
Fig. 17 is a schematic application scenario diagram of another game control method according to the embodiment of the present application.
Fig. 18 is a block diagram of a game control device according to an embodiment of the present application.
Fig. 19 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly and completely with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a game control method, a game control device, a storage medium and computer equipment. Specifically, the game control method of the embodiment of the present application may be executed by a computer device, where the computer device may be a terminal or a server or other devices. The terminal can be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like, and the terminal can also include a client, which can be a game application client, a browser client carrying a game program, or an instant messaging client, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, a cloud function, cloud storage, network service, cloud communication, middleware service, domain name service, security service, CDN, and a big data and artificial intelligence platform.
For example, when the game control method is operated on a terminal, the terminal device stores a game application and is used for presenting a virtual scene in a game screen. The terminal device is used for interacting with a user through a graphical user interface, for example, downloading and installing a game application program through the terminal device and running the game application program. The manner in which the terminal device provides the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including a game screen and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for executing the game, generating the graphical user interface, responding to the operation instructions, and controlling display of the graphical user interface on the touch display screen.
For example, when the game control method is executed on a server, the game control method may be a cloud game. Cloud gaming refers to a gaming regime based on cloud computing. In the running mode of the cloud game, the running main body of the game application program and the game picture presenting main body are separated, and the storage and the running of the game control method are finished on the cloud game server. The game screen presentation is performed at a cloud game client, which is mainly used for receiving and sending game data and presenting the game screen, for example, the cloud game client may be a display device with a data transmission function near a user side, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, and the like, but a terminal device for performing game data processing is a cloud game server at the cloud end. When a game is played, a user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the cloud game client through a network, and finally the data are decoded through the cloud game client and the game pictures are output.
Referring to fig. 1, fig. 1 is a schematic view of a game control system according to an embodiment of the present disclosure. The system may include at least one terminal, at least one server, at least one database, and a network. The terminal held by the user can be connected to servers of different games through a network. A terminal is any device having computing hardware capable of supporting and executing a software product corresponding to a game. In addition, the terminal has one or more multi-touch sensitive screens for sensing and obtaining input of a user through a touch or slide operation performed at a plurality of points of one or more touch display screens. In addition, when the system includes a plurality of terminals, a plurality of servers, and a plurality of networks, different terminals may be connected to each other through different networks and through different servers. The network may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition, different terminals may be connected to other terminals or to a server using their own bluetooth network or hotspot network. For example, multiple users may be online through different terminals to connect and synchronize with each other over a suitable network to support multiplayer gaming. Additionally, the system may include a plurality of databases coupled to different servers and in which information relating to the gaming environment may be stored continuously as different users play the multiplayer game online.
The embodiment of the application provides a game control method, which can be executed by a terminal or a server. The embodiment of the present application will be described with an example in which the game control method is executed by a terminal. The terminal comprises a touch display screen and a processor, wherein the touch display screen is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface. When a user operates the graphical user interface through the touch display screen, the graphical user interface can control the local content of the terminal through responding to the received operation instruction, and can also control the content of the opposite-end server through responding to the received operation instruction. For example, the operation instruction generated by the user acting on the graphical user interface comprises an instruction for starting a game application, and the processor is configured to start the game application after receiving the instruction provided by the user for starting the game application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch display screen. A touch display screen is a multi-touch sensitive screen capable of sensing a touch or slide operation performed at a plurality of points on the screen at the same time. The user uses fingers to perform touch operation on the graphical user interface, and when the graphical user interface detects the touch operation, different virtual objects in the game are controlled to perform actions corresponding to the touch operation. For example, the game may be any of a leisure game, an action game, an object-playing game, a strategy game, a sports game, a game of chance, and the like. Wherein the game may comprise a virtual scene of the game. Further, one or more virtual objects, such as virtual objects, controlled by the user (or player) may be included in the virtual scene of the game. Additionally, one or more obstacles, such as railings, ravines, walls, etc., may also be included in the virtual scene of the game to limit movement of the virtual objects, e.g., to limit movement of one or more objects to a particular area within the virtual scene. Optionally, the virtual scene of the game also includes one or more elements, such as skills, points, subject health, energy, etc., to provide assistance to the player, provide virtual services, increase points related to player performance, etc. In addition, the graphical user interface may also present one or more indicators to provide instructional information to the player. For example, a game may include a player-controlled virtual object and one or more other virtual objects (such as enemy objects). In one embodiment, one or more other virtual objects are controlled by other players of the game. For example, one or more other virtual objects may be computer controlled, such as a robot using Artificial Intelligence (AI) algorithms, to implement a human-machine fight mode. For example, the virtual objects possess various skills or capabilities that the game player uses to achieve the goal. For example, the virtual object possesses one or more weapons, props, tools, etc. that may be used to eliminate other objects from the game. Such skills or capabilities may be activated by a player of the game using one of a plurality of preset touch operations with a touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of a user.
It should be noted that the scene schematic diagram of the game control system shown in fig. 1 is only an example, the image processing system and the scene described in the embodiment of the present application are for more clearly illustrating the technical solution of the embodiment of the present application, and do not form a limitation on the technical solution provided in the embodiment of the present application, and as a person having ordinary skill in the art knows, with the evolution of the game control system and the occurrence of a new business scene, the technical solution provided in the embodiment of the present application is also applicable to similar technical problems.
Based on the foregoing problems, embodiments of the present application provide a game control method, an apparatus, a computer device, and a storage medium, which can improve game experience of a game player. The following are detailed below. It should be noted that the following description of the embodiments is not intended to limit the preferred order of the embodiments.
The embodiments of the present application provide a game control method, which may be executed by a terminal or a server, and the embodiments of the present application describe a game control method executed by a terminal as an example.
Referring to fig. 2, fig. 2 is a schematic flow chart of a game control method according to an embodiment of the present application. The specific flow of the game control method can be as follows:
101. an action control is provided on the graphical user interface.
In the embodiment of the application, a graphical user interface is provided through a terminal device, the graphical user interface at least comprises a part of a game scene of a target game and a target virtual object in the game scene, and the target virtual object is controlled by a current player. The graphical user interface is further provided with an action control, and the action control is used for triggering the target virtual object to execute a corresponding action in the game scene.
The action control can comprise a plurality of types, and different action controls can be used for triggering different actions to be executed. For example, the action control may be a scroll control, which may be used to trigger the target virtual object to perform a scroll action in the game scene.
For example, please refer to fig. 3, and fig. 3 is a schematic view of an application scenario of a game control method according to an embodiment of the present application. In the graphical user interface shown in fig. 3, a game scene of the target game is displayed, and a target virtual object controlled by the current player in the game scene is displayed, and an action control is also provided, as well as other operation controls.
102. In response to a first operation on the action control, an executable region of a first action of the target virtual object in the game scene is determined based on the position of the target virtual object in the game scene.
The first operation is also an operation performed by the current player on the action control, and the first operation may be multiple operations, such as clicking, pressing, sliding, and the like.
The first action refers to an action corresponding to the action control, and the executable area refers to a range area in which the target virtual object can execute the first action in the game scene.
In some embodiments, the step of "determining an executable area of a first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene" may comprise the operations of:
and determining a circular area in the game scene according to the position of the target virtual object in the game scene and the preset distance to obtain an executable area.
The preset distance refers to a movement distance after the virtual object executes the first action, which is pre-designed in the game, for example, if the first action is rolling, the preset distance refers to a preset rolling distance, that is, a movement distance of the virtual object executing one rolling action.
In this embodiment of the application, the first action performed by the target virtual object in the game scene may be oriented in any direction, and the range area where the target virtual object performs the first action may be a circular area, where the circular area is determined by taking the position of the target virtual object in the game scene as a central point and taking the preset distance as a radius, so as to obtain an executable area.
For example, referring to fig. 4, fig. 4 is a schematic view of an application scenario of another game control method according to an embodiment of the present application. In the graphical user interface shown in fig. 4, the positions of the target virtual objects in the game scene are: the target position, the preset distance of the first action may be: and L, determining a circular area by taking the target position as a center of a circle and the distance L as a radius in the game scene to obtain an executable area.
In some embodiments, to further select an accurate executable area, the step "determining an executable area of a first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene" may include the following operations:
determining a circular area in the game scene according to the position of the target virtual object in the game scene and the preset distance;
acquiring the current visual angle direction of a target virtual object in a game scene;
and selecting a fan-shaped area from the circular area based on the view direction and the preset angle to obtain an executable area.
Wherein, the view direction refers to a direction in which the target virtual object faces frontally in the game scene. The preset angle may be a view angle of the target virtual object in the game scene.
For example, referring to fig. 5, fig. 5 is a schematic view of an application scenario of another game control method according to an embodiment of the present application. In the graphical user interface shown in fig. 5, the positions of the target virtual objects in the game scene are: the target position, the preset distance of the first action may be: and L, determining a circular area by taking the target position as a center of a circle and the distance L as a radius in the game scene. The current view direction of the target virtual object in the game scene may be acquired as follows: and if the first direction and the view angle can be a, dividing a fan-shaped area from the circular area according to the first direction and the view angle a to obtain an executable area.
In some embodiments, in order to control the virtual object to quickly perform the first action, the step "determining an executable area of the first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene" may include the following operations:
determining a circular area in the game scene according to the position of the target virtual object in the game scene and the preset distance;
if a plurality of shelters exist in the circular area, determining a target shelter closest to the target virtual object from the plurality of shelters;
and selecting the area where the target shelter is located from the circular area based on the position relation between the target shelter and the target virtual object in the game scene to obtain an executable area.
The shelter refers to a virtual object in the game scene, which can be used for the target virtual object to avoid attacks or injuries from other objects, for example, the shelter may be a building in the virtual scene.
Specifically, the position relationship between the target shelter and the target virtual object comprises the relative direction of the target shelter and the target virtual object in the game scene and the relative distance between the target shelter and the target virtual object, and then, a partial area is determined in the circular area according to the relative direction and a preset angle, so that an executable area is obtained.
For example, please refer to fig. 6, and fig. 6 is a schematic view of an application scenario of another game control method according to an embodiment of the present application. In the graphical user interface shown in FIG. 6, the positions of the target virtual objects in the game scene are: and determining a circular area by taking the target position as a circle center and the distance L as a radius in the game scene. Detecting the presence of a plurality of shelters in a circular area, including: the method comprises the following steps that a shelter A and a shelter B are selected from a plurality of shelters, wherein the shelter closest to a target virtual object is as follows: and (4) covering the shelter B.
Further, determining the relative direction of the target virtual object and the target shelter may be: and in the second direction, dividing a fan-shaped area from the circular area according to the relative direction and a preset angle to obtain an executable area. Therefore, the player can conveniently control the target virtual object to quickly enter the shelter so as to avoid the attack of other objects.
In some embodiments, the graphical user interface may further include other virtual objects in the game scene, and the other virtual objects may be in a different play from the target virtual object, that is, the other virtual objects may be virtual objects controlled by other game players or virtual objects set in the game scene that may attack the target virtual object. Then, in order to further select an accurate executable area, the step "determining an executable area of the first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene" may include the following operations:
determining a circular area in the game scene according to the position of the target virtual object in the game scene and the preset distance;
acquiring the target direction of other virtual objects in a game scene relative to a target virtual object;
and selecting a sector area from the circular area based on the target direction and the preset angle to obtain an executable area.
The target direction refers to a relative direction of the other virtual objects and the target virtual object. The preset angle may be a view angle of the target virtual object in the game scene.
For example, please refer to fig. 7, and fig. 7 is a schematic view of an application scenario of another game control method according to an embodiment of the present application. In the graphical user interface shown in FIG. 7, the positions of the target virtual objects in the game scene are: and determining a circular area by taking the target position as a circle center and the distance L as a radius in a game scene. The positions of other virtual objects in the game scene may be: the position S, where the relative direction between the other virtual object and the target virtual object is obtained, may be: in a third direction, the viewing angle of the target virtual object in the game scene may be: and a, dividing a fan-shaped area from the circular area according to the third direction and the view angle a to obtain an executable area.
103. When a shelter is present in the executable area, a squatable indicator icon corresponding to the shelter is displayed on the graphical user interface.
Wherein the squat-able indication icon is used for prompting the player that a shelter exists in the executable area in the current game scene, wherein the shelter can be squat by the target virtual object.
For example, referring to fig. 8, fig. 8 is a schematic view of an application scenario of another game control method according to an embodiment of the present application. In the graphical user interface shown in fig. 8, the executable area is first determined according to the position of the target virtual object in the game scene, and when the existence of the shelter in the executable area is detected, the squat indicator is displayed in the area near the shelter in the graphical user interface, which can be used for prompting the current game player: the target virtual object may be controlled to squat within the shelter.
In some embodiments, when multiple shelters are present in the executable area, squashable indicator icons may be displayed on each shelter in the graphical user interface display.
For example, please refer to fig. 9, and fig. 9 is a schematic view of an application scenario of another game control method according to an embodiment of the present application. In the graphical user interface shown in fig. 9, the presence of: the first shelter and the second shelter can respectively display squat indicating icons on the first shelter and the second shelter, and can be used for prompting a current game player: the target virtual object may be controlled to squat in the first bunker or the second bunker.
104. In response to the first operation ending, the control-target virtual object squats to enter the bunker after performing the first action toward the bunker.
In some embodiments, the first operation may be a pressing operation, and when the end of the pressing operation is detected, if a shelter exists in the executable area, the target virtual object may be controlled to perform the first action and then squat in the shelter. Therefore, a plurality of instructions can be triggered through single operation, the target virtual object is controlled to complete the first action and crouch in the shelter, and the operation convenience of game players is improved.
For example, referring to fig. 10, fig. 10 is a schematic view of an application scenario of another game control method according to an embodiment of the present application. In the graphical user interface shown in fig. 10, detecting that the first operation on the action control is finished, controlling the first virtual object to squat into the bunker after performing the first action toward the bunker.
In some embodiments, there may be multiple shelters in the executable area, and in order to control the target virtual object to squat quickly, the step "squat to enter the shelter after performing the first action toward the shelter in response to the first operation ending" may include the following operations:
in response to the first operation ending, the target virtual object is controlled to perform a first action squat towards the bunker closest to the target virtual object to enter the bunker.
Specifically, it is detected that the first operation is finished, a shelter closest to the target virtual object is selected from a plurality of shelters in the executable area, the target shelter is obtained, and then the target virtual object is controlled to execute a first action towards the target shelter and squat into the target shelter.
For example, please refer to fig. 11, and fig. 11 is a schematic view of an application scenario of another game control method according to an embodiment of the present application. In the graphical user interface shown in fig. 11, the executable area includes: a first bunker and a second bunker, wherein the bunker closest to the target virtual object may be the first bunker. And controlling the first virtual object to squat into the first shelter after the first virtual object performs the first action towards the first shelter after detecting that the first operation on the action control is finished.
In some embodiments, to ensure accuracy of squatting of the target virtual object, the step of "controlling the target virtual object to squat after performing the first action toward the bunker to enter the bunker" may include the operations of:
determining an occlusion area of a shelter from a game scene;
and the control target virtual object is positioned in the sheltered area of the shelter in a squatting state after the control target virtual object performs the first action towards the shelter.
In the embodiment of the application, a shelter in a game scene can comprise a multi-side area, wherein an occlusion area refers to one side area which can be used for a target virtual object to accurately avoid an attack in the multi-side area of the shelter.
Specifically, the position of an attack object which can attack the target virtual object in the current game scene can be obtained, and then a side area where the target virtual object can avoid the attack is determined from the multi-side area of the bunker according to the position of the attack object, so that the occlusion area is determined.
For example, referring to fig. 12, fig. 12 is a schematic view of an application scenario of another game control method according to an embodiment of the present application. In the graphical user interface shown in fig. 12, a target virtual object and an attack object exist in a game scene, wherein the attack object can attack the target virtual object. The bunker in the executable area in the game scene may include a multi-side area: region (1), region (2), region (3), region (4). And acquiring the position information of the attack object, determining that the attack object is positioned in front of the target virtual object according to the position information, selecting an area which can be used for avoiding the attack of the target virtual object from the multi-side area of the shelter as an area (1), and taking the area (1) as an occlusion area. The control-target virtual object is then in an area (1) of the shelter in a squat state after performing a first action towards the shelter.
In some embodiments, if there is no bunker in the executable area, the method may include the steps of:
in response to a slide operation continuous with the first operation, the control-target virtual object performs a first action in a slide direction of the slide operation.
Wherein the sliding operation continuous with the first operation refers to a sliding operation at the end of the first operation.
Specifically, when it is detected that the current player finishes the first operation, the sliding operation on the graphical user interface may control the target virtual object to execute the first action toward the sliding direction according to the sliding direction of the sliding operation, so that the target virtual object is controlled to execute the first action toward the direction selected by the player, and attack of other virtual objects is avoided.
In some embodiments, to improve the interface space utilization, the method may further include the steps of:
and responding to the second operation of the action control, and controlling the target virtual object to be in a standing state after the first action is executed.
The first operation and the second operation are different operations, and the first operation and the second operation of the action control can be respectively used for triggering different instructions.
Specifically, through a first operation on the action control, the target virtual object can be controlled to execute a first action corresponding to the action control in the game scene and then crouch in a shelter of the executable area; the target virtual object can be controlled to be in a standing state after the first action corresponding to the action control is executed in the game scene through the second operation of the action control. Different instructions are integrated through one action control, the number of controls set by the graphical user interface is reduced, and therefore the utilization rate of the interface space can be improved.
The embodiment of the application discloses a game control method, which comprises the following steps: providing an action control on a graphical user interface; in response to a first operation on the action control, determining an executable area of a first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene; when the shelter exists in the executable area, displaying a squat indicator icon corresponding to the shelter on the graphical user interface; in response to the first operation ending, the control-target virtual object squats to enter the bunker after performing the first action toward the bunker. Therefore, the game experience of the game player in the target game can be improved.
Based on the above description, the game control method of the present application will be further described below by way of example. Referring to fig. 13, fig. 13 is a schematic flow chart of another game control method provided in the embodiment of the present application, and taking the example that the game control method is specifically applied to a terminal, a specific flow may be as follows:
201. the terminal displays a game interface of the target game.
In the embodiment of the application, a game interface is provided through a terminal device, the game interface includes a part of game scenes of a target game and target virtual objects in the game scenes, the target virtual objects can be virtual objects controlled by a current game player, and the game interface is further provided with a rolling motion control and other operation controls.
Wherein the scroll action control can be used to trigger the target virtual object to perform a scroll action in the game scene.
For example, referring to fig. 14, fig. 14 is a schematic view of an application scenario of another game control method according to the embodiment of the present application. In the game interface shown in fig. 14, a part of a game scene of a target game, a target virtual object in the game scene, which may be a virtual object controlled by a current game player, is displayed, and a scroll action control and other operation controls are provided in the game interface.
202. When the terminal detects that the current player presses the rolling action control, a rolling area is displayed on the game interface, and if a shelter exists in the rolling area, a squatting mark is displayed on the game interface.
Wherein the scroll-capable region refers to a region in which the target virtual object can perform a scroll action in the game scene.
Specifically, the turnable region may be a circle center according to a position of the current target virtual object in the game scene, and the preset turning distance is a radius, and a circular region is first determined in the game scene, and meanwhile, in order to avoid that the squat target cannot be determined when a plurality of shelters exist in the turnable region of the target virtual object character, a fan-shaped region may be determined according to a visual field direction and a visual field angle of the target virtual object in the circular region, so as to obtain the turnable region.
For example, please refer to fig. 15, fig. 15 is a schematic view of an application scenario of another game control method according to an embodiment of the present application. In the game interface shown in fig. 15, when the pressing operation of the current player on the scroll action control is detected, the fan-shaped region is determined according to the position of the target virtual object, the visual field direction and the visual field angle of the target virtual object, a scrollable region is obtained, the scrollable region is displayed on the graphical user interface, and the current player can observe whether a sheltering object which can be blocked exists or not by displaying the scrollable region.
Meanwhile, if it is detected that a shelter exists in the scrollable area, the crouch flag may be displayed near a shelter display area of the game interface.
For example, referring to fig. 16, fig. 16 is a schematic view of an application scenario of another game control method according to an embodiment of the present application. In the game interface shown in fig. 16, the presence of a shelter in the scrollable area is detected, and a squatable marker is displayed near the shelter, indicating that the current player can control the target virtual object to squat in the shelter.
In some embodiments, if there is no shelter in the current scrollable area and the current player wants to control the target virtual object to squat, the target virtual object may be controlled to move in the game scene by sliding the game interface movement control while updating the scrollable area according to the movement of the target virtual object until there is a shelter in the scrollable area.
203. And when the terminal detects that the pressing operation of the current player is finished, the control target virtual object performs rolling action towards the shelter and squats to enter the shelter.
For example, please refer to fig. 17, and fig. 17 is a schematic view of an application scenario of another game control method according to an embodiment of the present application. In the game interface shown in fig. 17, when it is detected that the current player finishes the pressing operation, the target virtual object may be controlled to perform a rolling motion toward the shelter and then squat into the shelter, so as to complete the rolling motion and squat into the shelter.
The embodiment of the application discloses a game control method, which comprises the following steps: the terminal displays a game interface of a target game, when the pressing operation of a current player on the rolling action control is detected, a rolling area is displayed on the game interface, if a shelter exists in the rolling area, a squat mark is displayed on the game interface, and when the pressing operation of the current player is detected to be finished, the target virtual object is controlled to squat towards the shelter to enter the shelter after the rolling action is executed, so that the game experience of the game player can be improved.
In order to better implement the game control method provided by the embodiment of the present application, the embodiment of the present application further provides a game control device based on the game control method. The terms are the same as those in the game control method, and the details of the implementation can be referred to the description in the method embodiment.
Referring to fig. 18, fig. 18 is a block diagram of a game control device according to an embodiment of the present disclosure, where the device includes:
a providing unit 301, configured to provide an action control on the graphical user interface;
a determining unit 302, configured to determine, in response to a first operation on the action control, an executable region of a first action of the target virtual object in the game scene based on a position of the target virtual object in the game scene;
a display unit 303, configured to display, on the graphical user interface, a squat-able indication icon corresponding to a shelter when the shelter is present in the executable area;
a first control unit 304 for, in response to the first operation ending, controlling the target virtual object to perform the first action squat towards the bunker to enter the bunker.
In some embodiments, the determining unit comprises:
the first determining subunit is configured to determine a circular area in the game scene according to the position of the target virtual object in the game scene and a preset distance, so as to obtain the executable area.
In some embodiments, the determining unit 302 may include:
the second determining subunit is used for determining a circular area in the game scene according to the position of the target virtual object in the game scene and a preset distance;
the first acquisition subunit is used for acquiring the current view angle direction of the target virtual object in the game scene;
and the first selecting subunit is used for selecting a sector area from the circular area based on the view angle direction and a preset angle to obtain the executable area.
In some embodiments, the determining unit 302 may include:
the third determining subunit is used for determining a circular area in the game scene according to the position of the target virtual object in the game scene and a preset distance;
a fourth determining subunit, configured to determine, if multiple shelters exist in the circular area, a target shelter closest to the target virtual object from the multiple shelters;
and the second selecting subunit is configured to select, based on the position relationship between the target shelter and the target virtual object in the game scene, a region where the target shelter is located from the circular region, and obtain the executable region.
In some embodiments, the determining unit 302 may include:
the fifth determining subunit determines a circular area in the game scene according to the position of the target virtual object in the game scene and a preset distance;
the second acquiring subunit is configured to acquire a target direction of the other virtual objects in the game scene relative to the target virtual object;
and the third determining subunit is configured to select a sector area from the circular area based on the target direction and a preset angle, so as to obtain the executable area.
In some embodiments, the first control unit 304 may include:
a first control subunit, configured to, in response to an end of the first operation, control the target virtual object to perform the first action squat toward a bunker closest to the target virtual object to enter the bunker.
In some embodiments, the first control unit 304 may include:
a fourth determining subunit, configured to determine an occlusion area of the shelter from the game scene;
a third control subunit, configured to control the target virtual object to be in a sheltered area of the shelter in a squat state after performing the first action toward the shelter.
In some embodiments, the apparatus may further comprise:
a second control unit configured to control the target virtual object to perform the first action in a sliding direction of the sliding operation in response to the sliding operation that is continuous with the first operation.
In some embodiments, the apparatus may further comprise:
and a third control unit, configured to control, in response to a second operation on the motion control, that the target virtual object is in a standing state after executing the first motion, where the first operation and the second operation are different.
The embodiment of the application discloses a game control device, which provides an action control on a graphical user interface through a providing unit 301; the determining unit 302 determines an executable area of a first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene in response to a first operation of the action control; when a shelter exists in the executable area, the display unit 303 displays a squat-able indication icon corresponding to the shelter on the graphical user interface; the first control unit 304, in response to the first operation ending, controls the target virtual object to perform the first action squat towards the bunker to enter the bunker. Therefore, the game experience of the game player can be improved.
Correspondingly, the embodiment of the application also provides a computer device, and the computer device can be a terminal. As shown in fig. 19, fig. 19 is a schematic structural diagram of a computer device according to an embodiment of the present application. The computer apparatus 600 includes a processor 601 having one or more processing cores, a memory 602 having one or more computer-readable storage media, and a computer program stored on the memory 602 and operable on the processor. The processor 601 is electrically connected to the memory 602. Those skilled in the art will appreciate that the computer device configurations illustrated in the figures are not meant to be limiting of computer devices and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The processor 601 is a control center of the computer apparatus 600, connects various parts of the entire computer apparatus 600 using various interfaces and lines, performs various functions of the computer apparatus 600 and processes data by running or loading software programs and/or modules stored in the memory 602, and calling data stored in the memory 602, thereby monitoring the computer apparatus 600 as a whole.
In the embodiment of the present application, the processor 601 in the computer device 600 loads instructions corresponding to processes of one or more applications into the memory 602, and the processor 601 executes the applications stored in the memory 602 according to the following steps, so as to implement various functions:
providing an action control on a graphical user interface;
in response to a first operation on the action control, determining an executable area of a first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene;
when the shelter exists in the executable area, displaying a squat indicator icon corresponding to the shelter on the graphical user interface;
in response to the first operation ending, the control-target virtual object squats to enter the shelter after performing the first action toward the shelter.
In some embodiments, determining an executable region of a first action of a target virtual object in a game scene based on a position of the target virtual object in the game scene comprises:
and determining a circular area in the game scene according to the position of the target virtual object in the game scene and the preset distance to obtain an executable area.
In some embodiments, determining an executable region of a first action of a target virtual object in a game scene based on a position of the target virtual object in the game scene comprises:
determining a circular area in the game scene according to the position of the target virtual object in the game scene and the preset distance;
acquiring the current visual angle direction of a target virtual object in a game scene;
and selecting a sector area from the circular area based on the view direction and the preset angle to obtain an executable area.
In some embodiments, determining an executable region of a first action of a target virtual object in a game scene based on a position of the target virtual object in the game scene comprises:
determining a circular area in the game scene according to the position of the target virtual object in the game scene and the preset distance;
if a plurality of shelters exist in the circular area, determining a target shelter closest to the target virtual object from the plurality of shelters;
and selecting the area where the target shelter is located from the circular area based on the position relation between the target shelter and the target virtual object in the game scene to obtain an executable area.
In some embodiments, the graphical user interface includes other virtual objects in the game scene, the other virtual objects being in a different stand than the target virtual object;
determining an executable area of a first action of a target virtual object in a game scene based on a position of the target virtual object in the game scene, comprising:
determining a circular area in the game scene according to the position of the target virtual object in the game scene and the preset distance;
acquiring the target direction of other virtual objects in a game scene relative to a target virtual object;
and selecting a fan-shaped area from the circular area based on the target direction and the preset angle to obtain an executable area.
In some embodiments, there are multiple bunkers in the executable area;
in response to the first operation ending, controlling the target virtual object to perform a first action squat toward the bunker to enter the bunker, including:
in response to the first operation ending, the target virtual object is controlled to perform a first action squat towards the bunker closest to the target virtual object to enter the bunker.
In some embodiments, controlling the target virtual object to perform a first action toward the bunker to squat to enter the bunker includes:
determining an occlusion area of a shelter from a game scene;
and the control target virtual object is positioned in the sheltered area of the shelter in a squatting state after the first action is performed towards the shelter.
In some embodiments, there is no bunker in the executable area;
the method further comprises the following steps:
in response to a slide operation continuous with the first operation, the control-target virtual object performs a first action in a slide direction of the slide operation.
In some embodiments, the method further comprises:
and responding to a second operation of the action control, and controlling the target virtual object to be in a standing state after executing the first action, wherein the first operation and the second operation are different.
According to the method and the device, a plurality of action instructions are designed for the same action control piece of a game interface, different action instructions are triggered through different operations of a player on the action control piece, when the specified operation of the player on the action control piece is detected, an action execution area is determined according to the position of a target virtual object controlled by the player in a game scene, and then when a game shelter exists in the action execution area, the target virtual object can be controlled to squat in the shelter after finishing the action corresponding to the action control piece, so that the operation convenience of the game player is improved, and the game experience of the player is improved.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 19, the computer device 600 further includes: a touch display screen 603, a radio frequency circuit 604, an audio circuit 605, an input unit 606, and a power supply 607. The processor 601 is electrically connected to the touch display screen 603, the radio frequency circuit 604, the audio circuit 605, the input unit 606, and the power supply 607. Those skilled in the art will appreciate that the computer device configuration illustrated in FIG. 19 does not constitute a limitation of computer devices, and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The touch display screen 603 can be used for displaying a graphical user interface and receiving operation instructions generated by a user acting on the graphical user interface. The touch display screen 603 may include a display panel and a touch panel. The display panel may be used, among other things, to display information entered by or provided to a user and various graphical user interfaces of the computer device, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus pen, and the like), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 601, and can receive and execute commands sent by the processor 601. The touch panel may overlay the display panel, and when the touch panel detects a touch operation thereon or nearby, the touch panel may transmit the touch operation to the processor 601 to determine the type of the touch event, and then the processor 601 may provide a corresponding visual output on the display panel according to the type of the touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 603 to implement input and output functions. However, in some embodiments, the touch panel and the touch panel can be implemented as two separate components to perform the input and output functions. That is, the touch display screen 603 can also be used as a part of the input unit 606 to implement an input function.
The rf circuit 604 may be used for transceiving rf signals to establish wireless communication with a network device or other computer device via wireless communication, and for transceiving signals with the network device or other computer device.
The audio circuit 605 may be used to provide an audio interface between the user and the computer device through speakers, microphones. The audio circuit 605 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 605 and converted into audio data, which is then processed by the audio data output processor 601, and then transmitted to, for example, another computer device via the radio frequency circuit 604, or output to the memory 602 for further processing. The audio circuit 605 may also include an earbud jack to provide communication of peripheral headphones with the computer device.
The input unit 606 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 607 is used to power the various components of the computer device 600. Optionally, the power supply 607 may be logically connected to the processor 601 through a power management system, so as to implement functions of managing charging, discharging, and power consumption management through the power management system. The power supply 607 may also include any component including one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown in fig. 19, the computer device 600 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described in detail herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
As can be seen from the above, the computer device provided in this embodiment may provide an action control on the graphical user interface; in response to a first operation on the action control, determining an executable region of a first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene; when the shelter exists in the executable area, displaying a squat indicator icon corresponding to the shelter on the graphical user interface; in response to the first operation ending, the control-target virtual object squats to enter the bunker after performing the first action toward the bunker.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, the present application provides a computer-readable storage medium, in which a plurality of computer programs are stored, and the computer programs can be loaded by a processor to execute the steps in any one of the game control methods provided in the present application. For example, the computer program may perform the steps of:
providing an action control on a graphical user interface;
in response to a first operation on the action control, determining an executable area of a first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene;
when the shelter exists in the executable area, displaying a squat indicator icon corresponding to the shelter on the graphical user interface;
in response to the first operation ending, the control-target virtual object squats to enter the bunker after performing the first action toward the bunker.
In some embodiments, determining an executable area of a first action of a target virtual object in a game scene based on a position of the target virtual object in the game scene comprises:
and determining a circular area in the game scene according to the position of the target virtual object in the game scene and the preset distance to obtain an executable area.
In some embodiments, determining an executable region of a first action of a target virtual object in a game scene based on a position of the target virtual object in the game scene comprises:
determining a circular area in the game scene according to the position of the target virtual object in the game scene and the preset distance;
acquiring the current visual angle direction of a target virtual object in a game scene;
and selecting a sector area from the circular area based on the view direction and the preset angle to obtain an executable area.
In some embodiments, determining an executable region of a first action of a target virtual object in a game scene based on a position of the target virtual object in the game scene comprises:
determining a circular area in the game scene according to the position of the target virtual object in the game scene and the preset distance;
if a plurality of shelters exist in the circular area, determining a target shelter closest to the target virtual object from the plurality of shelters;
and selecting the area where the target shelter is located from the circular area based on the position relation between the target shelter and the target virtual object in the game scene to obtain an executable area.
In some embodiments, the graphical user interface includes other virtual objects in the game scene that are in a different lineup than the target virtual object;
determining an executable area of a first action of a target virtual object in a game scene based on a position of the target virtual object in the game scene, comprising:
determining a circular area in the game scene according to the position of the target virtual object in the game scene and the preset distance;
acquiring target directions of other virtual objects relative to a target virtual object in a game scene;
and selecting a sector area from the circular area based on the target direction and the preset angle to obtain an executable area.
In some embodiments, there are multiple bunkers in the executable area;
in response to the first operation ending, controlling the target virtual object to perform a first action squat toward the bunker to enter the bunker, including:
in response to the first operation ending, the target virtual object is controlled to perform a first action squat toward a shelter closest to the target virtual object to enter the shelter.
In some embodiments, controlling the target virtual object to perform a first action squat towards the bunker to enter the bunker includes:
determining an occlusion area of a shelter from a game scene;
and the control target virtual object is positioned in the sheltered area of the shelter in a squatting state after the first action is performed towards the shelter.
In some embodiments, there is no bunker in the executable area;
the method further comprises the following steps:
in response to a slide operation continuous with the first operation, the control-target virtual object performs a first action in a slide direction of the slide operation.
In some embodiments, the method further comprises:
and responding to a second operation of the action control, and controlling the target virtual object to be in a standing state after the first action is executed, wherein the first operation and the second operation are different.
According to the method and the device, a plurality of action instructions are designed for the same action control piece of a game interface, different action instructions are triggered through different operations of a player on the action control piece, when the specified operation of the player on the action control piece is detected, an action execution area is determined according to the position of a target virtual object controlled by the player in a game scene, and then when a game shelter exists in the action execution area, the target virtual object can be controlled to squat in the shelter after finishing the action corresponding to the action control piece, so that the operation convenience of the game player is improved, and the game experience of the player is improved.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), random Access Memory (RAM), magnetic or optical disks, and the like.
Since the computer program stored in the storage medium can execute the steps in any game control method provided in the embodiments of the present application, the beneficial effects that can be achieved by any game control method provided in the embodiments of the present application can be achieved, and detailed descriptions are omitted here for the sake of detail in the foregoing embodiments.
The game control method, the game control device, the game control storage medium and the computer device provided by the embodiments of the present application are described in detail above, and specific examples are applied herein to illustrate the principles and implementations of the present application, and the description of the embodiments above is only used to help understand the method and the core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (12)

1. A game control method, wherein a graphical user interface is provided through a terminal device, the graphical user interface includes at least a part of a game scene, and a target virtual object in the game scene, the target virtual object being controlled by a current player, the method comprising:
providing an action control on the graphical user interface;
in response to a first operation on the action control, determining an executable region of a first action of the target virtual object in the game scene based on a position of the target virtual object in the game scene;
when a shelter exists in the executable area, displaying a squat-able indication icon corresponding to the shelter on the graphical user interface;
in response to the first operation ending, control the target virtual object to perform the first action squat toward the bunker to enter the bunker.
2. The method of claim 1, wherein determining an executable area for a first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene comprises:
and determining a circular area in the game scene according to the position of the target virtual object in the game scene and a preset distance to obtain the executable area.
3. The method of claim 1, wherein determining an executable area for a first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene comprises:
determining a circular area in the game scene according to the position of the target virtual object in the game scene and a preset distance;
acquiring the current visual angle direction of the target virtual object in the game scene;
and selecting a fan-shaped area from the circular area based on the view direction and a preset angle to obtain the executable area.
4. The method of claim 1, wherein determining an executable area for a first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene comprises:
determining a circular area in the game scene according to the position of the target virtual object in the game scene and a preset distance;
if a plurality of shelters exist in the circular area, determining a target shelter closest to the target virtual object from the plurality of shelters;
and selecting the area where the target shelter is located from the circular area based on the position relation between the target shelter and the target virtual object in the game scene to obtain the executable area.
5. The method of claim 1, wherein the graphical user interface comprises other virtual objects in the game scene, the other virtual objects being in a different lineup than the target virtual object;
the determining an executable area of a first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene comprises:
determining a circular area in the game scene according to the position of the target virtual object in the game scene and a preset distance;
acquiring the target direction of the other virtual objects in the game scene relative to the target virtual object;
and selecting a sector area from the circular area based on the target direction and a preset angle to obtain the executable area.
6. The method of claim 1, wherein there are a plurality of shelters in the executable area;
the controlling, in response to the first operation ending, the target virtual object to perform the first squat of action toward the bunker to enter the bunker includes:
in response to the first operation ending, control the target virtual object to perform the first action squat toward a bunker closest to the target virtual object to enter the bunker.
7. The method of claim 1, wherein the controlling the target virtual object to perform the first squat of action toward the bunker to enter the bunker comprises:
determining an occlusion region of the shelter from the game scene;
and controlling the target virtual object to be in a sheltered area of the shelter in a squatting state after the target virtual object performs the first action towards the shelter.
8. The method of claim 1, wherein there is no shelter in the executable area;
the method further comprises the following steps:
in response to a sliding operation continuous with the first operation, controlling the target virtual object to perform the first action in a sliding direction of the sliding operation.
9. The method of claim 1, further comprising:
and responding to a second operation of the action control, and controlling the target virtual object to be in a standing state after the first action is executed, wherein the first operation and the second operation are different.
10. A game control apparatus for providing a graphical user interface through a terminal device, the graphical user interface including at least a portion of a game scene and a target virtual object in the game scene, the target virtual object being controlled by a current player, the apparatus comprising:
the providing unit is used for providing an action control on the graphical user interface;
a determination unit, configured to determine, in response to a first operation on the action control, an executable region of a first action of the target virtual object in the game scene based on a position of the target virtual object in the game scene;
a display unit, configured to display, on the graphical user interface, a squattable indicator icon corresponding to a shelter when the shelter is present in the executable area;
a first control unit to, in response to the first operation ending, control the target virtual object to perform the first action squat towards the bunker to enter the bunker.
11. A computer device comprising a memory, a processor and a computer program stored on the memory and running on the processor, wherein the processor implements the game control method of any one of claims 1 to 9 when executing the program.
12. A storage medium storing a plurality of instructions adapted to be loaded by a processor to perform a game control method according to any one of claims 1 to 9.
CN202211049847.XA 2022-08-30 2022-08-30 Game control method and device, computer equipment and storage medium Pending CN115382201A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211049847.XA CN115382201A (en) 2022-08-30 2022-08-30 Game control method and device, computer equipment and storage medium
PCT/CN2023/079122 WO2024045528A1 (en) 2022-08-30 2023-03-01 Game control method and apparatus, and computer device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211049847.XA CN115382201A (en) 2022-08-30 2022-08-30 Game control method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115382201A true CN115382201A (en) 2022-11-25

Family

ID=84124636

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211049847.XA Pending CN115382201A (en) 2022-08-30 2022-08-30 Game control method and device, computer equipment and storage medium

Country Status (2)

Country Link
CN (1) CN115382201A (en)
WO (1) WO2024045528A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024045528A1 (en) * 2022-08-30 2024-03-07 网易(杭州)网络有限公司 Game control method and apparatus, and computer device and storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021194463A (en) * 2020-06-18 2021-12-27 任天堂株式会社 Game program, game device, game processing control method, and game system
CN112263833A (en) * 2020-11-19 2021-01-26 网易(杭州)网络有限公司 Game control method and device
CN112822397B (en) * 2020-12-31 2022-07-05 上海米哈游天命科技有限公司 Game picture shooting method, device, equipment and storage medium
CN112774201B (en) * 2021-01-22 2023-04-07 北京字跳网络技术有限公司 Virtual character masking method and device, computer equipment and storage medium
CN114225416A (en) * 2021-12-16 2022-03-25 网易(杭州)网络有限公司 Game control method and device
CN114404944A (en) * 2022-01-20 2022-04-29 网易(杭州)网络有限公司 Method and device for controlling player character, electronic device and storage medium
CN115382201A (en) * 2022-08-30 2022-11-25 网易(杭州)网络有限公司 Game control method and device, computer equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024045528A1 (en) * 2022-08-30 2024-03-07 网易(杭州)网络有限公司 Game control method and apparatus, and computer device and storage medium

Also Published As

Publication number Publication date
WO2024045528A1 (en) 2024-03-07

Similar Documents

Publication Publication Date Title
CN113101652A (en) Information display method and device, computer equipment and storage medium
CN113426124A (en) Display control method and device in game, storage medium and computer equipment
WO2024011894A1 (en) Virtual-object control method and apparatus, and storage medium and computer device
CN113398590A (en) Sound processing method, sound processing device, computer equipment and storage medium
CN113082707A (en) Virtual object prompting method and device, storage medium and computer equipment
CN115040873A (en) Game grouping processing method and device, computer equipment and storage medium
CN113332721B (en) Game control method, game control device, computer equipment and storage medium
WO2024045528A1 (en) Game control method and apparatus, and computer device and storage medium
WO2024103623A1 (en) Method and apparatus for marking virtual item, and computer device and storage medium
CN115501581A (en) Game control method and device, computer equipment and storage medium
CN115382202A (en) Game control method and device, computer equipment and storage medium
CN115999153A (en) Virtual character control method and device, storage medium and terminal equipment
CN115193043A (en) Game information sending method and device, computer equipment and storage medium
CN114225412A (en) Information processing method, information processing device, computer equipment and storage medium
CN116139483A (en) Game function control method, game function control device, storage medium and computer equipment
CN113398564B (en) Virtual character control method, device, storage medium and computer equipment
CN115430150A (en) Game skill release method and device, computer equipment and storage medium
CN115193046A (en) Game display control method and device, computer equipment and storage medium
CN115501582A (en) Game interaction control method and device, computer equipment and storage medium
CN115040867A (en) Game card control method and device, computer equipment and storage medium
CN115337641A (en) Switching method and device of game props, computer equipment and storage medium
CN115068943A (en) Game card control method and device, computer equipment and storage medium
CN116999835A (en) Game control method, game control device, computer equipment and storage medium
CN115430151A (en) Game role control method and device, electronic equipment and readable storage medium
CN115212567A (en) Information processing method, information processing device, computer equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination