WO2024045528A1 - 一种游戏控制方法、装置、计算机设备及存储介质 - Google Patents

一种游戏控制方法、装置、计算机设备及存储介质 Download PDF

Info

Publication number
WO2024045528A1
WO2024045528A1 PCT/CN2023/079122 CN2023079122W WO2024045528A1 WO 2024045528 A1 WO2024045528 A1 WO 2024045528A1 CN 2023079122 W CN2023079122 W CN 2023079122W WO 2024045528 A1 WO2024045528 A1 WO 2024045528A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual object
target virtual
bunker
game scene
action
Prior art date
Application number
PCT/CN2023/079122
Other languages
English (en)
French (fr)
Inventor
苗浩琦
Original Assignee
网易(杭州)网络有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 网易(杭州)网络有限公司 filed Critical 网易(杭州)网络有限公司
Publication of WO2024045528A1 publication Critical patent/WO2024045528A1/zh

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Definitions

  • the present disclosure relates to the field of computer technology, and specifically to a game control method, device, computer equipment and storage medium.
  • the game interface provides a roll control key. After the player clicks the control, the virtual character is triggered to complete a roll action and then remain standing. As for the operation of entering the bunker, when the character is close to the bunker, the crouch button will be displayed on the main interface. After clicking, the character will crouch behind the bunker.
  • Embodiments of the present disclosure provide a game control method, device, computer equipment and storage medium, which can solve the problem of cumbersome operation and poor experience when players control a virtual character to enter behind a bunker to crouch during the game.
  • an embodiment of the present disclosure provides a game control method, including:
  • the target virtual object In response to the end of the first operation, the target virtual object is controlled to perform the first action toward the bunker and then crouch to enter the bunker.
  • embodiments of the present disclosure also provide a game control device, including:
  • Determining unit configured to determine the executability of the first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene in response to the first operation of the action control. area;
  • a display unit configured to display a crouching indication icon corresponding to the bunker on the graphical user interface when there is a bunker in the executable area
  • a first control unit configured to, in response to the end of the first operation, control the target virtual object to perform the first action toward the bunker and then crouch to enter the bunker.
  • embodiments of the present disclosure also provide a computer device, including a memory, a processor, and a computer program stored on the storage and executable on the processor, wherein the processor executes any one of the embodiments of the present disclosure. game control method.
  • embodiments of the present disclosure also provide a storage medium that stores multiple instructions, and the instructions are suitable for loading by the processor to execute the above game control method.
  • multiple action instructions are designed for the same action control on the game interface, and different action instructions are triggered through different operations of the action control by the player.
  • the target controlled by the player is The position of the virtual object in the game scene determines an action execution area. Then when there is a game bunker in the action execution area, the target virtual object can be controlled to complete the action corresponding to the action control and then crouch in the bunker, improving the operational convenience of the game player. , thereby improving the player's gaming experience, thereby reducing the operating cost and running time of the game in the terminal device, and saving the power of the terminal device.
  • Figure 1 is a schematic scene diagram of a game control system provided by an embodiment of the present disclosure.
  • Figure 2 is a schematic flowchart of a game control method provided by an embodiment of the present disclosure.
  • Figure 3 is a schematic diagram of an application scenario of a game control method provided by an embodiment of the present disclosure.
  • Figure 4 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • Figure 5 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • Figure 6 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • Figure 7 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • Figure 8 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • Figure 9 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • Figure 10 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • Figure 11 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • Figure 12 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • Figure 13 is a schematic flowchart of another game control method provided by an embodiment of the present disclosure.
  • Figure 14 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • Figure 15 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • Figure 16 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • Figure 17 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • Figure 18 is a structural block diagram of a game control device provided by an embodiment of the present disclosure.
  • Figure 19 is a schematic structural diagram of a computer device provided by an embodiment of the present disclosure.
  • Embodiments of the present disclosure provide a game control method, device, storage medium and computer equipment.
  • the game control method of the embodiment of the present disclosure can be executed by a computer device, where the computer device can be a terminal or a server.
  • the terminal can be a terminal device such as a smartphone, tablet computer, notebook computer, touch screen, game console, personal computer (PC, Personal Computer), personal digital assistant (Personal Digital Assistant, PDA), etc.
  • the terminal can also include a client, The client may be a game application client, a browser client carrying a game program, or an instant messaging client.
  • the server can be an independent physical server, or a server cluster or distributed system composed of multiple physical servers. It can also provide cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communications, and middleware. Cloud servers for basic cloud computing services such as software services, domain name services, security services, CDN, and big data and artificial intelligence platforms.
  • the terminal device when the game control method is run on a terminal, the terminal device stores a game application program and is used to present virtual scenes in the game screen.
  • the terminal device is used to interact with the user through a graphical user interface, such as downloading, installing and running game applications through the terminal device.
  • the terminal device may provide the graphical user interface to the user in a variety of ways, for example, it may be rendered and displayed on the display screen of the terminal device, or the graphical user interface may be presented through holographic projection.
  • the terminal device may include a touch display screen and a processor
  • the touch display screen is used to present a graphical user interface and receive operating instructions generated by the user acting on the graphical user interface
  • the graphical user interface includes game screens
  • the processor is used to Run the game, generate a graphical user interface, respond to operating instructions, and control the display of the graphical user interface on the touch display.
  • the game control method when run on a server, it can be a cloud game.
  • Cloud gaming refers to a gaming method based on cloud computing.
  • the running body of the game application and the game screen presentation body are separated, and the storage and operation of the game control method are completed on the cloud game server.
  • the game screen presentation is completed on the cloud game client.
  • the cloud game client is mainly used for receiving and sending game data and presenting the game screen.
  • the cloud game client can be close to the user side and has the data transmission function. Display devices, such as mobile terminals, televisions, computers, PDAs, personal digital assistants, etc., but the terminal devices that process game data are cloud game servers in the cloud.
  • the user When playing a game, the user operates the cloud game client to send operating instructions to the cloud game server.
  • the cloud game server runs the game according to the operating instructions, encodes and compresses the game screen and other data, and returns it to the cloud game client through the network. Finally, through the cloud game The client decodes and outputs the game screen.
  • FIG. 1 is a schematic scene diagram of a game control system provided by an embodiment of the present disclosure.
  • the system may include at least one terminal, at least one server, at least one database, and a network.
  • the terminal held by the user can connect to the servers of different games through the network.
  • a terminal is any device with computing hardware capable of supporting and executing a software product corresponding to a game.
  • the terminal has one or more multi-touch-sensitive screens for sensing and obtaining user input through touch or sliding operations performed at multiple points of the one or more touch-sensitive display screens.
  • the system includes multiple terminals, multiple servers, and multiple networks, different terminals can be connected to each other through different networks and different servers.
  • the network may be a wireless network or a wired network.
  • the wireless network may be a wireless local area network (WLAN), a local area network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc.
  • WLAN wireless local area network
  • LAN local area network
  • cellular network a 2G network
  • 3G network a 3G network
  • 4G network a 5G network
  • different terminals can also use their own Bluetooth network or hotspot network to connect to other terminals or connect to servers, etc.
  • multiple users can be online via different terminals connected via appropriate networks and synchronized with each other to support multi-player gaming.
  • the system may include multiple databases coupled to different servers, and information related to the game environment may be continuously stored in the databases while different users are playing multi-player games online.
  • Embodiments of the present disclosure provide a game control method, which can be executed by a terminal or a server.
  • the embodiment of the present disclosure takes the game control method executed by the terminal as an example to illustrate.
  • the terminal includes a touch display screen and a processor, and the touch display screen is used to present a graphical user interface and receive operating instructions generated by a user acting on the graphical user interface.
  • the graphical user interface can control the local content of the terminal in response to the received operation instructions, and can also control the content of the peer server in response to the received operation instructions.
  • the operation instructions generated by the user acting on the graphical user interface include instructions for starting the game application, and the processor is configured to start the game application after receiving the instruction provided by the user to start the game application. Additionally, the processor is configured to render and draw a graphical user interface associated with the game on the touch display.
  • a touch display is a multi-touch screen that can sense simultaneous touches or swipes at multiple points on the screen. The user uses fingers to perform touch operations on the graphical user interface. When the graphical user interface detects the touch operation, it controls different virtual objects in the game to perform actions corresponding to the touch operation.
  • the game may be any of a casual game, an action game, an object playing game, a strategy game, a sports game, a puzzle game, and the like.
  • the game may include a virtual scene of the game.
  • the virtual scene of the game may include one or more virtual objects controlled by the user (or player), such as virtual objects.
  • the virtual scene of the game may also include one or more obstacles, such as railings, ravines, walls, etc., to limit the movement of virtual objects, for example, to limit the movement of one or more objects to a specific area within the virtual scene.
  • the virtual scene of the game also includes one or more elements, such as skills, points, object health status, energy, etc., to provide help to players, provide virtual services, increase points related to player performance, etc.
  • the graphical user interface may present one or more indicators to provide instructional information to the player.
  • a game may include a player-controlled virtual object and one or more other virtual objects (such as enemy objects).
  • one or more other virtual objects are controlled by other players of the game.
  • one or more other virtual objects can be controlled by a computer, such as a robot using artificial intelligence (AI) algorithms, to achieve a human-machine battle mode.
  • AI artificial intelligence
  • virtual objects possess various skills or abilities that gamers use to achieve goals.
  • a virtual object possesses one or more weapons, props, tools, etc. that can be used to eliminate other objects from the game.
  • skills or abilities may be activated by a player of the game using one of a plurality of preset touch operations with the terminal's touch display.
  • the processor may be configured to present a corresponding game screen in response to an operation instruction generated by a user's touch operation.
  • embodiments of the present disclosure provide a first game control method, device, computer equipment and storage medium, which can improve the game experience of game players.
  • a first game control method device, computer equipment and storage medium, which can improve the game experience of game players.
  • the embodiment of the present disclosure provides a game control method, which can be executed by a terminal or a server.
  • the embodiment of the present disclosure takes the game control method being executed by a terminal as an example for description.
  • FIG. 2 is a schematic flowchart of a game control method provided by an embodiment of the present disclosure.
  • the specific process of the game control method can be as follows:
  • a graphical user interface is provided through the terminal device, and the graphical user interface includes at least part of the game scene of the target game, and the target virtual object in the game scene, and the target virtual object is controlled by the current player.
  • the graphical user interface also provides action controls, which are used to trigger target virtual objects to perform corresponding actions in the game scene.
  • the action controls may include multiple types, and different action controls may be used to trigger the execution of different actions.
  • the action control can be a roll control, which can be used to trigger the target virtual object to perform a roll action in the game scene.
  • FIG. 3 is a schematic diagram of an application scenario of a game control method provided by an embodiment of the present disclosure.
  • the game scene of the target game is displayed, as well as the target virtual object controlled by the current player in the game scene, and action controls and other operation controls are also provided.
  • the first operation is the operation performed by the current player on the action control.
  • the first operation can be a variety of operations, such as clicking, pressing, sliding and other operations.
  • the first action refers to the action corresponding to the action control
  • the executable area refers to the range area in which the target virtual object can perform the first action in the game scene.
  • the step "determining the executable area of the first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene” may include the following operations:
  • a circular area is determined in the game scene to obtain an executable area.
  • the preset distance refers to the moving distance of the virtual object pre-designed in the game after performing the first action.
  • the preset distance refers to the preset rolling distance, that is, the virtual object The distance traveled by a roll action.
  • the target virtual object can face any direction when performing the first action in the game scene, and the range area in which the target virtual object performs the first action can be a circular area, with the target virtual object in the circular area.
  • the position in the game scene is the center point of the circle, and the preset distance is used as the radius to determine the executable area.
  • FIG. 4 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • the position of the target virtual object in the game scene is: target position.
  • the preset distance of the first action can be: L.
  • the target position can be the center of the circle in the game scene, and
  • the distance L is the radius to determine a circular area, and the executable area is obtained.
  • the step "determining the executable area of the first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene” may include the following operations:
  • a sector-shaped area is selected from the circular area to obtain the executable area.
  • the viewing angle direction refers to the direction in which the target virtual object faces in the game scene.
  • the preset angle may be the field of view angle of the target virtual object in the game scene.
  • FIG. 5 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • the position of the target virtual object in the game scene is: target position.
  • the preset distance of the first action can be: L.
  • the target position can be the center of the circle in the game scene, and
  • the distance L defines a circular area with a radius.
  • the current viewing angle direction of the target virtual object in the game scene can be obtained as: the first direction, and the viewing angle can be a.
  • a sector-shaped area is divided from the circular area to obtain an executable area.
  • the step "determining the executable area of the first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene” may include the following operations :
  • the area where the target bunker is located is selected from the circular area to obtain the executable area.
  • the bunker refers to the virtual object in the game scene that can be used for the target virtual object to avoid attacks or injuries from other objects.
  • the bunker can be a building in the virtual scene, etc.
  • the positional relationship between the target bunker and the target virtual object includes the relative direction of the target bunker and the target virtual object in the game scene, and the relative distance between the target bunker and the target virtual object. Then, in the circular area according to the relative direction , and the preset angle determines the partial area to obtain the executable area.
  • FIG. 6 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • the position of the target virtual object in the game scene is: the target position.
  • a circular area is determined with the target position as the center and distance L as the radius. It is detected that there are multiple bunkers in the circular area, including bunker A and bunker B. The bunker closest to the target virtual object is selected from the multiple bunkers as bunker B.
  • determining the relative direction between the target virtual object and the target bunker may be: in the second direction, divide a sector-shaped area from the circular area according to the relative direction and the preset angle to obtain an executable area. This makes it easier for players to control the target virtual object to quickly enter the bunker to avoid attacks from other objects.
  • the graphical user interface may also include other virtual objects in the game scene.
  • the other virtual objects may be in different camps from the target virtual object. That is, the other virtual objects may be virtual objects or games controlled by other game players.
  • the step "determine the executable area of the first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene" may include the following operations:
  • a sector-shaped area is selected from the circular area to obtain the executable area.
  • the target direction refers to the relative direction of other virtual objects and the target virtual object.
  • the preset angle may be the field of view angle of the target virtual object in the game scene.
  • FIG. 7 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • the position of the target virtual object in the game scene is: the target position.
  • a circular area is determined with the target position as the center and distance L as the radius.
  • the position of other virtual objects in the game scene can be: position S.
  • the relative direction of obtaining other virtual objects and the target virtual object can be: third direction.
  • the field of view angle of the target virtual object in the game scene can be: a, then according to The third direction and the viewing angle a divide a sector-shaped area from the circular area to obtain an executable area.
  • the crouching indicator icon corresponding to the bunker is displayed on the graphical user interface.
  • the crouching indicator icon is used to remind the player that there is a bunker in which the target virtual object can crouch in the executable area in the current game scene.
  • FIG. 8 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • the executable area is first determined based on the position of the target virtual object in the game scene.
  • the crouching area is displayed in the graphical user interface near the bunker.
  • the crouch indicator mark can be used to remind the current game player that the target virtual object can be controlled to crouch in the bunker.
  • a crouching indicator icon may be displayed on each bunker in the graphical user interface display.
  • FIG. 9 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • the crouching indicator icons can be displayed on the first bunker and the second bunker respectively, which can be used for prompts.
  • Current game player can control the target virtual object to crouch in the first bunker or the second bunker.
  • control the target virtual object In response to the end of the first operation, control the target virtual object to perform the first action toward the bunker and then crouch to enter the bunker.
  • the first operation may be a pressing operation.
  • the target virtual object may be controlled to perform the first action and then crouch in the bunker. In this way, multiple instructions can be triggered through a single operation to control the target virtual object to complete the first action and crouch in the bunker, improving the operational convenience of game players.
  • FIG. 10 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • the graphical user interface shown in Figure 10 it is detected that the first operation on the action control is completed, and the first virtual object is controlled to perform the first action towards the bunker and then crouch into the bunker.
  • bunker can include the following operations:
  • the target virtual object In response to the end of the first operation, the target virtual object is controlled to perform the first action toward the bunker closest to the target virtual object and then crouch to enter the bunker.
  • FIG. 11 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • the executable area includes: a first bunker and a second bunker, wherein the bunker closest to the target virtual object may be the first bunker. It is detected that the first operation on the action control is completed, and the first virtual object is controlled to perform the first action toward the first bunker and then crouch into the first bunker.
  • the step "controlling the target virtual object to perform the first action toward the bunker and then crouch to enter the bunker" may include the following operations:
  • the target virtual object is controlled to perform the first action toward the bunker and then remain in the occlusion area of the bunker in a crouching state.
  • the bunker in the game scene may include a multi-side area, where the occlusion area refers to a side area of the multi-side area of the bunker that can be used for the target virtual object to accurately avoid attacks.
  • the position of the attack object that can attack the target virtual object in the current game scene can be obtained, and then based on the position of the attack object, the one side area where the target virtual object can avoid the attack is determined from the multi-side areas of the bunker, thereby determining the occlusion area.
  • FIG. 12 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • the bunker in the executable area in the game scene can include multiple side areas: area 1, area 2, area 3, area 4.
  • area 1, area 2, area 3, area 4 Obtain the location information of the attack object, and determine that the attack object is in front of the target virtual object based on the location information. Then you can select an area from the multi-side area of the bunker that can be used by the target virtual object to avoid the attack object's attack as area 1, then you can Area 1 is used as the occlusion area. Then control the target virtual object to perform the first action toward the bunker and then stay in the bunker's area 1 in a crouching state.
  • the method may include the following steps:
  • the target virtual object In response to the sliding operation that is continuous with the first operation, the target virtual object is controlled to perform the first action along the sliding direction of the sliding operation.
  • the sliding operation that is continuous with the first operation refers to the sliding operation at the end of the first operation.
  • the target virtual object can be controlled to perform the first action in the sliding direction according to the sliding direction of the sliding operation, thereby controlling the target virtual object to move towards the player's selection. Perform the first action in the direction to avoid attacks from other virtual objects.
  • the method may also include the following steps:
  • the target virtual object In response to the second operation on the action control, the target virtual object is controlled to be in a standing state after performing the first action.
  • the first operation and the second operation are different operations, and the first operation and the second operation on the action control can be used to trigger different instructions respectively.
  • the target virtual object can be controlled to crouch in the bunker in the executable area after performing the first action corresponding to the action control in the game scene; through the second operation of the action control, the target virtual object can be controlled to perform the first action corresponding to the action control in the game scene.
  • the target virtual object is in a standing state after executing the first action corresponding to the action control in the game scene. Integrating different instructions through an action control reduces the number of controls set in the graphical user interface, thereby improving interface space utilization.
  • An embodiment of the present disclosure discloses a game control method.
  • the method includes: providing an action control on a graphical user interface; in response to a first operation on the action control, determining where the target virtual object is based on the position of the target virtual object in the game scene.
  • the executable area of the first action in the game scene when there is a bunker in the executable area, the crouching indicator icon corresponding to the bunker is displayed on the graphical user interface; in response to the end of the first operation, the target virtual object is controlled to perform the third step toward the bunker. Crouch after one move to enter cover. In this way, game players' gaming experience in the target game can be improved. This further reduces the operating cost and running time of the game in the terminal device, and saves the power of the terminal device.
  • Figure 13 is a schematic flowchart of another game control method provided by an embodiment of the present disclosure. Taking the game control method specifically applied to a terminal as an example, the specific process can be as follows:
  • the terminal displays the game interface of the target game.
  • the game interface is provided through the terminal device.
  • the game interface includes part of the game scene of the target game and the target virtual object in the game scene.
  • the target virtual object can be a virtual object controlled by the current game player.
  • the game interface also provides There are roll action controls and other operation controls.
  • the rolling action control can be used to trigger the target virtual object to perform a rolling action in the game scene.
  • FIG. 14 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • the game interface shown in Figure 14 part of the game scene of the target game and the target virtual object in the game scene are displayed.
  • the target virtual object can be a virtual object controlled by the current game player.
  • the game interface also provides roll action controls and Other operating controls.
  • the rollable area is displayed on the game interface. If there is a bunker in the rollable area, a crouching mark is displayed on the game interface.
  • the tumbling area refers to the area where the target virtual object can perform tumbling actions in the game scene.
  • the tumbling area can be based on the position of the current target virtual object in the game scene as the center of the circle, and the preset rolling distance as the radius.
  • a circular area is determined in the game scene.
  • a fan-shaped area can be determined from the circular area based on the field of view direction and field of view angle of the target virtual object to obtain the rollable area.
  • FIG. 15 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • the fan-shaped area is determined according to the position of the target virtual object, the field of view direction and the field of view angle of the target virtual object, and the rollable area is obtained, and The graphical user interface displays the rollable area. By displaying the rollable area, the current player can observe whether there is a cover that can be blocked.
  • the crouching mark can be displayed near the bunker display area on the game interface.
  • FIG. 16 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • a bunker is detected in the rollable area, a crouching mark is displayed near the bunker, prompting the current player to control the target virtual object to crouch in the bunker.
  • the target virtual object can be controlled to move in the game scene by sliding the game interface movement control, and at the same time, according to the target virtual object The movement of the object updates the rollable area until cover is present in the rollable area.
  • the terminal When the terminal detects that the current player's pressing operation ends, it controls the target virtual object to perform a rolling motion toward the bunker and then crouch into the bunker.
  • FIG. 17 is a schematic diagram of an application scenario of another game control method provided by an embodiment of the present disclosure.
  • the target virtual object can be controlled to perform a rolling action towards the bunker and then crouch into the bunker, complete the rolling action and crouch in the bunker.
  • the embodiment of the present disclosure discloses a game control method.
  • the method includes: the terminal displays the game interface of the target game, and when detecting the current player's pressing operation on the roll action control, displays a rollable area on the game interface. If the rollable area exists The crouching mark is displayed on the game interface.
  • the target virtual object is controlled to perform a rolling action toward the bunker and then crouch into the bunker. This can improve the game player's gaming experience. This further reduces the operating cost and running time of the game in the terminal device, and saves the power of the terminal device.
  • the embodiment of the present disclosure also provides a game control device based on the above game control method.
  • the meanings of the nouns are the same as those in the above game control method.
  • Figure 18 is a structural block diagram of a game control device provided by an embodiment of the present disclosure.
  • the device includes:
  • Providing unit 301 configured to provide action controls on the graphical user interface
  • Determining unit 302 configured to determine, in response to the first operation of the action control, the possibility of the first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene. execution area;
  • the display unit 303 is configured to display the crouching indication icon corresponding to the bunker on the graphical user interface when there is a bunker in the executable area;
  • the first control unit 304 is configured to, in response to the end of the first operation, control the target virtual object to perform the first action toward the bunker and then crouch to enter the bunker.
  • the determining unit includes:
  • the first determination subunit is used to determine a circular area in the game scene according to the position of the target virtual object in the game scene and the preset distance to obtain the executable area.
  • the determining unit 302 may include:
  • a second determination subunit configured to determine a circular area in the game scene according to the position of the target virtual object in the game scene and the preset distance
  • the first acquisition subunit is used to acquire the current viewing angle direction of the target virtual object in the game scene
  • the first selection subunit is used to select a sector-shaped area from the circular area based on the viewing angle direction and the preset angle to obtain the executable area.
  • the determining unit 302 may include:
  • the third determination subunit is used to determine a circular area in the game scene according to the position of the target virtual object in the game scene and the preset distance;
  • the fourth determination subunit is used to determine the target bunker closest to the target virtual object from the multiple bunkers if there are multiple bunkers in the circular area;
  • the second selection subunit is used to select the area where the target bunker is located from the circular area based on the positional relationship between the target bunker and the target virtual object in the game scene to obtain the executable area.
  • the determining unit 302 may include:
  • the fifth determination subunit determines a circular area in the game scene according to the position of the target virtual object in the game scene and the preset distance;
  • the second acquisition subunit is used to acquire the target direction of the other virtual object relative to the target virtual object in the game scene;
  • the third determination subunit is used to select a sector-shaped area from the circular area based on the target direction and the preset angle to obtain the executable area.
  • the first control unit 304 may include:
  • a first control subunit configured to, in response to the end of the first operation, control the target virtual object to perform the first action toward the bunker closest to the target virtual object and then crouch to enter the bunker.
  • the first control unit 304 may include:
  • the fourth determination subunit is used to determine the occlusion area of the bunker from the game scene
  • the third control subunit is used to control the target virtual object to perform the first action toward the bunker and then remain in the occlusion area of the bunker in a crouching state.
  • the device may further include:
  • a second control unit configured to control the target virtual object to perform the first action along the sliding direction of the sliding operation in response to a sliding operation that is continuous with the first operation.
  • the device may further include:
  • a third control unit configured to control the target virtual object to be in a standing state after performing the first action in response to a second operation on the action control, wherein the first operation and the second operation are different .
  • the embodiment of the present disclosure discloses a game control device that provides action controls on the graphical user interface through the providing unit 301; the determining unit 302 responds to the first operation of the action control, based on the location of the target virtual object.
  • the position in the game scene determines the executable area of the first action of the target virtual object in the game scene; the display unit 303 displays the executable area on the graphical user interface when there is a bunker in the executable area.
  • a crouching indication icon corresponding to the bunker in response to the end of the first operation, the first control unit 304 controls the target virtual object to perform the first action toward the bunker and then crouch to enter the bunker.
  • the gaming experience of gamers can be improved. This further reduces the operating cost and running time of the game in the terminal device, and saves the power of the terminal device.
  • FIG. 19 is a schematic structural diagram of a computer device provided by an embodiment of the present disclosure.
  • the computer device 600 includes a processor 601 having one or more processing cores, a memory 602 having one or more computer-readable storage media, and a computer program stored on the memory 602 and executable on the processor.
  • the processor 601 is electrically connected to the memory 602.
  • the structure of the computer equipment shown in the figures does not constitute a limitation on the computer equipment, and may include more or fewer components than shown in the figures, or combine certain components, or arrange different components.
  • the processor 601 is the control center of the computer device 600, using various interfaces and lines to connect various parts of the entire computer device 600, by running or loading software programs and/or modules stored in the memory 602, and calling the software programs and/or modules stored in the memory 602. data, perform various functions of the computer device 600 and process data, thereby overall monitoring the computer device 600.
  • the processor 601 in the computer device 600 will follow the following steps to load instructions corresponding to the processes of one or more application programs into the memory 602, and the processor 601 will run the instructions stored in the memory. 602 applications to achieve various functions:
  • the crouching indicator icon corresponding to the bunker is displayed on the graphical user interface
  • the target virtual object In response to the end of the first operation, the target virtual object is controlled to perform a first action toward the bunker and then crouch to enter the bunker.
  • determining the executable area of the first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene includes:
  • a circular area is determined in the game scene to obtain an executable area.
  • determining the executable area of the first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene includes:
  • a sector-shaped area is selected from the circular area to obtain the executable area.
  • determining the executable area of the first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene includes:
  • the area where the target bunker is located is selected from the circular area to obtain the executable area.
  • the graphical user interface includes other virtual objects in the game scene, and the other virtual objects are in different camps from the target virtual object;
  • Determining the executable area of the first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene including:
  • a sector-shaped area is selected from the circular area to obtain the executable area.
  • bunkers there are multiple bunkers in the executable area
  • controlling the target virtual object to perform a first action toward the bunker and then crouch to enter the bunker including:
  • the target virtual object In response to the end of the first operation, the target virtual object is controlled to perform the first action toward the bunker closest to the target virtual object and then crouch to enter the bunker.
  • controlling the target virtual object to perform a first action toward the bunker and then crouch to enter the bunker includes:
  • the target virtual object is controlled to perform the first action toward the bunker and then remain in the occlusion area of the bunker in a crouching state.
  • the method also includes:
  • the target virtual object In response to the sliding operation that is continuous with the first operation, the target virtual object is controlled to perform the first action along the sliding direction of the sliding operation.
  • the method further includes:
  • the target virtual object In response to the second operation on the action control, the target virtual object is controlled to be in a standing state after performing the first action, wherein the first operation and the second operation are different.
  • multiple action instructions are designed for the same action control on the game interface, and different action instructions are triggered through different operations of the action control by the player.
  • the target controlled by the player is The position of the virtual object in the game scene determines an action execution area. Then when there is a game bunker in the action execution area, the target virtual object can be controlled to complete the action corresponding to the action control and then crouch in the bunker, improving the operational convenience of the game player. , thereby improving the player’s gaming experience. This further reduces the operating cost and running time of the game in the terminal device, and saves the power of the terminal device.
  • the computer device 600 also includes: a touch display screen 603, a radio frequency circuit 604, an audio circuit 605, an input unit 606 and a power supply 607.
  • the processor 601 is electrically connected to the touch display screen 603, the radio frequency circuit 604, the audio circuit 605, the input unit 606 and the power supply 607 respectively.
  • the structure of the computer equipment shown in Figure 19 does not constitute a limitation on the computer equipment, and may include more or fewer components than shown, or combine certain components, or arrange different components.
  • the touch display screen 603 can be used to display a graphical user interface and receive operation instructions generated by the user acting on the graphical user interface.
  • the touch display screen 603 may include a display panel and a touch panel.
  • the display panel can be used to display information input by the user or information provided to the user as well as various graphical user interfaces of the computer device. These graphical user interfaces can be composed of graphics, text, icons, videos, and any combination thereof.
  • the display panel can be configured in the form of a liquid crystal display (LCD, Liquid Crystal Display), organic light-emitting diode (OLED, Organic Light-Emitting Diode), etc.
  • LCD liquid crystal display
  • OLED Organic Light-Emitting Diode
  • the touch panel can be used to collect the user's touch operations on or near it (such as the user's operations on or near the touch panel using a finger, stylus, or any suitable object or accessory), and generate corresponding operations. instruction, and the operation instruction executes the corresponding program.
  • the touch panel may include two parts: a touch detection device and a touch controller. Among them, the touch detection device detects the user's touch orientation, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact point coordinates, and then sends it to the touch controller. to the processor 601, and can receive commands sent by the processor 601 and execute them.
  • the touch panel can cover the display panel.
  • the touch panel When the touch panel detects a touch operation on or near the touch panel, it is sent to the processor 601 to determine the type of the touch event. Then the processor 601 provides information on the display panel according to the type of the touch event. Corresponding visual output.
  • the touch panel and the display panel can be integrated into the touch display screen 603 to implement input and output functions.
  • the touch panel and the touch panel can be used as two independent components to implement input and output functions. That is, the touch display screen 603 can also be used as a part of the input unit 606 to implement the input function.
  • the radio frequency circuit 604 can be used to send and receive radio frequency signals to establish wireless communication with network equipment or other computer equipment through wireless communication, and to send and receive signals with network equipment or other computer equipment.
  • Audio circuit 605 may be used to provide an audio interface between the user and the computer device through speakers and microphones.
  • the audio circuit 605 can transmit the electrical signal converted from the received audio data to the speaker, which converts it into a sound signal and outputs it; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received and converted by the audio circuit 605
  • the audio data is processed by the audio data output processor 601 and then sent to, for example, another computer device via the radio frequency circuit 604, or the audio data is output to the memory 602 for further processing.
  • Audio circuitry 605 may also include an earphone jack to provide communication of peripheral headphones to the computer device.
  • the input unit 606 can be used to receive input numbers, character information or user characteristic information (such as fingerprints, iris, facial information, etc.), and generate keyboard, mouse, joystick, optical or trackball signal input related to user settings and function control. .
  • Power supply 607 is used to power various components of computer device 600.
  • the power supply 607 can be logically connected to the processor 601 through a power management system, so that functions such as charging, discharging, and power consumption management can be implemented through the power management system.
  • Power supply 607 may also include one or more DC or AC power supplies, recharging systems, power failure detection circuits, power converters or inverters, power status indicators, and other arbitrary components.
  • the computer device 600 may also include a camera, a sensor, a wireless fidelity module, a Bluetooth module, etc., which will not be described again here.
  • the computer device can provide action controls on the graphical user interface; in response to the first operation of the action controls, it is determined based on the position of the target virtual object in the game scene that the target virtual object is in the game scene.
  • the executable area of the first action when there is a bunker in the executable area, the crouching indicator icon corresponding to the bunker is displayed on the graphical user interface; in response to the end of the first operation, after the target virtual object is controlled to perform the first action towards the bunker Crouch to enter cover.
  • embodiments of the present disclosure provide a computer-readable storage medium in which multiple computer programs are stored.
  • the computer programs can be loaded by a processor to execute any of the game control methods provided by the embodiments of the disclosure.
  • the computer program can perform the following steps:
  • the crouching indicator icon corresponding to the bunker is displayed on the graphical user interface
  • the target virtual object In response to the end of the first operation, the target virtual object is controlled to perform a first action toward the bunker and then crouch to enter the bunker.
  • determining the executable area of the first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene includes:
  • a circular area is determined in the game scene to obtain an executable area.
  • determining the executable area of the first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene includes:
  • a sector-shaped area is selected from the circular area to obtain the executable area.
  • determining the executable area of the first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene includes:
  • the area where the target bunker is located is selected from the circular area to obtain the executable area.
  • the graphical user interface includes other virtual objects in the game scene, and the other virtual objects are in different camps from the target virtual object;
  • Determining the executable area of the first action of the target virtual object in the game scene based on the position of the target virtual object in the game scene including:
  • a sector-shaped area is selected from the circular area to obtain the executable area.
  • bunkers there are multiple bunkers in the executable area
  • controlling the target virtual object to perform a first action toward the bunker and then crouch to enter the bunker including:
  • the target virtual object In response to the end of the first operation, the target virtual object is controlled to perform the first action toward the bunker closest to the target virtual object and then crouch to enter the bunker.
  • controlling the target virtual object to perform a first action toward the bunker and then crouch to enter the bunker includes:
  • the target virtual object is controlled to perform the first action toward the bunker and then remain in the occlusion area of the bunker in a crouching state.
  • the method also includes:
  • the target virtual object In response to the sliding operation that is continuous with the first operation, the target virtual object is controlled to perform the first action along the sliding direction of the sliding operation.
  • the method further includes:
  • the target virtual object In response to the second operation on the action control, the target virtual object is controlled to be in a standing state after performing the first action, wherein the first operation and the second operation are different.
  • multiple action instructions are designed for the same action control on the game interface, and different action instructions are triggered through different operations of the action control by the player.
  • the target controlled by the player is The position of the virtual object in the game scene determines an action execution area. Then when there is a game bunker in the action execution area, the target virtual object can be controlled to complete the action corresponding to the action control and then crouch in the bunker, improving the operational convenience of the game player. , thereby improving the player’s gaming experience. This further reduces the operating cost and running time of the game in the terminal device, and saves the power of the terminal device.
  • the storage medium may include: read only memory (ROM, Read Only Memory), random access memory (RAM, Random Access Memory), magnetic disk or optical disk, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种游戏控制方法,对一动作控件设计多个动作指令,通过对动作控件的不同操作触发不同的动作指令,当检测到对动作控件的指定操作,根据目标虚拟对象的位置确定一动作执行区域,当动作执行区域中存在游戏掩体,可以控制目标虚拟对象完成动作控件对应的动作后蹲伏于掩体中。

Description

一种游戏控制方法、装置、计算机设备及存储介质
本公开要求于2022年08月30日提交中国专利局、申请号为202211049847.X、发明名称为“一种游戏控制方法、装置、计算机设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本公开中。
技术领域
本公开涉及计算机技术领域,具体涉及一种游戏控制方法、装置、计算机设备及存储介质。
背景技术
随着互联网的发展,涌现出大量不同类型的游戏,以满足用户的日常娱乐需求。在以掩体射击为核心的游戏中,虚拟角色蹲伏于掩体内进行射击是最基本的作战方式。虚拟角色与掩体具有频繁且多样的交互,比如,虚拟角色蹲伏进入掩体。在游戏中,翻滚和进入掩体都是躲避敌人攻击最有效的方式。
相关技术中,在掩体射击游戏中,游戏界面提供翻滚控键,玩家点击控件后,触发虚拟角色完成一次翻滚动作然后保持站立状态。至于进入掩体的操作则是通过当角色靠近掩体时,主界面显示蹲伏键,点击后角色蹲伏于掩体后方。
技术问题
本公开实施例提供一种游戏控制方法、装置、计算机设备及存储介质,可以解决玩家游戏过程中控制虚拟角色进入掩体后方进行蹲伏,操作繁琐,体验差的问题。
技术解决方案
第一方面,本公开实施例提供了一种游戏控制方法,包括:
在所述图形用户界面上提供动作控件;
响应于对所述动作控件的第一操作,基于所述目标虚拟对象在所述游戏场景中的位置确定所述目标虚拟对象在所述游戏场景中的第一动作的可执行区域;
当所述可执行区域中存在掩体时,在所述图形用户界面显示所述掩体对应的可蹲伏指示图标;
响应于所述第一操作结束,控制所述目标虚拟对象朝向所述掩体执行所述第一动作后蹲伏以进入所述掩体。
第二方面,本公开实施例还提供了一种游戏控制装置,包括:
提供单元,用于在所述图形用户界面上提供动作控件;
确定单元,用于响应于对所述动作控件的第一操作,基于所述目标虚拟对象在所述游戏场景中的位置确定所述目标虚拟对象在所述游戏场景中的第一动作的可执行区域;
显示单元,用于当所述可执行区域中存在掩体时,在所述图形用户界面显示所述掩体对应的可蹲伏指示图标;
第一控制单元,用于响应于所述第一操作结束,控制所述目标虚拟对象朝向所述掩体执行所述第一动作后蹲伏以进入所述掩体。
第三方面,本公开实施例还提供了一种计算机设备,包括存储器,处理器及存储在储存器上并可在处理器上运行的计算机程序,其中,处理器执行本公开实施例任一提供的游戏控制方法。
第四方面,本公开实施例还提供了一种存储介质,存储介质存储有多条指令,指令适于处理器进行加载,以执行如上的游戏控制方法。
有益效果
本公开实施例通过对游戏界面的同一动作控件设计多个动作指令,通过玩家对该动作控件的不同操作触发不同的动作指令,当检测到玩家对动作控件的指定操作时,根据玩家控制的目标虚拟对象在游戏场景中的位置确定一动作执行区域,然后当动作执行区域中存在游戏掩体时,可以控制目标虚拟对象完成动作控件对应的动作后蹲伏于掩体中,提高游戏玩家的操作便捷性,从而提高玩家游戏体验,进而降低了终端设备中游戏的操作成本及运行时长,节省了终端设备的电量。
附图说明
为了更清楚地说明本公开实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本公开的一些实施例,对于本领域技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本公开实施例提供的一种游戏控制系统的场景示意图。
图2为本公开实施例提供的一种游戏控制方法的流程示意图。
图3为本公开实施例提供的一种游戏控制方法的应用场景示意图。
图4为本公开实施例提供的另一种游戏控制方法的应用场景示意图。
图5为本公开实施例提供的另一种游戏控制方法的应用场景示意图。
图6为本公开实施例提供的另一种游戏控制方法的应用场景示意图。
图7为本公开实施例提供的另一种游戏控制方法的应用场景示意图。
图8为本公开实施例提供的另一种游戏控制方法的应用场景示意图。
图9为本公开实施例提供的另一种游戏控制方法的应用场景示意图。
图10为本公开实施例提供的另一种游戏控制方法的应用场景示意图。
图11为本公开实施例提供的另一种游戏控制方法的应用场景示意图。
图12为本公开实施例提供的另一种游戏控制方法的应用场景示意图。
图13为本公开实施例提供的另一种游戏控制方法的流程示意图。
图14为本公开实施例提供的另一种游戏控制方法的应用场景示意图。
图15为本公开实施例提供的另一种游戏控制方法的应用场景示意图。
图16为本公开实施例提供的另一种游戏控制方法的应用场景示意图。
图17为本公开实施例提供的另一种游戏控制方法的应用场景示意图。
图18为本公开实施例提供的一种游戏控制装置的结构框图。
图19为本公开实施例提供的计算机设备的结构示意图。
本公开的实施方式
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本公开的一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
本公开实施例提供一种游戏控制方法、装置、存储介质及计算机设备。具体地,本公开实施例的游戏控制方法可以由计算机设备执行,其中,该计算机设备可以为终端或者服务器等设备。该终端可以为智能手机、平板电脑、笔记本电脑、触控屏幕、游戏机、个人计算机(PC,Personal Computer)、个人数字助理(Personal Digital Assistant,PDA)等终端设备,终端还可以包括客户端,该客户端可以是游戏应用客户端、携带有游戏程序的浏览器客户端或即时通信客户端等。服务器可以是独立的物理服务器,也可以是多个物理服务器构成的服务器集群或者分布式系统,还可以是提供云服务、云数据库、云计算、云函数、云存储、网络服务、云通信、中间件服务、域名服务、安全服务、CDN、以及大数据和人工智能平台等基础云计算服务的云服务器。
例如,当该游戏控制方法运行于终端时,终端设备存储有游戏应用程序并用于呈现游戏画面中的虚拟场景。终端设备用于通过图形用户界面与用户进行交互,例如通过终端设备下载安装游戏应用程序并运行。该终端设备将图形用户界面提供给用户的方式可以包括多种,例如,可以渲染显示在终端设备的显示屏上,或者,通过全息投影呈现图形用户界面。例如,终端设备可以包括触控显示屏和处理器,该触控显示屏用于呈现图形用户界面以及接收用户作用于图形用户界面产生的操作指令,该图形用户界面包括游戏画面,该处理器用于运行该游戏、生成图形用户界面、响应操作指令以及控制图形用户界面在触控显示屏上的显示。
例如,当该游戏控制方法运行于服务器时,可以为云游戏。云游戏是指以云计算为基础的游戏方式。在云游戏的运行模式下,游戏应用程序的运行主体和游戏画面呈现主体是分离的,游戏控制方法的储存与运行是在云游戏服务器上完成的。而游戏画面呈现是在云游戏的客户端完成的,云游戏客户端主要用于游戏数据的接收、发送以及游戏画面的呈现,例如,云游戏客户端可以是靠近用户侧的具有数据传输功能的显示设备,如,移动终端、电视机、计算机、掌上电脑、个人数字助理等,但是进行游戏数据处理的终端设备为云端的云游戏服务器。在进行游戏时,用户操作云游戏客户端向云游戏服务器发送操作指令,云游戏服务器根据操作指令运行游戏,将游戏画面等数据进行编码压缩,通过网络返回云游戏客户端,最后,通过云游戏客户端进行解码并输出游戏画面。
请参阅图1,图1为本公开实施例提供的一种游戏控制系统的场景示意图。该系统可以包括至少一个终端,至少一个服务器,至少一个数据库,以及网络。用户持有的终端可以通过网络连接到不同游戏的服务器。终端是具有计算硬件的任何设备,该计算硬件能够支持和执行与游戏对应的软件产品。另外,终端具有用于感测和获得用户通过在一个或者多个触控显示屏的多个点执行的触摸或者滑动操作的输入的一个或者多个多触敏屏幕。另外,当系统包括多个终端、多个服务器、多个网络时,不同的终端可以通过不同的网络、通过不同的服务器相互连接。网络可以是无线网络或者有线网络,比如无线网络为无线局域网(WLAN)、局域网(LAN)、蜂窝网络、2G网络、3G网络、4G网络、5G网络等。另外,不同的终端之间也可以使用自身的蓝牙网络或者热点网络连接到其他终端或者连接到服务器等。例如,多个用户可以通过不同的终端在线从而通过适当网络连接并且相互同步,以支持多玩家游戏。另外,该系统可以包括多个数据库,多个数据库耦合到不同的服务器,并且可以将与游戏环境有关的信息在不同用户在线进行多玩家游戏时连续地存储于数据库中。
本公开实施例提供了一种游戏控制方法,该方法可以由终端或服务器执行。本公开实施例以游戏控制方法由终端执行为例来进行说明。其中,该终端包括触控显示屏和处理器,该触控显示屏用于呈现图形用户界面以及接收用户作用于图形用户界面产生的操作指令。用户通过触控显示屏对图形用户界面进行操作时,该图形用户界面可以通过响应于接收到的操作指令控制终端本地的内容,也可以通过响应于接收到的操作指令控制对端服务器的内容。例如,用户作用于图形用户界面产生的操作指令包括用于启动游戏应用程序的指令,处理器被配置为在接收到用户提供的启动游戏应用程序的指令之后启动游戏应用程序。此外,处理器被配置为在触控显示屏上渲染和绘制与游戏相关联的图形用户界面。触控显示屏是能够感测屏幕上的多个点同时执行的触摸或者滑动操作的多触敏屏幕。用户在使用手指在图形用户界面上执行触控操作,图形用户界面在检测到触控操作时,控制游戏中的不同虚拟对象执行与触控操作对应的动作。例如,该游戏可以为休闲游戏、动作游戏、对象扮演游戏、策略游戏、体育游戏、益智游戏等游戏中的任一种。其中,游戏可以包括游戏的虚拟场景。此外,游戏的虚拟场景中可以包括由用户(或玩家)控制的一个或多个虚拟对象,诸如虚拟对象。另外,游戏的虚拟场景中还可以包括一个或多个障碍物,诸如栏杆、沟壑、墙壁等,以限制虚拟对象的移动,例如将一个或多个对象的移动限制到虚拟场景内的特定区域。可选地,游戏的虚拟场景还包括一个或多个元素,诸如技能、分值、对象健康状态、能量等,以向玩家提供帮助、提供虚拟服务、增加与玩家表现相关的分值等。此外,图形用户界面还可以呈现一个或多个指示器,以向玩家提供指示信息。例如,游戏可以包括玩家控制的虚拟对象和一个或多个其他虚拟对象(诸如敌方对象)。在一个实施例中,一个或多个其他虚拟对象由游戏的其他玩家控制。例如,一个或多个其他虚拟对象可以由计算机控制,诸如使用人工智能(AI)算法的机器人,实现人机对战模式。例如,虚拟对象拥有游戏玩家用来实现目标的各种技能或能力。例如虚拟对象拥有可用于从游戏中消除其他对象的一种或多种武器、道具、工具等。这样的技能或能力可由游戏的玩家使用与终端的触控显示屏的多个预设触控操作之一来激活。处理器可以被配置为响应于用户的触控操作产生的操作指令来呈现对应的游戏画面。
需要说明的是,图1所示的游戏控制系统的场景示意图仅仅是一个示例,本公开实施例描述的图像处理系统以及场景是为了更加清楚的说明本公开实施例的技术方案,并不构成对于本公开实施例提供的技术方案的限定,本领域普通技术人员可知,随着游戏控制系统的演变和新业务场景的出现,本公开实施例提供的技术方案对于类似的技术问题,同样适用。
基于上述问题,本公开实施例提供第一种游戏控制方法、装置、计算机设备及存储介质,可以提高游戏玩家的游戏体验。以下分别进行详细说明。需说明的是,以下实施例的描述顺序不作为对实施例优选顺序的限定。
本公开实施例提供一种游戏控制方法,该方法可以由终端或服务器执行,本公开实施例以游戏控制方法由终端执行为例来进行说明。
请参阅图2,图2为本公开实施例提供的一种游戏控制方法的流程示意图。该游戏控制方法的具体流程可以如下:
101、在图形用户界面上提供动作控件。
在本公开实施例中,通过终端设备提供图形用户界面,图形用户界面至少包括目标游戏的部分游戏场景,以及处于游戏场景中的目标虚拟对象,目标虚拟对象由当前玩家控制。图形用户界面还提供有动作控件,动作控件用于触发目标虚拟对象在游戏场景中执行相应的动作。
其中,动作控件可以包括多种,不同动作控件可以用于触发执行不同的动作。比如,动作控件可以为翻滚控件,则可以用于触发目标虚拟对象在游戏场景中执行翻滚动作。
例如,请参阅图3,图3为本公开实施例提供的一种游戏控制方法的应用场景示意图。在图3所示的图形用户界面中,显示有目标游戏的游戏场景,以及处于游戏场景中的由当前玩家控制的目标虚拟对象,还提供有动作控件,以及其他操作控件。
102、响应于对动作控件的第一操作,基于目标虚拟对象在游戏场景中的位置确定目标虚拟对象在游戏场景中的第一动作的可执行区域。
其中,第一操作也即当前玩家对动作控件执行的操作,第一操作可以为多种,比如,点击,按压,滑动等操作。
其中,第一动作指的是动作控件对应的动作,可执行区域指的是目标虚拟对象在游戏场景中可以执行第一动作的范围区域。
在一些实施例中,步骤“基于目标虚拟对象在游戏场景中的位置确定目标虚拟对象在游戏场景中的第一动作的可执行区域”,可以包括以下操作:
根据目标虚拟对象在游戏场景中的位置,以及预设距离在游戏场景中确定一圆形区域,得到可执行区域。
其中,预设距离指的是在游戏中预先设计虚拟对象执行第一动作后的移动距离,比如,若第一动作为翻滚,则预设距离指的是预设的翻滚距离,也即虚拟对象执行一次翻滚动作移动的距离。
在本公开实施例中,目标虚拟对象在游戏场景中执行第一动作可以朝向任意方向,则目标虚拟对象执行第一动作的范围区域可以为一圆形区域,该圆形区域以目标虚拟对象在游戏场景中的位置为圆心点,以预设距离为半径进行确定,从而得到可执行区域。
例如,请参阅图4,图4为本公开实施例提供的另一种游戏控制方法的应用场景示意图。在图4所示的图形用户界面中,目标虚拟对象在游戏场景中的位置为:目标位置,第一动作的预设距离可以为:L,则可以在游戏场景中以目标位置为圆心,以距离L为半径确定一圆形区域,得到可执行区域。
在一些实施例中,为了进一步选取准确的可执行区域,步骤“基于目标虚拟对象在游戏场景中的位置确定目标虚拟对象在游戏场景中的第一动作的可执行区域”,可以包括以下操作:
根据目标虚拟对象在游戏场景中的位置,以及预设距离在游戏场景中确定一圆形区域;
获取目标虚拟对象当前在游戏场景中的视角方向;
基于视角方向以及预设角度从圆形区域中选取一扇形区域,得到可执行区域。
其中,视角方向指的是目标虚拟对象在游戏场景中正面朝向的方向。预设角度可以为目标虚拟对象在游戏场景中的视野角度。
例如,请参阅图5,图5为本公开实施例提供的另一种游戏控制方法的应用场景示意图。在图5所示的图形用户界面中,目标虚拟对象在游戏场景中的位置为:目标位置,第一动作的预设距离可以为:L,则可以在游戏场景中以目标位置为圆心,以距离L为半径确定一圆形区域。获取目标虚拟对象当前在游戏场景中的视角方向可以为:第一方向,视野角度可以为a,则根据第一方向和视野角度a从圆形区域中划分一扇形区域,得到可执行区域。
在一些实施例中,为了控制虚拟对象快速执行第一动作,步骤“基于目标虚拟对象在游戏场景中的位置确定目标虚拟对象在游戏场景中的第一动作的可执行区域”,可以包括以下操作:
根据目标虚拟对象在游戏场景中的位置,以及预设距离在游戏场景中确定一圆形区域;
若圆形区域中存在多个掩体,则从多个掩体中确定与目标虚拟对象距离最近的目标掩体;
基于游戏场景中目标掩体与目标虚拟对象的位置关系,从圆形区域中选取目标掩体所处的区域,得到可执行区域。
其中,掩体指的是游戏场景中可以用于目标虚拟对象躲避其他对象攻击或者伤害的虚拟物体,比如,掩体可以为虚拟场景中的建筑物等。
具体的,目标掩体与目标虚拟对象的位置关系包括目标掩体与目标虚拟对象在游戏场景中的相对方向,以及目标掩体与目标虚拟对象之间的相对距离,然后,在圆形区域中根据相对方向,以及预设角度确定部分区域,得到可执行区域。
例如,请参阅图6,图6为本公开实施例提供的另一种游戏控制方法的应用场景示意图。在图6所示的图形用户界面中,目标虚拟对象在游戏场景中的位置为:目标位置,在游戏场景中以目标位置为圆心,以距离L为半径确定一圆形区域。检测到圆形区域中存在多个掩体,包括:掩体A和掩体B,从多个掩体中选取与目标虚拟对象距离最近的掩体为:掩体B。
进一步的,确定目标虚拟对象与目标掩体的相对方向可以为:第二方向,根据相对方向和预设角度从圆形区域中划分一扇形区域,得到可执行区域。从而可以方便玩家控制目标虚拟对象快速进入掩体,以躲避其他对象的攻击。
在一些实施例中,图形用户界面还可以包括处于游戏场景中的其他虚拟对象,其他虚拟对象可以与目标虚拟对象处于不同阵营,也即,其他虚拟对象可以为其他游戏玩家控制的虚拟对象或者游戏场景中设置的可以攻击目标虚拟对象的虚拟对象。则为了进一步选取准确的可执行区域,步骤“基于目标虚拟对象在游戏场景中的位置确定目标虚拟对象在游戏场景中的第一动作的可执行区域”,可以包括以下操作:
根据目标虚拟对象在游戏场景中的位置,以及预设距离在游戏场景中确定一圆形区域;
获取其他虚拟对象在游戏场景中相对于目标虚拟对象的目标方向;
基于目标方向以及预设角度从圆形区域中选取一扇形区域,得到可执行区域。
其中,目标方向指的是其他虚拟对象与目标虚拟对象的相对方向。预设角度可以为目标虚拟对象在游戏场景中的视野角度。
例如,请参阅图7,图7为本公开实施例提供的另一种游戏控制方法的应用场景示意图。在图7所示的图形用户界面中,目标虚拟对象在游戏场景中的位置为:目标位置,在游戏场景中以目标位置为圆心,以距离L为半径确定一圆形区域。其他虚拟对象在游戏场景中的位置可以为:位置S,获取其他虚拟对象与目标虚拟对象的相对方向可以为:第三方向,目标虚拟对象在游戏场景中的视野角度可以为:a,则根据第三方向和视野角度a从圆形区域中划分一扇形区域,得到可执行区域。
103、当可执行区域中存在掩体时,在图形用户界面显示掩体对应的可蹲伏指示图标。
其中,可蹲伏指示图标用于提示玩家当前游戏场景中的可执行区域中存在目标虚拟对象可以进行蹲伏的掩体。
例如,请参阅图8,图8为本公开实施例提供的另一种游戏控制方法的应用场景示意图。在图8所示的图形用户界面中,根据目标虚拟对象在游戏场景中的位置首先确定可执行区域,当检测到可执行区域中存在掩体时,在图形用户界面中该掩体附近区域显示可蹲伏指示标记,可以用于提示当前游戏玩家:可以控制目标虚拟对象蹲伏在该掩体中。
在一些实施例中,当可执行区域中存在多个掩体时,可以在图形用户界面显示中各掩体上显示可蹲伏指示图标。
例如,请参阅图9,图9为本公开实施例提供的另一种游戏控制方法的应用场景示意图。在图9所示的图形用户界面中,检测到可执行区域中存在:第一掩体和第二掩体,则可以分别在第一掩体以及第二掩体上显示可蹲伏指示图标,可以用于提示当前游戏玩家:可以控制目标虚拟对象蹲伏在第一掩体或者第二掩体中。
104、响应于第一操作结束,控制目标虚拟对象朝向掩体执行第一动作后蹲伏以进入掩体。
在一些实施例中,第一操作可以为按压操作,当检测到按压操作结束,若可执行区域中存在掩体,则可以控制目标虚拟对象执行第一动作后蹲伏于该掩体中。以此,可以通过单个操作触发多个指令,控制目标虚拟对象完成第一动作以及蹲伏于掩体中,提高游戏玩家的操作便捷性。
例如,请参阅图10,图10为本公开实施例提供的另一种游戏控制方法的应用场景示意图。在图10所示的图形用户界面中,检测到对动作控件的第一操作结束,控制第一虚拟对象朝向掩体执行第一动作后蹲伏进入掩体。
在一些实施例中,可执行区域中可以存在多个掩体,为了控制目标虚拟对象快速进行蹲伏,步骤“响应于第一操作结束,控制目标虚拟对象朝向掩体执行第一动作后蹲伏以进入掩体”,可以包括以下操作:
响应于第一操作结束,控制目标虚拟对象朝向与目标虚拟对象距离最近的掩体执行第一动作后蹲伏以进入掩体。
具体的,检测到第一操作结束,从处于可执行区域中的多个掩体中选取距离目标虚拟对象距离最近的掩体,得到目标掩体,然后控制目标虚拟对象朝向目标掩体执行第一动作后蹲伏进入目标掩体。
例如,请参阅图11,图11为本公开实施例提供的另一种游戏控制方法的应用场景示意图。在图11所示的图形用户界面中,可执行区域中包括:第一掩体和第二掩体,其中,与目标虚拟对象距离最近的掩体可以为第一掩体。检测到对动作控件的第一操作结束,控制第一虚拟对象朝向第一掩体执行第一动作后蹲伏进入第一掩体。
在一些实施例中,为了保证目标虚拟对象的蹲伏准确性,步骤“控制目标虚拟对象朝向掩体执行第一动作后蹲伏以进入掩体”,可以包括以下操作:
从游戏场景中确定掩体的遮挡区域;
控制目标虚拟对象朝向掩体执行第一动作后以蹲伏状态处于掩体的遮挡区域。
在本公开实施例中,游戏场景中的掩体可以包括多侧区域,其中,遮挡区域指的是掩体的多侧区域中可以用于目标虚拟对象准确躲避攻击的一侧区域。
具体的,可以获取当前游戏场景中可以对目标虚拟对象进行攻击的攻击对象的位置,然后根据攻击对象的位置从掩体的多侧区域中确定目标虚拟对象可以躲避攻击的一侧区域,从而确定遮挡区域。
例如,请参阅图12,图12为本公开实施例提供的另一种游戏控制方法的应用场景示意图。在图12所示的图形用户界面中,游戏场景中存在目标虚拟对象与攻击对象,其中,攻击对象可以攻击目标虚拟对象。游戏场景中处于可执行区域的掩体可以包括多侧区域:区域①,区域②,区域③,区域④。获取攻击对象的位置信息,根据位置信息确定攻击对象处于目标虚拟对象的前方,则可以从掩体的多侧区域中选取可以用于目标虚拟对象躲避攻击对象的攻击的区域为区域①,则可以将区域①作为遮挡区域。然后控制目标虚拟对象朝向掩体执行第一动作后以蹲伏状态处于掩体的区域①中。
在一些实施例中,可执行区域中若不存在掩体,则该方法可以包括以下步骤:
响应于与第一操作连续的滑动操作,控制目标虚拟对象沿滑动操作的滑动方向执行第一动作。
其中,与第一操作连续的滑动操作指的是在第一操作结束时的滑动操作。
具体的,当检测到当前玩家结束第一操作时在图形用户界面的滑动操作,可以根据滑动操作的滑动方向控制目标虚拟对象朝向滑动方向执行第一动作,以此,控制目标虚拟对象朝向玩家选择的方向执行第一动作,避免其他虚拟对象的攻击。
在一些实施例中,为了提高界面空间利用率,该方法还可以包括以下步骤:
响应于对动作控件的第二操作,控制目标虚拟对象执行第一动作后处于站立状态。
其中,第一操作和第二操作为不同的操作,通过对动作控件的第一操作和第二操作分别可以用于触发不同的指令。
具体的,通过对动作控件的第一操作,可以控制目标虚拟对象在游戏场景中执行动作控件对应的第一动作后蹲伏于可执行区域的掩体中;通过对动作控件的第二操作可以控制目标虚拟对象在游戏场景中执行动作控件对应的第一动作后处于站立状态。通过一个动作控件集成不同的指令,减少图形用户界面设置的控件数量,从而可以提高界面空间利用率。
本公开实施例公开了一种游戏控制方法,该方法包括:在图形用户界面上提供动作控件;响应于对动作控件的第一操作,基于目标虚拟对象在游戏场景中的位置确定目标虚拟对象在游戏场景中的第一动作的可执行区域;当可执行区域中存在掩体时,在图形用户界面显示掩体对应的可蹲伏指示图标;响应于第一操作结束,控制目标虚拟对象朝向掩体执行第一动作后蹲伏以进入掩体。以此,可以提高游戏玩家在目标游戏中的游戏体验。进而降低了终端设备中游戏的操作成本及运行时长,节省了终端设备的电量。
根据上述介绍的内容,下面将举例来进一步说明本公开的游戏控制方法。请参阅图13,图13为本公开实施例提供的另一种游戏控制方法的流程示意图,以该游戏控制方法具体应用于终端为例,具体流程可以如下:
201、终端显示目标游戏的游戏界面。
在本公开实施例中,通过终端设备提供游戏界面,游戏界面包括目标游戏的部分游戏场景,处于游戏场景中的目标虚拟对象,目标虚拟对象可以为当前游戏玩家控制的虚拟对象,游戏界面还提供有翻滚动作控件以及其他操作控件等。
其中,翻滚动作控件可以用于触发目标虚拟对象在游戏场景中执行翻滚动作。
例如,请参阅图14,图14为本公开实施例提供的另一种游戏控制方法的应用场景示意图。在图14所示的游戏界面中,显示有目标游戏的部分游戏场景,处于游戏场景中的目标虚拟对象,目标虚拟对象可以为当前游戏玩家控制的虚拟对象,游戏界面还提供有翻滚动作控件以及其他操作控件。
202、终端检测到当前玩家对翻滚动作控件的按压操作时,在游戏界面显示可翻滚区域,若可翻滚区域存在掩体,则在游戏界面显示蹲伏标记。
其中,可以翻滚区域指的是目标虚拟对象在游戏场景中可以执行翻滚动作的区域。
具体的,可翻滚区域可以根据当前目标虚拟对象在游戏场景中的位置为圆心,预设翻滚距离为半径,首先在游戏场景中确定一个圆形区域,同时,为了避免目标虚拟对象角色的可翻滚区域中存在多个掩体时无法确定蹲伏目标,则可以从圆形区域中根据目标虚拟对象的视野方向以及视野角度确定一扇形区域,得到可翻滚区域。
例如,请参阅图15,图15为本公开实施例提供的另一种游戏控制方法的应用场景示意图。在图15所示的游戏界面中,检测到当前玩家对翻滚动作控件的按压操作时,根据目标虚拟对象的位置,目标虚拟对象的视野方向和视野角度确定扇形区域,得到可翻滚区域,并在图形用户界面显示可翻滚区域,通过显示可翻滚区域,使得当前玩家可以观察是否存在可以遮挡的掩体。
同时,若检测到可翻滚区域中存在掩体,则可以在游戏界面的掩体显示区域附近显示该蹲伏标记。
例如,请参阅图16,图16为本公开实施例提供的另一种游戏控制方法的应用场景示意图。在图16所示的游戏界面中,检测到可翻滚区域中存在掩体,则在掩体附近显示可蹲伏标记,提示当前玩家可以控制目标虚拟对象蹲伏在掩体中。
在一些实施例中,若当前可翻滚区域中不存在掩体,而当前玩家想要控制目标虚拟对象进行蹲伏,则可以通过滑动游戏界面移动控件控制目标虚拟对象在游戏场景中移动,同时根据目标虚拟对象的移动更新可翻滚区域,直至可翻滚区域中存在掩体。
203、终端检测到当前玩家的按压操作结束时,控制目标虚拟对象朝向掩体执行翻滚动作后蹲伏进入掩体。
例如,请参阅图17,图17为本公开实施例提供的另一种游戏控制方法的应用场景示意图。在图17所示的游戏界面中,检测到当前玩家结束按压操作,则可以控制目标虚拟对象朝向掩体执行翻滚动作后蹲伏进入掩体,完成翻滚动作以及蹲伏于掩体。
本公开实施例公开了一种游戏控制方法,该方法包括:终端显示目标游戏的游戏界面,检测到当前玩家对翻滚动作控件的按压操作时,在游戏界面显示可翻滚区域,若可翻滚区域存在掩体,则在游戏界面显示蹲伏标记,检测到当前玩家的按压操作结束时,控制目标虚拟对象朝向掩体执行翻滚动作后蹲伏进入掩体,以此,可以提高游戏玩家的游戏体验。进而降低了终端设备中游戏的操作成本及运行时长,节省了终端设备的电量。
为便于更好的实施本公开实施例提供的游戏控制方法,本公开实施例还提供一种基于上述游戏控制方法的游戏控制装置。其中名词的含义与上述游戏控制方法中相同,具体实现细节可以参考方法实施例中的说明。
请参阅图18,图18为本公开实施例提供的一种游戏控制装置的结构框图,该装置包括:
提供单元301,用于在所述图形用户界面上提供动作控件;
确定单元302,用于响应于对所述动作控件的第一操作,基于所述目标虚拟对象在所述游戏场景中的位置确定所述目标虚拟对象在所述游戏场景中的第一动作的可执行区域;
显示单元303,用于当所述可执行区域中存在掩体时,在所述图形用户界面显示所述掩体对应的可蹲伏指示图标;
第一控制单元304,用于响应于所述第一操作结束,控制所述目标虚拟对象朝向所述掩体执行所述第一动作后蹲伏以进入所述掩体。
在一些实施例中,确定单元包括:
第一确定子单元,用于根据所述目标虚拟对象在所述游戏场景中的位置,以及预设距离在所述游戏场景中确定一圆形区域,得到所述可执行区域。
在一些实施例中,确定单元302可以包括:
第二确定子单元,用于根据所述目标虚拟对象在所述游戏场景中的位置,以及预设距离在所述游戏场景中确定一圆形区域;
第一获取子单元,用于获取所述目标虚拟对象当前在所述游戏场景中的视角方向;
第一选取子单元,用于基于所述视角方向以及预设角度从所述圆形区域中选取一扇形区域,得到所述可执行区域。
在一些实施例中,确定单元302可以包括:
第三确定子单元,用于根据所述目标虚拟对象在所述游戏场景中的位置,以及预设距离在所述游戏场景中确定一圆形区域;
第四确定子单元,用于若所述圆形区域中存在多个掩体,则从所述多个掩体中确定与所述目标虚拟对象距离最近的目标掩体;
第二选取子单元,用于基于所述游戏场景中所述目标掩体与所述目标虚拟对象的位置关系,从所述圆形区域中选取所述目标掩体所处的区域,得到所述可执行区域。
在一些实施例中,确定单元302可以包括:
第五确定子单元,根据所述目标虚拟对象在所述游戏场景中的位置,以及预设距离在所述游戏场景中确定一圆形区域;
第二获取子单元,用于获取所述其他虚拟对象在所述游戏场景中相对于所述目标虚拟对象的目标方向;
第三确定子单元,用于基于所述目标方向以及预设角度从所述圆形区域中选取一扇形区域,得到所述可执行区域。
在一些实施例中,第一控制单元304可以包括:
第一控制子单元,用于响应于所述第一操作结束,控制所述目标虚拟对象朝向与所述目标虚拟对象距离最近的掩体执行所述第一动作后蹲伏以进入所述掩体。
在一些实施例中,第一控制单元304可以包括:
第四确定子单元,用于从所述游戏场景中确定所述掩体的遮挡区域;
第三控制子单元,用于控制所述目标虚拟对象朝向所述掩体执行所述第一动作后以蹲伏状态处于所述掩体的遮挡区域。
在一些实施例中,该装置还可以包括:
第二控制单元,用于响应于与所述第一操作连续的滑动操作,控制所述目标虚拟对象沿所述滑动操作的滑动方向执行所述第一动作。
在一些实施例中,该装置还可以包括:
第三控制单元,用于响应于对所述动作控件的第二操作,控制所述目标虚拟对象执行所述第一动作后处于站立状态,其中,所述第一操作和所述第二操作不同。
本公开实施例公开了一种游戏控制装置,通过提供单元301在所述图形用户界面上提供动作控件;确定单元302响应于对所述动作控件的第一操作,基于所述目标虚拟对象在所述游戏场景中的位置确定所述目标虚拟对象在所述游戏场景中的第一动作的可执行区域;显示单元303当所述可执行区域中存在掩体时,在所述图形用户界面显示所述掩体对应的可蹲伏指示图标;第一控制单元304响应于所述第一操作结束,控制所述目标虚拟对象朝向所述掩体执行所述第一动作后蹲伏以进入所述掩体。以此,可以提高游戏玩家的游戏体验。进而降低了终端设备中游戏的操作成本及运行时长,节省了终端设备的电量。
相应的,本公开实施例还提供一种计算机设备,该计算机设备可以为终端。如图19所示,图19为本公开实施例提供的计算机设备的结构示意图。该计算机设备600包括有一个或者一个以上处理核心的处理器601、有一个或一个以上计算机可读存储介质的存储器602及存储在存储器602上并可在处理器上运行的计算机程序。其中,处理器601与存储器602电性连接。本领域技术人员可以理解,图中示出的计算机设备结构并不构成对计算机设备的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
处理器601是计算机设备600的控制中心,利用各种接口和线路连接整个计算机设备600的各个部分,通过运行或加载存储在存储器602内的软件程序和/或模块,以及调用存储在存储器602内的数据,执行计算机设备600的各种功能和处理数据,从而对计算机设备600进行整体监控。
在本公开实施例中,计算机设备600中的处理器601会按照如下的步骤,将一个或一个以上的应用程序的进程对应的指令加载到存储器602中,并由处理器601来运行存储在存储器602中的应用程序,从而实现各种功能:
在图形用户界面上提供动作控件;
响应于对动作控件的第一操作,基于目标虚拟对象在游戏场景中的位置确定目标虚拟对象在游戏场景中的第一动作的可执行区域;
当可执行区域中存在掩体时,在图形用户界面显示掩体对应的可蹲伏指示图标;
响应于第一操作结束,控制目标虚拟对象朝向掩体执行第一动作后蹲伏以进入掩体。
在一些实施例中,基于目标虚拟对象在游戏场景中的位置确定目标虚拟对象在游戏场景中的第一动作的可执行区域,包括:
根据目标虚拟对象在游戏场景中的位置,以及预设距离在游戏场景中确定一圆形区域,得到可执行区域。
在一些实施例中,基于目标虚拟对象在游戏场景中的位置确定目标虚拟对象在游戏场景中的第一动作的可执行区域,包括:
根据目标虚拟对象在游戏场景中的位置,以及预设距离在游戏场景中确定一圆形区域;
获取目标虚拟对象当前在游戏场景中的视角方向;
基于视角方向以及预设角度从圆形区域中选取一扇形区域,得到可执行区域。
在一些实施例中,基于目标虚拟对象在游戏场景中的位置确定目标虚拟对象在游戏场景中的第一动作的可执行区域,包括:
根据目标虚拟对象在游戏场景中的位置,以及预设距离在游戏场景中确定一圆形区域;
若圆形区域中存在多个掩体,则从多个掩体中确定与目标虚拟对象距离最近的目标掩体;
基于游戏场景中目标掩体与目标虚拟对象的位置关系,从圆形区域中选取目标掩体所处的区域,得到可执行区域。
在一些实施例中,图形用户界面包括处于游戏场景的其他虚拟对象,其他虚拟对象与目标虚拟对象处于不同阵营;
基于目标虚拟对象在游戏场景中的位置确定目标虚拟对象在游戏场景中的第一动作的可执行区域,包括:
根据目标虚拟对象在游戏场景中的位置,以及预设距离在游戏场景中确定一圆形区域;
获取其他虚拟对象在游戏场景中相对于目标虚拟对象的目标方向;
基于目标方向以及预设角度从圆形区域中选取一扇形区域,得到可执行区域。
在一些实施例中,可执行区域中存在多个掩体;
响应于第一操作结束,控制目标虚拟对象朝向掩体执行第一动作后蹲伏以进入掩体,包括:
响应于第一操作结束,控制目标虚拟对象朝向与目标虚拟对象距离最近的掩体执行第一动作后蹲伏以进入掩体。
在一些实施例中,控制目标虚拟对象朝向掩体执行第一动作后蹲伏以进入掩体,包括:
从游戏场景中确定掩体的遮挡区域;
控制目标虚拟对象朝向掩体执行第一动作后以蹲伏状态处于掩体的遮挡区域。
在一些实施例中,可执行区域中不存在掩体;
该方法还包括:
响应于与第一操作连续的滑动操作,控制目标虚拟对象沿滑动操作的滑动方向执行第一动作。
在一些实施例中,该方法还包括:
响应于对动作控件的第二操作,控制目标虚拟对象执行第一动作后处于站立状态,其中,第一操作和第二操作不同。
本公开实施例通过对游戏界面的同一动作控件设计多个动作指令,通过玩家对该动作控件的不同操作触发不同的动作指令,当检测到玩家对动作控件的指定操作时,根据玩家控制的目标虚拟对象在游戏场景中的位置确定一动作执行区域,然后当动作执行区域中存在游戏掩体时,可以控制目标虚拟对象完成动作控件对应的动作后蹲伏于掩体中,提高游戏玩家的操作便捷性,从而提高玩家游戏体验。进而降低了终端设备中游戏的操作成本及运行时长,节省了终端设备的电量。
以上各个操作的具体实施可参见前面的实施例,在此不再赘述。
可选的,如图19所示,计算机设备600还包括:触控显示屏603、射频电路604、音频电路605、输入单元606以及电源607。其中,处理器601分别与触控显示屏603、射频电路604、音频电路605、输入单元606以及电源607电性连接。本领域技术人员可以理解,图19中示出的计算机设备结构并不构成对计算机设备的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
触控显示屏603可用于显示图形用户界面以及接收用户作用于图形用户界面产生的操作指令。触控显示屏603可以包括显示面板和触控面板。其中,显示面板可用于显示由用户输入的信息或提供给用户的信息以及计算机设备的各种图形用户接口,这些图形用户接口可以由图形、文本、图标、视频和其任意组合来构成。可选的,可以采用液晶显示器(LCD,Liquid Crystal Display)、有机发光二极管(OLED,Organic Light-Emitting Diode)等形式来配置显示面板。触控面板可用于收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板上或在触控面板附近的操作),并生成相应的操作指令,且操作指令执行对应程序。可选的,触控面板可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器601,并能接收处理器601发来的命令并加以执行。触控面板可覆盖显示面板,当触控面板检测到在其上或附近的触摸操作后,传送给处理器601以确定触摸事件的类型,随后处理器601根据触摸事件的类型在显示面板上提供相应的视觉输出。在本公开实施例中,可以将触控面板与显示面板集成到触控显示屏603而实现输入和输出功能。但是在某些实施例中,触控面板与触控面板可以作为两个独立的部件来实现输入和输出功能。即触控显示屏603也可以作为输入单元606的一部分实现输入功能。
射频电路604可用于收发射频信号,以通过无线通信与网络设备或其他计算机设备建立无线通讯,与网络设备或其他计算机设备之间收发信号。
音频电路605可以用于通过扬声器、传声器提供用户与计算机设备之间的音频接口。音频电路605可将接收到的音频数据转换后的电信号,传输到扬声器,由扬声器转换为声音信号输出;另一方面,传声器将收集的声音信号转换为电信号,由音频电路605接收后转换为音频数据,再将音频数据输出处理器601处理后,经射频电路604以发送给比如另一计算机设备,或者将音频数据输出至存储器602以便进一步处理。音频电路605还可能包括耳塞插孔,以提供外设耳机与计算机设备的通信。
输入单元606可用于接收输入的数字、字符信息或用户特征信息(例如指纹、虹膜、面部信息等),以及产生与用户设置以及功能控制有关的键盘、鼠标、操作杆、光学或者轨迹球信号输入。
电源607用于给计算机设备600的各个部件供电。可选的,电源607可以通过电源管理系统与处理器601逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。电源607还可以包括一个或一个以上的直流或交流电源、再充电系统、电源故障检测电路、电源转换器或者逆变器、电源状态指示器等任意组件。
尽管图19中未示出,计算机设备600还可以包括摄像头、传感器、无线保真模块、蓝牙模块等,在此不再赘述。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
由上可知,本实施例提供的计算机设备,可以在图形用户界面上提供动作控件;响应于对动作控件的第一操作,基于目标虚拟对象在游戏场景中的位置确定目标虚拟对象在游戏场景中的第一动作的可执行区域;当可执行区域中存在掩体时,在图形用户界面显示掩体对应的可蹲伏指示图标;响应于第一操作结束,控制目标虚拟对象朝向掩体执行第一动作后蹲伏以进入掩体。
本领域普通技术人员可以理解,上述实施例的各种方法中的全部或部分步骤可以通过指令来完成,或通过指令控制相关的硬件来完成,该指令可以存储于一计算机可读存储介质中,并由处理器进行加载和执行。
为此,本公开实施例提供一种计算机可读存储介质,其中存储有多条计算机程序,该计算机程序能够被处理器进行加载,以执行本公开实施例所提供的任一种游戏控制方法中的步骤。例如,该计算机程序可以执行如下步骤:
在图形用户界面上提供动作控件;
响应于对动作控件的第一操作,基于目标虚拟对象在游戏场景中的位置确定目标虚拟对象在游戏场景中的第一动作的可执行区域;
当可执行区域中存在掩体时,在图形用户界面显示掩体对应的可蹲伏指示图标;
响应于第一操作结束,控制目标虚拟对象朝向掩体执行第一动作后蹲伏以进入掩体。
在一些实施例中,基于目标虚拟对象在游戏场景中的位置确定目标虚拟对象在游戏场景中的第一动作的可执行区域,包括:
根据目标虚拟对象在游戏场景中的位置,以及预设距离在游戏场景中确定一圆形区域,得到可执行区域。
在一些实施例中,基于目标虚拟对象在游戏场景中的位置确定目标虚拟对象在游戏场景中的第一动作的可执行区域,包括:
根据目标虚拟对象在游戏场景中的位置,以及预设距离在游戏场景中确定一圆形区域;
获取目标虚拟对象当前在游戏场景中的视角方向;
基于视角方向以及预设角度从圆形区域中选取一扇形区域,得到可执行区域。
在一些实施例中,基于目标虚拟对象在游戏场景中的位置确定目标虚拟对象在游戏场景中的第一动作的可执行区域,包括:
根据目标虚拟对象在游戏场景中的位置,以及预设距离在游戏场景中确定一圆形区域;
若圆形区域中存在多个掩体,则从多个掩体中确定与目标虚拟对象距离最近的目标掩体;
基于游戏场景中目标掩体与目标虚拟对象的位置关系,从圆形区域中选取目标掩体所处的区域,得到可执行区域。
在一些实施例中,图形用户界面包括处于游戏场景的其他虚拟对象,其他虚拟对象与目标虚拟对象处于不同阵营;
基于目标虚拟对象在游戏场景中的位置确定目标虚拟对象在游戏场景中的第一动作的可执行区域,包括:
根据目标虚拟对象在游戏场景中的位置,以及预设距离在游戏场景中确定一圆形区域;
获取其他虚拟对象在游戏场景中相对于目标虚拟对象的目标方向;
基于目标方向以及预设角度从圆形区域中选取一扇形区域,得到可执行区域。
在一些实施例中,可执行区域中存在多个掩体;
响应于第一操作结束,控制目标虚拟对象朝向掩体执行第一动作后蹲伏以进入掩体,包括:
响应于第一操作结束,控制目标虚拟对象朝向与目标虚拟对象距离最近的掩体执行第一动作后蹲伏以进入掩体。
在一些实施例中,控制目标虚拟对象朝向掩体执行第一动作后蹲伏以进入掩体,包括:
从游戏场景中确定掩体的遮挡区域;
控制目标虚拟对象朝向掩体执行第一动作后以蹲伏状态处于掩体的遮挡区域。
在一些实施例中,可执行区域中不存在掩体;
该方法还包括:
响应于与第一操作连续的滑动操作,控制目标虚拟对象沿滑动操作的滑动方向执行第一动作。
在一些实施例中,该方法还包括:
响应于对动作控件的第二操作,控制目标虚拟对象执行第一动作后处于站立状态,其中,第一操作和第二操作不同。
本公开实施例通过对游戏界面的同一动作控件设计多个动作指令,通过玩家对该动作控件的不同操作触发不同的动作指令,当检测到玩家对动作控件的指定操作时,根据玩家控制的目标虚拟对象在游戏场景中的位置确定一动作执行区域,然后当动作执行区域中存在游戏掩体时,可以控制目标虚拟对象完成动作控件对应的动作后蹲伏于掩体中,提高游戏玩家的操作便捷性,从而提高玩家游戏体验。进而降低了终端设备中游戏的操作成本及运行时长,节省了终端设备的电量。
以上各个操作的具体实施可参见前面的实施例,在此不再赘述。
其中,该存储介质可以包括:只读存储器(ROM,Read Only Memory)、随机存取记忆体(RAM,Random Access Memory)、磁盘或光盘等。
由于该存储介质中所存储的计算机程序,可以执行本公开实施例所提供的任一种游戏控制方法中的步骤,因此,可以实现本公开实施例所提供的任一种游戏控制方法所能实现的有益效果,详见前面的实施例,在此不再赘述。
以上对本公开实施例所提供的一种游戏控制方法、装置、存储介质及计算机设备进行了详细介绍,本文中应用了具体个例对本公开的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本公开的方法及其核心思想;同时,对于本领域的技术人员,依据本公开的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本公开的限制。

Claims (20)

  1. 一种游戏控制方法,通过终端设备提供图形用户界面,所述图形用户界面至少包括部分游戏场景,以及处于所述游戏场景中的目标虚拟对象,所述目标虚拟对象由当前玩家控制,所述方法包括:
    在所述图形用户界面上提供动作控件;
    响应于对所述动作控件的第一操作,基于所述目标虚拟对象在所述游戏场景中的位置确定所述目标虚拟对象在所述游戏场景中的第一动作的可执行区域;
    当所述可执行区域中存在掩体时,在所述图形用户界面显示所述掩体对应的可蹲伏指示图标;
    响应于所述第一操作结束,控制所述目标虚拟对象朝向所述掩体执行所述第一动作后蹲伏以进入所述掩体。
  2. 根据权利要求1所述的方法,其中,所述基于所述目标虚拟对象在所述游戏场景中的位置确定所述目标虚拟对象在所述游戏场景中的第一动作的可执行区域,包括:
    根据所述目标虚拟对象在所述游戏场景中的位置,以及预设距离在所述游戏场景中确定一圆形区域,得到所述可执行区域。
  3. 根据权利要求1所述的方法,其中,所述基于所述目标虚拟对象在所述游戏场景中的位置确定所述目标虚拟对象在所述游戏场景中的第一动作的可执行区域,包括:
    根据所述目标虚拟对象在所述游戏场景中的位置,以及预设距离在所述游戏场景中确定一圆形区域;
    获取所述目标虚拟对象当前在所述游戏场景中的视角方向;
    基于所述视角方向以及预设角度从所述圆形区域中选取一扇形区域,得到所述可执行区域。
  4. 根据权利要求1所述的方法,其中,所述基于所述目标虚拟对象在所述游戏场景中的位置确定所述目标虚拟对象在所述游戏场景中的第一动作的可执行区域,包括:
    根据所述目标虚拟对象在所述游戏场景中的位置,以及预设距离在所述游戏场景中确定一圆形区域;
    若所述圆形区域中存在多个掩体,则从所述多个掩体中确定与所述目标虚拟对象距离最近的目标掩体;
    基于所述游戏场景中所述目标掩体与所述目标虚拟对象的位置关系,从所述圆形区域中选取所述目标掩体所处的区域,得到所述可执行区域。
  5. 根据权利要求1所述的方法,其中,所述图形用户界面包括处于所述游戏场景的其他虚拟对象,所述其他虚拟对象与所述目标虚拟对象处于不同阵营;
    所述基于所述目标虚拟对象在所述游戏场景中的位置确定所述目标虚拟对象在所述游戏场景中的第一动作的可执行区域,包括:
    根据所述目标虚拟对象在所述游戏场景中的位置,以及预设距离在所述游戏场景中确定一圆形区域;
    获取所述其他虚拟对象在所述游戏场景中相对于所述目标虚拟对象的目标方向;
    基于所述目标方向以及预设角度从所述圆形区域中选取一扇形区域,得到所述可执行区域。
  6. 根据权利要求1所述的方法,其中,所述可执行区域中存在多个掩体;
    所述响应于所述第一操作结束,控制所述目标虚拟对象朝向所述掩体执行所述第一动作后蹲伏以进入所述掩体,包括:
    响应于所述第一操作结束,控制所述目标虚拟对象朝向与所述目标虚拟对象距离最近的掩体执行所述第一动作后蹲伏以进入所述掩体。
  7. 根据权利要求1所述的方法,其中,所述控制所述目标虚拟对象朝向所述掩体执行所述第一动作后蹲伏以进入所述掩体,包括:
    从所述游戏场景中确定所述掩体的遮挡区域;
    控制所述目标虚拟对象朝向所述掩体执行所述第一动作后以蹲伏状态处于所述掩体的遮挡区域。
  8. 根据权利要求1所述的方法,其中,所述可执行区域中不存在掩体;
    所述方法还包括:
    响应于与所述第一操作连续的滑动操作,控制所述目标虚拟对象沿所述滑动操作的滑动方向执行所述第一动作。
  9. 根据权利要求1所述的方法,其中,所述方法还包括:
    响应于对所述动作控件的第二操作,控制所述目标虚拟对象执行所述第一动作后处于站立状态,其中,所述第一操作和所述第二操作不同。
  10. 一种游戏控制装置,通过终端设备提供图形用户界面,所述图形用户界面至少包括部分游戏场景,以及处于所述游戏场景中的目标虚拟对象,所述目标虚拟对象由当前玩家控制,所述装置包括:
    提供单元,用于在所述图形用户界面上提供动作控件;
    确定单元,用于响应于对所述动作控件的第一操作,基于所述目标虚拟对象在所述游戏场景中的位置确定所述目标虚拟对象在所述游戏场景中的第一动作的可执行区域;
    显示单元,用于当所述可执行区域中存在掩体时,在所述图形用户界面显示所述掩体对应的可蹲伏指示图标;
    第一控制单元,用于响应于所述第一操作结束,控制所述目标虚拟对象朝向所述掩体执行所述第一动作后蹲伏以进入所述掩体。
  11. 一种计算机设备,包括存储器,处理器及存储在存储器上并在处理器上运行的计算机程序,所述处理器执行所述程序时实现如下步骤:
    在所述图形用户界面上提供动作控件;
    响应于对所述动作控件的第一操作,基于所述目标虚拟对象在所述游戏场景中的位置确定所述目标虚拟对象在所述游戏场景中的第一动作的可执行区域;
    当所述可执行区域中存在掩体时,在所述图形用户界面显示所述掩体对应的可蹲伏指示图标;
    响应于所述第一操作结束,控制所述目标虚拟对象朝向所述掩体执行所述第一动作后蹲伏以进入所述掩体。
  12. 一种存储介质,所述存储介质存储有多条指令,所述指令适于处理器进行加载,以执行如下步骤:
    在所述图形用户界面上提供动作控件;
    响应于对所述动作控件的第一操作,基于所述目标虚拟对象在所述游戏场景中的位置确定所述目标虚拟对象在所述游戏场景中的第一动作的可执行区域;
    当所述可执行区域中存在掩体时,在所述图形用户界面显示所述掩体对应的可蹲伏指示图标;
    响应于所述第一操作结束,控制所述目标虚拟对象朝向所述掩体执行所述第一动作后蹲伏以进入所述掩体。
  13. 根据权利要求12所述的存储介质,其中,所述基于所述目标虚拟对象在所述游戏场景中的位置确定所述目标虚拟对象在所述游戏场景中的第一动作的可执行区域,包括:
    根据所述目标虚拟对象在所述游戏场景中的位置,以及预设距离在所述游戏场景中确定一圆形区域,得到所述可执行区域。
  14. 根据权利要求12所述的存储介质,其中,所述基于所述目标虚拟对象在所述游戏场景中的位置确定所述目标虚拟对象在所述游戏场景中的第一动作的可执行区域,包括:
    根据所述目标虚拟对象在所述游戏场景中的位置,以及预设距离在所述游戏场景中确定一圆形区域;
    获取所述目标虚拟对象当前在所述游戏场景中的视角方向;
    基于所述视角方向以及预设角度从所述圆形区域中选取一扇形区域,得到所述可执行区域。
  15. 根据权利要求12所述的存储介质,其中,所述基于所述目标虚拟对象在所述游戏场景中的位置确定所述目标虚拟对象在所述游戏场景中的第一动作的可执行区域,包括:
    根据所述目标虚拟对象在所述游戏场景中的位置,以及预设距离在所述游戏场景中确定一圆形区域;
    若所述圆形区域中存在多个掩体,则从所述多个掩体中确定与所述目标虚拟对象距离最近的目标掩体;
    基于所述游戏场景中所述目标掩体与所述目标虚拟对象的位置关系,从所述圆形区域中选取所述目标掩体所处的区域,得到所述可执行区域。
  16. 根据权利要求12所述的存储介质,其中,所述图形用户界面包括处于所述游戏场景的其他虚拟对象,所述其他虚拟对象与所述目标虚拟对象处于不同阵营;
    所述基于所述目标虚拟对象在所述游戏场景中的位置确定所述目标虚拟对象在所述游戏场景中的第一动作的可执行区域,包括:
    根据所述目标虚拟对象在所述游戏场景中的位置,以及预设距离在所述游戏场景中确定一圆形区域;
    获取所述其他虚拟对象在所述游戏场景中相对于所述目标虚拟对象的目标方向;
    基于所述目标方向以及预设角度从所述圆形区域中选取一扇形区域,得到所述可执行区域。
  17. 根据权利要求12所述的存储介质,其中,所述可执行区域中存在多个掩体;
    所述响应于所述第一操作结束,控制所述目标虚拟对象朝向所述掩体执行所述第一动作后蹲伏以进入所述掩体,包括:
    响应于所述第一操作结束,控制所述目标虚拟对象朝向与所述目标虚拟对象距离最近的掩体执行所述第一动作后蹲伏以进入所述掩体。
  18. 根据权利要求12所述的存储介质,其中,所述控制所述目标虚拟对象朝向所述掩体执行所述第一动作后蹲伏以进入所述掩体,包括:
    从所述游戏场景中确定所述掩体的遮挡区域;
    控制所述目标虚拟对象朝向所述掩体执行所述第一动作后以蹲伏状态处于所述掩体的遮挡区域。
  19. 根据权利要求12所述的存储介质,其中,所述可执行区域中不存在掩体;
    还包括:
    响应于与所述第一操作连续的滑动操作,控制所述目标虚拟对象沿所述滑动操作的滑动方向执行所述第一动作。
  20. 根据权利要求12所述的存储介质,其还包括:
    响应于对所述动作控件的第二操作,控制所述目标虚拟对象执行所述第一动作后处于站立状态,其中,所述第一操作和所述第二操作不同。
PCT/CN2023/079122 2022-08-30 2023-03-01 一种游戏控制方法、装置、计算机设备及存储介质 WO2024045528A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211049847.XA CN115382201A (zh) 2022-08-30 2022-08-30 一种游戏控制方法、装置、计算机设备及存储介质
CN202211049847.X 2022-08-30

Publications (1)

Publication Number Publication Date
WO2024045528A1 true WO2024045528A1 (zh) 2024-03-07

Family

ID=84124636

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/079122 WO2024045528A1 (zh) 2022-08-30 2023-03-01 一种游戏控制方法、装置、计算机设备及存储介质

Country Status (2)

Country Link
CN (1) CN115382201A (zh)
WO (1) WO2024045528A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115382201A (zh) * 2022-08-30 2022-11-25 网易(杭州)网络有限公司 一种游戏控制方法、装置、计算机设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112263833A (zh) * 2020-11-19 2021-01-26 网易(杭州)网络有限公司 游戏控制方法及装置
CN112774201A (zh) * 2021-01-22 2021-05-11 北京字跳网络技术有限公司 一种虚拟角色掩蔽方法、装置、计算机设备及存储介质
CN112822397A (zh) * 2020-12-31 2021-05-18 上海米哈游天命科技有限公司 游戏画面的拍摄方法、装置、设备及存储介质
US20210394058A1 (en) * 2020-06-18 2021-12-23 Nintendo Co., Ltd. Non-transitory computer-readable storage medium having game program stored therein, game apparatus, game process method, and game system
CN114225416A (zh) * 2021-12-16 2022-03-25 网易(杭州)网络有限公司 游戏控制方法及装置
CN114404944A (zh) * 2022-01-20 2022-04-29 网易(杭州)网络有限公司 玩家角色的控制方法、装置、电子设备及存储介质
CN115382201A (zh) * 2022-08-30 2022-11-25 网易(杭州)网络有限公司 一种游戏控制方法、装置、计算机设备及存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210394058A1 (en) * 2020-06-18 2021-12-23 Nintendo Co., Ltd. Non-transitory computer-readable storage medium having game program stored therein, game apparatus, game process method, and game system
CN112263833A (zh) * 2020-11-19 2021-01-26 网易(杭州)网络有限公司 游戏控制方法及装置
CN112822397A (zh) * 2020-12-31 2021-05-18 上海米哈游天命科技有限公司 游戏画面的拍摄方法、装置、设备及存储介质
CN112774201A (zh) * 2021-01-22 2021-05-11 北京字跳网络技术有限公司 一种虚拟角色掩蔽方法、装置、计算机设备及存储介质
CN114225416A (zh) * 2021-12-16 2022-03-25 网易(杭州)网络有限公司 游戏控制方法及装置
CN114404944A (zh) * 2022-01-20 2022-04-29 网易(杭州)网络有限公司 玩家角色的控制方法、装置、电子设备及存储介质
CN115382201A (zh) * 2022-08-30 2022-11-25 网易(杭州)网络有限公司 一种游戏控制方法、装置、计算机设备及存储介质

Also Published As

Publication number Publication date
CN115382201A (zh) 2022-11-25

Similar Documents

Publication Publication Date Title
CN113101652A (zh) 信息展示方法、装置、计算机设备及存储介质
CN113398590B (zh) 声音处理方法、装置、计算机设备及存储介质
WO2024011894A1 (zh) 虚拟对象的控制方法、装置、存储介质及计算机设备
CN113426124A (zh) 游戏中的显示控制方法、装置、存储介质及计算机设备
CN113398566A (zh) 游戏的显示控制方法、装置、存储介质及计算机设备
CN112870718A (zh) 道具的使用方法、装置、存储介质及计算机设备
WO2024045528A1 (zh) 一种游戏控制方法、装置、计算机设备及存储介质
CN113546419A (zh) 游戏地图显示方法、装置、终端及存储介质
CN115040873A (zh) 一种游戏分组处理方法、装置、计算机设备及存储介质
WO2024051116A1 (zh) 虚拟角色的控制方法、装置、存储介质及终端设备
WO2024087786A1 (zh) 游戏元素的显示方法、装置、计算机设备及存储介质
CN113332721A (zh) 一种游戏控制方法、装置、计算机设备及存储介质
WO2024103623A1 (zh) 一种虚拟物品的标记方法、装置、计算机设备及存储介质
WO2024031942A1 (zh) 游戏道具的控制方法、装置、计算机设备及存储介质
WO2023246166A1 (zh) 一种视频进度的调节方法、装置、计算机设备及存储介质
WO2024007606A1 (zh) 虚拟物品的展示方法、装置、计算机设备及存储介质
CN115501581A (zh) 一种游戏控制方法、装置、计算机设备及存储介质
CN112245914B (zh) 一种视角调整方法、装置、存储介质及计算机设备
CN114225412A (zh) 信息处理方法、装置、计算机设备及存储介质
CN113867873A (zh) 页面显示方法、装置、计算机设备及存储介质
WO2024124814A1 (zh) 游戏信号反馈方法、装置、电子设备和可读存储介质
CN115212567A (zh) 信息处理方法、装置、计算机设备及计算机可读存储介质
CN115317893A (zh) 虚拟资源处理方法、装置、计算机设备及存储介质
CN115430150A (zh) 一种游戏技能释放方法、装置、计算机设备及存储介质
CN116999835A (zh) 一种游戏控制方法、装置、计算机设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23858571

Country of ref document: EP

Kind code of ref document: A1