CN117942567A - Game interaction method and device, computer equipment and storage medium - Google Patents

Game interaction method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN117942567A
CN117942567A CN202410160492.4A CN202410160492A CN117942567A CN 117942567 A CN117942567 A CN 117942567A CN 202410160492 A CN202410160492 A CN 202410160492A CN 117942567 A CN117942567 A CN 117942567A
Authority
CN
China
Prior art keywords
interaction
game
control
interactive
graphical user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410160492.4A
Other languages
Chinese (zh)
Inventor
许展昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202410160492.4A priority Critical patent/CN117942567A/en
Publication of CN117942567A publication Critical patent/CN117942567A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a game interaction method, a game interaction device, computer equipment and a computer readable storage medium. According to the scheme, an interactive interface is displayed in a superimposed mode on a graphical user interface in response to an interactive triggering event, wherein the interactive interface comprises at least one interactive item corresponding to the interactive triggering event, and the interactive item is configured to control a first virtual object to execute corresponding game behaviors; responding to a first triggering operation triggered by a touch area on a graphical user interface, and determining a target interaction item from at least one interaction item according to the first triggering operation; in the process of maintaining the first trigger operation, responding to the interaction termination event, and controlling the first virtual object to execute the game action in the game scene according to the current first touch parameter of the first trigger operation. Therefore, seamless switching triggering of the game player among different operation controls can be facilitated, and game operation experience of the player is improved.

Description

Game interaction method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a game interaction method, a game interaction device, a computer device, and a computer readable storage medium.
Background
With the development of the internet, a large number of different types of games are emerging to meet the daily entertainment needs of users. In some game scenarios of games, NPCs (non-PLAYER CHARACTER, non-player characters) are provided. The player may interact with the NPC.
In the related art, when a player controls a virtual character to move to an area near an NPC, a communication panel for interaction with the NPC is displayed on a game interface. However, the existing communication panel covers the game screen and covers part of the operation controls, thereby affecting the game operation of the player.
Disclosure of Invention
The embodiment of the application provides a game interaction method, a game interaction device, computer equipment and a computer readable storage medium, which can improve game operation experience of players.
The embodiment of the application provides a game interaction method, a graphical user interface is provided through terminal equipment, the content displayed by the graphical user interface at least comprises a part of game scenes and a first virtual object which is positioned in the game scenes and controlled by the terminal equipment, and the method comprises the following steps:
Responding to an interaction triggering event, and superposing and displaying an interaction interface on the graphical user interface, wherein the interaction interface comprises at least one interaction item corresponding to the interaction triggering event, and the interaction item is configured to control the first virtual object to execute corresponding game behaviors;
responding to a first triggering operation triggered by a touch area on the graphical user interface, and determining a target interaction item from the at least one interaction item according to the first triggering operation;
And in the process of maintaining the first triggering operation, responding to an interaction termination event, and controlling the first virtual object to execute a game action in the game scene according to the current first touch parameter of the first triggering operation.
Correspondingly, the embodiment of the application also provides a game interaction device, a graphical user interface is provided through a terminal device, the content displayed by the graphical user interface at least comprises a part of game scenes, and a first virtual object which is positioned in the game scenes and controlled by the terminal device, and the device comprises:
The first display unit is used for responding to the triggering interaction triggering event, superposing and displaying an interaction interface on the graphical user interface, wherein the interaction interface comprises at least one interaction item corresponding to the interaction triggering event, and the interaction item is configured to control the first virtual object to execute corresponding game behaviors;
The first determining unit is used for responding to a first triggering operation triggered by the touch control area on the graphical user interface and determining a target interaction item from the at least one interaction item according to the first triggering operation;
And the first control unit is used for responding to an interaction termination event in the process of maintaining the first trigger operation and controlling the first virtual object to execute a game action in the game scene according to the current first touch parameter of the first trigger operation.
Correspondingly, the embodiment of the application also provides computer equipment, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the game interaction method provided by any one of the embodiments of the application.
Correspondingly, the embodiment of the application also provides a computer readable storage medium, wherein the computer readable storage medium stores a plurality of instructions, and the instructions are suitable for being loaded by a processor to execute the game interaction method.
According to the scheme, the interactive interface is displayed in a superimposed mode on the graphical user interface by responding to the triggering interaction triggering event, wherein the interactive interface comprises at least one interactive item corresponding to the interaction triggering event, and the interactive item is configured to control the first virtual object to execute corresponding game behaviors; responding to a first triggering operation triggered by a touch area on a graphical user interface, and determining a target interaction item from at least one interaction item according to the first triggering operation; in the process of maintaining the first trigger operation, responding to the interaction termination event, and controlling the first virtual object to execute the game action in the game scene according to the current first touch parameter of the first trigger operation. Therefore, seamless switching triggering of the game player among different operation controls can be facilitated, and game operation experience of the player is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic view of a game interaction system according to an embodiment of the present application.
Fig. 2is a flow chart of a game interaction method according to an embodiment of the application.
Fig. 3 is a schematic application scenario diagram of a game interaction method according to an embodiment of the present application.
Fig. 4 is a schematic application scenario diagram of another game interaction method according to an embodiment of the present application.
Fig. 5 is a schematic application scenario diagram of another game interaction method according to an embodiment of the present application.
Fig. 6 is a schematic application scenario diagram of another game interaction method according to an embodiment of the present application.
Fig. 7 is a schematic diagram of an application scenario of another game interaction method according to an embodiment of the present application.
Fig. 8 is a schematic application scenario diagram of another game interaction method according to an embodiment of the present application.
Fig. 9 is a schematic application scenario diagram of another game interaction method according to an embodiment of the present application.
Fig. 10 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application.
Fig. 11 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application.
Fig. 12 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application.
Fig. 13 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application.
Fig. 14 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application.
Fig. 15 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application.
Fig. 16 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application.
Fig. 17 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application.
Fig. 18 is a schematic application scenario diagram of another game interaction method according to an embodiment of the present application.
Fig. 19 is a block diagram of a game interaction device according to an embodiment of the present application.
Fig. 20 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to fall within the scope of the application.
The embodiment of the application provides a game interaction method, a game interaction device, a computer readable storage medium and computer equipment. Specifically, the game interaction method of the embodiment of the application can be executed by a computer device, wherein the computer device can be a terminal or a server. The terminal may be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a Personal computer (PC, personal Computer), a Personal digital assistant (Personal DIGITAL ASSISTANT, PDA), and the like, and the terminal may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs (Content Delivery Network, content delivery networks), basic cloud computing services such as big data and artificial intelligent platforms, and the like.
For example, when the game interaction method is run on the terminal, the terminal device stores a game application program and is used for presenting a virtual scene in a game screen. The terminal device is used for interacting with a user through a graphical user interface, for example, the terminal device downloads and installs a game application program and runs the game application program. The way in which the terminal device presents the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device, or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including game screens and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for running the game, generating the graphical user interface, responding to the operation instructions, and controlling the display of the graphical user interface on the touch display screen.
For example, when the game interaction method is run on a server, it may be a cloud game. Cloud gaming refers to a game style based on cloud computing. In the running mode of the cloud game, a running main body of the game application program and a game picture presentation main body are separated, and the storage and running of the game interaction method are completed on a cloud game server. The game image presentation is completed at a cloud game client, which is mainly used for receiving and sending game data and presenting game images, for example, the cloud game client may be a display device with a data transmission function, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, etc., near a user side, but the terminal device for processing game data is a cloud game server in the cloud. When playing the game, the user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the cloud game client through a network, and finally decodes the data through the cloud game client and outputs the game pictures.
Referring to fig. 1, fig. 1 is a schematic view of a game interaction system according to an embodiment of the present application. The system may include at least one terminal, at least one server, at least one database, and a network. The terminal held by the user can be connected to the server of different games through the network. A terminal is any device having computing hardware capable of supporting and executing a software product corresponding to a game. In addition, the terminal has one or more multi-touch-sensitive screens for sensing and obtaining inputs of a user through touch or slide operations performed at a plurality of points of the one or more touch-sensitive display screens. In addition, when the system includes a plurality of terminals, a plurality of servers, and a plurality of networks, different terminals may be connected to each other through different networks, through different servers. The network may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition, the different terminals may be connected to other terminals or to a server or the like using their own bluetooth network or hotspot network. For example, multiple users may be online through different terminals to connect and synchronize with each other through an appropriate network to support multiplayer games. In addition, the system may include multiple databases coupled to different servers and information related to the gaming environment may be continuously stored in the databases as different users play multiplayer games online.
The embodiment of the application provides a game interaction method which can be executed by a terminal or a server. The embodiment of the application is illustrated by taking a game interaction method executed by a terminal as an example. The terminal comprises a touch display screen and a processor, wherein the touch display screen is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface. When a user operates the graphical user interface through the touch display screen, the graphical user interface can control the local content of the terminal by responding to the received operation instruction, and can also control the content of the opposite-end server by responding to the received operation instruction. For example, the user-generated operational instructions for the graphical user interface include instructions for launching the gaming application, and the processor is configured to launch the gaming application after receiving the user-provided instructions for launching the gaming application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch-sensitive display screen. A touch display screen is a multi-touch-sensitive screen capable of sensing touch or slide operations performed simultaneously by a plurality of points on the screen. The user performs touch operation on the graphical user interface by using a finger, and when the graphical user interface detects the touch operation, the graphical user interface controls different virtual objects in the game to execute actions corresponding to the touch operation. For example, the game may be any one of a leisure game, an action game, a role playing game, a strategy game, a sports game, a educational game, and the like. Wherein the game may include a virtual scene of the game. Further, one or more virtual objects, such as virtual characters, controlled by a user (or player) may be included in the virtual scene of the game. In addition, one or more obstacles, such as rails, ravines, walls, etc., may also be included in the virtual scene of the game to limit movement of the virtual object, e.g., to limit movement of the one or more objects to a particular area within the virtual scene. Optionally, the virtual scene of the game also includes one or more elements, such as skills, scores, character health status, energy, etc., to provide assistance to the player, provide virtual services, increase scores related to the player's performance, etc. In addition, the graphical user interface may also present one or more indicators to provide indication information to the player. For example, a game may include a player controlled virtual object and one or more other virtual objects (such as enemy characters). In one embodiment, one or more other virtual objects are controlled by other players of the game. For example, one or more other virtual objects may be computer controlled, such as a robot using an Artificial Intelligence (AI) algorithm, implementing a human-machine engagement mode. For example, virtual objects possess various skills or capabilities that a game player uses to achieve a goal. For example, the virtual object may possess one or more weapons, props, tools, etc. that may be used to eliminate other objects from the game. Such skills or capabilities may be activated by the player of the game using one of a plurality of preset touch operations with the touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of the user.
It should be noted that, the schematic view of the game interaction system shown in fig. 1 is only an example, and the image processing system and the scene described in the embodiment of the present application are for more clearly describing the technical solution of the embodiment of the present application, and do not constitute a limitation on the technical solution provided by the embodiment of the present application, and those skilled in the art can know that, with the evolution of the game interaction system and the appearance of a new service scene, the technical solution provided by the embodiment of the present application is equally applicable to similar technical problems.
Based on the above problems, embodiments of the present application provide a first game interaction method, apparatus, computer device, and computer readable storage medium, which can improve the game operation experience of a player. The following will describe in detail. The following description of the embodiments is not intended to limit the preferred embodiments.
The embodiment of the application provides a game interaction method which can be executed by a terminal or a server, and the embodiment of the application is described by taking the game interaction method executed by the terminal as an example.
Referring to fig. 2, fig. 2 is a flow chart of a game interaction method according to an embodiment of the application. The specific flow of the game interaction method can be as follows:
101. and responding to the triggering event of triggering interaction, and superposing and displaying an interaction interface on the graphical user interface.
In the embodiment of the application, a graphical user interface is provided through the terminal equipment, and the content displayed by the graphical user interface at least comprises part of game scenes of the target game and a first virtual object in the game scenes. The first virtual object may be controlled by the current game player through the terminal device. The target game may be a multiplayer online network game, for example, the target game may be a FPS (First-Person Shooter Game, first-person shooting game) or the like.
Referring to fig. 3, fig. 3 is a schematic view of an application scenario of a game interaction method according to an embodiment of the present application. In the graphical user interface 10 shown in fig. 3, a part of the game scene 11 of the target game is displayed, a first virtual object 111 being in the game scene 11, which is controlled by the current game player.
Wherein a plurality of controls are provided in the graphical user interface 10, for example, may include: a move control 12, a squat control 13, a groveling control 14, a jump control 15, a first shooting control 16, a second shooting control 17, an earpiece control 19, a voice control 20, and a field of view control 21.
Wherein the movement control 12 may be used to control the movement of the first virtual object 111 in the game scene 11; squat control 13 may be used to control first virtual object 111 to perform a squat action; the groveling control 14 may be used to control the first virtual object 111 to perform a groveling action; the jump control 15 may be used to control the first virtual object 111 to perform a jump action; the first shooting control 16 and the second shooting control 17 may be used to control the first virtual object 111 to shoot an enemy object in the game scene 11 (e.g., other virtual objects that may be controlled for other players, etc.); the earpiece control 19 can be used to turn on earpiece functions within the game play and receive voice messages from teammates players; the voice control 20 may be used to turn on voice functions within a game play, send voice information to teammate players, etc.; the field of view control 21 may be used to switch the viewing angle of the first virtual object 111 in the game scene 11.
The graphical user interface may further display a thumbnail map 18, where the thumbnail map 18 includes a thumbnail map of the game scene 11 and a position of the first virtual object 111 in the game scene 11, so that a current game player may conveniently observe a real-time position of the first virtual object 111 in the game scene 11. Wherein the thumbnail map 18 displays the scaled-down whole game scene and shows the position of the first virtual object 111 controlled by the current game player in the whole game scene, so that the player can know the topography profile, specific details and/or the position of the virtual object controlled by other players of the whole game scene through the thumbnail map 18, thereby better determining the moving direction of the next step and the like.
The graphical user interface may also be displayed with game play information 22, among other things. The game play information 22 may include, among other things, the achievement, life value, and number of players remaining in the play of the first virtual object 111 currently being played by the game player. The life value may be a blood bar of the first virtual object 111 controlled by the current game player, and when the first virtual object 111 is hit by another virtual object controlled by another player, the color of the blood bar may be changed, so that the current game player may know in real time the damage degree of the controlled virtual object, thereby better protecting taking a countermeasure.
The interactive triggering event is used for triggering the display of the interactive interface on the graphical user interface, and the interactive triggering event may include various modes, for example, the interactive triggering event may include: the first virtual object moves to a designated scene position in the game scene, or the first virtual object moves to the vicinity of the second virtual object in the game scene, or other triggering modes can be set. The present invention is not limited thereto, and may be set according to various requirements.
The interactive interface may include at least one interactive entry corresponding to the interaction triggering event, where the interactive entry is configured to control the first virtual object to execute the corresponding game behavior.
In some embodiments, the triggering manner of the interaction triggering event may be: if the distance between the first virtual object and the second virtual object in the game scene meets the preset condition, the step of responding to the triggering interaction triggering event and displaying the interaction interface in a superimposed manner on the graphical user interface may comprise the following operations:
Responsive to the distance between the first virtual object and the second virtual object in the game scene being less than a preset distance, displaying an interactive control on the graphical user interface;
and responding to touch operation for the interactive control, and displaying the interactive interface in a superposition way on the graphical user interface.
Wherein the second virtual object refers to a virtual object that can interact with the first virtual object in the game scene, the second virtual object does not belong to the current game player control, and the second virtual object may include a specified scene object in the game scene (such as a building in the game scene, etc.), or an NPC object in the game scene.
In the embodiment of the application, the current game player can control the first virtual object to move in the game scene through the movement control of the graphical user interface.
And when the first virtual object is controlled to move to a position with a distance smaller than a preset distance from the second virtual object in the game scene, the interactive control can be triggered to be displayed on the graphical user interface. The interaction control indicates that interaction with the second virtual object is possible, etc.
For example, referring to fig. 4, fig. 4 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application. In the graphical user interface 10 shown in the upper side of fig. 4, according to the operation of the current game player on the movement function control 12, the first virtual object 111 is controlled to move in the game scene 11, and when the first virtual object 111 moves to a position where the distance between the first virtual object 111 and the second virtual object 112 is smaller than the preset distance, the interactive control 23 is triggered to be displayed on the graphical user interface 10, so that the graphical user interface 10 shown in the lower side of fig. 4 is obtained. Wherein the graphical user interface 10 displays an interaction control 23.
Further, the current game player can trigger the display of the interactive interface on the graphical user interface through the operation of the interactive control, so that the interactive item can be selected on the interactive interface.
In the embodiment of the application, by displaying the interactive control 23 on the graphical user interface, compared with the mode of directly displaying the interactive communication panel on the graphical user interface in the prior art, the method can avoid shielding the game scene picture or operating the control, and meanwhile, the player can further select the interactive item through the operation of the interactive control, thereby meeting different requirements of the game player.
The touch operation for the interactive control may include a clicking operation, a sliding operation, a pressing operation, and the like.
For example, referring to fig. 5, fig. 5 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application. In the graphical user interface 10 shown in the upper side of fig. 5, when the clicking operation of the current game player on the interactive control 23 is detected, the interactive interface can be triggered to be displayed on the graphical user interface 10 in a superimposed manner, so as to obtain the graphical user interface 10 shown in the lower side of fig. 5.
The gui 10 shown in the lower side of fig. 5 is superimposed with an interactive interface 30, where the interactive interface 30 may include a plurality of interactive entries, which are respectively: "get force", "get agile", "get supplies", "leave dialog".
Wherein the "get power" may be configured to enhance a game behavior of an injury value of a virtual weapon of the first virtual object, which may be used to attack an enemy object in the game scene; the "get assets" may be configured such that a first virtual object gets virtual assets (virtual assets refer to virtual items or virtual resources that accomplish various tasks in a game); the "acquisition agility" may be configured to promote the speed, hit rate, etc. of the virtual weapon of the first virtual object; the "leave dialog" may be configured to control the first virtual object to exit the interaction.
The interactive interface 30 may also include an interactive prompt "warrior welcome to the sword field-! You can choose one ability to resonate, after which you will get an increase in the specified properties-! (manipulating the joystick to slide to a designated area to gain an effect) ", the interactive alert information may be used to alert the current game player that an interaction may be performed by selecting an interaction entry in the interaction interface 30.
102. And responding to a first triggering operation triggered by a touch control area on the graphical user interface, and determining a target interaction item from at least one interaction item according to the first triggering operation.
The touch area refers to an area on the graphical user interface that is operable by a current game player, for example, the touch area may be an operation control provided on the graphical user interface, and the like.
In the embodiment of the application, the current game player can select the target interaction item from at least one interaction item of the interaction interface by performing the first triggering operation on the touch control area.
In some embodiments, the touch area may display interactive selection controls provided for the graphical user interface, at which point the touch area may be used to implement the functionality of the interactive selection controls. The step of "determining a target interaction item from at least one interaction item according to a first trigger operation in response to a first trigger operation triggered by a touch area on the graphical user interface" may include the following operations:
In response to a touch operation for the interactive selection control, a target interactive entry is determined from the at least one interactive entry.
The touch operation may include a click operation, a press operation, a slide operation, and the like.
For example, referring to fig. 6, fig. 6 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application. In the graphical user interface shown in fig. 6, an interactive selection control 32 is provided. Wherein a plurality of interactive items may be displayed around the interactive selection control 32. When the sliding operation of the interactive selection control 32 by the current game player is detected, a target interactive item may be determined from a plurality of interactive items according to the sliding direction of the sliding operation, for example, if the sliding direction is rightward, the target interactive item may be determined to be "get agile".
In some embodiments, the interactive selection control may be divided into a plurality of operational sub-regions, each corresponding to an interactive entry.
For example, with continued reference to fig. 6, the interactive selection control 32 may be divided into four operational sub-areas, up, down, left, and right, wherein the upper operational sub-area in the interactive selection control 32 may correspond to the interactive entry "get power"; the lower operation sub-region in the interactive selection control 32 may correspond to the interactive entry "leave dialog"; the left operational sub-region in the interactive selection control 32 may correspond to the interactive entry "get agile"; the right operational sub-area in the interactive selection control 32 may correspond to the interactive entry "get supplies".
Further, the current game player can trigger the selection of "get power" as the target interactive item through the operation of the upper operation sub-region in the interactive selection control 32; triggering and selecting 'leave dialogue' as a target interaction item through the operation of the lower operation subarea; triggering and selecting 'acquisition agility' as a target interaction item through the operation of the left operation subarea; and triggering and selecting 'obtained materials' as target interaction items through the operation of the right operation subarea.
In some embodiments, in order to improve the operation experience of the game player, a plurality of operation controls may be integrated in the touch area, and the operation controls displayed in the touch area at different moments are determined according to different game events, then before responding to the interaction triggering event, the touch area may be displayed as a target function control, at this time, the touch area may implement the function of the target function control, and the method may further include the following steps:
And responding to the interaction triggering event, switching the target functional control displayed in the touch area into an interaction selection control, and hiding an operation control which is overlapped with the display position of the interaction interface on the graphical user interface.
Wherein the target functionality control may be used to control the first virtual object to perform a game action in the game scene.
For example, the target functionality control may be a movement control that may be used to control movement of the first virtual object in the game scene; or the target functionality control may be a shooting control, which may be used to control the first virtual object to shoot in the game scene, etc.
In some embodiments, the target functionality control may be a mobile control. For example, referring to fig. 7, fig. 7 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application. In the graphical user interface 10 shown in the upper side of fig. 7, when the clicking operation of the current game player on the interactive control 23 is detected, the mobile function control 12 can be triggered to be switched to the interactive selection control 32, meanwhile, the interactive interface 30 is displayed on the graphical user interface 10, and the operation control overlapping with the display position of the interactive interface 30 on the graphical user interface 10 is hidden, so that the graphical user interface 10 shown in the lower side of fig. 7 is obtained.
In the graphical user interface 10 shown at the lower side of fig. 7, the interactive interface 30 may be displayed in a lower area of the graphical user interface 10. The operation controls on the graphical user interface 10 that overlap with the display position of the interactive interface 30 may include: the mobile control 12, the squat control 13, the groveling control 14, the jump control 15, the first shooting control 16, the second shooting control 17 and the interactive control 23 can hide the operation controls with the display positions overlapped with the interactive interface 30, wherein the display is switched to the interactive selection control 32 at the position of the mobile control 12.
In the embodiment of the present application, the mobile control 12 and the interactive selection control 32 are respectively different operation controls, but are located at the same position (i.e. touch area) of the game interface and have click response areas with the same size, so that differences in display styles can exist.
The first triggering operation may be a touch operation on the touch area when the touch area is displayed as the interactive selection control.
In some embodiments, when the interactive interface is displayed according to the interaction trigger event, the step of "switching the target function control displayed in the touch area to the interaction selection control" may include the following operations:
If the target functional control is overlapped with the display position of the interactive interface, hiding the target functional control on the graphical user interface, and displaying the interactive selection control at the position corresponding to the target functional control on the interactive interface.
The overlapping of the target functional control and the display position of the interactive interface may include: the display position of the target functional control on the graphical user interface is overlapped with the display position of the interactive interface on the graphical user interface, and the target functional control is switched and displayed as the interactive selection control in the touch control area, so that the interactive selection control can be displayed on the interactive interface when the interactive interface is displayed, and a game player can conveniently select interactive items for interaction through the interactive selection control.
In some embodiments, to facilitate operation by a game player, the method may further comprise the steps of:
And closing the interactive interface in response to the closing operation of the interactive interface, switching the interactive selection control displayed in the touch area into a target functional control, and displaying an operation control overlapping with the display position of the interactive interface on the graphical user interface.
The closing operation for the interactive interface may include: triggering operation of the interaction item which indicates the end of interaction in the interaction selection control.
Wherein, the operation control which is overlapped with the display position of the interactive interface is displayed on the graphical user interface, and the operation control comprises: and restoring the display of the operation control which is hidden before.
In some embodiments, the target function control may be a mobile control, referring to fig. 8, fig. 8 is a schematic application scenario diagram of another game interaction method according to an embodiment of the present application. In the graphical user interface 10 shown on the top side of fig. 8, an interactive entry of "leave dialog" in the interactive interface 30 may indicate that the interaction is ended. When a sliding operation of the current game player on the interactive selection control 32 corresponding to "leave from dialog" is detected, the interactive interface 30 may be triggered to be closed, the interactive selection control 32 is switched to the mobile control 12, and an operation control overlapping with the display position of the interactive interface 30 is displayed on the graphical user interface 10. I.e. a graphical user interface 10 as shown in the lower side of fig. 8.
In the graphical user interface 10 shown in the lower side of fig. 8, the interactive interface 30 and the interactive selection control 32 are closed. The previously hidden move control 12, squat control 13, groveling control 14, skip control 15, first shoot control 16, second shoot control 17, and interactive control 23 resume display.
In some embodiments, the interactive interface may include a close control, and the close operation for the interactive interface may include a trigger operation for the close control in the interactive interface.
For example, referring to fig. 9, fig. 9 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application. In the graphical user interface 10 shown on the upper side of fig. 9, the interactive interface 30 may include a closing control 33, and when a click operation of the closing control 33 by a current game player is detected, the closing interactive interface 30 may be triggered, the interactive selection control 32 is switched to the mobile control 12, and an operation control overlapping with the display position of the interactive interface 30 is displayed on the graphical user interface 10. I.e. a graphical user interface 10 as shown in the lower side of fig. 9.
In the graphical user interface 10 shown in the lower side of fig. 9, the interactive interface 30 and the interactive selection control 32 are closed. The previously hidden move control 12, squat control 13, groveling control 14, skip control 15, first shoot control 16, second shoot control 17, and interactive control 23 resume display.
In some embodiments, to facilitate the current game player controlling the first virtual object to avoid the attack, the method may further comprise the steps of:
And closing the interactive interface in response to the interaction termination event, and displaying an operation control overlapping with the display position of the interactive interface on the graphical user interface.
Wherein, in response to the interaction termination event, at least one of the following may be included:
(1) The first virtual object is attacked in the game scene. For example, the first virtual object is hit by other virtual objects in the game scene, where the other virtual objects may include virtual objects controlled by other players, or virtual objects having attack damage attribute in the game scene, and so on, which are not exemplified herein;
(2) The first virtual object presents a threat in the game scene. For example, the first virtual object has footsteps, gunshots, etc. in the area where the first virtual object is located in the game scene, i.e. the first virtual object is not actually affected;
(3) The game behavior corresponding to the first virtual object execution target interaction item is terminated. Such as a cancel operation of the target interaction item by the current game player, etc.
(4) An operation satisfying a preset operation threshold is applied on the basis of the first trigger operation. For example, on the basis of continuously touching the interactive control, the user can quickly slide upwards, then directly exit the interactive interface, and then control the character to move according to the sliding direction, and the like.
For example, after the target function control displayed in the touch area is switched to the interactive selection control, if the first virtual object is detected to be attacked by other virtual objects in the game scene, the interactive selection control can be switched back to the target function control, so that the current game player can conveniently control the first virtual object to avoid the attack of other virtual objects through the target function control.
For example, referring to fig. 10, fig. 10 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application. In the graphical user interface 10 shown on the upper side of fig. 10, a hit mark 34 is displayed, which indicates that the first virtual object 111 is attacked, and then the interactive interface 30 may be triggered to be closed, the interactive selection control 32 is switched to the mobile control 12, and an operation control overlapping with the display position of the interactive interface 30 is displayed on the graphical user interface 10. I.e. a graphical user interface 10 as shown in the lower side of fig. 10.
In the graphical user interface 10 shown in the lower side of fig. 10, the interactive interface 30 and the interactive selection control 32 are closed. The previously hidden move control 12, squat control 13, groveling control 14, skip control 15, first shoot control 16, second shoot control 17, and interactive control 23 resume display.
In some embodiments, to prompt the current game player for an interactive entry made by the first virtual object, after the step of determining a target interactive entry from the at least one interactive entry according to the first trigger operation, the method may further include the steps of:
and switching the display style of the target interaction item on the interaction interface.
And responding to a cancel event of the target interaction item, and switching the display mode of the target interaction item to an initial display mode.
The switching of the display style of the target interaction item may include: highlighting target interaction items, magnifying target interaction items, etc., as well as various other styles that differ from the initial display.
For example, referring to fig. 11, fig. 11 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application. In the graphical user interface 10 shown in the upper side of fig. 11, after detecting that the current game player selects "get power" through the touch operation on the interactive selection control 32, the game actions of the first virtual object and "get power" may be controlled, specifically, the injury value of the virtual weapon of the first virtual object may be enhanced. Meanwhile, the "get power" is highlighted, and the area corresponding to the "get power" in the interactive selection control 32 can be also highlighted, and the interactive prompt information is updated in the interactive interface 30, so as to obtain the graphical user interface 10 shown in the lower side of fig. 11.
In the graphical user interface 10 shown in the lower side of fig. 11, the interactive prompt information in the interactive interface 30 is updated to the prompt information corresponding to the event of "get power", which may be "warrior, congratulate you to resonate successfully-! The force gain (increase weapon output in 5 minutes) was obtained.
Wherein, the cancel event of the target interaction item may at least include: closing the interactive interface, or canceling the target interactive item, etc.
For example, referring to fig. 12, fig. 12 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application. In the graphical user interface 10 shown on the upper side of fig. 12, the first virtual object 111 performs the game action of "get power", at this time, the area corresponding to "get power" in the "get power" and the interactive selection control 32 is highlighted, and the interactive prompt information in the interactive interface 30 is updated to the prompt information corresponding to the entry of "get power", which may be "warrior, pleased you successfully resonate-! The force gain (increase weapon output in 5 minutes) was obtained. When the selection operation of the current game player for "leave dialogue" in the interactive interface 30 is detected, the execution of the game action of "get power" can be triggered to be canceled, and meanwhile, the "get power" is turned off, the area corresponding to the "get power" in the interactive selection control 32 is highlighted, and the interactive prompt information is updated in the interactive interface 30.
In the graphical user interface 10 shown in the lower side of fig. 12, the area corresponding to "get power" in the "get power" and the interactive selection control 32 is not highlighted, and is restored to the initial display style, and the interactive prompt information in the interactive interface 30 is updated to the prompt information corresponding to the event of "get power", which may be "warrior welcome to the tsukamurella field-! You can choose one ability to resonate, after which you will get an increase in the specified properties-! (manipulating the rocker to slide to a designated area to obtain a gain effect) ". In this way, the current game player may be prompted to cancel executing the "get power" game event, and the player may again select other interactive items from the interactive interface 30 to interact.
In some embodiments, in order to reduce the display of the interactive interface from blocking the player's view, when an interactive trigger event is detected, the target function control displayed in the touch area may be triggered to be switched to an interactive selection control, and at least one interactive item is correspondingly displayed around the interactive selection control.
For example, the target function control may be a mobile control, refer to fig. 13, and fig. 13 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application. In the graphical user interface 10 shown on the upper side of fig. 13, when a click operation of the interactive control 23 by the current game player is detected, switching of the mobile control 12 to the interactive selection control 32 may be triggered. I.e. a graphical user interface 10 as shown in the lower side of fig. 13. Wherein, the interactive selection control 32 is displayed, and 4 interactive items are displayed around the interactive selection control 32, including "get force", "get agile", "get supplies", "leave dialogue".
103. In the process of maintaining the first trigger operation, responding to the interaction termination event, and controlling the first virtual object to execute the game action in the game scene according to the current first touch parameter of the first trigger operation.
In the embodiment of the present application, the touch parameters may include, but are not limited to, a position of a touch point in the touch area, a relative positional relationship between the touch point outside the touch area and a preset position (for example, a center point) in the touch area, a sliding speed, a touch duration, and the like.
The current first touch parameter may be a touch parameter of the first triggering operation when the interaction termination event occurs.
For example, the first triggering operation may include a sliding operation of an interactive selection control displayed on the touch area, and in the process of the first triggering operation, when an interaction termination event is detected, acquiring a touch parameter of the first triggering operation at this time may include: the operation position, the sliding direction and the like of the sliding operation on the interactive selection control are used as current first touch parameters.
In some embodiments, when the interaction termination event is detected, the interaction selection control displayed in the touch area may be switched to the target function control, and then the step of "controlling the first virtual object to execute the game action in the game scene according to the current first touch parameter of the first trigger operation" may include the following operations:
and controlling the first virtual object to execute the game action corresponding to the target function control in the game scene according to the current first touch parameter of the first trigger operation.
For example, referring to fig. 14, fig. 14 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application. In the graphical user interface 10 shown on the upper side of fig. 14, during the sliding operation of the interactive selection control 32 by the current game player, an interaction termination event is detected, which may trigger closing of the interactive interface 30, switching of the interactive selection control 32 to the mobile control 13, and displaying an operation control overlapping with the display position of the interactive interface 30 on the graphical user interface 10. At the same time, a sliding operation is maintained on the movement control 12. I.e. a graphical user interface 10 as shown in the lower side of fig. 14.
In the graphical user interface 10 shown in the lower side of fig. 14, the interactive interface 30 and the interactive selection control 32 are closed. The previously hidden move control 12, squat control 13, groveling control 14, skip control 15, first shoot control 16, second shoot control 17, and interactive control 23 resume display. Wherein the first virtual object 111 is controlled to move in the game scene 11 according to the sliding parameters of the sliding operation.
In some embodiments, the interaction termination event may include the first virtual object receiving an attack from another virtual object in the game scene, for example, refer to fig. 15, and fig. 15 is a schematic diagram of an application scenario of another game interaction method according to an embodiment of the present application. In the graphical user interface 10 shown in the upper side of fig. 15, when the current game player is detected to perform the sliding operation on the operation area corresponding to the "get power" in the interactive selection control 32, if the first virtual object 111 is detected to be attacked, the hit identifier 34 is displayed in the graphical user interface 10, the closing of the interactive interface 30 may be triggered, the interactive selection control 32 is switched to the mobile control 12, and an operation control overlapping with the display position of the interactive interface 30 is displayed in the graphical user interface 10. At the same time, a sliding operation is maintained on the movement control 12. I.e. a graphical user interface 10 as shown in the lower side of fig. 15.
In the graphical user interface 10 shown in the lower side of fig. 15, the interactive interface 30 and the interactive selection control 32 are closed. The previously hidden move control 12, squat control 13, groveling control 14, skip control 15, first shoot control 16, second shoot control 17, and interactive control 23 resume display. Wherein the first virtual object 111 is controlled to move in the game scene 11 according to the sliding parameters of the sliding operation.
For example, referring to fig. 16, fig. 16 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application. In fig. 16, when the interactive selection control 32 is displayed on the left side, the player performs a sliding operation on the area corresponding to the "get power" interaction item, during the sliding operation, if an attack is detected to be encountered by the first virtual object in the game scene, the interactive selection control 32 may be switched back to the mobile control 12 in the graphical user interface, while the sliding operation on the mobile control 12 is maintained, and the player does not need to raise the hands to operate again during the whole control switching process, and can perform operations on the two controls while maintaining the sliding operation, so that the first virtual object can be controlled quickly to avoid the attack.
In some embodiments, prior to step "in response to triggering an interactive trigger event", the method may further comprise the steps of:
And responding to a second triggering operation triggered by the touch area, and controlling the first virtual object to execute the game action in the game scene according to the second triggering operation.
Before responding to the triggering interaction triggering event, the touch control area can be displayed as a target function control, and the second triggering operation can be an operation on the target function control.
For example, the target functionality control may be a movement control, and when an operation on the target functionality control is detected, the first virtual object may be controlled to move in the game scene.
In some embodiments, the first trigger operation may be continuous with the second trigger operation, that is, the current game player does not leave the graphical user interface after performing the second trigger operation on the graphical user interface, and proceeds with the first trigger operation.
In some embodiments, the step of "controlling the first virtual object to perform a game action in the game scene according to the second trigger operation" may include the following operations:
And controlling the first virtual object to perform game action in the game scene according to the current second touch parameter of the second triggering operation.
The second triggering operation may be a touch operation on the touch area when the touch area is displayed as the target functional control.
The current second touch parameter may be a touch parameter of the second triggering operation.
For example, the second triggering operation may include a sliding operation on the target functional control displayed in the touch area, and obtain an operation parameter of the sliding operation as a current second touch parameter, and control the first virtual object to perform a game action corresponding to the target functional control in the game scene according to the current second touch parameter.
In some embodiments, the step of determining the target interaction entry from the at least one interaction entry according to the first trigger operation may include the following operations:
And determining a first initial touch parameter of the first trigger operation according to the current second touch parameter, and determining a target interaction item from at least one interaction item on the basis of the first initial touch parameter in response to the first trigger operation.
The first triggering operation may be an operation continuous with the second triggering operation, and when the second triggering operation transitions to the first triggering operation, the current second touch parameter of the second triggering operation may be used as a first initial touch parameter of the first triggering operation, for example, the current second touch parameter may include: the target position of the touch point in the touch area, the sliding direction to the right and the like can be used as the first initial touch parameters of the first triggering operation.
Further, a target interaction item may be determined from at least one interaction item of the interaction interface based on the first initial touch parameter.
For example, referring to fig. 17, fig. 17 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application. In the graphical user interface 10 shown on the upper side of fig. 17, the touch area may be displayed as a movement control 12, and the first virtual object 111 is controlled to move in the game scene when the second trigger operation of the movement control 12 displayed on the touch area by the current game player is detected.
Wherein, in the process of controlling the first virtual object 111 to move in the game scene according to the second triggering operation, when the interaction triggering event is detected, the interaction interface 30 may be displayed in an overlaid manner on the graphical user interface 10. If the finger of the current game player does not leave the gui 10, it may be determined that the current game player continuously performs the first triggering operation on the interactive selection control after performing the second triggering operation on the mobile control 12, that is, the gui 10 shown in the lower side of fig. 17 is obtained.
The obtaining the current second touch parameter of the second triggering operation may be: the sliding on the right side of the touch area can be further used as an initial touch parameter of the first triggering operation, and the obtained material is determined and selected as a target interaction item according to the sliding on the right side of the touch area.
In this scheme, if an interaction trigger event is detected during the touch operation on the mobile control 12, the mobile control 12 may be switched to the interaction selection control 32 in the graphical user interface 10, and meanwhile, the touch operation on the interaction selection control 32 is maintained according to the operation parameters of the touch operation on the mobile control 12, so that the player does not need to raise the hand to operate again, and the operation of two controls can be penetrated by maintaining the touch operation, so that the game behavior corresponding to the interaction item can be quickly controlled by the first virtual object 111.
In some embodiments, after the step of controlling the first virtual object to perform the game action in the game scene according to the current first touch parameter of the first trigger operation, the method may further include the steps of:
And determining a third initial touch parameter of a third trigger operation according to the current first touch parameter, and controlling the first virtual object to execute game actions in the game scene on the basis of the third initial touch parameter according to the third trigger operation.
Wherein the third trigger operation is an operation continuous with the first trigger operation. That is, after the current game player performs the first trigger operation on the graphical user interface, the current game player does not leave the graphical user interface and continues to perform the third trigger operation.
The third triggering operation may be a touch operation on the touch area after the interactive selection control displayed in the touch area is switched to the target functional control.
Specifically, when the transition from the first triggering operation to the third triggering operation is performed, the current first touch parameter of the first triggering operation may be used as the third initial touch parameter of the third triggering operation, for example, the current first touch parameter may include: the target position, the sliding direction, and the like of the touch point in the touch area can be used as a third initial touch parameter of a third triggering operation.
Furthermore, the first virtual object can be controlled to execute the game action corresponding to the target function control on the basis of the third initial touch parameter.
For example, referring to fig. 18, fig. 18 is a schematic view of an application scenario of another game interaction method according to an embodiment of the present application. In the graphical user interface 10 shown in the upper side of fig. 18, the touch area may be displayed as an interactive selection control 32, and when a first trigger operation of the interactive selection control 32 displayed in the touch area by the current game player is detected, it is determined that "interactive material" is a target interactive item, and the first virtual object 111 may be controlled to execute a game behavior corresponding to "interactive material".
In the process of executing the game behavior corresponding to the "interactive material" by the first virtual object 111, when an interaction termination event is detected, the interactive selection control 32 displayed in the touch area may be switched to the mobile control 12, and if the finger of the current game player does not leave the gui 10, it may be determined that the current game player continuously performs the third triggering operation on the mobile control 12 after the first triggering operation on the interactive selection control 32, that is, the gui 10 shown in the lower side of fig. 18 is obtained.
The obtaining the current first touch parameter of the first triggering operation may be: the sliding on the right side of the touch area can be further used as a starting touch parameter of the first triggering operation, and the first virtual object 111 is determined to be controlled to move rightward in the game scene according to the sliding on the right side of the touch area.
In this scheme, if an interaction termination event is detected during the touch operation on the interactive selection control 32, the interactive selection control 32 may be switched to the mobile control 12 in the graphical user interface 10, and meanwhile, the touch operation on the mobile control 12 is maintained according to the operation parameters of the touch operation on the interactive selection control 32, so that the player does not need to raise the hand to operate again, and the touch operation is maintained to penetrate through the operations of the two controls, so that the first virtual object 111 may be quickly controlled to move in the game scene.
The embodiment of the application discloses a game interaction method, which comprises the following steps: responding to the triggering interaction triggering event, superposing and displaying an interaction interface on the graphical user interface, wherein the interaction interface comprises at least one interaction item corresponding to the interaction triggering event, and the interaction item is configured to control the first virtual object to execute corresponding game behaviors; responding to a first triggering operation triggered by a touch area on a graphical user interface, and determining a target interaction item from at least one interaction item according to the first triggering operation; in the process of maintaining the first trigger operation, responding to the interaction termination event, and controlling the first virtual object to execute the game action in the game scene according to the current first touch parameter of the first trigger operation, so that the game operation experience of the player can be improved.
In order to facilitate better implementation of the game interaction method provided by the embodiment of the application, the embodiment of the application also provides a game interaction device based on the game interaction method. The meaning of the nouns is the same as that in the game interaction method, and specific implementation details can be referred to in the description of the method embodiment.
Referring to fig. 19, fig. 19 is a block diagram of a game interaction device according to an embodiment of the present application, where the device includes:
A first display unit 301, configured to respond to an interaction triggering event, and superimpose and display an interaction interface on the graphical user interface, where the interaction interface includes at least one interaction entry corresponding to the interaction triggering event, where the interaction entry is configured to control the first virtual object to execute a corresponding game behavior;
A first determining unit 302, configured to respond to a first triggering operation triggered by a touch area on the graphical user interface, and determine a target interaction item from the at least one interaction item according to the first triggering operation;
The first control unit 303 is configured to, in a process of maintaining the first trigger operation, respond to an interaction termination event, and control the first virtual object to execute a game action in the game scene according to a current first touch parameter of the first trigger operation.
In some embodiments, the apparatus may further comprise:
And the second control unit is used for responding to a second trigger operation triggered by the touch control area and controlling the first virtual object to execute the game action in the game scene according to the second trigger operation.
In some embodiments, the second control unit may include:
And the first control subunit is used for controlling the first virtual object to perform game actions in the game scene according to the current second touch parameters of the second trigger operation.
In some embodiments, the first determining unit 302 may include:
The first determining subunit is configured to determine a first initial touch parameter of the first triggering operation according to the current second touch parameter, and determine a target interaction item from the at least one interaction item based on the first initial touch parameter in response to the first triggering operation.
In some embodiments, the apparatus may further comprise:
And the third control unit is used for determining a third initial touch parameter of a third trigger operation according to the current first touch parameter, and controlling the first virtual object to execute a game action in the game scene on the basis of the third initial touch parameter according to the third trigger operation, wherein the third trigger operation is continuous with the first trigger operation.
In some embodiments, the apparatus may further comprise:
And the hiding unit is used for responding to the interaction triggering event, switching the target functional control displayed in the touch area into an interaction selection control and hiding an operation control which is overlapped with the display position of the interaction interface on the graphical user interface.
In some embodiments, the hidden unit may include:
And the hiding subunit is used for hiding the target functional control on the graphical user interface and displaying the interactive selection control on the interactive interface at the position corresponding to the target functional control if the target functional control is overlapped with the display position of the interactive interface.
In some embodiments, the first determining unit 302 may include:
and the second determining subunit is used for determining a target interaction item from the at least one interaction item in response to the touch operation of the interaction selection control.
In some embodiments, the first control unit 303 may include:
And the second control subunit is used for controlling the first virtual object to execute the game action corresponding to the target functional control in the game scene according to the current first touch parameter of the first trigger operation.
In some embodiments, the apparatus may further comprise:
the first closing unit is used for responding to the closing operation of the interactive interface, closing the interactive interface, switching the interactive selection control displayed in the touch area into the target functional control, and displaying an operation control overlapped with the display position of the interactive interface on the graphical user interface.
In some embodiments, the apparatus may further comprise:
And the second closing unit is used for responding to an interaction termination event, closing the interaction interface and displaying an operation control overlapped with the display position of the interaction interface on the graphical user interface.
In some embodiments, the apparatus may further comprise:
The first switching unit is used for switching the display style of the target interaction item on the interaction interface.
And the second switching unit is used for responding to the cancellation event of the target interaction item and switching the display mode of the target interaction item into an initial display mode.
In some embodiments, the first display unit 301 may include:
The first display subunit is used for displaying an interaction control on the graphical user interface in response to the fact that the distance between the first virtual object and the second virtual object in the game scene is smaller than a preset distance;
and the second display subunit is used for responding to the touch operation for the interactive control and displaying the interactive interface in a superposition way on the graphical user interface.
The embodiment of the application discloses a game interaction device, which responds to a triggering interaction triggering event through a first display unit 301, and displays an interaction interface in a superposition manner on the graphical user interface, wherein the interaction interface comprises at least one interaction item corresponding to the interaction triggering event, and the interaction item is configured to control a first virtual object to execute a corresponding game behavior; the first determining unit 302 responds to a first triggering operation triggered by a touch area on the graphical user interface, and determines a target interaction item from the at least one interaction item according to the first triggering operation; the first control unit 303 responds to an interaction termination event in the process of maintaining the first trigger operation, and controls the first virtual object to execute a game action in the game scene according to the current first touch parameter of the first trigger operation. With this, the player game operation experience can be improved.
Correspondingly, the embodiment of the application also provides computer equipment which can be a terminal. As shown in fig. 20, fig. 20 is a schematic structural diagram of a computer device according to an embodiment of the present application. The computer device 600 includes a processor 601 having one or more processing cores, a memory 602 having one or more computer readable storage media, and a computer program stored on the memory 602 and executable on the processor. The processor 601 is electrically connected to the memory 602. It will be appreciated by those skilled in the art that the computer device structure shown in the figures is not limiting of the computer device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
The processor 601 is a control center of the computer device 600, connects various parts of the entire computer device 600 using various interfaces and lines, and performs various functions of the computer device 600 and processes data by running or loading software programs and/or modules stored in the memory 602, and calling data stored in the memory 602, thereby performing overall monitoring of the computer device 600.
In an embodiment of the present application, the processor 601 in the computer device 600 loads instructions corresponding to the processes of one or more application programs into the memory 602 according to the following steps, and the processor 601 executes the application programs stored in the memory 602, thereby implementing various functions:
Responding to the triggering interaction triggering event, superposing and displaying an interaction interface on the graphical user interface, wherein the interaction interface comprises at least one interaction item corresponding to the interaction triggering event, and the interaction item is configured to control the first virtual object to execute corresponding game behaviors;
Responding to a first triggering operation triggered by a touch area on a graphical user interface, and determining a target interaction item from at least one interaction item according to the first triggering operation;
In the process of maintaining the first trigger operation, responding to the interaction termination event, and controlling the first virtual object to execute the game action in the game scene according to the current first touch parameter of the first trigger operation.
In some embodiments, prior to responding to the triggering interactive triggering event, the method further comprises:
And responding to a second triggering operation triggered by the touch area, and controlling the first virtual object to execute the game action in the game scene according to the second triggering operation.
In some embodiments, the first trigger operation is a continuous operation with the second trigger operation.
In some embodiments, controlling the first virtual object to perform a game action in the game scene according to the second trigger operation includes:
Controlling the first virtual object to perform game actions in the game scene according to the current second touch parameters of the second triggering operation;
Determining a target interaction item from the at least one interaction item according to a first triggering operation, including:
And determining a first initial touch parameter of the first trigger operation according to the current second touch parameter, and determining a target interaction item from at least one interaction item on the basis of the first initial touch parameter in response to the first trigger operation.
In some embodiments, after controlling the first virtual object to perform the game action in the game scene according to the current first touch parameter of the first trigger operation, the method further includes:
And determining a third initial touch parameter of a third trigger operation according to the current first touch parameter, and controlling the first virtual object to execute a game action in the game scene on the basis of the third initial touch parameter according to the third trigger operation, wherein the third trigger operation is continuous with the first trigger operation.
In some embodiments, the touch area displays a target functionality control, and the method further includes:
And responding to the interaction triggering event, switching the target functional control displayed in the touch area into an interaction selection control, and hiding an operation control which is overlapped with the display position of the interaction interface on the graphical user interface.
In some embodiments, switching the target functionality control displayed in the touch area to the interactive selection control includes:
If the target functional control is overlapped with the display position of the interactive interface, hiding the target functional control on the graphical user interface, and displaying the interactive selection control at the position corresponding to the target functional control on the interactive interface.
In some embodiments, in response to a first trigger operation triggered by a touch area on a graphical user interface, determining a target interaction entry from at least one interaction entry according to the first trigger operation includes:
In response to a touch operation for the interactive selection control, a target interactive entry is determined from the at least one interactive entry.
In some embodiments, controlling the first virtual object to perform a game action in the game scene according to the current first touch parameter of the first trigger operation includes:
and controlling the first virtual object to execute the game action corresponding to the target function control in the game scene according to the current first touch parameter of the first trigger operation.
In some embodiments, the method further comprises:
And closing the interactive interface in response to the closing operation of the interactive interface, switching the interactive selection control displayed in the touch area into a target functional control, and displaying an operation control overlapping with the display position of the interactive interface on the graphical user interface.
In some embodiments, the method further comprises:
And closing the interactive interface in response to the interaction termination event, and displaying an operation control overlapping with the display position of the interactive interface on the graphical user interface.
In some embodiments, after determining the target interaction entry from the at least one interaction entry according to the first triggering operation, the method further comprises:
and switching the display style of the target interaction item on the interaction interface.
And responding to a cancel event of the target interaction item, and switching the display mode of the target interaction item to an initial display mode.
In some embodiments, responsive to triggering an interaction trigger event, overlaying a display of an interaction interface on a graphical user interface includes:
Responsive to the distance between the first virtual object and the second virtual object in the game scene being less than a preset distance, displaying an interactive control on the graphical user interface;
and responding to touch operation for the interactive control, and displaying the interactive interface in a superposition way on the graphical user interface.
In the embodiment of the application, in response to triggering an interaction triggering event, an interaction interface is displayed in a superimposed manner on a graphical user interface, wherein the interaction interface comprises at least one interaction item corresponding to the interaction triggering event, and the interaction item is configured to control a first virtual object to execute a corresponding game behavior; responding to a first triggering operation triggered by a touch area on a graphical user interface, and determining a target interaction item from at least one interaction item according to the first triggering operation; in the process of maintaining the first trigger operation, responding to the interaction termination event, and controlling the first virtual object to execute the game action in the game scene according to the current first touch parameter of the first trigger operation. Therefore, seamless switching triggering of the game player among different operation controls can be facilitated, and game operation experience of the player is improved.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Optionally, as shown in fig. 20, the computer device 600 further includes: a touch display 603, a radio frequency circuit 604, an audio circuit 605, an input unit 606, and a power supply 607. The processor 601 is electrically connected to the touch display 603, the radio frequency circuit 604, the audio circuit 605, the input unit 606, and the power supply 607, respectively. Those skilled in the art will appreciate that the computer device structure shown in FIG. 20 is not limiting of the computer device and may include more or fewer components than shown, or may be combined with certain components, or a different arrangement of components.
The touch display 603 may be used to display a graphical user interface and receive operation instructions generated by a user acting on the graphical user interface. The touch display 603 may include a display panel and a touch panel. Wherein the display panel may be used to display information entered by a user or provided to a user as well as various graphical user interfaces of a computer device, which may be composed of graphics, text, icons, video, and any combination thereof. Alternatively, the display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations on or near the user (such as operations on or near the touch panel by the user using any suitable object or accessory such as a finger, stylus, etc.), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, and sends the touch point coordinates to the processor 601, and can receive and execute commands sent from the processor 601. The touch panel may overlay the display panel, and upon detection of a touch operation thereon or thereabout, the touch panel is passed to the processor 601 to determine the type of touch event, and the processor 601 then provides a corresponding visual output on the display panel based on the type of touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 603 to implement input and output functions. In some embodiments, however, the touch panel and the touch panel may be implemented as two separate components to perform the input and output functions. I.e. the touch display 603 may also implement an input function as part of the input unit 606.
The radio frequency circuit 604 may be configured to receive and transmit radio frequency signals to and from a network device or other computer device via wireless communication to and from the network device or other computer device.
The audio circuit 605 may be used to provide an audio interface between a user and a computer device through speakers, microphones, and so on. The audio circuit 605 may transmit the received electrical signal converted from audio data to a speaker, and convert the electrical signal into a sound signal for output by the speaker; on the other hand, the microphone converts the collected sound signals into electrical signals, which are received by the audio circuit 605 and converted into audio data, which are processed by the audio data output processor 601 for transmission to, for example, another computer device via the radio frequency circuit 604, or which are output to the memory 602 for further processing. The audio circuit 605 may also include an ear bud jack to provide communication of the peripheral headphones with the computer device.
The input unit 606 may be used to receive entered numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), as well as to generate keyboard, mouse, joystick, optical, or trackball signal inputs associated with user settings and function control.
The power supply 607 is used to power the various components of the computer device 600. Alternatively, the power supply 607 may be logically connected to the processor 601 through a power management system, so as to perform functions of managing charging, discharging, and power consumption management through the power management system. The power supply 607 may also include one or more of any of a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown in fig. 20, the computer device 600 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which will not be described herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
As can be seen from the above, the computer device provided in this embodiment may respond to the triggering interaction triggering event, and superimpose and display an interaction interface on the graphical user interface, where the interaction interface includes at least one interaction item corresponding to the interaction triggering event, and the interaction item is configured to control the first virtual object to execute a corresponding game behavior; responding to a first triggering operation triggered by a touch area on a graphical user interface, and determining a target interaction item from at least one interaction item according to the first triggering operation; in the process of maintaining the first trigger operation, responding to the interaction termination event, and controlling the first virtual object to execute the game action in the game scene according to the current first touch parameter of the first trigger operation.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, an embodiment of the present application provides a computer readable storage medium having stored therein a plurality of computer programs that can be loaded by a processor to perform steps in any of the game interaction methods provided by the embodiment of the present application. For example, the computer program may perform the steps of:
Responding to the triggering interaction triggering event, superposing and displaying an interaction interface on the graphical user interface, wherein the interaction interface comprises at least one interaction item corresponding to the interaction triggering event, and the interaction item is configured to control the first virtual object to execute corresponding game behaviors;
Responding to a first triggering operation triggered by a touch area on a graphical user interface, and determining a target interaction item from at least one interaction item according to the first triggering operation;
In the process of maintaining the first trigger operation, responding to the interaction termination event, and controlling the first virtual object to execute the game action in the game scene according to the current first touch parameter of the first trigger operation.
In some embodiments, prior to responding to the triggering interactive triggering event, the method further comprises:
And responding to a second triggering operation triggered by the touch area, and controlling the first virtual object to execute the game action in the game scene according to the second triggering operation.
In some embodiments, the first trigger operation is a continuous operation with the second trigger operation.
In some embodiments, controlling the first virtual object to perform a game action in the game scene according to the second trigger operation includes:
Controlling the first virtual object to perform game actions in the game scene according to the current second touch parameters of the second triggering operation;
Determining a target interaction item from the at least one interaction item according to a first triggering operation, including:
And determining a first initial touch parameter of the first trigger operation according to the current second touch parameter, and determining a target interaction item from at least one interaction item on the basis of the first initial touch parameter in response to the first trigger operation.
In some embodiments, after controlling the first virtual object to perform the game action in the game scene according to the current first touch parameter of the first trigger operation, the method further includes:
And determining a third initial touch parameter of a third trigger operation according to the current first touch parameter, and controlling the first virtual object to execute a game action in the game scene on the basis of the third initial touch parameter according to the third trigger operation, wherein the third trigger operation is continuous with the first trigger operation.
In some embodiments, the touch area displays a target functionality control, and the method further includes:
And responding to the interaction triggering event, switching the target functional control displayed in the touch area into an interaction selection control, and hiding an operation control which is overlapped with the display position of the interaction interface on the graphical user interface.
In some embodiments, switching the target functionality control displayed in the touch area to the interactive selection control includes:
If the target functional control is overlapped with the display position of the interactive interface, hiding the target functional control on the graphical user interface, and displaying the interactive selection control at the position corresponding to the target functional control on the interactive interface.
In some embodiments, in response to a first trigger operation triggered by a touch area on a graphical user interface, determining a target interaction entry from at least one interaction entry according to the first trigger operation includes:
In response to a touch operation for the interactive selection control, a target interactive entry is determined from the at least one interactive entry.
In some embodiments, controlling the first virtual object to perform a game action in the game scene according to the current first touch parameter of the first trigger operation includes:
and controlling the first virtual object to execute the game action corresponding to the target function control in the game scene according to the current first touch parameter of the first trigger operation.
In some embodiments, the method further comprises:
And closing the interactive interface in response to the closing operation of the interactive interface, switching the interactive selection control displayed in the touch area into a target functional control, and displaying an operation control overlapping with the display position of the interactive interface on the graphical user interface.
In some embodiments, the method further comprises:
And closing the interactive interface in response to the interaction termination event, and displaying an operation control overlapping with the display position of the interactive interface on the graphical user interface.
In some embodiments, after determining the target interaction entry from the at least one interaction entry according to the first triggering operation, the method further comprises:
and switching the display style of the target interaction item on the interaction interface.
And responding to a cancel event of the target interaction item, and switching the display mode of the target interaction item to an initial display mode.
In some embodiments, responsive to triggering an interaction trigger event, overlaying a display of an interaction interface on a graphical user interface includes:
Responsive to the distance between the first virtual object and the second virtual object in the game scene being less than a preset distance, displaying an interactive control on the graphical user interface;
and responding to touch operation for the interactive control, and displaying the interactive interface in a superposition way on the graphical user interface.
In the embodiment of the application, in response to triggering an interaction triggering event, an interaction interface is displayed in a superimposed manner on a graphical user interface, wherein the interaction interface comprises at least one interaction item corresponding to the interaction triggering event, and the interaction item is configured to control a first virtual object to execute a corresponding game behavior; responding to a first triggering operation triggered by a touch area on a graphical user interface, and determining a target interaction item from at least one interaction item according to the first triggering operation; in the process of maintaining the first trigger operation, responding to the interaction termination event, and controlling the first virtual object to execute the game action in the game scene according to the current first touch parameter of the first trigger operation. Therefore, seamless switching triggering of the game player among different operation controls can be facilitated, and game operation experience of the player is improved.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Wherein the computer-readable storage medium may comprise: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
Because the computer program stored in the computer readable storage medium can execute the steps in any game interaction method provided by the embodiment of the present application, the beneficial effects that any game interaction method provided by the embodiment of the present application can achieve can be achieved, which are detailed in the previous embodiments and are not described herein.
The above describes in detail a game interaction method, device, computer readable storage medium and computer equipment provided by the embodiments of the present application, and specific examples are applied to illustrate the principles and embodiments of the present application, where the above description of the embodiments is only for helping to understand the method and core ideas of the present application; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, the present description should not be construed as limiting the present application.

Claims (16)

1. A game interaction method, characterized in that a graphical user interface is provided by a terminal device, and the content displayed by the graphical user interface at least comprises a part of a game scene, and a first virtual object controlled by the terminal device in the game scene, the method comprising:
Responding to an interaction triggering event, and superposing and displaying an interaction interface on the graphical user interface, wherein the interaction interface comprises at least one interaction item corresponding to the interaction triggering event, and the interaction item is configured to control the first virtual object to execute corresponding game behaviors;
responding to a first triggering operation triggered by a touch area on the graphical user interface, and determining a target interaction item from the at least one interaction item according to the first triggering operation;
And in the process of maintaining the first triggering operation, responding to an interaction termination event, and controlling the first virtual object to execute a game action in the game scene according to the current first touch parameter of the first triggering operation.
2. The method of claim 1, wherein prior to said responding to the triggering interactive triggering event, the method further comprises:
and responding to a second triggering operation triggered by the touch area, and controlling the first virtual object to execute the game action in the game scene according to the second triggering operation.
3. The method of claim 2, wherein the first trigger operation is a continuous operation with the second trigger operation.
4. A method according to claim 3, wherein said controlling said first virtual object to perform said game action in said game scene in accordance with said second trigger operation comprises:
Controlling the first virtual object to perform game actions in the game scene according to the current second touch parameters of the second triggering operation;
the determining a target interaction item from the at least one interaction item according to the first triggering operation includes:
And determining a first initial touch parameter of the first trigger operation according to the current second touch parameter, and determining a target interaction item from the at least one interaction item on the basis of the first initial touch parameter in response to the first trigger operation.
5. The method of claim 4, further comprising, after controlling the first virtual object to perform a game action in the game scene according to the current first touch parameter of the first trigger operation:
Determining a third initial touch parameter of a third trigger operation according to the current first touch parameter, and controlling the first virtual object to execute a game action in the game scene on the basis of the third initial touch parameter according to the third trigger operation, wherein the third trigger operation is continuous with the first trigger operation.
6. The method of claim 1, wherein the touch area displays a target functionality control, the method further comprising:
and responding to the interaction triggering event, switching the target function control displayed in the touch area into an interaction selection control, and hiding an operation control which is overlapped with the display position of the interaction interface on the graphical user interface.
7. The method of claim 6, wherein the switching the target functionality control displayed in the touch area to an interactive selection control comprises:
And if the display positions of the target functional control and the interactive interface are overlapped, hiding the target functional control on the graphical user interface, and displaying the interactive selection control at the position corresponding to the target functional control on the interactive interface.
8. The method of claim 6, wherein the determining the target interaction item from the at least one interaction item in response to a first trigger operation triggered by a touch area on the graphical user interface according to the first trigger operation comprises:
and determining a target interaction item from the at least one interaction item in response to a touch operation for the interaction selection control.
9. The method of claim 6, wherein controlling the first virtual object to perform a game action in the game scene according to the current first touch parameter of the first trigger operation comprises:
And controlling the first virtual object to execute the game action corresponding to the target functional control in the game scene according to the current first touch parameter of the first triggering operation.
10. The method of claim 6, wherein the method further comprises:
And closing the interactive interface in response to closing operation of the interactive interface, switching the interactive selection control displayed in the touch area into the target functional control, and displaying an operation control overlapping with the display position of the interactive interface on the graphical user interface.
11. The method of claim 6, wherein the method further comprises:
And closing the interactive interface in response to an interaction termination event, and displaying an operation control overlapping with the display position of the interactive interface on the graphical user interface.
12. The method of claim 1, wherein after the determining a target interaction entry from the at least one interaction entry according to the first trigger operation, the method further comprises:
And switching the display style of the target interaction item on the interaction interface.
And responding to a cancel event of the target interaction item, and switching the display mode of the target interaction item to an initial display mode.
13. The method of claim 1, wherein the displaying an interactive interface overlaid on the graphical user interface in response to triggering an interactive triggering event comprises:
Responsive to the distance between the first virtual object and a second virtual object in the game scene being less than a preset distance, displaying an interactive control on the graphical user interface;
and responding to the touch operation of the interactive control, and displaying an interactive interface in a superposition way on the graphical user interface.
14. A game interaction device, characterized in that a graphical user interface is provided by a terminal device, the content displayed by the graphical user interface at least comprises a part of a game scene, and a first virtual object controlled by the terminal device in the game scene, the device comprising:
The first display unit is used for responding to the triggering interaction triggering event, superposing and displaying an interaction interface on the graphical user interface, wherein the interaction interface comprises at least one interaction item corresponding to the interaction triggering event, and the interaction item is configured to control the first virtual object to execute corresponding game behaviors;
The first determining unit is used for responding to a first triggering operation triggered by the touch control area on the graphical user interface and determining a target interaction item from the at least one interaction item according to the first triggering operation;
And the first control unit is used for responding to an interaction termination event in the process of maintaining the first trigger operation and controlling the first virtual object to execute a game action in the game scene according to the current first touch parameter of the first trigger operation.
15. A computer device comprising a memory, a processor and a computer program stored on the memory and running on the processor, wherein the processor implements the game interaction method of any of claims 1 to 13 when the program is executed by the processor.
16. A computer readable storage medium having stored thereon a plurality of instructions adapted to be loaded by a processor to perform the game interaction method of any of claims 1 to 13.
CN202410160492.4A 2024-02-04 2024-02-04 Game interaction method and device, computer equipment and storage medium Pending CN117942567A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410160492.4A CN117942567A (en) 2024-02-04 2024-02-04 Game interaction method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410160492.4A CN117942567A (en) 2024-02-04 2024-02-04 Game interaction method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117942567A true CN117942567A (en) 2024-04-30

Family

ID=90794376

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410160492.4A Pending CN117942567A (en) 2024-02-04 2024-02-04 Game interaction method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117942567A (en)

Similar Documents

Publication Publication Date Title
CN113101652A (en) Information display method and device, computer equipment and storage medium
CN113082688B (en) Method and device for controlling virtual character in game, storage medium and equipment
CN113398590B (en) Sound processing method, device, computer equipment and storage medium
CN113426124A (en) Display control method and device in game, storage medium and computer equipment
CN115193049A (en) Virtual role control method, device, storage medium and computer equipment
CN113332721B (en) Game control method, game control device, computer equipment and storage medium
CN116115991A (en) Aiming method, aiming device, computer equipment and storage medium
CN115999153A (en) Virtual character control method and device, storage medium and terminal equipment
CN115193035A (en) Game display control method and device, computer equipment and storage medium
CN114225412A (en) Information processing method, information processing device, computer equipment and storage medium
CN117942567A (en) Game interaction method and device, computer equipment and storage medium
CN113398564B (en) Virtual character control method, device, storage medium and computer equipment
CN118576971A (en) Game control method, game control device, computer equipment and storage medium
CN118477305A (en) Game card control method, device, computer equipment and storage medium
CN116850594A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN118179012A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN118001724A (en) Game view angle switching method and device, computer equipment and storage medium
CN118698118A (en) Game control method, game control device, computer equipment and storage medium
CN117654028A (en) Game display control method and device, computer equipment and storage medium
CN116870472A (en) Game view angle switching method and device, computer equipment and storage medium
CN118698128A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN118454230A (en) Game control method, game control device, computer equipment and storage medium
CN117323665A (en) Information processing method and device in game, computer equipment and storage medium
CN117160031A (en) Game skill processing method, game skill processing device, computer equipment and storage medium
CN118253086A (en) Game control method, game control device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination