CN117753001A - Interaction method and device in game, computer equipment and storage medium - Google Patents

Interaction method and device in game, computer equipment and storage medium Download PDF

Info

Publication number
CN117753001A
CN117753001A CN202410037172.XA CN202410037172A CN117753001A CN 117753001 A CN117753001 A CN 117753001A CN 202410037172 A CN202410037172 A CN 202410037172A CN 117753001 A CN117753001 A CN 117753001A
Authority
CN
China
Prior art keywords
game
interaction
drop point
user interface
graphical user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410037172.XA
Other languages
Chinese (zh)
Inventor
杨丽君
周琪然
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202410037172.XA priority Critical patent/CN117753001A/en
Publication of CN117753001A publication Critical patent/CN117753001A/en
Pending legal-status Critical Current

Links

Abstract

The application discloses an interaction method, an interaction device, computer equipment and a storage medium in a game, comprising the following steps: displaying a first game scene, a displacement control and an interaction control on a graphical user interface; responding to a first triggering operation acting on the displacement control, controlling the target virtual object to execute displacement behavior in a first game scene, and displaying a drop point indication mark on a graphical user interface; responding to a second trigger operation acted on the interaction control, and displaying a drop point offset mark on the graphical user interface according to the second trigger operation; and responding to the end of the second triggering operation, and controlling the target virtual object to execute target interaction behavior on the first interaction object. According to the embodiment of the application, the player can adjust the direction and the strength of the interactive object according to the drop point indication mark and the drop point offset mark, so that the game playing method is enriched, the game content playing resistance is greatly improved, the richness of the game interaction mode is effectively improved, the game experience is improved, and the game immersion of the player in the game is improved.

Description

Interaction method and device in game, computer equipment and storage medium
Technical Field
The present application relates to the field of game technologies, and in particular, to an interaction method and apparatus in a game, a computer device, and a storage medium.
Background
In order to meet the pursuit of people for mental life, entertainment games capable of being operated on terminals have been developed, for example, types of games such as role-playing games, competitive games, shooting competitive games, or massively multiplayer online role-playing games developed based on client or server architecture. In the game, the player can operate the virtual characters in the screen to play the game, and can execute the related operations of walking, running, jumping, picking up props, fighting and the like in the game scene based on the first view angle or the third view angle of the characters operated by the player, so that the player can experience the visual impact brought by the game in an immersive manner, and the initiative and the sense of reality of the game are greatly enhanced. For example, a Player may play a game in a game with a virtual Character that other players play, or a Non-Player Character (NPC) in the game.
Currently, in a virtual ball game, a plurality of virtual players exist in a virtual court in a game scene, and a player can control the corresponding virtual players to perform a goal shooting operation in the virtual court, so as to perform game interaction or perform game interaction with other virtual players. However, when a player controls a corresponding virtual player to perform a goal shooting operation in the existing game, the ball flies out directly after the player performs a simple touch operation, the interaction mode between the player and the virtual ball in the goal shooting stage is single, the player always feels boring and tedious when repeating the same goal shooting operation on the virtual ball, the game reality is poor, and the game experience of the player is poor.
Disclosure of Invention
According to the interaction method, the interaction device, the computer equipment and the storage medium in the game, the drop point indication identification is displayed when the target interaction event occurs, when the player controls the target virtual object to adjust the virtual sphere, the drop point indication identification can be indicated to the offset influence of the player adjusting operation on the drop point indication identification of the virtual sphere through the drop point offset identification, so that the player is guided to adjust the direction and the strength of the virtual sphere according to the drop point indication identification and the drop point offset identification, the game playing method is enriched, the game content playing tolerance is greatly improved, the richness of the game interaction mode is effectively improved, the game experience is improved, and the game immersion of the player during the game is improved.
The embodiment of the application provides an interaction method in a game, which comprises the steps of:
displaying a first game scene, a displacement control and an interaction control on the graphical user interface, wherein the first game scene comprises a first interaction object and a target virtual object controlled by the terminal equipment;
responding to a first triggering operation acted on the displacement control, controlling the target virtual object to execute a displacement action in the first game scene, and displaying a drop point indication mark on the graphical user interface, wherein the real-time display position of the drop point indication mark is determined based on the real-time displacement action of the target virtual object and is used for indicating the real-time drop point position of the first interaction object in the first game scene after the target virtual object executing the real-time displacement action has target interaction action with the first interaction object;
Responding to a second trigger operation acting on the interaction control, and displaying a drop point offset mark on the graphical user interface according to the second trigger operation, wherein the drop point offset mark is used for indicating offset influence of the second trigger operation on the real-time drop point position of the first interaction object;
and responding to the second triggering operation, and controlling the target virtual object to execute the target interaction behavior on the first interaction object.
Correspondingly, the embodiment of the application also provides an interaction device in the game, which provides a graphical user interface through terminal equipment, and comprises the following components:
the first display unit is used for displaying a first game scene, a displacement control and an interaction control on the graphical user interface, wherein the first game scene comprises a first interaction object and a target virtual object controlled by the terminal equipment;
the first control unit is used for responding to a first triggering operation acted on the displacement control, controlling the target virtual object to execute displacement behavior in the first game scene, and displaying a drop point indication mark on the graphical user interface, wherein the real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the first interaction object in the first game scene after the target virtual object executing the real-time displacement behavior has target interaction behavior with the first interaction object;
The second display unit is used for responding to a second trigger operation acting on the interaction control, and displaying a drop point offset mark on the graphical user interface according to the second trigger operation, wherein the drop point offset mark is used for indicating offset influence of the second trigger operation on the real-time drop point position of the first interaction object;
and the second control unit is used for responding to the ending of the second triggering operation and controlling the target virtual object to execute the target interaction behavior on the first interaction object.
Accordingly, embodiments of the present application also provide a computer device including a processor, a memory, and a computer program stored on the memory and capable of running on the processor, the computer program implementing steps of any one of the interaction methods in the game when executed by the processor.
Accordingly, embodiments of the present application also provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of any of the interaction methods in a game.
The embodiment of the application provides an interaction method, an interaction device, computer equipment and a storage medium in a game, wherein a first game scene, a displacement control and an interaction control are displayed on a graphical user interface, and the first game scene comprises a first interaction object and a target virtual object controlled by terminal equipment; then, responding to a first triggering operation acted on the displacement control, controlling the target virtual object to execute displacement behavior in the first game scene, and displaying a drop point indication mark on the graphical user interface, wherein the real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the first interaction object in the first game scene after the target interaction behavior of the target virtual object executing the real-time displacement behavior occurs with the first interaction object; then, responding to a second trigger operation acted on the interaction control, and displaying a drop point offset mark on the graphical user interface according to the second trigger operation, wherein the drop point offset mark is used for indicating offset influence of the second trigger operation on the real-time drop point position of the first interaction object; and finally, responding to the second triggering operation, and controlling the target virtual object to execute the target interaction behavior on the first interaction object. According to the method and the device, the drop point indication mark can be displayed when the target interaction event occurs, when the player controls the target virtual object to adjust the virtual sphere, the drop point offset mark can be used for indicating the offset influence of the player adjustment operation on the drop point indication mark of the virtual sphere, so that the player is guided to adjust the direction and the strength of the virtual sphere according to the drop point indication mark and the drop point offset mark, the game playing method is enriched, the game content playing tolerance is greatly improved, the richness of the game interaction mode is effectively improved, the game experience is improved, and the game immersion of the player during the game is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic view of a scenario of an interactive system in a game provided in an embodiment of the present application.
Fig. 2 is a schematic flow chart of an interaction method in a game according to an embodiment of the present application.
Fig. 3 is a schematic view of a scenario of an interaction method in a game according to an embodiment of the present application.
Fig. 4 is another schematic view of an interaction method in a game according to an embodiment of the present application.
Fig. 5 is another schematic view of an interaction method in a game according to an embodiment of the present application.
Fig. 6 is another schematic view of an interaction method in a game according to an embodiment of the present application.
Fig. 7a is another schematic view of an interaction method in a game according to an embodiment of the present application.
Fig. 7b is another schematic view of an interaction method in a game according to an embodiment of the present application.
Fig. 8 is another schematic view of an interaction method in a game according to an embodiment of the present application.
Fig. 9 is another schematic view of an interaction method in a game according to an embodiment of the present application.
Fig. 10 is a schematic structural diagram of an interactive device in a game according to an embodiment of the present application.
Fig. 11 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The embodiment of the application provides an interaction method, device, computer equipment and storage medium in a game. Specifically, the interaction method in the game of the embodiment of the application may be executed by a computer device, where the computer device may be a terminal or a server. The terminal may be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA), and the like, and the terminal may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligent platforms.
For example, when the interactive method in the game is run on the terminal, the terminal device stores a game application and is used to present a virtual scene in a game screen. The terminal device is used for interacting with a user through a graphical user interface, for example, the terminal device downloads and installs a game application program and runs the game application program. The way in which the terminal device presents the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device, or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including game screens and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for running the game, generating the graphical user interface, responding to the operation instructions, and controlling the display of the graphical user interface on the touch display screen.
For example, when the interactive method in the game is run on a server, it may be a cloud game. Cloud gaming refers to a game style based on cloud computing. In the running mode of the cloud game, a running main body of the game application program and a game picture presentation main body are separated, and the storage and running of the interaction method in the game are completed on a cloud game server. The game image presentation is completed at a cloud game client, which is mainly used for receiving and sending game data and presenting game images, for example, the cloud game client may be a display device with a data transmission function, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, etc., near a user side, but the terminal device for processing game data is a cloud game server in the cloud. When playing the game, the user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the cloud game client through a network, and finally decodes the data through the cloud game client and outputs the game pictures.
Referring to fig. 1, fig. 1 is a schematic view of a scene of an interactive system in a game according to an embodiment of the present application. The system may include at least one terminal, at least one server, at least one database, and a network. The terminal held by the user can be connected to the server of different games through the network. A terminal is any device having computing hardware capable of supporting and executing a software product corresponding to a game. In addition, when the system includes a plurality of terminals, a plurality of servers, and a plurality of networks, different terminals may be connected to each other through different networks, through different servers. The network may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition, the different terminals may be connected to other terminals or to a server or the like using their own bluetooth network or hotspot network. For example, multiple users may be online through different terminals to connect and synchronize with each other through an appropriate network to support multiplayer games. In addition, the system may include multiple databases coupled to different servers and information related to the gaming environment may be continuously stored in the databases as different users play multiplayer games online.
The embodiment of the application provides an interaction method in a game, which can be executed by a terminal or a server. The embodiment of the application is described by taking the interaction method in the game as an example executed by the terminal. The terminal may include a touch display screen and a processor (of course, the terminal may also use peripheral devices such as a mouse and a keyboard as input devices, which are only illustrated here by using the touch display screen as an example), where the touch display screen is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface. When a user operates the graphical user interface through the touch display screen, the graphical user interface can control the local content of the terminal by responding to the received operation instruction, and can also control the content of the opposite-end server by responding to the received operation instruction. For example, the user-generated operational instructions for the graphical user interface include instructions for launching the gaming application, and the processor is configured to launch the gaming application after receiving the user-provided instructions for launching the gaming application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch-sensitive display screen. A touch display screen is a multi-touch-sensitive screen capable of sensing touch or slide operations performed simultaneously by a plurality of points on the screen. The user performs touch operation on the graphical user interface by using a finger, and when the graphical user interface detects the touch operation, the graphical user interface controls different virtual objects in the graphical user interface of the game to perform actions corresponding to the touch operation. For example, the game may be any one of a leisure game, an action game, a role playing game, a strategy game, a sports game, a educational game, and the like. Wherein the game may comprise a virtual scene of the game drawn on a graphical user interface. Further, one or more virtual objects, such as virtual characters, controlled by a user (or player) may be included in the virtual scene of the game. In addition, one or more obstacles, such as rails, ravines, walls, etc., may also be included in the virtual scene of the game to limit movement of the virtual object, e.g., to limit movement of the one or more objects to a particular area within the virtual scene. Optionally, the virtual scene of the game also includes one or more elements, such as skills, scores, character health status, energy, etc., to provide assistance to the player, provide virtual services, increase scores related to the player's performance, etc. In addition, the graphical user interface may also present one or more indicators to provide indication information to the player. For example, a game may include a player controlled virtual object and one or more other virtual objects (such as enemy characters). In one embodiment, one or more other virtual objects are controlled by other players of the game. For example, one or more other virtual objects may be computer controlled, such as a robot using an Artificial Intelligence (AI) algorithm, implementing a human-machine engagement mode. For example, virtual objects possess various skills or capabilities that a game player uses to achieve a goal. For example, the virtual object may possess one or more weapons, props, tools, etc. that may be used to eliminate other objects from the game. Such skills or capabilities may be activated by the player of the game using one of a plurality of preset touch operations with the touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of the user.
It should be noted that, the schematic view of the scenario of the interactive system in the game shown in fig. 1 is only an example, and the interactive system in the game and the scenario described in the embodiments of the present application are for more clearly describing the technical solutions of the embodiments of the present application, and do not constitute a limitation on the technical solutions provided by the embodiments of the present application, and as a person of ordinary skill in the art can know that, with the appearance of a new service scenario, the technical solutions provided by the embodiments of the present application are equally applicable to similar technical problems.
Based on the above problems, the embodiments of the present application provide an interaction method, apparatus, computer device and storage medium in a game, which can effectively enrich game play methods, and greatly improve game content playing resistance, so as to effectively improve richness of game interaction methods, improve game experience, and improve game immersion of players during the game. The following will describe in detail. The following description of the embodiments is not intended to limit the preferred embodiments.
The embodiment of the application provides an interaction method in a game, which can be executed by a terminal or a server, and the embodiment of the application is described by taking the execution of the interaction method in the game by the terminal as an example, a graphical user interface can be provided through a terminal device, wherein the content displayed by the graphical user interface at least partially comprises a game scene, a first interaction object, a second interaction object and a target virtual object, the first interaction object can be a virtual sphere, the second interaction object can be a virtual goal, and the target virtual object can be a controlled virtual goal currently controlled by the terminal device by a player. The graphic user interface may also display a part of virtual objects, where the part of the virtual objects are other virtual objects displayed in the part of the game scene, and the other virtual objects may be friend virtual objects (e.g., friend virtual players) that are in the same camp as the controlled virtual Player, or enemy virtual objects that are in hostile camp with the controlled virtual object, e.g., virtual goalkeepers, or other common virtual objects that are not in camp, or Non-Player virtual characters (NPCs).
In the embodiment of the application, the game scene is a virtual environment displayed (or provided) when the application program runs on the terminal. The virtual environment can be a simulation environment for the real world, a semi-simulation and semi-fictional three-dimensional environment, or a pure fictional three-dimensional environment. The virtual environment is used for virtual environment combat between at least two virtual objects, and virtual resources available for the at least two virtual objects are provided in the virtual environment. One or more virtual objects may be displayed in a game scene, and a controlled virtual player referred to herein is a virtual object, where a virtual object (or hero) refers to a movable object in a virtual environment, and a virtual object refers to a virtual character in a game controlled by a user or player through a terminal. The controlled virtual player in the embodiment of the application refers to a virtual object in a game controlled by a current player through a terminal, namely a virtual object controlled by a local player. The friend virtual object can be a virtual object controlled by a player in a game through a terminal device, and the friend virtual object and the hostile virtual object can also refer to virtual objects controlled by other players in the game through terminals, namely virtual objects controlled by other players at other ends; the friend virtual object and the adversary virtual object may also refer to virtual objects controlled by a computer device (e.g., AI).
Referring to fig. 2, fig. 2 is a schematic flow chart of an interaction method in a game according to an embodiment of the present application, and a specific flow of the interaction method in a game may be as follows:
and 101, displaying a first game scene, a displacement control and an interaction control on the graphical user interface, wherein the first game scene comprises a first interaction object and a target virtual object controlled by the terminal equipment.
Specifically, a first game scene, a displacement control and an interaction control may be displayed on the graphical user interface, where the first game scene includes a first interaction object and a target virtual object controlled by the terminal device, the first game scene may be a game scene preset in a ball-pointing stage, the first interaction object may be a virtual sphere, and the target virtual object may be a controlled virtual player currently being controlled by the player through the terminal device.
In an embodiment, before the step of "the graphical user interface displays the first game scene, the displacement control, and the interaction control", the method further comprises:
and displaying a second game scene on the graphical user interface, wherein the second game scene comprises a plurality of game camps and the first interactive object, each game camps comprises a plurality of virtual characters, and the virtual characters of the game camps compete for the control right of the first interactive object in the second game scene.
For example, a second game scene may be displayed on the graphical user interface, where the second game scene includes a plurality of game camps and the first interactive object, the first interactive object may be a virtual sphere, the plurality of game camps may include a first game camping and a second game camping, each game camping includes a plurality of virtual players, and the first virtual players of the first game camping and the second virtual players of the second game camping compete for control rights of the virtual sphere in the second game scene. The second game scene may be a virtual football field, which includes a first virtual goal corresponding to the first game play and a second virtual goal corresponding to the second game play, and when the first virtual player of the first game play kicks the virtual ball into the second virtual goal, the first game play score is determined; when the second virtual player of the second game camp kicks the virtual ball into the first virtual goal, the score of the second game camp is judged, and the like until the competition duration is finished, and the game camp with high score is judged to be a winning party according to the score conditions of the two game camps.
Further, the first game scene may be a part of a virtual football field, and specifically may include a target virtual object, a virtual sphere, and a virtual goal corresponding to a game play of a non-identical target virtual object. For example, when the target virtual object is a first virtual player of a first game play, the first game scene may be a portion of a virtual court including a second virtual goal, and a second virtual goalkeeper located in front of the second virtual goal in the second game play. For another example, when the target virtual object is a second virtual player of a second game play, the first game scene may be a portion of a virtual court including a first virtual goal, and a first virtual goalkeeper located in front of the first virtual goal in the first game play.
In a specific embodiment, in the game fight, the number of fights performed on the virtual court in the first game camp and the second game camp may be 11, the first game camp and the second game camp may be controlled by different players, each player may respectively control any one or more of 11 virtual objects in the game camp, the player may randomly switch the currently operated virtual objects in the game fight to perform a game, and any one of the first game camp and the second game camp may also be operated by the computer device (AI).
In one embodiment, the step of displaying the first game scene, the displacement control and the interaction control on the graphical user interface, the method may include:
and responding to a target interaction event, switching the second game scene displayed by the graphical user interface into a first game scene, and displaying a displacement control and an interaction control on the graphical user interface.
The target interaction event may be a first target interaction event and a second target interaction event, where the first target interaction event is: at least two game camps cannot win or lose in normal game play time, namely, when the game camps are flat, a ball-pointing event is triggered; alternatively, the second target interaction event may be: the specified foul of the virtual object in a certain game lineup triggers a ball-pointing event.
For example, the actual scenario of the first target interaction event may be: when the competition duration of the first game camp and the second game camp reaches 90 minutes of the normal game duration, and the ratio of the first game camp to the second game camp is 2:2, the first game camp and the second game camp can be judged to be flat, and a ball-pointing event is triggered.
For another example, the actual scenario of the second target interaction event may be: and triggering a ball-striking event if a specified foul (such as a foul in a forbidden zone) occurs to a second virtual player of the second game play within 90 minutes of the normal game play time of the first game play and the second game play.
The goal is to shoot the virtual ball towards the virtual goal of the hostile game play in the area of the hostile game play.
In an embodiment, the game view angle corresponding to the first game scene is different from the game view angle corresponding to the second game scene.
Specifically, the game view angle corresponding to the first game scene is a 2.5D overlook view angle, and the game view angle corresponding to the second game scene is a first person view angle or a third person view angle.
In an embodiment, the first game scene is a game scene in which a target virtual object in the first game battle or the second game battle performs a ball-striking operation, in order to concentrate a player on performing a ball-striking and highlighting a ball-striking link, a game view angle corresponding to the first game scene is generally a 2.5D overlooking view angle, the first game scene is generally a game scene of a part of a virtual court, and the first game scene is generally displayed with the target virtual object, the virtual sphere, and a virtual goal and a virtual goalkeeper corresponding to the target virtual object being a non-identical game battle. The 2.5D overlooking view may be a game view of a virtual camera overlooking at a fixed angle. Under this perspective, the virtual object can move in three axes of x, y, and z, thus mimicking the environment of a 3D game.
In another embodiment, the second game scene is a virtual object in the first game battle, a game scene of a normal game of a game with the virtual object in the second game battle, in order to enable a player to perform global operation on the virtual court, the game view corresponding to the second game scene is a first person view or a third person view, the player can switch the game view to the first person view or the third person view according to own game requirement, the second game scene is usually a game scene integrally displayed on the virtual court, and the second game scene can usually display virtual objects, virtual spheres and virtual goals corresponding to all the game battles, wherein all the game battles participate in the game battle on the virtual court.
For example, referring to fig. 3, the computer device displays a graphical user interface on a screen, where the graphical user interface displays content that at least partially includes a first game scene, a virtual sphere located in the first game scene, a virtual goal, a virtual goalkeeper of a second game play, and a target virtual object controlled by the terminal device, where the target virtual object may be a virtual player in the first game play.
102, responding to a first trigger operation acted on the displacement control, controlling the target virtual object to execute displacement behavior in the first game scene, and displaying a drop point indication mark on the graphical user interface, wherein the real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the first interaction object in the first game scene after the target virtual object executing the real-time displacement behavior has target interaction behavior with the first interaction object.
In an embodiment, the first game scene further includes a second interactive object used together with the first interactive object. Specifically, in the step of "the response acts on the first trigger operation of the displacement control to control the target virtual object to execute the displacement behavior in the first game scene and display a drop point indication identifier on the graphical user interface", the method may include:
responding to a first triggering operation acting on the displacement control, controlling the target virtual object to execute displacement behavior in the first game scene, and displaying a drop point indication mark at the second interaction object;
The real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the first interactive object at the second interactive object after the target virtual object executing the real-time displacement behavior has the target interactive behavior with the first interactive object.
In this embodiment of the present application, the first interaction object may be a virtual sphere, and the second interaction object may be a virtual goal. The player can execute a first trigger operation on the displacement control through the graphic user interface, the computer equipment responds to the first trigger operation on the displacement control, controls the target virtual object to execute displacement behavior in the first game scene, and displays a drop point indication mark at the virtual goal. The real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the virtual ball at the virtual goal after the target virtual object executing the real-time displacement behavior has target interaction behavior with the virtual ball.
For example, referring to fig. 4, the target virtual object is a virtual object of a first game play, the first interactive object may be a virtual sphere, and the second interactive object may be a virtual goal of a second game play, where the virtual goal of the second game play is preceded by a virtual goalkeeper of the second game play. The player can execute a first trigger operation on the displacement control through the graphic user interface, the computer equipment responds to the first trigger operation on the displacement control, controls the target virtual object to execute displacement behavior in the first game scene, and displays a drop point indication mark at the virtual goal, wherein the drop point indication mark is a dotted line circular mark, and when needing to be described, the drop point indication mark can also be a solid line circular mark or marks of other shapes, and the shape of the drop point indication mark can be similar to, the same as or different from that of the first interaction object. This is merely an example, and more examples are not described in detail herein.
The falling point indication mark is used for indicating a ball falling position of the virtual ball at the virtual goal, wherein the ball falling position is a falling position when the virtual ball enters the virtual goal.
Further, to prompt the movement trend of the first interactive object in the player game, the step of responding to the first trigger operation acting on the displacement control to control the target virtual object to execute the displacement action in the first game scene and display a drop point indication identifier on the graphical user interface, and the method may include:
responding to a first triggering operation acting on the displacement control, controlling the target virtual object to execute displacement behavior in the first game scene, displaying a drop point indication mark on the graphical user interface, and displaying a movement trend mark near the drop point indication mark;
the real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the first interaction object in the first game scene after the target virtual object executing the real-time displacement behavior has target interaction behavior with the first interaction object; the motion trend identifies a first motion direction and a first motion force for indicating the real-time drop point position of the first interactive object.
For example, referring to fig. 5, the target virtual object is a virtual object of a first game play, the first interactive object may be a virtual sphere, and the second interactive object may be a virtual goal of a second game play, where the virtual goal of the second game play is preceded by a virtual goalkeeper of the second game play. The player can execute a first trigger operation on the displacement control through the graphic user interface, the computer equipment responds to the first trigger operation on the displacement control, controls the target virtual object to execute displacement behavior in the first game scene, displays a drop point indication mark on the graphic user interface, and displays a movement trend mark nearby the drop point indication mark. The drop point indication mark is a dotted line circular mark, the movement trend mark is displayed near the drop point indication mark and is a prompt mark consisting of a plurality of triangular marks with first colors, the direction pointed by an arrow consisting of the triangular marks with the first colors is a first movement direction, and the number of the triangular marks with the first colors represents first movement force born by the virtual sphere.
And 103, responding to a second trigger operation acted on the interaction control, and displaying a drop point offset mark on the graphical user interface according to the second trigger operation, wherein the drop point offset mark is used for indicating offset influence of the second trigger operation on the real-time drop point position of the first interaction object.
In an embodiment of the present application, in the step of "the responding to the second trigger operation applied to the interactive control, a drop point offset identifier is displayed on the graphical user interface according to the second trigger operation", a method may include:
responding to a second triggering operation acting on the interaction control, and displaying a drop point deviation mark on the graphical user interface according to the second triggering operation, wherein the drop point deviation mark is used for indicating a second movement direction and a second movement force of the real-time drop point position deviation of the first interaction object;
and updating the movement trend identifier according to the second triggering operation so as to display the updated movement trend identifier on the graphical user interface.
For example, referring to fig. 5 and 6 together, a player may perform a first trigger operation on a displacement control through a graphical user interface, and a computer device may control the target virtual object to perform a displacement action in the first game scene in response to the first trigger operation acting on the displacement control, and display a drop point indication identifier on the graphical user interface and a movement trend identifier near the drop point indication identifier. Then, the player can execute a second trigger operation on the interaction control through the graphic user interface, the computer equipment responds to the second trigger operation acted on the interaction control, and a drop point offset mark is displayed on the graphic user interface according to the second trigger operation, wherein the drop point offset mark is used for indicating a second movement direction and a second movement force of the real-time drop point position offset of the first interaction object; and the computer equipment updates the movement trend identification according to the second trigger operation so as to display the updated movement trend identification on the graphical user interface. Specifically, the motion trend identifier is a prompt identifier consisting of four triangular identifiers with first colors, and the updated motion trend identifier is a prompt identifier consisting of three triangular identifiers with first colors; the drop point offset mark is a prompt mark consisting of triangular marks of a second color, the direction pointed by an arrow consisting of the triangular marks of the second color is a second motion direction, and the number of the triangular marks of the second color represents second motion force born by the virtual sphere.
In this embodiment of the present application, a drop point offset identifier may be displayed on the gui according to the second trigger operation, and the movement trend identifier may be updated according to the second trigger operation, so as to display the updated movement trend identifier on the gui. Specific examples are as follows:
for example, as shown in fig. 7a and 7b, a player may perform a first trigger operation on a displacement control through a graphical user interface, and a computer device may control the target virtual object to perform a displacement action in the first game scene in response to the first trigger operation acting on the displacement control, and display a drop point indication identifier on the graphical user interface and a movement trend identifier near the drop point indication identifier. The drop point indication mark is a dotted line circular mark, the movement trend mark is displayed near the drop point indication mark and is a prompt mark consisting of four triangular marks with first colors, the direction pointed by an arrow consisting of the triangular marks with the first colors is a first movement direction, and the number of the triangular marks with the first colors represents first movement force born by the virtual sphere. Then, the computer equipment responds to a second triggering operation acting on the interaction control, a drop point offset mark is displayed on the graphical user interface according to the second triggering operation, at the moment, the drop point offset mark is a prompt mark consisting of a triangle mark with a second color, the direction pointed by an arrow consisting of the triangle mark with the second color is a second movement direction, and the number of the triangle marks with the second color represents second movement force born by the virtual sphere; and the computer equipment updates the movement trend identification according to the second trigger operation so as to display the updated movement trend identification on the graphical user interface. Specifically, the movement trend identifier is a prompt identifier formed by triangular identifiers of four first colors, and is updated to be a prompt identifier formed by triangular identifiers of three first colors so as to display the updated movement trend identifier.
For another example, as shown in fig. 8, a player may perform a first trigger operation on a displacement control through a graphical user interface, and a computer device may control the target virtual object to perform a displacement action in the first game scene in response to the first trigger operation acting on the displacement control, and display a drop point indication identifier on the graphical user interface and a movement trend identifier near the drop point indication identifier. The drop point indication mark is a dotted line circular mark, the movement trend mark is displayed near the drop point indication mark and is a prompt mark consisting of four triangular marks with first colors, the direction pointed by an arrow consisting of the triangular marks with the first colors is a first movement direction, and the number of the triangular marks with the first colors represents first movement force born by the virtual sphere. Then, the computer equipment responds to a second triggering operation acting on the interaction control, a drop point offset mark is displayed on the graphical user interface according to the second triggering operation, at the moment, the drop point offset mark is a prompt mark consisting of two triangular marks with second colors, the direction pointed by an arrow consisting of the triangular marks with the second colors is a second movement direction, and the number of the triangular marks with the second colors represents second movement force born by the virtual sphere; and the computer equipment updates the movement trend identification according to the second trigger operation so as to display the updated movement trend identification on the graphical user interface. Specifically, the movement trend identifier is a prompt identifier formed by triangular identifiers of four first colors, and is updated to be a prompt identifier formed by triangular identifiers of two first colors so as to display the updated movement trend identifier.
For another example, as shown in fig. 9, a player may perform a first trigger operation on a displacement control through a graphical user interface, and a computer device may control the target virtual object to perform a displacement action in the first game scene in response to the first trigger operation acting on the displacement control, and display a drop point indication identifier on the graphical user interface and a movement trend identifier near the drop point indication identifier. The drop point indication mark is a dotted line circular mark, the movement trend mark is displayed near the drop point indication mark and is a prompt mark consisting of four triangular marks with first colors, the direction pointed by an arrow consisting of the triangular marks with the first colors is a first movement direction, and the number of the triangular marks with the first colors represents first movement force born by the virtual sphere. Then, the computer equipment responds to a second triggering operation acting on the interaction control, a drop point offset mark is displayed on the graphical user interface according to the second triggering operation, at the moment, the drop point offset mark is a prompt mark consisting of three triangular marks with a second color, the direction pointed by an arrow consisting of the triangular marks with the second color is a second movement direction, and the number of the triangular marks with the second color represents second movement force born by the virtual sphere; and the computer equipment updates the movement trend identification according to the second trigger operation so as to display the updated movement trend identification on the graphical user interface. Specifically, the movement trend identifier is a prompt identifier formed by triangular identifiers of four first colors, and is updated to be a prompt identifier formed by triangular identifiers of two first colors so as to display the updated movement trend identifier.
In this embodiment of the present application, an offset is further provided, where the offset refers to a distance between a preset track in a target landing point of a virtual ball and an actual acting force generated when a controlled virtual player shoots a goal, where the distance is a feedback value of information mapped in a screen by a virtual scene. Specifically, the larger the value of the offset is, the larger the actual motion trail of the virtual sphere is different from the initial trend; the smaller the value of the offset, the closer the virtual sphere motion is to the expected position under the relative acting force; when approaching equal, the base indicator displays a balanced condition. And in the process of the movement of the virtual sphere, factors such as gravity and the like are involved in the movement of the sphere, so that the reality is restored and experienced.
104, responding to the second triggering operation, and controlling the target virtual object to execute the target interaction behavior on the first interaction object.
In an embodiment of the present application, in step "the responding to the ending of the second triggering operation, the target virtual object is controlled to execute the target interaction behavior on the first interaction object", and the method may include:
and responding to the second triggering operation, and controlling the target virtual object to execute the target interaction behavior on the first interaction object based on the first movement direction, the second movement direction, the first movement force and the second movement force.
In order to prompt the player of the remaining time for operating the interactive control, after "responding to the first trigger operation acting on the displacement control, controlling the target virtual object to execute the displacement behavior in the first game scene, and displaying a drop point indication identifier on the graphical user interface", the method may include:
and displaying a time prompt identifier on the graphical user interface, wherein the time prompt identifier is used for prompting a player of the residual duration for executing the triggering operation on the interaction control.
The time prompt identifier is updated in real time in the whole process of the residual time after the drop point indication identifier is displayed, and the time prompt identifier can be updated according to the current time length, and can be a time prompt identifier for timing in a positive sequence, a time prompt identifier for timing down, or a progress bar.
In light of the foregoing, the following examples further illustrate the interaction method in the game provided in the present application. One specific embodiment of the game interaction method is as follows:
(1) The second game scene can be displayed on the graphical user interface, wherein the second game scene comprises a plurality of game camps and the first interaction object, the first interaction object can be a virtual sphere, and the second interaction object can be a virtual goal; the plurality of game camps may include a first game camps and a second game camps, each game camps including a plurality of virtual players, the plurality of first virtual players of the first game camps, and the plurality of second virtual players of the second game camps may contend for control of the virtual sphere in the second game scene. The second game scene may be a virtual court including a first virtual goal corresponding to a first game play and a second virtual goal corresponding to a second game play.
(2) And the computer equipment responds to a ball-striking trigger event of the target virtual object for ball striking, switches the second game scene displayed by the graphical user interface into a first game scene, and displays a displacement control and an interaction control on the graphical user interface. In this case, the target virtual object is a virtual object in the first game arcade, and the first game scene is a game scene in which the target virtual object of the first game arcade performs a goal operation, and in order to concentrate the player on performing the goal and highlighting the goal, the first game scene is typically displayed with the target virtual object, the virtual ball, the virtual goal of the second game arcade, and the virtual goalkeeper.
(3) In the embodiment of the application, if the computer device does not detect any triggering operation for the displacement control and/or the interaction control within a first designated time period after the trigger point ball triggering event, after the first designated time period is reached, the computer device automatically controls the target virtual object to execute the target interaction behavior on the virtual sphere based on a preset AI algorithm.
In light of the foregoing, the following examples further illustrate the interaction method in the game provided in the present application. Another specific embodiment of the game interaction method is as follows:
(1) The second game scene can be displayed on the graphical user interface, wherein the second game scene comprises a plurality of game camps and the first interaction object, the first interaction object can be a virtual sphere, and the second interaction object can be a virtual goal; the plurality of game camps may include a first game camps and a second game camps, each game camps including a plurality of virtual players, the plurality of first virtual players of the first game camps, and the plurality of second virtual players of the second game camps may contend for control of the virtual sphere in the second game scene. The second game scene may be a virtual court including a first virtual goal corresponding to a first game play and a second virtual goal corresponding to a second game play.
(2) And the computer equipment responds to a ball-striking trigger event of the target virtual object for ball striking, switches the second game scene displayed by the graphical user interface into a first game scene, and displays a displacement control and an interaction control on the graphical user interface. In this case, the target virtual object is a virtual object in the first game arcade, and the first game scene is a game scene in which the target virtual object of the first game arcade performs a goal operation, and in order to concentrate the player on performing the goal and highlighting the goal, the first game scene is typically displayed with the target virtual object, the virtual ball, the virtual goal of the second game arcade, and the virtual goalkeeper.
(3) In the embodiment of the application, the player can execute the first trigger operation on the displacement control through the graphic user interface, the computer equipment responds to the first trigger operation acting on the displacement control, controls the target virtual object to execute the displacement behavior in the first game scene, and displays a drop point indication identifier at the virtual goal of the second game battle. The real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the virtual ball at the virtual goal after the target virtual object executing the real-time displacement behavior has target interaction behavior with the virtual ball, wherein the drop point indication mark is a dotted line circular mark.
(4) If the computer equipment does not detect the second triggering operation for the interaction control within the second appointed time after the drop point indication mark is displayed, the computer equipment automatically controls the target virtual object to execute the target interaction action on the virtual sphere based on the real-time displacement action of the target virtual object after the second appointed time is reached.
In light of the foregoing, the following examples further illustrate the interaction method in the game provided in the present application. Another specific embodiment of the game interaction method is as follows:
(1) The second game scene can be displayed on the graphical user interface, wherein the second game scene comprises a plurality of game camps and the first interaction object, the first interaction object can be a virtual sphere, and the second interaction object can be a virtual goal; the plurality of game camps may include a first game camps and a second game camps, each game camps including a plurality of virtual players, the plurality of first virtual players of the first game camps, and the plurality of second virtual players of the second game camps may contend for control of the virtual sphere in the second game scene. The second game scene may be a virtual court including a first virtual goal corresponding to a first game play and a second virtual goal corresponding to a second game play.
(2) And the computer equipment responds to a ball-striking trigger event of the target virtual object for ball striking, switches the second game scene displayed by the graphical user interface into a first game scene, and displays a displacement control and an interaction control on the graphical user interface. In this case, the target virtual object is a virtual object in the first game arcade, and the first game scene is a game scene in which the target virtual object of the first game arcade performs a goal operation, and in order to concentrate the player on performing the goal and highlighting the goal, the first game scene is typically displayed with the target virtual object, the virtual ball, the virtual goal of the second game arcade, and the virtual goalkeeper.
At this time, the goal stage is started, the player can execute a first trigger operation on the displacement control through the graphic user interface during the goal stage, the computer equipment responds to the first trigger operation acting on the displacement control, controls the target virtual object to execute displacement behavior in the first game scene, and displays a drop point indication mark at the virtual goal of the second game battle. The real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the virtual ball at the virtual goal after the target virtual object executing the real-time displacement behavior has target interaction behavior with the virtual ball, wherein the drop point indication mark is a dotted line circular mark.
(4) When the end of the ball-pointing stage is detected and the running-up stage is started, a movement trend identifier can be displayed nearby the drop point indication identifier displayed by the graphical user interface. The drop point indication mark is a dotted line circular mark, the movement trend mark is displayed near the drop point indication mark and is a prompt mark consisting of a plurality of triangular marks with first colors, the direction pointed by an arrow consisting of the triangular marks with the first colors is a first movement direction, and the number of the triangular marks with the first colors represents first movement force born by the virtual sphere.
Further, the player may perform a second triggering operation on the interactive control through the graphical user interface, e.g., the player may perform a second triggering operation on the interactive control (e.g., the virtual rocker), specifically may determine the second movement direction by dragging the virtual rocker, and may determine the second movement strength by pressing the virtual rocker. Then, the computer equipment responds to a second triggering operation acting on the interaction control, and a drop point deviation mark is displayed on the graphical user interface according to the second triggering operation, wherein the drop point deviation mark is used for indicating a second movement direction and a second movement force of the real-time drop point position deviation of the first interaction object; and the computer equipment updates the movement trend identification according to the second trigger operation so as to display the updated movement trend identification on the graphical user interface. Specifically, the motion trend identifier is a prompt identifier consisting of four triangular identifiers with first colors, and the updated motion trend identifier is a prompt identifier consisting of three triangular identifiers with first colors; the drop point offset mark is a prompt mark consisting of a triangular mark with a second color, the direction pointed by an arrow consisting of the triangular mark with the second color is a second motion direction, and the number of the triangular marks with the second color represents second motion force born by the virtual sphere.
Optionally, when the end of the ball-pointing stage is detected and the running-up stage is started, a time prompt identifier may be displayed on the graphical user interface, where the time prompt identifier is used to prompt the remaining duration of the running-up stage that is away from the end. The time prompt identifier is updated in real time in the whole running-up stage, and can be updated according to the remaining duration of the running-up stage, and the time prompt identifier can be a time prompt identifier for timing in a positive sequence, a time prompt identifier for timing in a negative sequence, or a progress bar.
(5) And the computer equipment responds to the second triggering operation to finish, and controls the target virtual object to execute the target interaction behavior on the virtual sphere. Specifically, the computer equipment determines the motion trail of the virtual sphere according to the drop point position of the drop point indication mark, the first motion direction and the first motion force associated with the motion trend mark, and the second motion direction and the second motion force of the drop point offset mark, and controls the target virtual object to execute the ball-pointing operation on the virtual sphere in response to the ending of the second trigger operation, so that the virtual sphere moves according to the motion trail. Specifically, the first movement direction and the first movement force are adjusted according to the second movement direction and the second movement force, so that the target movement direction and the target movement force are obtained, the target virtual object is controlled to execute the ball-pointing operation on the virtual ball according to the target movement direction and the target movement force, and the virtual ball is controlled to move.
In summary, the embodiment of the application provides an interaction method in a game, which includes displaying a first game scene, a displacement control and an interaction control on the graphical user interface, wherein the first game scene comprises a first interaction object and a target virtual object controlled by the terminal equipment; then, responding to a first triggering operation acted on the displacement control, controlling the target virtual object to execute displacement behavior in the first game scene, and displaying a drop point indication mark on the graphical user interface, wherein the real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the first interaction object in the first game scene after the target interaction behavior of the target virtual object executing the real-time displacement behavior occurs with the first interaction object; then, responding to a second trigger operation acted on the interaction control, and displaying a drop point offset mark on the graphical user interface according to the second trigger operation, wherein the drop point offset mark is used for indicating offset influence of the second trigger operation on the real-time drop point position of the first interaction object; and finally, responding to the second triggering operation, and controlling the target virtual object to execute the target interaction behavior on the first interaction object. According to the method and the device, the drop point indication mark can be displayed when the target interaction event occurs, when the player controls the target virtual object to adjust the virtual sphere, the drop point offset mark can be used for indicating the offset influence of the player adjustment operation on the drop point indication mark of the virtual sphere, so that the player is guided to adjust the direction and the strength of the virtual sphere according to the drop point indication mark and the drop point offset mark, the game playing method is enriched, the game content playing tolerance is greatly improved, the richness of the game interaction mode is effectively improved, the game experience is improved, and the game immersion of the player during the game is improved.
In order to facilitate better implementation of the in-game interaction method provided by the embodiment of the application, the embodiment of the application also provides an in-game interaction device based on the in-game interaction method. Where the meaning of nouns is the same as in the interaction method in the game described above, specific implementation details may be referred to the description in the method embodiments.
Referring to fig. 10, fig. 10 is a schematic structural diagram of an interactive device in a game according to an embodiment of the present application, where the interactive device in a game includes:
a first display unit 201, configured to display a first game scene, a displacement control, and an interaction control on the graphical user interface, where the first game scene includes a first interaction object and a target virtual object controlled by the terminal device;
a first control unit 202, configured to control, in response to a first trigger operation acting on the displacement control, the target virtual object to perform a displacement action in the first game scene, and display a drop point indication identifier on the graphical user interface, where a real-time display position of the drop point indication identifier is determined based on a real-time displacement action of the target virtual object, and is used to indicate a real-time drop point position of the first interaction object in the first game scene after the target virtual object performing the real-time displacement action performs a target interaction action with the first interaction object;
A second display unit 203, configured to respond to a second trigger operation acting on the interaction control, and display a drop point offset identifier on the graphical user interface according to the second trigger operation, where the drop point offset identifier is used to indicate an offset effect of the second trigger operation on the real-time drop point position of the first interaction object;
and the second control unit 204 is configured to control the target virtual object to execute the target interaction behavior on the first interaction object in response to the ending of the second triggering operation.
In some embodiments, the interactive device in the game comprises:
the first control subunit is used for responding to a first triggering operation acted on the displacement control, controlling the target virtual object to execute displacement behavior in the first game scene, and displaying a drop point indication mark at the second interaction object;
the real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the first interactive object at the second interactive object after the target virtual object executing the real-time displacement behavior has the target interactive behavior with the first interactive object.
In some embodiments, the interactive device in the game comprises:
and the first display subunit is used for displaying a second game scene on the graphical user interface, wherein the second game scene comprises a plurality of game camps and the first interaction objects, each game camps comprises a plurality of virtual roles, and the virtual roles of the game camps compete for the control right of the first interaction objects in the second game scene.
In some embodiments, the interactive device in the game comprises:
and the response subunit is used for responding to the target interaction event, switching the second game scene displayed by the graphical user interface into a first game scene, and displaying a displacement control and an interaction control on the graphical user interface.
In some embodiments, the interaction means in the game comprises a setup subunit for:
the game view angle corresponding to the first game scene is different from the game view angle corresponding to the second game scene.
In some embodiments, the interaction means in the game comprises a setup subunit for:
the game view angle corresponding to the first game scene is a 2.5D overlook view angle, and the game view angle corresponding to the second game scene is a first person view angle or a third person view angle.
In some embodiments, the interactive device in the game comprises:
the second control subunit is used for responding to a first triggering operation acted on the displacement control, controlling the target virtual object to execute displacement behavior in the first game scene, displaying a drop point indication mark on the graphical user interface and displaying a movement trend mark nearby the drop point indication mark;
the real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the first interaction object in the first game scene after the target virtual object executing the real-time displacement behavior has target interaction behavior with the first interaction object; the motion trend identifies a first motion direction and a first motion force for indicating the real-time drop point position of the first interactive object.
In some embodiments, the interactive device in the game comprises:
the second display subunit is used for responding to a second triggering operation acting on the interaction control, and displaying a drop point deviation mark on the graphical user interface according to the second triggering operation, wherein the drop point deviation mark is used for indicating a second movement direction and a second movement force of the real-time drop point position deviation of the first interaction object;
And the updating subunit is used for updating the movement trend identifier according to the second triggering operation so as to display the updated movement trend identifier on the graphical user interface.
In some embodiments, the interactive device in the game comprises:
and the third control subunit is used for responding to the end of the second triggering operation and controlling the target virtual object to execute the target interaction behavior on the first interaction object based on the first movement direction, the second movement direction, the first movement force and the second movement force.
In some embodiments, the interactive device in the game comprises:
and the third display subunit is used for displaying a time prompt identifier on the graphical user interface, wherein the time prompt identifier is used for prompting a player of the residual duration for executing the triggering operation on the interaction control.
The embodiment of the application discloses an interaction device in a game, which is used for displaying a first game scene, a displacement control and an interaction control on a graphical user interface through a first display unit 201, wherein the first game scene comprises a first interaction object and a target virtual object controlled by terminal equipment; the first control unit 202 controls the target virtual object to execute a displacement action in the first game scene in response to a first trigger operation acting on the displacement control, and displays a drop point indication identifier on the graphical user interface, wherein a real-time display position of the drop point indication identifier is determined based on the real-time displacement action of the target virtual object, and is used for indicating a real-time drop point position of the first interaction object in the first game scene after the target virtual object executing the real-time displacement action has a target interaction action with the first interaction object; the second display unit 203 is configured to respond to a second trigger operation acting on the interaction control, and display a drop point offset identifier on the graphical user interface according to the second trigger operation, where the drop point offset identifier is used to indicate an offset effect of the second trigger operation on the real-time drop point position of the first interaction object; the second control unit 204 controls the target virtual object to execute the target interaction behavior on the first interaction object in response to the second trigger operation ending. According to the method and the device, the drop point indication mark can be displayed when the target interaction event occurs, when the player controls the target virtual object to adjust the virtual sphere, the drop point offset mark can be used for indicating the offset influence of the player adjustment operation on the drop point indication mark of the virtual sphere, so that the player is guided to adjust the direction and the strength of the virtual sphere according to the drop point indication mark and the drop point offset mark, the game playing method is enriched, the game content playing tolerance is greatly improved, the richness of the game interaction mode is effectively improved, the game experience is improved, and the game immersion of the player during the game is improved.
Correspondingly, the embodiment of the application also provides a computer device, which can be a terminal or a server, wherein the terminal can be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA) and the like. Fig. 11 is a schematic structural diagram of a computer device according to an embodiment of the present application, as shown in fig. 11. The computer device 300 includes a processor 301 having one or more processing cores, a memory 302 having one or more computer readable storage media, and a computer program stored on the memory 302 and executable on the processor. The processor 301 is electrically connected to the memory 302. It will be appreciated by those skilled in the art that the computer device structure shown in the figures is not limiting of the computer device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
Processor 301 is a control center of computer device 300 and utilizes various interfaces and lines to connect various portions of the overall computer device 300, and to perform various functions of computer device 300 and process data by running or loading software programs and/or modules stored in memory 302 and invoking data stored in memory 302, thereby performing overall monitoring of computer device 300.
In the embodiment of the present application, the processor 301 in the computer device 300 loads the instructions corresponding to the processes of one or more application programs into the memory 302 according to the following steps, and the processor 301 executes the application programs stored in the memory 302, so as to implement various functions:
displaying a first game scene, a displacement control and an interaction control on the graphical user interface, wherein the first game scene comprises a first interaction object and a target virtual object controlled by the terminal equipment;
responding to a first triggering operation acted on the displacement control, controlling the target virtual object to execute a displacement action in the first game scene, and displaying a drop point indication mark on the graphical user interface, wherein the real-time display position of the drop point indication mark is determined based on the real-time displacement action of the target virtual object and is used for indicating the real-time drop point position of the first interaction object in the first game scene after the target virtual object executing the real-time displacement action has target interaction action with the first interaction object;
responding to a second trigger operation acting on the interaction control, and displaying a drop point offset mark on the graphical user interface according to the second trigger operation, wherein the drop point offset mark is used for indicating offset influence of the second trigger operation on the real-time drop point position of the first interaction object;
And responding to the second triggering operation, and controlling the target virtual object to execute the target interaction behavior on the first interaction object.
In an embodiment, the first game scene further includes a second interactive object used together with the first interactive object;
the response acts on the first triggering operation of the displacement control, controls the target virtual object to execute displacement behavior in the first game scene, displays a drop point indication mark on the graphical user interface, and comprises the following steps:
responding to a first triggering operation acting on the displacement control, controlling the target virtual object to execute displacement behavior in the first game scene, and displaying a drop point indication mark at the second interaction object;
the real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the first interactive object at the second interactive object after the target virtual object executing the real-time displacement behavior has the target interactive behavior with the first interactive object.
In an embodiment, before the graphical user interface displays the first game scene, the displacement control, and the interaction control, the method further comprises:
And displaying a second game scene on the graphical user interface, wherein the second game scene comprises a plurality of game camps and the first interactive object, each game camps comprises a plurality of virtual characters, and the virtual characters of the game camps compete for the control right of the first interactive object in the second game scene.
In an embodiment, the displaying the first game scene, the displacement control and the interaction control on the graphical user interface includes:
and responding to a target interaction event, switching the second game scene displayed by the graphical user interface into a first game scene, and displaying a displacement control and an interaction control on the graphical user interface.
In an embodiment, the game view angle corresponding to the first game scene is different from the game view angle corresponding to the second game scene.
In an embodiment, the game view angle corresponding to the first game scene is a 2.5D overlook view angle, and the game view angle corresponding to the second game scene is a first person view angle or a third person view angle.
In an embodiment, the responding to the first triggering operation acting on the displacement control controls the target virtual object to execute the displacement action in the first game scene, and displays a drop point indication identifier on the graphical user interface, including:
Responding to a first triggering operation acting on the displacement control, controlling the target virtual object to execute displacement behavior in the first game scene, displaying a drop point indication mark on the graphical user interface, and displaying a movement trend mark near the drop point indication mark;
the real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the first interaction object in the first game scene after the target virtual object executing the real-time displacement behavior has target interaction behavior with the first interaction object; the motion trend identifies a first motion direction and a first motion force for indicating the real-time drop point position of the first interactive object.
In an embodiment, the responding to the second triggering operation acting on the interaction control displays a drop point offset identifier on the graphical user interface according to the second triggering operation, and includes:
responding to a second triggering operation acting on the interaction control, and displaying a drop point deviation mark on the graphical user interface according to the second triggering operation, wherein the drop point deviation mark is used for indicating a second movement direction and a second movement force of the real-time drop point position deviation of the first interaction object;
And updating the movement trend identifier according to the second triggering operation so as to display the updated movement trend identifier on the graphical user interface.
In an embodiment, the controlling the target virtual object to perform the target interaction behavior on the first interaction object in response to the second triggering operation ending includes:
and responding to the second triggering operation, and controlling the target virtual object to execute the target interaction behavior on the first interaction object based on the first movement direction, the second movement direction, the first movement force and the second movement force.
In an embodiment, after responding to the first triggering operation acting on the displacement control, controlling the target virtual object to execute the displacement action in the first game scene, and displaying a drop point indication identifier on the graphical user interface, the method further comprises:
and displaying a time prompt identifier on the graphical user interface, wherein the time prompt identifier is used for prompting a player of the residual duration for executing the triggering operation on the interaction control.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Optionally, as shown in fig. 11, the computer device 300 further includes: a touch display 303, a radio frequency circuit 304, an audio circuit 305, an input unit 306, and a power supply 307. The processor 301 is electrically connected to the touch display 303, the radio frequency circuit 304, the audio circuit 305, the input unit 306, and the power supply 307, respectively. Those skilled in the art will appreciate that the computer device structure shown in FIG. 11 is not limiting of the computer device and may include more or fewer components than shown, or may be combined with certain components, or a different arrangement of components.
The touch display 303 may be used to display a graphical user interface and receive operation instructions generated by a user acting on the graphical user interface. The touch display 303 may include a display panel and a touch panel. Wherein the display panel may be used to display information entered by a user or provided to a user as well as various graphical user interfaces of a computer device, which may be composed of graphics, text, icons, video, and any combination thereof. Alternatively, the display panel may be configured in the form of a liquid crystal display (LCD, liquid Crystal Display), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations on or near the user (such as operations on or near the touch panel by the user using any suitable object or accessory such as a finger, stylus, etc.), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, and sends the touch point coordinates to the processor 301, and can receive and execute commands sent from the processor 301. The touch panel may overlay the display panel, and upon detection of a touch operation thereon or thereabout, the touch panel is passed to the processor 301 to determine the type of touch event, and the processor 301 then provides a corresponding visual output on the display panel in accordance with the type of touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 303 to implement the input and output functions. In some embodiments, however, the touch panel and the touch panel may be implemented as two separate components to perform the input and output functions. I.e. the touch-sensitive display 303 may also implement an input function as part of the input unit 306.
In the present embodiment, a graphical user interface is generated on touch display 303 by processor 301 executing a gaming application. The touch display 303 is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface.
The radio frequency circuitry 304 may be used to transceive radio frequency signals to establish wireless communications with a network device or other computer device via wireless communications.
The audio circuit 305 may be used to provide an audio interface between a user and a computer device through a speaker, microphone. The audio circuit 305 may transmit the received electrical signal after audio data conversion to a speaker, and convert the electrical signal into a sound signal for output by the speaker; on the other hand, the microphone converts the collected sound signals into electrical signals, which are received by the audio circuit 305 and converted into audio data, which are processed by the audio data output processor 301 for transmission to, for example, another computer device via the radio frequency circuit 304, or which are output to the memory 302 for further processing. The audio circuit 305 may also include an ear bud jack to provide communication of the peripheral ear bud with the computer device.
The input unit 306 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 307 is used to power the various components of the computer device 300. Alternatively, the power supply 307 may be logically connected to the processor 301 through a power management system, so as to perform functions of managing charging, discharging, and power consumption management through the power management system. The power supply 307 may also include one or more of any components, such as a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown in fig. 11, the computer device 300 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which will not be described herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
As can be seen from the foregoing, in the computer device provided in this embodiment, a first game scene, a displacement control, and an interaction control are displayed on the graphical user interface, where the first game scene includes a first interaction object and a target virtual object controlled by the terminal device; then, responding to a first triggering operation acted on the displacement control, controlling the target virtual object to execute displacement behavior in the first game scene, and displaying a drop point indication mark on the graphical user interface, wherein the real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the first interaction object in the first game scene after the target interaction behavior of the target virtual object executing the real-time displacement behavior occurs with the first interaction object; then, responding to a second trigger operation acted on the interaction control, and displaying a drop point offset mark on the graphical user interface according to the second trigger operation, wherein the drop point offset mark is used for indicating offset influence of the second trigger operation on the real-time drop point position of the first interaction object; and finally, responding to the second triggering operation, and controlling the target virtual object to execute the target interaction behavior on the first interaction object. According to the method and the device, the drop point indication mark can be displayed when the target interaction event occurs, when the player controls the target virtual object to adjust the virtual sphere, the drop point offset mark can be used for indicating the offset influence of the player adjustment operation on the drop point indication mark of the virtual sphere, so that the player is guided to adjust the direction and the strength of the virtual sphere according to the drop point indication mark and the drop point offset mark, the game playing method is enriched, the game content playing tolerance is greatly improved, the richness of the game interaction mode is effectively improved, the game experience is improved, and the game immersion of the player during the game is improved.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer readable storage medium having stored therein a plurality of computer programs that can be loaded by a processor to perform steps in any of the interaction methods in a game provided by embodiments of the present application. For example, the computer program may perform the steps of:
displaying a first game scene, a displacement control and an interaction control on the graphical user interface, wherein the first game scene comprises a first interaction object and a target virtual object controlled by the terminal equipment;
responding to a first triggering operation acted on the displacement control, controlling the target virtual object to execute a displacement action in the first game scene, and displaying a drop point indication mark on the graphical user interface, wherein the real-time display position of the drop point indication mark is determined based on the real-time displacement action of the target virtual object and is used for indicating the real-time drop point position of the first interaction object in the first game scene after the target virtual object executing the real-time displacement action has target interaction action with the first interaction object;
Responding to a second trigger operation acting on the interaction control, and displaying a drop point offset mark on the graphical user interface according to the second trigger operation, wherein the drop point offset mark is used for indicating offset influence of the second trigger operation on the real-time drop point position of the first interaction object;
and responding to the second triggering operation, and controlling the target virtual object to execute the target interaction behavior on the first interaction object.
In an embodiment, the first game scene further includes a second interactive object used together with the first interactive object;
the response acts on the first triggering operation of the displacement control, controls the target virtual object to execute displacement behavior in the first game scene, displays a drop point indication mark on the graphical user interface, and comprises the following steps:
responding to a first triggering operation acting on the displacement control, controlling the target virtual object to execute displacement behavior in the first game scene, and displaying a drop point indication mark at the second interaction object;
the real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the first interactive object at the second interactive object after the target virtual object executing the real-time displacement behavior has the target interactive behavior with the first interactive object.
In an embodiment, before the graphical user interface displays the first game scene, the displacement control, and the interaction control, the method further comprises:
and displaying a second game scene on the graphical user interface, wherein the second game scene comprises a plurality of game camps and the first interactive object, each game camps comprises a plurality of virtual characters, and the virtual characters of the game camps compete for the control right of the first interactive object in the second game scene.
In an embodiment, the displaying the first game scene, the displacement control and the interaction control on the graphical user interface includes:
and responding to a target interaction event, switching the second game scene displayed by the graphical user interface into a first game scene, and displaying a displacement control and an interaction control on the graphical user interface.
In an embodiment, the game view angle corresponding to the first game scene is different from the game view angle corresponding to the second game scene.
In an embodiment, the game view angle corresponding to the first game scene is a 2.5D overlook view angle, and the game view angle corresponding to the second game scene is a first person view angle or a third person view angle.
In an embodiment, the responding to the first triggering operation acting on the displacement control controls the target virtual object to execute the displacement action in the first game scene, and displays a drop point indication identifier on the graphical user interface, including:
responding to a first triggering operation acting on the displacement control, controlling the target virtual object to execute displacement behavior in the first game scene, displaying a drop point indication mark on the graphical user interface, and displaying a movement trend mark near the drop point indication mark;
the real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the first interaction object in the first game scene after the target virtual object executing the real-time displacement behavior has target interaction behavior with the first interaction object; the motion trend identifies a first motion direction and a first motion force for indicating the real-time drop point position of the first interactive object.
In an embodiment, the responding to the second triggering operation acting on the interaction control displays a drop point offset identifier on the graphical user interface according to the second triggering operation, and includes:
Responding to a second triggering operation acting on the interaction control, and displaying a drop point deviation mark on the graphical user interface according to the second triggering operation, wherein the drop point deviation mark is used for indicating a second movement direction and a second movement force of the real-time drop point position deviation of the first interaction object;
and updating the movement trend identifier according to the second triggering operation so as to display the updated movement trend identifier on the graphical user interface.
In an embodiment, the controlling the target virtual object to perform the target interaction behavior on the first interaction object in response to the second triggering operation ending includes:
and responding to the second triggering operation, and controlling the target virtual object to execute the target interaction behavior on the first interaction object based on the first movement direction, the second movement direction, the first movement force and the second movement force.
In an embodiment, after responding to the first triggering operation acting on the displacement control, controlling the target virtual object to execute the displacement action in the first game scene, and displaying a drop point indication identifier on the graphical user interface, the method further comprises:
And displaying a time prompt identifier on the graphical user interface, wherein the time prompt identifier is used for prompting a player of the residual duration for executing the triggering operation on the interaction control.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Wherein the storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
Displaying a first game scene, a displacement control and an interaction control on the graphical user interface, wherein the first game scene comprises a first interaction object and a target virtual object controlled by the terminal device; then, responding to a first triggering operation acted on the displacement control, controlling the target virtual object to execute displacement behavior in the first game scene, and displaying a drop point indication mark on the graphical user interface, wherein the real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the first interaction object in the first game scene after the target interaction behavior of the target virtual object executing the real-time displacement behavior occurs with the first interaction object; then, responding to a second trigger operation acted on the interaction control, and displaying a drop point offset mark on the graphical user interface according to the second trigger operation, wherein the drop point offset mark is used for indicating offset influence of the second trigger operation on the real-time drop point position of the first interaction object; and finally, responding to the second triggering operation, and controlling the target virtual object to execute the target interaction behavior on the first interaction object. According to the method and the device, the drop point indication mark can be displayed when the target interaction event occurs, when the player controls the target virtual object to adjust the virtual sphere, the drop point offset mark can be used for indicating the offset influence of the player adjustment operation on the drop point indication mark of the virtual sphere, so that the player is guided to adjust the direction and the strength of the virtual sphere according to the drop point indication mark and the drop point offset mark, the game playing method is enriched, the game content playing tolerance is greatly improved, the richness of the game interaction mode is effectively improved, the game experience is improved, and the game immersion of the player during the game is improved.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
The foregoing describes in detail the interaction method, apparatus, computer device and storage medium in a game provided by the embodiments of the present application, and specific examples are applied to illustrate the principles and embodiments of the present application, where the foregoing description of the embodiments is only used to help understand the technical solution and core ideas of the present application; those of ordinary skill in the art will appreciate that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.

Claims (13)

1. An interactive method in a game, characterized in that a graphical user interface is provided by a terminal device, the method comprising:
displaying a first game scene, a displacement control and an interaction control on the graphical user interface, wherein the first game scene comprises a first interaction object and a target virtual object controlled by the terminal equipment;
Responding to a first triggering operation acted on the displacement control, controlling the target virtual object to execute a displacement action in the first game scene, and displaying a drop point indication mark on the graphical user interface, wherein the real-time display position of the drop point indication mark is determined based on the real-time displacement action of the target virtual object and is used for indicating the real-time drop point position of the first interaction object in the first game scene after the target virtual object executing the real-time displacement action has target interaction action with the first interaction object;
responding to a second trigger operation acting on the interaction control, and displaying a drop point offset mark on the graphical user interface according to the second trigger operation, wherein the drop point offset mark is used for indicating offset influence of the second trigger operation on the real-time drop point position of the first interaction object;
and responding to the second triggering operation, and controlling the target virtual object to execute the target interaction behavior on the first interaction object.
2. The method of claim 1, further comprising a second interactive object in the first game scene for use with the first interactive object;
The response acts on the first triggering operation of the displacement control, controls the target virtual object to execute displacement behavior in the first game scene, displays a drop point indication mark on the graphical user interface, and comprises the following steps:
responding to a first triggering operation acting on the displacement control, controlling the target virtual object to execute displacement behavior in the first game scene, and displaying a drop point indication mark at the second interaction object;
the real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the first interactive object at the second interactive object after the target virtual object executing the real-time displacement behavior has the target interactive behavior with the first interactive object.
3. The method of claim 2, wherein prior to the graphical user interface displaying the first game scene, the displacement control, and the interaction control, the method further comprises:
and displaying a second game scene on the graphical user interface, wherein the second game scene comprises a plurality of game camps and the first interactive object, each game camps comprises a plurality of virtual characters, and the virtual characters of the game camps compete for the control right of the first interactive object in the second game scene.
4. The method of claim 3, wherein displaying the first game scene, the displacement control, and the interaction control at the graphical user interface comprises:
and responding to a target interaction event, switching the second game scene displayed by the graphical user interface into a first game scene, and displaying a displacement control and an interaction control on the graphical user interface.
5. The method of claim 4, wherein the first game scene corresponds to a different game perspective than the second game scene.
6. The method of claim 4, wherein the first game scene corresponds to a game view angle of 2.5D overhead view, and the second game scene corresponds to a game view angle of either a first person view angle or a third person view angle.
7. The method of claim 1, wherein controlling the target virtual object to perform a displacement action in the first game scene in response to a first trigger operation on the displacement control and displaying a drop point indication identifier on the graphical user interface comprises:
responding to a first triggering operation acting on the displacement control, controlling the target virtual object to execute displacement behavior in the first game scene, displaying a drop point indication mark on the graphical user interface, and displaying a movement trend mark near the drop point indication mark;
The real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the first interaction object in the first game scene after the target virtual object executing the real-time displacement behavior has target interaction behavior with the first interaction object; the motion trend identifies a first motion direction and a first motion force for indicating the real-time drop point position of the first interactive object.
8. The method of claim 7, wherein the responding to the second trigger operation on the interactive control, displaying a drop point offset identifier on the graphical user interface according to the second trigger operation, comprises:
responding to a second triggering operation acting on the interaction control, and displaying a drop point deviation mark on the graphical user interface according to the second triggering operation, wherein the drop point deviation mark is used for indicating a second movement direction and a second movement force of the real-time drop point position deviation of the first interaction object;
and updating the movement trend identifier according to the second triggering operation so as to display the updated movement trend identifier on the graphical user interface.
9. The method of claim 8, wherein the controlling the target virtual object to perform the target interaction behavior on the first interaction object in response to the second trigger operation ending comprises:
and responding to the second triggering operation, and controlling the target virtual object to execute the target interaction behavior on the first interaction object based on the first movement direction, the second movement direction, the first movement force and the second movement force.
10. The method of claim 8, wherein after controlling the target virtual object to perform a displacement action in the first game scene in response to a first trigger operation acting on the displacement control and displaying a drop point indication identifier on the graphical user interface, further comprising:
and displaying a time prompt identifier on the graphical user interface, wherein the time prompt identifier is used for prompting a player of the residual duration for executing the triggering operation on the interaction control.
11. An interactive apparatus in a game, characterized by providing a graphical user interface through terminal equipment, comprising:
the first display unit is used for displaying a first game scene, a displacement control and an interaction control on the graphical user interface, wherein the first game scene comprises a first interaction object and a target virtual object controlled by the terminal equipment;
The first control unit is used for responding to a first triggering operation acted on the displacement control, controlling the target virtual object to execute displacement behavior in the first game scene, and displaying a drop point indication mark on the graphical user interface, wherein the real-time display position of the drop point indication mark is determined based on the real-time displacement behavior of the target virtual object, and is used for indicating the real-time drop point position of the first interaction object in the first game scene after the target virtual object executing the real-time displacement behavior has target interaction behavior with the first interaction object;
the second display unit is used for responding to a second trigger operation acting on the interaction control, and displaying a drop point offset mark on the graphical user interface according to the second trigger operation, wherein the drop point offset mark is used for indicating offset influence of the second trigger operation on the real-time drop point position of the first interaction object;
and the second control unit is used for responding to the ending of the second triggering operation and controlling the target virtual object to execute the target interaction behavior on the first interaction object.
12. A computer device comprising a processor and a memory, the memory storing a plurality of instructions; the processor loads instructions from the memory to perform the steps in the interaction method in a game as claimed in any of claims 1 to 10.
13. A computer readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps in the interaction method in a game according to any of claims 1 to 10.
CN202410037172.XA 2024-01-10 2024-01-10 Interaction method and device in game, computer equipment and storage medium Pending CN117753001A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410037172.XA CN117753001A (en) 2024-01-10 2024-01-10 Interaction method and device in game, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410037172.XA CN117753001A (en) 2024-01-10 2024-01-10 Interaction method and device in game, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117753001A true CN117753001A (en) 2024-03-26

Family

ID=90310866

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410037172.XA Pending CN117753001A (en) 2024-01-10 2024-01-10 Interaction method and device in game, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117753001A (en)

Similar Documents

Publication Publication Date Title
KR20200115213A (en) Automated player control takeover in a video game
US10918937B2 (en) Dynamic gameplay session management system
TW202222395A (en) Display method, device, apparatus and medium for pre-purchase items
JP2020168527A (en) Program, terminal, game system, and game management device
WO2022083451A1 (en) Skill selection method and apparatus for virtual object, and device, medium and program product
CN112221135B (en) Picture display method, device, equipment and storage medium
WO2023024880A1 (en) Method and apparatus for expression displaying in virtual scenario, and device and medium
CN115068947A (en) Game interaction method and device, computer equipment and computer-readable storage medium
CN114159789A (en) Game interaction method and device, computer equipment and storage medium
US11541312B2 (en) Respawn systems and methods in video games
CN115645912A (en) Game element display method and device, computer equipment and storage medium
CN117753001A (en) Interaction method and device in game, computer equipment and storage medium
CN112138392B (en) Virtual object control method, device, terminal and storage medium
CN116850595A (en) Game control method, game control device, computer equipment and storage medium
CN117643723A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN110743167A (en) Method and device for realizing interactive function
CN117018617A (en) Game control method, game control device, computer equipment and storage medium
CN116850594A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN116099199A (en) Game skill processing method, game skill processing device, computer equipment and storage medium
CN118001735A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN118001718A (en) Method and device for controlling movement in game, electronic equipment and medium
CN116328301A (en) Information prompting method, device, computer equipment and storage medium
CN117861213A (en) Game skill processing method, game skill processing device, computer equipment and storage medium
WO2024037399A1 (en) Catching information display method and apparatus based on virtual world, and device and medium
JP6761075B2 (en) Programs, terminals, game systems and game management devices

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination