CN117205555A - Game interface display method, game interface display device, electronic equipment and readable storage medium - Google Patents

Game interface display method, game interface display device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN117205555A
CN117205555A CN202311175328.2A CN202311175328A CN117205555A CN 117205555 A CN117205555 A CN 117205555A CN 202311175328 A CN202311175328 A CN 202311175328A CN 117205555 A CN117205555 A CN 117205555A
Authority
CN
China
Prior art keywords
throwing
target
game
character
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311175328.2A
Other languages
Chinese (zh)
Inventor
许展昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202311175328.2A priority Critical patent/CN117205555A/en
Publication of CN117205555A publication Critical patent/CN117205555A/en
Pending legal-status Critical Current

Links

Abstract

The embodiment of the application discloses a game interface display method, a game interface display device, electronic equipment and a computer readable storage medium, wherein a target throwing object to be thrown is determined by responding to a throwing object selecting operation on a graphical user interface; control displays a throwing preview panel on the graphical user interface, wherein the throwing preview panel includes a target game character and a throwing trace including a throwing start point and a throwing end point, the throwing preview panel further including distance information indicating a throwing start point to throwing end point distance. The embodiment of the application can reduce the time spent by a player when throwing the throwing object.

Description

Game interface display method, game interface display device, electronic equipment and readable storage medium
Technical Field
The present application relates to the field of game technologies, and in particular, to a game interface display method, a game interface display device, an electronic device, and a computer readable storage medium.
Background
Under the wave of the internet, the continuous development and evolution of hardware and software technology has promoted the advent of intelligent devices and software. At the same time, a large number of games with different themes are played to meet the demands of users, and with the vigorous development of various technologies in the game industry, the demands of people on game performance are increasing.
Currently, in some throwing games, a player generally throws a throwing object multiple times to meet the expected requirement of the player, namely, throwing the throwing object at a satisfactory position of the player, so that a high trial and error cost exists, and more time is spent when the player throws the throwing object.
Disclosure of Invention
Embodiments of the present application provide a game interface display method, apparatus, electronic device, and computer-readable storage medium, which can reduce the time spent by a player throwing a throwing object.
In a first aspect, an embodiment of the present application provides a game interface display method, where a graphical user interface is provided by a terminal device, where content displayed in the graphical user interface includes at least a part of a game scene and a target game character that is located in the game scene and is controlled by a current player, and the method includes:
determining a target thrown object to be thrown in response to a thrown object selecting operation on the graphical user interface;
and controlling to display a throwing preview panel on the graphical user interface, wherein the throwing preview panel comprises the target game character and a throwing track, the throwing track comprises a throwing starting point and a throwing end point, and the throwing preview panel further comprises distance information indicating a distance from the throwing starting point to the throwing end point.
In a second aspect, an embodiment of the present application further provides a game interface display apparatus, which provides a graphical user interface through a terminal device, where content displayed in the graphical user interface includes at least a part of a game scene, and a target game character located in the game scene and operated by a current player, where the apparatus includes:
a projectile determining module for determining a target projectile to be thrown in response to a projectile selection operation on a graphical user interface;
and the throwing preview module is used for controlling the display of a throwing preview panel on the graphical user interface, wherein the throwing preview panel comprises the target game role and a throwing track, the throwing track comprises a throwing starting point and a throwing end point, and the throwing preview panel further comprises distance information indicating the distance from the throwing starting point to the throwing end point.
In a third aspect, an embodiment of the present application further provides an electronic device, including a memory storing a plurality of instructions; the processor loads instructions from the memory to execute any game interface display method provided by the embodiment of the application.
In a fourth aspect, embodiments of the present application further provide a computer readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform any of the game interface display methods provided by the embodiments of the present application.
In the embodiment of the application, a graphical user interface is provided through terminal equipment, the content displayed in the graphical user interface comprises at least part of game scenes and target game roles which are positioned in the game scenes and controlled by current players, and a target throwing object to be thrown is determined in response to throwing object selection operation on the graphical user interface; and controlling to display a throwing preview panel on the graphical user interface, wherein the throwing preview panel comprises the target game role and a throwing track, the throwing track comprises a throwing starting point and a throwing end point, and the throwing preview panel further comprises distance information indicating the distance from the throwing starting point to the throwing end point, so that a player can determine the corresponding throwing track and related information under the current throwing gesture by displaying the throwing track on the throwing preview panel, and the time spent by the player when throwing a throwing object is reduced.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a game interface display system according to an embodiment of the present application;
FIG. 2 is a flowchart of an embodiment of a game interface display method according to an embodiment of the present application;
FIG. 3a is a view of a throwing trace scene in a standing throwing pose provided in an embodiment of the present application;
FIG. 3b is a view of a throwing trace scene in a squat throwing gesture provided in an embodiment of the present application;
FIG. 3c is a view of a throwing trace scene in a jumping throwing gesture provided in an embodiment of the present application;
FIG. 3d is a schematic diagram of a scenario of gesture identification provided in an embodiment of the present application;
FIG. 4a is a view of a throwing trace scene using a high-throwing technique provided in an embodiment of the present application;
FIG. 4b is a view of a throwing trace scene using a low-throw technique provided in an embodiment of the present application;
FIG. 5a is a schematic view of one scenario of projectile selection provided in an embodiment of the application;
FIG. 5b is a schematic view of another scenario of projectile selection provided in an embodiment of the application;
FIG. 5c is a schematic view of yet another scenario of projectile selection provided in an embodiment of the application;
FIG. 6 is a schematic view of a scenario of coordinate axes provided in an embodiment of the present application;
FIG. 7a is a schematic diagram of a role identification scenario provided by an embodiment of the present application;
FIG. 7b is a schematic view of a range of waveforms provided by an embodiment of the present application;
FIG. 7c is a view of a throwing trace scene in a jumping throwing gesture provided in an embodiment of the present application;
FIG. 8 is a schematic diagram of a game interface display device according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to fall within the scope of the application.
Before explaining the embodiments of the present application in detail, some terms related to the embodiments of the present application are explained.
Wherein in the description of embodiments of the present application, the terms "first," "second," and the like may be used herein to describe various concepts, but are not limited by these terms unless otherwise specified. These terms are only used to distinguish one concept from another. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Wherein the graphical user interface: the graphical user interface may be a display screen interface of the terminal device by executing a software application on a processor of the mobile terminal or other terminal and rendering it on a display. The graphical user interface may present all or only a portion of the game scene. The game scene comprises a plurality of static virtual objects, and specifically comprises ground, mountain, stone, vegetation, buildings and the like. When the game scene is bigger, only the local content of the game scene is displayed on the graphical user interface of the terminal equipment in the game process.
Wherein, the game scene: is a game scene that an application displays (or provides) when running on a terminal or server. Optionally, the game scene may be a simulation environment for the real world, a semi-simulation and semi-fictional virtual environment, or a pure fictional virtual environment. The game scene can be any one of a two-dimensional game scene, a 2.5-dimensional game scene or a three-dimensional game scene, and the dimension of the game scene is not limited in the embodiment of the application. For example, a game scene may include sky, land, sea, etc., where land may include environmental elements such as desert, city, etc., where a user may control a target game character to move.
Wherein, the target game character: refers to dynamic objects that can be controlled in a game scene. Alternatively, the target game character may be a virtual character, a virtual animal, a cartoon character, or the like. The target game character may be an avatar in the game scene for representing the user. At least one target game character may be included in the game scene, each target game character having its own shape and volume in the game scene, occupying a portion of the space in the game scene. In one possible implementation, a user can control a target game character to move in the game scene, e.g., control the target game character to run, jump, crawl, etc., as well as control the target game character to fight other target game characters using skills, virtual props, etc., provided by the application.
In this embodiment, the target game character may be a virtual character that a user controls by an operation on the client. In some implementations, the target game character can be a virtual character that plays in a game scene. In some embodiments, the number of target game characters participating in the interaction in the game scene may be preset, or may be dynamically determined according to the number of clients joining the interaction.
The embodiment of the application provides a game interface display method, a game interface display device, electronic equipment and a computer readable storage medium. Specifically, the game interface display method of the embodiment of the application can be executed by an electronic device, wherein the electronic device can be a terminal or a server and other devices. The terminal may be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA), and the like, and the terminal may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, content delivery networks (Content Delivery Network, CDN), basic cloud computing services such as big data and artificial intelligent platforms, and the like.
For example, when the game interface display method is run on the terminal, the terminal device stores a game application and is used to present a virtual scene in a game screen. The terminal device is used for interacting with a user through a graphical user interface, for example, the terminal device downloads and installs a game application program and runs the game application program. The way in which the terminal device presents the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device, or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including game screens and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for running the game, generating the graphical user interface, responding to the operation instructions, and controlling the display of the graphical user interface on the touch display screen.
For example, the game interface display method may be a cloud game when running on a server. Cloud gaming refers to a game style based on cloud computing. In the running mode of the cloud game, a running main body of the game application program and a game picture presentation main body are separated, and the storage and the running of the game interface display method are completed on a cloud game server. The game image presentation is completed at a cloud game client, which is mainly used for receiving and sending game data and presenting game images, for example, the cloud game client may be a display device with a data transmission function, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, etc., near a user side, but the terminal device for processing game data is a cloud game server in the cloud. When playing the game, the user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the cloud game client through a network, and finally decodes the data through the cloud game client and outputs the game pictures.
Referring to fig. 1, fig. 1 is a schematic diagram of a game interface display system according to an embodiment of the application. The system may include at least one terminal 1000, at least one server 2000, at least one database 3000, and a network 4000. Terminal 1000 held by a user may be connected to servers of different games through network 4000. Terminal 1000 can be any device having computing hardware capable of supporting and executing software products corresponding to a game. In addition, terminal 1000 can have one or more multi-touch sensitive screens for sensing and obtaining input from a user through touch or slide operations performed at multiple points of one or more touch sensitive display screens. In addition, when the system includes a plurality of terminals 1000, a plurality of servers 2000, and a plurality of networks 4000, different terminals 1000 may be connected to each other through different networks 4000, through different servers 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition, the different terminals 1000 may be connected to other terminals or to a server or the like using their own bluetooth network or hotspot network. For example, multiple users may be online through different terminals 1000 so as to be connected via an appropriate network and synchronized with each other to support multiplayer games. In addition, the system may include a plurality of databases 3000, the plurality of databases 3000 being coupled to different servers 2000, and information related to the game environment may be continuously stored in the databases 3000 while different users play the multiplayer game online.
The following detailed description is provided with reference to the accompanying drawings. The following description of the embodiments is not intended to limit the preferred embodiments. Although a logical order is depicted in the flowchart, in some cases the steps shown or described may be performed in an order different than depicted in the figures.
In this embodiment, a terminal is taken as an example to describe the present embodiment, and a method for displaying a game interface is provided, where a graphical user interface is provided by a terminal device, where content displayed in the graphical user interface includes at least a part of a game scene and a target game character that is located in the game scene and is controlled by a current player, as shown in fig. 2, a specific flow of the method for displaying a game interface may be as follows:
201. in response to a projectile selection operation on the graphical user interface, a target projectile to be thrown is determined.
202. And controlling to display a throwing preview panel on the graphical user interface, wherein the throwing preview panel comprises the target game character and a throwing track, the throwing track comprises a throwing starting point and a throwing end point, and the throwing preview panel further comprises distance information indicating a distance from the throwing starting point to the throwing end point.
In some throwing games, a player needs to switch throwing gestures to throw a throwing object for meeting expected demands, so that the time spent by the player throwing the throwing object is relatively long, so in the embodiment, a terminal is introduced into a throwing preview panel to display a throwing track when a target game character controlled by a current player throws the target throwing object in the throwing preview panel, the throwing track is a moving track of the target throwing object in the air, so that the player definitely determines the throwing track corresponding to at least one throwing gesture, the player intuitively selects the throwing gesture required for throwing, the boundary of throwing identification sensing degree is improved, the operation cost of the player is reduced, the time spent by the player throwing the throwing object is further reduced, and the situation that the player spends a large amount of time in actual combat to switch the throwing gesture of the target game character, and the opponent game character is defeated is avoided, so that the game experience is influenced is caused.
The display position of the throwing preview panel on the gui may be set according to the requirement, for example, in the top area of the gui, the side area of the gui, the lower area of the gui, etc.
The throwing track comprises a throwing starting point and a throwing end point, wherein the throwing starting point is a starting point when a player controls a target game role to throw a target throwing object, and can be understood as the position of the target throwing object on the target game role or the position of the target throwing object when the target throwing object leaves the target game role; the throwing end point is the falling point position of the target throwing object, namely the position when the target throwing object falls on the ground or a certain object.
Optionally, the throwing track may include a throwing track when the target game character throws the target throwing object in at least two throwing postures, so that a player is prompted to intuitively determine a throwing track corresponding to the target throwing track in different throwing postures, so that the player intuitively selects a throwing posture required for throwing, a boundary of a throwing identification sensing degree is improved, an operation cost of the player is reduced, and time spent by the player when throwing the throwing object is reduced.
The throwing points of the throwing tracks corresponding to the different throwing gestures may be different, for example, when the target game character is in the same character position, the throwing point corresponding to the squat throwing gesture is lower than the throwing point corresponding to the jump throwing gesture.
In addition, in order to prompt the player to clearly determine the deviation of the throwing track from the expected deviation, the throwing preview panel further comprises distance information from the throwing start point to the throwing end point of the throwing track, so that the player can adjust accordingly based on the distance information. For example, if the distance information from the throwing start point to the throwing end point is smaller than the end point position expected by the player, the player may control the target game character to switch the throwing method or control the target game character to move, so that the deviation between the throwing trajectory and the expected position of the player is reduced.
It will be appreciated that, since the above-described at least two throwing gestures are used to predict a throwing gesture that may exist when a player throws a target throwing object, such as a squat throwing gesture, a standing throwing gesture, a jump throwing gesture, etc., the above-described at least two throwing gestures may include a current throwing gesture currently maintained by the player so that the player can directly make throwing of the throwing object when the player finds that the currently maintained gesture satisfies its intended requirement.
Illustratively, as shown in fig. 3a, the distance information corresponding to the throwing track of each throwing gesture, such as Xa being 17m, ya being 22m, za being 17m, is included in fig. 3a, and the throwing gesture to which the throwing track corresponding to Ya belongs is the current throwing gesture of the player, and the game character in fig. 3a is the target game character. Also, in FIG. 3a, control 301, control 302, control 303, control 304, control 305, control 306, control 307, and control 308 are operational controls operable by a player for representing different functions.
Specifically, the terminal may design at least one throwing gesture in the terminal in advance, so as to predict a preset throwing gesture and a throwing gesture corresponding to the throwing gesture currently held by the player when predicting the throwing gesture; or, the terminal may acquire a historical throwing gesture corresponding to the throwing of the throwing object by the target game character controlled by the current player in the historical period, so as to predict the historical throwing gesture and a throwing track corresponding to the throwing gesture currently held by the player.
In some embodiments, since there are at least two throwing trajectories corresponding to the throwing gesture in the throwing preview panel, in order to prompt a player, a prompt may be made on the display of the throwing trajectory, for example, a difference in the display manner of the throwing trajectory, adding a logo to the throwing trajectory, and the like.
Specifically, since the at least two kinds of throwing gestures may include a throwing trajectory corresponding to a current throwing gesture currently maintained by the player, in order to present the throwing trajectory corresponding to the current throwing gesture of the player, the throwing trajectory corresponding to the current throwing gesture of the target game character is displayed in a first trajectory display style, and the throwing trajectory corresponding to the predicted throwing gesture of the target game character is displayed in a second trajectory display style. Wherein the first track display pattern and the second track display pattern are different, for example, different in color, line size, line pattern, and the like.
Illustratively, as shown in fig. 3a, fig. 3a shows a throwing trace corresponding to three throwing postures predicted when a target game character is in a standing throwing posture, and it can be seen that the throwing trace corresponding to Ya in fig. 3a is shown as a solid line and the throwing traces corresponding to Xa and Za are shown as a broken line. As shown in fig. 3b, fig. 3b is a throwing trace corresponding to three throwing postures predicted by the player in the squat throwing posture, and it can be seen that the throwing trace corresponding to Xb in fig. 3b is shown as a solid line, and the throwing traces corresponding to Yb and Zb are shown as a broken line. As shown in fig. 3c, fig. 3c shows the throwing trajectories corresponding to three throwing gestures predicted by the player in the jumping throwing gesture, and it can be seen that the throwing trajectories corresponding to Zc in fig. 3c are shown as solid lines and the throwing trajectories corresponding to Xc and Yc are shown as dotted lines. And, in fig. 3b and 3c, control 301, control 302, control 303, control 304, control 305, control 306, control 307, and control 308 are operational controls operable by a player for representing different functions.
Specifically, the terminal may further introduce a gesture identifier, that is, the above-mentioned throwing preview panel further includes a gesture identifier of a corresponding throwing gesture associated with a throwing track of each of the above-mentioned throwing gestures, so that it can be clarified which throwing gesture the current throwing track belongs to by looking at the gesture identifier. The gesture identifier corresponding to the throwing gesture may be a character, a word, a gesture image, etc.
Illustratively, as shown in fig. 3d, the characters near the throwing track corresponding to the three throwing gestures in fig. 3d are gesture identifications corresponding to the throwing track, such as gesture identifications corresponding to the throwing track under the current throwing gesture corresponding to the current throwing gesture in fig. 3d, gesture identifications corresponding to the throwing track under the squatting throwing gesture corresponding to the squatting in fig. 3d, and gesture identifications corresponding to the throwing track under the jumping throwing gesture corresponding to the jumping in fig. 3 d. In the scene of gesture indication display, the distance information from the throwing point to the throwing end point is still displayed, and in this example, the distance information from the throwing point to the throwing end point is not drawn to prevent confusion.
In some embodiments, the above-mentioned display of the throwing preview panel on the above-mentioned graphical user interface is required to determine a throwing track displayed on the throwing preview panel, and the terminal may predict the moving track of the target throwing object in the air through relevant information of throwing of the target game character, where the relevant information of throwing may include a throwing gesture, a throwing method, a viewing angle of throwing, a throwing distance of the target throwing object, and the like. The throwing method includes, but is not limited to, high throwing method, low throwing method, etc., and the distance from the throwing start point to the throwing end point of the same throwing track is different under different throwing methods.
Specifically, the terminal may acquire the throwing method of the above-described target game character, that is, determine that the target game character is set. Then, the terminal may predict a throwing trajectory of the target game character when throwing the target throwing object in a current throwing gesture at a current viewing angle based on the throwing method. Then, a throwing trace corresponding to the current throwing gesture is displayed in the throwing preview panel on the graphical user interface.
The terminal takes the position of the target throwing object on the target game character as a throwing starting point, calculates the throwing distance of the target game character in throwing in the direction of the visual angle corresponding to the current visual angle of the target game character, and can obtain the falling point position of the target throwing object, namely the throwing end point of the throwing track, so that the throwing track can be obtained based on the throwing starting point and the throwing end point.
It will be appreciated that if the throwing preview panel includes throwing trajectories corresponding to at least two throwing gestures, the throwing trajectory corresponding to one throwing gesture in the throwing preview panel may be changed based on the angle of view of the target game character when throwing, and the change of the throwing method, that is, the angle of view of the throwing trajectories corresponding to different throwing gestures simultaneously displayed in the throwing preview panel are the same, and the throwing methods are the same.
Illustratively, as shown in fig. 4a and 4b, fig. 4a shows a throwing track corresponding to three throwing gestures predicted when a target game character adopts a high throwing method, fig. 4b shows a throwing track corresponding to three throwing gestures predicted when a target game character adopts a low throwing method, and fig. 4a and 4b show a consistent current view angle and a consistent target throwing object of the target game character, so that the throwing track predicted under the high throwing method is farther than the throwing track predicted under the low throwing method from fig. 4a and 4 b. Wherein control 401, control 402, control 403, control 404, control 405, control 406, control 407, and control 408 in fig. 4a and 4b are operational controls operable by a player for representing different functions.
In some embodiments, since there are multiple throwing methods for the target game character, in order to accurately respond to the user demand, in this embodiment, the terminal introduces a method control, that is, the graphical user interface further includes a method control, so that the user controls the throwing method of the target game character through controlling the method control, so as to implement switching of throwing trajectories under different throwing methods.
Specifically, the terminal determines an updated target throwing method of the target game character, that is, the updated target throwing method is a throwing method of the target game character to be switched, that is, a throwing method of generating a throwing track that a player wants to view currently, by responding to a control operation of the method control. Then, the terminal may predict a target throwing trajectory when the target game character throws the target throwing object in a current throwing posture at a current viewing angle based on the target throwing method. And updating the throwing track in the throwing preview panel based on the target throwing track corresponding to the current throwing gesture, namely, switching the throwing track corresponding to the current throwing gesture under the previous throwing method to the throwing track corresponding to the current throwing gesture under the target throwing method. If the throwing preview panel includes at least two throwing trajectories, the throwing trajectories corresponding to the predicted throwing gesture need to be further displayed.
The control operation may include a clicking operation, an input operation, a selecting operation, and the like, and may be specifically determined based on an operable manipulation control on the graphical user interface.
For example, as shown in fig. 4a and 4b, fig. 4a shows a control display style of the skill control 409 in a high-throwing skill, and fig. 4b shows a control display style of the skill control 409 in a low-throwing skill, and a player may switch between the high-throwing skill and the low-throwing skill of the target game character by clicking the skill control 409.
In some embodiments, the projectile selection operation may be a clicking operation, a selecting operation, an inputting operation, etc., and may be specifically determined based on an operable control on the graphical user interface regarding projectile selection.
Specifically, the graphical user interface may further include a projectile selecting control, and the determining, in response to the projectile selecting operation on the graphical user interface, the target projectile to be thrown may include: and the terminal displays a projectile window in response to the triggering operation of the projectile selection control, wherein the projectile window contains at least one projectile held by the current player. The terminal then determines the selected projectile as the target projectile by responding to the selection operation of the projectile. In addition, the terminal can also directly display the throwing object window on the graphical user interface, so that the player can directly select at least one throwing object in the throwing object window, and the target throwing object can be determined.
The triggering operation may include a clicking operation, a mouse touch operation, and the like.
Illustratively, as shown in fig. 5a, by clicking the projectile selection control 510 in fig. 5a, a projectile window including at least one projectile is displayed, as shown in fig. 5b, and in fig. 5b, projectile window including projectile E, projectile F, projectile G, projectile H, projectile I, and projectile J is included, and then in response to the selection operation of the projectile by the player, the target projectile may be determined, e.g., the target projectile J, and then the terminal may display, on the graphical user interface, the projectile identifier corresponding to the target projectile, as shown in fig. 5 c. In fig. 5a, fig. 5b, and fig. 5c, control 501, control 502, control 503, control 504, control 505, control 506, control 507, and control 508 are operation controls operable by a player and used for representing different functions, and control 509 is a manipulation control.
Specifically, the graphical user interface may further include a projectile input control, and the determining, in response to a projectile selection operation on the graphical user interface, a target projectile to be thrown may include: and the terminal determines the thrown object corresponding to the input information by responding to the information input operation of the thrown object input control, and determines the thrown object corresponding to the input information as the target thrown object.
In some embodiments, the throwing preview panel may further include a coordinate axis, wherein the coordinate information on the coordinate axis is used as distance information indicating a distance from the throwing start point to the throwing end point, that is, an origin of the coordinate axis represents a character position of the target game character, and a direction of the coordinate axis represents a viewing angle direction of the target game character; the target game character throws the target throwing object at the corresponding throwing starting point in the current throwing gesture on the origin of the coordinate axis, and the throwing end point corresponding to the current throwing gesture is on the coordinate axis, so that the distance information from the throwing starting point to the throwing end point of the throwing track can be clarified through the coordinate axis. If the throwing preview panel includes at least two throwing trajectories, the throwing trajectories corresponding to the predicted throwing gesture need to be further displayed on the coordinate axis.
Illustratively, as shown in fig. 6, below the throwing preview panel is a coordinate axis, and it can be seen from fig. 6 that the origin of the coordinate axis is the position where the target game character is located, and the throwing end points of the throwing trajectories corresponding to different throwing gestures are at different positions of the coordinate axis. In fig. 6, the controls 601, 602, 603, 604, 605, 606, 607 and 608 are operation controls operable by a player and used for representing different functions, the control 609 is a manual control, and the control 610 is a throwing object selection control.
It will be appreciated that the player can clearly control the target game character by looking at the coordinate axes to make the throwing result more in line with the player's expectations, for example, if the player finds that none of the throwing end points of the throwing track reach the player's expected position and the distance between the closest throwing end point and the expected position is 5m, the player can control the target game character to move 5m in the viewing angle direction of the target game character.
In some embodiments, in order to enable the player to intuitively feel the relative positional relationship of the predicted throwing track and the enemy game character, in this embodiment, the terminal may introduce a character identification, that is, a character identification of the target game character and the enemy game character of the target game character is displayed on the throwing preview panel.
Specifically, the throwing preview panel further includes a target character identifier of the target game character and an enemy character identifier of the enemy game character, where a distance between the target character identifier and the enemy character identifier matches a distance between the target game character and the enemy game character in a viewing angle direction of the target game character, so as to promote a player to make a throwing decision based on a relative positional relationship between a throwing track and the enemy game character, that is, to make throwing of a target throwing object according to which throwing track.
Illustratively, as shown in fig. 7a, the virtual character of the white head portrait on the left side in fig. 7a is the target character identification of the target game character, and the virtual character of the black head portrait on the right side is the character identification of the opponent game character of the target game character, so that the player can intuitively see the target character identification, the opponent character identification and the correlation positional relationship among the throwing tracks corresponding to different throwing gestures. In fig. 7a, the controls 701, 702, 703, 704, 705, 706, 707 and 708 are operation controls operable by the player and used for representing different functions, the control 709 is a manual control, and the control 710 is a throwing object selection control.
It can be understood that the coordinate axes can be displayed in the process that the target game character and the character identifier of the enemy game character of the target game character are displayed on the throwing preview panel, so that a player can clearly determine the relative position relationship between the throwing track and the enemy game character, thereby better delivering the throwing character and improving the throwing efficiency; instead of displaying the coordinate axes, the player may issue a throwing decision according to other forms of distance information indicating the distance from the throwing point to the throwing end point.
In some embodiments, since different kinds of thrown objects have different sweep ranges, in order for the player to clearly determine the sweep range corresponding to the thrown object, the terminal may display the sweep range on the throwing preview panel, that is, the throwing preview panel may further include the sweep range of the target thrown object, that is, if the throwing trace includes a throwing trace when the target game character throws the target thrown object in the current throwing gesture, the terminal may display the sweep range of the throwing trace corresponding to the current throwing gesture of the target game character, that is, the sweep range may be displayed centering on the throwing end point of the throwing trace corresponding to the current throwing gesture of the target game character. If the throwing preview panel includes at least two throwing trajectories, the range of the throwing trajectories corresponding to the predicted throwing gesture is further displayed.
The terminal can determine the sweep range of the target throwing object according to the type of the target throwing object.
The terminal may display a sweep range of the throwing trace corresponding to each throwing gesture, that is, the sweep range, with the throwing end point of the throwing trace corresponding to each throwing gesture of the target game character as the center.
It can be appreciated that through the display of the sweep range, the player can be prompted to more intuitively and clearly determine the range influenced by the throwing object under different throwing gestures, so that corresponding adjustment is performed based on the sweep range, for example, the throwing gesture is adjusted, the perception cost of the player and the information blind area are greatly reduced, and the throwing game experience of the player is improved.
Illustratively, as shown in fig. 7b, a throwing trajectory corresponding to three throwing postures predicted by the player in the standing throwing posture is shown, so that the sweep range of the target throwing object is displayed at the throwing end point of the middle throwing trajectory. In fig. 7b, the controls 701, 702, 703, 704, 705, 706, 707 and 708 are operation controls operable by the player and used for representing different functions, the control 709 is a manual control, and the control 710 is a throwing object selection control.
In some embodiments, in order to enable the player to more intuitively feel the relative positional relationship between the sweep range of the predicted throwing track and the enemy character, in this embodiment, the terminal may display the enemy character identification of the enemy character differently, that is, the enemy character identification in the sweep range is different from the enemy character identification outside the sweep range.
Specifically, the throwing track further includes a throwing track when the target game character throws the target throwing object in at least one predicted throwing gesture, and the terminal may determine each target sweep range centered on a throwing end point corresponding to each throwing track of the target game character. And judging according to the target sweep ranges corresponding to the throwing tracks respectively, namely displaying the enemy character identifications positioned in the target sweep ranges in a first identification display mode and displaying the enemy character identifications positioned outside the target sweep ranges in a second identification display mode in the throwing preview panel. Wherein the first logo display style and the second logo display style are different, for example, in logo color, logo style, logo size, and the like.
Illustratively, as shown in fig. 7C, the enemy character indicator A, B in fig. 7C is within the target sweep range corresponding to at least one of the three throwing trajectories of fig. 7C, while the enemy character indicator C is not within the target sweep range corresponding to the three throwing trajectories of fig. 7C, so the identification color of the enemy character indicator A, B in fig. 7C is different from the identification color of the enemy character indicator C. In fig. 7c, the controls 701, 702, 703, 704, 705, 706, 707, and 708 are operation controls operable by the player and used for representing different functions, the control 709 is a manual control, and the control 710 is a throwing object selection control.
From the above, it can be seen that a graphical user interface is provided through the terminal device, where the content displayed in the graphical user interface includes at least a part of game scene, a target game character that is located in the game scene and is operated by the current player, and a target projectile to be thrown is determined in response to a projectile selection operation on the graphical user interface; and controlling to display a throwing preview panel on the graphical user interface, wherein the throwing preview panel comprises the target game role and a throwing track, the throwing track comprises a throwing starting point and a throwing end point, and the throwing preview panel further comprises distance information indicating the distance from the throwing starting point to the throwing end point, so that a player can determine the corresponding throwing track and related information under the current throwing gesture by displaying the throwing track on the throwing preview panel, and the time spent by the player when throwing a throwing object is reduced.
In order to better implement the above method, the embodiment of the present application further provides a game interface display device, where the game interface display device may be specifically integrated in an electronic device, for example, a computer device, where the computer device may be a terminal, a server, or other devices.
The terminal can be a mobile phone, a tablet personal computer, an intelligent Bluetooth device, a notebook computer, a personal computer and other devices; the server may be a single server or a server cluster composed of a plurality of servers.
For example, in this embodiment, taking a specific integration of a game interface display device in a terminal as an example, the method of the embodiment of the present application is described in detail, and this embodiment provides a game interface display device, where a graphical user interface is provided by a terminal device, where content displayed in the graphical user interface includes at least a part of a game scene and a target game character that is located in the game scene and is controlled by a current player, as shown in fig. 8, the game interface display device 80 may include:
a projectile determining module 801 for determining a target projectile to be thrown in response to a projectile selection operation on a graphical user interface;
a throwing preview module 802, configured to control displaying a throwing preview panel on the graphical user interface, where the throwing preview panel includes the target game character and a throwing track, the throwing track includes a throwing start point and a throwing end point, and the throwing preview panel further includes distance information indicating a distance from the throwing start point to the throwing end point.
In some embodiments, the throwing preview panel further includes a target character identifier of the target game character and an enemy character identifier of the enemy game character, and a distance between the target character identifier and the enemy character identifier matches a distance between the target game character and the enemy game character in a viewing angle direction of the target game character.
In some embodiments, the throwing path includes a throwing path when the target game character throws the target throwing object in a current throwing gesture, and the throwing preview panel further includes a range of the target throwing object, wherein the range of the target throwing object is displayed centering on a throwing end point of the throwing path corresponding to the current throwing gesture of the target game character.
In some embodiments, the throwing trace further includes a throwing trace of the target game character when throwing the target throwing object in at least one predicted throwing gesture, and the game interface display device 80 further includes an identification display module, where the identification display module is specifically configured to:
respectively determining a target sweep range centering on a throwing end point corresponding to each throwing track of the target game character;
In the throwing preview panel, the enemy character identification located within the target sweep range is displayed in a first identification display style, and the enemy character identification located outside the target sweep range is displayed in a second identification display style.
In some embodiments, the throwing preview panel further includes a coordinate axis, an origin of the coordinate axis representing a character position of the target game character, and a direction of the coordinate axis representing a viewing angle direction of the target game character;
the target game character throws a target thrown object at a corresponding throwing start point in a current throwing gesture on an origin of the coordinate axis, and a throwing end point corresponding to the current throwing gesture is on the coordinate axis.
In some embodiments, the throwing trace includes a throwing trace when the target game character throws the target throwing object in at least two throwing poses.
In some embodiments, a throwing track corresponding to a current throwing gesture of the target game character is displayed in a first track display style, and a throwing track corresponding to a predicted throwing gesture of the target game character is displayed in a second track display style.
In some embodiments, the throwing preview panel further includes a gesture identification of a corresponding throwing gesture associated with a throwing trajectory of each of the throwing gestures.
In some embodiments, the starting points of the throwing trajectories corresponding to the different throwing poses are different.
In some embodiments, the throwing preview module 802 is specifically configured to:
acquiring a throwing method of the target game character, and predicting a throwing track of the target game character when throwing the target throwing object in a current throwing posture under a current view angle based on the throwing method;
and displaying a throwing track corresponding to the current throwing gesture in the throwing preview panel on the graphical user interface.
In some embodiments, the graphical user interface further includes a manipulation control, and the game interface display device 80 further includes a manipulation control module, where the manipulation control module is specifically configured to:
responding to the control operation of the skill control, and determining a target throwing skill after updating the target game role;
predicting a target throwing track of the target game character when throwing the target throwing object in a current throwing gesture under a current view angle based on the target throwing method;
And updating the throwing track in the throwing preview panel based on the target throwing track corresponding to the current throwing gesture.
In some embodiments, the graphical user interface further includes a projectile selection control, and the projectile preview module 802 is specifically configured to:
responding to the triggering operation of the throwing object selection control, displaying a throwing object window, wherein the throwing object window comprises at least one throwing object held by the current player;
in response to a selection operation of the projectile, the selected projectile is determined as the target projectile.
As can be seen from the above, the game interface display device of the present embodiment provides a graphical user interface through the terminal device, where the content displayed in the graphical user interface includes at least a part of game scenes and target game characters that are located in the game scenes and are controlled by the current player, and determines a target thrown object to be thrown by responding to a thrown object selection operation on the graphical user interface; and displaying a throwing preview panel on the graphical user interface, wherein the throwing preview panel comprises the target game role and a throwing track, the throwing track comprises a throwing starting point and a throwing end point, and the throwing preview panel further comprises distance information indicating the distance from the throwing starting point to the throwing end point, so that a player can determine the corresponding throwing track and related information under the current throwing gesture by displaying the throwing track on the throwing preview panel, and the time spent by the player when throwing a throwing object is reduced.
Correspondingly, the embodiment of the application also provides electronic equipment which can be a terminal, and the terminal can be terminal equipment such as a smart phone, a tablet personal computer, a notebook computer, a touch screen, a game machine, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA) and the like. Fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application, as shown in fig. 9. The electronic device 900 includes a processor 901 having one or more processing cores, a memory 902 having one or more computer readable storage media, and a computer program stored on the memory 902 and executable on the processor. The processor 901 is electrically connected to the memory 902. It will be appreciated by those skilled in the art that the electronic device structure shown in the figures is not limiting of the electronic device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
Processor 901 is a control center of electronic device 900, connects various portions of the entire electronic device 900 using various interfaces and lines, and performs various functions of electronic device 900 and processes data by running or loading software programs and/or modules stored in memory 902, and invoking data stored in memory 902, thereby performing overall monitoring of electronic device 900.
In the embodiment of the present application, the processor 901 in the electronic device 900 loads the instructions corresponding to the processes of one or more application programs into the memory 902 according to the following steps, and the processor 901 executes the application programs stored in the memory 902, so as to implement various functions:
determining a target thrown object to be thrown in response to a thrown object selecting operation on the graphical user interface;
and controlling to display a throwing preview panel on the graphical user interface, wherein the throwing preview panel comprises the target game character and a throwing track, the throwing track comprises a throwing starting point and a throwing end point, and the throwing preview panel further comprises distance information indicating a distance from the throwing starting point to the throwing end point.
In some embodiments, the throwing preview panel further includes a target character identifier of the target game character and an enemy character identifier of the enemy game character, and a distance between the target character identifier and the enemy character identifier matches a distance between the target game character and the enemy game character in a viewing angle direction of the target game character.
In some embodiments, the throwing path includes a throwing path when the target game character throws the target throwing object in a current throwing gesture, and the throwing preview panel further includes a range of the target throwing object, wherein the range of the target throwing object is displayed centering on a throwing end point of the throwing path corresponding to the current throwing gesture of the target game character.
In some embodiments, the throwing trace further includes a throwing trace of the target game character when throwing the target throwing object in at least one predicted throwing gesture, further including:
respectively determining a target sweep range centering on a throwing end point corresponding to each throwing track of the target game character;
in the throwing preview panel, the enemy character identification located within the target sweep range is displayed in a first identification display style, and the enemy character identification located outside the target sweep range is displayed in a second identification display style.
In some embodiments, the throwing preview panel further includes a coordinate axis, an origin of the coordinate axis representing a character position of the target game character, and a direction of the coordinate axis representing a viewing angle direction of the target game character;
the target game character throws a target thrown object at a corresponding throwing start point in a current throwing gesture on an origin of the coordinate axis, and a throwing end point corresponding to the current throwing gesture is on the coordinate axis.
In some embodiments, the throwing trace includes a throwing trace when the target game character throws the target throwing object in at least two throwing poses.
In some embodiments, a throwing track corresponding to a current throwing gesture of the target game character is displayed in a first track display style, and a throwing track corresponding to a predicted throwing gesture of the target game character is displayed in a second track display style.
In some embodiments, the throwing preview panel further includes a gesture identification of a corresponding throwing gesture associated with a throwing trajectory of each of the throwing gestures.
In some embodiments, the starting points of the throwing trajectories corresponding to the different throwing poses are different.
In some embodiments, displaying a throwing preview panel on the graphical user interface includes:
acquiring a throwing method of the target game character, and predicting a throwing track of the target game character when throwing the target throwing object in a current throwing posture under a current view angle based on the throwing method;
and displaying a throwing track corresponding to the current throwing gesture in the throwing preview panel on the graphical user interface.
In some embodiments, the graphical user interface further includes a manipulation control, and further includes:
responding to the control operation of the skill control, and determining a target throwing skill after updating the target game role;
Predicting a target throwing track of the target game character when throwing the target throwing object in a current throwing gesture under a current view angle based on the target throwing method;
and updating the throwing track in the throwing preview panel based on the target throwing track corresponding to the current throwing gesture.
In some embodiments, the method further includes determining a target projectile to be thrown in response to a projectile selection operation on the graphical user interface, including:
responding to the triggering operation of the throwing object selection control, displaying a throwing object window, wherein the throwing object window comprises at least one throwing object held by the current player;
in response to a selection operation of the projectile, the selected projectile is determined as the target projectile.
Thus, the electronic device 900 provided in this embodiment may have the following technical effects: the time spent by the player throwing the throwing object is reduced.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Optionally, as shown in fig. 9, the electronic device 900 further includes: a touch display 903, a radio frequency circuit 904, an audio circuit 905, an input unit 906, and a power supply 907. The processor 901 is electrically connected to the touch display 903, the radio frequency circuit 904, the audio circuit 905, the input unit 906, and the power supply 907, respectively. It will be appreciated by those skilled in the art that the electronic device structure shown in fig. 9 is not limiting of the electronic device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
The touch display 903 may be used to display a graphical user interface and receive an operation instruction generated by a user acting on the graphical user interface. The touch display 903 may include a display panel and a touch panel. Wherein the display panel may be used to display information entered by a user or provided to a user as well as various graphical user interfaces of the electronic device, which may be composed of graphics, text, icons, video, and any combination thereof. Alternatively, the display panel may be configured in the form of a liquid crystal display (LCD, liquid Crystal Display), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations on or near the user (such as operations on or near the touch panel by the user using any suitable object or accessory such as a finger, stylus, etc.), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, and sends the touch point coordinates to the processor 901, and can receive and execute commands sent from the processor 901. The touch panel may overlay the display panel, and upon detection of a touch operation thereon or thereabout, the touch panel is passed to the processor 901 to determine the type of touch event, and the processor 901 then provides a corresponding visual output on the display panel based on the type of touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display 903 to realize input and output functions. In some embodiments, however, the touch panel and the touch panel may be implemented as two separate components to perform the input and output functions. I.e. the touch display 903 may also implement an input function as part of the input unit 906.
The radio frequency circuit 904 may be configured to receive and transmit radio frequency signals to and from a network device or other electronic device via wireless communication to and from the network device or other electronic device.
The audio circuitry 905 may be used to provide an audio interface between a user and an electronic device through a speaker, microphone. The audio circuit 905 may transmit the received electrical signal converted from audio data to a speaker, and convert the electrical signal into a sound signal to output; on the other hand, the microphone converts the collected sound signals into electrical signals, which are received by the audio circuit 905 and converted into audio data, which are processed by the audio data output processor 901 for transmission to, for example, another electronic device via the radio frequency circuit 904, or which are output to the memory 902 for further processing. The audio circuit 905 may also include an ear bud jack to provide communication of the peripheral headphones with the electronic device.
The input unit 906 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 907 is used to power the various components of the electronic device 900. Alternatively, the power supply 907 may be logically connected to the processor 901 through a power management system, so as to implement functions of managing charging, discharging, and power consumption management through the power management system. The power supply 907 may also include one or more of any components, such as a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown in fig. 9, the electronic device 900 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, an embodiment of the present application provides a computer readable storage medium in which a plurality of computer programs are stored, the computer programs being capable of being loaded by a processor to perform any of the game interface display methods provided by the embodiment of the present application. For example, the computer program may perform the steps of:
Determining a target thrown object to be thrown in response to a thrown object selecting operation on the graphical user interface;
and controlling to display a throwing preview panel on the graphical user interface, wherein the throwing preview panel comprises the target game character and a throwing track, the throwing track comprises a throwing starting point and a throwing end point, and the throwing preview panel further comprises distance information indicating a distance from the throwing starting point to the throwing end point.
In some embodiments, the throwing preview panel further includes a target character identifier of the target game character and an enemy character identifier of the enemy game character, and a distance between the target character identifier and the enemy character identifier matches a distance between the target game character and the enemy game character in a viewing angle direction of the target game character.
In some embodiments, the throwing path includes a throwing path when the target game character throws the target throwing object in a current throwing gesture, and the throwing preview panel further includes a range of the target throwing object, wherein the range of the target throwing object is displayed centering on a throwing end point of the throwing path corresponding to the current throwing gesture of the target game character.
In some embodiments, the throwing trace further includes a throwing trace of the target game character when throwing the target throwing object in at least one predicted throwing gesture, further including:
respectively determining a target sweep range centering on a throwing end point corresponding to each throwing track of the target game character;
in the throwing preview panel, the enemy character identification located within the target sweep range is displayed in a first identification display style, and the enemy character identification located outside the target sweep range is displayed in a second identification display style.
In some embodiments, the throwing preview panel further includes a coordinate axis, an origin of the coordinate axis representing a character position of the target game character, and a direction of the coordinate axis representing a viewing angle direction of the target game character;
the target game character throws a target thrown object at a corresponding throwing start point in a current throwing gesture on an origin of the coordinate axis, and a throwing end point corresponding to the current throwing gesture is on the coordinate axis.
In some embodiments, the throwing trace includes a throwing trace when the target game character throws the target throwing object in at least two throwing poses.
In some embodiments, a throwing track corresponding to a current throwing gesture of the target game character is displayed in a first track display style, and a throwing track corresponding to a predicted throwing gesture of the target game character is displayed in a second track display style.
In some embodiments, the throwing preview panel further includes a gesture identification of a corresponding throwing gesture associated with a throwing trajectory of each of the throwing gestures.
In some embodiments, the starting points of the throwing trajectories corresponding to the different throwing poses are different.
In some embodiments, displaying a throwing preview panel on the graphical user interface includes:
acquiring a throwing method of the target game character, and predicting a throwing track of the target game character when throwing the target throwing object in a current throwing posture under a current view angle based on the throwing method;
and displaying a throwing track corresponding to the current throwing gesture in the throwing preview panel on the graphical user interface.
In some embodiments, the graphical user interface further includes a manipulation control, and further includes:
responding to the control operation of the skill control, and determining a target throwing skill after updating the target game role;
Predicting a target throwing track of the target game character when throwing the target throwing object in a current throwing gesture under a current view angle based on the target throwing method;
and updating the throwing track in the throwing preview panel based on the target throwing track corresponding to the current throwing gesture.
In some embodiments, the method further includes determining a target projectile to be thrown in response to a projectile selection operation on the graphical user interface, including:
responding to the triggering operation of the throwing object selection control, displaying a throwing object window, wherein the throwing object window comprises at least one throwing object held by the current player;
in response to a selection operation of the projectile, the selected projectile is determined as the target projectile.
It can be seen that the computer program can be loaded by the processor to execute any of the game interface display methods provided by the embodiments of the present application, so as to bring about the following technical effects: the time spent by the player throwing the throwing object is reduced.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Wherein the computer-readable storage medium may comprise: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
Because the computer program stored in the computer readable storage medium can execute any game interface display method provided by the embodiment of the present application, the beneficial effects that any game interface display method provided by the embodiment of the present application can realize can be realized, which are detailed in the previous embodiments and are not described herein.
The above description of the game interface display method, the device, the electronic equipment and the computer readable storage medium provided by the embodiment of the application applies specific examples to illustrate the principle and the implementation of the application, and the description of the above embodiment is only used for helping to understand the method and the core idea of the application; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, the present description should not be construed as limiting the present application.

Claims (15)

1. A game interface display method, characterized in that a graphical user interface is provided through a terminal device, and content displayed in the graphical user interface includes at least a part of a game scene, a target game character located in the game scene and operated by a current player, the method comprising:
Determining a target projectile to be thrown in response to a projectile selection operation on the graphical user interface;
controlling to display a throwing preview panel on the graphical user interface, wherein the throwing preview panel comprises the target game character and a throwing track, the throwing track comprises a throwing starting point and a throwing end point, and the throwing preview panel further comprises distance information indicating a distance from the throwing starting point to the throwing end point.
2. The game interface display method according to claim 1, wherein the throwing preview panel further includes thereon a target character identification of the target game character and an enemy character identification of an enemy game character, a distance between the target character identification and the enemy character identification matching a distance of the target game character and the enemy game character in a viewing angle direction of the target game character.
3. The game interface display method according to claim 2, wherein the throwing locus includes a throwing locus when the target game character throws the target thrown object in a current throwing gesture, and the throwing preview panel further includes a sweep range of the target thrown object, the sweep range being displayed centering on a throwing end point of the throwing locus corresponding to the current throwing gesture of the target game character.
4. The game interface display method of claim 3, wherein the throwing trace further comprises a throwing trace of the target game character when throwing the target throwing object in at least one predicted throwing gesture, further comprising:
respectively determining a target sweep range centering on a throwing end point corresponding to each throwing track of the target game character;
in the throwing preview panel, the enemy character identification located within the target sweep range is displayed in a first identification display style, and the enemy character identification located outside the target sweep range is displayed in a second identification display style.
5. The game interface display method according to claim 1, wherein the throwing preview panel further includes a coordinate axis, an origin of the coordinate axis representing a character position of the target game character, a direction of the coordinate axis representing a viewing angle direction of the target game character;
the target game character throws a target thrown object at a corresponding throwing start point in a current throwing gesture on an origin of the coordinate axis, and a throwing end point corresponding to the current throwing gesture is on the coordinate axis.
6. The game interface display method according to claim 1, wherein the throwing locus includes a throwing locus when the target game character throws the target thrown in at least two throwing poses.
7. The game interface display method according to claim 6, wherein a throwing trace corresponding to a current throwing gesture of the target game character is displayed in a first trace display style, and a throwing trace corresponding to a predicted throwing gesture of the target game character is displayed in a second trace display style.
8. The game interface display method of claim 6, wherein the throwing preview panel further includes a gesture identification of a corresponding throwing gesture associated with a throwing trajectory of each of the throwing gestures.
9. The game interface display method according to claim 6, wherein the throwing points of the throwing trajectories corresponding to the different throwing gestures are different.
10. The game interface display method according to claim 1, wherein the displaying a throwing preview panel on the graphical user interface includes:
acquiring a throwing method of the target game character, and predicting a throwing track of the target game character when throwing the target throwing object in a current throwing posture under a current view angle based on the throwing method;
and displaying a throwing track corresponding to the current throwing gesture in the throwing preview panel on the graphical user interface.
11. The game interface display method of claim 1, wherein the graphical user interface further comprises a manipulation control thereon, further comprising:
determining a target throwing method after the target game role is updated in response to the control operation of the method control;
based on the target throwing method, predicting a target throwing track when the target game character throws the target throwing object in a current throwing gesture at a current view angle;
and based on the target throwing track corresponding to the current throwing gesture, updating the throwing track in the throwing preview panel.
12. The game interface display method of any one of claims 1 to 11, further comprising a projectile selection control on the graphical user interface, the determining a target projectile to be thrown in response to a projectile selection operation on the graphical user interface comprising:
responding to the triggering operation of the projectile selecting control, displaying a projectile window, wherein the projectile window contains at least one projectile held by the current player;
in response to a selection operation of the projectile, the selected projectile is determined to be the target projectile.
13. A game interface display apparatus, wherein a graphic user interface is provided through a terminal device, and contents displayed in the graphic user interface include at least a part of a game scene, a target game character which is located in the game scene and is operated by a current player, the apparatus comprising:
a projectile determining module for determining a target projectile to be thrown in response to a projectile selection operation on a graphical user interface;
and the throwing preview module is used for controlling to display a throwing preview panel on the graphical user interface, wherein the throwing preview panel comprises the target game role and a throwing track, the throwing track comprises a throwing starting point and a throwing end point, and the throwing preview panel further comprises distance information indicating the distance from the throwing starting point to the throwing end point.
14. An electronic device comprising a processor and a memory, the memory storing a plurality of instructions; the processor loads instructions from the memory to perform the game interface display method of any one of claims 1 to 12.
15. A computer readable storage medium having stored thereon a plurality of instructions adapted to be loaded by a processor to perform the game interface display method of any one of claims 1 to 12.
CN202311175328.2A 2023-09-12 2023-09-12 Game interface display method, game interface display device, electronic equipment and readable storage medium Pending CN117205555A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311175328.2A CN117205555A (en) 2023-09-12 2023-09-12 Game interface display method, game interface display device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311175328.2A CN117205555A (en) 2023-09-12 2023-09-12 Game interface display method, game interface display device, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN117205555A true CN117205555A (en) 2023-12-12

Family

ID=89050586

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311175328.2A Pending CN117205555A (en) 2023-09-12 2023-09-12 Game interface display method, game interface display device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN117205555A (en)

Similar Documents

Publication Publication Date Title
CN113082712A (en) Control method and device of virtual role, computer equipment and storage medium
CN113101652A (en) Information display method and device, computer equipment and storage medium
CN113398590B (en) Sound processing method, device, computer equipment and storage medium
CN113350783B (en) Game live broadcast method and device, computer equipment and storage medium
CN113426124B (en) Display control method and device in game, storage medium and computer equipment
WO2023240925A1 (en) Virtual prop pickup method and apparatus, computer device, and storage medium
CN115040873A (en) Game grouping processing method and device, computer equipment and storage medium
CN113521724B (en) Method, device, equipment and storage medium for controlling virtual character
CN115999153A (en) Virtual character control method and device, storage medium and terminal equipment
CN115382201A (en) Game control method and device, computer equipment and storage medium
CN115193043A (en) Game information sending method and device, computer equipment and storage medium
CN112245914B (en) Viewing angle adjusting method and device, storage medium and computer equipment
CN115193064A (en) Virtual object control method and device, storage medium and computer equipment
CN111672107B (en) Virtual scene display method and device, computer equipment and storage medium
CN114225412A (en) Information processing method, information processing device, computer equipment and storage medium
CN116650963A (en) Game information display method, game information display device, computer equipment and storage medium
CN117205555A (en) Game interface display method, game interface display device, electronic equipment and readable storage medium
CN113398564B (en) Virtual character control method, device, storage medium and computer equipment
CN115430151A (en) Game role control method and device, electronic equipment and readable storage medium
CN117942556A (en) Game center adjusting method and device, electronic equipment and readable storage medium
CN117942563A (en) Game interface display method, game interface display device, electronic equipment and readable storage medium
CN116585707A (en) Game interaction method and device, electronic equipment and storage medium
CN117482516A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN115068943A (en) Game card control method and device, computer equipment and storage medium
CN116421968A (en) Virtual character control method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination