CN117942563A - Game interface display method, game interface display device, electronic equipment and readable storage medium - Google Patents

Game interface display method, game interface display device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN117942563A
CN117942563A CN202410160530.6A CN202410160530A CN117942563A CN 117942563 A CN117942563 A CN 117942563A CN 202410160530 A CN202410160530 A CN 202410160530A CN 117942563 A CN117942563 A CN 117942563A
Authority
CN
China
Prior art keywords
latency
game
game character
exposure
latent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410160530.6A
Other languages
Chinese (zh)
Inventor
许展昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202410160530.6A priority Critical patent/CN117942563A/en
Publication of CN117942563A publication Critical patent/CN117942563A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a game interface display method, a game interface display device, an electronic device and a computer readable storage medium, wherein the latency exposure rate of a game character under at least one direction taking the game character as a center is determined by responding to the latency event of the game character in a virtual shelter; the latent panel is displayed on the graphical user interface based on the latent exposure rate in at least one orientation, the latent panel including at least one orientation exposure rate indicator for indicating a target orientation and the latent exposure rate in the target orientation. The embodiment of the application can reduce the consumption of players when controlling the game roles to carry out the latency and the perception cost for identifying whether the game roles are hidden or not.

Description

Game interface display method, game interface display device, electronic equipment and readable storage medium
Technical Field
The present application relates to the field of game technologies, and in particular, to a game interface display method, a game interface display device, an electronic device, and a computer readable storage medium.
Background
Under the wave of the internet, the continuous development and evolution of hardware and software technology has promoted the advent of intelligent devices and software. At the same time, a large number of games with different themes are played to meet the demands of users, and with the vigorous development of various technologies in the game industry, the demands of people on game performance are increasing.
Currently, in some games, players need to control the game character to be hidden according to some shelter to meet the expected demands of the players, such as the players can control the game character to be hidden in a grass to make the game character voile against other game characters which are hostile to the players. However, since the player can only see the scenes presented by the game scenes under the view angles, but cannot see the scenes under other view angles, the player needs to expend a great deal of perceived cost to recognize whether the latency of the game character is hidden or not when controlling the game character to latency.
Disclosure of Invention
The embodiment of the application provides a game interface display method, a game interface display device, electronic equipment and a computer readable storage medium, which can reduce the consumption of a player when controlling a game character to perform latency and the perception cost for identifying whether the game character is hidden or not.
In a first aspect, an embodiment of the present application provides a game interface display method, where a graphical user interface is provided by a terminal device, where content displayed in the graphical user interface includes at least a part of a game scene, where the game scene includes a game character and a virtual shelter, and the method includes:
Determining a latency exposure of the game character in at least one orientation centered on the game character in response to a latency event of the game character in the virtual shelter;
Displaying a latent panel on the graphical user interface based on the latent exposure rate in at least one orientation, the latent panel including at least one orientation exposure rate indicator for indicating a target orientation and the latent exposure rate in the target orientation.
In a second aspect, an embodiment of the present application further provides a game interface display apparatus, where a graphical user interface is provided by a terminal device, where content displayed in the graphical user interface includes at least a part of a game scene, where the game scene includes a game character and a virtual shelter, and the apparatus includes:
An exposure rate determination module for determining a latency exposure rate of the game character in at least one direction centered on the game character in response to a latency event of the game character in the virtual shelter;
and the panel display module is used for displaying a latent panel on the graphical user interface based on the latent exposure rate in at least one azimuth, wherein the latent panel comprises at least one azimuth exposure rate mark, and the azimuth exposure rate mark is used for indicating a target azimuth and the latent exposure rate in the target azimuth.
In a third aspect, an embodiment of the present application further provides an electronic device, including a memory storing a plurality of instructions; the processor loads instructions from the memory to execute any game interface display method provided by the embodiment of the application.
In a fourth aspect, embodiments of the present application further provide a computer readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform any of the game interface display methods provided by the embodiments of the present application.
In an embodiment of the present application, a graphical user interface is provided through a terminal device, where content displayed in the graphical user interface includes at least a portion of a game scene, where the game scene includes a game character and a virtual shelter, and a latency exposure rate of the game character in at least one direction centered on the game character is determined by responding to a latency event of the game character in the virtual shelter. And then, displaying a latent panel on the graphical user interface based on the latent exposure rate in at least one azimuth, wherein the latent panel comprises at least one azimuth exposure rate mark used for indicating a target azimuth and the latent exposure rate in the target azimuth, so that a player can be prompted to know the latent exposure condition of the game role controlled by the player in time through the information displayed on the latent panel by providing a latent panel, and the perceived cost of identifying whether the game role is hidden or not, which is required by the player to control the game role to be hidden, is greatly reduced.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a game interface display system according to an embodiment of the present application;
FIG. 2 is a flowchart of an embodiment of a game interface display method according to an embodiment of the present application;
FIG. 3a is a schematic illustration of an undetectable latent exposure provided in an embodiment of the present application;
FIG. 3b is a schematic illustration of a less-discoverable latent exposure provided in an embodiment of the application;
FIG. 3c is a schematic illustration of easily found latent exposure provided in an embodiment of the present application;
FIG. 4 is a schematic diagram of a distance setting control provided in an embodiment of the present application;
Figure 5 is a schematic view of a latent panel provided in an embodiment of the application;
FIG. 6a is a schematic diagram of a latent panel prior to gesture update provided in an embodiment of the present application;
FIG. 6b is a schematic diagram of a latent panel after gesture update provided in an embodiment of the present application;
Fig. 7a is a schematic diagram of a latent panel before distance update according to an embodiment of the present application;
fig. 7b is a schematic diagram of a latency panel after distance update according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a game interface display device according to an embodiment of the present application;
Fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to fall within the scope of the application.
Before explaining the embodiments of the present application in detail, some terms related to the embodiments of the present application are explained.
Wherein in the description of embodiments of the present application, the terms "first," "second," and the like may be used herein to describe various concepts, but are not limited by these terms unless otherwise specified. These terms are only used to distinguish one concept from another. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Wherein the graphical user interface: the graphical user interface may be a display screen interface of the terminal device by executing a software application on a processor of the mobile terminal or other terminal and rendering it on a display. The graphical user interface may present all or only a portion of the game scene. The game scene comprises a plurality of static virtual objects, and specifically comprises ground, mountain, stone, vegetation, buildings and the like. When the game scene is bigger, only the local content of the game scene is displayed on the graphical user interface of the terminal equipment in the game process.
Wherein, the game scene: is a game scene that an application displays (or provides) when running on a terminal or server. Optionally, the game scene may be a simulation environment for the real world, a semi-simulation and semi-fictional virtual environment, or a pure fictional virtual environment. The game scene can be any one of a two-dimensional game scene, a 2.5-dimensional game scene or a three-dimensional game scene, and the dimension of the game scene is not limited in the embodiment of the application. For example, a game scene may include sky, land, sea, etc., where land may include environmental elements such as deserts, cities, etc., where a user may control game characters to move.
Wherein, the game role: refers to dynamic objects that can be controlled in a game scene. Alternatively, the game character may be a virtual character, a virtual animal, a cartoon character, or the like. The game character may be an avatar in the game scene for representing a user. At least one game character may be included in the game scene, each having its own shape and volume in the game scene, occupying a portion of the space in the game scene. In one possible implementation, a user can control a game character to move in the game scene, e.g., to run, jump, crawl, etc., and can also control the game character to fight other game characters using skills, virtual props, etc., provided by the application.
In this embodiment, the game character may be a virtual character that a user controls by an operation on the client. In some implementations, the game character can be a virtual character that plays in a game scene. In some embodiments, the number of game characters participating in the interaction in the game scene may be preset, or may be dynamically determined according to the number of clients joining the interaction.
The embodiment of the application provides a game interface display method, a game interface display device, electronic equipment and a computer readable storage medium. Specifically, the game interface display method of the embodiment of the application can be executed by an electronic device, wherein the electronic device can be a terminal or a server and other devices. The terminal may be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a Personal computer (PC, personal Computer), a Personal digital assistant (Personal DIGITAL ASSISTANT, PDA), and the like, and the terminal may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, a content distribution network (Content Delivery Network, CDN), basic cloud computing services such as big data and an artificial intelligent platform.
For example, when the game interface display method is run on the terminal, the terminal device stores a game application and is used to present a virtual scene in a game screen. The terminal device is used for interacting with a user through a graphical user interface, for example, the terminal device downloads and installs a game application program and runs the game application program. The way in which the terminal device presents the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device, or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including game screens and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for running the game, generating the graphical user interface, responding to the operation instructions, and controlling the display of the graphical user interface on the touch display screen.
For example, the game interface display method may be a cloud game when running on a server. Cloud gaming refers to a game style based on cloud computing. In the running mode of the cloud game, a running main body of the game application program and a game picture presentation main body are separated, and the storage and the running of the game interface display method are completed on a cloud game server. The game image presentation is completed at a cloud game client, which is mainly used for receiving and sending game data and presenting game images, for example, the cloud game client may be a display device with a data transmission function, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, etc., near a user side, but the terminal device for processing game data is a cloud game server in the cloud. When playing the game, the user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the cloud game client through a network, and finally decodes the data through the cloud game client and outputs the game pictures.
Referring to fig. 1, fig. 1 is a schematic diagram of a game interface display system according to an embodiment of the application. The system may include at least one terminal 1000, at least one server 2000, at least one database 3000, and a network 4000. Terminal 1000 held by a user may be connected to servers of different games through network 4000. Terminal 1000 can be any device having computing hardware capable of supporting and executing software products corresponding to a game. In addition, terminal 1000 can have one or more multi-touch sensitive screens for sensing and obtaining input from a user through touch or slide operations performed at multiple points of one or more touch sensitive display screens. In addition, when the system includes a plurality of terminals 1000, a plurality of servers 2000, and a plurality of networks 4000, different terminals 1000 may be connected to each other through different networks 4000, through different servers 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition, the different terminals 1000 may be connected to other terminals or to a server or the like using their own bluetooth network or hotspot network. For example, multiple users may be online through different terminals 1000 so as to be connected via an appropriate network and synchronized with each other to support multiplayer games. In addition, the system may include a plurality of databases 3000, the plurality of databases 3000 being coupled to different servers 2000, and information related to the game environment may be continuously stored in the databases 3000 while different users play the multiplayer game online.
The following detailed description is provided with reference to the accompanying drawings. The following description of the embodiments is not intended to limit the preferred embodiments. Although a logical order is depicted in the flowchart, in some cases the steps shown or described may be performed in an order different than depicted in the figures.
In this embodiment, a terminal is taken as an example to describe the present embodiment, and a method for displaying a game interface is provided, where a graphical user interface is provided by a terminal device, where content displayed in the graphical user interface includes at least a part of a game scene, where the game scene includes a game character and a virtual shelter, as shown in fig. 2, a specific flow of the method for displaying a game interface may be as follows:
201. in response to a latency event of the game character in the virtual shelter, determining a latency exposure of the game character in at least one orientation centered on the game character.
The virtual shelter is a virtual object which can be hidden by a target game character in the current game scene, and the game character can be hidden in the virtual shelter to meet certain requirements of a player, such as vomit or dodge the game character of an enemy. Such virtual shelters include, but are not limited to, virtual objects that may be hidden from the play, such as grass, stones, walls, houses, etc. in the game scene.
The latency event is used for indicating that the game character needs to be latency currently, and the latency event is set so that after the terminal triggers the latency event, the terminal can respond to the triggered latency event to perform corresponding processing.
The above-mentioned azimuth is used for indicating the azimuth of sensing the exposure condition of the game character, namely, the azimuth of the virtual object of the game character after being hidden can be checked in the game scene at present, so that the latent exposure condition of the game character after being hidden under at least one azimuth can be clarified by determining the latent exposure rate of at least one azimuth, and the latent exposure rate is used for indicating the latent exposure condition of the game character.
The above mentioned latent exposure conditions may be set according to requirements, for example, the latent exposure conditions may be set to at least two conditions, such as three kinds of latent exposure conditions with different degrees of non-discoverable, non-discoverable and easy-discoverable, i.e. the game character in the above mentioned non-discoverable latent exposure conditions does not expose any area in the virtual shelter, or exposes only a small area, as shown in fig. 3 a; the game character in the virtual shelter is exposed to a greater extent than in fig. 3a under the above-described hidden exposure conditions that are not readily found, as shown in fig. 3 b; the easily found latent exposure condition described above exposes more area in the virtual shelter than in fig. 3b, as shown in fig. 3 c. In addition, the control in the left area in fig. 3a, 3b and 3c is a control that controls the action of the game character, while the control in the lower area is the identity of the weapon held by the game character.
In this embodiment, the terminal determines the latency exposure rate of at least one direction when responding to the latency event of the game character in the game scene, so that the terminal can prompt the player to clearly determine the latency exposure condition of the game character controlled by the terminal through the determined latency exposure rate of at least one direction, thereby improving the information sensitivity of the player and reducing the perception cost of the player.
In some embodiments, the terminal may preset at least one latency trigger condition, so as to generate a latency event when detecting that some information in the game meets the preset latency trigger condition, so that the terminal responds to the latency event to perform corresponding processing. For example, the latency trigger condition may be that a latency trigger control is provided on a graphical user interface, so as to generate a latency event after a player triggers the latency trigger control; or the latency triggering condition may be that latency event is generated by detecting latency related information between the player control game character and the virtual shelter so that the latency related information accords with a preset latency triggering condition.
Specifically, the latency-related information may be a blocking area between the virtual shelter and the game character, and the determining a latency exposure of the game character in at least one direction centered on the game character in response to a latency event of the game character in the virtual shelter may include: the terminal acquires the shielding area between the virtual shelter and the game character and judges according to a preset latency shielding condition based on the shielding area, so that the current triggering response to the latency event is described by responding to the fact that the shielding area meets the preset latency shielding condition, and therefore the latency exposure rate of the game character in at least one direction taking the game character as the center needs to be determined.
The hiding condition corresponding to the hiding area can be that the hiding area is larger than or equal to a preset threshold, the preset threshold can be set according to the requirement of a user, and the preset threshold can also be set according to the whole area of the game role, for example, 0.1 time of the whole area of the game role.
The area of the screen between the virtual shelter and the game character may be a contact area between the virtual shelter and the game character, or may be a sum of overlapping areas of the virtual shelter and the game character, which are viewed in at least one direction, and may be set as needed, and is not limited thereto.
In some embodiments, the determining the latency exposure of the game character in at least one direction centered on the game character may include: the terminal may acquire latency-related information when the game character is in a state of being hidden in the virtual shelter, and determine a latency exposure rate of the game character in at least one direction centering on the game character based on the latency-related information.
The latency related information may be related information that may be used to calculate a latency exposure rate of the game character, and the latency related information includes, but is not limited to, a skin color of the game character, a shelter color of the virtual shelter, a latency distance parameter in at least one of the orientations, a shielding area between the virtual shelter and the game character, and the like.
It is to be understood that, in case the terminal determines the latency exposure rate of the above-mentioned game character in one direction centering on the above-mentioned game character based on the plurality of latency association information, the terminal may calculate a plurality of sub-latency exposure rates of the game character in one direction centering on the above-mentioned game character, respectively, based on at least one of the plurality of latency association information or a combination of at least two information, and then assign corresponding weights to the calculated plurality of sub-latency exposure rates, respectively, and calculate the sum of the weighted sub-latency exposure rates, thereby taking the calculated sum of the sub-latency exposure rates as the latency exposure rate of the above-mentioned game character in one direction centering on the above-mentioned game character.
Specifically, the latency-related information may include a skin color of the game character and a shelter color of the virtual shelter, and determining a latency exposure rate of the game character in at least one direction centered on the game character based on the latency-related information may include: the terminal may determine a color similarity between the game character and the virtual shelter based on the skin color and the shelter color, and then determine a first exposure rate corresponding to the color similarity based on a preset similarity mapping rule, so as to use the first exposure rate as a latent exposure rate of the game character in at least one direction centering on the game character, thereby determining whether the game character is easy to be hidden in the virtual shelter according to the colors of the game character and the virtual shelter.
The similarity mapping rule may be set according to requirements, for example, the terminal may set a similarity mapping curve between similarity and latent exposure rate, so as to select a first exposure rate corresponding to the color similarity from the similarity mapping curve; the terminal may further set a similarity mapping algorithm between the similarity and the latent exposure rate, so as to determine the first exposure rate corresponding to the color similarity by inputting the color similarity into the similarity mapping algorithm.
Specifically, the latency related information may include a latency parameter in at least one of the directions, and determining a latency exposure rate of the game character in at least one direction centered on the game character based on the latency related information may include: the terminal determines a second exposure rate corresponding to the latency parameter under each azimuth based on a preset distance mapping rule; the second exposure rate in each of the orientations is set as a latent exposure rate in each of the orientations.
The distance mapping rule can be set according to requirements, for example, a terminal can set a distance mapping curve between a latency distance parameter and a latency exposure rate, so as to select a second exposure rate corresponding to the latency distance parameter from the distance mapping curve; the terminal may further set a distance mapping algorithm between the latency parameter and the latency exposure rate, so as to determine a second exposure rate corresponding to the latency parameter by inputting the latency parameter into the distance mapping algorithm.
It will be appreciated that the latency parameter is used to indicate the distance from which the latent exposure of the game character is perceived, i.e., the distance corresponding to the observed latent game character, i.e., how far from the game character the latent game character is observed.
The graphical user interface further includes at least one distance setting control, and the obtaining latency related information when the game character is latency in the virtual shelter may include: and responding to the setting operation of the distance setting control, and taking the set distance range value as the latency distance parameter.
The distance setting control can comprise a distance bar, and the distance bar comprises at least one distance sliding block. When there is only one distance slider on the distance bar, in response to a sliding operation of the distance slider on the distance bar, a target distance corresponding to a position of the distance slider when stopped on the distance bar is determined, and the target distance is determined as the aforementioned latency parameter. When there are two distance sliders on the distance bar, as shown in fig. 4, in response to a sliding operation of the two distance sliders on the distance bar, a distance range value, such as 50m to 100m, of a distance composition corresponding to each of positions of the two distance sliders when stopped on the distance bar is determined, and the distance range value is taken as the aforementioned latency distance parameter. Wherein control 401 and control 402 in fig. 4 are distance sliders as described above, and control 401 and control 402 are on distance bar 403.
The distance setting control may include a distance information input control, and the terminal may respond to a distance information input operation of the distance information input control, and use the input distance information as the latency parameter.
Specifically, the latency-related information includes a blocking area between the virtual shelter and the game character, and determining a latency exposure of the game character in at least one direction centered on the game character based on the latency-related information may include: determining a third exposure rate corresponding to the shielding area under each azimuth based on a preset area mapping rule; the third exposure rate in each of the orientations is set as a latent exposure rate in each of the orientations.
The area mapping rule can be set according to requirements, for example, a terminal can set an area mapping curve between a shielding area and a latent exposure rate, so that a third exposure rate corresponding to the shielding area is selected from the area mapping curve; the terminal can also set an area mapping algorithm between the shielding area and the latent exposure rate, so as to determine a third exposure rate corresponding to the shielding area by inputting the shielding area into the area mapping algorithm.
It will be understood that, if the terminal determines the latency exposure rate of the game character in one direction centering on the game character based on a plurality of latency association information, the first exposure rate, the second exposure rate and the third exposure rate of the terminal are the sub-latency exposure rates, and then the corresponding weights are respectively given to the first exposure rate, the second exposure rate and the third exposure rate, and the sum of the first exposure rate, the second exposure rate and the third exposure rate after the weights are given is calculated, and the calculated sum is taken as the latency exposure rate.
202. Displaying a latent panel on the graphical user interface based on the latent exposure rate in at least one orientation, the latent panel including at least one orientation exposure rate indicator for indicating a target orientation and the latent exposure rate in the target orientation.
The information displayed on the latent panel is used for indicating the latent exposure condition of the game character, namely the latent exposure rate, and the terminal can indicate the latent exposure condition of the game character by adopting different information presentation modes on the latent panel.
It can be understood that, because the player needs to expend a great deal of perceived cost to identify whether the game character is hidden or not when the player is in the process of controlling the game character to be hidden, in the embodiment, the terminal can intuitively feel the hidden exposure condition of the game character by displaying the hidden panel capable of indicating the hidden exposure condition of the game character, the safety degree of the game character after the hidden controlled by the player is clear, the perception cost of the player is reduced, and uncertain factors existing when the player finishes the expected requirement are reduced, so that the player can better develop and experience specific playing methods, for example, the player can better utilize the topography to perform voyage to the enemy character, and the hidden battle experience of the player is improved.
In some embodiments, the azimuth exposure rate identifier on the latent panel may include an exposure rate bar, where azimuth information indicating the target azimuth is configured on the exposure rate bar, and an exposure rate slider located at a position for indicating the latent exposure rate in the target azimuth.
In some embodiments, the plurality of azimuth exposure rate markers on the latent panel may form a ring, the ring is divided into a plurality of sub-rings, each sub-ring represents one of the azimuth exposure rate markers, a relative direction between the sub-ring and a center position of the ring is used for indicating the target azimuth, and the sub-ring is configured to indicate the latent exposure rate in the target azimuth through a displayed color.
It can be understood that the color displayed by the sub-ring indicates the severity of the latent exposure condition while indicating the above target orientation, so that the player can intuitively feel the latent exposure condition of the game character in different orientations by observing the colors of the sub-rings in different orientations of the ring, and the latent exposure condition corresponding to the specific color can be set according to requirements, for example, when three kinds of latent exposure conditions which cannot be found, cannot be found easily and can be found easily exist, different colors can be respectively given to the latent exposure conditions of different degrees, for example, green is given to the latent exposure condition which cannot be found easily, and red is given to the latent exposure condition which cannot be found easily.
Illustratively, as shown in fig. 5, the left area in fig. 5 shows a latent panel on which there is a ring composed of four sub-rings, each of which is used to indicate the target azimuth with respect to the center position of the ring, and the colors shown by the sub-rings indicate the latent exposure rate at the target azimuth, and in total, three colors are shown in fig. 5, each representing the extent of the latent exposure condition to which the corresponding latent exposure rate belongs, that is, the latent exposure conditions in different azimuth in fig. 5 are different, and the sum of the latent exposure rates in fig. 5 may be 50% for the three levels of the latent exposure conditions that are respectively not found, are not easily found, and are easily found.
In some embodiments, the latency panel further includes a character indicator located at a center of the ring, as shown in fig. 5, which may be used to indicate the game character, the character indicator being configured to indicate a sum of latency exposure rates for all directions by a displayed color. In addition, the terminal may display the sum of the latent exposure rates directly on the latent panel.
In some embodiments, the terminal may determine the latency exposure of the game character in a direction centered on the game character based on at least one different latency association information, where the latency association information may be different in the same game as the game progresses, so that the terminal needs to update the information indicated on the latency panel in time. For example, the occlusion area between the virtual shelter and the game character changes according to the posture of the game character; or the color of the shelter of the virtual shelter where the game character is located is different along with the advancing of the game character; or the latency parameter is changed along with the adjustment of the distance setting control.
Specifically, the latency-related information may include a blocking area between the virtual shelter and the game character in at least one of the orientations, and may further include, after the latency panel is displayed on the graphical user interface: determining a latest latency exposure of the game character in at least one of the orientations based on a blocking area between the virtual shelter and the game character in at least one of the orientations in response to a gesture switching operation of the game character in the virtual shelter; updating at least one azimuth exposure rate identifier on the latent panel based on the latest latent exposure rate under at least one azimuth.
For example, as shown in fig. 6a, if the game character is currently in a lying down posture, and because the virtual mask can well mask the game character in this posture, the corresponding latency exposure rate is 0%, that is, the colors of the sub-circles corresponding to the respective orientations of the circles are consistent, and are the colors corresponding to the latency exposure conditions which cannot be found.
When the game character is switched from the prone posture to the squat posture, if the virtual shelter is set so that the game character cannot be well shielded, the latest latency exposure rate corresponding to the shielding area between the game character in the squat posture and the virtual shelter is calculated to be 50%, and as shown in fig. 6b, the colors of the sub-circles corresponding to the different directions of the circles are not uniform.
Specifically, the latency related information may include a distance setting control, and the player may control a distance for sensing a latency exposure of the game character by resetting the distance setting control, so as to obtain the latency exposure of the game character under the distance, and may further include, after displaying the latency panel on the graphical user interface: in response to a setting operation of the distance setting control, determining an updated distance range value, and the terminal may determine a latest latency exposure rate corresponding to the updated distance range value under each of the orientations based on a preset distance mapping rule, and update at least one orientation exposure rate identifier on the latency panel respectively.
It can be appreciated that if the player wants to view other game characters within a certain distance range from the game character controlled by the player according to the situation of the own latency position, and perceives the latency exposure situation of the game character controlled by the player, the player can determine the latency exposure situation of the game character under the corresponding distance by setting the distance setting control. Wherein the farther the distance range value is from the player-controlled game character, the lower the corresponding latency exposure rate.
For example, as shown in fig. 7a, if the distance range value currently corresponding to the game character is 50m to 100m, and the virtual shelter cannot well shelter the game character due to the close distance under the distance range value, the corresponding latency exposure rate is 50%, that is, the colors of the sub-circles corresponding to different orientations of the circle are not consistent.
When the distance range value of the game character is reset and the updated distance range value is 100m to 160m, the virtual shelter can further shield the game character due to the fact that the distance under the distance range value is far compared with that of fig. 7a, and accordingly the latency exposure rate is 25%, as shown in fig. 7 b.
From the above, it can be seen that a graphical user interface is provided by a terminal device, the content displayed in the graphical user interface comprising at least part of a game scene comprising a game character and a virtual shelter, and the latency exposure of the game character in at least one orientation centred on the game character is determined in response to a latency event of the game character in the virtual shelter. And then, displaying a latent panel on the graphical user interface based on the latent exposure rate in at least one azimuth, wherein the latent panel comprises at least one azimuth exposure rate mark used for indicating a target azimuth and the latent exposure rate in the target azimuth, so that a player can be prompted to know the latent exposure condition of the game role controlled by the player in time through the information displayed on the latent panel by providing a latent panel, and the perceived cost of identifying whether the game role is hidden or not, which is required by the player to control the game role to be hidden, is greatly reduced.
In order to better implement the above method, the embodiment of the present application further provides a game interface display device, where the game interface display device may be specifically integrated in an electronic device, for example, a computer device, where the computer device may be a terminal, a server, or other devices.
The terminal can be a mobile phone, a tablet personal computer, an intelligent Bluetooth device, a notebook computer, a personal computer and other devices; the server may be a single server or a server cluster composed of a plurality of servers.
For example, in this embodiment, a specific integration of a game interface display device in a terminal will be taken as an example, and a method according to an embodiment of the present application is described in detail, where the game interface display device provides a graphical user interface through a terminal device, where content displayed in the graphical user interface includes at least a part of a game scene, where the game scene includes a game character and a virtual shelter, and as shown in fig. 8, the game interface display device may include:
an exposure rate determining module 801 for determining a latency exposure rate of the game character in at least one direction centered on the game character in response to a latency event of the game character in the virtual shelter;
a panel display module 802 for displaying a latent panel on the graphical user interface based on the latent exposure rate in at least one orientation, the latent panel including at least one orientation exposure rate indicator for indicating a target orientation and a latent exposure rate in the target orientation.
In some embodiments, the exposure rate determination module 801 is specifically configured to:
Acquiring latency related information of the game character when the game character is in the virtual shelter;
And determining a latency exposure rate of the game character in at least one direction centering on the game character based on the latency-related information.
In some embodiments, the latency-related information includes skin color of the game character and shelter color of the virtual shelter, and the exposure rate determination module 801 is specifically configured to:
Determining a color similarity between the game character and the virtual shelter based on the skin color and the shelter color;
Determining a first exposure rate corresponding to the color similarity based on a preset similarity mapping rule;
the first exposure rate is set as a latent exposure rate of the game character in at least one of the orientations centered on the game character.
In some embodiments, the latency related information includes at least one latency parameter in the azimuth, the latency parameter is used to indicate a distance for sensing a latency exposure of the game character, and the exposure rate determining module 801 is specifically configured to:
Determining a second exposure rate corresponding to the latency parameter under each azimuth based on a preset distance mapping rule;
the second exposure rate in each of the orientations is set as a latent exposure rate in each of the orientations.
In some embodiments, the graphical user interface further includes at least one distance setting control, and the exposure rate determining module 801 is specifically configured to:
And responding to the setting operation of the distance setting control, and taking the set distance range value as the latency distance parameter.
In some embodiments, the latency-related information includes an area of occlusion between the virtual shelter and the game character in at least one of the orientations, and the game interface display device further includes an update module, the update module being specifically configured to:
Determining a latest latency exposure of the game character in at least one of the orientations based on a blocking area between the virtual shelter and the game character in at least one of the orientations in response to a gesture switching operation of the game character in the virtual shelter;
Updating at least one azimuth exposure rate identifier on the latent panel based on the latest latent exposure rate under at least one azimuth.
In some embodiments, the exposure rate determination module 801 is specifically configured to:
acquiring a shielding area between the virtual shelter and the game character;
And determining the latent exposure rate of the game character in at least one direction centering on the game character in response to the shielding area meeting a preset latent shielding condition.
In some embodiments, the plurality of azimuth exposure rate marks on the latent panel form a ring, the ring is divided into a plurality of sub-rings, each sub-ring represents one azimuth exposure rate mark, a relative direction between the sub-ring and a center position of the ring is used for indicating the target azimuth, and the sub-ring is configured to indicate the latent exposure rate in the target azimuth through a displayed color.
In some embodiments, the latency panel further includes a character indicator located at a central location of the ring, the character indicator configured to indicate a sum of latency exposure rates for all orientations by a displayed color.
As can be seen from the above, a graphical user interface is provided by the terminal device, where the content displayed in the graphical user interface includes at least a portion of a game scene, where the game scene includes a game character and a virtual shelter, and the exposure rate determining module 801 determines, in response to a latency event of the game character in the virtual shelter, a latency exposure rate of the game character in at least one direction centered on the game character. Then, based on the latency exposure rate in at least one direction, the panel display module 802 displays a latency panel on the gui, where the latency panel includes at least one direction exposure rate indicator, where the direction exposure rate indicator is used to indicate a target direction and the latency exposure rate in the target direction, so that by providing a latency panel, a player can be prompted to learn about the latency exposure condition of the game character controlled by the player in time through the information displayed on the latency panel, so as to greatly reduce the cost of perception required by the player to control the game character to perform latency, and identify whether the latency of the game character is hidden.
Correspondingly, the embodiment of the application also provides electronic equipment which can be a terminal, wherein the terminal can be terminal equipment such as a smart phone, a tablet Personal computer, a notebook computer, a touch screen, a game machine, a Personal computer (PC, personal Computer), a Personal digital assistant (Personal DIGITAL ASSISTANT, PDA) and the like. Fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application, as shown in fig. 9. The electronic device 900 includes a processor 901 having one or more processing cores, a memory 902 having one or more computer readable storage media, and a computer program stored on the memory 902 and executable on the processor. The processor 901 is electrically connected to the memory 902. It will be appreciated by those skilled in the art that the electronic device structure shown in the figures is not limiting of the electronic device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
Processor 901 is a control center of electronic device 900, connects various portions of the entire electronic device 900 using various interfaces and lines, and performs various functions of electronic device 900 and processes data by running or loading software programs and/or modules stored in memory 902, and invoking data stored in memory 902, thereby performing overall monitoring of electronic device 900.
In the embodiment of the present application, the processor 901 in the electronic device 900 loads the instructions corresponding to the processes of one or more application programs into the memory 902 according to the following steps, and the processor 901 executes the application programs stored in the memory 902, so as to implement various functions:
Determining a latency exposure of the game character in at least one orientation centered on the game character in response to a latency event of the game character in the virtual shelter;
Displaying a latent panel on the graphical user interface based on the latent exposure rate in at least one orientation, the latent panel including at least one orientation exposure rate indicator for indicating a target orientation and the latent exposure rate in the target orientation.
In some embodiments, the determining the latency exposure of the game character in at least one orientation centered on the game character comprises:
Acquiring latency related information of the game character when the game character is in the virtual shelter;
And determining a latency exposure rate of the game character in at least one direction centering on the game character based on the latency-related information.
In some embodiments, the latency-related information includes a skin color of the game character and a shelter color of the virtual shelter, and the determining a latency exposure of the game character in at least one direction centered on the game character based on the latency-related information includes:
Determining a color similarity between the game character and the virtual shelter based on the skin color and the shelter color;
Determining a first exposure rate corresponding to the color similarity based on a preset similarity mapping rule;
the first exposure rate is set as a latent exposure rate of the game character in at least one of the orientations centered on the game character.
In some embodiments, the latency-related information includes at least one latency parameter in the orientation, the latency parameter indicating a distance to perceive a latency exposure of the game character, the determining a latency exposure of the game character in at least one orientation centered on the game character based on the latency-related information includes:
Determining a second exposure rate corresponding to the latency parameter under each azimuth based on a preset distance mapping rule;
the second exposure rate in each of the orientations is set as a latent exposure rate in each of the orientations.
In some embodiments, the graphical user interface further includes at least one distance setting control, and the obtaining latency-related information of the game character when it is in the virtual shelter includes:
And responding to the setting operation of the distance setting control, and taking the set distance range value as the latency distance parameter.
In some embodiments, the latency-related information includes an occlusion area between the virtual shelter and the game character in at least one of the orientations, and after displaying a latency panel on the graphical user interface, further comprising:
Determining a latest latency exposure of the game character in at least one of the orientations based on a blocking area between the virtual shelter and the game character in at least one of the orientations in response to a gesture switching operation of the game character in the virtual shelter;
Updating at least one azimuth exposure rate identifier on the latent panel based on the latest latent exposure rate under at least one azimuth.
In some embodiments, the determining the latency exposure of the game character in at least one orientation centered on the game character in response to a latency event of the game character in the virtual shelter comprises:
acquiring a shielding area between the virtual shelter and the game character;
And determining the latent exposure rate of the game character in at least one direction centering on the game character in response to the shielding area meeting a preset latent shielding condition.
In some embodiments, the plurality of azimuth exposure rate marks on the latent panel form a ring, the ring is divided into a plurality of sub-rings, each sub-ring represents one azimuth exposure rate mark, a relative direction between the sub-ring and a center position of the ring is used for indicating the target azimuth, and the sub-ring is configured to indicate the latent exposure rate in the target azimuth through a displayed color.
In some embodiments, the latency panel further includes a character indicator located at a central location of the ring, the character indicator configured to indicate a sum of latency exposure rates for all orientations by a displayed color.
Thus, the electronic device 900 provided in this embodiment may have the following technical effects: the cost of perception of identifying whether the game character's latency is hidden is reduced, which is required to be expended by the player in controlling the game character's latency.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Optionally, as shown in fig. 9, the electronic device 900 further includes: a touch display 903, a radio frequency circuit 904, an audio circuit 905, an input unit 906, and a power supply 907. The processor 901 is electrically connected to the touch display 903, the radio frequency circuit 904, the audio circuit 905, the input unit 906, and the power supply 907, respectively. It will be appreciated by those skilled in the art that the electronic device structure shown in fig. 9 is not limiting of the electronic device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
The touch display 903 may be used to display a graphical user interface and receive an operation instruction generated by a user acting on the graphical user interface. The touch display 903 may include a display panel and a touch panel. Wherein the display panel may be used to display information entered by a user or provided to a user as well as various graphical user interfaces of the electronic device, which may be composed of graphics, text, icons, video, and any combination thereof. Alternatively, the display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations on or near the user (such as operations on or near the touch panel by the user using any suitable object or accessory such as a finger, stylus, etc.), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, and sends the touch point coordinates to the processor 901, and can receive and execute commands sent from the processor 901. The touch panel may overlay the display panel, and upon detection of a touch operation thereon or thereabout, the touch panel is passed to the processor 901 to determine the type of touch event, and the processor 901 then provides a corresponding visual output on the display panel based on the type of touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display 903 to realize input and output functions. In some embodiments, however, the touch panel and the touch panel may be implemented as two separate components to perform the input and output functions. I.e. the touch display 903 may also implement an input function as part of the input unit 906.
The radio frequency circuit 904 may be configured to receive and transmit radio frequency signals to and from a network device or other electronic device via wireless communication to and from the network device or other electronic device.
The audio circuitry 905 may be used to provide an audio interface between a user and an electronic device through a speaker, microphone. The audio circuit 905 may transmit the received electrical signal converted from audio data to a speaker, and convert the electrical signal into a sound signal to output; on the other hand, the microphone converts the collected sound signals into electrical signals, which are received by the audio circuit 905 and converted into audio data, which are processed by the audio data output processor 901 for transmission to, for example, another electronic device via the radio frequency circuit 904, or which are output to the memory 902 for further processing. The audio circuit 905 may also include an ear bud jack to provide communication of the peripheral headphones with the electronic device.
The input unit 906 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 907 is used to power the various components of the electronic device 900. Alternatively, the power supply 907 may be logically connected to the processor 901 through a power management system, so as to implement functions of managing charging, discharging, and power consumption management through the power management system. The power supply 907 may also include one or more of any components, such as a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown in fig. 9, the electronic device 900 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, an embodiment of the present application provides a computer readable storage medium in which a plurality of computer programs are stored, the computer programs being capable of being loaded by a processor to perform any of the game interface display methods provided by the embodiment of the present application. For example, the computer program may perform the steps of:
Determining a latency exposure of the game character in at least one orientation centered on the game character in response to a latency event of the game character in the virtual shelter;
Displaying a latent panel on the graphical user interface based on the latent exposure rate in at least one orientation, the latent panel including at least one orientation exposure rate indicator for indicating a target orientation and the latent exposure rate in the target orientation.
In some embodiments, the determining the latency exposure of the game character in at least one orientation centered on the game character comprises:
Acquiring latency related information of the game character when the game character is in the virtual shelter;
And determining a latency exposure rate of the game character in at least one direction centering on the game character based on the latency-related information.
In some embodiments, the latency-related information includes a skin color of the game character and a shelter color of the virtual shelter, and the determining a latency exposure of the game character in at least one direction centered on the game character based on the latency-related information includes:
Determining a color similarity between the game character and the virtual shelter based on the skin color and the shelter color;
Determining a first exposure rate corresponding to the color similarity based on a preset similarity mapping rule;
the first exposure rate is set as a latent exposure rate of the game character in at least one of the orientations centered on the game character.
In some embodiments, the latency-related information includes at least one latency parameter in the orientation, the latency parameter indicating a distance to perceive a latency exposure of the game character, the determining a latency exposure of the game character in at least one orientation centered on the game character based on the latency-related information includes:
Determining a second exposure rate corresponding to the latency parameter under each azimuth based on a preset distance mapping rule;
the second exposure rate in each of the orientations is set as a latent exposure rate in each of the orientations.
In some embodiments, the graphical user interface further includes at least one distance setting control, and the obtaining latency-related information of the game character when it is in the virtual shelter includes:
And responding to the setting operation of the distance setting control, and taking the set distance range value as the latency distance parameter.
In some embodiments, the latency-related information includes an occlusion area between the virtual shelter and the game character in at least one of the orientations, and after displaying a latency panel on the graphical user interface, further comprising:
Determining a latest latency exposure of the game character in at least one of the orientations based on a blocking area between the virtual shelter and the game character in at least one of the orientations in response to a gesture switching operation of the game character in the virtual shelter;
Updating at least one azimuth exposure rate identifier on the latent panel based on the latest latent exposure rate under at least one azimuth.
In some embodiments, the determining the latency exposure of the game character in at least one orientation centered on the game character in response to a latency event of the game character in the virtual shelter comprises:
acquiring a shielding area between the virtual shelter and the game character;
And determining the latent exposure rate of the game character in at least one direction centering on the game character in response to the shielding area meeting a preset latent shielding condition.
In some embodiments, the plurality of azimuth exposure rate marks on the latent panel form a ring, the ring is divided into a plurality of sub-rings, each sub-ring represents one azimuth exposure rate mark, a relative direction between the sub-ring and a center position of the ring is used for indicating the target azimuth, and the sub-ring is configured to indicate the latent exposure rate in the target azimuth through a displayed color.
In some embodiments, the latency panel further includes a character indicator located at a central location of the ring, the character indicator configured to indicate a sum of latency exposure rates for all orientations by a displayed color.
It can be seen that the computer program can be loaded by the processor to execute any of the game interface display methods provided by the embodiments of the present application, so as to bring about the following technical effects: the cost of perception of identifying whether the game character's latency is hidden is reduced, which is required to be expended by the player in controlling the game character's latency.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Wherein the computer-readable storage medium may comprise: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
Because the computer program stored in the computer readable storage medium can execute any game interface display method provided by the embodiment of the present application, the beneficial effects that any game interface display method provided by the embodiment of the present application can realize can be realized, which are detailed in the previous embodiments and are not described herein.
The above description of the game interface display method, the device, the electronic equipment and the computer readable storage medium provided by the embodiment of the application applies specific examples to illustrate the principle and the implementation of the application, and the description of the above embodiment is only used for helping to understand the method and the core idea of the application; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, the present description should not be construed as limiting the present application.

Claims (12)

1. A game interface display method, characterized in that a graphical user interface is provided by a terminal device, wherein contents displayed in the graphical user interface include at least part of game scenes, and the game scenes include game characters and virtual shelter, and the method comprises:
Determining a latency exposure of the game character in at least one direction centered on the game character in response to a latency event of the game character in the virtual shelter;
A latency panel is displayed on the graphical user interface based on the latency exposure in at least one orientation, the latency panel including at least one orientation exposure indicator for indicating a target orientation and the latency exposure in the target orientation.
2. The game interface display method of claim 1, wherein the determining the latency exposure of the game character in at least one direction centered on the game character comprises:
Acquiring latency related information of the game role when the game role is in latency in the virtual shelter;
based on the latency association information, a latency exposure of the game character in at least one direction centered on the game character is determined.
3. The game interface display method of claim 2, wherein the latency-related information includes a skin color of the game character and a shelter color of the virtual shelter, and wherein determining a latency exposure of the game character in at least one direction centered on the game character based on the latency-related information comprises:
Determining a color similarity between the game character and the virtual shelter based on the skin color and the shelter color;
Determining a first exposure rate corresponding to the color similarity based on a preset similarity mapping rule;
The first exposure rate is used as a latent exposure rate of the game character in at least one direction centering on the game character.
4. The game interface display method according to claim 2, wherein the latency-related information includes a latency parameter in at least one of the orientations, the latency parameter indicating a distance for perceiving a latency exposure of the game character, the determining a latency exposure rate of the game character in at least one of the orientations centered on the game character based on the latency-related information, comprising:
determining a second exposure rate corresponding to the latency parameter under each azimuth based on a preset distance mapping rule;
and taking the second exposure rate in each azimuth as the latent exposure rate in each azimuth.
5. The game interface display method of claim 4, further comprising at least one distance setting control on the graphical user interface, wherein the obtaining latency-related information for the game character to latency in the virtual shelter comprises:
And responding to the setting operation of the distance setting control, and taking the set distance range value as the latency distance parameter.
6. The game interface display method of claim 2, wherein the latency-related information includes an occlusion area between the virtual shelter and the game character in at least one of the orientations, further comprising, after displaying a latency panel on the graphical user interface:
determining a latest latency exposure of the game character in at least one of the orientations based on an occlusion area between the virtual shelter and the game character in at least one of the orientations in response to a gesture switching operation of the game character in the virtual shelter;
Updating at least one of the azimuth exposure rate identifiers on the latent panel based on the latest latent exposure rate in at least one azimuth, respectively.
7. The game interface display method of claim 1, wherein the determining the latency exposure of the game character in at least one orientation centered on the game character in response to the latency event of the game character in the virtual shelter comprises:
Acquiring a shielding area between the virtual shelter and the game role;
And determining the latent exposure rate of the game role in at least one direction taking the game role as the center in response to the shielding area meeting a preset latent shielding condition.
8. The game interface display method according to any one of claims 1 to 7, wherein a plurality of the azimuth exposure rate marks on the latent panel form a ring, the ring is divided into a plurality of sub-rings, each of the sub-rings represents one of the azimuth exposure rate marks, a relative direction between the sub-ring and a center position of the ring is used to indicate the target azimuth, and the sub-ring is configured to indicate the latent exposure rate in the target azimuth by a displayed color.
9. The game interface display method of claim 8, wherein the latent panel further comprises a character indicator located at a center position of the ring, the character indicator configured to indicate a sum of latent exposure rates of all directions by a displayed color.
10. A game interface display apparatus, wherein a graphical user interface is provided by a terminal device, wherein content displayed in the graphical user interface includes at least a portion of a game scene including a game character and a virtual shelter, the apparatus comprising:
an exposure rate determination module for determining a latency exposure rate of the game character in at least one direction centered on the game character in response to a latency event of the game character in the virtual shelter;
A panel display module for displaying a latent panel on the graphical user interface based on the latent exposure rate in at least one orientation, the latent panel comprising at least one orientation exposure rate indicator for indicating a target orientation and a latent exposure rate in the target orientation.
11. An electronic device comprising a processor and a memory, the memory storing a plurality of instructions; the processor loads instructions from the memory to perform the game interface display method according to any one of claims 1 to 9.
12. A computer readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the game interface display method of any one of claims 1 to 9.
CN202410160530.6A 2024-02-04 2024-02-04 Game interface display method, game interface display device, electronic equipment and readable storage medium Pending CN117942563A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410160530.6A CN117942563A (en) 2024-02-04 2024-02-04 Game interface display method, game interface display device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410160530.6A CN117942563A (en) 2024-02-04 2024-02-04 Game interface display method, game interface display device, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN117942563A true CN117942563A (en) 2024-04-30

Family

ID=90799952

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410160530.6A Pending CN117942563A (en) 2024-02-04 2024-02-04 Game interface display method, game interface display device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN117942563A (en)

Similar Documents

Publication Publication Date Title
CN111013142B (en) Interactive effect display method and device, computer equipment and storage medium
CN112494955B (en) Skill releasing method, device, terminal and storage medium for virtual object
CN111414080B (en) Method, device and equipment for displaying position of virtual object and storage medium
CN113101652A (en) Information display method and device, computer equipment and storage medium
CN111603770B (en) Virtual environment picture display method, device, equipment and medium
CN111589140A (en) Virtual object control method, device, terminal and storage medium
CN113398590B (en) Sound processing method, device, computer equipment and storage medium
CN113426124B (en) Display control method and device in game, storage medium and computer equipment
CN110496392B (en) Virtual object control method, device, terminal and storage medium
CN112891931A (en) Virtual role selection method, device, equipment and storage medium
CN112704876B (en) Method, device and equipment for selecting virtual object interaction mode and storage medium
US20240131434A1 (en) Method and apparatus for controlling put of virtual resource, computer device, and storage medium
CN113082707A (en) Virtual object prompting method and device, storage medium and computer equipment
CN113577765A (en) User interface display method, device, equipment and storage medium
CN113559495A (en) Method, device, equipment and storage medium for releasing skill of virtual object
CN113599819A (en) Prompt message display method, device, equipment and storage medium
CN111530075A (en) Method, device, equipment and medium for displaying picture of virtual environment
WO2023071808A1 (en) Virtual scene-based graphic display method and apparatus, device, and medium
CN115382201A (en) Game control method and device, computer equipment and storage medium
CN112619131B (en) Method, device and equipment for switching states of virtual props and readable storage medium
CN117942563A (en) Game interface display method, game interface display device, electronic equipment and readable storage medium
CN113521724A (en) Method, device, equipment and storage medium for controlling virtual role
CN111921191A (en) Display method and device of status icon, terminal and storage medium
CN113398564B (en) Virtual character control method, device, storage medium and computer equipment
CN115430151A (en) Game role control method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination