CN116672724A - Game play showing method, device, equipment and storage medium - Google Patents
Game play showing method, device, equipment and storage medium Download PDFInfo
- Publication number
- CN116672724A CN116672724A CN202210161375.0A CN202210161375A CN116672724A CN 116672724 A CN116672724 A CN 116672724A CN 202210161375 A CN202210161375 A CN 202210161375A CN 116672724 A CN116672724 A CN 116672724A
- Authority
- CN
- China
- Prior art keywords
- target
- time
- interaction
- game
- role
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 69
- 230000003993 interaction Effects 0.000 claims abstract description 197
- 230000000875 corresponding effect Effects 0.000 claims description 52
- 230000008859 change Effects 0.000 claims description 28
- 230000006399 behavior Effects 0.000 claims description 24
- 230000002452 interceptive effect Effects 0.000 claims description 21
- 239000008280 blood Substances 0.000 claims description 17
- 210000004369 blood Anatomy 0.000 claims description 17
- 230000004044 response Effects 0.000 claims description 13
- 230000034994 death Effects 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 9
- 230000001960 triggered effect Effects 0.000 claims description 6
- 238000004891 communication Methods 0.000 claims description 5
- 230000002596 correlated effect Effects 0.000 claims description 5
- 230000004083 survival effect Effects 0.000 claims description 5
- 230000007423 decrease Effects 0.000 claims description 4
- 230000007123 defense Effects 0.000 claims description 3
- 230000009471 action Effects 0.000 claims 2
- 230000003247 decreasing effect Effects 0.000 claims 1
- 208000027418 Wounds and injury Diseases 0.000 description 18
- 230000006378 damage Effects 0.000 description 18
- 208000014674 injury Diseases 0.000 description 18
- 238000010586 diagram Methods 0.000 description 16
- 230000008569 process Effects 0.000 description 6
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000006907 apoptotic process Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001225 therapeutic effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/85—Providing additional services to players
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application provides a game play showing method, device and equipment and a storage medium, and belongs to the technical field of game interfaces. The method comprises the following steps: providing a graphical user interface through the terminal device, and displaying game global information in the graphical user interface, wherein the game global information comprises: a global time axis and at least one time node located on the global time axis; and responding to the triggering operation of the target node in the time nodes, and displaying game interaction detail information which is respectively established by taking each virtual character as a dimension and corresponds to the target node in a graphical user interface. The application can increase the comprehensiveness and intuitiveness of the content of the multiple disks.
Description
Technical Field
The application relates to the technical field of game interfaces, in particular to a game play display method, a game play display device, game play display equipment and a storage medium.
Background
In a large escape-and-kill type game, players are often required to play a multiple of the game after the game is completed to determine the detailed cause of the winning or losing game.
In the prior art, the time axis is usually listed in the order of time, so that the player obtains the event occurring at the time of the corresponding position according to the time axis, and the time of the event and the specific event content can be recorded in the time axis.
However, for a large escape type game, interaction of players among a plurality of camps is usually involved, only the time of occurrence of an event and corresponding event content are acquired, and the player cannot intuitively and accurately acquire the interaction situation among different players at the time point, so that the acquired event content has a larger limitation, and correspondingly, the acquired information content is less when a plurality of players are in a double-disc state.
Disclosure of Invention
The application aims to provide a game play showing method, a game play showing device and a game play showing storage medium, which can increase the comprehensiveness and intuitiveness of a multi-disc content.
Embodiments of the present application are implemented as follows:
in one aspect of the embodiment of the present application, a game play display method is provided, and a graphical user interface is provided through a terminal device, where the method includes:
displaying game global information in the graphical user interface, the game global information comprising: a global time axis and at least one time node located on the global time axis;
and responding to the triggering operation of the target node in the time nodes, and displaying game interaction detail information which is respectively established by taking each virtual character as a dimension and corresponds to the target node in a graphical user interface.
Optionally, the virtual roles include a target role and at least one related role; the game interaction details include: the method comprises the steps of establishing a target role time axis by taking a target role as a dimension, at least one relevant role time axis by taking a relevant role as a dimension, and interaction identification information;
the target role time axis is used for representing an interaction time period of the target role taking the target time node as a starting moment;
the related role time axis is used for representing the interaction time period of the related roles taking the target time node as the starting moment;
the interaction identification information is used for identifying interaction behaviors between at least two virtual roles in the target role and at least one relevant role at a target time point, wherein the target time point is a target role time axis or any time point on the relevant role time axis.
Optionally, the game interaction detail information further includes attribute information of the virtual character; the target role time axis is also used for representing the change condition of attribute information of the target role taking the target time node as the starting moment; the relevant character time axis is also used for representing the change situation of attribute information of the relevant character taking the target time node as the starting moment.
Optionally, the attribute information includes at least one of: blood volume attribute, armor attribute, attack attribute, defense attribute and energy attribute, wherein attribute information is represented in a form of a histogram;
The change condition of the attribute information includes: when the attribute information increases, increasing the height of the histogram; when the attribute information decreases, the height of the histogram is reduced.
Optionally, the interaction identification information includes: at least one directivity identification;
the directivity mark is used for indicating the target role to execute interaction behavior on the related roles at the target time point; or, instruct the related role to execute the interactive action to the target role at the target time point; or, instruct the related character to perform an interactive action on another related character at the target time point, the interactive action including: attack interactions, facilitation interactions, treatment interactions, communication interactions.
The interaction identification information further includes: a skill identification image;
the skill identification image is used to indicate the type of skill triggered during the interactive behavior.
Optionally, displaying, in the graphical user interface, game interaction details corresponding to the target nodes, each of which is established with each virtual character as a dimension, including:
displaying a target role time axis and at least one relevant role time axis which are respectively established by taking each virtual role as a dimension and correspond to the target node in a graphical user interface; displaying the directivity mark at the target time points of at least two time axes in the target role time axis and at least one relevant role time axis; and displaying the skill mark image within a specified range of the directivity mark.
Optionally, the method further comprises:
and responding to the triggering operation of the skill identification image, and displaying the interaction numerical value corresponding to the interaction behavior.
Optionally, the game interaction details further include: a first selection control corresponding to the target character time axis and/or a second selection control corresponding to the relevant character time axis; the method further comprises the steps of:
hiding or displaying interaction identification information of the target role in response to triggering operation of the first selection control;
and hiding or displaying the interaction identification information of the related roles in response to the triggering operation of the second selection control.
Optionally, each time node includes a corresponding node icon respectively;
the size and/or color of the node icon is positively correlated to the frequency of interaction for the game interaction period represented by the time node.
Optionally, each time node further includes corresponding node association information;
the node association information includes at least one of: the moving distance of the target role on the time node, the time information of the time node, the number of survival roles on the time node and the number of the array death roles on the time node.
Optionally, the method further comprises:
and sending the game global information and/or the game interaction detail information to the target object to be sent.
In another aspect of the embodiments of the present application, there is provided a game play display apparatus for providing a graphic user interface through a terminal device, the apparatus comprising: the first display module and the second display module;
the first display module is used for displaying game global information in the graphical user interface, and the game global information comprises: a global time axis and at least one time node located on the global time axis;
and the second display module is used for responding to the triggering operation of the target node in the time nodes and displaying game interaction detail information which is respectively established by taking each virtual role as a dimension and corresponds to the target node in the graphical user interface.
Optionally, in the device, the virtual character includes a target character and at least one related character; the game interaction details include: the method comprises the steps of establishing a target role time axis by taking a target role as a dimension, at least one relevant role time axis by taking a relevant role as a dimension, and interaction identification information; the target role time axis is used for representing an interaction time period of the target role taking the target time node as a starting moment; the related role time axis is used for representing the interaction time period of the related roles taking the target time node as the starting moment; the interaction identification information is used for identifying interaction behaviors between at least two virtual roles in the target role and at least one relevant role at a target time point, wherein the target time point is a target role time axis or any time point on the relevant role time axis.
Optionally, in the device, the game interaction detail information further includes attribute information of the virtual character; the target role time axis is also used for representing the change condition of attribute information of the target role taking the target time node as the starting moment; the relevant character time axis is also used for representing the change situation of attribute information of the relevant character taking the target time node as the starting moment.
Optionally, in the apparatus, the interaction identifying information includes: at least one directivity identification;
the directivity mark is used for indicating the target role to execute interaction behavior on the related roles at the target time point; or, instruct the related role to execute the interactive action to the target role at the target time point; or, the related character is instructed to perform an interactive action on another related character at the target point in time.
Optionally, in the apparatus, the interaction identification information further includes: a skill identification image; the skill identification image is used to indicate the type of skill triggered during the interactive behavior.
Optionally, in the device, the first display module is specifically configured to display, in the graphical user interface, a target role time axis and at least one relevant role time axis, each of which is established with each virtual role as a dimension and corresponds to the target node; displaying the directivity mark at the target time points of at least two time axes in the target role time axis and at least one relevant role time axis; and displaying the skill mark image within a specified range of the directivity mark.
Optionally, the second display module is further configured to display an interaction numerical value corresponding to the interaction behavior in response to a triggering operation on the skill identification image.
Optionally, in the apparatus, the game interaction details further include: a first selection control corresponding to the target character time axis and/or a second selection control corresponding to the relevant character time axis; the second display module is also used for hiding or displaying the interaction identification information of the target role in response to the triggering operation of the first selection control; and hiding or displaying the interaction identification information of the related roles in response to the triggering operation of the second selection control.
Optionally, in the device, each time node includes a corresponding node icon; the size and/or color of the node icon is positively correlated to the frequency of interaction for the game interaction period represented by the time node.
Optionally, in the device, each time node further includes corresponding node association information; the node association information includes at least one of: the moving distance of the target role on the time node, the time information of the time node, the number of survival roles on the time node and the number of the array death roles on the time node.
Optionally, the second display module is further configured to send the game global information and/or the game interaction detail information to the target object to be sent.
In another aspect of an embodiment of the present application, there is provided a computer apparatus including: the game playing system comprises a memory and a processor, wherein the memory stores a computer program which can be run on the processor, and the processor realizes the steps of the game playing method when executing the computer program.
In another aspect of the embodiments of the present application, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the game play presentation method described above.
The beneficial effects of the embodiment of the application include:
in the game play display method, device, equipment and storage medium provided by the embodiment of the application, the graphical user interface can be provided through the terminal equipment, and the game global information can be displayed in the graphical user interface, wherein the game global information comprises the following steps: a global time axis and at least one time node located on the global time axis; and responding to the triggering operation of the target node in the time nodes, and displaying game interaction detail information which is respectively established by taking each virtual character as a dimension and corresponds to the target node in a graphical user interface. The method comprises the steps of selecting a time node, displaying the multi-disc of the game content in a corresponding time period more quickly, displaying the content of the multi-disc more clearly and intuitively by displaying game interaction detail information, so that a user can acquire the required multi-disc information more comprehensively, quickly and accurately, interaction conditions among different virtual roles can be acquired more clearly from the multi-disc information, and further, the efficiency of the multi-disc can be improved, and the comprehensiveness and intuitiveness of the multi-disc content are improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is an interface schematic diagram of a graphical user interface provided by an embodiment of the present application;
FIG. 2 is a schematic flow chart of a game play showing method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of an interface of game interaction details provided by an embodiment of the present application;
FIG. 4 is another interface diagram of game interaction details provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of another interface of game interaction details provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of an interface of game global information provided by an embodiment of the present application;
FIG. 7 is another interface diagram of game global information provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of a game play display device according to an embodiment of the present application;
Fig. 9 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
In the description of the present application, it should be noted that the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
The game play presentation method in one embodiment of the present application may be run on a terminal device or a server. The terminal device may be a local terminal device. When the game display method is run on a server, the method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an alternative embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, the storage and operation of the game display method are completed on the cloud game server, the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the terminal device for information processing is cloud game server of cloud. When playing the game, the player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
In a possible implementation manner, the embodiment of the invention provides a game play showing method, and a graphical user interface is provided through a first terminal device, wherein the first terminal device can be the aforementioned local terminal device or the aforementioned client device in the cloud interaction system.
The terminal device provided in the embodiment of the present application and the graphical user interface displayed on the terminal device are specifically explained below.
Fig. 1 is an interface schematic diagram of a graphical user interface provided in an embodiment of the present application, and referring to fig. 1, the interface may be displayed on a terminal device.
Alternatively, the terminal device may be specifically any type of computer device such as a computer, a tablet computer, a mobile phone, a game console, a dedicated electronic device, and the like.
Application software may be displayed on the terminal device, for example: game software, chat software, office software and the like are taken as examples, the above graphical user interface can be displayed on the terminal device, the graphical user interface can be specifically an interface displayed by combining a plurality of elements such as a plurality of icons, images, models, characters, videos and the like, an interaction control can be arranged in the interface, and a user (for example, a player) can interact with the graphical user interface through the interaction interface.
In the game play display method provided by the embodiment of the application, various types of in-game information such as game global information, game interaction detail information and the like can be displayed in the graphical user interface, and a player can trigger the interface in a clicking (for example, clicking, double clicking, long clicking, dragging and the like) mode through a preset related game control, so that the related information required by the player such as the game global information, the game interaction detail information and the like is displayed in the graphical user interface.
In the embodiment of the present application, the situation that the game global information is displayed in the graphical user interface is specifically entered by triggering the graphical user interface in the initial state by the user; the situation of displaying corresponding game interaction detail information can be entered through triggering related nodes in the game global information by a user.
Alternatively, the above-mentioned game interaction details, game global information, and the like may be moved in the graphical user interface by a drag operation of the user in the game.
The following explains a specific implementation procedure of the game play demonstration method provided in the embodiment of the present application on the premise that the terminal device is an execution subject.
Fig. 2 is a flow chart of a game play showing method provided by an embodiment of the application, please refer to fig. 2, the method includes:
s210: game global information is displayed in a graphical user interface.
Wherein the game global information includes: the system comprises a global time axis and at least one time node positioned on the global time axis, wherein the global time axis is used for representing the time progress of a game, and each time node respectively represents a combat time period.
Alternatively, the game global information may be specifically displayed in the graphical user interface through triggering of the graphical user interface by the player, for example: taking a game interface as an example, a player can play a multiple of a game just played by clicking a "combat detail" key preset in the graphical user interface, and the terminal device responds to the clicking operation of the player and displays the game global information in the graphical user interface.
It should be noted that the above triggering manner is only an example, and when triggering is actually required, the triggering manner may be set correspondingly according to factors such as actual requirements of the player, for example: the voice trigger, the preset condition trigger, etc. are not limited thereto.
Alternatively, referring to fig. 1 in combination, global time axis may be displayed in the game global information, where the global time axis may specifically represent time of game progress, and represents time progress from the start of the game by means of time axis, for example: the time at which the game was started was 0 minutes to 0 seconds. At least one time node can be arranged on the global time axis, different time nodes can represent combat at different times, and each time node can represent a combat time period.
The fight period may specifically be a period from the start of a fight to the end of the fight, and the fight may specifically be a fight occurring between virtual characters operated by a plurality of players, for example: gun battles, cold weapon battles, etc. in games.
S220: and responding to the triggering operation of the target node in the time nodes, and displaying game interaction detail information which is respectively established by taking each virtual character as a dimension and corresponds to the target node in a graphical user interface.
The game interaction detail information is used for representing interaction information of the target role in an interaction time period corresponding to the target node, for example: interactive information in games related to combat injury, treatment conditions, release skills, information exchange, and assisted conditions.
Optionally, when the game global information is displayed by the graphical user interface, the triggering of the player to the target node in the time nodes can be responded, so that corresponding game interaction detail information is displayed.
The target node may be a node selected by a player, and specifically may be any one of the plurality of time nodes, which is not limited herein. The triggering operation may specifically be a click operation of the target node by the player.
Optionally, each time node may correspond to one game interaction detail information, for example, the global time axis includes two time nodes, namely, a time node a and a time node B, and when the time node a is used as a target node, the game interaction detail information corresponding to the time node a may be displayed in the graphical user interface through the triggering operation; when the time node B is used as the target node, through the triggering operation, game interaction detail information corresponding to the time node B can be displayed in the graphical user interface.
The game interaction detail information can display interaction information of the target role in a target interaction time period corresponding to the target node. The target character may be a virtual character operated by the player. Taking combat interaction as an example, the combat interaction information may be a change situation of information related to the target role in the combat process, for example: the fight interaction information may be that the character a causes an X-point injury to the character B, so that the blood volume/armor of the character B changes, for example: a decrease in blood volume; alternatively, character C may be treated for character B such that the blood volume of character B is changed, such as: the blood volume increases.
It should be noted that, at the same target node, different virtual characters may be used as dimensions to correspond to different game interaction details.
In the game play display method provided by the embodiment of the application, the terminal equipment can be used for providing the graphical user interface, and the game global information can be displayed in the graphical user interface, wherein the game global information comprises the following steps: a global time axis and at least one time node located on the global time axis; and responding to the triggering operation of the target node in the time nodes, and displaying game interaction detail information which is respectively established by taking each virtual character as a dimension and corresponds to the target node in a graphical user interface. The method comprises the steps of selecting a time node, displaying the multi-disc of the game content in a corresponding time period more quickly, displaying the content of the multi-disc more clearly and intuitively by displaying game interaction detail information, so that a user can acquire the required multi-disc information more comprehensively, quickly and accurately, interaction conditions among different virtual roles can be acquired more clearly from the multi-disc information, and further, the efficiency of the multi-disc can be improved, and the comprehensiveness and intuitiveness of the multi-disc content are improved.
The display interface of the game interaction detailed information provided in the embodiment of the application and the specific content displayed in the interface are specifically explained below.
FIG. 3 is a schematic diagram of an interface of game interaction details provided in an embodiment of the present application, referring to FIG. 3, a virtual character includes a target character and at least one related character; the game interaction details include: a target role timeline 310 established in a target role dimension, at least one related role timeline 320 established in a related role dimension, and interaction identification information 330.
Optionally, the target role timeline 310 is used to represent interaction time periods of the target roles with the target time node as a starting time; the related character time axis 320 is used for representing an interaction time period of the related character taking the target time node as a starting time; the interaction identification information 330 is used to identify the interaction behavior between the target character and at least two virtual characters in at least one related character at a target time point, where the target time point is the target character time axis or any time point on the related character time axis.
The relevant roles may be virtual roles corresponding to one or more other players/NPCs (non-player characters) actively selected by the user, that is, the user may select a time axis of the relevant roles to be displayed according to the requirement, and the time axes of the relevant roles corresponding to different relevant roles are different. The interaction behavior may include an attack interaction, a treatment interaction, an exchange interaction, an facilitation interaction, and the like.
The attack interaction can be specifically an injury caused by using an object or releasing skills by one character (a target character or a related character) to another character (the related character or the target character); the treatment interaction can be the blood volume replied by one character to the character or another character by using articles or releasing skills and the like; the communication interaction may be a process that a character converts a certain attribute value of itself into a certain attribute value of another character, where the attribute values may be the same attribute value or different attribute values, for example: a certain character reduces the attack attribute so that another character increases the defending attribute and the like; the attack-aiding interaction can be a process that when the role A attacks the role B, the role C assists the role A to cause injury to the role B.
It should be noted that, the target character may interact with any one or more other characters, and accordingly, multiple related characters may also interact with each other.
Optionally, the game interaction detail information further includes attribute information of the virtual character; the target role time axis 310 is further used for representing the change situation of attribute information of the target role taking the target time node as the starting moment; the related character timeline 320 is also used to represent the change of attribute information of the related character with the target time node as the starting time.
The attribute information may be, for example: blood volume attributes, armor attributes, attack attributes, defense attributes, and energy attributes.
Alternatively, in connection with fig. 3, the attribute information may be represented in the form of a histogram.
The change condition of the attribute information may include: when the attribute information increases, increasing the height of the histogram; when the attribute information decreases, the height of the histogram is reduced.
For example: when the attribute information identifies the attack attribute, if the attack of the role A is increased, the height of the corresponding histogram can be increased; accordingly, if the attack of character a is reduced, the height of the corresponding histogram may be reduced.
Alternatively, when the change of the attribute information of the same character is displayed, the same type of attribute (such as blood volume attribute and armor attribute) can be represented on the same time axis; if it is a different type of attribute, for example: the attack attribute and the defending attribute may be set with a plurality of time axes respectively, which respectively represent the change condition of each attribute value, and fig. 3 illustrates the same type of attribute as an example.
Optionally, taking attribute information as an example of a blood volume attribute and a armor attribute, on the target character time axis 310, the change information of the blood volume attribute and the change information of the armor attribute may be distinguished by different identifiers, taking fig. 3 as an example, the change information with a diagonal line identifier may be the change information of the armor attribute, the change information without a diagonal line identifier may be the change information of the blood volume attribute, the time interval represented by the target character time axis may specifically be the time length of a time node corresponding to the game interaction detail information, the start time of the time node may be the start time of the target character time axis, and the end time of the time node may be the end time of the target character time axis.
For example: when the character a is a target character, if the armor and blood volume attribute of the character a change due to injury of the character a by the character B, the change can be displayed on the target character time axis at the time corresponding to the change.
Optionally, the related character timeline 320 is used to represent time-varying information of attributes of related characters in the interaction time period, where the related characters include characters that fight the interaction with the target character.
Alternatively, the combat interaction may be specifically a case where the properties of the characters (the blood volume properties, the armor properties, and the like) are changed by causing injury or performing treatment or the like between the characters.
For example: the role A is a target role, the role B causes injury to the role A, the role C treats the role A, and the role D causes injury to the role B, so that the role B and the role C perform fight interaction with the role A, and the role B and the role C can be used as related roles; while character D only has a combat interaction with character B and does not have a related combat interaction with character a, character D cannot act as a related character.
Alternatively, the content displayed by the related character timeline 320 may be similar to the content displayed by the target character timeline 310, and will not be explained again.
Optionally, the interaction identification information 330 is used to identify the target character and the related characters, or combat interaction information between at least two related characters at a target time point, where the target time point is the target character time axis or any time point on the related character time axis.
Taking the interaction between the target character and the related characters as an example (interaction between at least two related characters is the same), for example: role a is the target role, role B is the relevant role, at point in time X. The character B hurts the character A, so that the blood volume and the armor of the character A are changed, the character B can be used as a fight interaction behavior, corresponding fight interaction information can be generated based on the fight interaction behavior, and the fight interaction information is displayed in an interface in a mode of interaction identification information, and is particularly used for showing that the character B hurts the character A at a time point X. The time point X is the target time point.
In the game play showing method provided by the embodiment of the application, the game interaction detail information can be represented by a target role time axis, at least one relevant role time axis and interaction identification information, and the attribute change condition of each virtual role at different time points can be clearly displayed by a plurality of role time axes; the game interaction condition among different virtual roles can be displayed more intuitively and rapidly through the interaction identification information, so that the intuitiveness and rapidity of the repeated disc are improved.
Optionally, the interaction identification information includes: at least one directivity identification; the directivity mark is used for indicating the target role to execute interaction behavior on the related roles at the target time point; or, instruct the related role to execute the interactive action to the target role at the target time point; or, the related character is instructed to perform an interactive action on another related character at the target point in time.
Optionally, the interactive identification information may include a directional identification, where the directional identification may point to a source of injury or treatment, for example: when the role A causes injury to the role B, the directivity mark can be pointed to the role B by the role A; the directivity identification may be pointed to by character C as character C treats character B.
During a combat interaction, there may be multiple interactions between multiple virtual characters, such as: in the same time node, at a first target time point, the role A causes injury to the role B; at a second target time point, the character A is hurt to the character B, and at a third target time point, the character C treats the character B; all three of the above cases can be displayed by means of directivity identification at the corresponding target time point.
Optionally, if the interaction identification information is identification information causing death of the target character or the related character, the directivity identification is displayed in a preset display mode.
Alternatively, the role death may be specifically caused when the blood volume of the role is 0, for example: the injury to the character B is caused by the character A, and the blood volume of the character B is reduced to 0 by the injury, so that the character A causes the death of the character B; if the role B is the related role, the related role death is caused; if character B is the target character, the death of the target character is caused.
Optionally, when the death of the character is caused, a preset display mode may be adopted to display the directivity identifier, for example: the directivity mark may be displayed in a preset color, size, thickness, shape, brightness, etc. to distinguish the directivity mark from other directivity marks.
Taking the example shown in fig. 3, at the last time point, the related character causes death of the target character, and the directivity mark may be displayed in a preset display manner, and in fig. 3, the directivity mark is shown in a thickened display manner, and may be displayed according to the actual requirement in the actual display process, which is not limited thereto.
Optionally, the interaction identification information includes: a skill identification image; the skill identification image is used to indicate the type of skill triggered during the interactive behavior.
Optionally, in addition to the arrow, a skill identification image may be set in the interactive identification information, where the skill identification image may be a preset icon, and each icon corresponds to a type of injury or treatment caused by a task skill or related skill such as an article skill.
For example: when the role A hurts the role B, the role B is hurt by a certain skill of the role A, and then the related icon of the skill can be displayed as a skill identification image; when the character C hurts the character B, the character B is hurt by a specific object, and then the related icon of the object can be displayed as a skill mark image.
Optionally, displaying, in the graphical user interface, game interaction details corresponding to the target nodes, each of which is established with each virtual character as a dimension, including: displaying a target role time axis and at least one relevant role time axis which are respectively established by taking each virtual role as a dimension and correspond to the target node in a graphical user interface; displaying the directivity mark at the target time points of at least two time axes in the target role time axis and at least one relevant role time axis; and displaying the skill mark image on the directivity mark.
Alternatively, when the display of the interactive identification information is performed, the above-described identification image may be displayed within a specified range of the directivity identification, for example: the identification image may be displayed at a center position of the directivity identification.
The display interface of the game interaction detailed information provided in the embodiment of the present application will be specifically explained below, as well as another specific content displayed in the interface.
Fig. 4 is another interface schematic diagram of game interaction details provided in an embodiment of the present application, please refer to fig. 4, and the method further includes: and responding to the triggering operation of the skill identification image, and displaying the interaction numerical value corresponding to the interaction behavior.
Alternatively, the interaction value may specifically be a value of an injury or a value of a treatment, etc.
Optionally, if the terminal device is a computer, the triggering operation may be an operation of placing the mouse on the identification image by the player, or may also be a related operation of the keyboard by the player; if the terminal device is a mobile phone, a tablet computer or the like, the triggering operation may be an operation of long-time pressing of the identification image by the player.
Alternatively, the value that is displayed as causing the injury or treatment may be a specific value, such as: the character A causes 100 points of injury to the character B, and a specific numerical value of the injury can be displayed through the triggering operation, namely the 100 points; the therapeutic values are not limited herein, and may be differentiated by different fonts, colors, sizes, etc. in order to differentiate between different types of values.
The display interface of game interaction details provided in the embodiment of the present application and still another specific content displayed in the interface will be specifically explained below.
Fig. 5 is a schematic diagram of another interface of game interaction details provided in an embodiment of the present application, referring to fig. 5, the game interaction details further include: a first selection control corresponding to the target character time axis and/or a second selection control corresponding to the relevant character time axis; the method further comprises the steps of: hiding or displaying interaction identification information of the target role in response to triggering operation of the first selection control; and hiding or displaying the interaction identification information of the related roles in response to the triggering operation of the second selection control.
Alternatively, as shown in fig. 5, the first selection control and the second selection control may be triggered in the form of options, such as: after clicking the first selection control or the second selection control, the player can adjust the display state, for example, defaulting to the display state, after clicking the first selection control or the second selection control, the player can change the display state to the hidden state or change the display state from the hidden state to the display state, and when in the display state, all interaction identification information on a time axis corresponding to the selection control can be displayed; and in the hidden state, all interactive identification information on the time axis corresponding to the selection control can not be displayed.
Illustratively, a hidden identifier is provided in FIG. 5, which may be highlighted when in a hidden state (i.e., the check number shown in the figure); the hidden identifier may not be highlighted when in the display state.
It should be noted that the above is only one possible example, and other types of identifiers may be set in the actual application process; the adjustment setting whether to hide or not can also be performed for each interaction identification information.
As another alternative example, a target character browsing identifier may be set, and after the player triggers the identifier, only the interaction identifier information related to the target character may be displayed, the interaction identifier information unrelated to the target character may be hidden, and so on.
The game global information provided in the embodiment of the present application and the specific contents of the relevant display contents in the game global information are specifically explained below.
Fig. 6 is an interface schematic diagram of game global information provided by the embodiment of the present application, referring to fig. 6, each time node includes a corresponding node icon 610; the size and/or color of the node icon 610 is positively correlated to the frequency of interactions of the game interaction time period represented by the time node.
Alternatively, the node icon 610 may be represented in a circular manner, for example: the length of the icon of the whole node on the time axis can be set to be the duration time of the game interaction time period, and the starting position of the node icon on the time axis is the starting time of the game interaction time period; the ending position of the node icon on the time axis is the ending time of the game interaction time period.
The higher the interaction frequency of the combat period, the larger the size of the node icons may be optionally set (e.g., the longer the game interaction duration, the higher the interaction frequency, the larger the size of the corresponding node icons).
In the game play showing method provided by the embodiment of the application, the complex disc results can be more intuitively and clearly shown by positively correlating the size of the node icon with the interaction frequency of the combat time period represented by the time node, and the comprehensiveness and intuitiveness of the complex disc are improved.
The game global information provided in the embodiment of the present application and another specific content of the relevant display content in the game global information are specifically explained below.
FIG. 7 is another interface schematic diagram of game global information provided by an embodiment of the present application, referring to FIG. 7, each time node further includes corresponding node association information, respectively; the node association information includes at least one of: the moving distance of the target character on the time node, the time information of the time node, the number of survival characters on the time node and the number of combat apoptosis characters on the time node.
Alternatively, the node association information may be displayed in text form at the node icon or at the node icon side.
The moving distance of the target character on the time node can be specifically a virtual distance of the target character moving in the virtual scene at the time node, and the virtual distance can be a distance calculated according to the proportion in the virtual scene; the time information of the time node may be a time period, or may also be a start time of the time period, and may be a duration relative to a game start time; the number of game characters on the time node may be specifically the number of surviving characters among all characters playing the game; the number of combat-matrix-death characters on the time node may specifically be the number of the character of the matrix-death among the plurality of characters performing combat interaction with the target character in the time period, and may specifically be displayed by a number.
For example: in fig. 7, taking the first time node as an example, the moving distance of the target character on the time node may be 800 meters, the time information of the time node may be 2:50, and the number of game characters on the time node may be 65; the number of combat lith roles on a time node may be 1.
In addition, if the target character dies during a certain combat interaction, the target character may be displayed as one of the node-related information, and the specific display form is not limited to the above example.
In the game play showing method provided by the embodiment of the application, a plurality of node association information can be displayed, and multiple-disc display can be more comprehensively carried out by displaying a plurality of types of node association information, so that the comprehensiveness and intuitiveness of multiple discs are improved.
Optionally, the method further comprises: and sending the game global information and/or the game interaction detail information to the target object to be sent.
Optionally, a forwarding control can be additionally arranged in the interface, and the player can realize sending operation by triggering the forwarding control; and the terminal equipment responds to the sending operation and can send the game global information and/or the game interaction detail information to a target to be sent, wherein the target to be sent can be a friend of a target role in a game interface or interfaces (such as chat software and the like) in other related software in the terminal equipment.
The following describes devices, storage media and the like corresponding to the game play display method provided by the present application, and specific implementation processes and technical effects thereof are referred to above, and are not described in detail below.
Fig. 8 is a schematic structural diagram of a game play display device according to an embodiment of the present application, please refer to fig. 8, wherein the device includes: a first display module 810, a second display module 820;
A first display module 810 for displaying game global information in a graphical user interface, the game global information comprising: a global time axis and at least one time node located on the global time axis;
and a second display module 820 for displaying, in response to a trigger operation of a target node among the time nodes, game interaction details corresponding to the target node, which are respectively established in the dimensions of each virtual character, in the graphical user interface.
Optionally, in the device, the virtual character includes a target character and at least one related character; the game interaction details include: the method comprises the steps of establishing a target role time axis by taking a target role as a dimension, at least one relevant role time axis by taking a relevant role as a dimension, and interaction identification information; the target role time axis is used for representing an interaction time period of the target role taking the target time node as a starting moment; the related role time axis is used for representing the interaction time period of the related roles taking the target time node as the starting moment; the interaction identification information is used for identifying interaction behaviors between at least two virtual roles in the target role and at least one relevant role at a target time point, wherein the target time point is a target role time axis or any time point on the relevant role time axis.
Optionally, in the device, the game interaction detail information further includes attribute information of the virtual character; the target role time axis is also used for representing the change condition of attribute information of the target role taking the target time node as the starting moment; the relevant character time axis is also used for representing the change situation of attribute information of the relevant character taking the target time node as the starting moment.
Optionally, in the apparatus, the interaction identifying information includes: at least one directivity identification;
the directivity mark is used for indicating the target role to execute interaction behavior on the related roles at the target time point; or, instruct the related role to execute the interactive action to the target role at the target time point; or, the related character is instructed to perform an interactive action on another related character at the target point in time.
Optionally, in the apparatus, the interaction identification information further includes: a skill identification image; the skill identification image is used to indicate the type of skill triggered during the interactive behavior.
Optionally, in the apparatus, the first display module 810 is specifically configured to display, in the graphical user interface, a target role timeline corresponding to the target node and at least one related role timeline respectively established with each virtual role as a dimension; displaying the directivity mark at the target time points of at least two time axes in the target role time axis and at least one relevant role time axis; and displaying the skill mark image within a specified range of the directivity mark.
Optionally, the second display module 820 is further configured to display an interaction numerical value corresponding to the interaction behavior in response to a triggering operation on the skill identification image.
Optionally, in the apparatus, the game interaction details further include: a first selection control corresponding to the target character time axis and/or a second selection control corresponding to the relevant character time axis; the second display module 820 is further configured to conceal or display interaction identification information of the target character in response to a triggering operation of the first selection control; and hiding or displaying the interaction identification information of the related roles in response to the triggering operation of the second selection control.
Optionally, in the device, each time node includes a corresponding node icon; the size and/or color of the node icon is positively correlated to the frequency of interaction for the game interaction period represented by the time node.
Optionally, in the device, each time node further includes corresponding node association information; the node association information includes at least one of: the moving distance of the target role on the time node, the time information of the time node, the number of survival roles on the time node and the number of the array death roles on the time node.
Optionally, the second display module 820 is further configured to send the game global information and/or the game interaction details information to the target object to be sent.
The foregoing apparatus is used for executing the method provided in the foregoing embodiment, and its implementation principle and technical effects are similar, and are not described herein again.
The above modules may be one or more integrated circuits configured to implement the above methods, for example: one or more application specific integrated circuits (Application Specific Integrated Circuit, abbreviated as ASICs), or one or more microprocessors, or one or more field programmable gate arrays (Field Programmable Gate Array, abbreviated as FPGAs), etc. For another example, when a module above is implemented in the form of a processing element scheduler code, the processing element may be a general-purpose processor, such as a central processing unit (Central Processing Unit, CPU) or other processor that may invoke the program code. For another example, the modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Fig. 9 is a schematic structural diagram of a computer device according to an embodiment of the present application, referring to fig. 9, the computer device includes: memory 910 and processor 920, wherein the memory 910 stores a computer program executable on the processor 920, and the processor 920 realizes the steps of the game play method when executing the computer program.
In another aspect of the embodiments of the present application, there is also provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the game play presentation method described above.
Alternatively, the above-mentioned computer device may be specifically the aforementioned terminal device.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of elements is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units implemented in the form of software functional units described above may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (english: processor) to perform part of the steps of the methods of the embodiments of the application. And the aforementioned storage medium includes: u disk, mobile hard disk, read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk, etc.
The foregoing is merely illustrative of embodiments of the present application, and the present application is not limited thereto, and any changes or substitutions can be easily made by those skilled in the art within the technical scope of the present application, and the present application is intended to be covered by the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.
The above description is only of the preferred embodiments of the present application and is not intended to limit the present application, but various modifications and variations can be made to the present application by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.
Claims (15)
1. A game play presentation method, providing a graphical user interface through a terminal device, the method comprising:
displaying game global information in the graphical user interface, the game global information comprising: a global time axis and at least one time node located on the global time axis;
and responding to the triggering operation of the target node in the time nodes, and displaying game interaction detail information which is respectively established by taking each virtual role as a dimension and corresponds to the target node in the graphical user interface.
2. The game play presentation method of claim 1, wherein the virtual character comprises a target character and at least one related character; the game interaction detail information comprises: the method comprises the steps of establishing a target role time axis by taking a target role as a dimension, at least one relevant role time axis by taking a relevant role as a dimension, and interaction identification information;
The target role time axis is used for representing an interaction time period of a target role taking the target node as a starting moment;
the related role time axis is used for representing the interaction time period of the related role taking the target node as the starting moment;
the interaction identification information is used for identifying interaction behaviors between the target role and at least two virtual roles in at least one relevant role at a target time point, wherein the target time point is the target role time axis or any time point on the relevant role time axis.
3. The game play presentation method of claim 2, wherein the game interaction detail information further includes attribute information of the virtual character;
the target role time axis is also used for representing the change condition of attribute information of a target role taking the target node as a starting moment;
the related role time axis is also used for representing the change condition of attribute information of the related role taking the target node as a starting moment.
4. The game play presentation method of claim 3, wherein the attribute information includes at least one of: blood volume attributes, armor attributes, attack attributes, defense attributes and energy attributes, wherein the attribute information is represented in a form of a histogram;
The change condition of the attribute information includes: increasing the height of the histogram when the attribute information increases; when the attribute information decreases, the height of the histogram is decreased.
5. The game play presentation method of claim 2, wherein the interactive identification information includes: at least one directivity identification;
the directivity mark is used for indicating the target role to execute interaction behavior on the related role at the target time point; or, instruct the related character to execute interaction behavior on the target character at the target time point; or, instruct the related character to perform an interaction action on another related character at the target time point, where the interaction action includes: attack interactions, facilitation interactions, treatment interactions, communication interactions.
6. The game play presentation method of claim 5, wherein the interactive identification information further comprises: a skill identification image;
the skill identification image is used to indicate the type of skill triggered during the interactive behavior.
7. The game play presentation method of claim 6, wherein displaying, in the graphical user interface, game interaction details corresponding to the target node established in each virtual character as a dimension, respectively, comprises:
Displaying a target role time axis and at least one relevant role time axis which are respectively established by taking each virtual role as a dimension and correspond to the target node in the graphical user interface; displaying the directivity identification at target time points of at least two time axes of the target character time axis and at least one related character time axis; and displaying the skill mark image within a specified range of the directivity mark.
8. The game play presentation method of claim 6, wherein the method further comprises:
and responding to the triggering operation of the skill identification image, and displaying an interaction numerical value corresponding to the interaction behavior.
9. The game play exposure method of any one of claims 2-8, wherein the game interaction details information further includes: a first selection control corresponding to the target character time axis and/or a second selection control corresponding to the related character time axis; the method further comprises the steps of:
hiding or displaying the interaction identification information of the target role in response to the triggering operation of the first selection control;
and hiding or displaying the interaction identification information of the related roles in response to the triggering operation of the second selection control.
10. The game play presentation method of claim 1, wherein each of the time nodes includes a corresponding node icon, respectively;
the size and/or color of the node icon is positively correlated to the frequency of interaction of the game interaction time period represented by the time node.
11. The game play presentation method of claim 10, wherein each of the time nodes further comprises corresponding node association information, respectively;
the node association information includes at least one of: the moving distance of the target role on the time node, the time information of the time node, the number of survival roles on the time node and the number of the array death roles on the time node.
12. The game play presentation method of claim 1, wherein the method further comprises:
and sending the game global information and/or the game interaction detail information to a target object to be sent.
13. A game play presentation apparatus for providing a graphical user interface through a terminal device, the apparatus comprising: the first display module and the second display module;
the first display module is configured to display game global information in the graphical user interface, where the game global information includes: a global time axis and at least one time node located on the global time axis;
And the second display module is used for responding to the triggering operation of the target node in the time nodes and displaying game interaction detail information which is respectively established by taking each virtual role as a dimension and corresponds to the target node in the graphical user interface.
14. A computer device, comprising: memory, a processor, in which a computer program is stored which is executable on the processor, when executing the computer program, implementing the steps of the game play method of any of the preceding claims 1 to 12.
15. A computer readable storage medium, characterized in that the storage medium has stored thereon a computer program which, when executed by a processor, implements the steps of the game play method of any of claims 1 to 12.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210161375.0A CN116672724A (en) | 2022-02-22 | 2022-02-22 | Game play showing method, device, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210161375.0A CN116672724A (en) | 2022-02-22 | 2022-02-22 | Game play showing method, device, equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116672724A true CN116672724A (en) | 2023-09-01 |
Family
ID=87784202
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210161375.0A Pending CN116672724A (en) | 2022-02-22 | 2022-02-22 | Game play showing method, device, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116672724A (en) |
-
2022
- 2022-02-22 CN CN202210161375.0A patent/CN116672724A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10661171B2 (en) | Information processing method, terminal, and computer storage medium | |
CN111589148B (en) | User interface display method, device, terminal and storage medium | |
US9959008B2 (en) | Computer peripheral display and communication device providing an adjunct 3D user interface | |
CN110548288B (en) | Virtual object hit prompting method and device, terminal and storage medium | |
CN112619167B (en) | Information processing method, device, computer equipment and medium | |
WO2017059683A1 (en) | Information processing method and terminal, and computer storage medium | |
WO2019105349A1 (en) | Information display method and device, storage medium, and electronic device | |
CN111298449B (en) | Control method and device in game, computer equipment and storage medium | |
CN111905363B (en) | Virtual object control method, device, terminal and storage medium | |
CN111265872B (en) | Virtual object control method, device, terminal and storage medium | |
WO2021244209A1 (en) | Virtual object control method and apparatus, and terminal and storage medium | |
CN111111191B (en) | Virtual skill activation method and device, storage medium and electronic device | |
CN110801629B (en) | Method, device, terminal and medium for displaying virtual object life value prompt graph | |
US20240278124A1 (en) | Virtual resource transfer method, apparatus, device, and storage medium | |
WO2021101661A1 (en) | Server-based generation of a help map in a video game | |
US20150040036A1 (en) | Dynamic player activity environment response | |
CN113975824B (en) | Reminding method for game sightseeing and related equipment | |
CN111729315B (en) | Method, system, electronic device and storage medium for obtaining game virtual pet | |
CN116672724A (en) | Game play showing method, device, equipment and storage medium | |
EP3984608A1 (en) | Method and apparatus for controlling virtual object, and terminal and storage medium | |
CN113893523B (en) | Mark display method and device, storage medium and electronic equipment | |
CN112057859B (en) | Virtual object control method, device, terminal and storage medium | |
CN114404961A (en) | Switching method and device of fighting view angle, electronic equipment and readable storage medium | |
CN115089968A (en) | Operation guiding method and device in game, electronic equipment and storage medium | |
CN114225401A (en) | Game matching control method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |