CN112023403A - Battle process display method and device based on image-text information - Google Patents

Battle process display method and device based on image-text information Download PDF

Info

Publication number
CN112023403A
CN112023403A CN202010917073.2A CN202010917073A CN112023403A CN 112023403 A CN112023403 A CN 112023403A CN 202010917073 A CN202010917073 A CN 202010917073A CN 112023403 A CN112023403 A CN 112023403A
Authority
CN
China
Prior art keywords
character
target
match
display area
record
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010917073.2A
Other languages
Chinese (zh)
Other versions
CN112023403B (en
Inventor
杨家昇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010917073.2A priority Critical patent/CN112023403B/en
Publication of CN112023403A publication Critical patent/CN112023403A/en
Application granted granted Critical
Publication of CN112023403B publication Critical patent/CN112023403B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/86Watching games played by other players
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/57Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player
    • A63F2300/577Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player for watching a game played by other players

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a battle process display method and device based on image-text information, computer equipment and a computer readable storage medium, and belongs to the technical field of computers. The method and the device have the advantages that the fighting process of the virtual characters is reviewed in a picture-text combination mode, based on the operation behavior of the user to the character display area in the fighting detail page, the fact that the user needs to review which part of the fighting process is determined, in the image display area, the interaction condition of each virtual character in the fighting process is presented in a graphical display mode, such as the attribute change, the position change and the like of each virtual character, the user can intuitively know the fighting process through a graphic picture, the influence of each character fighting record in the fighting global is known, the user can more easily perceive the detailed fighting process, the reviewing efficiency of the fighting process is improved, and the user experience is improved.

Description

Battle process display method and device based on image-text information
Technical Field
The present application relates to the field of computer technologies, and in particular, to a battle process display method and apparatus based on image-text information, a computer device, and a computer-readable storage medium.
Background
With the development of computer technology and the diversification of terminal functions, more and more network games are available, and in some types of network games, after a user selects a virtual character to fight against an NPC in the game or virtual characters of other users, a game server automatically generates a fighting result based on attribute information such as the offensive power and defensive power of the virtual character which is played by the two parties, and displays the fighting result. In the games, the fighting process is automatically completed by the server, and the user does not need to control the virtual character in the fighting process, so that in the games, after one round of fighting is finished, the fighting process needs to be embodied in a warfare newspaper form, namely, the interaction condition of each round in the fighting process is represented in an interaction information form, so that the user can restore and review the fighting process, and the user can adjust the virtual character in subsequent fighting.
At present, the warfare newspaper in the game is usually in a pure text form, but the information of station position change, attribute change and the like of each virtual character in the fighting process is difficult to restore by a user based on text information, and the user cannot intuitively know the specific situation in the fighting process.
Disclosure of Invention
The embodiment of the application provides a battle process display method and device based on image-text information, computer equipment and a computer readable storage medium, so that a user can know the battle process more visually through a graphic picture, and the efficiency of reviewing the battle process is improved. The technical scheme is as follows:
on one hand, a battle process display method based on image-text information is provided, and the method comprises the following steps:
displaying a character display area including a plurality of character match records for indicating attribute changes and match position changes of virtual characters participating in a match detail interface of the match;
determining a target character fight record corresponding to the operation behavior in a plurality of character fight records displayed in the character display area based on the operation behavior of the character display area;
and graphically displaying the character match records in a graphical display area of the match detail interface based on the target character match records.
In one aspect, a battle process display device based on image-text information is provided, the device includes:
the first display module is used for displaying a character display area comprising a plurality of character match records in a match detail interface of a match, wherein the character match records are used for representing attribute changes and match position changes of virtual characters participating in the match;
the record determining module is used for determining a target character fight record corresponding to the operation behavior in a plurality of character fight records displayed in the character display area based on the operation behavior of the character display area;
and the second display module is used for graphically displaying the character fighting record in a graphical display area of the fighting detail interface based on the target character fighting record.
In one possible implementation, the apparatus further includes:
and the speed determining module is used for determining the playing speed of a target picture corresponding to at least one target character fight record based on the sliding speed of the sliding operation.
In one possible implementation manner, the battle detail interface displays a selection control of at least one round in the battle, and the selection control of any round is used for switching the display content of the text display area to a text battle record corresponding to any round.
In one possible implementation, the apparatus further includes:
the record acquisition module is used for responding to the selection of the selection control of the target round and acquiring at least one character fight record corresponding to the target round; and executing the step of graphically displaying the character match records in a graphic display area of the match detail interface based on at least one character match record corresponding to the target round.
In one possible implementation, the first display module is further configured to:
and responding to the selection of the selection control of the target round, and displaying a first character match record in at least one character match record corresponding to the target round at a target position in the character display area.
In one possible implementation, the apparatus further includes:
the data acquisition module is used for acquiring the fight data of the fight from the server;
and the generating module is used for generating at least two character fight records based on the fight data.
In one aspect, a computer device is provided that includes one or more processors and one or more memories having at least one program code stored therein, the at least one program code being loaded and executed by the one or more processors to perform the operations performed by the method for displaying a course of action based on teletext information.
In one aspect, a computer-readable storage medium is provided, in which at least one program code is stored, and the at least one program code is loaded and executed by a processor to implement the operations performed by the method for displaying a course of battle based on teletext information.
In one aspect, a computer program product is provided that includes at least one program code stored in a computer readable storage medium. The processor of the computer device reads the at least one program code from the computer-readable storage medium, and the processor executes the at least one program code, so that the computer device realizes the operations executed by the image-text information-based battle process display method.
According to the technical scheme, the fighting process of the virtual characters is reviewed in a picture-text combination mode, based on the operation behavior of the user to the character display area in the fighting detail page, the fact that the user needs to review which part of the fighting process is determined, and in the graphic display area, the interaction conditions of each virtual character in the fighting process are presented in a graphic display mode, such as the attribute change and the position change of each virtual character, the user can visually know the fighting process through the graphic picture, the influence of each character fighting record in the fighting global is known, so that the user can more easily perceive the detailed fighting process, the reviewing efficiency of the fighting process is improved, and the user experience is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment of a battle process display method based on image-text information according to an embodiment of the present application;
fig. 2 is a flowchart of a battle process display method based on image-text information according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a battle details interface provided by an embodiment of the present application;
fig. 4 is a flowchart of a battle process display method based on image-text information according to an embodiment of the present application;
fig. 5 is a schematic diagram of a method for determining a target text match record according to an embodiment of the present application;
FIG. 6 is a diagram of a target screen according to an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of a target screen display method according to an embodiment of the present disclosure;
FIG. 8 is a flowchart of a method for displaying a battle process according to an embodiment of the present application;
FIG. 9 is a diagram illustrating a selection control provided by an embodiment of the present application;
fig. 10 is a schematic structural diagram of a battle process display device based on image-text information according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
To make the purpose, technical solutions and advantages of the present application clearer, the following will describe embodiments of the present application in further detail with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in this application are used for distinguishing between similar items and items that have substantially the same function or similar functionality, and it should be understood that "first," "second," and "nth" do not have any logical or temporal dependency or limitation on the number or order of execution.
In order to facilitate understanding of the technical processes of the embodiments of the present application, some terms referred to in the embodiments of the present application are explained below:
virtual roles: may be a virtual character, a virtual animal, an animation character, etc., or may be a virtual avatar to represent the user. The number of virtual characters in the battle may be preset, or may be dynamically determined according to the number of clients joining the battle, which is not limited in the embodiment of the present application.
Fig. 1 is a schematic diagram of an implementation environment of a battle process display method based on teletext information according to an embodiment of the present application, and referring to fig. 1, the implementation environment may include: a terminal 110 and a server 140.
The terminal 110 is installed and operated with an application program supporting virtual character battle, and the application program may be a policy-type game, a multiplayer online tactical competition game, a role playing-type game, and the like, which is not limited in this embodiment of the present application. Illustratively, the terminal 110 is a terminal used by any user, and the user controls at least one virtual Character to play against an NPC (Non-Player Character) set in an application program or a virtual Character controlled by another user through the terminal 110. The terminal 110 may be a smart phone, a tablet computer, an e-book reader, an MP3(Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3) player, an MP4(Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4) player, a laptop, a desktop, or the like.
The server 140 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 140 is used for providing background services for the application program, for example, providing data computing services for the fighting process of the virtual character. Alternatively, server 140 undertakes primary computational tasks and terminal 110 undertakes secondary computational tasks; alternatively, server 140 undertakes the secondary computing work and terminal 110 undertakes the primary computing work; alternatively, the server 140 and the terminal 110 perform cooperative computing by using a distributed computing architecture.
The terminal 110 is connected to the server 140 through a wireless network or a wired network. Those skilled in the art will appreciate that the number of terminals described above may be greater or fewer. Such as tens or hundreds of such terminals, or even more. The number of terminals and the type of the device are not limited in the embodiments of the present application.
The battle process display method provided by the embodiment of the application can be combined with various types of application programs. For example, in some game applications, the server directly generates the result of the match and displays the result when the virtual character matches, and it is difficult for the user to know the matching process of the virtual character, for example, the attribute change, station position change, etc. of the virtual character in each round. The fighting process display method provided by the scheme combines the image information and the text information, the content in the character fighting records is visually presented through the image picture, the user can conveniently restore the fighting process, the user can visually perceive the influence of each character fighting record on the fighting process, the user can conveniently know the fighting rules, a corresponding fighting strategy is selected, and the game experience of the user is improved.
Fig. 2 is a flowchart of a battle process display method based on teletext information according to an embodiment of the present application. The method can be applied to the above implementation environment, in the embodiment of the present application, a terminal is used as an execution subject, and a method for displaying a battle process based on image-text information is introduced, referring to fig. 2, where the embodiment may specifically include the following steps:
201. the terminal displays a character display area including a plurality of character match records indicating attribute changes and match position changes of virtual characters participating in a match detail interface of the match.
Wherein, this fight detailed interface is used for showing the process of fighting. For example, in response to the end of one-time fight, the terminal switches to a fight detail interface, a plurality of text fight records generated in the fight process are displayed in the fight detail interface, and the fight process is embodied by using the text fight records, namely the interaction situation among virtual characters participating in the fight is embodied by using the text fight records. For example, the character match records include character names, attribute changes, virtual property utilities, and the like of each virtual character. Fig. 3 is a schematic diagram of a battle detail interface provided in an embodiment of the present application, and referring to fig. 3, the battle detail interface includes a text display area 301 and a graphic display area 302, the text display area 301 displays text battle records 303, and the graphic display area 302 displays virtual characters participating in the local battle, attribute information of the virtual characters, and the like.
202. And the terminal determines a target character fight record corresponding to the operation behavior in a plurality of character fight records displayed in the character display area based on the operation behavior to the character display area.
The operation behavior may be a sliding operation on a text display area, a selection operation on a text match record, and the like, which is not limited in the embodiment of the present application.
In a possible implementation manner, the terminal determines which part of the fighting process needs to be reviewed by the user based on the operation behavior of the user on the character display area, that is, determines a target character fighting record and reviews the fighting process indicated by the target character fighting record. For example, when the terminal detects a sliding operation of the user on the character display area, at least one character match record sliding to the target position is determined as a target character match record. The target position is set by a developer, and the embodiment of the present application is not limited thereto. For example, after detecting the selection operation of the character match record, the terminal determines at least one selected character match record as the target character match record.
203. And the terminal graphically displays the character match records in a graphical display area of the match detail interface based on the target character match records.
In one possible implementation, as shown in the battle details interface shown in fig. 3, a card of each virtual character participating in the battle is displayed in the graphic display area, and the card is marked with the character name, the character grade, the attribute value, and the like of each virtual character. Cards of different avatars are displayed in different positions to represent where the avatars are in the battle. In one possible implementation manner, the terminal updates the picture displayed in the graphic display area based on the information in the target character fight record. For example, if the target character fight record indicates that "Abu" loses 50 forces, "the terminal may update the data displayed in the card of the virtual character" Abu "and reduce the force value displayed in the card by 50. For example, if the target character fight record is that "abb" defeats "kaiser", the terminal moves the cards of the virtual character "abb" to the position of the cards of the virtual character "kaiser" in the graphic display area, and does not display the cards of the virtual character "kaiser", that is, the change of the station position of the virtual character is reflected by the movement of the card skin. It should be noted that the description of the text engagement record graphical display method is only an exemplary description of one possible implementation manner, and the embodiment of the present application does not limit what graphical display method is specifically adopted.
According to the technical scheme, the fighting process of the virtual characters is reviewed in a picture-text combination mode, based on the operation behavior of the user to the character display area in the fighting detail page, the fact that the user needs to review which part of the fighting process is determined, and in the graphic display area, the interaction conditions of each virtual character in the fighting process are presented in a graphic display mode, such as the attribute change and the position change of each virtual character, the user can visually know the fighting process through the graphic picture, the influence of each character fighting record in the fighting global is known, so that the user can more easily perceive the detailed fighting process, the reviewing efficiency of the fighting process is improved, and the user experience is improved.
The above embodiment is a brief introduction to the method for displaying a battle process based on the image-text information provided by the present application, and specifically, the method is specifically described with reference to fig. 4. Fig. 4 is a flowchart of a battle process display method based on teletext information according to an embodiment of the present application, and referring to fig. 4, the method includes the following steps:
401. the terminal obtains the fight data of the fight from the target server, and generates at least two character fight records based on the fight data.
The character match record is used to indicate interaction conditions of each virtual character participating in the match, for example, attribute changes and match position changes of the virtual character participating in the match, and of course, usage information of the virtual prop by the virtual character may also be included. In this embodiment of the present application, the target server may be any server in a server cluster, and a user of the server cluster provides a background service for an application program supporting virtual role engagement. In one possible implementation mode, in response to the start of one-game fight, the terminal sends the launch information to the server cluster, the server cluster allocates one server for the one-game fight based on the load condition of each server, namely the target server, and the target server provides computing service for the fight process of the one-game fight.
In one possible implementation, the target server formats the engagement data generated during the engagement process into a unified data structure, and stores the engagement data in a database. The database may be disposed in the target server, or may be disposed in a server dedicated to data storage in a server cluster, which is not limited in this embodiment of the present application. In the embodiment of the present application, the database is disposed in the target server as an example for explanation. In response to the completion of the battle, the terminal acquires the battle data from the database of the target server, analyzes the battle data, and deserializes the battle data into characters, for example, to obtain a plurality of character battle records of the local battle. The above description of the method for generating the character match record is merely an exemplary description, and the embodiment of the present application is not limited to which specific method is used to generate the character match record.
402. And the terminal displays a character display area comprising a plurality of character match records in a match detail interface of the match.
In one possible implementation mode, in response to one-time fight ending, the terminal switches to the fight detail interface, and renders the acquired text fight records to a text display area in the fight detail interface for display. The user can slide the character display area up and down to check all character fight records generated in the fight process of the local bureau, and the fight process is restored through character information.
403. And the terminal determines a target character fight record corresponding to the operation behavior in a plurality of character fight records displayed in the character display area based on the operation behavior of the character display area.
In one possible implementation, the operational behavior is a sliding operation. And the terminal responds to the sliding operation of the character display area, and determines the character match record sliding to the target position of the character display area as the target character match record. In a possible implementation manner, the terminal may obtain the display position of each character match record in the character display area according to a reference period through a timer cycle, and determine the character match record as the target character match record by sliding the display position to the target position. For example, a text match record corresponds to a text control, and the position of the text control in the text display area is the display position of the text match record. The reference period is set by a developer, and the embodiment of the present application is not limited thereto. Fig. 5 is a schematic diagram of a target text match record determination method provided in an embodiment of the present application, and referring to fig. 5, in a possible implementation manner, the match detail interface displays a positioning identifier 501, and the positioning identifier 501 is used for indicating a target position 502. As shown in fig. 5 (a), during the sliding of the text display area by the user, the text match record 503 slides to the target position 502, and the terminal determines the text match record 503 as a target text match record; as shown in fig. 5 (b), the user continues to slide the character display area, and at this time, the character match record 504 slides to the target position, and the terminal identifies the character match record 504 as a target character match record.
In one possible implementation, the operation behavior is a select operation. And the terminal responds to the selection operation of the character match record in the character display area and determines the selected character match record as the target character match record. The selection operation is a long press operation, a click operation, or the like, which is not limited in the embodiments of the present application. For example, when the terminal detects a click operation of a certain character match record by a user, the clicked character match record is determined as a target character match record.
404. And the terminal graphically displays the character match records in a graphical display area of the match detail interface based on the target character match records.
In one possible implementation manner, the terminal determines the target picture based on the target character fight record. The target picture comprises virtual characters participating in the battle, attribute information of the virtual characters and battle positions of the virtual characters in the battle. In one possible implementation manner, the combat process can be divided into a plurality of rounds, any one of the literal combat records corresponds to one round, and the server can record initial state information of each round in the combat data processing process, that is, attribute information, position information and the like of the virtual character at the beginning of each round. When determining a target picture corresponding to the target character match record, the terminal determines current attribute information, position information and the like of each virtual character based on information in the target character match record and initial state information of a round to which the target character match record belongs, so that the target picture is determined. And the terminal plays the target picture corresponding to each item label character fight record in the graphic display area. Fig. 6 is a schematic diagram of a target screen according to an embodiment of the present application, and as shown in fig. 6 (a), in one possible implementation manner, cards of virtual characters are displayed on the target screen based on station positions of the virtual characters in a battle, the card display positions of the virtual characters correspond to the station positions of the virtual characters in the battle, and the cards include information such as character names, character grades, and attribute values of the virtual characters. As shown in fig. 6 (b), in one possible implementation, a virtual scene of the local game is displayed on the destination screen, and each virtual character, attribute information of the virtual character, and the like are displayed in the virtual scene. The above description of the target screen is only an exemplary description, and the target screen may be represented in the form of a virtual character card, a battle scene, or a dynamic screen, such as a battle animation.
In a possible implementation manner, if the user operation of the user in the text display area is a sliding operation, the terminal may determine a playing sequence of a target picture corresponding to at least one target text match record based on a sliding direction of the sliding operation, and play the target picture on the graphical display interface based on the playing sequence. That is, the text display area corresponds to a progress bar for playing the target screen, and the playing progress of the target screen in the graphic display area is controlled based on the sliding operation on the text display area. For example, in response to the sliding direction being the first direction, the terminal sequentially displays the target screens corresponding to the respective target character engagement records based on the arrangement order of the respective target character engagement records in the character display area. Wherein the first direction may be a direction of upward sliding. And in response to the sliding direction being a second direction, the terminal determines the target character fight record determined when the sliding operation is started as a first record, displays a target picture corresponding to the first record in the graphic display area, determines the target character fight record determined when the sliding operation is ended as a second record in response to the sliding operation being ended, and displays a target picture corresponding to the second record in the graphic display area. Fig. 7 is a schematic diagram of a target screen display method according to an embodiment of the present application, and the target screen display process is described with reference to fig. 7. As shown in fig. 7 (a), the slide direction in response to the slide operation is the first direction 701, and when the character match record 702 reaches the target position first during the slide operation, the terminal displays the target screen corresponding to the character match record 702 in real time in the graphic display area, and when the character match record 703 and the character match record 704 reach the target position in sequence during the slide operation, the terminal displays the target screen corresponding to the character match record 703 and the character match record 704 in sequence in the graphic display area. As shown in fig. 7 (b), when the slide direction in response to the slide operation is the second direction 705, the character match record 706 reaches the target position first during the slide operation, the terminal displays the target screen corresponding to the character match record 706 in real time in the graphic display area, and when the slide operation is stopped, the character match record 707 reaches the target position, the screen displayed in the graphic display area by the terminal is switched to the target screen corresponding to the character match record 707. The above description of the method for displaying the target screen based on the sliding direction is merely an exemplary description, and the present embodiment is not limited to which specific method is adopted to display the target screen.
In a possible implementation manner, the terminal may further determine, based on the sliding speed of the sliding operation, a playing speed of a target picture corresponding to the at least one target character match record. For example, for a part of the fighting process that needs to be understood in detail, the user can reduce the sliding speed of the character display area, reduce the picture switching speed of the graphic display area, or reduce the playing speed of the fighting animation of the graphic display area.
According to the technical scheme, the fighting process of the virtual characters is reviewed in a picture-text combination mode, based on the operation behavior of the user to the character display area in the fighting detail page, the fact that the user needs to review which part of the fighting process is determined, and in the graphic display area, the interaction conditions of each virtual character in the fighting process are presented in a graphic display mode, such as the attribute change and the position change of each virtual character, the user can visually know the fighting process through the graphic picture, the influence of each character fighting record in the fighting global is known, so that the user can more easily perceive the detailed fighting process, the reviewing efficiency of the fighting process is improved, and the user experience is improved.
Fig. 8 is a flowchart of a battle procedure display method according to an embodiment of the present application, and the battle procedure display method is described with reference to fig. 8 by taking a sliding operation of a character display area as an example. In a possible implementation manner, in response to detecting a sliding operation of a user on a character display area, a terminal performs step 801 of acquiring a display position of each character match record, determines a target character match record, performs step 802 of reading information in the target character match record, performs step 803 of updating data based on the read information, such as attribute value change information of a virtual character, interactive property use information, and the like, and performs screen display based on a data update condition, such as playing a match animation, moving a card position of the virtual character, and the like, in a graphic display area. In the embodiment of the application, a text-text combined report expression mode is provided, the sliding operation of text fight records in a text display area is equivalent to the control of a video play progress bar, the playing of pictures in the graphic display area is controlled based on the sliding operation of a user, for example, the play progress, the play speed and the like of the pictures are controlled, the user can know the fight process more clearly through the graphical pictures, the influence brought by each text fight record is globally recognized, and the fast browse fight process is realized by the sliding of the text display area and the switching of round selection controls.
The above embodiment mainly introduces a process of graphically displaying the character fight record based on the operation behavior of the character fight record. In one possible implementation, a one-play match includes at least one round, and the match detail interface displays a selection control for at least one round of the match. And the selection control of any round is used for switching the display content of the character display area to the character fight record corresponding to the any round. Fig. 9 is a schematic diagram of a selection control provided in an embodiment of the present application, and referring to fig. 9, a selection control 902 with multiple rounds is displayed in a region 901 of a battle details interface. In one possible implementation manner, in response to the selection of the selection control of the target round, at least one text match record corresponding to the target round is obtained, and the step of graphically displaying the text match record in the graphical display area of the match detail interface is executed based on the at least one text match record corresponding to the target round. For example, the terminal determines at least one target picture based on at least one text match record corresponding to the target turn, and plays the at least one target picture in the graphic display area. Specifically, for a first text match record in the round, the terminal may determine a target picture corresponding to the first text match record based on information in the first text match record and initial state information corresponding to the round. For any literal fight record except the literal fight record on the first day, the terminal determines a target picture corresponding to the literal fight record based on the information in the literal fight record and the information of the virtual character displayed in the current graphic display area. In a possible implementation manner, the terminal may further switch the display content of the text display area correspondingly based on a selection operation of the selection control of any one round. For example, in response to the selection of the selection control of the target round, the terminal displays a first text match record in at least one text match record corresponding to the target round at the target position in the text display area. It should be noted that, the above description of performing the round selection based on the selection control and correspondingly switching the contents displayed on the battle details interface is only an exemplary description, and the embodiment of the present application does not limit the specific form of displaying different contents based on the round switching. In the embodiment of the application, the selection control is set, and the selection control based on each round is rapidly switched, so that a user can rapidly position the fighting process needing attention, and the man-machine interaction efficiency is improved.
All the above optional technical solutions may be combined arbitrarily to form optional embodiments of the present application, and are not described herein again.
Fig. 10 is a schematic structural diagram of a battle process display device based on teletext information according to an embodiment of the present application, and referring to fig. 10, the device includes:
a first display module 1001 configured to display, in a battle detail interface of a battle, a character display area including a plurality of character battle records indicating attribute changes and battle position changes of virtual characters participating in the battle;
a record determining module 1002, configured to determine, based on an operation behavior for the text display area, a target text engagement record corresponding to the operation behavior in the text engagement records displayed in the text display area;
and the second display module 1003 is configured to graphically display the target character match record in a graphical display area of the match detail interface based on the target character match record.
In one possible implementation, the record determination module 1002 is configured to perform any one of:
responding to the sliding operation of the character display area, and determining the character fight record sliding to the target position of the character display area as a target character fight record;
and in response to the selection operation of the character match record in the character display area, determining the selected character match record as the target character match record.
In one possible implementation, the second display module 1003 includes:
the picture determining unit is used for determining a target picture based on the target character fight record, wherein the target picture comprises the virtual character, the attribute information of the virtual character and the fight position of the virtual character in the fight;
and the picture playing unit is used for playing the target picture corresponding to each target character fight record in the graphic display area.
In one possible implementation, the operation behavior is a sliding operation;
the picture playing unit is used for determining the playing sequence of a target picture corresponding to at least one target character fighting record based on the sliding direction of the sliding operation; and playing the target picture on the graphical display interface based on the playing sequence.
In one possible implementation, the method further comprises:
and the speed determining module is used for determining the playing speed of a target picture corresponding to at least one target character fight record based on the sliding speed of the sliding operation.
In one possible implementation manner, the battle detail interface displays a selection control of at least one round in the battle, and the selection control of any round is used for switching the display content of the text display area to a text battle record corresponding to any round.
In one possible implementation, the method further comprises:
the record acquisition module is used for responding to the selection of the selection control of the target round and acquiring at least one character fight record corresponding to the target round; and executing the step of graphically displaying the character match records in a graphic display area of the match detail interface based on at least one character match record corresponding to the target round.
In one possible implementation, the first display module 1001 is further configured to:
and responding to the selection of the selection control of the target round, and displaying a first character match record in at least one character match record corresponding to the target round at a target position in the character display area.
In one possible implementation, the method further comprises:
the data acquisition module is used for acquiring the fight data of the fight from the server;
and the generating module is used for generating at least two character fight records based on the fight data.
The device that this application embodiment provided, the process of fighting to virtual character is reviewed through the mode that the picture and text combines, based on the user to the regional operation action of characters display in the details page of fighting, confirm which part of process of fighting that the user need review, in the figure display region, show the interactive condition of each virtual character in the process of fighting with graphical display mode, for example, the attribute of each virtual character changes, position change etc. make the user know the process of fighting through the graphic picture directly perceivedly, know the influence of every characters record of fighting in the overall situation of fighting, thereby make the user perceive the detailed process of fighting more easily, improve the efficiency of reviewing in the process of fighting, promote user experience.
It should be noted that: in the battle process display device based on the image-text information provided in the above embodiment, only the division of the above functional modules is used for illustration, and in practical applications, the above function distribution can be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the above described functions. In addition, the battle process display device based on the image-text information and the battle process display method based on the image-text information provided by the embodiment belong to the same concept, and the specific implementation process is described in the method embodiment and is not described herein again.
The computer device provided in the foregoing technical solution may be implemented as a terminal or a server, for example, fig. 11 is a schematic structural diagram of a terminal provided in this embodiment of the present application. The terminal 1100 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Terminal 1100 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, and so forth.
In general, terminal 1100 includes: one or more processors 1101 and one or more memories 1102.
Processor 1101 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1101 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1101 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1101 may be integrated with a GPU (Graphics Processing Unit) that is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, the processor 1101 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 1102 may include one or more computer-readable storage media, which may be non-transitory. Memory 1102 can also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1102 is used to store at least one program code for execution by the processor 1101 to implement the teletext information based engagement process presentation method provided by the method embodiments herein.
In some embodiments, the terminal 1100 may further include: a peripheral interface 1103 and at least one peripheral. The processor 1101, memory 1102 and peripheral interface 1103 may be connected by a bus or signal lines. Various peripheral devices may be connected to the peripheral interface 1103 by buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1104, display screen 1105, camera assembly 1106, audio circuitry 1107, positioning assembly 1108, and power supply 1109.
The peripheral interface 1103 may be used to connect at least one peripheral associated with I/O (Input/Output) to the processor 1101 and the memory 1102. In some embodiments, the processor 1101, memory 1102, and peripheral interface 1103 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1101, the memory 1102 and the peripheral device interface 1103 may be implemented on separate chips or circuit boards, which is not limited by this embodiment.
The Radio Frequency circuit 1104 is used to receive and transmit RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuit 1104 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1104 converts an electric signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electric signal. Optionally, the radio frequency circuit 1104 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1104 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 1104 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1105 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1105 is a touch display screen, the display screen 1105 also has the ability to capture touch signals on or over the surface of the display screen 1105. The touch signal may be input to the processor 1101 as a control signal for processing. At this point, the display screen 1105 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display 1105 may be one, providing the front panel of terminal 1100; in other embodiments, the display screens 1105 can be at least two, respectively disposed on different surfaces of the terminal 1100 or in a folded design; in some embodiments, display 1105 can be a flexible display disposed on a curved surface or on a folded surface of terminal 1100. Even further, the display screen 1105 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The Display screen 1105 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
Camera assembly 1106 is used to capture images or video. Optionally, camera assembly 1106 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1106 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1107 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1101 for processing or inputting the electric signals to the radio frequency circuit 1104 to achieve voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of terminal 1100. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1101 or the radio frequency circuit 1104 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 1107 may also include a headphone jack.
Positioning component 1108 is used to locate the current geographic position of terminal 1100 for purposes of navigation or LBS (Location Based Service). The Positioning component 1108 may be a Positioning component based on the united states GPS (Global Positioning System), the chinese beidou System, the russian graves System, or the european union galileo System.
Power supply 1109 is configured to provide power to various components within terminal 1100. The power supply 1109 may be alternating current, direct current, disposable or rechargeable. When the power supply 1109 includes a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1100 can also include one or more sensors 1110. The one or more sensors 1110 include, but are not limited to: acceleration sensor 1111, gyro sensor 1112, pressure sensor 1113, fingerprint sensor 1114, optical sensor 1115, and proximity sensor 1116.
Acceleration sensor 1111 may detect acceleration levels in three coordinate axes of a coordinate system established with terminal 1100. For example, the acceleration sensor 1111 may be configured to detect components of the gravitational acceleration in three coordinate axes. The processor 1101 may control the display screen 1105 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1111. The acceleration sensor 1111 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1112 may detect a body direction and a rotation angle of the terminal 1100, and the gyro sensor 1112 may cooperate with the acceleration sensor 1111 to acquire a 3D motion of the user with respect to the terminal 1100. From the data collected by gyroscope sensor 1112, processor 1101 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensor 1113 may be disposed on a side bezel of terminal 1100 and/or underlying display screen 1105. When the pressure sensor 1113 is disposed on the side frame of the terminal 1100, the holding signal of the terminal 1100 from the user can be detected, and the processor 1101 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1113. When the pressure sensor 1113 is disposed at the lower layer of the display screen 1105, the processor 1101 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 1105. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1114 is configured to collect a fingerprint of the user, and the processor 1101 identifies the user according to the fingerprint collected by the fingerprint sensor 1114, or the fingerprint sensor 1114 identifies the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the user is authorized by the processor 1101 to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, etc. Fingerprint sensor 1114 may be disposed on the front, back, or side of terminal 1100. When a physical button or vendor Logo is provided on the terminal 1100, the fingerprint sensor 1114 may be integrated with the physical button or vendor Logo.
Optical sensor 1115 is used to collect ambient light intensity. In one embodiment, the processor 1101 may control the display brightness of the display screen 1105 based on the ambient light intensity collected by the optical sensor 1115. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1105 is increased; when the ambient light intensity is low, the display brightness of the display screen 1105 is reduced. In another embodiment, processor 1101 may also dynamically adjust the shooting parameters of camera assembly 1106 based on the ambient light intensity collected by optical sensor 1115.
Proximity sensor 1116, also referred to as a distance sensor, is typically disposed on a front panel of terminal 1100. Proximity sensor 1116 is used to capture the distance between the user and the front face of terminal 1100. In one embodiment, when the proximity sensor 1116 detects that the distance between the user and the front face of the terminal 1100 is gradually decreased, the display screen 1105 is controlled by the processor 1101 to switch from a bright screen state to a dark screen state; when the proximity sensor 1116 detects that the distance between the user and the front face of the terminal 1100 becomes progressively larger, the display screen 1105 is controlled by the processor 1101 to switch from a breath-screen state to a light-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 11 does not constitute a limitation of terminal 1100, and may include more or fewer components than those shown, or may combine certain components, or may employ a different arrangement of components.
Fig. 12 is a schematic structural diagram of a server 1200 according to an embodiment of the present application, where the server 1200 may generate a relatively large difference due to a difference in configuration or performance, and may include one or more processors (CPUs) 1201 and one or more memories 1202, where at least one program code is stored in the one or more memories 1202, and the at least one program code is loaded and executed by the one or more processors 1201 to implement the methods provided by the foregoing method embodiments. Certainly, the server 1200 may further have components such as a wired or wireless network interface, a keyboard, and an input/output interface, so as to perform input and output, and the server 1200 may further include other components for implementing the functions of the device, which is not described herein again.
In an exemplary embodiment, a computer-readable storage medium, such as a memory, including at least one program code, which is executable by a processor to perform the method for displaying a course of battle based on teletext information in the above embodiments is also provided. For example, the computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product is also provided that includes at least one program code stored in a computer readable storage medium. The processor of the computer device reads the at least one program code from the computer-readable storage medium, and the processor executes the at least one program code, so that the computer device realizes the operations executed by the image-text information-based battle process display method.
It will be understood by those skilled in the art that all or part of the steps of implementing the above embodiments may be implemented by hardware, or implemented by at least one program code associated with hardware, where the program code is stored in a computer readable storage medium, such as a read only memory, a magnetic or optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. A battle process display method based on image-text information is characterized by comprising the following steps:
displaying a character display area including a plurality of character match records in a match detail interface of a match, wherein the character match records are used for representing attribute changes and match position changes of virtual characters participating in the match;
determining a target character fight record corresponding to the operation behavior in a plurality of character fight records displayed in the character display area based on the operation behavior of the character display area;
and graphically displaying the character fighting records in a graphical display area of the fighting detail interface based on the target character fighting records.
2. The method according to claim 1, wherein the determining, based on the operation behavior on the text display area, a target text match record corresponding to the operation behavior among a plurality of text match records displayed on the text display area includes any one of:
responding to the sliding operation of the character display area, and determining the character fight record sliding to the target position of the character display area as a target character fight record;
and responding to the selection operation of the character match records in the character display area, and determining the selected character match records as the target character match records.
3. The method of claim 1, wherein graphically displaying the textual engagement record in a graphical display area of the engagement detail interface based on the target textual engagement record comprises:
determining a target picture based on the target character fight record, wherein the target picture comprises the virtual character, the attribute information of the virtual character and the fight position of the virtual character in the fight;
and playing the target picture corresponding to each target character fight record in the graphic display area.
4. The method of claim 3, wherein the operational behavior is a sliding operation;
the playing of the target picture corresponding to each target character fight record in the graphic display area comprises:
determining the playing sequence of a target picture corresponding to at least one target character fighting record based on the sliding direction of the sliding operation;
and playing the target picture on the graphical display interface based on the playing sequence.
5. The method of claim 4, wherein prior to the graphical display interface playing the target display based on the playback order, the method further comprises:
and determining the playing speed of a target picture corresponding to at least one target character fight record based on the sliding speed of the sliding operation.
6. The method of claim 1, wherein the battle details interface displays a selection control of at least one round of the battle, and the selection control of any round is used for switching the display content of the text display area to a text battle record corresponding to any round.
7. The method of claim 6, wherein after displaying a text display area comprising a plurality of text match records in a match detail interface for a match, the method further comprises:
responding to the selection of a selection control of a target round, and acquiring at least one character fight record corresponding to the target round;
and executing the step of graphically displaying the character match records in a graphical display area of the match detail interface based on at least one character match record corresponding to the target round.
8. The method of claim 7, wherein after the selection control responding to the target turn is selected and at least one text match record corresponding to the target turn is acquired, the method further comprises:
and responding to the selection of a selection control of a target round, and displaying a first character match record in at least one character match record corresponding to the target round at a target position in the character display area.
9. The method of claim 1, wherein prior to displaying a textual display field comprising a plurality of textual engagement records in the engagement detail interface for engagement, the method further comprises:
obtaining the fight data of the fight from a server;
and generating at least two character fight records based on the fight data.
10. A battle process display device based on image-text information is characterized in that the device comprises:
the first display module is used for displaying a character display area comprising a plurality of character match records in a match detail interface of a match, wherein the character match records are used for representing attribute changes and match position changes of virtual characters participating in the match;
the record determining module is used for determining a target character fight record corresponding to the operation behavior in a plurality of character fight records displayed in the character display area based on the operation behavior of the character display area;
and the second display module is used for graphically displaying the character fighting records in a graphical display area of the fighting detail interface based on the target character fighting records.
11. The apparatus of claim 10, wherein the record determination module is configured to perform any of:
responding to the sliding operation of the character display area, and determining the character fight record sliding to the target position of the character display area as a target character fight record;
and responding to the selection operation of the character match records in the character display area, and determining the selected character match records as the target character match records.
12. The apparatus of claim 10, wherein the second display module comprises:
the picture determining unit is used for determining a target picture based on the target character fight record, wherein the target picture comprises the virtual character, the attribute information of the virtual character and the fight position of the virtual character in the fight;
and the picture playing unit is used for playing the target picture corresponding to each target character fight record in the graphic display area.
13. The apparatus of claim 12, wherein the operational behavior is a sliding operation;
the picture playing unit is used for determining the playing sequence of a target picture corresponding to at least one target character fight record based on the sliding direction of the sliding operation; and playing the target picture on the graphical display interface based on the playing sequence.
14. A computer device, characterized in that the computer device comprises one or more processors and one or more memories, wherein at least one program code is stored in the one or more memories, and the at least one program code is loaded and executed by the one or more processors to implement the operations performed by the teletext information based engagement process presentation method according to any one of claims 1-9.
15. A computer-readable storage medium, wherein at least one program code is stored in the computer-readable storage medium, and the at least one program code is loaded and executed by a processor to implement the operations performed by the method for displaying a battle procedure based on teletext information according to any one of claims 1-9.
CN202010917073.2A 2020-09-03 2020-09-03 Battle process display method and device based on image-text information Active CN112023403B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010917073.2A CN112023403B (en) 2020-09-03 2020-09-03 Battle process display method and device based on image-text information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010917073.2A CN112023403B (en) 2020-09-03 2020-09-03 Battle process display method and device based on image-text information

Publications (2)

Publication Number Publication Date
CN112023403A true CN112023403A (en) 2020-12-04
CN112023403B CN112023403B (en) 2022-04-26

Family

ID=73591910

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010917073.2A Active CN112023403B (en) 2020-09-03 2020-09-03 Battle process display method and device based on image-text information

Country Status (1)

Country Link
CN (1) CN112023403B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113144604A (en) * 2021-02-08 2021-07-23 网易(杭州)网络有限公司 Information processing method, device, equipment and storage medium for game role
CN113509720A (en) * 2021-05-21 2021-10-19 腾讯科技(深圳)有限公司 Playback method, device, terminal, server and storage medium for virtual battle

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150038226A1 (en) * 2013-07-31 2015-02-05 Gree, Inc. Non-transitory computer readable recording medium, game server, game control method, and game system
CN108970115A (en) * 2018-07-13 2018-12-11 腾讯科技(深圳)有限公司 Information display method, device, equipment and storage medium in battle game
CN110180176A (en) * 2019-06-06 2019-08-30 腾讯科技(深圳)有限公司 Display methods, device, equipment and the readable storage medium storing program for executing at war communique displaying interface
CN110433488A (en) * 2019-08-16 2019-11-12 腾讯科技(深圳)有限公司 Battle control method, device, equipment and medium based on virtual role
CN110711384A (en) * 2019-10-24 2020-01-21 网易(杭州)网络有限公司 Game history operation display method, device and equipment
CN110772795A (en) * 2019-10-24 2020-02-11 网易(杭州)网络有限公司 Game history operation display method, device, equipment and readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150038226A1 (en) * 2013-07-31 2015-02-05 Gree, Inc. Non-transitory computer readable recording medium, game server, game control method, and game system
CN108970115A (en) * 2018-07-13 2018-12-11 腾讯科技(深圳)有限公司 Information display method, device, equipment and storage medium in battle game
CN110180176A (en) * 2019-06-06 2019-08-30 腾讯科技(深圳)有限公司 Display methods, device, equipment and the readable storage medium storing program for executing at war communique displaying interface
CN110433488A (en) * 2019-08-16 2019-11-12 腾讯科技(深圳)有限公司 Battle control method, device, equipment and medium based on virtual role
CN110711384A (en) * 2019-10-24 2020-01-21 网易(杭州)网络有限公司 Game history operation display method, device and equipment
CN110772795A (en) * 2019-10-24 2020-02-11 网易(杭州)网络有限公司 Game history operation display method, device, equipment and readable storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113144604A (en) * 2021-02-08 2021-07-23 网易(杭州)网络有限公司 Information processing method, device, equipment and storage medium for game role
CN113144604B (en) * 2021-02-08 2024-05-10 网易(杭州)网络有限公司 Information processing method, device, equipment and storage medium for game roles
CN113509720A (en) * 2021-05-21 2021-10-19 腾讯科技(深圳)有限公司 Playback method, device, terminal, server and storage medium for virtual battle
CN113509720B (en) * 2021-05-21 2023-10-20 腾讯科技(深圳)有限公司 Virtual fight playback method, device, terminal, server and storage medium

Also Published As

Publication number Publication date
CN112023403B (en) 2022-04-26

Similar Documents

Publication Publication Date Title
CN110841285B (en) Interface element display method and device, computer equipment and storage medium
CN109646944B (en) Control information processing method, control information processing device, electronic equipment and storage medium
CN111050189B (en) Live broadcast method, device, equipment and storage medium
CN112118477B (en) Virtual gift display method, device, equipment and storage medium
CN110740340B (en) Video live broadcast method and device and storage medium
CN110300274B (en) Video file recording method, device and storage medium
CN112044065B (en) Virtual resource display method, device, equipment and storage medium
CN112181572A (en) Interactive special effect display method and device, terminal and storage medium
CN114116053B (en) Resource display method, device, computer equipment and medium
CN110139143B (en) Virtual article display method, device, computer equipment and storage medium
CN112007362B (en) Display control method, device, storage medium and equipment in virtual world
CN113613028A (en) Live broadcast data processing method, device, terminal, server and storage medium
CN111368114A (en) Information display method, device, equipment and storage medium
CN112023403B (en) Battle process display method and device based on image-text information
CN111437600A (en) Plot showing method, plot showing device, plot showing equipment and storage medium
CN113204672B (en) Resource display method, device, computer equipment and medium
CN112367533B (en) Interactive service processing method, device, equipment and computer readable storage medium
CN113032590B (en) Special effect display method, device, computer equipment and computer readable storage medium
CN112306332A (en) Method, device and equipment for determining selected target and storage medium
CN110152309B (en) Voice communication method, device, electronic equipment and storage medium
CN112118482A (en) Audio file playing method and device, terminal and storage medium
CN112118353A (en) Information display method, device, terminal and computer readable storage medium
CN109107163B (en) Analog key detection method and device, computer equipment and storage medium
CN108228052B (en) Method and device for triggering operation of interface component, storage medium and terminal
CN112188268B (en) Virtual scene display method, virtual scene introduction video generation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40035403

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant