CN116785700A - Game data processing method, device, equipment and storage medium - Google Patents

Game data processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN116785700A
CN116785700A CN202310858424.0A CN202310858424A CN116785700A CN 116785700 A CN116785700 A CN 116785700A CN 202310858424 A CN202310858424 A CN 202310858424A CN 116785700 A CN116785700 A CN 116785700A
Authority
CN
China
Prior art keywords
game
character
virtual character
keyword
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310858424.0A
Other languages
Chinese (zh)
Inventor
杨昕粲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202310858424.0A priority Critical patent/CN116785700A/en
Publication of CN116785700A publication Critical patent/CN116785700A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding

Abstract

The application provides a game data processing method, a game data processing device and a game data storage medium, and relates to the technical field of games. The method comprises the following steps: based on the game ending event, game data associated with the virtual character in the game is acquired, wherein the game data comprises: character identification of virtual character scene identification of the game battle situation data of the virtual character in the game of the game; determining at least one target keyword corresponding to the virtual character according to game data associated with the virtual character and a keyword library corresponding to the game data established in advance; drawing processing is carried out according to at least one target keyword corresponding to the virtual character and a pre-constructed character model, and a corresponding block image corresponding to the virtual character in the game is generated; and displaying the corresponding subtotal image of the virtual character on the graphical user interface. By applying the embodiment of the application, the uniqueness of each game can be reflected, and the visual characteristic is realized.

Description

Game data processing method, device, equipment and storage medium
Technical Field
The present application relates to the field of game technologies, and in particular, to a game data processing method, apparatus, device, and storage medium.
Background
For games in units of plays, such as MOBA (Multiplayer Online Battle Arena, multiplayer online competitive game), and fleeing game, players typically view game data for their own games after their own games are completed, which may include KDA (KILL DEATH ASSIST, kill, death, support) data.
At present, game data after the game is finished is mainly displayed in a data table form. However, each game has uniqueness due to the form of a friend or foe, and the game map mode, and the game data is displayed in the form of a data table, so that the uniqueness of each game is difficult to embody, and the visualization characteristic is lacking.
Disclosure of Invention
The application aims to overcome the defects in the prior art and provide a game data processing method, a device, equipment and a storage medium, which can show the uniqueness of each game and have the characteristic of visualization.
In order to achieve the above purpose, the technical scheme adopted by the embodiment of the application is as follows:
in a first aspect, an embodiment of the present application provides a game data processing method, where a graphical user interface is provided by a terminal device, the method includes:
based on the game ending event, game data associated with the virtual character in the game is obtained, wherein the game data comprises: character identification of the virtual character the scene mark of the game battle state data of the virtual character in the local game;
Determining at least one target keyword corresponding to the virtual character according to game data associated with the virtual character and a keyword library corresponding to the pre-established game data;
drawing according to at least one target keyword corresponding to the virtual character and a pre-constructed character model, and generating a corresponding block image corresponding to the virtual character in the game;
and displaying the corresponding subtotal image of the virtual character on the graphical user interface.
In a second aspect, an embodiment of the present application further provides a game data processing apparatus, for providing a graphical user interface through a terminal device, the apparatus including:
the acquisition module is used for acquiring game data related to the virtual characters in the local game based on the ending time of the local game, wherein the game data comprises the following components: character identification of the virtual character the scene mark of the game battle state data of the virtual character in the local game;
the determining module is used for determining at least one target keyword corresponding to the virtual character according to game data associated with the virtual character and a keyword library corresponding to the game data established in advance;
The generation module is used for carrying out drawing processing according to at least one target keyword corresponding to the virtual character and a pre-constructed character model, and generating a subtotal image corresponding to the virtual character in the game;
and the display module is used for displaying the corresponding check image of the virtual role on the graphical user interface.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor, a storage medium, and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating over the bus when the electronic device is operating, the processor executing the machine-readable instructions to perform the steps of the game data processing method of the first aspect described above.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having a computer program stored thereon, which when executed by a processor performs the steps of the game data processing method of the first aspect described above.
The beneficial effects of the application are as follows:
the embodiment of the application provides a game data processing method, a device, equipment and a storage medium, which can acquire game data such as character identification, scene identification and status data associated with a virtual character in a game after the game is finished, and further determine at least one target keyword corresponding to the game data associated with the virtual character in the game, namely at least one target keyword corresponding to the virtual character, based on keywords corresponding to the game data established in advance. Based on the above, the subtotal image corresponding to the virtual character in the game can be generated according to the target keyword corresponding to the virtual character and the pre-established character model of the virtual character, that is, the character model, the scene information and the battle situation information of the virtual character in the game can be displayed in the subtotal image corresponding to the virtual character, and finally the mentioned subtotal image corresponding to the virtual character can be displayed on the graphic user interface.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of a game data processing method according to an embodiment of the present application;
FIG. 2 is a flowchart of another game data processing method according to an embodiment of the present application;
FIG. 3 is a flowchart of another game data processing method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a review panel according to an embodiment of the present application;
FIG. 5 is a flowchart of another game data processing method according to an embodiment of the present application;
FIG. 6 is a schematic diagram of another review panel according to an embodiment of the present application;
FIG. 7 is a schematic diagram of yet another review panel provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of yet another review panel provided by an embodiment of the present application;
FIG. 9 is a schematic diagram of a game data processing device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The game data processing method in one embodiment of the application can be run on a local terminal device or a server. When the game data processing method is run on a server, the method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an alternative embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, the storage and operation of the game data processing method are completed on the cloud game server, and the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; however, when the cloud game server which performs information processing and is cloud game is playing, a player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as a game picture and the like are encoded and compressed, the data is returned to the client device through a network, and finally, the data is decoded through the client device and the game picture is output.
In an alternative embodiment, taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
In a possible implementation manner, the embodiment of the application provides a game data processing method, and a graphical user interface is provided through terminal equipment, wherein the terminal equipment can be the aforementioned local terminal equipment or the aforementioned client equipment in the cloud interaction system.
The game data processing method according to the present application is illustrated below with reference to the accompanying drawings, and the execution subject of the method is the above-mentioned terminal device, on which a graphical user interface for displaying a game, specifically a game in units of plays, such as a multiplayer game, a large escape game, or other types of games, is rendered, which is not limited by the present application.
Fig. 1 is a flow chart of a game data processing method according to an embodiment of the present application. As shown in fig. 1, the method may include:
s101, based on the game ending event, game data related to the virtual characters in the game are obtained.
Wherein, the game data comprises: character identification of virtual character scene identification of the game and the battle situation data of the virtual character in the local game.
It will be appreciated that in a game of combat type in units of plays, the virtual characters may be divided into my teammate characters and enemy teammate characters, where the my teammate characters and the enemy teammate characters fight in the game of play, and when any one of the parties completes a task, this task may be, for example, a push-to-destroy base task, and the server may generate a game end event after detecting that any one of the parties has destroyed an enemy base. Meanwhile, the server may generate combat situation data of each virtual character in the local game based on the performance of each virtual character in the local game, the combat data may be, for example, a kDa (KILL DEATH ASSIST, number of defeated enemies, number of support) and may be characterized by a ratio of number of knocks to number of deaths. It is noted that the combat data may also include other types of data, which are not limited by the present application.
It should be noted that, the virtual characters in the game all have character identifiers, such as numbers, and different game scenes also have corresponding scene identifiers, so that the character identifiers corresponding to the virtual characters and the scene identifiers corresponding to the game scenes can be stored in the related databases in advance.
For example, when the game is created, the character identification of each virtual character in the game and the scene identification of the game can be determined according to the selection operation (such as the selected virtual character) of the player and the game scene of the game, the determined character identification of each virtual character in the game and the determined scene identification of the game are stored in a database related to the server, and meanwhile, the battle situation data of each virtual character in the game can be written into the database in real time based on the character identification, that is, the game data related to each virtual character in the game is stored in the database. It should be noted that other data, such as equipment identifier, may also be included in the game data, which is not limited by the present application.
In one implementation, the server may obtain, from the database, a character identifier for each virtual character in the local game, a scene identifier for the local game, and combat data for the virtual character in the local game based on the local game end event.
S102, determining at least one target keyword corresponding to the virtual character according to game data associated with the virtual character and a keyword library corresponding to the game data established in advance.
The keyword library corresponding to the pre-established game data comprises keywords corresponding to the character identifiers, keywords corresponding to the scene identifiers and keywords corresponding to the battle situation data, and certainly, keywords corresponding to other types of game data can also be included, so that the method is not limited by the application.
For example, taking a virtual character as an example, after game data associated with the virtual character is obtained, character identifiers, scene identifiers and battle situation data included in the game data may be respectively matched with corresponding keyword libraries, so as to determine at least one target keyword corresponding to the virtual character, for example, the target keyword may be a keyword corresponding to the character identifier, a keyword corresponding to the scene identifier and a keyword corresponding to the battle situation data included in the game data. And similarly, finally, determining target keywords corresponding to each virtual character in the game.
And S103, performing drawing processing according to at least one target keyword corresponding to the virtual character and a pre-constructed character model, and generating a corresponding block image corresponding to the virtual character in the game.
For example, at least one target keyword corresponding to each virtual character may be combined to obtain a combination result corresponding to each virtual character, and drawing processing is performed according to the combination result corresponding to each virtual character and a pre-constructed character model corresponding to each virtual character, so as to generate a corresponding block image corresponding to each virtual character. The character model corresponding to each virtual character constructed in advance may be a Lora model (Low-Rank Adaptation of Large Language Models, low-order adaptation of a large language model).
It can be seen that the subtree image generated according to the target keyword and the character model corresponding to the virtual character includes the character model, the scene information and the battle situation information of the virtual character in the game. For example, when the battle situation data is characterized as easily winning, and the keyword corresponding to the easily winning includes "raised flag", the image information of "raised flag" may be displayed in the subtended image; when the battle situation data is characterized as failure, and the keyword corresponding to the failure comprises a 'fallen flag', the image information of the 'fallen flag' can be displayed in the opposite image, and when the keyword corresponding to the scene identification comprises a 'fan blade', the image information of the 'fan blade' can be displayed in the opposite image.
That is, the game image can show the uniqueness of the game, has the visual characteristic, can deepen the memory point and increases the game fun.
S104, displaying the corresponding opposite image of the virtual character on the graphical user interface.
For example, after the game is completed, a game review panel may be displayed directly on the graphical user interface. In another example, after the game is ended, the terminal device may display a game review panel on the graphical user interface in response to a trigger operation of the game review control by the user.
Based on the above description, the server has generated the corresponding block image of each virtual character in the game, and then the corresponding block image of each virtual character touched by each user may be displayed on the first area of the game review panel displayed by the terminal device corresponding to each user.
It should be understood that the corresponding syncopa image of the virtual character in the game may be generated by the cloud interaction between the server and the terminal device, or may be generated by the server alone, which is not limited by the present application.
In summary, in the game data processing method provided by the application, after the game is finished, the game data associated with the virtual characters in the game, such as character identification, scene identification and status data, can be obtained, and determining at least one target keyword corresponding to the game data associated with the virtual character in the game, namely at least one target keyword corresponding to the virtual character, based on the keywords corresponding to the pre-established game data. Based on the above, the subtotal image corresponding to the virtual character in the game can be generated according to the target keyword corresponding to the virtual character and the pre-established character model of the virtual character, that is, the character model, the scene information and the battle situation information of the virtual character in the game can be displayed in the subtotal image corresponding to the virtual character, and finally the mentioned subtotal image corresponding to the virtual character can be displayed on the graphic user interface.
Fig. 2 is a flowchart of another game data processing method according to an embodiment of the present application. Optionally, as shown in fig. 2, the determining at least one target keyword corresponding to the virtual character according to the game data associated with the virtual character and the keyword word library corresponding to the pre-established game data includes:
s201, determining character identification keywords corresponding to the virtual characters according to character identifications of the virtual characters included in the game data and keywords corresponding to the character identifications in the keyword word library.
S202, determining scene identification keywords corresponding to the virtual roles according to scene identifications of the game in the game data and keywords corresponding to the scene identifications in the keyword lexicon.
S203, determining the battle situation data keywords corresponding to the virtual characters according to the battle situation data of the virtual characters in the game of the game and the keywords corresponding to the battle situation data in the keyword library.
As can be seen from the above description, the keyword library established in advance includes keywords corresponding to the character identifiers (such as character identifier a, character identifier B, and character identifier C), keywords corresponding to the scene identifiers (such as scene identifier a, scene identifier B, and scene identifier C), and keywords corresponding to the battle situation data (battle situation data a, battle situation data B, and battle situation data C). For example, the keyword corresponding to the character identifier a included in the keyword library is (ddd), the keyword corresponding to the scene identifier a is (maple, mountain), that is, the scene information corresponding to the scene identifier a includes the fan blade, the mountain and the battle situation data a, which is characterized by easy winning, and the corresponding keyword is (flag activated, multicolored), that is, the battle situation data a corresponds to "raised flag, color" battle situation information. The application is not limited to keywords corresponding to the character identification, the scene identification and the battle situation data.
By way of example, the character identification of the virtual character in the local game may be matched with the character identification included in the keyword library, determining keywords corresponding to the character identifications of the virtual characters contained in the keyword library, if the character identifier of the virtual character is character identifier A, the character identifier keyword corresponding to the virtual character is (ddd); matching the scene identifier in the game with the scene identifier in the keyword library, determining the keyword corresponding to the scene identifier in the game, wherein if the scene identifier in the game is the scene identifier A, the scene identifier keyword corresponding to the virtual character is (maple, mountain); matching the battle situation data in the game with the battle situation data in the keyword library, determining the keywords corresponding to the battle situation data in the game, wherein if the battle situation data in the game is characterized by easy winning, the corresponding battle situation data keywords of the virtual character are (flag activated, multi color). That is, the target keywords corresponding to the virtual character include (ddd), (maple, mountain), (flag activated, multicolored) keywords.
Fig. 3 is a flowchart of another game data processing method according to an embodiment of the present application. Optionally, as shown in fig. 3, the drawing processing is performed according to at least one target keyword corresponding to the virtual character and a pre-constructed character model, so as to generate a subtotal image corresponding to the virtual character in the game, which includes:
s301, splicing diagonal identification keywords, scene identification keywords and battle situation data keywords to generate keyword strings corresponding to virtual roles.
The virtual characters in the game play are taken as dimensions to describe, and according to the description, the character identification keywords, the scene identification keywords and the battle situation data keywords corresponding to the virtual characters in the game play can be obtained. And splicing the character identification keywords, the scene identification keywords and the battle situation data keywords corresponding to the virtual characters to generate keyword strings corresponding to the virtual characters. Continuing with the above explanation by way of example, the keyword string corresponding to the virtual character whose character is identified as a is "< ddd >, < maple, mountain >, < flag activated, multicolored >".
S302, drawing processing is carried out according to the keyword strings corresponding to the virtual characters and the character model which is built in advance, and a corresponding block image corresponding to the virtual characters in the game is generated.
For example, the keyword string corresponding to the virtual character can be input into a plotter for drawing processing, and the matching of the character identification keywords in the keyword strings corresponding to the virtual characters and the character models constructed in advance can be performed, and the corresponding subtotal images corresponding to the virtual characters in the game are generated according to the matching results.
Optionally, according to the character identification keywords included in the keyword strings corresponding to the virtual characters and the pre-constructed character models, calling the character models corresponding to the character identification keywords; and drawing processing is carried out according to the character model corresponding to the character identification keyword, the scene identification keyword and the battle situation data keyword which are included in the keyword string, and a corresponding block image corresponding to the virtual character in the game is generated.
It can be understood that a mapping relationship exists between a pre-constructed character model and a character identification keyword, a drawing server based on cloud deployment can be matched with the pre-constructed character model according to the character identification keyword included in the keyword string to obtain a character model corresponding to the character identification keyword, and the character model corresponding to the character identification keyword is read, so that the drawing server can perform drawing processing based on the character model corresponding to the character identification keyword, the scene identification keyword included in the keyword string and the battle situation data keyword to generate a corresponding image of the virtual character in the game.
Optionally, displaying the corresponding counterpoise image of the virtual character on the graphical user interface includes: a review panel is displayed on the graphical user interface, and a decision image corresponding to the virtual character is displayed on the first area of the review panel in response to a selection operation for a controllable area corresponding to the virtual character displayed on the second area of the review panel.
After the game is finished, a controllable area corresponding to each virtual character in the game can be displayed on a second area of a game review panel displayed on a graphical user interface, character icons of the virtual characters and combat data (such as a parameter group rate and various injury degrees) are displayed in the controllable area, and a user can perform selection operations on the controllable area corresponding to each virtual character, such as clicking and double clicking, and meanwhile, a decision image of the virtual character corresponding to the selected controllable area is displayed on the first area of the game review panel.
Fig. 4 is a schematic diagram of a review panel according to an embodiment of the present application. Assuming that the game includes a virtual character a, a virtual character B, a virtual character C and a virtual character D, the controllable areas corresponding to the virtual characters are displayed on the second area of the game review panel, and if the controllable areas corresponding to the virtual character B are triggered, the corresponding check image corresponding to the virtual character B in the game can be displayed on the first area of the game review panel.
As can be seen from fig. 4, the user can view not only the subtotal image corresponding to the virtual character controlled by the user on the contrast review panel, but also the subtotal image corresponding to the teammate character and the subtotal image corresponding to the enemy character on the contrast review panel.
Fig. 5 is a flowchart of another game data processing method according to an embodiment of the present application.
Optionally, as shown in fig. 5, the method may further include:
s501, acquiring a behavior image of the virtual character in the game of the game.
S502, displaying a behavior image of the virtual character in the game on a second area of the game review panel displayed on the graphical user interface.
For example, after the game is completed, an action trajectory image may be generated from action coordinates of the virtual character in the game, the action trajectory may be used as a behavior image, and the behavior image may be stored in a database. When the behavior image needs to be read, the behavior image corresponding to the virtual character can be obtained from the database, and the behavior image corresponding to the virtual object is displayed on the second area of the game review panel.
It should be noted that the behavior image may be other types of images, such as a highlight image, which is not limited by the present application.
Fig. 6 is a schematic diagram of another review board provided in an embodiment of the present application, and it is assumed that a virtual image a in the game is taken as an example for explanation, a behavior image of the virtual image a is displayed on a second area of the review board, and a corresponding block image of the virtual image a is displayed on a first area. As can be seen from fig. 6, the user can intuitively acquire not only the action track and the wonderful moment information of the virtual character in the local game through the action image displayed on the second area of the game review panel, but also the character model, the scene information and the battle situation information of the virtual character in the local game through the subtle image displayed on the first area of the game review panel, thereby increasing the interest of the game.
Optionally, the method further comprises: displaying at least one sharing control corresponding to the sharing object on a first relevant area of the subtree image corresponding to the virtual character; and responding to the triggering operation of the sharing control corresponding to the sharing object, and sharing the corresponding block image of the virtual character to the sharing object.
Referring to fig. 7, fig. 7 is a schematic diagram of another review panel according to an embodiment of the present application. As can be seen from fig. 7, the first relevant area mentioned above may specifically be an area located below the displayed contrast image. For example, the sharing control corresponding to the sharing object displayed in the first relevant area may include a sharing control 1, a sharing control 2, and a sharing control 3, which should be noted that the present application is not limited thereto. If the user performs a triggering operation, such as double clicking, on the sharing control 1, the currently displayed image of the block may be shared to the sharing object corresponding to the sharing control 1.
Optionally, the method may further comprise: displaying a storage control on a second relevant area of the corresponding block image of the virtual character; and responding to the triggering operation aiming at the saving control, and saving the corresponding block image of the virtual character.
As will be seen from fig. 7, the second relevant area may be specifically located in an area above the displayed contrast image, and the save control is displayed in the second relevant area. After the user performs touch operation on the storage control, the currently displayed opposite image can be stored according to the storage path corresponding to the storage control.
Optionally, the method may further comprise: generating the subtotal image information according to at least one target keyword corresponding to the virtual character; and displaying the opposite image information on a third related area of the opposite image corresponding to the virtual character, so that the image generator generates the opposite image based on the opposite image information.
For example, the third related area may be located in an area directly below the displayed image, the target keyword associated with the virtual character corresponding to the image is included in the image information of the image displayed on the third related area, and the player may copy and paste the image information of the image on the third related area, input the image information into the image generator, and the image generator generates the image of the image.
Optionally, the method may further comprise: and responding to the selected target time period on the target review panel displayed on the graphical user interface, and displaying the corresponding target images of the virtual roles controlled by the target user in the target time period on the target review panel according to a preset sequence.
For example, if the user wants to view the corresponding contrast image of the virtual character controlled in the target period, the target period may be selected on the contrast record interface (such as the contrast review panel) displayed on the graphical user interface, and based on the identification information (such as the account number) of the target user, the corresponding contrast image of the virtual character controlled by the target user in the target period is displayed on the contrast record interface according to the selected target period in a preset order, where the preset order may be to display the corresponding contrast image of the latest game first, and so on.
Fig. 8 is a schematic diagram of yet another review panel according to an embodiment of the present application. Assuming that the virtual characters controlled by the target user in the order of the game session from near to far include virtual character 1, virtual character 2, virtual character 3, and virtual character 4 in the target period, the respective opponent images of virtual character 1, virtual character 2, virtual character 3, and virtual character 4 in the corresponding session game can be generated in the above-described manner, and then the respective opponent images 1 and 2 corresponding to virtual character 1, the respective opponent images 3 and 4 corresponding to virtual character 3 can be displayed as in fig. 8. It can be seen that the game data of the target user during the target period can be displayed in the form of a gallery on the game review panel.
Fig. 9 is a schematic structural diagram of a game data processing device according to an embodiment of the present application. As shown in fig. 9, the apparatus includes:
the obtaining module 901 is configured to obtain game data associated with a virtual character in the local game based on the local game ending time, where the game data includes: character identification of virtual character scene identification of the game battle situation data of the virtual character in the game of the game;
a determining module 902, configured to determine at least one target keyword corresponding to the virtual character according to game data associated with the virtual character and a keyword library corresponding to the game data established in advance;
the generating module 903 is configured to perform drawing processing according to at least one target keyword corresponding to the virtual character and a character model that is built in advance, and generate a subtotal image corresponding to the virtual character in the game;
and the display module 904 is used for displaying the corresponding subtotal image of the virtual character on the graphical user interface.
Optionally, the determining module 902 is specifically configured to determine a character identifier keyword corresponding to the virtual character according to a character identifier of the virtual character included in the game data and a keyword corresponding to the character identifier in the keyword lexicon; determining scene identification keywords corresponding to the virtual characters according to scene identifications of the game in the game data and keywords corresponding to the scene identifications in the keyword lexicon; and determining the battle situation data keywords corresponding to the virtual characters according to the battle situation data of the virtual characters in the game of the game and the keywords corresponding to the battle situation data in the keyword library.
Optionally, the generating module 903 is specifically configured to splice the diagonal identification keyword, the scene identification keyword, and the battle situation data keyword, and generate a keyword string corresponding to the virtual character; and drawing processing is carried out according to the keyword strings corresponding to the virtual characters and the character model constructed in advance, so that a corresponding block image corresponding to the virtual characters in the game is generated.
Optionally, the generating module 903 is further specifically configured to invoke a role model corresponding to the role identification keyword according to the role identification keyword included in the keyword string corresponding to the virtual role and a role model that is built in advance; and drawing processing is carried out according to the character model corresponding to the character identification keyword, the scene identification keyword and the battle situation data keyword which are included in the keyword string, and a subtotal image corresponding to the virtual character in the game is generated.
Optionally, a display module 904, specifically configured to display a review of the game panel on the graphical user interface; and responding to the selection operation of the controllable area corresponding to the virtual character displayed on the second area of the contrast review panel, and displaying the contrast image corresponding to the virtual character on the first area of the contrast review panel, wherein the controllable area is provided with the character icon of the virtual character and the combat data.
Optionally, the display module 904 is further configured to obtain a behavior image of the virtual character in the local game; a behavior image of the virtual character in the local game is displayed on a second area of the game review panel displayed on the graphical user interface.
Optionally, the display module 904 is further configured to display a sharing control corresponding to at least one sharing object on a first relevant area of the pair-block image corresponding to the virtual character; and responding to the triggering operation of the sharing control corresponding to the sharing object, and sharing the corresponding block image of the virtual character to the sharing object.
Optionally, the display module 904 is further configured to display a save control on a second relevant area of the corresponding image of the opposite block corresponding to the virtual character; and responding to the triggering operation aiming at the saving control, and saving the corresponding block image of the virtual character.
Optionally, the display module 904 is further configured to display, on the review panel, in a preset order, a corresponding diagonal image of the virtual character manipulated by the target user in the target period in response to the selected target period on the review panel displayed on the graphical user interface.
The foregoing apparatus is used for executing the method provided in the foregoing embodiment, and its implementation principle and technical effects are similar, and are not described herein again.
The above modules may be one or more integrated circuits configured to implement the above methods, for example: one or more application specific integrated circuits (Application Specific Integrated Circuit, abbreviated as ASIC), or one or more microprocessors (Digital Signal Processor, abbreviated as DSP), or one or more field programmable gate arrays (Field Programmable Gate Array, abbreviated as FPGA), or the like. For another example, when a module above is implemented in the form of a processing element scheduler code, the processing element may be a general-purpose processor, such as a central processing unit (Central Processing Unit, CPU) or other processor that may invoke the program code. For another example, the modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application, as shown in fig. 10, where the electronic device may include: processor 1001, storage medium 1002, and bus 1003, storage medium 1002 storing machine-readable instructions executable by processor 1001, processor 1001 and storage medium 1002 communicating over bus 1003 when the electronic device is operating, processor 1001 executing machine-readable instructions to perform the steps of:
In a possible embodiment, the processor 1001 is specifically configured to obtain, when executing the game data processing method, game data associated with a virtual character in the game based on the end time of the game, where the game data includes: character identification of virtual character scene identification of the game battle situation data of the virtual character in the game of the game; determining at least one target keyword corresponding to the virtual character according to game data associated with the virtual character and a keyword library corresponding to the game data established in advance; drawing processing is carried out according to at least one target keyword corresponding to the virtual character and a pre-constructed character model, and a corresponding block image corresponding to the virtual character in the game is generated; and displaying the corresponding subtotal image of the virtual character on the graphical user interface.
In a possible implementation manner, when executing the game data processing method, the processor 1001 is specifically configured to determine a character identification keyword corresponding to the virtual character according to a character identification of the virtual character included in the game data and a keyword corresponding to the character identification in the keyword lexicon; determining scene identification keywords corresponding to the virtual characters according to scene identifications of the game in the game data and keywords corresponding to the scene identifications in the keyword lexicon; and determining the battle situation data keywords corresponding to the virtual characters according to the battle situation data of the virtual characters in the game of the game and the keywords corresponding to the battle situation data in the keyword library.
In a possible implementation manner, the processor 1001 is specifically configured to splice the diagonal color code keyword, the scene identification keyword, and the battle situation data keyword when executing the game data processing method, and generate a keyword string corresponding to the virtual character; and drawing processing is carried out according to the keyword strings corresponding to the virtual characters and the character model constructed in advance, so that a corresponding block image corresponding to the virtual characters in the game is generated.
In a possible implementation manner, the processor 1001 is specifically configured to retrieve a character model corresponding to a character identification keyword according to the character identification keyword included in the keyword string corresponding to the virtual character and a pre-constructed character model when executing the game data processing method; and drawing processing is carried out according to the character model corresponding to the character identification keyword, the scene identification keyword and the battle situation data keyword which are included in the keyword string, and a subtotal image corresponding to the virtual character in the game is generated.
In a possible implementation, the processor 1001 is specifically configured to display a game review panel on a graphical user interface when executing the game data processing method, and in response to a selection operation for a controllable area corresponding to a virtual character displayed on a second area of the game review panel, display a fighting image corresponding to the virtual character on a first area of the game review panel, where a character icon of the virtual character and the combat data are displayed on the controllable area.
In one possible embodiment, the processor 1001 is specifically configured to acquire a behavior image of the virtual character in the game of the game play in executing the game data processing method; a behavior image of the virtual character in the local game is displayed on a second area of the game review panel displayed on the graphical user interface.
In a possible implementation manner, the processor 1001 is specifically configured to display, when executing the game data processing method, a sharing control corresponding to at least one sharing object on a first relevant area of the corresponding pair-block image of the virtual character; and responding to the triggering operation of the sharing control corresponding to the sharing object, and sharing the corresponding block image of the virtual character to the sharing object.
In a possible implementation, the processor 1001 is specifically configured to display a save control on a second relevant area of the corresponding image of the counterparty corresponding to the virtual character when executing the game data processing method; and responding to the triggering operation aiming at the saving control, and saving the corresponding block image of the virtual character.
In a possible implementation, the processor 1001 is specifically configured to, when executing the game data processing method, display, on the game review panel, in a preset order, a corresponding game image corresponding to the virtual character manipulated by the target user in the target period in response to the target period selected on the game review panel displayed on the graphical user interface.
Optionally, the present application further provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of:
in a possible implementation manner, the processor is specifically configured to obtain, when executing the game data processing method, game data associated with a virtual character in the game based on the end time of the game, where the game data includes: character identification of virtual character scene identification of the game battle situation data of the virtual character in the game of the game; determining at least one target keyword corresponding to the virtual character according to game data associated with the virtual character and a keyword library corresponding to the game data established in advance; drawing processing is carried out according to at least one target keyword corresponding to the virtual character and a pre-constructed character model, and a corresponding block image corresponding to the virtual character in the game is generated; and displaying the corresponding subtotal image of the virtual character on the graphical user interface.
In a possible implementation manner, when executing the game data processing method, the processor is specifically configured to determine a character identification keyword corresponding to the virtual character according to the character identification of the virtual character included in the game data and a keyword corresponding to the character identification in the keyword lexicon; determining scene identification keywords corresponding to the virtual characters according to scene identifications of the game in the game data and keywords corresponding to the scene identifications in the keyword lexicon; and determining the battle situation data keywords corresponding to the virtual characters according to the battle situation data of the virtual characters in the game of the game and the keywords corresponding to the battle situation data in the keyword library.
In a possible implementation manner, when executing the game data processing method, the processor is specifically used for splicing the diagonal code keywords, the scene identification keywords and the battle situation data keywords to generate keyword strings corresponding to the virtual characters; and drawing processing is carried out according to the keyword strings corresponding to the virtual characters and the character model constructed in advance, so that a corresponding block image corresponding to the virtual characters in the game is generated.
In one possible implementation, the processor is specifically configured to, when executing the game data processing method, invoke a character model corresponding to the character identification keyword according to the character identification keyword included in the keyword string corresponding to the virtual character and a pre-constructed character model; and drawing processing is carried out according to the character model corresponding to the character identification keyword, the scene identification keyword and the battle situation data keyword which are included in the keyword string, and a subtotal image corresponding to the virtual character in the game is generated.
In one possible implementation, the processor is specifically configured to display a game review panel on the graphical user interface, and display a fighting image corresponding to the virtual character on the first area of the game review panel in response to a selection operation for a controllable area corresponding to the virtual character displayed on the second area of the game review panel, where the controllable area displays a character icon of the virtual character and the combat data.
In one possible embodiment, the processor is specifically configured to acquire a behavior image of the virtual character in the game of the game play, when executing the game data processing method; a behavior image of the virtual character in the local game is displayed on a second area of the game review panel displayed on the graphical user interface.
In a possible implementation manner, the processor is specifically configured to display, when executing the game data processing method, a sharing control corresponding to at least one sharing object on a first relevant area of the corresponding pair-block image corresponding to the virtual character; and responding to the triggering operation of the sharing control corresponding to the sharing object, and sharing the corresponding block image of the virtual character to the sharing object.
In a possible implementation manner, the processor is specifically configured to display a save control on a second relevant area of the corresponding syncopa image corresponding to the virtual character when executing the game data processing method; and responding to the triggering operation aiming at the saving control, and saving the corresponding block image of the virtual character.
In a possible implementation, the processor is specifically configured to, when executing the game data processing method, display, on the game review panel in a preset order, a corresponding check image of the virtual character manipulated by the target user within the target period in response to the target period selected on the game review panel displayed on the graphical user interface.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units implemented in the form of software functional units described above may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (english: processor) to perform some of the steps of the methods according to the embodiments of the application. And the aforementioned storage medium includes: u disk, mobile hard disk, read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk, etc.
It should be noted that in this document, relational terms such as "first" and "second" and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only of the preferred embodiments of the present application and is not intended to limit the present application, but various modifications and variations can be made to the present application by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application. It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. The above description is only of the preferred embodiments of the present application and is not intended to limit the present application, but various modifications and variations can be made to the present application by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (12)

1. A game data processing method characterized by providing a graphical user interface through a terminal device, the method comprising:
based on the game ending event, game data associated with the virtual character in the game is obtained, wherein the game data comprises: character identification of the virtual character the scene mark of the game battle state data of the virtual character in the local game;
Determining at least one target keyword corresponding to the virtual character according to game data associated with the virtual character and a keyword library corresponding to the pre-established game data;
drawing according to at least one target keyword corresponding to the virtual character and a pre-constructed character model, and generating a corresponding block image corresponding to the virtual character in the game;
and displaying the corresponding subtotal image of the virtual character on the graphical user interface.
2. The method according to claim 1, wherein the determining at least one target keyword corresponding to the virtual character according to the game data associated with the virtual character and a keyword word library corresponding to the game data established in advance includes:
determining a character identification keyword corresponding to the virtual character according to the character identification of the virtual character included in the game data and the keyword corresponding to the character identification in the keyword lexicon;
determining a scene identification keyword corresponding to the virtual character according to the scene identification of the game and the keywords corresponding to the scene identification in the keyword word stock, wherein the scene identification is included in the game data;
And determining the battle situation data keywords corresponding to the virtual characters according to the battle situation data of the virtual characters in the game and the keywords corresponding to the battle situation data in the keyword library, wherein the battle situation data are included in the game data.
3. The method of claim 2, wherein the generating a decision image corresponding to the virtual character in the local game by performing drawing processing according to at least one target keyword corresponding to the virtual character and a pre-constructed character model comprises:
splicing the role identification keywords, the scene identification keywords and the combat condition data keywords to generate keyword strings corresponding to the virtual roles;
and drawing according to the keyword strings corresponding to the virtual characters and the pre-constructed character model, and generating a corresponding block image corresponding to the virtual characters in the game.
4. The method of claim 3, wherein the generating a fighting image corresponding to the virtual character in the local game by performing drawing processing according to the keyword string corresponding to the virtual character and a character model constructed in advance comprises:
According to the character identification keywords and the pre-constructed character models which are included in the keyword strings corresponding to the virtual characters, calling the character models corresponding to the character identification keywords;
and carrying out drawing processing according to the character model corresponding to the character identification keyword, the scene identification keyword and the battle situation data keyword which are included in the keyword string, and generating a corresponding decision image of the virtual character in the local game.
5. The method of claim 1, wherein displaying the corresponding decision image of the virtual character on the graphical user interface comprises:
displaying a review of the game panel on the graphical user interface;
and responding to the selection operation of a controllable area corresponding to the virtual character displayed on the second area of the game review panel, and displaying a fighting image corresponding to the virtual character on the first area of the game review panel, wherein the controllable area is provided with a character icon of the virtual character and fighting data.
6. The method according to claim 1, wherein the method further comprises:
acquiring a behavior image of the virtual character in the game of the game;
Displaying a behavior image of the virtual character in the local game on a second area of a game review panel displayed on the graphical user interface.
7. The method according to claim 1, wherein the method further comprises:
displaying at least one sharing control corresponding to the sharing object on a first relevant area of the subtree image corresponding to the virtual character;
and responding to the triggering operation of the sharing control corresponding to the sharing object, and sharing the corresponding block image of the virtual character to the sharing object.
8. The method according to claim 1, wherein the method further comprises:
displaying a storage control on a second relevant area of the subtotal image corresponding to the virtual character;
and responding to the triggering operation aiming at the storage control, and storing the corresponding block image of the virtual character.
9. The method according to any one of claims 1-7, further comprising:
and responding to the target time period selected on the target time period displayed on the graphical user interface, and displaying the corresponding check image of the virtual character controlled by the target user in the target time period on the target time period according to a preset sequence.
10. A game data processing apparatus, characterized in that a graphical user interface is provided by a terminal device, the apparatus comprising:
the acquisition module is used for acquiring game data related to the virtual characters in the local game based on the ending time of the local game, wherein the game data comprises the following components: character identification of the virtual character the scene mark of the game battle state data of the virtual character in the local game;
the determining module is used for determining at least one target keyword corresponding to the virtual character according to game data associated with the virtual character and a keyword library corresponding to the game data established in advance;
the generation module is used for carrying out drawing processing according to at least one target keyword corresponding to the virtual character and a pre-constructed character model, and generating a subtotal image corresponding to the virtual character in the game;
and the display module is used for displaying the corresponding check image of the virtual role on the graphical user interface.
11. An electronic device, comprising: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating over the bus when the electronic device is operating, the processor executing the machine-readable instructions to perform the steps of the game data processing method of any one of claims 1-9.
12. A computer-readable storage medium, on which a computer program is stored, which computer program, when being executed by a processor, performs the steps of the game data processing method according to any one of claims 1-9.
CN202310858424.0A 2023-07-12 2023-07-12 Game data processing method, device, equipment and storage medium Pending CN116785700A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310858424.0A CN116785700A (en) 2023-07-12 2023-07-12 Game data processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310858424.0A CN116785700A (en) 2023-07-12 2023-07-12 Game data processing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116785700A true CN116785700A (en) 2023-09-22

Family

ID=88043642

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310858424.0A Pending CN116785700A (en) 2023-07-12 2023-07-12 Game data processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116785700A (en)

Similar Documents

Publication Publication Date Title
US11068042B2 (en) Detecting and responding to an event within an interactive videogame
CN111589148B (en) User interface display method, device, terminal and storage medium
CN110496391B (en) Information synchronization method and device
CN112843737B (en) Virtual object display method, device, terminal and storage medium
CN111672111A (en) Interface display method, device, equipment and storage medium
JP2022544888A (en) Interface display method, device, terminal, storage medium and computer program
CN111672122B (en) Interface display method, device, terminal and storage medium
CN115779433A (en) Method, device, terminal and storage medium for controlling virtual object release technology
CN113648650B (en) Interaction method and related device
KR20210148196A (en) Information display method and apparatus, device, and storage medium
CN110801629B (en) Method, device, terminal and medium for displaying virtual object life value prompt graph
CN111905363A (en) Virtual object control method, device, terminal and storage medium
CN113559504A (en) Information processing method, information processing apparatus, storage medium, and electronic device
CN111729291A (en) Interaction method, interaction device, electronic equipment and storage medium
CN114073100B (en) Mapping view of digital content
JP2022535502A (en) Method for selecting virtual objects and its device, terminal and storage medium
CN116785700A (en) Game data processing method, device, equipment and storage medium
CN111151004B (en) Game unit deployment method and device, electronic equipment and storage medium
KR100876739B1 (en) Multiplayer Video Game System and Method
KR100895170B1 (en) Method and mobile terminal for playing video game thereof system
CN112057859B (en) Virtual object control method, device, terminal and storage medium
CN117046111B (en) Game skill processing method and related device
CN113509730B (en) Information preview method, device, equipment and storage medium
CN113440842B (en) Content display method, device, terminal and storage medium
CN115089968A (en) Operation guiding method and device in game, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination