CN109621413A - Rendering indication method, device, terminal and the storage medium of game picture - Google Patents

Rendering indication method, device, terminal and the storage medium of game picture Download PDF

Info

Publication number
CN109621413A
CN109621413A CN201811624322.8A CN201811624322A CN109621413A CN 109621413 A CN109621413 A CN 109621413A CN 201811624322 A CN201811624322 A CN 201811624322A CN 109621413 A CN109621413 A CN 109621413A
Authority
CN
China
Prior art keywords
target virtual
rendering
game
virtual character
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811624322.8A
Other languages
Chinese (zh)
Inventor
夏富荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201811624322.8A priority Critical patent/CN109621413A/en
Publication of CN109621413A publication Critical patent/CN109621413A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6615Methods for processing data by generating or executing the game program for rendering three dimensional images using models with different levels of detail [LOD]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)

Abstract

This application discloses a kind of rendering indication method of game picture, device, terminal and storage mediums.The described method includes: rendering in the preparation stage that game is played a game and showing the first 3D scene of game;In the pick-up point choice phase that game is played a game, the corresponding 2D map of the 2nd 3D scene of game is shown, and stop rendering the first 3D scene of game;The role identification of the destination virtual role currently controlled is shown in 2D map;When destination virtual role reaches the target pick-up point in the 2nd 3D scene of game, cancel display 2D map, and render and show the scene in the 2nd 3D scene of game around destination virtual role.The embodiment of the present application is under the premise of equally realizing that pick-up point selects, so that need not render display 3D scene of game in the pick-up point choice phase, calculation amount needed for reducing picture rendering Caton situation occurs to be reduced or avoided, improves image quality.

Description

Rendering display method and device of game picture, terminal and storage medium
Technical Field
The embodiment of the application relates to the technical field of image processing, in particular to a rendering display method and device of a game picture, a terminal and a storage medium.
Background
At present, a terminal such as a mobile phone, a tablet computer, etc. can be installed with a running game application, so as to bring a richer entertainment experience to a user.
In some gaming applications, game play includes the following stages: the method comprises a preparation stage, a landing point selection stage and a fighting stage. In the preparation phase of the game play, a certain time is provided for the players participating in the game play to be ready, for example, when there are a plurality of players participating in the game play, the respective players may be gathered in the preparation phase. In the floor point selection stage, a game scene of game match is provided, and a player operates a virtual character to select a floor point in the game scene, wherein the floor point is an initial position point of the virtual character in the game scene. In the fighting stage, the player controls the virtual character to participate in the fighting in the game scene so as to win the victory of game play.
In the related art ground spot selection stage, there is provided a method of selecting a ground spot in the form of simulating a parachute. Specifically, the player operates the virtual character to land on a target landing point in the game scene from the air in a manner of simulating a parachute, and the landing point selection is completed.
In the related art, a 3D game scene needs to be rendered and displayed in the floor point selection stage, and is prone to being stuck.
Disclosure of Invention
The embodiment of the application provides a rendering display method and device of a game picture, a terminal and a storage medium, which can be used for solving the problem that the related art is easy to be stuck in a floor point selection stage. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a rendering display method for a game screen, where the method includes:
in a preparation stage of game play, rendering and displaying a first 3D game scene;
in the floor point selection stage of game match, displaying a 2D map corresponding to a second 3D game scene, and stopping rendering the first 3D game scene;
displaying a role identifier of a currently controlled target virtual role in the 2D map, wherein the role identifier of the target virtual role is used for indicating the position of the target virtual role;
when the target virtual character reaches a target landing point in the second 3D game scene, canceling to display the 2D map, and rendering and displaying scenes around the target virtual character in the second 3D game scene.
On the other hand, an embodiment of the present application provides a rendering display device for a game screen, the device including:
the scene rendering module is used for rendering and displaying a first 3D game scene in the preparation stage of game match;
the map display module is used for displaying a 2D map corresponding to a second 3D game scene and stopping rendering the first 3D game scene in the floor point selection stage of game match;
the identification display module is used for displaying the role identification of the currently controlled target virtual role in the 2D map, wherein the role identification of the target virtual role is used for indicating the position of the target virtual role;
the scene rendering module is further configured to cancel displaying the 2D map and render and display scenes around the target virtual character in the second 3D game scene when the target virtual character reaches a target landing point in the second 3D game scene.
In still another aspect, an embodiment of the present application provides a terminal, where the terminal includes a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or an instruction set, and the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by the processor to implement the method for rendering and displaying a game screen according to the foregoing aspect.
In yet another aspect, an embodiment of the present application provides a computer-readable storage medium, in which at least one instruction, at least one program, a code set, or a set of instructions is stored, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by a processor to implement the rendering display method of a game screen according to the above aspect.
In still another aspect, an embodiment of the present application provides a computer program product, which is configured to perform the method for rendering and displaying a game screen according to the above aspect when the computer program product is executed.
The technical scheme provided by the embodiment of the application at least comprises the following beneficial effects:
in the technical scheme provided by the embodiment of the application, rendering and displaying of a 3D game scene are stopped in a floor point selection stage of game match, a 2D map is displayed, and a player selects a floor point of a virtual character in the 2D map; on the premise of realizing the floor point selection, the 3D game scene does not need to be rendered and displayed in the floor point selection stage, and the calculation amount required by picture rendering is reduced, so that the situation of blocking is reduced or avoided, and the picture quality is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram illustrating a rendering process provided in the present technical solution;
FIG. 2 is a flowchart of a method for rendering and displaying a game screen according to an embodiment of the present application;
fig. 3 illustrates an interface diagram of a landing point selection phase;
FIG. 4 illustrates a schematic diagram of an interpolated rendering;
FIG. 5 is a block diagram of a rendering display device for game screens according to an embodiment of the present application;
FIG. 6 is a block diagram of a rendering display device for game screens according to another embodiment of the present application;
fig. 7 is a block diagram of a terminal according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Before the technical solution of the present application is introduced and described, an application scenario and related terms related to the technical solution of the present application are first introduced and described.
The technical scheme of the application can be applied to single-person games and multi-person games. For example, the technical solution of the present application may be applied to MOBA (Multiplayer Online Battle Arena) games. Of course, in some other embodiments, the technical solution of the present application may also be applied to a shooting game, a survival game, or other types of games, which is not limited in this embodiment of the present application.
Game play refers to the amount of play time spent to complete a game goal. The game objects may be different from game to game, for example, the game objects may be a game object that passes through a certain level, kills an enemy, pushes down an enemy big book, kills a number of monsters, becomes a last survivor, kills a specific monster, and the like, which is not limited in the embodiment of the present application.
Virtual characters, also referred to as game characters, etc., refer to objects that a player substitutes and manipulates during a game. In one game play, the number of virtual characters may be one or more. When the number of the virtual characters in the game play is one, a single-player game is common, and the virtual characters refer to the virtual characters operated and controlled by the current client. When the number of the virtual characters in the game play is multiple, the game can be a single-player game, the current client corresponds to multiple different virtual characters, and the player can replace the substituted and controlled virtual characters in the game play process; the virtual roles can also be a multi-player game, a plurality of virtual roles correspond to a plurality of different clients, one or more virtual roles are substituted and controlled by the player of each client, and the virtual roles substituted and controlled by different players can belong to the same camp or different camps. During game play, the virtual character can move in the game scene, such as walking, running, jumping and the like, and can change different postures. In addition, for the MOBA game, the virtual character can also perform operations such as releasing skills, attacking enemy units and the like in the game scene.
The game scene is a virtual scene created in the game competition process and used for the virtual character to play the game competition, such as a virtual map, a virtual island, a virtual house and the like. The game scene can be in a 3D (three-dimensional) form or a 2D (two-dimensional) form; the 3D game scene is more three-dimensional and vivid, and the user experience is better. The game picture rendered and displayed by the client contains a game scene provided by game play, and optionally also comprises one or more virtual characters positioned in the virtual scene.
As already mentioned above, the game play may include the following stages: the method comprises a preparation stage, a landing point selection stage and a fighting stage. In the preparation phase of the game match, a certain time is provided for the players participating in the game match to prepare, for example, when a plurality of players participate in the game match, the players can be gathered in the preparation phase, and the "preparation phase" can also be referred to as a "gathering phase". When the preparation phase is completed, for example, when the remaining time of the preparation phase is 0, or when all participants of the game play are ready, the preparation phase is switched to the floor point selection phase. In the floor point selection stage, a game scene of game match is provided, and a player operates a virtual character to select a floor point in the game scene, wherein the floor point is an initial position point of the virtual character in the game scene. And when the target virtual character controlled by the current client side reaches the target landing point in the game scene, switching from the landing point selection stage to the fighting stage. In the fighting stage, the player controls the virtual character to participate in the fighting in the game scene so as to win the victory of game play.
In the technical solution provided by the present application, as shown in fig. 1, in a preparation stage of game play, a terminal renders and displays a first game screen 11, where the first game screen 11 includes a first 3D game scene, and the first 3D game scene may include a target virtual character controlled by a current client, and optionally further includes other virtual characters controlled by other clients participating in the game play. In the stage of selecting a landing point for game match, the terminal displays a second game picture 12, where the second game picture 12 includes a 2D map corresponding to a second 3D game scene. That is, in the floor point selection stage, the terminal stops rendering and displaying the first 3D game scene, and displays the 2D map corresponding to the second 3D game scene. The player selects a floor point in the 2D map. And when the target virtual character reaches the target landing point in the second 3D game scene, switching the landing point selection stage to the fighting stage. In the fighting stage of game play, the terminal renders and displays a third game screen 13, wherein the third game screen 13 comprises a second 3D game scene and each virtual character for fighting in the second 3D game scene. In the embodiment of the present application, the first 3D game scene refers to a 3D game scene displayed in a preparation stage of game play, and the second 3D game scene refers to a 3D game scene displayed in a play stage of game play. The first 3D game scene and the second 3D game scene may be the same 3D game scene or may be two different 3D game scenes.
In the technical scheme provided by the application, considering that in a floor point selection stage, the position updating speed of virtual characters is high, and if the number of the virtual characters participating in game play is large (such as about 100), the positions of a large number of virtual characters need to be updated synchronously, if a 3D game scene is rendered and displayed in the floor point selection stage, as the 3D game scene itself needs more calculation amount and frequent and large number of position updates are added, the situation that data cannot be processed accurately in time and the game picture is jammed easily occurs. In view of this, in the technical solution provided in the present application, rendering and displaying a 3D game scene are stopped in a floor point selection stage of game match, a 2D map is displayed, and a player selects a floor point of a virtual character in the 2D map; on the premise of realizing the floor point selection, the 3D game scene does not need to be rendered and displayed in the floor point selection stage, and the calculation amount required by picture rendering is reduced, so that the situation of blocking is reduced or avoided, and the picture quality is improved.
In addition, in the embodiment of the method of the present application, the execution subject of each step may be a terminal, and for example, the terminal may be an electronic device such as a mobile phone, a tablet Computer, a game device, and a PC (Personal Computer). Alternatively, the execution subject of each step may be a client (referred to as "game client" for short) of a game application installed and running in the terminal, such as a client of an MOBA game. For convenience of explanation, in the following method embodiments, only the execution subject of each step is described as a terminal, but the present invention is not limited thereto.
The technical solution of the present application will be described below by means of several embodiments.
Referring to fig. 2, a flowchart of a method for rendering and displaying a game screen according to an embodiment of the present application is shown. The method comprises the following steps (201-204):
step 201, in the preparation stage of game match, rendering and displaying a first 3D game scene.
In a preparation stage of game play, the terminal renders and displays a first 3D game scene. The first 3D game scene is the 3D game scene displayed in the preparation stage of game play. The 3D game scene is generally a 3D virtual scene constructed by game development designers, not a real-world scene, which constructs the environment in which people (i.e., virtual characters) and objects in game play are located. The elements contained in a 3D game scene may be different for different types of games. For example, for a MOBA game, elements such as virtual characters, the ground, turrets, monsters, equipment, houses, bushes, and rivers can be included in the 3D game scene. For another example, for a shooting game, the 3D game scene may include elements such as virtual characters, equipment, ground, sky, water, mountains, flowers, grass, trees, stones, vehicles, houses, and the like.
Optionally, in a preparation phase of game play, the terminal renders and displays a first 3D game scene around the currently controlled target virtual character, where the first 3D game scene may include the target virtual character and its surroundings, such as a room or a ground where the target virtual character is located, and other virtual characters or objects around the target virtual character.
When the preparation phase is completed, for example, when the remaining time of the preparation phase is 0, or when all participants of the game play are ready, the preparation phase is switched to the floor point selection phase.
Step 202, in the floor point selection stage of game match, displaying the 2D map corresponding to the second 3D game scene, and stopping rendering the first 3D game scene.
The second 3D game scene is a 3D game scene displayed at a match stage of the game play. The first 3D game scene and the second 3D game scene may be the same 3D game scene or may be two different 3D game scenes. For example, when the first 3D game scene and the second 3D game scene are two different 3D game scenes, the first 3D game scene may be a virtual fort for a set of virtual characters participating in a game play, and the second 3D game scene may be a virtual island for a play of virtual characters participating in a game play.
The 2D map corresponding to the second 3D game scene comprises a plane map corresponding to the second 3D game scene. The 2D map may include contents such as a division of each area of the second 3D game scene, a name of each area, and a topographic feature of each area.
Optionally, the terminal displays a 2D map corresponding to the second 3D game scene on the upper layer of the first 3D game scene in an overlapping manner, so that the 2D map can cover the first 3D game scene. After the first 3D game scene is covered, the terminal can stop rendering the first 3D game scene, and the terminal only needs to render and display the 2D map.
In addition, if the terminal renders and displays the first 3D game scene in the target area of the screen in the preparation stage of game play, the terminal renders and displays the 2D map corresponding to the second 3D game scene in the target area in the floor point selection stage of game play. That is, if the first 3D game scene occupies the entire target area for display, the 2D map corresponding to the second 3D game scene also occupies the entire target area for display, so as to implement complete coverage of the first 3D game scene with the 2D map. Therefore, the terminal can display the 2D map instead of the 3D game scene in the floor point selection stage of game match. The target area can be the whole screen area of the terminal, namely, a game picture is displayed in a full-screen mode; or may be a partial screen area of the terminal, which is not limited in the embodiment of the present application.
And step 203, displaying the character identification of the currently controlled target virtual character in the 2D map.
In the floor selection phase, the player can control and adjust the position of the currently controlled target virtual character to select a suitable target floor. Optionally, the upper layer of the 2D map further includes a plurality of operation controls, such as buttons, rockers, sliders, and other operation controls, so that the player can control and adjust the position of the target virtual character through the operation controls. In one example, the upper layer of the 2D map includes a direction adjustment control, which is an operation control for adjusting the moving direction of the target virtual character, and the direction adjustment control may be implemented with a joystick. Optionally, the upper layer of the 2D map further includes a speed adjustment control, which is an operation control for adjusting the landing speed of the target virtual character, and the speed adjustment control may be implemented by using a button or a slider.
In addition, the operation control can be displayed on the upper layer of the 2D map, that is, the operation control is visible to the user, or can be superimposed on the upper layer of the 2D map in a hidden manner, that is, the operation control is invisible to the user, but can still respond to the touch operations performed by the user, such as clicking, sliding, pressing, and the like.
And the terminal displays the character identification of the currently controlled target virtual character in the 2D map. The character identification of the target virtual character is used to indicate the location of the target virtual character. The position of the target virtual character refers to the position coordinates of the target virtual character in the 2D map, and the position can be calculated according to parameters such as the moving direction and the moving speed of the target virtual character. The role identifier of the target virtual role is used for identifying and characterizing the target virtual role, for example, the role identifier may be a small dot or an avatar, which is not limited in this embodiment of the present application. The character of the target virtual character identifies a display position in the 2D map, matching the position of the target virtual character. The terminal can dynamically adjust the display position of the character identifier of the target virtual character in the 2D map according to the position of the target virtual character.
Alternatively, if the participants of the game play include at least one other virtual character in addition to the target virtual character, the terminal may also display character identifications of the other virtual characters in the 2D map so that the player can view the positions of the virtual characters controlled by the other players participating in the game play.
Optionally, the terminal may further display the time identifier and/or the direction identifier of the target virtual character in the 2D map. The time mark is used for indicating the remaining time of landing of the target virtual character, for example, the time mark can be represented by a progress bar, and the length of the progress bar represents the remaining time of landing of the target virtual character; the direction indicator is used to indicate the moving direction of the target virtual character, for example, the direction indicator may be identified by an arrow, and the direction pointed by the arrow is the moving direction of the target virtual character. If the terminal displays the character identifications of other virtual characters in the 2D map, the time identifications and/or direction identifications of the other virtual characters can also be displayed, so that the player can check the landing remaining time and the moving direction of the virtual character controlled by the other player participating in the game match.
The landing remaining time of the target virtual character refers to the remaining time required for the target virtual character to reach the ground of the second 3D game scene. In the present embodiment, the landing spot is selected in a manner that simulates a flight. Optionally, in a floor point selection stage, the terminal displays a virtual flight object moving along a specified trajectory in the 2D map, where the virtual flight object is used to carry virtual characters participating in game play and move along the specified trajectory; and when the terminal receives a starting instruction corresponding to the target virtual character, controlling the target virtual character to separate from the virtual flying body and drop to a target landing point of a second 3D game scene in a flight simulating manner. The designated trajectory may be a straight line or a curved line, and the designated trajectory may be displayed on a 2D map. In addition, under the condition that the player does not adjust the landing speed, the target virtual character lands on the ground after a preset time (such as 15 seconds) from the time when the target virtual character departs from the virtual flying object; if the player adjusts the landing speed, the target virtual character will land with a shorter or longer landing time.
In one example, as shown in fig. 3, in a preparation stage of game play, the terminal displays a first 3D game scene 21, where the first 3D game scene 21 includes a 3D character 22 of a target virtual character and its surrounding environment, such as a room or ground where the target virtual character is located, other virtual characters or objects around the target virtual character, and the like. In the floor point selection stage of game match, the terminal displays a 2D map 31 corresponding to the second 3D game scene, and the terminal displays a virtual flight 33 moving along a specified trajectory 32 on the 2D map 31. The user clicks the "go" button 34, and accordingly, the terminal receives a go instruction corresponding to the target virtual character, and then controls the target virtual character to escape from the virtual flight body 33, and to land on the target landing spot of the second 3D game scene in a simulated flight manner. After the target virtual character departs from the virtual flight volume 33, the terminal displays in the 2D map 31 the character identification 35 of the target virtual character, optionally together with the character identifications 36 of other virtual characters participating in the game play. The player can control the moving direction of the target virtual character through the rocker 36, and the landing speed of the target virtual character is adjusted through the 'sprint' button 37. In fig. 3, the character identifier 35 of the target virtual character is represented by a small dot with the reference numeral 1, and the character identifiers 36 of the other virtual characters are represented by small black dots. In addition, a progress bar 38 and an arrow 39 may be displayed next to the character identifier 35 of the target virtual character, wherein the progress bar 38 is used for indicating the landing remaining time of the target virtual character, and the arrow 39 is used for indicating the moving direction of the target virtual character.
And step 204, when the target virtual character reaches the target landing point in the second 3D game scene, canceling to display the 2D map, and rendering and displaying scenes around the target virtual character in the second 3D game scene.
After the target virtual character has passed a period of simulated flight, a target landing spot in the second 3D game scene is reached. In the embodiment of the application, since the 2D map is displayed in the floor point selection stage instead of the 3D game scene, the simulated flight process of the target virtual character is not shown in a 3D form, and the player cannot check the process of the target virtual character falling from the air to the ground from the game picture. However, from the perspective of background processing logic, the landing time of the target virtual character is still calculated according to the landing distance and the landing speed of the target virtual character.
And when the target virtual character reaches the target landing point in the second 3D game scene, the game match stage is switched to the match stage from the landing point selection stage. In the fight phase, the terminal cancels the display of the 2D map and renders and displays a second 3D game scene around the target virtual character.
As shown in fig. 3, when the target virtual character reaches the target landing point in the second 3D game scene, the terminal enters the fighting phase of game match, and displays the second 3D game scene 41, where the second 3D game scene 41 includes the 3D character 22 of the target virtual character and its surrounding environment, such as the room or ground where the target virtual character is located, other virtual characters or objects around the target virtual character, and so on.
It should be noted that, in the floor point selection stage, two modes, namely a manual control mode and an automatic control mode, may be provided to control the movement of the target virtual character. When the target virtual character is in the manual control mode, the terminal controls the target virtual character to move to the target floor point according to the control operation corresponding to the operation control introduced above. When the target virtual character is in the automatic control mode, the terminal determines a target floor point of the target virtual character, and then automatically controls the target virtual character to move to the target floor point. For example, after the target landing point of the target virtual character is determined, the terminal calculates the shortest path to the target landing point, and then moves at the maximum landing speed according to the shortest path, so as to help the player to reach the target landing point at the fastest speed.
Optionally, the manner in which the terminal determines the target landing point of the target virtual character includes, but is not limited to, any one of the following: 1. and the terminal determines a target floor point of the target virtual character according to the floor point selection operation in the 2D map. The above-mentioned landing point selection operation refers to an operation performed by the player in the 2D map for selecting a target landing point, for example, the player clicks a certain position in the 2D map, and the terminal determines the position as the target landing point of the target virtual character. 2. And the terminal randomly determines a target landing point of the target virtual character. 3. And the terminal determines the target landing point of the target virtual character according to the moving direction of other virtual characters in the game-to-game. For example, the terminal may select an area having the greatest or smallest player density as a target landing point of the target virtual character according to the moving direction of other virtual characters in the game-to-game.
To sum up, in the technical solution provided in the embodiment of the present application, rendering and displaying a 3D game scene is stopped at a floor point selection stage of game match, and a 2D map is displayed, and a player selects a floor point of a virtual character in the 2D map; on the premise of realizing the floor point selection, the 3D game scene does not need to be rendered and displayed in the floor point selection stage, and the calculation amount required by picture rendering is reduced, so that the situation of blocking is reduced or avoided, and the picture quality is improved.
In an alternative embodiment provided on the basis of the embodiment of fig. 2, the step 203 may include the following sub-steps (203a to 203 b):
step 203a, acquiring the position information of the target virtual role once every first time interval according to the logic frame rate;
and step 203b, rendering and displaying the rendering frame once every second time interval according to the rendering frame rate.
The position information of the target virtual character is used to indicate the position of the target virtual character. Optionally, the location information includes the following 4 parameters: moving direction, moving speed, landing speed, and whether to stop. Taking the target virtual character as an example, the moving direction of the target virtual character refers to the moving direction of the target virtual character on the plane shown in the 2D map, the moving speed of the target virtual character refers to the transverse moving speed of the target virtual character on the plane shown in the 2D map, the landing speed of the target virtual character refers to the longitudinal moving speed of the target virtual character in the process of falling from the air to the ground of the second 3D game scene, and whether the target virtual character stops or not refers to whether the target virtual character stops moving on the plane shown in the 2D map.
The rendering frame comprises a 2D map and a role identifier of a target virtual role in the 2D map, and the display position of the role identifier of the target virtual role in the 2D map is determined according to the position information of the target virtual role. For example, the terminal may calculate a real-time position of the target virtual character based on the position information of the target virtual character, and then display a character identifier of the target virtual character in the 2D map based on the real-time position of the target virtual character.
In the technical field of rendering and displaying of game pictures, two concepts of logical frames and rendering frames exist. The terminal updates the game state at a first time interval, including acquiring the position information, attribute parameters (such as blood volume, attack value, defense value and other parameters), grade parameters, killing parameters and other information of each virtual character participating in game play, wherein the process of updating the game state is the process of updating the logic frame. The logical frame rate refers to the update frequency of the logical frame. The process of updating logical frames is generally not time consuming. In addition, the terminal updates the game picture at a second time interval, namely updates the rendering frame. The rendering frame rate refers to the update frequency of the rendering frame. The process of updating the rendered frames is time consuming and the more complex the scene is, the more time consuming. The values of the logical frame rate and the rendering frame rate may be preset by a game developer, which is not limited in the embodiment of the present application. The terminal can update the logic frame, then update the rendering frame, then update the logic frame, then update the rendering frame, and so on, and thus the game state and the game picture are continuously updated in a circulating mode, and real-time game experience is brought to the player.
Alternatively, if the participants of the game play include at least one other virtual character in addition to the target virtual character, the above step 203a may include the following sub-steps:
1. generating position information of the target virtual role according to the control instruction corresponding to the target virtual role;
2. sending the position information of the target virtual role to a server;
3. and receiving global position information sent by the server at intervals of a first time according to the logic frame rate, wherein the global position information comprises the position information of the target virtual character and the position information of other virtual characters.
The terminal firstly sends the position information of the target virtual role to the server, and the server uniformly updates the logic frame. The server obtains global position information after obtaining the position information of each virtual character participating in game match, then broadcasts the global position information to each client participating in game match, and the client updates the rendering frame according to the global position information, wherein the content comprises displaying the character identification and the like of each virtual character participating in game match in a 2D map. By the mode, when a plurality of virtual characters participate in the same game, the logic frame is updated uniformly through the server, the game state and the game picture of each client side can be ensured to be synchronous, and the game experience of a player is promoted.
Optionally, the rendering frame rate is n times the logical frame rate, n being an integer greater than 1. For example, the rendering frame rate may be 2 times or 3 times the logical frame rate. The game picture is displayed at a higher rendering frame rate, so that the display quality of the game picture can be improved.
In addition, when the rendering frame rate is n times of the logical frame rate, the step 203b may include the following sub-steps:
1. rendering and displaying an ith rendering frame corresponding to the position information of the target virtual character acquired at the ith time according to the position information of the target virtual character acquired at the ith time, wherein i is a positive integer;
2. sequentially rendering and displaying n-1 rendering frames by adopting an interpolation algorithm according to the position information of the target virtual character acquired at the ith time and the position information of the target virtual character acquired at the (i + 1) th time;
3. and rendering and displaying the (i + 1) th rendering frame corresponding to the position information of the target virtual character acquired at the (i + 1) th time according to the position information of the target virtual character acquired at the (i + 1) th time.
Referring to fig. 4, n is 2, that is, the rendering frame rate is 2 times of the logical frame rate. In fig. 4, the ith logical frame includes the global position information acquired the ith time, the (i + 1) th logical frame includes the global position information acquired the (i + 1) th time, and so on. And the real-time position of each virtual character displayed in the ith rendering frame in the 2D map is matched with the global position information acquired at the ith time contained in the ith logical frame by the terminal in a delayed rendering mode, the real-time position of each virtual character displayed in the (i + 1) th rendering frame in the 2D map is matched with the global position information acquired at the (i + 1) th time contained in the (i + 1) th logical frame, and the like. In addition, one rendering frame (referred to as "interpolated rendering frame" in fig. 4) is inserted between two adjacent rendering frames by using an interpolation algorithm, and taking the interpolated rendering frame inserted between the ith rendering frame and the (i + 1) th rendering frame as an example, the real-time position of each virtual character displayed in the interpolated rendering frame in the 2D map is calculated according to the global position information acquired at the i-th time included in the ith logical frame and the global position information acquired at the i + 1-th time included in the i + 1-th logical frame. Taking the target avatar as an example, assume that the real-time position indicated by the position information of the target avatar in the ith logical frame is PiThe real-time position indicated by the position information of the target virtual character in the (i + 1) th logical frame is Pi+1If the real-time position of the target virtual character in the ith rendering frame is PiThe real-time position in the interpolated rendering frame inserted between the ith rendering frame and the (i + 1) th rendering frame is PiAnd Pi+1The real-time position in the (i + 1) th rendering frame is Pi+1. Through the mode, the mode of combining lag rendering with interpolation can ensure the moving smoothness and accuracy of the character identification of the virtual character in the 2D map, avoid shaking, help a player to accurately select points and improve game experience.
To sum up, in the technical solution provided in the embodiment of the present application, when a plurality of virtual characters participate in the same game, the server uniformly updates the logical frame, so that the game state and the game picture of each client can be ensured to be synchronized, which is helpful for improving the game experience of the player.
In addition, the mode of combining lag rendering with interpolation can ensure the smoothness and accuracy of the movement of the character identification of the virtual character in the 2D map, avoid the shaking, help the player to accurately select points and simultaneously improve the game experience.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Referring to fig. 5, a block diagram of a rendering and displaying device for a game screen according to an embodiment of the present application is shown. The device has the functions of realizing the method examples, and the functions can be realized by hardware or by hardware executing corresponding software. The device may be the terminal described above, or may be provided on the terminal. The apparatus 500 may comprise: a scene rendering module 510, a map display module 520, and an identification display module 530.
And a scene rendering module 510, configured to render and display the first 3D game scene in a preparation stage of game play.
The map display module 520 is configured to display a 2D map corresponding to a second 3D game scene and stop rendering the first 3D game scene in the floor point selection stage of the game match.
An identifier displaying module 530, configured to display a role identifier of a currently controlled target virtual role in the 2D map, where the role identifier of the target virtual role is used to indicate a location of the target virtual role.
The scene rendering module 510 is further configured to cancel displaying the 2D map and render and display a scene around the target virtual character in the second 3D game scene when the target virtual character reaches the target landing point in the second 3D game scene.
To sum up, in the technical solution provided in the embodiment of the present application, rendering and displaying a 3D game scene is stopped at a floor point selection stage of game match, and a 2D map is displayed, and a player selects a floor point of a virtual character in the 2D map; on the premise of realizing the floor point selection, the 3D game scene does not need to be rendered and displayed in the floor point selection stage, and the calculation amount required by picture rendering is reduced, so that the situation of blocking is reduced or avoided, and the picture quality is improved.
In an alternative embodiment provided based on the embodiment of fig. 5, as shown in fig. 6, the identifier display module 530 includes: a logical frame acquisition unit 531 and a render frame display unit 532.
A logical frame obtaining unit 531, configured to obtain location information of the target virtual character at a first time interval according to a logical frame rate, where the location information of the target virtual character is used to indicate a location of the target virtual character.
A rendering frame display unit 532, configured to render and display a rendering frame every second time interval according to the rendering frame rate; the rendering frame comprises the 2D map and the role identification of the target virtual role in the 2D map, and the display position of the role identification of the target virtual role in the 2D map is determined according to the position information of the target virtual role.
Optionally, the rendering frame rate is n times the logical frame rate, where n is an integer greater than 1.
Optionally, the render frame display unit 532 is configured to:
rendering and displaying an ith rendering frame corresponding to the position information of the target virtual character acquired at the ith time according to the position information of the target virtual character acquired at the ith time, wherein i is a positive integer;
sequentially rendering and displaying n-1 rendering frames according to the position information of the target virtual character acquired at the ith time and the position information of the target virtual character acquired at the (i + 1) th time by adopting an interpolation algorithm;
and rendering and displaying an i +1 th rendering frame corresponding to the position information of the target virtual character acquired at the i +1 th time according to the position information of the target virtual character acquired at the i +1 th time.
Optionally, the participants of the game play also include at least one other virtual character in addition to the target virtual character;
the logical frame obtaining unit 531 is configured to:
generating position information of the target virtual role according to a control instruction corresponding to the target virtual role;
sending the position information of the target virtual role to a server;
receiving global position information sent by the server at intervals of the first time interval according to the logical frame rate, wherein the global position information comprises the position information of the target virtual role and the position information of other virtual roles.
In another optional embodiment provided based on the embodiment of fig. 5 or any one of the above optional embodiments, the identifier displaying module 530 is further configured to display a time identifier and/or a direction identifier of the target virtual character in the 2D map;
the time mark is used for indicating the landing remaining time of the target virtual character, and the direction mark is used for indicating the moving direction of the target virtual character.
In another optional embodiment provided on the basis of the embodiment of fig. 5 or any one of the optional embodiments described above, the apparatus 500 further includes: a landing point determination module 540 and a movement control module 550.
A landing point determining module 540, configured to determine the target landing point of the target avatar when the target avatar is in an automatic control mode.
A movement control module 550, configured to control the target avatar to move to the target landing point.
Optionally, the landing point determining module 540 is configured to: determining the target landing point of the target avatar according to a landing point selection operation in the 2D map; or, randomly determining the target landing point of the target virtual character; or determining the target landing point of the target virtual character according to the moving direction of other virtual characters in the game play.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Referring to fig. 7, a block diagram of a terminal 700 according to an embodiment of the present application is shown. The terminal 700 may be an electronic device such as a mobile phone, a tablet computer, a game device, a PC, and the like.
In general, terminal 700 includes: a processor 701 and a memory 702.
The processor 701 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 701 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (field Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 701 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 701 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 701 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 702 may include one or more computer-readable storage media, which may be non-transitory. Memory 702 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 702 is used to store at least one instruction for execution by processor 701 to implement a method for rendering and displaying a game screen provided by method embodiments herein.
In some embodiments, the terminal 700 may further optionally include: a peripheral interface 703 and at least one peripheral. The processor 701, the memory 702, and the peripheral interface 703 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 703 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 704, touch screen display 705, camera 706, audio circuitry 707, positioning components 708, and power source 709.
The peripheral interface 703 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 701 and the memory 702. In some embodiments, processor 701, memory 702, and peripheral interface 703 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 701, the memory 702, and the peripheral interface 703 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 704 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 704 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 704 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 704 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 704 may communicate with other devices via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or Wi-Fi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 704 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 705 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 705 is a touch display screen, the display screen 705 also has the ability to capture touch signals on or over the surface of the display screen 705. The touch signal may be input to the processor 701 as a control signal for processing. At this point, the display 705 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 705 may be one, providing the front panel of the terminal 700; in other embodiments, the display 705 can be at least two, respectively disposed on different surfaces of the terminal 700 or in a folded design; in still other embodiments, the display 705 may be a flexible display disposed on a curved surface or on a folded surface of the terminal 700. Even more, the display 705 may be arranged in a non-rectangular irregular pattern, i.e. a shaped screen. The Display 705 may be made of LCD (liquid crystal Display), OLED (Organic Light-Emitting Diode), or the like.
The camera assembly 706 is used to capture images or video. Optionally, camera assembly 706 includes a front camera and a rear camera. Generally, a front camera is disposed on a front panel of a computer apparatus, and a rear camera is disposed on a rear surface of the computer apparatus. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 706 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 707 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 701 for processing or inputting the electric signals to the radio frequency circuit 704 to realize voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be provided at different portions of the terminal 700. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 701 or the radio frequency circuit 704 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 707 may also include a headphone jack.
The positioning component 708 is used to locate the current geographic position of the terminal 700 to implement navigation or LBS (location based Service). The positioning component 708 may be a positioning component based on the GPS (global positioning System) in the united states, the beidou System in china, the graves System in russia, or the galileo System in the european union.
Power supply 709 is provided to supply power to various components of terminal 700. The power source 709 may be alternating current, direct current, disposable batteries, or rechargeable batteries. When the power source 709 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 700 also includes one or more sensors 710. The one or more sensors 710 include, but are not limited to: acceleration sensor 711, gyro sensor 712, pressure sensor 713, fingerprint sensor 714, optical sensor 715, and proximity sensor 716.
The acceleration sensor 711 can detect the magnitude of acceleration in three coordinate axes of a coordinate system established with the terminal 700. For example, the acceleration sensor 711 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 701 may control the touch screen 705 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 711. The acceleration sensor 711 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 712 may detect a body direction and a rotation angle of the terminal 700, and the gyro sensor 712 may cooperate with the acceleration sensor 711 to acquire a 3D motion of the terminal 700 by the user. From the data collected by the gyro sensor 712, the processor 701 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 713 may be disposed on a side bezel of terminal 700 and/or an underlying layer of touch display 705. When the pressure sensor 713 is disposed on a side frame of the terminal 700, a user's grip signal on the terminal 700 may be detected, and the processor 701 performs right-left hand recognition or shortcut operation according to the grip signal collected by the pressure sensor 713. When the pressure sensor 713 is disposed at a lower layer of the touch display 705, the processor 701 controls the operability control on the UI interface according to the pressure operation of the user on the touch display 705. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 714 is used for collecting a fingerprint of a user, and the processor 701 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 714, or the fingerprint sensor 714 identifies the identity of the user according to the collected fingerprint. When the user identity is identified as a trusted identity, the processor 701 authorizes the user to perform relevant sensitive operations, including unlocking a screen, viewing encrypted information, downloading software, paying, changing settings, and the like. The fingerprint sensor 714 may be disposed on the front, back, or side of the terminal 700. When a physical button or a vendor Logo is provided on the terminal 700, the fingerprint sensor 714 may be integrated with the physical button or the vendor Logo.
The optical sensor 715 is used to collect the ambient light intensity. In one embodiment, the processor 701 may control the display brightness of the touch display 705 based on the ambient light intensity collected by the optical sensor 715. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 705 is increased; when the ambient light intensity is low, the display brightness of the touch display 705 is turned down. In another embodiment, processor 701 may also dynamically adjust the shooting parameters of camera assembly 706 based on the ambient light intensity collected by optical sensor 715.
A proximity sensor 716, also referred to as a distance sensor, is typically disposed on a front panel of the terminal 700. The proximity sensor 716 is used to collect the distance between the user and the front surface of the terminal 700. In one embodiment, when the proximity sensor 716 detects that the distance between the user and the front surface of the terminal 700 gradually decreases, the processor 701 controls the touch display 705 to switch from the bright screen state to the dark screen state; when the proximity sensor 716 detects that the distance between the user and the front surface of the terminal 700 gradually becomes larger, the processor 701 controls the touch display 705 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 7 is not intended to be limiting of terminal 700 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
In an example embodiment, there is also provided a terminal comprising a processor and a memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions. The at least one instruction, the at least one program, the set of codes, or the set of instructions is configured to be executed by one or more processors to implement the rendering display method of a game screen provided by the above-mentioned embodiments.
In an exemplary embodiment, there is also provided a computer-readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions which, when executed by a processor of a computer device, implements the rendering display method of a game screen provided by the above-described embodiments. Alternatively, the computer-readable storage medium may be a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, there is also provided a computer program product for implementing the rendering display method of a game screen provided in the above-described embodiments when the computer program product is executed.
It should be understood that reference to "a plurality" herein means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. In addition, the step numbers described herein only exemplarily show one possible execution sequence among the steps, and in some other embodiments, the steps may also be executed out of the numbering sequence, for example, two steps with different numbers are executed simultaneously, or two steps with different numbers are executed in a reverse order to the order shown in the figure, which is not limited by the embodiment of the present application.
The above description is only exemplary of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like that are made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. A rendering display method of a game picture is characterized by comprising the following steps:
in a preparation stage of game play, rendering and displaying a first 3D game scene;
in the floor point selection stage of game match, displaying a 2D map corresponding to a second 3D game scene, and stopping rendering the first 3D game scene;
displaying a role identifier of a currently controlled target virtual role in the 2D map, wherein the role identifier of the target virtual role is used for indicating the position of the target virtual role;
when the target virtual character reaches a target landing point in the second 3D game scene, canceling to display the 2D map, and rendering and displaying scenes around the target virtual character in the second 3D game scene.
2. The method of claim 1, wherein displaying the character identification of the currently controlled target virtual character in the 2D map comprises:
acquiring the position information of the target virtual role at intervals of a first time interval according to a logic frame rate, wherein the position information of the target virtual role is used for indicating the position of the target virtual role;
rendering and displaying the rendering frames once every second time interval according to the rendering frame rate; the rendering frame comprises the 2D map and the role identification of the target virtual role in the 2D map, and the display position of the role identification of the target virtual role in the 2D map is determined according to the position information of the target virtual role.
3. The method of claim 2, wherein the rendering frame rate is n times the logical frame rate, wherein n is an integer greater than 1.
4. The method of claim 3, wherein rendering and displaying the rendered frames at the rendering frame rate every second time interval comprises:
rendering and displaying an ith rendering frame corresponding to the position information of the target virtual character acquired at the ith time according to the position information of the target virtual character acquired at the ith time, wherein i is a positive integer;
sequentially rendering and displaying n-1 rendering frames according to the position information of the target virtual character acquired at the ith time and the position information of the target virtual character acquired at the (i + 1) th time by adopting an interpolation algorithm;
and rendering and displaying an i +1 th rendering frame corresponding to the position information of the target virtual character acquired at the i +1 th time according to the position information of the target virtual character acquired at the i +1 th time.
5. The method of claim 2, wherein the participants of the game play include at least one other virtual character in addition to the target virtual character;
the acquiring the position information of the target virtual role at intervals of a first time interval according to the logical frame rate includes:
generating position information of the target virtual role according to a control instruction corresponding to the target virtual role;
sending the position information of the target virtual role to a server;
receiving global position information sent by the server at intervals of the first time interval according to the logical frame rate, wherein the global position information comprises the position information of the target virtual role and the position information of other virtual roles.
6. The method of claim 1, further comprising:
displaying a time identification and/or a direction identification of the target virtual character in the 2D map;
the time mark is used for indicating the landing remaining time of the target virtual character, and the direction mark is used for indicating the moving direction of the target virtual character.
7. The method of claim 1, further comprising:
determining the target landing point of the target virtual character when the target virtual character is in an automatic control mode;
and controlling the target virtual character to move to the target floor point.
8. The method of claim 7, wherein the determining the target landing point of the target avatar comprises:
determining the target landing point of the target avatar according to a landing point selection operation in the 2D map;
or,
randomly determining the target landing point of the target virtual character;
or,
and determining the target landing point of the target virtual character according to the moving directions of other virtual characters in the game play.
9. A rendering display apparatus for a game screen, the apparatus comprising:
the scene rendering module is used for rendering and displaying a first 3D game scene in the preparation stage of game match;
the map display module is used for displaying a 2D map corresponding to a second 3D game scene and stopping rendering the first 3D game scene in the floor point selection stage of game match;
the identification display module is used for displaying the role identification of the currently controlled target virtual role in the 2D map, wherein the role identification of the target virtual role is used for indicating the position of the target virtual role;
the scene rendering module is further configured to cancel displaying the 2D map and render and display scenes around the target virtual character in the second 3D game scene when the target virtual character reaches a target landing point in the second 3D game scene.
10. The apparatus of claim 9, wherein the logo display module comprises:
a logical frame acquiring unit, configured to acquire, at a first time interval according to a logical frame rate, location information of the target virtual character, where the location information of the target virtual character is used to indicate a location of the target virtual character;
the rendering frame display unit is used for rendering and displaying the rendering frames once every second time interval according to the rendering frame rate; the rendering frame comprises the 2D map and the role identification of the target virtual role in the 2D map, and the display position of the role identification of the target virtual role in the 2D map is determined according to the position information of the target virtual role.
11. The apparatus of claim 10, wherein the rendering frame rate is n times the logical frame rate, and wherein n is an integer greater than 1.
12. The apparatus of claim 11, wherein the render frame display unit is configured to:
rendering and displaying an ith rendering frame corresponding to the position information of the target virtual character acquired at the ith time according to the position information of the target virtual character acquired at the ith time, wherein i is a positive integer;
sequentially rendering and displaying n-1 rendering frames according to the position information of the target virtual character acquired at the ith time and the position information of the target virtual character acquired at the (i + 1) th time by adopting an interpolation algorithm;
and rendering and displaying an i +1 th rendering frame corresponding to the position information of the target virtual character acquired at the i +1 th time according to the position information of the target virtual character acquired at the i +1 th time.
13. The apparatus of claim 10, wherein the participants of the game play include at least one other virtual character in addition to the target virtual character;
the logical frame acquiring unit is configured to:
generating position information of the target virtual role according to a control instruction corresponding to the target virtual role;
sending the position information of the target virtual role to a server;
receiving global position information sent by the server at intervals of the first time interval according to the logical frame rate, wherein the global position information comprises the position information of the target virtual role and the position information of other virtual roles.
14. A terminal, characterized in that it comprises a processor and a memory in which at least one instruction, at least one program, set of codes or set of instructions is stored, which is loaded and executed by the processor to implement the method according to any one of claims 1 to 8.
15. A computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the method according to any one of claims 1 to 8.
CN201811624322.8A 2018-12-28 2018-12-28 Rendering indication method, device, terminal and the storage medium of game picture Pending CN109621413A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811624322.8A CN109621413A (en) 2018-12-28 2018-12-28 Rendering indication method, device, terminal and the storage medium of game picture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811624322.8A CN109621413A (en) 2018-12-28 2018-12-28 Rendering indication method, device, terminal and the storage medium of game picture

Publications (1)

Publication Number Publication Date
CN109621413A true CN109621413A (en) 2019-04-16

Family

ID=66078900

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811624322.8A Pending CN109621413A (en) 2018-12-28 2018-12-28 Rendering indication method, device, terminal and the storage medium of game picture

Country Status (1)

Country Link
CN (1) CN109621413A (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110300327A (en) * 2019-04-18 2019-10-01 深圳市腾讯网域计算机网络有限公司 A kind of game client method for analyzing performance, device, terminal and storage medium
CN110339564A (en) * 2019-08-16 2019-10-18 腾讯科技(深圳)有限公司 Virtual objects display methods, device, terminal and storage medium in virtual environment
CN110975285A (en) * 2019-12-06 2020-04-10 北京像素软件科技股份有限公司 Smooth knife light obtaining method and device
CN111068309A (en) * 2019-12-04 2020-04-28 网易(杭州)网络有限公司 Display control method, device, equipment, system and medium for virtual reality game
CN111068311A (en) * 2019-11-29 2020-04-28 珠海豹趣科技有限公司 Display control method and device for game scene
CN111325822A (en) * 2020-02-18 2020-06-23 腾讯科技(深圳)有限公司 Method, device and equipment for displaying hot spot diagram and readable storage medium
CN111330275A (en) * 2020-03-04 2020-06-26 网易(杭州)网络有限公司 Interactive method and device in game, storage medium and electronic equipment
CN111589121A (en) * 2020-04-03 2020-08-28 北京冰封互娱科技有限公司 Information display method and device, storage medium and electronic device
CN112052097A (en) * 2020-10-15 2020-12-08 腾讯科技(深圳)有限公司 Rendering resource processing method, device and equipment for virtual scene and storage medium
CN112138377A (en) * 2019-06-28 2020-12-29 北京智明星通科技股份有限公司 Method and system for adjusting game APP rendering effect
CN113192168A (en) * 2021-04-01 2021-07-30 广州三七互娱科技有限公司 Game scene rendering method and device and electronic equipment
CN113244617A (en) * 2021-06-02 2021-08-13 网易(杭州)网络有限公司 Mobile data processing method and device in game and electronic equipment
CN113262497A (en) * 2021-05-14 2021-08-17 广州三七极耀网络科技有限公司 Virtual character rendering method, device, equipment and storage medium
CN113318441A (en) * 2021-05-26 2021-08-31 网易(杭州)网络有限公司 Game scene display control method and device, electronic equipment and storage medium
CN113379870A (en) * 2021-03-12 2021-09-10 广东虚拟现实科技有限公司 Display method and device based on virtual training scene training and storage medium
CN113413600A (en) * 2021-07-01 2021-09-21 网易(杭州)网络有限公司 Information processing method, information processing device, computer equipment and storage medium
CN114245177A (en) * 2021-12-17 2022-03-25 智道网联科技(北京)有限公司 Smooth display method and device of high-precision map, electronic equipment and storage medium
CN114748872A (en) * 2022-06-13 2022-07-15 深圳市乐易网络股份有限公司 Game rendering updating method based on information fusion
CN114768250A (en) * 2022-04-06 2022-07-22 成都星奕网络科技有限公司 Virtual scene rendering color matching analysis management system based on image processing technology
WO2023273131A1 (en) * 2021-06-30 2023-01-05 上海完美时空软件有限公司 Game scene generation method and apparatus, storage medium, and electronic apparatus
WO2023065835A1 (en) * 2021-10-20 2023-04-27 北京思明启创科技有限公司 Code block interpretive execution method and apparatus, and electronic device and storage medium
US11861775B2 (en) 2019-10-17 2024-01-02 Huawei Technologies Co., Ltd. Picture rendering method, apparatus, electronic device, and storage medium
WO2024037153A1 (en) * 2022-08-19 2024-02-22 腾讯科技(深圳)有限公司 Interface display method and information providing method based on turn-based combat, and system
WO2024164585A1 (en) * 2023-02-07 2024-08-15 腾讯科技(深圳)有限公司 Frame synchronization method, frame synchronization apparatus, electronic device and computer storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1834679A1 (en) * 2006-03-15 2007-09-19 Kabushiki Kaisha Square Enix (also trading as Square Enix Co., Ltd.) A video game processing apparatus, a method and a computer program product for processing a video game
CN101547308A (en) * 2008-03-25 2009-09-30 索尼株式会社 Image processing apparatus, image processing method, and program
CN103685860A (en) * 2012-09-11 2014-03-26 索尼公司 Image interpolation with improved definition performance
CN104133553A (en) * 2014-07-30 2014-11-05 小米科技有限责任公司 Method and device for showing webpage content
CN107067475A (en) * 2017-03-30 2017-08-18 北京乐动卓越科技有限公司 A kind of the scene management system and its implementation of 3D game
CN107789837A (en) * 2017-09-12 2018-03-13 网易(杭州)网络有限公司 Information processing method, device and computer-readable recording medium
CN108379832A (en) * 2018-01-29 2018-08-10 珠海金山网络游戏科技有限公司 A kind of game synchronization method and apparatus
CN108579087A (en) * 2018-04-10 2018-09-28 网易(杭州)网络有限公司 A kind of control method and device of game role

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1834679A1 (en) * 2006-03-15 2007-09-19 Kabushiki Kaisha Square Enix (also trading as Square Enix Co., Ltd.) A video game processing apparatus, a method and a computer program product for processing a video game
CN101547308A (en) * 2008-03-25 2009-09-30 索尼株式会社 Image processing apparatus, image processing method, and program
CN103685860A (en) * 2012-09-11 2014-03-26 索尼公司 Image interpolation with improved definition performance
CN104133553A (en) * 2014-07-30 2014-11-05 小米科技有限责任公司 Method and device for showing webpage content
CN107067475A (en) * 2017-03-30 2017-08-18 北京乐动卓越科技有限公司 A kind of the scene management system and its implementation of 3D game
CN107789837A (en) * 2017-09-12 2018-03-13 网易(杭州)网络有限公司 Information processing method, device and computer-readable recording medium
CN108379832A (en) * 2018-01-29 2018-08-10 珠海金山网络游戏科技有限公司 A kind of game synchronization method and apparatus
CN108579087A (en) * 2018-04-10 2018-09-28 网易(杭州)网络有限公司 A kind of control method and device of game role

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
包子入侵SSR: "《bilibili网》", 28 May 2018 *

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110300327A (en) * 2019-04-18 2019-10-01 深圳市腾讯网域计算机网络有限公司 A kind of game client method for analyzing performance, device, terminal and storage medium
CN110300327B (en) * 2019-04-18 2021-06-15 深圳市腾讯网域计算机网络有限公司 Game client performance analysis method, device, terminal and storage medium
CN112138377A (en) * 2019-06-28 2020-12-29 北京智明星通科技股份有限公司 Method and system for adjusting game APP rendering effect
CN110339564B (en) * 2019-08-16 2023-04-07 腾讯科技(深圳)有限公司 Virtual object display method, device, terminal and storage medium in virtual environment
CN110339564A (en) * 2019-08-16 2019-10-18 腾讯科技(深圳)有限公司 Virtual objects display methods, device, terminal and storage medium in virtual environment
US11861775B2 (en) 2019-10-17 2024-01-02 Huawei Technologies Co., Ltd. Picture rendering method, apparatus, electronic device, and storage medium
CN111068311A (en) * 2019-11-29 2020-04-28 珠海豹趣科技有限公司 Display control method and device for game scene
CN111068311B (en) * 2019-11-29 2023-06-23 珠海豹趣科技有限公司 Game scene display control method and device
CN111068309A (en) * 2019-12-04 2020-04-28 网易(杭州)网络有限公司 Display control method, device, equipment, system and medium for virtual reality game
CN111068309B (en) * 2019-12-04 2023-09-15 网易(杭州)网络有限公司 Display control method, device, equipment, system and medium for virtual reality game
CN110975285A (en) * 2019-12-06 2020-04-10 北京像素软件科技股份有限公司 Smooth knife light obtaining method and device
CN110975285B (en) * 2019-12-06 2024-03-22 北京像素软件科技股份有限公司 Smooth cutter light acquisition method and device
US11790607B2 (en) 2020-02-18 2023-10-17 Tencent Technology (Shenzhen) Company Limited Method and apparatus for displaying heat map, computer device, and readable storage medium
CN111325822A (en) * 2020-02-18 2020-06-23 腾讯科技(深圳)有限公司 Method, device and equipment for displaying hot spot diagram and readable storage medium
CN111330275B (en) * 2020-03-04 2024-03-19 网易(杭州)网络有限公司 Interaction method and device in game, storage medium and electronic equipment
CN111330275A (en) * 2020-03-04 2020-06-26 网易(杭州)网络有限公司 Interactive method and device in game, storage medium and electronic equipment
CN111589121A (en) * 2020-04-03 2020-08-28 北京冰封互娱科技有限公司 Information display method and device, storage medium and electronic device
CN112052097B (en) * 2020-10-15 2024-05-03 腾讯科技(深圳)有限公司 Virtual scene rendering resource processing method, device, equipment and storage medium
CN112052097A (en) * 2020-10-15 2020-12-08 腾讯科技(深圳)有限公司 Rendering resource processing method, device and equipment for virtual scene and storage medium
CN113379870A (en) * 2021-03-12 2021-09-10 广东虚拟现实科技有限公司 Display method and device based on virtual training scene training and storage medium
CN113192168A (en) * 2021-04-01 2021-07-30 广州三七互娱科技有限公司 Game scene rendering method and device and electronic equipment
CN113262497A (en) * 2021-05-14 2021-08-17 广州三七极耀网络科技有限公司 Virtual character rendering method, device, equipment and storage medium
CN113318441A (en) * 2021-05-26 2021-08-31 网易(杭州)网络有限公司 Game scene display control method and device, electronic equipment and storage medium
CN113244617B (en) * 2021-06-02 2023-11-28 网易(杭州)网络有限公司 Mobile data processing method and device in game and electronic equipment
CN113244617A (en) * 2021-06-02 2021-08-13 网易(杭州)网络有限公司 Mobile data processing method and device in game and electronic equipment
WO2023273131A1 (en) * 2021-06-30 2023-01-05 上海完美时空软件有限公司 Game scene generation method and apparatus, storage medium, and electronic apparatus
CN113413600A (en) * 2021-07-01 2021-09-21 网易(杭州)网络有限公司 Information processing method, information processing device, computer equipment and storage medium
WO2023065835A1 (en) * 2021-10-20 2023-04-27 北京思明启创科技有限公司 Code block interpretive execution method and apparatus, and electronic device and storage medium
CN114245177B (en) * 2021-12-17 2024-01-23 智道网联科技(北京)有限公司 Smooth display method and device of high-precision map, electronic equipment and storage medium
CN114245177A (en) * 2021-12-17 2022-03-25 智道网联科技(北京)有限公司 Smooth display method and device of high-precision map, electronic equipment and storage medium
CN114768250B (en) * 2022-04-06 2023-03-24 成都星奕网络科技有限公司 Virtual scene rendering color matching analysis management system based on image processing technology
CN114768250A (en) * 2022-04-06 2022-07-22 成都星奕网络科技有限公司 Virtual scene rendering color matching analysis management system based on image processing technology
CN114748872B (en) * 2022-06-13 2022-09-02 深圳市乐易网络股份有限公司 Game rendering updating method based on information fusion
CN114748872A (en) * 2022-06-13 2022-07-15 深圳市乐易网络股份有限公司 Game rendering updating method based on information fusion
WO2024037153A1 (en) * 2022-08-19 2024-02-22 腾讯科技(深圳)有限公司 Interface display method and information providing method based on turn-based combat, and system
WO2024164585A1 (en) * 2023-02-07 2024-08-15 腾讯科技(深圳)有限公司 Frame synchronization method, frame synchronization apparatus, electronic device and computer storage medium

Similar Documents

Publication Publication Date Title
CN109621413A (en) Rendering indication method, device, terminal and the storage medium of game picture
CN111589142B (en) Virtual object control method, device, equipment and medium
CN111282274B (en) Virtual object layout method, device, terminal and storage medium
CN111589140B (en) Virtual object control method, device, terminal and storage medium
CN110755845B (en) Virtual world picture display method, device, equipment and medium
CN111589127B (en) Control method, device and equipment of virtual role and storage medium
CN111462307A (en) Virtual image display method, device, equipment and storage medium of virtual object
CN112083848B (en) Method, device and equipment for adjusting position of control in application program and storage medium
CN111672104B (en) Virtual scene display method, device, terminal and storage medium
CN112569600B (en) Path information sending method in virtual scene, computer device and storage medium
CN112402962B (en) Signal display method, device, equipment and medium based on virtual environment
CN111672106B (en) Virtual scene display method and device, computer equipment and storage medium
CN111494937B (en) Virtual object control method, virtual object information synchronization device, virtual object information synchronization equipment and virtual object information synchronization medium
CN111672126A (en) Information display method, device, equipment and storage medium
CN112221142A (en) Control method and device of virtual prop, computer equipment and storage medium
CN112870705A (en) Method, device, equipment and medium for displaying game settlement interface
WO2023016089A1 (en) Method and apparatus for displaying prompt information, device, and storage medium
CN113101656A (en) Virtual object control method, device, terminal and storage medium
CN112691375B (en) Virtual object control method, device, terminal and storage medium
CN114404972A (en) Method, device and equipment for displaying visual field picture
CN112604274B (en) Virtual object display method, device, terminal and storage medium
CN113813606A (en) Virtual scene display method, device, terminal and storage medium
CN110585708B (en) Method, device and readable storage medium for landing from aircraft in virtual environment
CN112169321B (en) Mode determination method, device, equipment and readable storage medium
CN112156463B (en) Role display method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination