CN113821345A - Method and device for rendering movement track in game and electronic equipment - Google Patents

Method and device for rendering movement track in game and electronic equipment Download PDF

Info

Publication number
CN113821345A
CN113821345A CN202111120759.XA CN202111120759A CN113821345A CN 113821345 A CN113821345 A CN 113821345A CN 202111120759 A CN202111120759 A CN 202111120759A CN 113821345 A CN113821345 A CN 113821345A
Authority
CN
China
Prior art keywords
track
position point
map
current frame
rendering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111120759.XA
Other languages
Chinese (zh)
Other versions
CN113821345B (en
Inventor
梁小健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202111120759.XA priority Critical patent/CN113821345B/en
Publication of CN113821345A publication Critical patent/CN113821345A/en
Application granted granted Critical
Publication of CN113821345B publication Critical patent/CN113821345B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5027Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention provides a method and a device for rendering a moving track in a game and electronic equipment, wherein the method comprises the following steps: moving the first mesh into a view frustum range of a virtual camera of the game in response to a rendering event of the current frame; determining a new track corresponding to the current frame according to a second position point of the controlled virtual target in the current frame and a first position point of the previous frame; acquiring a second moving track of the controlled virtual target in the current frame according to the first moving track and the newly added track of the previous frame; caching a second moving track to a second map which is applied to a second grid and pointed by a second rendering object; rendering the current frame according to the second map, and replacing the first rendering object pointed by the virtual camera of the game with the second rendering object. The invention can reduce the memory occupation of the CPU when rendering the moving track, improve the fluency of the game and further improve the game experience of the user.

Description

Method and device for rendering movement track in game and electronic equipment
Technical Field
The present invention relates to the field of game technologies, and in particular, to a method and an apparatus for rendering a movement trajectory in a game, and an electronic device.
Background
In some virtual scenes provided by games, it is sometimes necessary to present a movement track of a virtual character or a player touch point. Taking the moving track of the formula god as an example, the moving track refers to the moving track of the formula god marble after being launched in a battle, and the moving track usually has a certain duration, and can cause damage to an enemy unit on the moving track within the duration.
In the prior art, a god movement track is recorded in Texture, the Texture is additionally created and is used for separately recording the movement track, the rendering of the movement track is performed after the rendering of the ground Texture of a battle scene and before the rendering of a battle unit, so that the battle scene needs to be subjected to matting, and the movement track of the god is expressed in the battle scene. The presentation process may specifically include: acquiring all pixels in Texture of a battle scene into a CPU memory, converting position information of a formula spirit in the battle scene into a corresponding pixel set, modifying pixel values in the pixel set, and then writing the pixel set back into the Texture. For the battle scenes corresponding to different frames, the operations are required to be repeated.
In the process of representing the moving track, the occupation of a CPU is high due to the fact that the number of pixels written back at each time is large, and the number of times of writing back is large due to the fact that the positions of the virtual characters are updated frequently, so that frame dropping within a period of time is caused, and the fluency of the game is affected.
Disclosure of Invention
The invention aims to provide a method and a device for rendering a movement track in a game and electronic equipment, so that the memory occupation of a CPU (central processing unit) is reduced when the movement track is rendered, the smoothness of the game is improved, and the game experience of a user is further improved.
In a first aspect, an embodiment of the present invention provides a method for rendering a movement trajectory in a game, where a terminal device provides a graphical user interface of the game, where content displayed on the graphical user interface at least includes a game scene and a controlled virtual object in the game scene, and the method includes: moving the first mesh into a view frustum range of a virtual camera of the game in response to a rendering event of a current frame; the material of the first grid uses a first map corresponding to a previous frame of the current frame, the first map caches a first movement track corresponding to the controlled virtual target at a first position point in the previous frame, and a first rendering object in a GPU of the terminal device points to the first map; determining a newly added track corresponding to the current frame according to a second position point and the first position point of the controlled virtual target in the current frame; acquiring a second moving track of the controlled virtual target in the current frame according to the first moving track and the newly added track; caching the second moving track to a second map pointed by a second rendering object in the GPU, and rendering the current frame according to the second map; wherein the material of the second grid uses the second map; replacing the first rendered object pointed to by the virtual camera with the second rendered object.
With reference to the first aspect, an embodiment of the present invention provides a first possible implementation manner of the first aspect, where the step of determining, according to a second position point and a first position point of the controlled virtual target in a current frame, a new trajectory corresponding to the current frame includes: adding an interpolation position point to a second position point of the controlled virtual target in the current frame from the first position point; and connecting the first position point, the interpolation position point and the second position point to obtain a newly added track corresponding to the current frame.
With reference to the first aspect, an embodiment of the present invention provides a second possible implementation manner of the first aspect, where the step of obtaining, according to the first moving trajectory and the newly added trajectory, a second moving trajectory of the controlled virtual target in the current frame includes: caching each position point in the newly added track to a position point mapping; creating a second grid for the game scene corresponding to the current frame; wherein the second mesh corresponds to a second render object in the GPU; respectively creating a rectangular frame for each position point in the position point mapping in the second grid; wherein, the pixel points in the rectangular frame area are the pixel points on the newly added track; and determining a second moving track of the controlled virtual target in the current frame according to the first moving track in the first map and a rectangular frame corresponding to the position point in the position point map.
With reference to the second implementation manner of the first aspect, an embodiment of the present invention provides a third possible implementation manner of the first aspect, where the step of creating a rectangular frame for each location point in the location point mapping map in the second grid includes: transmitting the attribute parameters into a material ball through a shader; the attribute parameters comprise the aspect ratio of the area where the second grid is located, the coordinates of the position points in the chartlet coordinate system, and the ratio of the width of the rectangular frame corresponding to the position points to the width of the second grid; and creating a rectangular frame for the position point based on the attribute parameters through the material ball.
With reference to the second implementation manner of the first aspect, an embodiment of the present invention provides a fourth possible implementation manner of the first aspect, where the step of determining, according to a rectangular frame corresponding to a first movement trajectory in the first map and a position point in the position point map, a second movement trajectory of the controlled virtual target in the current frame includes: converting the coordinates of the pixel points in the position point mapping from the coordinates in the mapping coordinate system to the coordinates in the grid coordinate system; converting the coordinates of the pixel points of the position point mapping in the grid coordinate system into coordinates in a rectangular frame to obtain mapping conversion coordinates corresponding to the position point mapping; setting pixel values of pixel points in the position point mapping according to the mapping transformation coordinates to obtain pixel values corresponding to the newly added track; and determining the pixel value of a second moving track of the controlled virtual target in the current frame according to the pixel value corresponding to the first moving track in the first map and the pixel value corresponding to the newly added track.
With reference to the fourth implementation manner of the first aspect, an embodiment of the present invention provides a fifth possible implementation manner of the first aspect, where the step of determining, according to a pixel value corresponding to a first movement track in the first map and a pixel value corresponding to the newly added track, a pixel value of a second movement track of the controlled virtual object in the current frame includes: modifying the pixel value of a pixel point corresponding to the first moving track in the first map based on a track gradual change rule to obtain an optimized map; and overlapping the pixel values corresponding to the optimized mapping and the newly added track to obtain the pixel value of a second moving track of the controlled virtual target in the current frame.
With reference to the first aspect, an embodiment of the present invention provides a sixth possible implementation manner of the first aspect, where a size of the second grid is the same as a size of the game scene of the current frame; the step of caching the second movement track to a second map pointed to by a second rendering object in the GPU includes: acquiring a second rendering object corresponding to the second grid in the GPU; and caching the second moving track to a second map pointed by the second rendering object.
In a second aspect, an embodiment of the present invention further provides an apparatus for rendering a movement trajectory in a game, where a terminal device provides a graphical user interface of the game, where content displayed on the graphical user interface at least includes a game scene and a controlled virtual object in the game scene, and the apparatus includes: the rendering triggering module is used for responding to the rendering event of the current frame and moving the first grid into the visual cone range of the virtual camera of the game; the material of the first grid uses a first map corresponding to a previous frame of the current frame, the first map caches a first movement track corresponding to the controlled virtual target at a first position point in the previous frame, and a first rendering object in a GPU of the terminal device points to the first map; a newly added track determining module, configured to determine a newly added track corresponding to the current frame according to the second position point and the first position point of the controlled virtual target in the current frame; a track obtaining module, configured to obtain a second moving track of the controlled virtual target in the current frame according to the first moving track and the newly added track; a track caching and rendering module, configured to cache the second moving track to a second map pointed by a second rendering object in the GPU, and render the current frame according to the second map; wherein the material of the second grid uses the second map; a replacement module to replace the first rendered object pointed to by the virtual camera with the second rendered object.
In a third aspect, an embodiment of the present invention further provides an electronic device, including a processor and a memory, where the memory stores computer-executable instructions that can be executed by the processor, and the processor executes the computer-executable instructions to implement the method for rendering a movement trajectory in a game.
In a fourth aspect, embodiments of the present invention also provide a computer-readable storage medium storing computer-executable instructions, which, when invoked and executed by a processor, cause the processor to implement the movement trajectory rendering method in the game described above.
The embodiment of the invention provides a method, a device and electronic equipment for rendering a moving track in a game, which apply a mode that a first rendering object points to a map in which the moving track in a previous frame is cached, can move a grid applying the map into a view cone range of a virtual camera of the game when a current frame is rendered, further can enable the virtual camera to acquire data corresponding to the moving track in the map, and enable a second moving track of the current frame to enter the view cone range of the virtual camera when the next frame is rendered by using a second rendering object in a GPU to the map in which the second moving track corresponding to the current frame is cached, and after the current frame is rendered according to the second map, by changing the mode that the virtual camera points to the second rendering object, the second moving track of the current frame can enter the view cone range of the virtual camera for the rendering of the next frame, so that the moving track of the previous frame can be prevented from being cleared, when the current frame has the newly added track, the moving track of the current frame can be generated based on the newly added track and the moving track of the previous frame, and the mapping of the existing track of the previous frame is not required to be generated again, so that the processing load of a CPU (central processing unit) is reduced, the track rendering efficiency and the performance effect are improved, and the smoothness of game operation is ensured.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic view of a game scene corresponding to a graphical user interface according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a method for rendering a movement track in a game according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a game scene corresponding to another graphical user interface provided in an embodiment of the present invention;
FIG. 4 is a schematic diagram of a game scene corresponding to another graphical user interface provided in an embodiment of the present invention;
FIG. 5 is a schematic diagram of a game scenario corresponding to another graphical user interface provided in an embodiment of the present invention;
FIG. 6 is a schematic diagram of a coordinate transformation provided by an embodiment of the present invention;
FIG. 7 is a diagram illustrating an exchange of rendering objects according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a movement track rendering device in a game according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the following embodiments, and it should be understood that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In some games, sometimes the movement track of a virtual object (such as a god, a bullet, and a brush pen) needs to be presented in a game scene in a frame animation manner, so that the movement track needs to be drawn based on the current position of a virtual object in each frame, and in order to improve the drawing efficiency and the expression effect of the movement track, embodiments of the present invention provide a movement track rendering method, apparatus, and electronic device in a game, which can be applied to a computer, a mobile phone, and other devices that can implement human-computer interaction, and is particularly applicable to a scene of a computer game or a mobile phone game, so as to reduce the memory occupation of a CPU when rendering the movement track of the virtual object, improve the fluency of the game, and further improve the game experience of a user.
The embodiment of the invention provides a method for rendering a moving track in a game, which comprises the steps of providing a graphical user interface of the game through terminal equipment, wherein the content displayed by the graphical user interface at least comprises a game scene and a controlled virtual target in the game scene; referring to fig. 1, a schematic diagram of a game scene corresponding to a graphical user interface is shown, where a controlled virtual object refers to an object controlled by a player, and the controlled virtual object may be set in the game scene. The player can control the controlled virtual object to perform a moving operation. For example, in fig. 1, the controlled virtual target is located at position point a, the player may control the controlled virtual target to perform a moving operation from position point a (a start position point) to position point D (an end position point) via position point B and position point C in order, the moving locus of the controlled virtual target is represented by a broken line segment formed by connecting position point a, position point B, position point C and position point D in order, and the controlled virtual target is represented by a broken line frame to move to position point D via position point B and position point C in order.
When the controlled virtual target performs a moving operation, in order to enable the moving track of the controlled virtual target to be visually displayed on the screen of the terminal device, the current moving track needs to be determined according to the display position of the controlled virtual target in each frame of picture. Referring to fig. 2, a flow chart of a method for rendering a movement track in a game is shown, and an execution subject of the method is described by taking the terminal device as an example. In the initialization stage, two rendering objects can be configured for the GPU of the terminal device, and a first rendering object is configured to point to a first map, where the first map is used to store a moving track corresponding to a previous frame, and the material of a first grid uses the first map; configuring a second rendering object to point to a second map, wherein the second map is used for storing a moving track corresponding to the current frame, and the material of a second grid uses the second map; and in the initialization stage, the first map and the second map are empty. Based on this, the method for rendering the movement track in the game provided by the embodiment may include the following steps:
step S202, responding to the rendering event of the current frame, moving the first grid into the visual cone range of the virtual camera of the game; the material of the first grid uses a first map corresponding to a previous frame of a current frame, the first map caches a first moving track corresponding to a first position point of the controlled virtual object in the previous frame, and a first rendering object in a GPU of the terminal device points to the first map.
The above rendering event may be understood as an event that a previous frame of the current frame has been presented and the current frame needs to be presented on the graphical user interface, or an event that a new position movement occurs for the virtual object, and continuing to take the game scene corresponding to the graphical user interface shown in fig. 1 as an example, referring to a schematic diagram of a game scene corresponding to another graphical user interface shown in fig. 3, a position point B is used to represent a first position point, a virtual line segment formed by sequentially connecting the position point a and the position point B is used to represent a first movement track, a dashed line box is used to represent that the controlled virtual object moves from the position point a to the position point B, a picture of the game scene corresponding to the first movement track is used to represent the previous frame, and a picture cutout of the game scene corresponding to the first movement track is used to represent the first map. When the controlled virtual target moves from the position point A to the position point B, the GPU of the terminal device renders a first movement track in the game scene and caches the first movement track by using the first map. For different time, the first moving track and the first map are changed correspondingly along with the change of the position point of the controlled virtual target.
And step S204, determining a new track corresponding to the current frame according to the second position point of the controlled virtual target in the current frame and the first position point.
The new track corresponding to the controlled virtual target in the current frame refers to a path from the first position point to the second position point, the path can be obtained by connecting the first position point to the second position point, the connection operation of the position points can be straight line connection, or connection of arcs or curves and the like, and the specific connection mode can be flexibly set based on different game presentation requirements.
And step S206, acquiring a second moving track of the controlled virtual target in the current frame according to the first moving track and the newly added track.
In order to distinguish from other objects in the game scene, the moving track is usually represented by a specific pixel value, and the track image corresponding to the pixel value is a map of the track.
Continuing with the example of fig. 3, when the controlled virtual target moves from the position point B to the position point C, the controlled virtual target continues to perform the moving operation from the position point C to the position point D (end position point), the position point C represents the second position point, the position point B and the position point C are sequentially connected to form a virtual line segment to represent a new trajectory, and after the controlled virtual target moves from the position point B to the position point C, the trajectory between the position point a and the position point C is the second moving trajectory.
Fig. 3 is only an example, and in practical applications, the position point C may be located between the position point a and the position point B, which is not limited by the present invention.
Specifically, all pixel points in the newly added track can be directly superimposed on the first map through the GPU to obtain a superimposed map, and the values of the pixel points in the superimposed map are correspondingly modified, so that a second moving track is obtained.
Step S208, caching the second moving track to a second map pointed by a second rendering object in the GPU, and rendering the current frame according to the second map; and the material of the second grid uses a second mapping.
Before caching the second moving track in the second map, the second map may be a null map, or a moving track corresponding to a previous frame is cached, and then the second moving track may be cached in the second map in a manner of covering the original content.
Step S210, replacing the first rendering object pointed by the virtual camera with a second rendering object.
Because the virtual camera of the game does not move, the grids presented in the view cone of the virtual camera can be different grids by modifying the rendering object pointed by the virtual camera, and then the chartlets corresponding to the grids in the view cone can be obtained through the virtual camera.
After the rendering of the current frame is completed, the virtual camera points to the second rendering object, so that the moving track in the current frame can be reserved in the GPU before the rendering of the next frame is performed, when the rendering event of the next frame comes, the second grid can be moved into the view cone range of the virtual camera, and the second moving track in the second map is obtained for determining the moving track of the next frame.
Continuing to fig. 3, a virtual line segment formed by connecting position point a, position point B and position point C in sequence is used to represent a second movement track of the controlled virtual target, and a second map is used to cache a second movement track of the game scene (including the first movement track) corresponding to the newly added track. In the embodiment, the first rendering object pointed by the virtual camera is replaced by the second rendering object, so that the second grid can be moved into the view cone of the virtual camera when the next frame is rendered, the second grid is made of the second chartlet, and therefore the third moving track corresponding to the next frame can be determined by directly utilizing the second moving track in the second chartlet corresponding to the second grid when the next frame is rendered, the CPU is not required to regenerate the chartlet of the second moving track, and the processing task of the CPU is reduced.
The method of the present embodiment of the present invention, which applies a way that a first rendering object points to a map in which a moving trajectory in a previous frame is cached, can move a mesh to which the map is applied into a view cone range of a virtual camera of a game when a current frame is rendered, and further can enable the virtual camera to obtain data corresponding to the moving trajectory in the map, enables a second rendering object in a GPU to point to the map in which a second moving trajectory corresponding to the current frame is cached, and enables the second moving trajectory of the current frame to enter the view cone range of the virtual camera when a next frame is rendered by changing a way that the virtual camera points to the second rendering object, for rendering the next frame, and can avoid the moving trajectory of the previous frame from being cleared, so that when the current frame has a new trajectory, the moving trajectory of the current frame can be generated based on the new trajectory and the moving trajectory of the previous frame, and the mapping of the existing track of the previous frame is not required to be generated again, so that the processing load of a CPU is reduced, the track rendering efficiency and the expression effect are improved, and the smoothness of game running is ensured.
In order to reduce the distortion of the map in the rendering process, the size of the second grid is the same as the size of the game scene of the current frame, and the second rendering object pointing the second map to which the second movement trajectory is cached in the GPU in step S208 may adopt the following operation modes: (1) acquiring a second rendering object corresponding to the GPU in the second grid; (2) and caching the second moving track to a second map pointed by the second rendering object. As a possible implementation, the GPU of the terminal device may generate a Mesh (also referred to as a Mesh patch) having the same size as the game scene, use the Mesh as a second Mesh, configure a second map pointed by the second rendering object for the material of the Mesh, and then may cache the second movement trajectory on the second map.
In order to further improve the rendering effect, the step S204 (i.e. determining a new trajectory corresponding to the current frame according to the second position point and the lost first position point of the controlled virtual object in the current frame) is optimized, and the step may specifically adopt the following operation modes:
(1) adding an interpolation position point to a second position point of the controlled virtual target in the current frame from the first position point;
(2) and connecting the first position point, the interpolation position point and the second position point to obtain a newly added track corresponding to the current frame.
Referring to fig. 4, another schematic diagram of a game scene corresponding to a graphical user interface is shown, where a first position point is represented by a position point B, a second position point is represented by a position point C, a position point E is generated between the position points B and C by interpolation, and a new track is represented by a virtual line segment formed by connecting the position point B, the position point E, and the position point C in sequence. Of course, a plurality of position points may be inserted between the first position point and the second position point to make the moving track conform to the expected desired performance.
Specifically, if the current frame is subjected to the moving track painting, all position points (including an interpolation position point newly added between the current position point of the controlled virtual target and the moving track end point of the previous frame) in the current frame to be painted can be converted into uv values under the chartlet coordinate system from coordinate values in the world coordinate system, uv position points corresponding to all the position points are obtained, and the uv position points are subjected to the coloring operation in the pixel method of the shader. By the method, when the current frame is painted, only the position points on the newly added track are subjected to coordinate conversion processing, and the CPU is not required to perform coordinate conversion processing again on the position points of the existing track, so that the processing amount of the CPU is reduced.
In order to further improve the accuracy of obtaining the trajectory, a specific area of the trajectory may be determined by using a mesh model, and based on this, the step of obtaining the second moving trajectory of the controlled virtual target in the current frame according to the first moving trajectory and the newly added trajectory may adopt the following operation modes:
step 1, caching each position point in the newly added track to a position point mapping;
step 2, establishing a second grid for the game scene corresponding to the current frame; the second grid corresponds to a second render object in the GPU.
Step 3, respectively creating a rectangular frame for each position point in the position point map in the second grid; wherein, the pixel points in the rectangular frame area are the pixel points on the newly added track; for example: transmitting the attribute parameters into a material ball through a shader; the attribute parameters comprise the aspect ratio of the area where the second grid is located, the coordinates of the position points in the chartlet coordinate system, and the ratio of the width of the rectangular frame corresponding to the position points to the width of the second grid; creating a rectangular box for the location point based on the attribute parameters by the material ball.
Referring to fig. 5, a schematic diagram of a game scene corresponding to another gui is shown, in which a position point B, a position point E and a position point C form a position point map; in the grids corresponding to the game scene, a position point B, a position point E and a position point C are respectively used as centers, a first rectangular frame corresponding to the position point B, a second rectangular frame corresponding to the position point E and a third rectangular frame corresponding to the position point C are created, and pixel points in all rectangular frame areas are pixel points on the newly added track.
And 4, determining a second moving track of the controlled virtual target in the current frame according to the first moving track in the first map and a rectangular frame corresponding to the position point in the position point map.
As a possible implementation, the step 4 can be implemented by the following steps 41 to 43:
step 41, converting the pixel points in the position point mapping from the coordinates under the mapping coordinate system to the coordinates under the grid coordinate system;
42, converting the coordinates of the pixel points of the position point mapping in the grid coordinate system into coordinates in a rectangular frame to obtain mapping conversion coordinates corresponding to the position point mapping;
based on the above steps 41 to 42, the following formula may be adopted to perform coordinate transformation, assuming that the uv coordinate of the pixel point U in the position point map is (U1, v1), the uv coordinate of the drawing position point P is (P1, P2), the ratio of the width of the rectangular frame to the width of the grid is scaleW, and the aspect ratio of the grid of the game scene is aspect. Then, the coordinates of the location point in the rectangular frame can be transformed by the following operation:
the coordinates of the pixel point U in the rectangular frame are (newU, newV), where newU is (U1-p 1)/scaleW +0.5, and newV is (v 1-p 2)/(scaleW + aspect) + 0.5.
Referring to the schematic diagram of coordinate transformation shown in fig. 6, fig. 6a is a grid coordinate system corresponding to a game scene, and a grid in which the game scene is located is a rectangular region formed by two points (0,1) and (1,0) in fig. 6 a. The drawing position point P is marked in fig. 6b, i.e. the uv coordinate values (P1, P2) of the position point on the new added trajectory, the uv coordinates (U1, v1) of the pixel point U of the position point mapping in the grid coordinate system are schematically shown in fig. 6c, and the rectangular frame corresponding to the drawing position point P is determined based on the following parameters transmitted by the material ball: the aspect ratio aspect of the current grid, the uv coordinate value of the drawing position point P, and the ratio of the width of the rectangular frame of the drawing position point P to the width of the grid are scaleW.
In fig. 6c, several attribute values of the incoming material ball can be represented by a vector3[2] (2 (x, y, z) structure values) attribute a and a float floating point type attribute B defined by the shader, wherein the first vector3 of the attribute a represents the color value of the drawing position point (red, green and blue 3 channel), and the second vector3 represents the uv coordinate value of the drawing position point and the ratio value of the width of the rectangular frame and the width of the grid corresponding to the drawing position point (division operation); the attribute B represents the aspect ratio aspect of the mesh. These attributes can be passed in by setting the uniform attribute for the material ball.
Fig. 6d illustrates that the pixel point U is transformed from the coordinate value (U1, v1) of the grid coordinate system uv to the coordinate value (newU, newV) of the coordinate system U 'v' corresponding to the rectangular frame of the drawing position point P. Then, whether the pixel point U is in the rectangular frame can be judged according to the coordinate value (newU, newV) and the coordinate value of the vertex of the rectangular frame, and the pixel value of the pixel point U is further determined.
And 43, setting the pixel values of the pixel points in the position point mapping according to the mapping transformation coordinates to obtain the pixel values corresponding to the newly added track.
Setting a pixel value at each position in the position point mapping in a rectangular area painting mode, wherein the painting shape of the area where the rectangular frame is located can be controlled in the rectangular frame corresponding to each position point through a mapping provided by a special effect, the rectangular frame represents a position point needing painting, the position point is correspondingly rendered to a rectangular pixel block in a screen, the pixel block can control the area needing coloring of the pixel block through a texture mapping with a transparent channel and the shape of an opaque part, for example, a circular texture mapping can be used, if the pixel point U is in the rectangular frame, the color value obtained after sampling corresponds to the non-edge area of the circular texture mapping, and if the pixel point U is not in the rectangular frame, the color value obtained after sampling corresponds to the edge area of the circular texture mapping; therefore, the edge of the circular texture map can be reserved with a certain size of transparent margin, the color value of the edge area does not affect the color value of the first sampling, and the pixel value corresponding to the moving track is only given to the pixel point in the rectangular frame.
And step 44, determining the pixel value of the second moving track of the controlled virtual target in the current frame according to the pixel value corresponding to the first moving track and the pixel value corresponding to the newly added track in the first map.
After the pixel value corresponding to the newly added track is obtained, a second moving track of the controlled virtual target in the second frame and the pixel value corresponding to the track can be generated according to the cached first map and the pixel value corresponding to the newly added track. Considering that the moving track is changed, the time length of each pixel point on the track can gradually disappear in a gradual change mode to achieve the effect of dynamic track display, and based on this, step 44 can be obtained in the following mode: modifying the pixel value of a pixel point corresponding to the first moving track in the first map based on a track gradual change rule to obtain an optimized map; and overlapping the pixel values corresponding to the optimized mapping and the newly added track to obtain the pixel value of the second moving track of the controlled virtual target in the current frame.
When the moving track is expressed in detail, the moving track can be drawn through a shader (shader), wherein the shader uses 2 texture maps, one map is a map painted in the previous frame, namely the first map, and the other map is a map containing newly added position points, namely the second map; under a uv coordinate system corresponding to a mesh of a game scene, in addition to a uv (map coordinate) value of each vertex in a map, a current position point needing drawing can be converted into a corresponding uv (map coordinate) value in the mesh, and then the uv (map coordinate) value is transmitted into a shader for use through material ball attribute setting; in the pixel processing stage, the shader first samples the uv value output in the vertex processing (i.e. the output uv value interpolated according to the uv value of each vertex), then converts the uv value after the drawing position point conversion again according to the uv value output in the current vertex processing, and then samples the uv value after the conversion (the purpose of the conversion here is to determine whether the pixel processed by the current pixel is in the rectangular area of the drawing position point), and further determines the pixel value of the position point.
In the embodiment, it is considered that if the position on the previously recorded track is repainted once every time the map of the moving track is updated, the position is hard to bear on the performance of the CPU, so in scene rendering, the embodiment does not perform the clearing operation of the color value in the map of the previous frame, but continues painting on the original basis every time there is a new track, in order to implement this manner, two 2 rendertargets (rendering objects) are adopted for implementation, when the scene is rendered to the RenderTarget2 (rendering object 2), the map of the RenderTarget1 (rendering object 1) is used, the texture property defined in the material ball used by the grid set to be the same proportion in the scene is used to designate the material of the grid to use the map, and the two rendertargets (rendering objects) are exchanged in the next rendering. For better understanding of the exchange process, refer to the schematic diagram of rendering object exchange shown in fig. 7, where the left side is the processing corresponding to the first frame, and the right side is the processing corresponding to the second frame, where the processing of the first frame may be understood as the first painting, and the processing of the second frame may be understood as the second painting, and the tiles in the diagram are also referred to as meshes. Before the controlled virtual target starts to move, the CPU of the terminal device generates a Mesh patch 1 (i.e., the first Mesh) and a Mesh patch 2 (i.e., the second Mesh) to be painted, and creates an empty first rendering object RenderTarget1 and an empty second rendering object RenderTarget 2; the material of the surface patch 1 uses a first mapping, and the material of the surface patch 2 uses a second mapping; when the mapping pixels used by the material of the patch change, the display of the patch also changes correspondingly; in the process of moving the controlled virtual object, when the GPU of the terminal device renders to a first position point of the controlled virtual object in a first frame, the patch 1 is outside the visible range (i.e., the camera view cone) of the camera (also called a virtual camera), so that the first map corresponding to the patch 1 is not rendered, and waits for the next frame to be used, the patch 2 is within the visible range of the camera, the patch 2 corresponds to a second map, and the movement track cached by the second map is the cache track of the previous frame (i.e., the previous frame of the first frame); the GPU of the terminal equipment enables a buffer track (an empty track in the present case) of the previous frame and a newly added track of the first frame to form a first track to be buffered to a first mapping, and the first mapping corresponds to a patch 1; after the first frame is rendered, switching the rendering object pointed by the camera (the pointing direction of the camera is determined by a second rendering object introduced by a render function of the scene) from the first rendering object to the second rendering object, and interchanging the position of the patch 1 and the position of the patch 2, namely, the patch 1 enters the visual range of the camera and the patch 2 moves out of the visual range; when the GPU of the terminal equipment renders a second position point in a second frame, the first moving track cached in the first map can be used in the rendering process of the second frame due to the fact that the patch 1 is in the visual range of the camera; by analogy, rendering of the moving track in each frame is realized on the basis of the moving track of the previous frame, and the processing amount of a CPU is simplified.
Based on the foregoing method embodiment, an embodiment of the present invention further provides a device for rendering a movement trajectory in a game, where the device is applied to the terminal device, and provides a graphical user interface of the game through the terminal device, where content displayed on the graphical user interface at least includes a game scene and a controlled virtual object in the game scene, as shown in fig. 8, the device includes:
a rendering triggering module 81, configured to move the first mesh into a view cone range of a virtual camera of the game in response to a rendering event of the current frame; the material of the first grid uses a first map corresponding to a previous frame of the current frame, the first map caches a first movement track corresponding to the controlled virtual target at a first position point in the previous frame, and a first rendering object in a GPU of the terminal device points to the first map.
And a newly added track determining module 82, configured to determine a newly added track corresponding to the current frame according to the second position point and the first position point of the controlled virtual target in the current frame.
And a track obtaining module 83, configured to obtain a second moving track of the controlled virtual target in the current frame according to the first moving track and the newly added track.
A track caching and rendering module 84, configured to cache the second moving track to a second map pointed by a second rendering object in the GPU, and render the current frame according to the second map; and the material of the second grid uses the second mapping.
A replacement module 85 for replacing the first rendered object pointed to by the virtual camera with the second rendered object.
The moving track rendering device in the game provided by the embodiment of the invention can move the grid applying the map into the view cone range of the virtual camera of the game by applying the way that the first rendering object points to the map in which the moving track in the previous frame is cached when the current frame is rendered, so that the virtual camera can acquire the data corresponding to the moving track in the map, and can make the second moving track of the current frame enter the view cone range of the virtual camera for the rendering of the next frame by using the way that the second rendering object in the GPU points to the map in which the second moving track corresponding to the current frame is cached, and changing the way that the virtual camera points to the second rendering object, so that the second moving track of the current frame can enter the view cone range of the virtual camera when the next frame is rendered, and the way can avoid the moving track of the previous frame from being cleared, and can generate the moving track of the current frame based on the newly added track and the moving track of the previous frame when the current frame has the newly added track, and the mapping of the existing track of the previous frame is not required to be generated again, so that the processing load of a CPU is reduced, the track rendering efficiency and the expression effect are improved, and the smoothness of game running is ensured.
The new track determining module 82 is further configured to: adding an interpolation position point to a second position point of the controlled virtual target in the current frame from the first position point; and connecting the first position point, the interpolation position point and the second position point to obtain a newly added track corresponding to the current frame.
The trajectory acquisition module 83 is further configured to: caching each position point in the newly added track to a position point mapping; creating a second grid for the game scene corresponding to the current frame; wherein the second mesh corresponds to a second render object in the GPU; respectively creating a rectangular frame for each position point in the position point mapping in the second grid; wherein, the pixel points in the rectangular frame area are the pixel points on the newly added track; and determining a second moving track of the controlled virtual target in the current frame according to the first moving track in the first map and a rectangular frame corresponding to the position point in the position point map.
The trajectory acquisition module 83 is further configured to: transmitting the attribute parameters into a material ball through a shader; the attribute parameters comprise the aspect ratio of the area where the second grid is located, the coordinates of the position points in the chartlet coordinate system, and the ratio of the width of the rectangular frame corresponding to the position points to the width of the second grid; and creating a rectangular frame for the position point based on the attribute parameters through the material ball.
The trajectory acquisition module 83 is further configured to: converting the coordinates of the pixel points in the position point mapping from the coordinates in the mapping coordinate system to the coordinates in the grid coordinate system; converting the coordinates of the pixel points of the position point mapping in the grid coordinate system into coordinates in a rectangular frame to obtain mapping conversion coordinates corresponding to the position point mapping; setting pixel values of pixel points in the position point mapping according to the mapping transformation coordinates to obtain pixel values corresponding to the newly added track; and determining the pixel value of a second moving track of the controlled virtual target in the current frame according to the pixel value corresponding to the first moving track in the first map and the pixel value corresponding to the newly added track.
The trajectory acquisition module 83 is further configured to: modifying the pixel value of a pixel point corresponding to the first moving track in the first map based on a track gradual change rule to obtain an optimized map; and overlapping the pixel values corresponding to the optimized mapping and the newly added track to obtain the pixel value of a second moving track of the controlled virtual target in the current frame.
The size of the second grid is the same as the size of the game scene of the current frame; accordingly, the trace caching and rendering module 84 is further configured to: acquiring a second rendering object corresponding to the second grid in the GPU; and caching the second moving track to a second map pointed by the second rendering object.
An embodiment of the present invention further provides an electronic device, as shown in fig. 9, which is a schematic structural diagram of the electronic device, where the electronic device 100 includes a processor 91 and a memory 90, the memory 90 stores computer-executable instructions capable of being executed by the processor 91, and the processor 91 executes the computer-executable instructions to implement the method for rendering a movement track in a game.
In the embodiment shown in fig. 9, the electronic device further comprises a bus 92 and a communication interface 93, wherein the processor 91, the communication interface 93 and the memory 90 are connected by the bus 92.
The Memory 90 may include a high-speed Random Access Memory (RAM) and may also include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 93 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, and the like can be used. The bus 92 may be an ISA (Industry Standard Architecture) bus, a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus 92 may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 9, but this does not indicate only one bus or one type of bus.
The processor 91 may be an integrated circuit chip having signal processing capabilities. In the implementation process, the steps of the movement trace rendering method in the game may be implemented by an integrated logic circuit of hardware in the processor 91 or instructions in the form of software. The Processor 91 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the device can also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method for rendering the movement track in the game disclosed by the embodiment of the invention can be directly embodied as the execution of a hardware decoding processor, or the combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 90, and the processor 91 reads the information in the memory 90, and completes the steps of the movement track rendering method in the game of the foregoing embodiment in combination with the hardware thereof.
The embodiment of the present invention further provides a computer-readable storage medium, where the computer-readable storage medium stores computer-executable instructions, and when the computer-executable instructions are called and executed by a processor, the computer-executable instructions cause the processor to implement the method for rendering a movement track in a game, where specific implementation may refer to the foregoing method embodiment, and details are not repeated herein.
The method and the apparatus for rendering a movement trajectory in a game and the computer program product of an electronic device provided in the embodiments of the present invention include a computer-readable storage medium storing program codes, where instructions included in the program codes may be used to execute the method for rendering a movement trajectory in a game described in the foregoing method embodiments, and specific implementation may refer to the method embodiments and will not be described herein again.
Unless specifically stated otherwise, the relative steps, numerical expressions, and values of the components and steps set forth in these embodiments do not limit the scope of the present invention.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method for rendering the movement track in the game according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present invention, which are used for illustrating the technical solutions of the present invention and not for limiting the same, and the protection scope of the present invention is not limited thereto, although the present invention is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A method for rendering a movement track in a game, wherein a terminal device provides a graphical user interface of the game, the content displayed by the graphical user interface at least comprises a game scene and a controlled virtual object in the game scene, and the method comprises the following steps:
moving the first mesh into a view frustum range of a virtual camera of the game in response to a rendering event of a current frame; the material of the first grid uses a first map corresponding to a previous frame of the current frame, the first map caches a first movement track corresponding to the controlled virtual target at a first position point in the previous frame, and a first rendering object in a GPU of the terminal device points to the first map;
determining a newly added track corresponding to the current frame according to a second position point and the first position point of the controlled virtual target in the current frame;
acquiring a second moving track of the controlled virtual target in the current frame according to the first moving track and the newly added track;
caching the second moving track to a second map pointed by a second rendering object in the GPU, and rendering the current frame according to the second map; wherein the material of the second grid uses the second map;
replacing the first rendered object pointed to by the virtual camera with the second rendered object.
2. The method of claim 1, wherein the step of determining the newly added trajectory corresponding to the current frame according to the second position point and the first position point of the controlled virtual target in the current frame comprises:
adding an interpolation position point to a second position point of the controlled virtual target in the current frame from the first position point;
and connecting the first position point, the interpolation position point and the second position point to obtain a newly added track corresponding to the current frame.
3. The method according to claim 1, wherein the step of obtaining a second moving trajectory of the controlled virtual target in the current frame according to the first moving trajectory and the newly added trajectory comprises:
caching each position point in the newly added track to a position point mapping;
creating a second grid for the game scene corresponding to the current frame; wherein the second mesh corresponds to a second render object in the GPU;
respectively creating a rectangular frame for each position point in the position point mapping in the second grid; wherein, the pixel points in the rectangular frame area are the pixel points on the newly added track;
and determining a second moving track of the controlled virtual target in the current frame according to the first moving track in the first map and a rectangular frame corresponding to the position point in the position point map.
4. The method of claim 3, wherein the step of creating a rectangular box in the second grid for each location point in the location bitmap respectively comprises:
transmitting the attribute parameters into a material ball through a shader; the attribute parameters comprise the aspect ratio of the area where the second grid is located, the coordinates of the position points in the chartlet coordinate system, and the ratio of the width of the rectangular frame corresponding to the position points to the width of the second grid;
and creating a rectangular frame for the position point based on the attribute parameters through the material ball.
5. The method according to claim 3, wherein the step of determining the second moving track of the controlled virtual target in the current frame according to the rectangular frame corresponding to the first moving track in the first map and the position point in the position point map comprises:
converting the coordinates of the pixel points in the position point mapping from the coordinates in the mapping coordinate system to the coordinates in the grid coordinate system;
converting the coordinates of the pixel points of the position point mapping in the grid coordinate system into coordinates in a rectangular frame to obtain mapping conversion coordinates corresponding to the position point mapping;
setting pixel values of pixel points in the position point mapping according to the mapping transformation coordinates to obtain pixel values corresponding to the newly added track;
and determining the pixel value of a second moving track of the controlled virtual target in the current frame according to the pixel value corresponding to the first moving track in the first map and the pixel value corresponding to the newly added track.
6. The method according to claim 5, wherein the step of determining the pixel value of the second movement track of the controlled virtual object in the current frame according to the pixel value corresponding to the first movement track in the first map and the pixel value corresponding to the new added track comprises:
modifying the pixel value of a pixel point corresponding to the first moving track in the first map based on a track gradual change rule to obtain an optimized map;
and overlapping the pixel values corresponding to the optimized mapping and the newly added track to obtain the pixel value of a second moving track of the controlled virtual target in the current frame.
7. The method of claim 1, wherein the size of the second grid is the same as the size of the game scene of the current frame;
the step of caching the second movement track to a second map pointed to by a second rendering object in the GPU includes:
acquiring a second rendering object corresponding to the second grid in the GPU;
and caching the second moving track to a second map pointed by the second rendering object.
8. An apparatus for rendering a movement trace in a game, wherein a graphical user interface of the game is provided through a terminal device, and the content displayed by the graphical user interface at least comprises a game scene and a controlled virtual object in the game scene, the apparatus comprising:
the rendering triggering module is used for responding to the rendering event of the current frame and moving the first grid into the visual cone range of the virtual camera of the game; the material of the first grid uses a first map corresponding to a previous frame of the current frame, the first map caches a first movement track corresponding to the controlled virtual target at a first position point in the previous frame, and a first rendering object in a GPU of the terminal device points to the first map;
a newly added track determining module, configured to determine a newly added track corresponding to the current frame according to the second position point and the first position point of the controlled virtual target in the current frame;
a track obtaining module, configured to obtain a second moving track of the controlled virtual target in the current frame according to the first moving track and the newly added track;
a track caching and rendering module, configured to cache the second moving track to a second map pointed by a second rendering object in the GPU, and render the current frame according to the second map; wherein the material of the second grid uses the second map;
a replacement module to replace the first rendered object pointed to by the virtual camera with the second rendered object.
9. An electronic device comprising a processor and a memory, the memory storing computer-executable instructions executable by the processor, the processor executing the computer-executable instructions to implement the method of any one of claims 1 to 7.
10. A computer-readable storage medium having computer-executable instructions stored thereon which, when invoked and executed by a processor, cause the processor to implement the method of any of claims 1 to 7.
CN202111120759.XA 2021-09-24 2021-09-24 Method and device for rendering moving track in game and electronic equipment Active CN113821345B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111120759.XA CN113821345B (en) 2021-09-24 2021-09-24 Method and device for rendering moving track in game and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111120759.XA CN113821345B (en) 2021-09-24 2021-09-24 Method and device for rendering moving track in game and electronic equipment

Publications (2)

Publication Number Publication Date
CN113821345A true CN113821345A (en) 2021-12-21
CN113821345B CN113821345B (en) 2023-06-30

Family

ID=78921285

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111120759.XA Active CN113821345B (en) 2021-09-24 2021-09-24 Method and device for rendering moving track in game and electronic equipment

Country Status (1)

Country Link
CN (1) CN113821345B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040002380A1 (en) * 2002-06-27 2004-01-01 Igt Trajectory-based 3-D games of chance for video gaming machines
CN111054074A (en) * 2019-12-27 2020-04-24 网易(杭州)网络有限公司 Method and device for moving virtual object in game and electronic equipment
CN111437604A (en) * 2020-03-23 2020-07-24 网易(杭州)网络有限公司 Game display control method and device, electronic equipment and storage medium
CN111701238A (en) * 2020-06-24 2020-09-25 腾讯科技(深圳)有限公司 Virtual picture volume display method, device, equipment and storage medium
WO2020207202A1 (en) * 2019-04-11 2020-10-15 腾讯科技(深圳)有限公司 Shadow rendering method and apparatus, computer device and storage medium
CN112862935A (en) * 2021-03-16 2021-05-28 天津亚克互动科技有限公司 Game character motion processing method and device, storage medium and computer equipment
CN113064540A (en) * 2021-03-23 2021-07-02 网易(杭州)网络有限公司 Game-based drawing method, game-based drawing device, electronic device, and storage medium
CN113209626A (en) * 2021-05-21 2021-08-06 珠海金山网络游戏科技有限公司 Game picture rendering method and device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040002380A1 (en) * 2002-06-27 2004-01-01 Igt Trajectory-based 3-D games of chance for video gaming machines
WO2020207202A1 (en) * 2019-04-11 2020-10-15 腾讯科技(深圳)有限公司 Shadow rendering method and apparatus, computer device and storage medium
CN111054074A (en) * 2019-12-27 2020-04-24 网易(杭州)网络有限公司 Method and device for moving virtual object in game and electronic equipment
CN111437604A (en) * 2020-03-23 2020-07-24 网易(杭州)网络有限公司 Game display control method and device, electronic equipment and storage medium
CN111701238A (en) * 2020-06-24 2020-09-25 腾讯科技(深圳)有限公司 Virtual picture volume display method, device, equipment and storage medium
CN112862935A (en) * 2021-03-16 2021-05-28 天津亚克互动科技有限公司 Game character motion processing method and device, storage medium and computer equipment
CN113064540A (en) * 2021-03-23 2021-07-02 网易(杭州)网络有限公司 Game-based drawing method, game-based drawing device, electronic device, and storage medium
CN113209626A (en) * 2021-05-21 2021-08-06 珠海金山网络游戏科技有限公司 Game picture rendering method and device

Also Published As

Publication number Publication date
CN113821345B (en) 2023-06-30

Similar Documents

Publication Publication Date Title
TWI636423B (en) Method for efficient construction of high resolution display buffers
CN108236783B (en) Method and device for simulating illumination in game scene, terminal equipment and storage medium
JP4542153B2 (en) Flexible anti-aliasing for embedded devices
JP4982498B2 (en) Antialiasing vector graphic image
CN112241993B (en) Game image processing method and device and electronic equipment
CN108830923B (en) Image rendering method and device and storage medium
CN109985384B (en) Method and device for dynamically adjusting map
US20230230311A1 (en) Rendering Method and Apparatus, and Device
CN114565708A (en) Method, device and equipment for selecting anti-aliasing algorithm and readable storage medium
US6903746B2 (en) Rendering processing method
CN105550973B (en) Graphics processing unit, graphics processing system and anti-aliasing processing method
CN113469883B (en) Rendering method and device of dynamic resolution, electronic equipment and readable storage medium
CN112669433A (en) Contour rendering method, apparatus, electronic device and computer-readable storage medium
CN113821345B (en) Method and device for rendering moving track in game and electronic equipment
EP4231243A1 (en) Data storage management method, object rendering method, and device
KR20180088876A (en) Image processing method and apparatus
JP2005107602A (en) Three-dimensional image drawing device and three-dimensional image drawing method
CN111462343B (en) Data processing method and device, electronic equipment and storage medium
CN114429513A (en) Method and device for determining visible element, storage medium and electronic equipment
CN110689606B (en) Method and terminal for calculating raindrop falling position in virtual scene
JPH11331700A (en) Image processing unit and image processing method
WO2020036214A1 (en) Image generation device, and image generation method and program
CN110384926B (en) Position determining method and device
WO2018175299A1 (en) System and method for rendering shadows for a virtual environment
JP7471512B2 (en) Drawing processing device, drawing processing system, drawing processing method, and drawing processing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant