CN111127576B - Game picture rendering method and device and electronic equipment - Google Patents

Game picture rendering method and device and electronic equipment Download PDF

Info

Publication number
CN111127576B
CN111127576B CN201911315201.XA CN201911315201A CN111127576B CN 111127576 B CN111127576 B CN 111127576B CN 201911315201 A CN201911315201 A CN 201911315201A CN 111127576 B CN111127576 B CN 111127576B
Authority
CN
China
Prior art keywords
rendering
vegetation
preset
color
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911315201.XA
Other languages
Chinese (zh)
Other versions
CN111127576A (en
Inventor
罗树权
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Pixel Software Technology Co Ltd
Original Assignee
Beijing Pixel Software Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Pixel Software Technology Co Ltd filed Critical Beijing Pixel Software Technology Co Ltd
Priority to CN201911315201.XA priority Critical patent/CN111127576B/en
Publication of CN111127576A publication Critical patent/CN111127576A/en
Application granted granted Critical
Publication of CN111127576B publication Critical patent/CN111127576B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene

Abstract

The invention provides a method and a device for rendering a game picture and electronic equipment, and relates to the technical field of games, wherein the method for rendering the game picture comprises the following steps: extracting the colors of vegetation textures in the initial game picture, mapping the colors of the vegetation textures onto a preset background texture to obtain a target texture, determining target rendering strength based on a preset attenuation starting distance and a preset rendering distance, interpolating in the initial game picture based on the target texture and the target rendering strength, and rendering to obtain a final game picture. The invention can effectively improve the vegetation and terrain linking effect and enhance the sense of reality of the game picture.

Description

Game picture rendering method and device and electronic equipment
Technical Field
The present invention relates to the field of game technologies, and in particular, to a method and an apparatus for rendering a game screen, and an electronic device.
Background
In many games, a large amount of vegetation is often required to be rendered in order to enhance the realism of the game screen. Because of the particularity of the plant leaves, detailed modeling of the plant leaves is impossible, a large number of vertexes are needed during rendering, so that great performance loss is brought, a large-scale vegetation effect cannot be rendered in a game picture, the display distance of vegetation can only be shortened, and the vegetation only covers a distance which is closer to a camera. However, this affects the effect of the overall game screen rendering, and especially when vegetation is lost, the floor is exposed, and the engagement is unnatural.
Aiming at the problem that the connection is unnatural when the vegetation effect is rendered on the game picture, no effective solution is proposed at present.
Disclosure of Invention
The invention aims to provide a rendering method and device of a game picture and electronic equipment, which can effectively improve the vegetation and terrain engagement effect and improve the sense of reality of the game picture.
In a first aspect, an embodiment provides a method for rendering a game screen, including: extracting the color of vegetation texture in an initial game picture; mapping the colors of the vegetation textures onto a preset background texture to obtain a target texture; determining a target rendering intensity based on a preset attenuation starting distance and a preset rendering distance; interpolation is carried out in the initial game picture based on the target texture and the target rendering strength, and the final game picture is rendered.
In an alternative embodiment, the step of extracting the color of the vegetation texture in the initial game screen includes: extracting the colors of plant leaves in an initial vegetation texture in an initial game picture; and downsampling the color of the plant leaves in the extracted vegetation texture to obtain a first rendering color with resolution of 1 multiplied by 1.
In an alternative embodiment, the step of mapping the color of the vegetation texture onto a preset background texture to obtain a target texture includes: determining a second rendering color of each plant in a preset background texture based on the first rendering color and a preset vegetation vertex color; the plant is at least one plant; determining a rendering position of each plant in a preset background texture; and filling the second rendering color of each plant into the corresponding rendering position to obtain the target texture.
In an alternative embodiment, the step of determining a rendering position of each plant in a preset background texture includes: acquiring an axis alignment bounding box of the vegetation template, and determining a projection radius based on a maximum coordinate and a minimum coordinate of the axis alignment bounding box projected to a horizontal plane; acquiring a first rendering position of each plant in a target texture; determining a rendering range of each plant in the target texture based on the projection radius and the scaling factor; and determining a target rendering position of each plant in a preset background texture based on the first rendering position and the rendering range.
In an alternative embodiment, the method further comprises: and carrying out preset blurring operation on the target texture.
In an alternative embodiment, the step of determining the target rendering intensity based on the preset decay start distance and the preset rendering distance comprises: determining the distance between the rendering position of the final game picture and the camera as a rendering distance l; based on the rendering distance l and a preset attenuation starting distance l f And a preset rendering distance l r According to the formulaA target rendering intensity f is determined.
In an alternative embodiment, the method further comprises: judging whether the terrain has vegetation coverage, if so, setting the coverage coefficient to be 1; if not, setting the coverage coefficient to 0; interpolation is carried out in the initial game picture based on the target texture and the target rendering strength, and the final game picture is rendered, which comprises the following steps: acquiring an initial game picture at a preset attenuation starting distance l f And a preset rendering distance l r The terrain apex color between the two; interpolation is carried out according to a preset interpolation operation according to the target texture, the terrain vertex color, the target rendering intensity f and the coverage coefficient, and the vertex color of vegetation rendering is obtained; the preset interpolation operation is c=lerp (C t ,C e F×c); wherein C is t Is the peak color of the terrain; c (C) e Is the color of the target texture; c is the coverage coefficient; c is the vertex color of vegetation rendering; and rendering based on the vertex colors of vegetation rendering to obtain a final game picture.
In a second aspect, an embodiment provides a rendering apparatus of a game screen, including: the color extraction module is used for extracting the colors of vegetation textures in the initial game picture; the color mapping module is used for mapping the colors of the vegetation textures to a preset background texture to obtain a target texture; the rendering intensity determining module is used for determining target rendering intensity based on a preset attenuation starting distance and a preset rendering distance; and the rendering module is used for interpolating in the initial game picture based on the target texture and the target rendering strength, and rendering to obtain a final game picture.
In a third aspect, an embodiment provides an electronic device comprising a processor and a memory; a computer program is stored on a memory, which when run by a processor performs the method of any of the previous embodiments.
In a fourth aspect, the embodiments provide a computer readable storage medium storing computer software instructions for use with any of the methods of the previous embodiments.
According to the game picture rendering method, the game picture rendering device and the electronic equipment, colors of vegetation textures in an initial game picture are firstly extracted, the colors of the vegetation textures are mapped onto preset background textures to obtain target textures, target rendering strength is determined based on a preset attenuation starting distance and a preset rendering distance, interpolation is carried out in the initial game picture based on the target textures and the target rendering strength, and a final game picture is obtained through rendering. The vegetation color of the initial game picture is extracted, the extracted vegetation texture is mapped to obtain target texture, interpolation is carried out between the preset attenuation starting distance and the preset rendering distance according to the target rendering intensity through the obtained target texture, and the rendered final game picture is obtained, so that the vegetation effect can still be seen at a distance far from the camera (namely, the distance between the preset attenuation starting distance and the preset rendering distance), the vegetation and the topography have a vivid linking effect, and the reality of the game picture is improved. Therefore, the embodiment of the invention can effectively improve the vegetation and terrain engagement effect and improve the sense of reality of the game picture.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of a method for rendering a game screen according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a target texture according to an embodiment of the present invention;
FIG. 3 is a schematic diagram showing the results of another target texture according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a relationship between an attenuation start distance and a preset rendering distance according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a rendering device for a game screen according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. The components of the embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
In the description of the present invention, it should be noted that the terms "first," "second," and the like are used merely to distinguish between descriptions and should not be construed as indicating or implying relative importance.
Furthermore, the term "horizontal" merely means that its direction is more horizontal than "vertical" and does not mean that the structure must be perfectly horizontal, but may be slightly inclined.
Some embodiments of the present invention are described in detail below with reference to the accompanying drawings. The following embodiments and features of the embodiments may be combined with each other without conflict.
Considering the situation that the existing game pictures are not natural in connection at the place where the vegetation disappears, in the reality nature, when the place where one piece of vegetation is covered is farther and farther away from the place where the vegetation is covered, the details of the vegetation, such as the structure of the plants, the venation of the leaves and the like, can not be seen slowly, and only one piece of color is covered on the terrain finally, so that the invention provides a rendering method, a rendering device and electronic equipment of the game pictures, which can effectively improve the vegetation and terrain connection effect and improve the sense of reality of the game pictures.
For easy understanding, first, a detailed description will be given of a method for rendering a game screen according to an embodiment of the present invention, referring to a flowchart of a method for rendering a game screen shown in fig. 1, the method mainly includes steps S102 to S108 as follows:
step S102: and extracting the colors of vegetation textures in the initial game picture.
In one embodiment, the initial game frame may include a game scene that has been rendered, but because some game scenes are far away from the camera (i.e. where vegetation disappears) and there is an unnatural connection, it is necessary to render the frame far away from the camera to obtain a natural connection and real game frame, first extracting the color of the plant leaves in the initial vegetation texture in the initial game frame, and downsampling the color of the plant leaves in the extracted vegetation texture, such as downsampling the resolution by Box Filter by half each time, until a first rendering color (also called the color of the texture) with a resolution of 1×1 is obtained, so as to provide the initial rendering color for the remote game frame rendering.
Step S104: mapping the colors of the vegetation textures to a preset background texture to obtain a target texture.
In one embodiment, the preset background texture may be a texture set according to the size of the vegetation texture, where the texture includes three channels of RGB, first, initializing color data to 0, that is, obtaining the background texture, where the unit of the background texture is a pixel, the size may be set to w, performing color processing and blurring on the color of the extracted vegetation texture, and obtaining a vegetation effect similar to a position far away in reality, and mapping the processed vegetation texture on the preset background texture to obtain a target texture based on the color of the vegetation texture to be filled according to the position and the range of the vegetation.
Step S106: and determining the target rendering intensity based on the preset attenuation starting distance and the preset rendering distance.
In one embodiment, the preset attenuation starting distance may be a distance from the camera where the vegetation begins to disappear in the initial game screen, the preset rendering distance may be a farthest distance that can be displayed in the screen, and since the effect of rendering from the preset attenuation starting distance to the preset rendering distance should be a gradually disappearing vegetation color effect, the target rendering intensity needs to be set for rendering, the target rendering intensity is selected according to the distance from the attenuation starting distance to the farther distance when the target rendering intensity is rendered according to the colors of all vegetation textures within the preset attenuation starting distance.
Step S108: interpolation is carried out in the initial game picture based on the target texture and the target rendering strength, and the final game picture is rendered.
The obtained target texture and target rendering intensity need to be rendered between a preset attenuation distance and a preset rendering distance in an interpolation mode, so that a realistic effect of vegetation connection in a game picture is achieved.
The invention provides a method for rendering a game picture, which comprises the steps of firstly extracting the colors of vegetation textures in an initial game picture, mapping the colors of the vegetation textures onto a preset background texture to obtain a target texture, determining target rendering strength based on a preset attenuation starting distance and a preset rendering distance, interpolating in the initial game picture based on the target texture and the target rendering strength, and rendering to obtain a final game picture. The vegetation color of the initial game picture is extracted, the extracted vegetation texture is mapped to obtain target texture, interpolation is carried out between the preset attenuation starting distance and the preset rendering distance according to the target rendering intensity through the obtained target texture, and the rendered final game picture is obtained, so that the vegetation effect can still be seen at a distance far from the camera (namely, the distance between the preset attenuation starting distance and the preset rendering distance), the vegetation and the topography have a vivid linking effect, and the reality of the game picture is improved. Therefore, the embodiment of the invention can effectively improve the vegetation and terrain engagement effect and improve the sense of reality of the game picture.
In one embodiment, the game scene may be divided into small squares of a fixed size, the vegetation data is organized in units of a square and saved in a file, and a piece of vegetation data represents vegetation covering a range of terrain. In order to facilitate understanding the above step S104, the present invention provides a specific embodiment for mapping the color of a vegetation texture onto a preset background texture to obtain a target texture, which mainly includes steps 1 to 3 as follows:
step 1, determining a second rendering color of each plant in a preset background texture based on the first rendering color and a preset vegetation vertex color. The plant is at least one plant; the preset vegetation apex color is typically drawn by an artist at the time of editing, such as by marking the first rendered color as C t The color of the top of the vegetation is marked as C v The color of each plant in the background texture (i.e., the second rendered color) may be C r =C v ×C t
And 2, determining the rendering position of each plant in a preset background texture. In practical application, vegetation in a game picture is obtained by a plurality of plants or clusters of plants, and in order to render a real effect, the colors of each plant or each cluster of plants are different, so that after the colors required by rendering each plant are determined, the rendering position of each plant is calculated, and the real rendering effect is displayed.
And 3, filling the second rendering color of each plant into the corresponding rendering position to obtain a target texture, wherein the result schematic diagram of one target texture is shown in fig. 2.
In addition, since the calculated texture has a relatively hard border at the edge of the vegetation filled with the color, in order to obtain a smooth and natural effect, a preset blurring operation needs to be performed on the target texture, in this embodiment, the blurring operation may be performed on the target texture through, for example, gaussian blurring, and the target texture after gaussian blurring is referred to as a result schematic diagram of another target texture as shown in fig. 3.
The step of determining the rendering position of each plant in the preset background texture may further include the following steps 2.1 to 2.4:
step 2.1, acquiring an axis alignment bounding box of the vegetation template, and determining a projection radius r based on a maximum coordinate max (x, z) and a minimum coordinate min (x, z) of the axis alignment bounding box projected to a horizontal plane, wherein the projection radius r can be expressed by a formulaThe representation is performed.
And 2.2, acquiring a first rendering position of each plant in the target texture. Since the position of vegetation is a local coordinate value relative to the block where it needs to be converted to pixel coordinates on the target texture, for example, it can be calculated using the following formula, where h is the size of the vegetation texture and w is the size of the background texture, (t x ,t y ) I.e. pixel coordinates in the target texture, where floor is a rounding down operation.
Step 2.3, determining a rendering range of each plant in the target texture based on the projection radius and the scaling factor, e.g. the rendering range may apply a formulaA determination is made, where s is a scaling factor.
Step 2.4, determining a target rendering position of each plant in the preset background texture, namely a base, based on the first rendering position and the rendering rangeIn the pixel coordinates (t x ,t y ) And rendering radius r v And determining a target rendering position of each plant in a preset background texture.
In one embodiment, after obtaining the target texture of the vegetation, it is required to apply the target texture as the vertex color of the terrain to the ground (i.e. the game picture) of the game scene, the real vegetation is rendered at a position closer to the camera, and the effect of the target texture is only started to be displayed at a position where the vegetation disappears at a farther position, so that the target texture is required to be displayed according to a preset attenuation starting distance l f And a preset rendering distance l r Determining the rendering intensity, wherein the preset attenuation starting distance l f And a preset rendering distance l r Referring to fig. 4, a specific embodiment of determining the target rendering intensity based on the preset attenuation starting distance and the preset rendering distance includes the following steps:
step A, determining the distance between the rendering position of the final game picture and the camera as a rendering distance l;
step B, based on the rendering distance l and a preset attenuation starting distance l f And a preset rendering distance l r According to the formulaA target rendering intensity f is determined.
In one embodiment, firstly, it is determined whether the terrain has vegetation coverage, if yes, the coverage coefficient is set to 1, otherwise, the coverage coefficient is set to 0, which aims to promote the reality of the game scene, and the vegetation is not rendered on the part which is not covered by the vegetation, and the steps of interpolating in the initial game picture based on the target texture and the target rendering strength and rendering to obtain the final game picture include the following steps (1) to (3):
step (1), obtaining an initial game picture at a preset attenuation starting distance l f And a preset rendering distance l r Terrain apex color in between. In one embodiment, since different terrain vertex colors are set in different seasons in order to achieve a realistic effect of the game screen, such as green in summer and yellow in autumn in the game scene, a more realistic effect of the game screen is obtained according to the different terrain vertex colors.
Step (2), interpolating according to a preset interpolation operation according to the target texture, the terrain vertex color, the target rendering intensity f and the coverage coefficient to obtain the vertex color of vegetation rendering, wherein the preset interpolation operation can adopt a formula of c=lerp (C t ,C e F×c), wherein C t Is the peak color of the terrain, C e And C is the coverage coefficient, and C is the vertex color of vegetation rendering.
And (3) rendering based on the vertex colors of vegetation rendering to obtain a final game picture. And obtaining the vertex color C of vegetation rendering through the interpolation operation, and filling the vertex color corresponding to each rendering position to obtain a final game picture.
For the above-mentioned game screen rendering method, an embodiment of the present invention provides a game screen rendering device, referring to a schematic structural diagram of a game screen rendering device shown in fig. 5, the device mainly includes the following parts:
a color extraction module 502, configured to extract a color of a vegetation texture in an initial game frame;
the color mapping module 504 is configured to map the color of the vegetation texture onto a preset background texture to obtain a target texture;
a rendering intensity determining module 506, configured to determine a target rendering intensity based on a preset attenuation start distance and a preset rendering distance;
and the rendering module 508 is used for interpolating in the initial game picture based on the target texture and the target rendering strength, and rendering to obtain a final game picture.
The invention provides a game picture rendering device, which is characterized in that firstly, colors of vegetation textures in an initial game picture are extracted, the colors of the vegetation textures are mapped onto preset background textures to obtain target textures, a target rendering strength is determined based on a preset attenuation starting distance and a preset rendering distance, interpolation is carried out in the initial game picture based on the target textures and the target rendering strength, and a final game picture is obtained by rendering. The vegetation color of the initial game picture is extracted, the extracted vegetation texture is mapped to obtain target texture, interpolation is carried out between the preset attenuation starting distance and the preset rendering distance according to the target rendering intensity through the obtained target texture, and the rendered final game picture is obtained, so that the vegetation effect can still be seen at a distance far from the camera (namely, the distance between the preset attenuation starting distance and the preset rendering distance), the vegetation and the topography have a vivid linking effect, and the reality of the game picture is improved. Therefore, the embodiment of the invention can effectively improve the vegetation and terrain engagement effect and improve the sense of reality of the game picture.
In one embodiment, the color extraction module 502 is further configured to extract a color of a plant leaf in an initial vegetation texture in an initial game frame; and downsampling the color of the plant leaves in the extracted vegetation texture to obtain a first rendering color with resolution of 1 multiplied by 1.
In one embodiment, the color mapping module 504 is further configured to determine, based on the first rendered color and the preset vegetation apex color, a second rendered color of each plant in the preset background texture; the plant is at least one plant; determining a rendering position of each plant in a preset background texture; and filling the second rendering color of each plant into the corresponding rendering position to obtain the target texture.
In one embodiment, the apparatus further comprises: the rendering position determining module is used for acquiring an axis alignment bounding box of the vegetation template and determining a projection radius based on the maximum coordinate and the minimum coordinate of the axis alignment bounding box projected to the horizontal plane; acquiring a first rendering position of each plant in a target texture; determining a rendering range of each plant in the target texture based on the projection radius and the scaling factor; and determining a target rendering position of each plant in a preset background texture based on the first rendering position and the rendering range.
In one embodiment, the apparatus further includes a blurring operation module, configured to perform a preset blurring operation on the target texture.
In one embodiment, the rendering intensity determining module 506 is further configured to determine that a distance from the camera to the rendering position of the final game frame is a rendering distance l; based on the rendering distance l and a preset attenuation starting distance l f And a preset rendering distance l r According to the formulaA target rendering intensity f is determined.
In one embodiment, the apparatus further comprises: the judging module is used for judging whether the terrain has vegetation coverage, and if so, setting the coverage coefficient to be 1; if not, setting the coverage coefficient to 0; the rendering module 508 is further configured to obtain an initial game frame at a preset attenuation start distance l f And a preset rendering distance l r The terrain apex color between the two;
interpolation is carried out according to a preset interpolation operation according to the target texture, the terrain vertex color, the target rendering intensity f and the coverage coefficient, and the vertex color of vegetation rendering is obtained; the preset interpolation operation is c=lerp (C t ,C e F×c); wherein C is t Is the peak color of the terrain; c (C) e Is the color of the target texture; c is the coverage coefficient; c is the vertex color of vegetation rendering; and rendering based on the vertex colors of vegetation rendering to obtain a final game picture.
The device provided by the embodiment of the present invention has the same implementation principle and technical effects as those of the foregoing method embodiment, and for the sake of brevity, reference may be made to the corresponding content in the foregoing method embodiment where the device embodiment is not mentioned.
The device is an electronic device, and specifically, the electronic device comprises a processor and a storage device; the storage means has stored thereon a computer program which, when executed by the processor, performs the method of any of the embodiments described above.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, where the electronic device 100 includes: a processor 60, a memory 61, a bus 62 and a communication interface 63, the processor 60, the communication interface 63 and the memory 61 being connected by the bus 62; the processor 60 is arranged to execute executable modules, such as computer programs, stored in the memory 61.
The memory 61 may include a high-speed random access memory (RAM, randomAccessMemory) and may further include a non-volatile memory (non-volatile memory), such as at least one magnetic disk memory. The communication connection between the system network element and at least one other network element is achieved via at least one communication interface 63 (which may be wired or wireless), and may use the internet, a wide area network, a local network, a metropolitan area network, etc.
Bus 62 may be an ISA bus, a PCI bus, an EISA bus, or the like. The buses may be classified as address buses, data buses, control buses, etc. For ease of illustration, only one bi-directional arrow is shown in FIG. 6, but not only one bus or type of bus.
The memory 61 is configured to store a program, and the processor 60 executes the program after receiving an execution instruction, and the method executed by the apparatus for flow defining disclosed in any of the foregoing embodiments of the present invention may be applied to the processor 60 or implemented by the processor 60.
The processor 60 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuitry in hardware or instructions in software in the processor 60. The processor 60 may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but may also be a digital signal processor (Digital Signal Processing, DSP for short), application specific integrated circuit (Application Specific Integrated Circuit, ASIC for short), off-the-shelf programmable gate array (Field-Programmable Gate Array, FPGA for short), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory 61 and the processor 60 reads the information in the memory 61 and in combination with its hardware performs the steps of the method described above.
The method, the device and the computer program product of the electronic device for rendering the game picture provided by the embodiment of the invention comprise a computer readable storage medium storing nonvolatile program codes executable by a processor, and the computer readable storage medium stores a computer program, and when the computer program is executed by the processor, the method described in the previous method embodiment is executed, and specific implementation can be seen in the method embodiment and will not be repeated herein.
It will be clear to those skilled in the art that, for convenience and brevity of description, the specific working process of the system described above may refer to the corresponding process in the foregoing embodiment, which is not described in detail herein.
The computer program product of the readable storage medium provided by the embodiment of the present invention includes a computer readable storage medium storing a program code, where the program code includes instructions for executing the method described in the foregoing method embodiment, and specific implementation may refer to the method embodiment and will not be described herein.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (7)

1. A method of rendering a game screen, comprising:
extracting the color of vegetation textures in the initial game picture;
mapping the colors of the vegetation textures to a preset background texture to obtain a target texture;
determining a target rendering intensity based on a preset attenuation starting distance and a preset rendering distance;
interpolation is carried out in the initial game picture based on the target texture and the target rendering strength, and a final game picture is rendered;
the step of extracting the colors of vegetation textures in the initial game picture comprises the following steps:
extracting the colors of plant leaves in the initial vegetation textures in the initial game picture;
downsampling the color of the plant leaves in the extracted vegetation texture to obtain a first rendering color with resolution of 1×1;
the step of mapping the colors of the vegetation textures to preset background textures to obtain target textures comprises the following steps: determining a second rendering color of each plant in the preset background texture based on the first rendering color and a preset vegetation vertex color; the plant is at least one plant; determining a rendering position of each plant in the preset background texture; filling the second rendering color of each plant into the corresponding rendering position to obtain a target texture;
the step of determining the rendering position of each plant in the preset background texture comprises the following steps: acquiring an axis alignment bounding box of a vegetation template, and determining a projection radius based on a maximum coordinate and a minimum coordinate of the axis alignment bounding box projected to a horizontal plane; acquiring a first rendering position of each plant in the target texture; determining a rendering range of each plant in the target texture based on the projection radius and a scaling factor; and determining a target rendering position of each plant in the preset background texture based on the first rendering position and the rendering range.
2. The method according to claim 1, wherein the method further comprises: and carrying out preset blurring operation on the target texture.
3. The method of claim 1, wherein the step of determining the target rendering intensity based on the preset decay start distance and the preset rendering distance comprises:
determining the distance between the rendering position of the final game picture and the camera as rendering distance l;
based on the rendering distance l and the preset attenuation starting distance l f And the preset rendering distance l r According to the formulaA target rendering intensity f is determined.
4. A method according to claim 3, characterized in that the method further comprises: judging whether the terrain has vegetation coverage, if so, setting the coverage coefficient to be 1; if not, setting the coverage coefficient to 0;
the step of interpolating in the initial game picture and rendering to obtain a final game picture based on the target texture and the target rendering strength comprises the following steps:
acquiring the initial attenuation starting distance l of the game picture at the preset attenuation starting distance l f And the preset rendering distance l r The terrain apex color between the two;
interpolation is carried out according to preset interpolation operation according to the target texture, the terrain vertex color, the target rendering intensity f and the coverage coefficient, and the vertex color of vegetation rendering is obtained; the preset interpolation operation is c=lerp (C t ,C e F×c); wherein C is t Color the terrain vertices; c (C) e Color for the target texture; c is the coverage coefficient; c is the vertex color of the vegetation rendering;
rendering is carried out based on the vertex colors of the vegetation rendering, and the final game picture is obtained.
5. A game screen rendering apparatus, the apparatus comprising:
the color extraction module is used for extracting the color of vegetation textures in the initial game picture;
the color mapping module is used for mapping the colors of the vegetation textures to a preset background texture to obtain a target texture;
the rendering intensity determining module is used for determining target rendering intensity based on a preset attenuation starting distance and a preset rendering distance;
the rendering module is used for interpolating in the initial game picture based on the target texture and the target rendering intensity, and rendering to obtain a final game picture;
the color extraction module is further used for extracting the colors of plant leaves in the initial vegetation textures in the initial game picture; downsampling the color of the plant leaves in the extracted vegetation texture to obtain a first rendering color with resolution of 1×1;
the color mapping module is further used for determining a second rendering color of each plant in the preset background texture based on the first rendering color and a preset vegetation vertex color; the plant is at least one plant; determining a rendering position of each plant in the preset background texture; filling the second rendering color of each plant into the corresponding rendering position to obtain a target texture;
the color mapping module is further used for acquiring an axis alignment bounding box of the vegetation template, and determining a projection radius based on the maximum coordinate and the minimum coordinate of the axis alignment bounding box projected to the horizontal plane; acquiring a first rendering position of each plant in the target texture; determining a rendering range of each plant in the target texture based on the projection radius and a scaling factor; and determining a target rendering position of each plant in the preset background texture based on the first rendering position and the rendering range.
6. An electronic device comprising a processor and a memory;
the memory has stored thereon a computer program which, when executed by the processor, performs the method of any of claims 1 to 4.
7. A computer readable storage medium storing computer software instructions for use with the method of any one of claims 1 to 4.
CN201911315201.XA 2019-12-18 2019-12-18 Game picture rendering method and device and electronic equipment Active CN111127576B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911315201.XA CN111127576B (en) 2019-12-18 2019-12-18 Game picture rendering method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911315201.XA CN111127576B (en) 2019-12-18 2019-12-18 Game picture rendering method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN111127576A CN111127576A (en) 2020-05-08
CN111127576B true CN111127576B (en) 2023-11-17

Family

ID=70500101

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911315201.XA Active CN111127576B (en) 2019-12-18 2019-12-18 Game picture rendering method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111127576B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111760290A (en) * 2020-06-11 2020-10-13 网易(杭州)网络有限公司 Information processing method and device, computer equipment and storage medium
CN111798554A (en) * 2020-07-24 2020-10-20 上海米哈游天命科技有限公司 Rendering parameter determination method, device, equipment and storage medium
CN112206528B (en) * 2020-10-12 2024-03-01 网易(杭州)网络有限公司 Vegetation model rendering method, device, equipment and storage medium
CN112215968A (en) * 2020-10-29 2021-01-12 网易(杭州)网络有限公司 Model paste processing method and device, storage medium and electronic equipment
CN112807685A (en) * 2021-01-22 2021-05-18 珠海天燕科技有限公司 Grassland rendering method, grassland rendering device and grassland rendering equipment based on game role track
CN113450443B (en) * 2021-07-08 2023-03-24 网易(杭州)网络有限公司 Rendering method and device of sea surface model

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107952241A (en) * 2017-12-05 2018-04-24 北京像素软件科技股份有限公司 Render control method, device and readable storage medium storing program for executing
WO2019015591A1 (en) * 2017-07-21 2019-01-24 腾讯科技(深圳)有限公司 Method for rendering game, and method, apparatus and device for generating game resource file
CN110115841A (en) * 2019-05-10 2019-08-13 网易(杭州)网络有限公司 The rendering method and device of vegetation object in a kind of scene of game

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6952207B1 (en) * 2002-03-11 2005-10-04 Microsoft Corporation Efficient scenery object rendering

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019015591A1 (en) * 2017-07-21 2019-01-24 腾讯科技(深圳)有限公司 Method for rendering game, and method, apparatus and device for generating game resource file
CN107952241A (en) * 2017-12-05 2018-04-24 北京像素软件科技股份有限公司 Render control method, device and readable storage medium storing program for executing
CN110115841A (en) * 2019-05-10 2019-08-13 网易(杭州)网络有限公司 The rendering method and device of vegetation object in a kind of scene of game

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
汪丽萍 ; 何火娇 ; .植物叶片渲染方法研究进展.农机化研究.2013,(第08期),全文. *

Also Published As

Publication number Publication date
CN111127576A (en) 2020-05-08

Similar Documents

Publication Publication Date Title
CN111127576B (en) Game picture rendering method and device and electronic equipment
JP6725110B2 (en) Image rendering of laser scan data
US7173631B2 (en) Flexible antialiasing in embedded devices
CN111968216A (en) Volume cloud shadow rendering method and device, electronic equipment and storage medium
CN108765520B (en) Text information rendering method and device, storage medium and electronic device
CN110115841B (en) Rendering method and device for vegetation object in game scene
CN111127624A (en) Illumination rendering method and device based on AR scene
CN111968214B (en) Volume cloud rendering method and device, electronic equipment and storage medium
US10019848B2 (en) Edge preserving color smoothing of 3D models
CN106447756B (en) Method and system for generating user-customized computer-generated animations
CN115641415B (en) Method, device, equipment and medium for generating three-dimensional scene based on satellite image
CN109697748B (en) Model compression processing method, model mapping processing method, model compression processing device, and storage medium
CN113648655A (en) Rendering method and device of virtual model, storage medium and electronic equipment
CN111260767B (en) Rendering method, rendering device, electronic device and readable storage medium in game
US8970626B2 (en) System, method, and computer program product for adding computer-generated scene elements into a representation of a real-world scene, using path tracing
CN112419460B (en) Method, apparatus, computer device and storage medium for baking model map
CN115359169A (en) Image processing method, apparatus and storage medium
CN111062863B (en) Method, device, equipment and storage medium for binding 3D model with longitude and latitude coordinates
CN114288671A (en) Method, device and equipment for making map and computer readable medium
CN108171784B (en) Rendering method and terminal
CN109729285B (en) Fuse grid special effect generation method and device, electronic equipment and storage medium
CN114549732A (en) Model rendering method and device and electronic equipment
JP5946369B2 (en) 3D map image data generation system
CN111462343A (en) Data processing method and device, electronic equipment and storage medium
CN111111177A (en) Method and device for disturbing background by special effect of game and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant