CN112132936B - Picture rendering method and device, computer equipment and storage medium - Google Patents

Picture rendering method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN112132936B
CN112132936B CN202011003955.4A CN202011003955A CN112132936B CN 112132936 B CN112132936 B CN 112132936B CN 202011003955 A CN202011003955 A CN 202011003955A CN 112132936 B CN112132936 B CN 112132936B
Authority
CN
China
Prior art keywords
wind
vertex
interactive
target
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011003955.4A
Other languages
Chinese (zh)
Other versions
CN112132936A (en
Inventor
周昊楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Mihoyo Tianming Technology Co Ltd
Original Assignee
Shanghai Mihoyo Tianming Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Mihoyo Tianming Technology Co Ltd filed Critical Shanghai Mihoyo Tianming Technology Co Ltd
Priority to CN202011003955.4A priority Critical patent/CN112132936B/en
Publication of CN112132936A publication Critical patent/CN112132936A/en
Application granted granted Critical
Publication of CN112132936B publication Critical patent/CN112132936B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/603D [Three Dimensional] animation of natural phenomena, e.g. rain, snow, water or plants
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/663Methods for processing data by generating or executing the game program for rendering three dimensional images for simulating liquid objects, e.g. water, gas, fog, snow, clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/24Fluid dynamics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/44Morphing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention discloses a picture rendering method, a device, equipment and a medium. The method comprises the following steps: acquiring an interactive wind map matched with a target picture to be rendered; according to the interactive wind map, performing deformation processing on each target model element in the target picture by using target interactive wind parameters of corresponding position relations; rendering and displaying each target model element after deformation processing. According to the technical scheme, the corresponding interactive wind switch states are arranged at different positions in the interactive wind map, which model elements in the target picture need to be subjected to interactive deformation processing can be determined according to the interactive wind switch states, the interactive display effect of the model elements is ensured, the authenticity of the display picture is improved, the consumption of the memory of a computer is reduced to the greatest extent, and the processing performance of the computer is saved.

Description

Picture rendering method and device, computer equipment and storage medium
Technical Field
The embodiment of the invention relates to an image processing technology, in particular to a game image engine technology, and particularly relates to a picture rendering method, a device, computer equipment and a storage medium.
Background
In the world of games, it is often necessary to lay a large number of small objects in a scene or a large area within a scene to add detail, supplement the picture and thus characterize the area. For example, a large amount of flowers and plants, small vegetation, or the like are laid on a lawn.
In order to achieve perfect fusion of the laid small objects and the scene, it is generally required that the game screen is capable of fitting the interactive effect of the small objects extruded around when the small objects pass by a dynamic object (for example, a character, a weapon held by the character or a launched skill), so as to create a simulated scene in which the small objects deform during the interaction. However, there is no effective processing technique for interactive deformation of small objects in the prior art.
Disclosure of Invention
The embodiment of the invention provides a picture rendering method, a device, equipment and a medium, which are used for providing a picture display method under an interactive scene and improving the interactive display effect of model elements in a picture.
In a first aspect, an embodiment of the present invention further provides a method for rendering a picture, including:
acquiring an interactive wind map matched with a target picture to be rendered; wherein, the interactive wind map records interactive wind parameters of different positions and interactive wind switch states;
According to the interactive wind map, performing deformation processing on each target model element in the target picture by using target interactive wind parameters of corresponding position relations;
rendering and displaying each target model element after deformation processing.
In a second aspect, an embodiment of the present invention further provides a deformation processing apparatus for a model element, including:
the interactive wind map acquisition module is used for acquiring an interactive wind map matched with a target picture to be rendered; wherein, the interactive wind map records interactive wind parameters of different positions and interactive wind switch states;
the deformation processing module is used for performing deformation processing on each target model element in the target picture by using the target interaction wind parameters of the corresponding position relation according to the interaction wind map;
and the rendering display module is used for rendering and displaying each target model element after the deformation processing.
In a third aspect, an embodiment of the present invention further provides a computer device, including a memory, a processor, and a computer program stored in the memory and capable of running on the processor, where the processor implements the method for rendering a picture according to any one of the embodiments of the present invention when executing the program.
In a fourth aspect, an embodiment of the present invention further provides a computer readable storage medium having stored thereon a computer program, which when executed by a processor, implements a picture rendering method according to any of the embodiments of the present invention.
According to the technical scheme, the interactive weather map matched with the target picture to be rendered is obtained; according to the interactive wind map, performing deformation processing on each target model element in the target picture by using target interactive wind parameters of corresponding position relations; the technical means of rendering and displaying the target model elements after deformation processing is that interactive wind switch states are set at different positions in the interactive wind map, so that the interactive deformation processing of the model elements in the target picture can be determined according to the interactive wind switch states, the interactive display effect of the model elements is ensured, the authenticity of the display picture is improved, the consumption of the memory of a computer is reduced to the greatest extent, and the processing performance of the computer is saved.
Drawings
FIG. 1 is a flowchart of a method for rendering a frame according to a first embodiment of the present invention;
FIG. 2a is a flowchart of a method for rendering a frame according to a second embodiment of the present invention;
FIG. 2b is a schematic diagram illustrating a vector relationship between vertices and center points according to an embodiment of the present invention;
FIG. 2c is a schematic diagram illustrating another vector relationship between vertices and center points for which embodiments are applicable;
fig. 3 is a block diagram of a picture rendering apparatus in a third embodiment of the present invention;
fig. 4 is a block diagram of a computer device in accordance with a fourth embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
It should be further noted that, for convenience of description, only some, but not all of the matters related to the present invention are shown in the accompanying drawings. Before discussing exemplary embodiments in more detail, it should be mentioned that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart depicts operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently, or at the same time. Furthermore, the order of the operations may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figures. The processes may correspond to methods, functions, procedures, subroutines, and the like.
Example 1
Fig. 1 is a flowchart of a picture rendering method according to a first embodiment of the present invention, where the method is applicable to a case of rendering and displaying a picture including model elements after interaction, and the method may be performed by a picture rendering device, and the device may be implemented by software and/or hardware and may be generally integrated in a terminal or a game server, and the method of the embodiment specifically includes the following steps:
s110, acquiring an interactive weather map matched with a target picture to be rendered; and the interactive wind map records interactive wind parameters at different positions and interactive wind switch states.
In this embodiment, in order to realize the display of a picture, it is necessary to first lay various objects in a canvas using an object laying tool under the environment of picture production (such as a game engine, digital content production software, etc.) to form a picture.
The object laying tool may be a brush tool, similar to a stamp of the shape of the object to be laid, where the object is laid and where the stamp is to be printed. Specifically, when the mouse is in the environment of the screen production and the object laying tool is activated, the screen producer can select the model of the object to be laid, then the mouse can be displayed on the interface of the screen production in the shape of the selected model, and when the screen producer drags the mouse and clicks or other forms of selecting operation are performed at a certain position in the target screen, the selected model is laid at the position. For example, a large amount of grass is laid on a lawn to finally form the lawn, and usually, for the purpose of picture making efficiency, the grass is laid in units of a cluster of grass models in the course of laying the grass, and a cluster of grass is composed of a certain number of grass roots.
When the picture is manufactured or previewed in the manufacturing process, the picture can be rendered and displayed, the process needs to call a shader, display parameters of model elements in each paved object model are read through the shader, the picture is rendered according to the read display parameters, and finally the rendered picture is formed and displayed.
In this embodiment, if a display effect after interaction of the interaction wind is required to be displayed in a current picture to be rendered, the picture is determined to be a target picture to be rendered, and before the target picture is rendered and displayed by using a shader, an interaction wind map corresponding to the target picture needs to be acquired first.
In this embodiment, the interactive wind map is a virtual map recording interactive wind parameters of each position in the target frame to be rendered, where the virtual map may be stored in the GPU or may be stored in the hard disk.
Optionally, the interactive wind map includes: the dynamic object interacts with the wind farm. The interactive wind map can be dynamically generated according to an interactive wind field generated by one or more dynamic objects appearing in the target picture. It will be appreciated that one or more dynamic objects may be included in the target frame, each within a set range centered on itself (e.g., a circular range, or a rectangular range, etc.), which may create an interactive wind farm. For example, a character may move over the ground and create an interactive wind field during movement within its own-centered setting, and for example, a character releases a forward push skill that also creates an interactive wind field during forward pushing within its own-centered setting.
The interactive wind fields generated by different dynamic objects may not overlap each other or have a certain overlapping area. The interactive wind fields can be overlapped in the interactive wind map to reflect the overall interactive wind condition of the target picture.
In an alternative implementation of the embodiment, the interactive wind map may be a rendering Texture map (Render Texture), that is, the interactive wind map described in the embodiment of the present invention is instantiated by using a rendering tool to provide the rendering Texture map.
In the rendering texture map, the interactive wind parameters at different positions and the interactive wind switch state can be recorded through RBGA color channels of pixel points. Optionally, the interactive wind parameter may include: the wind power and the wind direction of the interactive wind.
Optionally, in the rendering texture map, wind force values of the interactive wind in the X-axis direction and the Z-axis direction can be recorded through an R color channel and a B color channel, and the state of an interactive wind switch is recorded through an A color channel; the interactive wind switch state comprises an opening state or a closing state, is positioned in the interactive wind opening state, participates in interactive wind deformation calculation, is positioned in the interactive wind closing state, and does not perform interactive wind deformation calculation.
In this embodiment, the RBGA color channels of the pixels in the rendering texture map do not record color or brightness information of the pixels, but may record the interactive wind parameters and the on-off states of the interactive wind at different positions.
Correspondingly, when one or more dynamic object interaction wind fields act on the target picture, the interaction wind parameters and the switching state of the interaction wind recorded in the interaction wind map can be dynamically adjusted.
In a specific example, when a character emits a certain skill, it is possible that a certain position area in a picture does not need to participate in interactive wind deformation calculation, and correspondingly, an a color channel of a pixel point of the position area can be set to be in an interactive wind closed state correspondingly; however, when the character additionally transmits a new skill, the position area needs to participate in interactive wind deformation calculation, and accordingly, the color channel a of the pixel point of the position area needs to be dynamically adjusted to be in an interactive wind on state.
Alternatively, the interactive wind parameter and the switching state of the interactive wind at the pixel point position may be recorded through the RBGA color channel of each pixel point, or the interactive wind parameter and the switching state of the interactive wind in a pixel area may be recorded through the RBGA color channel of one or more pixel points in the pixel area, which is not limited in this embodiment.
In an optional implementation manner of this embodiment, each pixel point in the interactive weather map is associated with each pixel point in the target frame. That is, the interactive weather map may act at each location of the target picture.
In another optional implementation manner of this embodiment, each pixel point in the interactive weather map may be associated with each pixel point in the set sub-area of the target frame, that is, the interactive weather map may only act inside the set sub-area in the target frame.
It should be emphasized again that, in the present embodiment, the inventors creatively propose to use the interactive wind map to store the interactive wind switch states to implement the switch control of the interactive deformation process of the model elements in the target screen. The interactive deformation processing of the model elements is only carried out at the position where the interactive wind switch state is in the on state, and on the basis of providing the control flexibility of the deformation processing, the selective interactive deformation processing of the model elements in the target picture to be rendered can be realized, so that the consumption of the memory of the computer can be reduced to the greatest extent, and the processing performance of the computer is saved.
S120, performing deformation processing on each target model element in the target picture by using the target interaction wind parameters of the corresponding position relation according to the interaction wind map.
Specifically, in this embodiment, it is required to first obtain, from all model elements included in the target frame, a target model element in the interaction wind map action area, where a color channel a at a corresponding position records the opening state of the interaction wind. And then, carrying out deformation processing on the target model parameters based on the interaction wind mode matched with the target model elements and recorded in the interaction wind map.
And S130, rendering and displaying each target model element after deformation processing.
In an optional implementation manner of this embodiment, the operation of performing deformation processing on each target model element in the target frame by using the target interaction wind parameter of the corresponding position relationship according to the interaction wind map in S120 is performed by the first shader; and (S130) rendering and displaying each object model element subjected to deformation processing, wherein the operation is executed by a second shader.
Through the cooperation of the two shaders, the interactive wind parameters can be read from the interactive wind map, the deformation processing is carried out on the target model element, and the model element after the deformation processing is rendered and displayed at the same time, so that the rendering and displaying speed of the target picture is improved.
According to the technical scheme, the interactive weather map matched with the target picture to be rendered is obtained; according to the interactive wind map, performing deformation processing on each target model element in the target picture by using target interactive wind parameters of corresponding position relations; the technical means for rendering and displaying the target model elements after deformation processing is provided with the interactive wind switch states at different positions in the interactive wind map, so that the interactive deformation processing of the model elements in the target picture can be determined according to the interactive wind switch states.
Example two
Fig. 2a is a flowchart of an implementation of a picture rendering method in the second embodiment of the present invention, in this embodiment, the operation of performing deformation processing on each target model element in the target picture by using the target interaction wind parameters of the corresponding positional relationship is further refined according to the interaction wind map. Correspondingly, the method of the embodiment specifically comprises the following steps:
S210, acquiring an interactive weather map matched with a target picture to be rendered; and the interactive wind map records interactive wind parameters at different positions and interactive wind switch states.
S220, at least one alternative model element matched with the action range of the interaction wind map is acquired in the target picture.
As previously described, only model elements included within the region of action of the interaction wind map are likely to undergo interaction deformation. Therefore, it is necessary to first obtain each model element located in the action region in the target screen according to the action region of the interaction wind map.
Meanwhile, since the objects laid in each target frame to be rendered and displayed are known, and not all the objects need to be subjected to the interactive wind deformation processing, for example, when a frame includes trees, grass, stones and houses, it is obvious that each grass in the tree leaves and grass is subjected to the interactive wind deformation processing, and the stones and houses do not need to be subjected to the interactive wind deformation processing.
Correspondingly, after obtaining each model element in the action area, at least one alternative model element needing to be subjected to interactive wind treatment can be determined according to the type of each model element.
S230, sequentially acquiring an alternative model element as a current processing element.
S240, according to the position of the current processing element in the target picture, a matched pixel point is obtained in the interactive weather map.
Optionally, the pixel position of the center point in the current processing element in the target picture may be obtained, and further, according to the pixel position, a matching pixel point is obtained in the interactive weather map.
Specifically, the matching pixel points may be pixel points corresponding to the pixel positions one by one, or the matching pixel points may be pixel points matching the region where the pixel positions are located, which is not limited in this embodiment.
S250, judging whether the interactive air switch state recorded in the color channel A in the matched pixel point is an on state or not: if yes, executing S260; otherwise, S290 is performed.
In this embodiment, if the interactive wind switch state recorded in the color channel a in the matching pixel point is an on state, deformation processing is required to be performed on the current processing element, so that the deformation processing effect of the current processing element is displayed in the target picture; if the interactive air switch state recorded in the color channel A in the matched pixel point is in the off state, deformation processing of the current processing element is not needed, and further the deformation processing effect of the current processing element is not displayed in the target picture.
Through the interactive wind switch state recorded in the color A channel, model elements which are not subjected to deformation treatment and can not obviously influence the display effect of the target picture can be ignored, so that precious memory resources can be saved to complete other display tasks.
And S260, determining the current processing element as a target model element, and executing S270.
S270, calculating target interaction wind parameters matched with the current processing element according to wind force values recorded in the R color channel and the B color channel in the matched pixel point, and executing S280.
Optionally, according to the wind power values recorded in the R color channel and the B color channel in the matched pixel point, a manner of calculating the target interaction wind parameter matched with the current processing element may be:
mapping the wind power values recorded in the R color channel and the B color channel onto an X axis and a Z axis respectively to obtain an X axis wind power vector and a Z axis wind power vector;
and carrying out vector operation according to the X-axis wind power vector and the Z-axis wind power vector to obtain target interaction wind parameters matched with the current processing element, wherein the target interaction wind parameters comprise wind power and wind direction.
Typically, the vector operation may be vector addition, and the wind force acting on the current processing element and the wind direction value may be obtained by vector addition of the X-axis wind force vector and the Z-axis wind force vector.
Alternatively, a wind high value (Y-axis wind force value), that is, the action height of the interaction wind, may be recorded in the G color channel in the matching pixel point.
S280, performing deformation processing on the current processing element according to the target interaction wind parameters, and executing S290.
S290, judging whether the processing of all the candidate model elements is completed or not: if yes, ending the flow; otherwise, execution returns to S230.
According to the technical scheme, the interactive weather map matched with the target picture to be rendered is obtained; according to the interactive wind map, performing deformation processing on each target model element in the target picture by using target interactive wind parameters of corresponding position relations; the technical means for rendering and displaying the target model elements after deformation processing is provided with the interactive wind switch states at different positions in the interactive wind map, so that the interactive deformation processing of the model elements in the target picture can be determined according to the interactive wind switch states.
On the basis of the above embodiments, performing deformation processing on the current processing element according to the target interaction wind parameter may include:
extracting vertex data of a current processing element, wherein the vertex data comprises: vertex coordinates of each vertex and information of a center point of the current processing element; and adjusting the vertex coordinates of each vertex according to the target interaction wind parameters and the information of the center point to obtain vertex deformation coordinates of each vertex.
In this embodiment, the current processing element (i.e., a model element) refers to a specific individual element from among a large number of small objects laid in a screen to be rendered and displayed. For example, the model element may be a flower in the flower sea, a grass in a cluster of grass on a lawn, or a leaf on a tree, etc.
In this embodiment, each model element has a preset number of vertices (generally a plurality of vertices), where vertex data corresponding to the model element needs to be pre-stored for each model element. The vertex data includes: vertex coordinates of each vertex and information of a center point of the model element.
And after the vertex coordinates of the vertexes are connected in sequence, the outline shape of the model element can be obtained.
In general, one model element may include one or more center points therein, which are typically located at the root of the model element proximate the ground. After the center point is set, when the interactive deformation processing is carried out on the model elements, the root is ensured not to generate interactive swing, so that the interactive deformation processing result is closer to the real situation.
Wherein the information of the center point is information for indicating a position of the center point in the model element, for example: the coordinates may be center point coordinates, or vectors between each vertex coordinate and the corresponding center point coordinate.
In one specific example, it is assumed that a model of a cluster of grass includes a plurality of model elements. Each model element is a grass. The model element contains 5 vertices (A, B, C, D and E).
In an alternative implementation of this example, as shown in fig. 2b, where the point F is the only center point in the root, in the model element, vectors AF, BF, CF, DF and EF are the vertex-to-center point F vectors.
In an alternative implementation of the present example, in fig. 2c, a grass contains 3 center points, in particular D, E and F. The vector from each vertex to the center point includes AF, BD, CE, DD and EE. That is, in FIG. 2c, there is more than one center point, and each vertex may be directed to any one of the center points. In fig. 2b and 2c, the solid lines with arrows represent the vectors of the respective vertices to the center point, and the broken lines illustrate the overall outline of a grass. In addition, the small circles around D and E in fig. 2c represent that the vectors D and E to the center point are all vectors pointing to themselves.
The vertex data may be stored in UV data, vertex color data, an instantiation array, a tangent line, a normal line, or a secondary tangent line, which is not limited in this embodiment. Where a vertex may contain multiple sets of UV data, the center point data may store a specified set of UV data, i.e., UVn, where n represents the nth set of UV. The vertex color data includes R, G, B and a (luminance) four channels. Instantiating an array is indexing an array with vertex IDs or other vertex related int type variables.
In this embodiment, according to the information of the center point of the current processing element, the wind receiving process of the interactive wind is performed on each vertex of the current processing element. The advantages of this arrangement are that: the method can create the effect of bending the current processing element around the central point when the current processing element is deformed by the interaction wind, can not be disturbed everywhere, and can ensure the authenticity of the current processing element when the current processing element is deformed by the interaction wind.
Optionally, according to the target interaction wind parameter and the information of the center point, the vertex coordinates of each vertex are adjusted, and the manner of obtaining the vertex deformation coordinates of each vertex may be:
For each vertex, obtaining a vector between each vertex coordinate and a corresponding center point coordinate, and generating a wind vector corresponding to each vertex coordinate according to the target interaction wind parameter; and calculating vertex deformation coordinates corresponding to each vertex according to vector operation (vector addition or vector subtraction) of the two vectors.
Or, according to the wind power and the wind direction in the target interactive wind parameter and a preset relation mapping table, calculating a bending angle and a bending direction corresponding to each vertex respectively, further performing rotation processing on vectors between each vertex coordinate and the corresponding center point coordinate according to the bending angle and the bending direction, and determining vertex deformation coordinates corresponding to each vertex respectively according to new vectors obtained after rotation.
Alternatively, an interactive wind bending function may be preset, where the interactive wind bending function uses information about the target interactive wind parameter and the center point as independent variables, and uses the vertex offset (in the XZ two directions) as dependent variables. Based on the interactive wind bending function, vertex offset corresponding to each vertex can be calculated, and then vertex deformation coordinates of each vertex can be calculated according to the vertex coordinates of each vertex and the vertex offset.
On the basis of the above embodiments, the vertex data may further include: a softness factor for each vertex, the softness factor being associated with the location of the vertex in the model element;
correspondingly, according to the target interaction wind parameter and the information of the center point, the vertex coordinates of each vertex are adjusted to obtain vertex deformation coordinates of each vertex, which may include:
and adjusting the vertex coordinates of each vertex according to the target interaction wind parameters, the information of the center point and the softness factor of each vertex to obtain the vertex deformation coordinates of each vertex.
In this embodiment, different softness factors may be further stored for different vertices, and based on the softness factors, different portions of a single model element may have different deformation magnitudes. Typically, in a real scene deformed by interaction, the amplitude of the deformation of a grass tip is maximum, and the smaller the amplitude of the deformation is, the grass root is not deformed in general. In order to simulate the above-mentioned real scene in the game screen, the inventors creatively added softness factors (which may also be called wind affected factors) corresponding to each vertex separately to the vertex data.
The range of the softness factor can be set as 0,1, and the larger the softness factor of a vertex is, the larger the deformation of the vertex is generated after the vertex is subjected to interactive wind; the smaller the softness factor of a vertex, the smaller the deformation of the vertex after being subjected to the interaction wind, and when the softness factor of a vertex is 0, the vertex does not generate any deformation after being subjected to the interaction wind.
In a specific example, as shown in fig. 2B, the softness factor of vertex a may be set to 0.9, the softness factors of vertex B and vertex C to 0.5, and the softness factors of vertex D and vertex E to 0. At this time, even if only one center point F is set, the apexes D and E of the grass roots do not undergo any wobbling (deformation) after being subjected to the alternating wind because of the introduction of the softness factor.
The advantages of this arrangement are that: when each target model element in the picture to be rendered is acted by interaction wind, the amplitude of swing (or deformation) of different positions of the target model element is different, so that the interaction deformation effect of the target model element is closer to the real situation.
In an alternative implementation manner of this embodiment, according to the target interaction wind parameter, the information of the center point and the softness factor of each vertex, the vertex coordinates of each vertex are adjusted, so that the manner of obtaining the vertex deformation coordinates of each vertex is similar to the foregoing manner, for example, when calculating the bending angle or constructing the interaction wind bending function, the softness factor may be added as a new calculation reference item, so as to ensure that the calculated vertex deformation coordinates of each vertex are matched with the characteristics of the added calculated softness factor.
In the foregoing optional implementation manner of this embodiment, the vertex data may further include: individual characteristic data for distinguishing the model element from other model elements of the same type.
Correspondingly, according to the target interaction wind parameter and the information of the center point, the vertex coordinates of each vertex are adjusted to obtain vertex deformation coordinates of each vertex, which may include:
and adjusting the vertex coordinates of each vertex according to the target interaction wind parameters, the information of the central point and the individual characteristic data to obtain vertex deformation coordinates of each vertex.
The individual characteristic data may be a random value generated randomly, or a value generated by calculating according to a certain rule, or a value set according to the characteristics of the object, and the same or different individual characteristic data may be set for each vertex of the same model element.
In an alternative implementation manner of the embodiment, different individual characteristic data are set for different model elements, so that when a picture to be rendered includes a plurality of target model elements (for example, a cluster of grass formed by a plurality of grass) of the same type, deformation of each target model element after being subjected to interaction wind is not completely the same, and a relatively hard display effect that each grass in the picture is bent by the same angle in a certain direction after being subjected to interaction wind is avoided, and interaction wind deformation of each target model element has small difference, so that the picture is more natural and free from repetitive feeling.
In the above optional implementation manner of this embodiment, the vertex coordinates of each vertex are adjusted according to the target interaction wind parameter, the information of the center point and the individual feature data, so that the manner of obtaining the vertex deformation coordinates of each vertex is similar to the foregoing manner, for example, when calculating the bending angle or constructing the interaction wind bending function, the individual feature data may be added as a new calculation reference item, so as to ensure that the calculated vertex deformation coordinates of each vertex of different model elements have a certain slight difference, so that the rendered image finally appears, and the actual situation in the real scene is more realistic.
In an optional implementation manner of this embodiment, the vertex data may further include: softness factors of each vertex and individual characteristic data;
correspondingly, according to the target interaction wind parameter and the information of the center point, the vertex coordinates of each vertex are adjusted to obtain vertex deformation coordinates of each vertex, which may include:
and adjusting the vertex coordinates of each vertex according to the target interaction wind parameters, the information of the center point, the softness factor of each vertex and the individual characteristic data to obtain vertex deformation coordinates of each vertex.
In the above optional implementation manner of this embodiment, the softness factor of each vertex and the individual feature data are added simultaneously to calculate the vertex deformation coordinates of each vertex of the target model element after the interaction wind is applied, so that not only is the deformation amplitude of each target model element in the picture to be rendered different in the process of generating the interaction deformation, but also a certain small difference in the vertex deformation coordinates of each vertex of different target model elements is ensured, the display effect of the target model elements under the interaction wind is further improved, each model element in the picture to be rendered is more close to the real interaction deformation effect, and the authenticity and the interactivity of the picture are improved.
On the basis of the above embodiments, according to the target interaction wind parameter, the information of the center point, the softness factor of each vertex and the individual feature data, vertex coordinates of each vertex are adjusted to obtain vertex deformation coordinates of each vertex, including:
calculating to obtain deformation offset corresponding to each vertex by taking the target interaction wind parameter, the information of the center point, the softness factor of each vertex and the individual characteristic data as weights;
And obtaining vertex deformation coordinates of the vertexes according to the vertex coordinates of the vertexes and deformation offset corresponding to the vertexes respectively.
On the basis of the above embodiments, adjusting the vertex coordinates of each vertex according to the target interaction wind parameter, the information of the center point, the softness factor of each vertex and the individual feature data to obtain vertex deformation coordinates of each vertex may further include:
calculating the bending angle and the bending direction of the target vertex relative to the center point according to the target interaction wind parameters, the softness factor of the currently processed target vertex in the model element and the individual characteristic data;
and acquiring a center point coordinate according to the information of the center point, rotating the vertex coordinate of the target vertex by the bending angle along the bending direction by taking the center point coordinate as a rotation center, and obtaining a vertex deformation coordinate corresponding to the target vertex.
Example III
Fig. 3 is a block diagram of a picture rendering apparatus according to a third embodiment of the present invention. The device comprises: the system comprises an interactive wind map acquisition module 310, a deformation processing module 320 and a rendering display module 330. Wherein:
An interactive wind map obtaining module 310, configured to obtain an interactive wind map that matches a target frame to be rendered; wherein, the interactive wind map records interactive wind parameters of different positions and interactive wind switch states;
the deformation processing module 320 is configured to perform deformation processing on the target interaction wind parameters of each target model element in the target frame by using the corresponding position relationship according to the interaction wind map;
and the rendering display module 330 is configured to render and display each of the object model elements after the deformation processing.
According to the technical scheme, the interactive weather map matched with the target picture to be rendered is obtained; according to the interactive wind map, performing deformation processing on each target model element in the target picture by using target interactive wind parameters of corresponding position relations; the technical means for rendering and displaying the target model elements after deformation processing is provided with the interactive wind switch states at different positions in the interactive wind map, so that the interactive deformation processing of the model elements in the target picture can be determined according to the interactive wind switch states.
Based on the above embodiments, the interactive wind map may be a rendered texture map;
in the rendering texture map, the interactive wind parameters at different positions and the interactive wind switch states are recorded through RBGA color channels of pixel points.
On the basis of the above embodiments, in the rendering texture map, wind force values of the interactive wind in the X-axis direction and the Z-axis direction are recorded respectively through an R-color channel and a B-color channel, and the state of the interactive wind switch is recorded through an a-color channel;
the interactive wind switch state comprises an opening state or a closing state, is positioned in the interactive wind opening state, participates in interactive wind deformation calculation, is positioned in the interactive wind closing state, and does not perform interactive wind deformation calculation.
Based on the above embodiments, the interactive weather map may include: the dynamic object interacts with the wind farm.
On the basis of the above embodiments, each pixel point in the interactive weather map is associated with each pixel point in the target picture, or each pixel point in the interactive weather map is associated with each pixel point in a set sub-area of the target picture.
Based on the above embodiments, the deformation processing module 320 is executed by the first shader; rendering display module 330 is performed by a second shader.
Based on the above embodiments, the deformation processing module 320 may specifically include:
the candidate model element acquisition unit is used for acquiring at least one candidate model element matched with the action range of the interactive wind map in the target picture;
the current processing element acquisition unit is used for sequentially acquiring an alternative model element as a current processing element;
the matching pixel point acquisition unit is used for acquiring matching pixel points in the interactive weather map according to the position of the current processing element in the target picture;
the target model element determining unit is used for determining the current processing element as a target model element if the interactive wind switch state recorded in the A color channel in the matched pixel point is an on state;
the target interaction wind parameter calculation unit is used for calculating target interaction wind parameters matched with the current processing elements according to wind force values recorded in the R color channel and the B color channel in the matched pixel points;
the deformation processing unit is used for performing deformation processing on the current processing element according to the target interaction wind parameter;
And the return execution unit is used for returning to execute the operation of sequentially acquiring one candidate model element as the current processing element until the processing of all the candidate model elements is completed.
The target interaction wind parameter calculation unit may be specifically configured to:
mapping the wind power values recorded in the R color channel and the B color channel onto an X axis and a Z axis respectively to obtain an X axis wind power vector and a Z axis wind power vector;
and carrying out vector operation according to the X-axis wind power vector and the Z-axis wind power vector to obtain target interaction wind parameters matched with the current processing element, wherein the target interaction wind parameters comprise wind power and wind direction.
The deformation processing unit may specifically be configured to: extracting vertex data of a current processing element, wherein the vertex data comprises: vertex coordinates of each vertex and information of a center point of the current processing element;
and adjusting the vertex coordinates of each vertex according to the target interaction wind parameters and the information of the center point to obtain vertex deformation coordinates of each vertex.
On the basis of the above embodiments, the vertex data further includes: a softness factor for each vertex, the softness factor being associated with a location of the vertex in the current processing element;
Accordingly, the deformation processing unit may be specifically configured to: and adjusting the vertex coordinates of each vertex according to the target interaction wind parameters, the information of the center point and the softness factor of each vertex to obtain the vertex deformation coordinates of each vertex.
On the basis of the above embodiments, the vertex data further includes: individual feature data for distinguishing the current processing element from other types of model elements;
accordingly, the deformation processing unit may be specifically configured to: and adjusting the vertex coordinates of each vertex according to the target interaction wind parameters, the information of the central point and the individual characteristic data to obtain vertex deformation coordinates of each vertex.
On the basis of the above embodiments, the vertex data further includes: softness factors of each vertex and individual characteristic data;
accordingly, the deformation processing unit may be specifically configured to: and adjusting the vertex coordinates of each vertex according to the target interaction wind parameters, the information of the center point, the softness factor of each vertex and the individual characteristic data to obtain vertex deformation coordinates of each vertex.
The picture rendering device provided by the embodiment of the invention can execute the picture rendering method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example IV
Fig. 4 is a schematic structural diagram of a computer device in a fourth embodiment of the present invention. Fig. 4 illustrates a block diagram of an exemplary computer device 412 suitable for use in implementing embodiments of the invention. The computer device 412 shown in fig. 4 is only an example and should not be construed as limiting the functionality and scope of use of embodiments of the invention.
As shown in FIG. 4, computer device 412 is in the form of a general purpose computing device. Components of computer device 412 may include, but are not limited to: one or more processors or processing units 416, a system memory 428, and a bus 418 that connects the various system components (including the system memory 428 and processing units 416).
Bus 418 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, or a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, micro channel architecture (MAC) bus, enhanced ISA bus, video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Computer device 412 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by computer device 412 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 428 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) 430 and/or cache memory 432. The computer device 412 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 434 may be used to read from or write to non-removable, nonvolatile magnetic media (not shown in FIG. 4, commonly referred to as a "hard disk drive"). Although not shown in fig. 4, a magnetic disk drive for reading from and writing to a removable non-volatile magnetic disk (e.g., a "floppy disk"), and an optical disk drive for reading from or writing to a removable non-volatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In such cases, each drive may be coupled to bus 418 via one or more data medium interfaces. Memory 428 may include at least one program product having a set (e.g., at least one) of program modules configured to carry out the functions of embodiments of the invention.
A program/utility 440 having a set (at least one) of program modules 442 may be stored in, for example, memory 428, such program modules 442 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Program modules 442 generally perform the functions and/or methodologies in the described embodiments of the invention.
The computer device 412 may also communicate with one or more external devices 414 (e.g., keyboard, pointing device, display 424, etc.), one or more devices that enable a user to interact with the computer device 412, and/or any devices (e.g., network card, modem, etc.) that enable the computer device 412 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 422. Moreover, computer device 412 may also communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet, through network adapter 420. As shown, network adapter 420 communicates with other modules of computer device 412 over bus 418. It should be appreciated that although not shown in fig. 4, other hardware and/or software modules may be used in connection with computer device 412, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
The processing unit 416 executes various functional applications and data processing by running programs stored in the system memory 428, for example, to implement a picture rendering method provided by an embodiment of the present invention, including:
acquiring an interactive wind map matched with a target picture to be rendered; wherein, the interactive wind map records interactive wind parameters of different positions and interactive wind switch states; according to the interactive wind map, performing deformation processing on each target model element in the target picture by using target interactive wind parameters of corresponding position relations; rendering and displaying each target model element after deformation processing.
Example five
The fifth embodiment of the present invention further provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a picture rendering method as provided by the embodiments of the present invention, including:
acquiring an interactive wind map matched with a target picture to be rendered; wherein, the interactive wind map records interactive wind parameters of different positions and interactive wind switch states; according to the interactive wind map, performing deformation processing on each target model element in the target picture by using target interactive wind parameters of corresponding position relations; rendering and displaying each target model element after deformation processing.
The computer storage media of embodiments of the invention may take the form of any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present invention may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.

Claims (13)

1. A picture rendering method, comprising:
acquiring an interactive wind map matched with a target picture to be rendered; wherein, the interactive wind map records interactive wind parameters of different positions and interactive wind switch states;
according to the interactive wind map, performing deformation processing on each target model element in the target picture by using target interactive wind parameters of corresponding position relations;
rendering and displaying each target model element after deformation treatment;
the interactive wind map is a rendering texture map;
recording interactive wind parameters at different positions and interactive wind switch states through RBGA color channels of pixel points in the rendering texture map;
in the rendering texture mapping, wind force values of the interactive wind in the X-axis direction and the Z-axis direction are recorded respectively through an R color channel and a B color channel, and the switching state of the interactive wind is recorded through an A color channel;
the interactive wind switch state comprises an opening state or a closing state, is positioned in the interactive wind opening state, participates in interactive wind deformation calculation, is positioned in the interactive wind closing state, and does not perform interactive wind deformation calculation.
2. The method of claim 1, wherein the interactive windmap comprises: the dynamic object interacts with the wind farm.
3. The method of claim 1, wherein each pixel in the interactive weather map is associated with each pixel in the target picture or each pixel in the interactive weather map is associated with each pixel in a set sub-region of the target picture.
4. The method according to claim 1, wherein the operation of performing deformation processing on each target model element in the target picture by using the target interaction wind parameter of the corresponding position relationship according to the interaction wind map is performed by the first shader;
rendering and displaying each target model element after deformation processing, and executing by a second shader.
5. The method according to any one of claims 1 to 4, wherein performing deformation processing on the target model elements in the target frame by using the target interaction wind parameters of the corresponding positional relationship according to the interaction wind map comprises:
at least one alternative model element matched with the action range of the interactive wind map is acquired in the target picture;
Sequentially acquiring an alternative model element as a current processing element;
according to the position of the current processing element in the target picture, obtaining a matched pixel point in the interactive weather map;
if the interactive wind switch state recorded in the color channel A in the matched pixel point is an on state, determining the current processing element as a target model element;
calculating target interaction wind parameters matched with the current processing element according to wind force values recorded in an R color channel and a B color channel in the matched pixel points, and performing deformation processing on the current processing element according to the target interaction wind parameters;
and returning to execute the operation of sequentially acquiring one candidate model element as the current processing element until the processing of all the candidate model elements is completed.
6. The method of claim 5, wherein calculating a target interaction wind parameter matching the current processing element based on wind values recorded in R color channel and B color channel in the matching pixel point, comprises:
mapping the wind power values recorded in the R color channel and the B color channel onto an X axis and a Z axis respectively to obtain an X axis wind power vector and a Z axis wind power vector;
And carrying out vector operation according to the X-axis wind power vector and the Z-axis wind power vector to obtain target interaction wind parameters matched with the current processing element, wherein the target interaction wind parameters comprise wind power and wind direction.
7. The method of claim 5, wherein deforming the current processing element according to the target interaction wind parameter comprises:
extracting vertex data of a current processing element, wherein the vertex data comprises: vertex coordinates of each vertex and information of a center point of the current processing element;
and adjusting the vertex coordinates of each vertex according to the target interaction wind parameters and the information of the center point to obtain vertex deformation coordinates of each vertex.
8. The method of claim 7, wherein the vertex data further comprises: a softness factor for each vertex, the softness factor being associated with a location of the vertex in the current processing element;
according to the target interaction wind parameter and the information of the center point, vertex coordinates of all the vertices are adjusted to obtain vertex deformation coordinates of all the vertices, and the method comprises the following steps:
and adjusting the vertex coordinates of each vertex according to the target interaction wind parameters, the information of the center point and the softness factor of each vertex to obtain the vertex deformation coordinates of each vertex.
9. The method of claim 7, wherein the vertex data further comprises: individual feature data for distinguishing the current processing element from other types of model elements;
according to the target interaction wind parameter and the information of the center point, vertex coordinates of all the vertices are adjusted to obtain vertex deformation coordinates of all the vertices, and the method comprises the following steps:
and adjusting the vertex coordinates of each vertex according to the target interaction wind parameters, the information of the central point and the individual characteristic data to obtain vertex deformation coordinates of each vertex.
10. The method of claim 7, wherein the vertex data further comprises: softness factors of each vertex and individual characteristic data;
according to the target interaction wind parameter and the information of the center point, vertex coordinates of all the vertices are adjusted to obtain vertex deformation coordinates of all the vertices, and the method comprises the following steps:
and adjusting the vertex coordinates of each vertex according to the target interaction wind parameters, the information of the center point, the softness factor of each vertex and the individual characteristic data to obtain vertex deformation coordinates of each vertex.
11. A picture rendering apparatus, comprising:
The interactive wind map acquisition module is used for acquiring an interactive wind map matched with a target picture to be rendered; wherein, the interactive wind map records interactive wind parameters of different positions and interactive wind switch states;
the deformation processing module is used for performing deformation processing on each target model element in the target picture by using the target interaction wind parameters of the corresponding position relation according to the interaction wind map;
the rendering display module is used for rendering and displaying each target model element after deformation processing;
the interactive wind map is a rendering texture map;
recording interactive wind parameters at different positions and interactive wind switch states through RBGA color channels of pixel points in the rendering texture map;
in the rendering texture mapping, wind force values of the interactive wind in the X-axis direction and the Z-axis direction are recorded respectively through an R color channel and a B color channel, and the switching state of the interactive wind is recorded through an A color channel;
the interactive wind switch state comprises an opening state or a closing state, is positioned in the interactive wind opening state, participates in interactive wind deformation calculation, is positioned in the interactive wind closing state, and does not perform interactive wind deformation calculation.
12. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the picture rendering method according to any one of claims 1-10 when executing the program.
13. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when executed by a processor, implements a picture rendering method as claimed in any one of claims 1-10.
CN202011003955.4A 2020-09-22 2020-09-22 Picture rendering method and device, computer equipment and storage medium Active CN112132936B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011003955.4A CN112132936B (en) 2020-09-22 2020-09-22 Picture rendering method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011003955.4A CN112132936B (en) 2020-09-22 2020-09-22 Picture rendering method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112132936A CN112132936A (en) 2020-12-25
CN112132936B true CN112132936B (en) 2024-03-29

Family

ID=73842525

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011003955.4A Active CN112132936B (en) 2020-09-22 2020-09-22 Picture rendering method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112132936B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116964661A (en) * 2020-12-31 2023-10-27 华为技术有限公司 GPU, SPPU and task processing method
CN113362436B (en) * 2021-05-31 2023-09-12 上海米哈游璃月科技有限公司 Object rendering method, device, equipment and storage medium
CN117036560B (en) * 2023-10-10 2024-01-02 福州朱雀网络科技有限公司 Wind field simulation method, medium and equipment suitable for virtual scene

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9916760B1 (en) * 2016-06-30 2018-03-13 Amazon Technologies, Inc. Efficient traffic map rendering
CN109934897A (en) * 2019-03-06 2019-06-25 珠海金山网络游戏科技有限公司 A kind of swing effect simulation system, calculates equipment and storage medium at method
CN110838162A (en) * 2019-11-26 2020-02-25 网易(杭州)网络有限公司 Vegetation rendering method and device, storage medium and electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107463398B (en) * 2017-07-21 2018-08-17 腾讯科技(深圳)有限公司 Game rendering intent, device, storage device and terminal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9916760B1 (en) * 2016-06-30 2018-03-13 Amazon Technologies, Inc. Efficient traffic map rendering
CN109934897A (en) * 2019-03-06 2019-06-25 珠海金山网络游戏科技有限公司 A kind of swing effect simulation system, calculates equipment and storage medium at method
CN110838162A (en) * 2019-11-26 2020-02-25 网易(杭州)网络有限公司 Vegetation rendering method and device, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN112132936A (en) 2020-12-25

Similar Documents

Publication Publication Date Title
CN112132936B (en) Picture rendering method and device, computer equipment and storage medium
EP3882870B1 (en) Method and device for image display, storage medium and electronic device
CN112241993B (en) Game image processing method and device and electronic equipment
CN108564646A (en) Rendering intent and device, storage medium, the electronic device of object
US9183654B2 (en) Live editing and integrated control of image-based lighting of 3D models
CN110930486A (en) Rendering method and device of virtual grass in game and electronic equipment
CN112102492B (en) Game resource manufacturing method and device, storage medium and terminal
CN111882632A (en) Rendering method, device and equipment of ground surface details and storage medium
CN111583378B (en) Virtual asset processing method and device, electronic equipment and storage medium
CN110930484B (en) Animation configuration method and device, storage medium and electronic device
CN112206535A (en) Rendering display method and device of virtual object, terminal and storage medium
CN112132938B (en) Model element deformation processing and picture rendering method, device, equipment and medium
CN102306097B (en) Method for performing real-time image processing on scene images in MultiGen-Vega
US11100723B2 (en) System, method, and terminal device for controlling virtual image by selecting user interface element
CN111882640A (en) Rendering parameter determination method, device, equipment and storage medium
CN112132935A (en) Model element deformation processing method, model element deformation processing device, model element image rendering method, model element image rendering device and model element image rendering medium
CN112132934A (en) Model element deformation processing method, model element deformation processing device, model element image rendering method, model element image rendering device and model element image rendering medium
CN107274468B (en) Wind-catching processing method and system applied to three-dimensional game
CN115970275A (en) Projection processing method and device for virtual object, storage medium and electronic equipment
CN113192173B (en) Image processing method and device of three-dimensional scene and electronic equipment
CA2316611A1 (en) Method and apparatus for providing depth blur effects within a 3d videographics system
CN112473135A (en) Real-time illumination simulation method, device, equipment and storage medium for mobile game
CN111462343A (en) Data processing method and device, electronic equipment and storage medium
CN111882637B (en) Picture rendering method, device, equipment and medium
CN112132937A (en) Model element deformation processing method, model element deformation processing device, model element image rendering method, model element image rendering device and model element image rendering medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant