CN114037811B - 3D temperature field graph rendering method, apparatus, medium, and device based on directed distance field - Google Patents
3D temperature field graph rendering method, apparatus, medium, and device based on directed distance field Download PDFInfo
- Publication number
- CN114037811B CN114037811B CN202111372182.1A CN202111372182A CN114037811B CN 114037811 B CN114037811 B CN 114037811B CN 202111372182 A CN202111372182 A CN 202111372182A CN 114037811 B CN114037811 B CN 114037811B
- Authority
- CN
- China
- Prior art keywords
- rendering
- texture
- bvh
- data
- temperature field
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000009877 rendering Methods 0.000 title claims abstract description 77
- 238000000034 method Methods 0.000 title claims abstract description 50
- 230000001133 acceleration Effects 0.000 claims abstract description 25
- 238000002156 mixing Methods 0.000 claims abstract description 8
- 238000003860 storage Methods 0.000 claims abstract description 8
- 238000005070 sampling Methods 0.000 claims description 35
- 238000010586 diagram Methods 0.000 claims description 9
- 238000001914 filtration Methods 0.000 claims description 7
- 238000012360 testing method Methods 0.000 claims description 5
- 239000003086 colorant Substances 0.000 claims description 4
- 238000000605 extraction Methods 0.000 claims description 3
- 239000000203 mixture Substances 0.000 claims description 3
- 230000003287 optical effect Effects 0.000 claims description 3
- 230000003068 static effect Effects 0.000 claims description 3
- 238000004040 coloring Methods 0.000 claims description 2
- 238000004590 computer program Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000001186 cumulative effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000002277 temperature effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/10—Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/20—Processor architectures; Processor configuration, e.g. pipelining
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
- G06T17/205—Re-meshing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4007—Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Generation (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention relates to a 3D temperature field graph rendering method based on a directed distance field, which comprises the following steps: extracting triangle data in the triangular surface model, and constructing a BVH acceleration structure; baking out directional distance field data based on the BVH acceleration structure and storing to a 3D texture; the GPU renders a 3D temperature field map by using the 3D texture; and mixing the scene rendering result with the traditional scene rendering result and outputting the mixed result to a display. The method of the invention is no longer limited to rendering only on the surface of the model, but can actually render accurate temperature information in 3D space. The invention also relates to a 3D temperature field map rendering apparatus, storage medium, and device based on the directed distance field.
Description
Technical Field
The invention relates to the technical field of computers, in particular to a 3D temperature field graph rendering method and device based on a directed distance field, a storage medium and equipment.
Background
Because most of graphic engines are rendered based on a triangular surface, the existing temperature field diagram is represented based on a plane, and cannot accurately represent the real situation of a 3D space, for example, the body surface temperature and the body internal temperature of a human body are different, and the traditional scheme cannot accurately represent the situation and only can draw the surface temperature.
Therefore, there is a strong need in the art to develop a method capable of truly representing the temperature field map inside the 3D space.
Disclosure of Invention
The invention aims to solve the technical problems of the prior art, and provides a 3D temperature field map rendering method, device, storage medium and equipment based on a directed distance field, which are mainly used for solving the technical problems of low manual production efficiency, long production period and complicated display effect modification process in the existing building floor model generation.
The technical scheme for solving the technical problems is as follows:
a method of 3D temperature field map rendering based on a directed distance field, the method comprising:
extracting triangle data in the triangular surface model, and constructing a BVH acceleration structure;
baking out directional distance field data based on the BVH acceleration structure and storing to a 3D texture;
the GPU renders a 3D temperature field map by using the 3D texture;
and mixing the scene rendering result with the traditional scene rendering result and outputting the mixed result to a display.
Preferably, the extracting triangle data in the triangular surface model and the constructing the BVH acceleration structure include:
acquiring a vertex buffer object VBO and a triangle sequence number buffer object IBO in a three-dimensional engine through a static grid;
extracting triangle data by using the vertex buffer object VBO and the triangle sequence number buffer object IBO, and storing the triangle data into a user-defined triangle data structure;
and constructing a BVH acceleration structure based on an SAH method.
Preferably, baking out the directional distance field data based on the BVH-accelerating structure and storing to the 3D texture comprises:
creating a 3D texture object based on the three-dimensional engine;
setting a filtering mode, and starting bilinear interpolation or trilinear interpolation to perform texture sampling filtering;
the 3D texture object is filled.
Preferably, the filling the 3D texture object comprises:
voxelizing the BVH accelerated structure space into the voxel number corresponding to the 3D texture;
and querying the directional distance of the current voxel relative to the triangular surface model by using the position of the voxel in the BVH acceleration structure.
Preferably, the GPU rendering the 3D temperature field map using the 3D texture comprises:
transmitting the 3D texture, the BVH bounding box data and the actual temperature sensor data in the 3D space to a shader;
and rendering the 3D temperature field diagram in the shader by a ray stepping method, and simultaneously detecting a rendering range and performing a depth test.
Preferably, rendering the 3D temperature field map in the shader by a ray stepping method, and simultaneously detecting a rendering range and performing a depth test includes:
performing 3D rendering in the shader by a ray stepping method;
detecting and optimizing a rendering range, and executing a renderer coloring code;
calculating sampling UVW of the 3D texture and the weight of a single sampling point;
the final color of the current pixel is calculated.
Preferably, the calculating the final color of the current pixel comprises:
the color of a single 3D sampling point is Csp, the coordinates of the sampling point are sp, and the final color is C, wherein the part with the directed distance smaller than 0 is filtered by Clamp, and then the colors Ctemp of all the sampling points on the optical path L are accumulated;
sp=ro+rd*S;
Ctemp=f(sp,hps,whp,color)=∫whp*dist(sp,hp)/rhp*color;
sp is the position of a sampling point in a three-dimensional space, ro is a light starting point, rd is a light direction, S is an accumulated light step length, hps is an array of sampling points of all temperature sensors, whp is a default value of the weight of a single sensor, rhp is a data coverage radius, color is a basic color, the sampling is obtained from a color band, and hp is a sampling point of the single temperature sensor;
Csp=f(UVW)=clamp(sample(UVW),0,1)*Ctemp;
wherein, L is the total advancing length of the light, namely the light path, theta is the light step length, and Wsp is the weight of a single sampling point.
The invention has the beneficial effects that: A3D temperature field graph rendering method based on a directed distance field is provided, and is not limited to rendering only on the surface of a model, but can truly render accurate temperature information in a 3D space. Similarly, if information such as air quality sensor data, gravity sensor data, pressure sensor data, wind sensor data or personnel density sensor data is acquired through the shader, the rendering effect diagram corresponding to the actual scene can be output through the method of the invention, and the method is not limited to the 3D temperature field diagram.
The invention also solves another technical scheme of the technical problems as follows:
an apparatus for 3D temperature field map rendering based on a directed distance field, the apparatus comprising:
the extraction module is used for extracting triangular data in the triangular surface model and constructing a BVH acceleration structure;
a baking module to bake out directional distance field data based on the BVH acceleration structure and store to the 3D texture;
a rendering module to render a 3D temperature field map using the 3D texture by the GPU;
and the output module is used for mixing the 3D temperature field graph rendered by the rendering module and a traditional scene rendering result and outputting the mixture to a display.
Furthermore, the present invention provides a computer readable storage medium storing one or more programs, which are executable by one or more processors, to implement the steps in the method for 3D temperature field map rendering based on directed distance fields according to any of the above technical solutions.
The present invention also provides a 3D temperature field map rendering apparatus based on a directed distance field, comprising: a processor and a memory; the memory has stored thereon a computer readable program executable by the processor; the processor, when executing the computer readable program, implements the steps in the method for 3D temperature field map rendering based on directed distance fields according to any of the above aspects.
Advantages of additional aspects of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments of the present invention or in the description of the prior art will be briefly described below, and it is obvious that the drawings described below are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart of a 3D temperature field map rendering method based on a directed distance field according to an embodiment of the present invention;
fig. 2 is a block diagram of an apparatus for rendering a 3D temperature field map based on a directed distance field according to another embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, shall fall within the scope of protection of the present invention.
As shown in fig. 1, a method for rendering a 3D temperature field map based on a directed distance field includes the steps of:
110. extracting triangle data in the triangular surface model, and constructing a BVH acceleration structure;
120. baking out directional distance field data based on the BVH acceleration structure and storing to a 3D texture;
130. the GPU renders a 3D temperature field map by using the 3D texture;
140. and mixing the scene rendering result with the traditional scene rendering result and outputting the mixed result to a display.
According to the method, an explicit model (a traditional triangular surface model) is baked to obtain an implicit model (a directed distance field), then a 3D temperature field graph is directly rendered on a GPU by using a volume rendering method, and the 3D temperature field graph and a traditional 3D scene rendering result are correctly mixed and output to a display, so that accurate temperature information can be truly rendered in a 3D space.
Further, the extracting triangle data in the triangle surface model in step 110, and constructing the BVH (hierarchical bounding box) acceleration structure specifically includes:
111. acquiring a vertex buffer object VBO and a triangle sequence number buffer object IBO in a three-dimensional engine through a static grid;
112. extracting triangle data by using the vertex buffer object VBO and the triangle sequence number buffer object IBO, and storing the triangle data into a user-defined triangle data structure FTriangle, wherein for example, a triangle is formed by three vertexes with sequence numbers (0, 1 and 2);
113. and constructing a BVH acceleration structure based on an SAH (Surface Area Hearistic Surface Area Heuristic) method.
The invention selects SAH method to construct BVH acceleration structure, when the number of subsequent queries is larger, the method can obtain larger profit. The specific aim of the construction of the BVH accelerating structure is to divide the triangles in the space into different AABB bounding boxes according to the set maximum number.
Preferably, baking the directional distance field data based on the BVH-accelerating structure and storing to the 3D texture in step 120 specifically comprises:
121. creating a 3D texture object based on the three-dimensional engine;
122. a filtering mode is set, and bilinear interpolation or trilinear interpolation is started, so that smoother and softer color transition can be brought;
123. the 3D texture object is filled.
The 3D texture is used for storing the directional distance, and the 3D texture needs to be controlled to be more than 256 units, so that a better display effect can be obtained, and the number of units only affects the generation time. It should be noted that the 3D texture object can be understood as a plurality of stacked 2D textures compared to the common 2D texture map.
Further, the filling the 3D texture object in step 123 specifically includes:
1231. voxelizing the BVH accelerated structure space into the voxel number corresponding to the 3D texture;
1232. and querying the directional distance of the current voxel relative to the triangular surface model by using the position of the voxel in the BVH acceleration structure. Where directed can be understood as being inside the model or outside the model, the internal coefficient is +1 if inside the model and the external coefficient is-1 if outside the model.
The voxel is a unit cube in 3D space.
Preferably, the step 130 of rendering the 3D temperature field map by the GPU using the 3D texture specifically includes:
131. transmitting the 3D texture, the BVH bounding box data, the actual temperature sensor data in the 3D space and the like to a shader through a material ball UMaterialInstance in a three-dimensional engine;
132. and rendering the 3D temperature field diagram in the shader by a ray stepping method, and simultaneously detecting a rendering range and performing a depth test.
The temperature sensor data comprises data information such as a sensor temperature value, a sensor acquisition range, a sensor acquisition position and the like. Moreover, the temperature sensor data in the invention can be replaced by sensor data such as an air quality sensor, a personnel density sensor and the like, and a 3D rendering map is correspondingly generated, so that the invention can also be applied to rendering other 3D rendering maps capable of displaying real data, and is not limited to the temperature field map.
Preferably, the rendering the 3D temperature field map in the shader in the step 132 by a ray stepping method, and the detecting the rendering range and performing the depth test by using the RayAABB (ray-axis aligned bounding box collision detection) specifically include:
1321. performing 3D rendering in the shader by a ray stepping Raymarching method;
1322. detecting and optimizing a rendering range, and executing rendering shader codes if the light rays emitted from the camera position have intersection points with an AABB bounding box of the model and the depth of an incident point is less than the depth of the three-dimensional engine scene depth;
1323. the sampling UVW of the 3D texture is calculated, as well as the individual sampling point weights.
1324. The final color of the current pixel is calculated.
Specifically, when the light has an intersection point with the AABB bounding box of the model, a point P at which the light intersects with the AABB and a point Q at which the light intersects with the AABB are calculated, the total length (light path) L of light travel is calculated, and the sampling UVW of the 3D texture can be calculated according to the light step length θ set by the light stepping method. Wherein, the light starting point is ro, the light direction is rd, m1 is the minimum point of AABB, m2 is the maximum point of AABB, the cumulative light step length is S, then the sampling UVW and the weight Wsp of a single sampling point are:
preferably, the calculating the final color of the current pixel in step 1324 specifically includes:
the color of a single 3D sampling point is Csp, the coordinate of the sampling point is sp, and the final color is C, wherein a part with a directed distance less than 0 is filtered by Clamp, and because the temperature effect is only rendered in the model, the colors Ctemp of all sampling points on the light path L are accumulated;
sp=ro+rd*S;
Ctemp=f(sp,hps,whp,color)=∫whp*dist(sp,hp)/rhp*color;
wherein sp is the position of a sampling point in a three-dimensional space, ro is a light starting point, rd is a light direction, S is an accumulated light step length, hps is an array of sampling points of all temperature sensors, whp is a single sensor weight default value of 1, rhp is a data coverage radius, color is a basic color, the sampling is carried out from a color band, and hp is a single sampling point of the temperature sensor;
Csp=f(UVW)=clamp(sample(UVW),0,1)*Ctemp;
in step 130 of the invention, a 3D temperature field map is obtained by GPU-based rendering, then a rendering result and an existing traditional scene rendering result are mixed and output to a display, finally a simple color mixing is executed, and the result is finally displayed on the display by a three-dimensional engine, and the specific output mode is as follows:
Output=f(SceneColor,Csp)=SceneColor.rgb*(1-Csp.a)+Csp.rgb*Csp.a。
rgb represents the three primary colors of red, green and blue in the color, a represents the transparency, and SceneColor is the current frame buffer color.
The 3D temperature field graph rendering method based on the directed distance field is not limited to rendering on the surface of a model any more, but can truly render accurate temperature information in a 3D space. Similarly, if information such as air quality sensor data, gravity sensor data, pressure sensor data, wind sensor data or personnel density sensor data is acquired through the shader, the rendering effect diagram corresponding to the actual scene can be output through the method of the invention, and the method is not limited to the 3D temperature field diagram.
The invention also solves another technical scheme of the technical problems as follows:
as shown in fig. 2, an apparatus for rendering a 3D temperature field map based on a directed distance field includes:
the extraction module is used for extracting triangular data in the triangular surface model and constructing a BVH acceleration structure;
a baking module to bake out directional distance field data based on the BVH acceleration structure and store to the 3D texture;
a rendering module to render a 3D temperature field map using 3D textures via a GPU;
and the output module is used for mixing the 3D temperature field graph rendered by the rendering module and a traditional scene rendering result and outputting the mixture to a display.
The 3D temperature field graph rendering device based on the directed distance field can accurately display the accurate condition of the temperature of the internal space of the model, and is more accurate and more visual compared with the prior art that the temperature of the surface of the model can only be described; the invention has no requirement on the triangular surface model, can be obtained by providing a user, or directly obtain the existing triangular surface model, and directly process any triangular surface model to obtain the accurate condition of the internal space temperature of the model.
Furthermore, the present invention provides a computer readable storage medium storing one or more programs, which are executable by one or more processors, to implement the steps in the method for 3D temperature field map rendering based on directed distance fields according to any of the above technical solutions.
The present invention also provides a 3D temperature field map rendering apparatus based on a directed distance field, comprising: a processor and a memory; the memory has stored thereon a computer readable program executable by the processor; the processor, when executing the computer readable program, implements the steps in the method for 3D temperature field map rendering based on directed distance fields according to any of the above aspects.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium.
Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer memory, Read-only memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, etc. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.
While the invention has been described with reference to specific embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (8)
1. A method for 3D temperature field map rendering based on a directed distance field, the method comprising the steps of:
extracting triangle data in the triangular surface model, and constructing a BVH acceleration structure;
baking out directional distance field data based on the BVH acceleration structure and storing to a 3D texture;
the GPU renders a 3D temperature field map by utilizing the 3D texture, the BVH bounding box data and the actual temperature sensor data in the 3D space;
mixing the scene with a traditional scene rendering result and outputting the result to a display;
wherein baking out directional distance field data based on the BVH accelerating structure and storing to a 3D texture comprises:
creating a 3D texture object based on the three-dimensional engine;
setting a filtering mode, and starting bilinear interpolation or trilinear interpolation to perform texture sampling filtering;
filling the 3D texture object;
the populating the 3D texture object includes:
voxelizing the BVH accelerated structure space into the voxel number corresponding to the 3D texture;
and querying the directional distance of the current voxel relative to the triangular surface model by using the position of the voxel in the BVH acceleration structure.
2. The method of claim 1, wherein extracting triangle data in the triangular surface model and constructing the BVH acceleration structure comprises:
acquiring a vertex buffer object VBO and a triangle sequence number buffer object IBO in a three-dimensional engine through a static grid;
extracting triangle data by using the vertex buffer object VBO and the triangle sequence number buffer object IBO, and storing the triangle data into a user-defined triangle data structure;
and constructing a BVH acceleration structure based on an SAH method.
3. The method of claim 1, wherein the GPU rendering the 3D temperature field map using 3D textures, BVH bounding box data, and actual temperature sensor data in 3D space comprises:
transmitting the 3D texture, the BVH bounding box data and the actual temperature sensor data in the 3D space to a shader;
and rendering the 3D temperature field diagram in the shader by a ray stepping method, and simultaneously detecting a rendering range and performing a depth test.
4. The method of claim 3, wherein the step of rendering the 3D temperature field map by ray stepping in the shader comprises:
performing 3D rendering in the shader by a ray stepping method;
detecting and optimizing a rendering range, and executing a renderer coloring code;
calculating sampling UVW of the 3D texture and the weight of a single sampling point;
the final color of the current pixel is calculated.
5. The method of claim 4, wherein computing the current pixel final color comprises:
the color of a single 3D sampling point is Csp, the coordinates of the sampling point are sp, and the final color is C, wherein the part with the directed distance smaller than 0 is filtered by Clamp, and then the colors Ctemp of all the sampling points on the optical path L are accumulated;
sp=ro+rd*S;
Ctemp=f(sp,hps,whp,color)=∫whp*dist(sp,hp)/rhp*color;
wherein sp is the position of a sampling point in a three-dimensional space, ro is a light starting point, rd is a light direction, S is an accumulated light step length, hps is an array of sampling points of all temperature sensors, whp is a single sensor weight default value of 1, rhp is a data coverage radius, color is a basic color, the sampling is carried out from a color band, and hp is a single sampling point of the temperature sensor;
Csp=f(UVW)=clamp(sample(UVW),0,1)*Ctemp;
wherein, L is the total advancing length of the light, namely the light path, theta is the light step length, and Wsp is the weight of a single sampling point.
6. An apparatus for 3D temperature field map rendering based on a directed distance field, the apparatus comprising:
the extraction module is used for extracting triangular data in the triangular surface model and constructing a BVH acceleration structure;
a baking module to bake out directional distance field data based on the BVH acceleration structure and store to the 3D texture;
the rendering module is used for rendering a 3D temperature field map by using the 3D texture, the BVH bounding box data and the actual temperature sensor data in the 3D space through the GPU;
the output module is used for mixing the 3D temperature field graph rendered by the rendering module and a traditional scene rendering result and outputting the mixture to a display;
wherein, the baking module specifically comprises:
creating a 3D texture object based on the three-dimensional engine;
setting a filtering mode, and starting bilinear interpolation or trilinear interpolation to perform texture sampling filtering;
filling the 3D texture object;
the populating the 3D texture object includes:
voxelizing the BVH accelerated structure space into the voxel number corresponding to the 3D texture;
and querying the directional distance of the current voxel relative to the triangular surface model by using the position of the voxel in the BVH acceleration structure.
7. A computer readable storage medium storing one or more programs thereon, the one or more programs being executable by one or more processors to perform the steps in the method for 3D temperature field map rendering based on directed distance fields according to any of claims 1-5.
8. A 3D temperature field map rendering device based on a directed distance field, comprising: a processor and a memory; the memory has stored thereon a computer readable program executable by the processor; the processor, when executing the computer readable program, implements the steps in the method for 3D temperature field map rendering based on a directed distance field of any of claims 1-5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111372182.1A CN114037811B (en) | 2021-11-18 | 2021-11-18 | 3D temperature field graph rendering method, apparatus, medium, and device based on directed distance field |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111372182.1A CN114037811B (en) | 2021-11-18 | 2021-11-18 | 3D temperature field graph rendering method, apparatus, medium, and device based on directed distance field |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114037811A CN114037811A (en) | 2022-02-11 |
CN114037811B true CN114037811B (en) | 2022-05-27 |
Family
ID=80138153
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111372182.1A Active CN114037811B (en) | 2021-11-18 | 2021-11-18 | 3D temperature field graph rendering method, apparatus, medium, and device based on directed distance field |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114037811B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115601506B (en) * | 2022-11-07 | 2024-05-28 | 上海人工智能创新中心 | Reconstruction method of three-dimensional scene, electronic equipment and medium |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109903385A (en) * | 2019-04-29 | 2019-06-18 | 网易(杭州)网络有限公司 | Rendering method, device, processor and the terminal of threedimensional model |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190392102A1 (en) * | 2018-06-22 | 2019-12-26 | Xplicit Computing, Inc. | Unified geometries for dynamic high-performance computing |
CN109701273B (en) * | 2019-01-16 | 2022-04-19 | 腾讯科技(北京)有限公司 | Game data processing method and device, electronic equipment and readable storage medium |
CN111292405B (en) * | 2020-02-06 | 2022-04-08 | 腾讯科技(深圳)有限公司 | Image rendering method and related device |
CN111737835B (en) * | 2020-06-28 | 2021-10-15 | 大连理工大学 | Three-period minimum curved surface-based three-dimensional porous heat dissipation structure design and optimization method |
-
2021
- 2021-11-18 CN CN202111372182.1A patent/CN114037811B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109903385A (en) * | 2019-04-29 | 2019-06-18 | 网易(杭州)网络有限公司 | Rendering method, device, processor and the terminal of threedimensional model |
Also Published As
Publication number | Publication date |
---|---|
CN114037811A (en) | 2022-02-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4643271B2 (en) | Visible surface determination system and method for computer graphics using interval analysis | |
CN103503032B (en) | Veining in graphic hardware | |
CN114219902B (en) | Method and device for rendering volume drawing of meteorological data and computer equipment | |
CN103530907B (en) | Complicated three-dimensional model drawing method based on images | |
EP2831848B1 (en) | Method for estimating the opacity level in a scene and corresponding device | |
US20110228055A1 (en) | Space skipping for multi-dimensional image rendering | |
CN110728740A (en) | Virtual photogrammetry | |
US11579466B2 (en) | Method, device, apparatus and computer readable storage medium of simulating volumetric 3D display | |
CN102768765A (en) | Real-time soft shadow rendering method for point light sources | |
Shopf et al. | March of the froblins: simulation and rendering massive crowds of intelligent and detailed creatures on gpu | |
CN106504310A (en) | Graphic system | |
CN114037811B (en) | 3D temperature field graph rendering method, apparatus, medium, and device based on directed distance field | |
CN117152334B (en) | Three-dimensional simulation method based on electric wave and meteorological cloud image big data | |
CN118154661A (en) | Method, device, equipment and storage medium for opening degree analysis of three-dimensional space | |
EP3437072B1 (en) | System and method for rendering points without gaps | |
KR20180048081A (en) | Method and apparatus for processing texture | |
US6690369B1 (en) | Hardware-accelerated photoreal rendering | |
US20230326129A1 (en) | Method and apparatus for storing visibility data of three-dimensional model, device, and storage medium | |
CN112927334B (en) | Three-dimensional model rapid voxelization method based on GPU | |
EP4182892A1 (en) | Direct volume rendering apparatus | |
Congote et al. | Volume ray casting in WebGL | |
CN117058301B (en) | Knitted fabric real-time rendering method based on delayed coloring | |
CN117333598B (en) | 3D model rendering system and method based on digital scene | |
Ragragui et al. | Per-pixel revolution mapping with rectification of the texture projection | |
CN116385613A (en) | Method, storage medium and equipment for screen space thermodynamic diagram rendering |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP03 | Change of name, title or address | ||
CP03 | Change of name, title or address |
Address after: Room 105, first floor, building 82, No. 10, Jiuxianqiao Road, Chaoyang District, Beijing 100015 Patentee after: Beijing Youhao Technology Co.,Ltd. Country or region after: China Address before: Room 105, first floor, building 82, No. 10, Jiuxianqiao Road, Chaoyang District, Beijing 100015 Patentee before: BEIJING YOUNUO TECHNOLOGY Co.,Ltd. Country or region before: China |