CN112991508A - WebGL-based 3D rendering system and method - Google Patents

WebGL-based 3D rendering system and method Download PDF

Info

Publication number
CN112991508A
CN112991508A CN202110340974.4A CN202110340974A CN112991508A CN 112991508 A CN112991508 A CN 112991508A CN 202110340974 A CN202110340974 A CN 202110340974A CN 112991508 A CN112991508 A CN 112991508A
Authority
CN
China
Prior art keywords
module
library
texture
rendering
webgl
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110340974.4A
Other languages
Chinese (zh)
Inventor
彭碧梧
彭梦雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Serva Software Shanghai Co ltd
Original Assignee
Serva Software Shanghai Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Serva Software Shanghai Co ltd filed Critical Serva Software Shanghai Co ltd
Priority to CN202110340974.4A priority Critical patent/CN112991508A/en
Publication of CN112991508A publication Critical patent/CN112991508A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/55Radiosity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a 3D rendering system and a method based on WebGL, which comprises a scene object library, a special effect library, a moving picture frame library, a rectangular calculation library, a graphical user interface library and an auxiliary module library, wherein the scene object library comprises a scene object module, a light object module, a camera object module, a material object module, a particle object module, a model object module and a texture object module; the animation frame library comprises a skin animation module and a skeleton animation module; the rectangular computation library comprises a vector computation module and a model bounding box computation module. The invention provides a middleware mechanism for users, so that the users can rapidly develop interactive three-dimensional graphic application without knowing the details of the bottom layer; meanwhile, the GLSL300es is supported, loading is not needed, and calling can be directly carried out; the user can also engage in development without being familiar with a computer graphics related knowledge system, thereby reducing the use difficulty and shortening the development period; the development efficiency is improved, and the development cost is reduced.

Description

WebGL-based 3D rendering system and method
Technical Field
The invention relates to a 3D rendering system, in particular to a 3D rendering system and a method based on WebGL.
Background
WebGL (Web Graphics library) is a 3D drawing protocol, the drawing technical standard allows JavaScript and OpenGL ES 2.0 to be combined together, and by adding one JavaScript binding of OpenGL ES 2.0, WebGL can provide hardware 3D accelerated rendering for HTML5Canvas, so that 3D scenes and models can be displayed in a browser more smoothly. The method realizes the production of Web interactive three-dimensional animation through the HTML script without the support of any browser plug-in. And the graphics rendering by using the bottom graphics hardware acceleration function is realized by a uniform, standard and cross-platform OpenGL interface.
The existing WebGL exposed interface is a bottom layer, and related development can be carried out only by familiarizing with a computer graphics related knowledge system, so that the use difficulty is very high, and the development period is long; the packaging degree of the existing WebGL-based three-dimensional engine is very low, and the requirement of front-end developers for developing three-dimensional functions cannot be met; the disadvantages of the prior art include high development cost, low efficiency, long period and the like.
Disclosure of Invention
The technical problem to be solved by the invention is to overcome the defects of the prior art and provide a 3D rendering system and method based on WebGL.
In order to solve the technical problems, the invention provides the following technical scheme:
the invention relates to a WebGL-based 3D rendering system, which comprises a scene object library, a special effect library, a moving picture frame library, a rectangular calculation library, a graphical user interface library and an auxiliary module library, wherein,
the scene object library comprises a scene object module, a light object module, a camera object module, a material object module, a particle object module, a model object module and a texture object module;
the special effect library comprises a screen space ambient light shielding module, a floodlight module, a high dynamic range imaging module, a tone mapping module, a frame selecting module, a light emitting selecting module, a light sweeping module, a 3D Tiles loading module and an oblique photography module;
the animation frame library comprises a skin animation module and a skeleton animation module;
the rectangle calculation library comprises a vector calculation module and a model bounding box calculation module.
As a preferred technical solution of the present invention, the gui library is used for adjusting the object property on a page.
As a preferred embodiment of the present invention, the auxiliary module library includes a position display module, a bounding box module, and a coordinate axis module.
As a preferred technical solution of the present invention, the scene object module of the scene object library is configured to display a three-dimensional model;
the light object module of the scene object library is used for light source emission and light reflection;
the camera object module of the scene object library is used for perspective display;
the material object module of the scene object library is used to assign material characteristics to a three-dimensional model, wherein,
the material object module comprises a default material, a PBR material, a custom drawing material, a shader material, a bulletin board material, a terrain material and a border line material;
the particle object module of the scene object library is used for rendering particles;
the model object module pressure of the scene object library is used for loading of the model, wherein,
the loaded models include 3D OBJ, FBX, and glTF;
the texture object modules of the scene object library include a DDS texture format, a HDR texture format, a Basis texture format, and an Image texture format.
As a preferred technical solution of the present invention, the screen space ambient light shielding module of the special effects library is configured to output data textures;
the floodlight module of the special effect library is used for realizing a light overflow effect;
the high dynamic range imaging module of the special effect library is used for rendering the brightest and darkest details;
the tone mapping module of the special effect library is used for converting the high dynamic range imaging module into an LDR;
the frame selection module of the special effect library is used for displaying the stroking effect;
the light scanning module of the special effect library is used for realizing the architectural luminous effect;
the 3D Tiles loading module of the special effect library is used for geographic 3D data streaming transmission and rendering;
the oblique photography module of the special effect library comprises an OSGB processing module, an FBX processing module and an STL processing module.
As a preferred technical scheme of the invention, the animation frame library comprises a skinning animation module and a skeleton animation module.
As a preferred technical solution of the present invention, the rectangular computation library includes a two-dimensional three-dimensional vector computation module and a quaternion computation module.
The invention also comprises a 3D rendering method based on WebGL, which is applied to the 3D rendering system based on WebGL and comprises the following steps:
step S1: initializing a WebGL context;
step S2: initializing a shader;
step S3: initializing a data cache;
step S4: initializing a UV texture;
step S5: creating a texture;
step S6: binding data to the texture described in said step S5;
step S7: displaying the frame data;
step S8: and releasing the resources.
Preferably, in step S2, initializing the shader includes the following steps:
step S201: creating a shader;
step S202: attaching a source to a shader;
step S203: compiling the program;
step S204: a merging procedure is performed.
Preferably, in step S3, initializing data cache in the vertex coordinate system and the texture coordinate system includes the following steps:
step S301: creating a buffer area;
step S302: binding a buffer area;
step S303: passing the data to a buffer;
step S304: and (6) contact binding.
Preferably, in step S4, initializing the UV texture includes the following steps:
step S401: creating a U texture and a V texture;
step S402: acquiring storage positions of a y sampler, a u sampler and a v sampler;
step S403: the texture unit number is passed to the shader, where,
numbered 0,1,2, respectively.
Preferably, in step S5, the creating a texture includes the following steps:
step S501: creating a texture object;
step S502: binding the texture object to the target;
step S503: the texture parameters are configured such that, among other things,
the configuration parameters include texture size, reduction factor, horizontal fill value, and vertical fill value.
Preferably, in the step S6, the step of binding data to the texture in the step S5 includes the steps of:
step S601: uploading UV data to the texture;
step S602: activating a texture unit;
step S603: binding textures;
step S604: specifying the bound texture object.
Preferably, in step S7, the displaying the frame data includes the following steps:
step S701: associating the buffer object with the shader program;
step S702: acquiring the position of the attribute;
step S703: enabling the attribute;
step S704: assigning attributes;
step S705: drawing a needed object;
step S706: and displaying on a screen.
Preferably, in step S8, the step of releasing resources includes the following steps:
step S801: releasing the shader program;
step S802: releasing the buffer area;
step S803: and releasing the YUV texture.
Preferably, in step S204, the merging procedure includes the following steps:
step S20401: creating a program object;
step S20402: an additional shader;
step S20403: a chaining shader;
step S20404: the program is used after attribute assignment.
Preferably, in step S701, associating the buffer object and the shader program includes the following steps:
step S70101: clearing the target color;
step S70102: clearing the color buffer area;
step S70103: setting a browsing window;
step S70104: binding data to a texture;
step S70105: the program is used.
The invention also comprises another 3D rendering method based on WebGL, which is applied to the 3D rendering system based on WebGL and comprises the following steps:
step SS1: creating a scene object;
step SS 2: binding a rendering target;
step SS3: creating a shot;
step SS 4: adding lamplight;
step SS 5: adding a model;
step SS 6: and performing screen rendering display.
The invention also comprises another 3D rendering method based on WebGL, which is applied to the 3D rendering system based on WebGL and is characterized by comprising the following steps:
step SSS 1-retrieve scene objects,
creating a canvas according to a container provided by a user, and newly creating a scene object which comprises information such as specified ambient light, a specified area drawing color, whether atomization is started, atomization information, whether animation is started, whether a special effect is started, exposure and the like;
step SSS 2: a three-dimensional model object is obtained,
the method comprises the steps that a model object is created through URL or binary data, data format analysis is generally required to be carried out on a three-dimensional model, the common three-dimensional model data format comprises GLTF and OBJ, the purpose of the data format analysis is to convert the three-dimensional model into a general format for display, and then vertex information is calculated and added into a scene through a WebGL interface for presentation;
step SSS3 setting three-dimensional model object properties,
after the model is added into a scene, a user-defined material mapping, position size and the like can be set for the model, and the illumination attribute, whether to accept a shadow, whether to cast the shadow, whether to start double-sided display and the like can also be set;
step SSS 4: the lens rotation interaction is added to the system,
events can be added to the shot, such as whether to start an animation mode, rotate, translate, zoom, and the like;
step SSS 5: rendering, namely generating a display picture based on the display position information and the attribute of each model example to be rendered and the rendering result of each model example to be rendered;
in the step SSS5, in order to display the three-dimensional model more truly, a certain light source, material, texture, and scene and model details need to be added, including:
light sources, including those used to simulate the nature's various light source characteristics for rendering, such as straight line light, point light sources, spot lights, ambient reflected light;
the material comprises a material for simulating the characteristics of the reflected light of the natural object, such as a specular reflection material and a diffuse reflection material;
texture, including the use of textures that are imparted to material by pictures, makes objects more realistic and realistic.
The invention has the following beneficial effects: the invention provides a middleware mechanism for users, so that the users can rapidly develop interactive three-dimensional graphic application without knowing the details of the bottom layer; GLSL 100 of WebGL and GLSL300es are supported, loading is not needed, and calling can be directly carried out; the user can also engage in development without being familiar with a computer graphics related knowledge system, thereby reducing the use difficulty and shortening the development period; the development efficiency is improved, and the development cost is reduced.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
FIG. 1 is one of the flow diagrams of the present invention;
FIG. 2 is a second schematic flow chart of the present invention.
Detailed Description
The preferred embodiments of the present invention will be described in conjunction with the accompanying drawings, and it will be understood that they are described herein for the purpose of illustration and explanation and not limitation.
Examples
The invention provides a WebGL-based 3D rendering system, which comprises a scene object library, a special effect library, a moving picture frame library, a rectangular calculation library, a graphical user interface library and an auxiliary module library, wherein,
the scene object library comprises a scene object module, a light object module, a camera object module, a material object module, a particle object module, a model object module and a texture object module;
the special effect library comprises a screen space ambient light shielding module, a floodlight module, a high dynamic range imaging module, a tone mapping module, a frame selecting module, a light emitting selecting module, a light sweeping module, a 3D Tiles loading module and an oblique photography module;
the animation frame library comprises a skin animation module and a skeleton animation module;
the rectangle calculation library comprises a vector calculation module and a model bounding box calculation module.
Further, the graphical user interface library is used for adjusting the object properties on the page.
The auxiliary module library comprises a position display module, a bounding box module and a coordinate axis module.
The scene object module of the scene object library is used for displaying a three-dimensional model;
the light object module of the scene object library is used for light source emission and light reflection, the light source emission comprises parallel light, point light source light and environment light (indirect light), and the light reflection comprises diffuse reflection and environment reflection;
the camera object module of the scene object library is used for perspective display;
the material object module of the scene object library is used to assign material characteristics to a three-dimensional model, wherein,
the material object module comprises default materials, PBR materials (physical rule-based rendering, namely PBR, the concept means that realistic shadow/illumination models and measured material parameter values are used together to accurately represent materials of the real world), custom drawing materials, shader materials, bulletin board materials, terrain materials and border line materials;
the particle object module of the scene object library is used for performing particle rendering, and the particle rendering, i.e. the particle system, is a special network element, has no specific surface, and consists of a series of internal vertexes, and each vertex displays a given map. It can show effects such as smog, flame in 3D scene. The method is purely calculated by the GPU, only time and other few parameters need to be transmitted in each operation, the performance is good, but the control difficulty is high, and the drawing batch is not easy to manage. Many particles are generated by one container at a time, and a large particle container only needs to be drawn once, so that a dynamic image is realized by dynamically transmitting changed data to the GPU. The method is characterized in that the control is very convenient at the CPU end, a large number of various effects can be easily realized, and a plurality of effects only can draw once. The disadvantage is that the data corresponding to the GPU is dynamically updated. Moreover, most of the calculation is at the CPU end, which may consume the CPU performance;
the model object module pressure of the scene object library is used for loading the model, wherein the loaded model comprises 3D OBJ, FBX and glTF, and meanwhile supports glTF extension plug-ins such as KHR blend, KHR relay mesh compression, KHR lights public, KHR materials common, KHR materials unlit, KHR texture transform, KHR materials pbrSpeculatGlossine and MSFT _ lod;
the texture object modules of the scene object library include a DDS texture format, a HDR texture format, a Basis texture format, and an Image texture format,
the DDS Texture format is a Texture Compression format S3TC (S3 Texture Compression) S3TC, namely a Texture Compression format is used for storing more textures by a Compression mode and utilizing a limited Texture cache space, and because the DDS Texture format supports a Compression ratio of 6:1, the 6M textures can be compressed into 1M to be stored in a material cache, so that the display performance is improved while the cache is saved;
the HDR texture format synthesizes LDR images (Low-Dynamic Range) corresponding to the best details at each exposure instant into a final HDR image; in popular terms, the same picture is exposed for multiple times and then synthesized into an image;
the resource occupation of the Basis texture format, namely the Basis Universal texture format, on the GPU is 6-8 times smaller than that of the traditional JPEG format, but the size of the space required by file storage is similar to that of JPEG, so that the Basis texture format becomes a good alternative for GPU compression methods (such as JPEG, PNG and the like) which are low in efficiency and cannot run across platforms at present. The compressed file created by the Basis universal texture format is suitable for various common application scenes: games, VR and AR, maps, photos, short videos, etc.;
the Image texture format is a common picture map such as common PNG, JPG and the like.
The screen space ambient light shielding module of the special effect library is used for outputting data textures, and in a general illumination model, ambient light is used for simulating secondary scattering of light, so that places which are not directly illuminated can have corresponding brightness. Further, in corners, the corner and the exposed surface should not have the same ambient light. Whereas in the illumination model, the ambient light is a constant value. The SSAO algorithm delays rendering based on screen post-processing, or direct docking. It is very costly because the algorithm needs to be run once for each fragment. Forward rendering cannot be used at all. Second, the SSAO algorithm is inserted after the GBuffer PASS before formal rendering. Its output is also a data texture (recording how much ambient light each fragment is exposed to). In formal rendering, only data in the SSAO texture needs to be multiplied when the ambient light is applied;
the floodlight module of the special effect library is used for realizing a light overflow effect and simulating a light source lighting or heating technology, and the bright light sources are distinguished in a mode that the bright light sources emit light rays which are scattered to the periphery, so that an observer can generate a light source or the bright area is a strong light area. The principle is as follows: firstly, setting a floodlight brightness threshold, then screening according to a rendering image of a screen, filtering out all pixels (black pixels) smaller than the threshold, reserving other pixels, reserving only the pixels to be luminous, wherein the other pixels are all generated by a black floodlight effect due to a diffraction effect, and the floodlight effect seen in the real world is that the brightest place is actually diffused to a dark place, namely, the boundary is not obvious in the bright place, so that the floodlight part, namely, the image obtained by the last operation is required to be subjected to a fuzzy operation to achieve the effect of light overflow, and finally, the processed image and the original image are superposed to obtain the final effect.
The high dynamic range imaging module of the special effect library is used for rendering the brightest and darkest details;
the tone mapping module of the special effects library is used to convert the high dynamic range imaging module to LDR, and in a common 3D engine rendering pipeline, we usually set the numerical range between [0,1] (or [0,255]) for the R, G, B channel color information. Where 0 represents no luminance and 1 represents the maximum luminance that the display can display. Although this representation is straightforward, it does not reflect the brightness of light in the real world. In real-world lighting environments, the intensity of the light sometimes exceeds the maximum brightness that the display can display. Moreover, human eyes can adjust themselves according to the illumination intensity when observing real world objects. Therefore, a more realistic rendering mode is to make the color value exceed 1. This lighting calculation or ambient lighting map is the HDR (high Dynamic range) lighting or HDR ambient map we often see in the game engine. However, the luminance values rendered with HDR may exceed the maximum value that the display can display. At this point, we need to convert the illumination result from HDR to LDR that the display can display normally. This process we generally refer to as tone mapping;
the module is selected to the frame in special effects storehouse is used for showing the effect of tracing, lets the object surface have a layer of mask layer promptly, principle: firstly, the model is required to be slightly more fat, and the original size of the model can be wrapped; microscopically, if a surface is allowed to expand outward, i.e., the direction in which the normal of the surface points, then the surface is allowed to translate a little toward the normal; in another aspect, for a vertex, that is, what the vertex shader really wants to write, when the vertex is normally calculated, the vertex is transformed through MVP and finally transferred to the fragment shader, so that the vertex can be slightly translated along the normal direction at this step. Then, the 'fat' model is rendered firstly, and then the original model is rendered at the same position to shield the two repeated parts, thus showing the stroking effect.
The light scanning module of the special effect library is used for realizing the building luminous effect, firstly, a central point is provided, and a custom color is superposed on an initial color, namely the color of the building if the distance from each vertex to the central point is within the range;
the 3D Tiles loading module of the special effect library is used for geographic 3D data streaming transmission and rendering, and the 3D Tiles are products obtained by adding a layered LOD structure on the basis of glTF, and are specially designed for a format for large amount of geographic 3D data streaming transmission and mass rendering. In 3D Tiles, a tileset is a tree structure consisting of a series of Tiles. Each tile may refer to one of the following formats: b3dm (large heterogeneous 3D model including three-dimensional buildings, terrain, etc.), i3dm (3D model instance such as tree, wind turbine, etc.), pnts (point cloud), cmpt (slices of different formats above combined into one slice);
the oblique photography module of the special effect library comprises an OSGB processing module, an FBX processing module and an STL processing module, the OSGB is an oblique model produced in the market at present, particularly, the organization mode of oblique photography three-dimensional model data processed by Smart3D is generally an OSGB format with embedded link texture data (.jpg), in the embodiment, the data is firstly converted into a format of 3DTiles, and then a scene is rendered according to information of a json file.
The animation frame library comprises a skinning animation module and a skeleton animation module, wherein the skinning animation (Morph animation) directly specifies the vertex position of each frame of the animation, and the positions of all the vertices of the Mesh at the corresponding moment of the key frame are stored in the animation key. The model of the joint animation is not an integral Mesh, but is divided into a plurality of parts (meshes), the dispersed meshes are organized together through a parent-child hierarchical structure, the parent Mesh drives the motion of the child Mesh under the parent Mesh, and the vertex coordinates in each Mesh are defined in the coordinate system of the Mesh, so that each Mesh participates in the motion as a whole. The method comprises the steps that transformation (mainly rotation, certainly movement and scaling) of each child Mesh relative to a parent Mesh of the child Mesh is set in an animation frame, and through the transformation from the child to the parent, one-level transformation accumulation (certainly technically, if matrix operation is accumulation multiplication) obtains the transformation of the Mesh in a coordinate space of a whole animation model (from the viewpoint of the text, the world coordinate system is the same as below), so that the position and the direction of each Mesh in the world coordinate system are determined, and then rendering is carried out by taking the Mesh as a unit;
and (3) bone animation, under the control of bones, the vertices of the skinned mesh are dynamically calculated through vertex mixing, and the motion of the bones is relative to the father bones and is driven by animation key frame data. A skeletal animation generally includes skeletal hierarchy data, Mesh (Mesh) data, Mesh skinning data (Skin Info) and skeletal animation (keyframe) data.
The rectangular calculation library comprises a two-dimensional three-dimensional vector calculation module and a quaternion calculation module, a large number of matrix vectors such as vec2, vec3, vec4, mat, quat and the like are packaged for calculation, and a user can directly use an API (application programming interface) to calculate.
As shown in fig. 1, the present invention further provides a WebGL-based 3D rendering method, which is applied to the WebGL-based 3D rendering system, and includes the following steps:
step S1: initializing a WebGL context, wherein the steps are specifically to obtain a context object;
step S2: initializing a shader;
step S3: initializing a data cache;
step S4: initializing a UV texture;
step S5: creating a texture;
step S6: binding the data to the texture in step S5;
step S7: displaying the frame data;
step S8: and releasing the resources.
Further, in step S2, initializing the shader includes the following steps:
step S201: creating a shader;
step S202: attaching a source to a shader;
step S203: compiling the program;
step S204: a merging procedure is performed.
Further, in step S3, initializing data caching in the vertex coordinate system and the texture coordinate system includes the following steps:
step S301: creating a buffer area;
step S302: binding a buffer area;
step S303: passing the data to a buffer;
step S304: and (6) contact binding.
Further, in step S4, initializing the UV texture includes the following steps:
step S401: creating a U texture and a V texture;
step S402: acquiring storage positions of a y sampler, a u sampler and a v sampler;
step S403: the texture unit number is passed to the shader, where,
numbered 0,1,2, respectively.
Further, in step S5, creating a texture includes the following steps:
step S501: creating a texture object;
step S502: binding the texture object to the target;
step S503: the texture parameters are configured such that, among other things,
the configuration parameters include texture size, reduction factor, horizontal fill value, and vertical fill value.
Further, in step S6, the step of binding the data to the texture in step S5 includes the following steps:
step S601: uploading UV data to the texture;
step S602: activating a texture unit;
step S603: binding textures;
step S604: specifying the bound texture object.
Further, in step S7, the displaying the frame data includes the following steps:
step S701: associating the buffer object with the shader program;
step S702: acquiring the position of the attribute;
step S703: enabling the attribute;
step S704: assigning attributes;
step S705: drawing a needed object;
step S706: and displaying on a screen.
Further, in step S8, the releasing the resource includes the following steps:
step S801: releasing the shader program;
step S802: releasing the buffer area;
step S803: and releasing the YUV texture.
Further, in step S204, the merging procedure includes the following steps:
step S20401: creating a program object;
step S20402: an additional shader;
step S20403: a chaining shader;
step S20404: the program is used after attribute assignment.
Further, in step S701, associating the buffer object and the shader program includes the following steps:
step S70101: clearing the target color;
step S70102: clearing the color buffer area;
step S70103: setting a browsing window;
step S70104: binding data to a texture;
step S70105: the program is used.
As shown in fig. 2, the present invention further provides another WebGL-based 3D rendering method, which is applied to the aforementioned WebGL-based 3D rendering system, and includes the following steps:
step SS1: creating a scene object;
step SS 2: binding a rendering target, namely a screen rendering target position;
step SS3: creating a shot, and binding the shot into a scene;
step SS 4: adding light, namely creating a light object, and adding the light object into a scene;
step SS 5: adding a model and displaying the model in a scene;
step SS 6: and performing screen rendering display.
In this embodiment, a WebGL-based 3D rendering method is further provided, which is applied to the WebGL-based 3D rendering system, and is characterized by including the following steps:
step SSS 1-retrieve scene objects,
creating a canvas according to a container provided by a user, and newly creating a scene object which comprises information such as specified ambient light, a specified area drawing color, whether atomization is started, atomization information, whether animation is started, whether a special effect is started, exposure and the like; step SSS 2: a three-dimensional model object is obtained,
the method comprises the steps that a model object is created through URL or binary data, data format analysis is generally required to be carried out on a three-dimensional model, the common three-dimensional model data format comprises GLTF and OBJ, the purpose of the data format analysis is to convert the three-dimensional model into a general format for display, and then vertex information is calculated and added into a scene through a WebGL interface for presentation;
step SSS3 setting three-dimensional model object properties,
after the model is added into a scene, a user-defined material mapping, position size and the like can be set for the model, and the illumination attribute, whether to accept a shadow, whether to cast the shadow, whether to start double-sided display and the like can also be set;
step SSS 4: the lens rotation interaction is added to the system,
events can be added to the shot, such as whether to start an animation mode, rotate, translate, zoom, and the like;
step SSS 5: rendering, namely generating a display picture based on the display position information and the attribute of each model example to be rendered and the rendering result of each model example to be rendered;
in the step SSS5, in order to display the three-dimensional model more truly, a certain light source, material, texture, and scene and model details need to be added, including:
light sources, including those used to simulate the nature's various light source characteristics for rendering, such as straight line light, point light sources, spot lights, ambient reflected light;
the material comprises a material for simulating the characteristics of the reflected light of the natural object, such as a specular reflection material and a diffuse reflection material;
texture, including the use of textures that are imparted to material by pictures, makes objects more realistic and realistic.
Specifically, the graphical user interface library of the invention facilitates the direct adjustment of the attributes of the objects on the page; the auxiliary module library comprises auxiliary objects such as positions, bounding boxes, coordinate axes and the like, and is convenient for function development and debugging; the special effect library can support various types of special effects; the rectangular calculation library can directly use an API mode to calculate vectors, matrixes and the like; different from other 3D engines (drawing continuously), the scene is redrawn only when the scene changes, and the performance of the engine is greatly improved.
The invention provides a middleware mechanism for users, so that the users can rapidly develop interactive three-dimensional graphic application without knowing the details of the bottom layer; GLSL 100 of WebGL also supports GLSL300es, and can be directly called without loading; the user can also engage in development without being familiar with a computer graphics related knowledge system, thereby reducing the use difficulty and shortening the development period; the development efficiency is improved, and the development cost is reduced. Such as using in instead of attribute; varying is replaced by in/out; texture is substituted for texture2D, and so on.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Finally, it should be noted that: although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that changes may be made in the embodiments and/or equivalents thereof without departing from the spirit and scope of the invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A3D rendering system based on WebGL is characterized by comprising a scene object library, a special effect library, a moving picture frame library, a rectangle calculation library, a graphic user interface library and an auxiliary module library, wherein,
the scene object library comprises a scene object module, a light object module, a camera object module, a material object module, a particle object module, a model object module and a texture object module;
the special effect library comprises a screen space ambient light shielding module, a floodlight module, a high dynamic range imaging module, a tone mapping module, a frame selecting module, a light emitting selecting module, a light sweeping module, a 3D Tiles loading module and an oblique photography module;
the animation frame library comprises a skin animation module and a skeleton animation module;
the rectangle calculation library comprises a vector calculation module and a model bounding box calculation module.
2. The WebGL-based 3D rendering system of claim 1, wherein the GUI library is configured to adjust object properties on a page;
the auxiliary module library comprises a position display module, a bounding box module and a coordinate axis module;
the scene object module of the scene object library is used for displaying a three-dimensional model;
the light object module of the scene object library is used for light source emission and light reflection;
the camera object module of the scene object library is used for perspective display;
the material object module of the scene object library is used to assign material characteristics to a three-dimensional model, wherein,
the material object module comprises a default material, a PBR material, a custom drawing material, a shader material, a bulletin board material, a terrain material and a border line material;
the particle object module of the scene object library is used for rendering particles;
the model object module pressure of the scene object library is used for loading of the model, wherein,
the loaded models include 3D OBJ, FBX, and glTF;
the texture object modules of the scene object library include a DDS texture format, a HDR texture format, a Basis texture format, and an Image texture format.
3. The WebGL-based 3D rendering system of claim 1 or 2, wherein the screen space ambient light masking module of the special effects library is configured to output data textures;
the floodlight module of the special effect library is used for realizing a light overflow effect;
the high dynamic range imaging module of the special effect library is used for rendering the brightest and darkest details;
the tone mapping module of the special effect library is used for converting the high dynamic range imaging module into an LDR;
the frame selection module of the special effect library is used for displaying the stroking effect;
the light scanning module of the special effect library is used for realizing the architectural luminous effect;
the 3D Tiles loading module of the special effect library is used for geographic 3D data streaming transmission and rendering;
the oblique photography module of the special effect library comprises an OSGB processing module, an FBX processing module and an STL processing module;
the animation frame library comprises a skin animation module and a skeleton animation module;
the rectangular calculation library comprises a two-dimensional three-dimensional vector calculation module and a quaternion calculation module.
4. A WebGL-based 3D rendering method applied to the WebGL-based 3D rendering system of any one of claims 1-3, comprising the steps of:
step S1: initializing a WebGL context;
step S2: initializing a shader;
step S3: initializing a data cache;
step S4: initializing a UV texture;
step S5: creating a texture;
step S6: binding data to the texture described in said step S5;
step S7: displaying the frame data;
step S8: and releasing the resources.
5. The WebGL-based 3D rendering method of claim 4,
in step S2, initializing the shader includes the following steps:
step S201: creating a shader;
step S202: attaching a source to a shader;
step S203: compiling the program;
step S204: carrying out a merging procedure;
in step S3, initializing data cache in the vertex coordinate system and the texture coordinate system includes the following steps:
step S301: creating a buffer area;
step S302: binding a buffer area;
step S303: passing the data to a buffer;
step S304: and (6) contact binding.
6. The WebGL-based 3D rendering method of claim 4, wherein the initializing UV texture in the step S4 comprises the following steps:
step S401: creating a U texture and a V texture;
step S402: acquiring storage positions of a y sampler, a u sampler and a v sampler;
step S403: the texture unit number is passed to the shader, where,
respectively numbered 0,1 and 2;
in step S5, creating a texture includes the following steps:
step S501: creating a texture object;
step S502: binding the texture object to the target;
step S503: the texture parameters are configured such that, among other things,
the configuration parameters comprise texture size, reduction multiple, horizontal filling value and vertical filling value;
in the step S6, the step of binding data to the texture in the step S5 includes the steps of:
step S601: uploading UV data to the texture;
step S602: activating a texture unit;
step S603: binding textures;
step S604: specifying the bound texture object.
7. The WebGL-based 3D rendering method of claim 4, wherein the step S7 is characterized in that the step of displaying the frame data comprises the following steps:
step S701: associating the buffer object with the shader program;
step S702: acquiring the position of the attribute;
step S703: enabling the attribute;
step S704: assigning attributes;
step S705: drawing a needed object;
step S706: displaying on a screen;
in step S8, the step of releasing resources includes the following steps:
step S801: releasing the shader program;
step S802: releasing the buffer area;
step S803: and releasing the YUV texture.
8. The WebGL-based 3D rendering method of claim 4,
in step S204, the merging procedure includes the following steps:
step S20401: creating a program object;
step S20402: an additional shader;
step S20403: a chaining shader;
step S20404: using the program after attribute assignment;
in step S701, associating the buffer object and the shader program includes the following steps:
step S70101: clearing the target color;
step S70102: clearing the color buffer area;
step S70103: setting a browsing window;
step S70104: binding data to a texture;
step S70105: the program is used.
9. A WebGL-based 3D rendering method applied to the WebGL-based 3D rendering system of any one of claims 1-3, comprising the steps of:
step SS1: creating a scene object;
step SS 2: binding a rendering target;
step SS3: creating a shot;
step SS 4: adding lamplight;
step SS 5: adding a model;
step SS 6: and performing screen rendering display.
10. A WebGL-based 3D rendering method applied to the WebGL-based 3D rendering system of any one of claims 1-3, comprising the steps of:
step SSS 1-retrieve scene objects,
creating a canvas according to a container provided by a user, and newly creating a scene object which comprises information such as specified ambient light, a specified area drawing color, whether atomization is started, atomization information, whether animation is started, whether a special effect is started, exposure and the like;
step SSS 2: a three-dimensional model object is obtained,
the method comprises the steps that a model object is created through URL or binary data, data format analysis is generally required to be carried out on a three-dimensional model, the common three-dimensional model data format comprises GLTF and OBJ, the purpose of the data format analysis is to convert the three-dimensional model into a general format for display, and then vertex information is calculated and added into a scene through a WebGL interface for presentation;
step SSS3 setting three-dimensional model object properties,
after the model is added into a scene, a user-defined material mapping, position size and the like can be set for the model, and the illumination attribute, whether to accept a shadow, whether to cast the shadow, whether to start double-sided display and the like can also be set;
step SSS 4: the lens rotation interaction is added to the system,
events can be added to the shot, such as whether to start an animation mode, rotate, translate, zoom, and the like;
step SSS 5: rendering, namely generating a display picture based on the display position information and the attribute of each model example to be rendered and the rendering result of each model example to be rendered;
in the step SSS5, in order to display the three-dimensional model more truly, a certain light source, material, texture, and scene and model details need to be added, including:
light sources, including those used to simulate the nature's various light source characteristics for rendering, such as straight line light, point light sources, spot lights, ambient reflected light;
the material comprises a material for simulating the characteristics of the reflected light of the natural object, such as a specular reflection material and a diffuse reflection material;
texture, including the use of textures that are imparted to material by pictures, makes objects more realistic and realistic.
CN202110340974.4A 2021-03-30 2021-03-30 WebGL-based 3D rendering system and method Pending CN112991508A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110340974.4A CN112991508A (en) 2021-03-30 2021-03-30 WebGL-based 3D rendering system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110340974.4A CN112991508A (en) 2021-03-30 2021-03-30 WebGL-based 3D rendering system and method

Publications (1)

Publication Number Publication Date
CN112991508A true CN112991508A (en) 2021-06-18

Family

ID=76339143

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110340974.4A Pending CN112991508A (en) 2021-03-30 2021-03-30 WebGL-based 3D rendering system and method

Country Status (1)

Country Link
CN (1) CN112991508A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113656918A (en) * 2021-08-30 2021-11-16 四川中烟工业有限责任公司 Four-rotor simulation test method applied to finished product elevated warehouse scene
CN114708367A (en) * 2022-03-28 2022-07-05 长沙千博信息技术有限公司 WebGL-based sign language digital human driving and real-time rendering system
CN114898032A (en) * 2022-05-10 2022-08-12 北京领为军融科技有限公司 Light spot rendering method based on shader storage cache object
CN115205433A (en) * 2022-09-14 2022-10-18 中山大学 Three-dimensional rendering fusion method based on glTF model and building contour expansion
CN116091684A (en) * 2023-04-06 2023-05-09 杭州片段网络科技有限公司 WebGL-based image rendering method, device, equipment and storage medium
CN116630486A (en) * 2023-07-19 2023-08-22 山东锋士信息技术有限公司 Semi-automatic animation production method based on Unity3D rendering

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106990961A (en) * 2017-03-28 2017-07-28 易网数通(北京)科技有限公司 A kind of method for building up of WebGL graphics rendering engines
US20180143757A1 (en) * 2016-11-18 2018-05-24 Zspace, Inc. 3D User Interface
CN108958724A (en) * 2018-06-26 2018-12-07 北京优锘科技有限公司 Three-dimensional visualization engine construction method, device, engine, browser, equipment and storage medium
KR102176837B1 (en) * 2019-08-19 2020-11-10 공간정보기술 주식회사 System and method for fast rendering and editing 3d images in web browser
WO2021036394A1 (en) * 2019-08-30 2021-03-04 杭州群核信息技术有限公司 Webgl-based replaceable model hybrid rendering display method and system, and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180143757A1 (en) * 2016-11-18 2018-05-24 Zspace, Inc. 3D User Interface
CN106990961A (en) * 2017-03-28 2017-07-28 易网数通(北京)科技有限公司 A kind of method for building up of WebGL graphics rendering engines
CN108958724A (en) * 2018-06-26 2018-12-07 北京优锘科技有限公司 Three-dimensional visualization engine construction method, device, engine, browser, equipment and storage medium
KR102176837B1 (en) * 2019-08-19 2020-11-10 공간정보기술 주식회사 System and method for fast rendering and editing 3d images in web browser
WO2021036394A1 (en) * 2019-08-30 2021-03-04 杭州群核信息技术有限公司 Webgl-based replaceable model hybrid rendering display method and system, and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
DIRKSEN.J著;李程鹏译: "《THREE.JS开发指南》", 机械工业出版社, pages: 1 - 12 *
WEB前端开发: "WebGL基础知识", pages 25 - 246, Retrieved from the Internet <URL:https://mp.weixin.qq.com/s/mFL5F-En77GZ5FigB7afWw> *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113656918A (en) * 2021-08-30 2021-11-16 四川中烟工业有限责任公司 Four-rotor simulation test method applied to finished product elevated warehouse scene
CN113656918B (en) * 2021-08-30 2024-04-16 四川中烟工业有限责任公司 Four-rotor simulation test method applied to finished product overhead warehouse scene
CN114708367A (en) * 2022-03-28 2022-07-05 长沙千博信息技术有限公司 WebGL-based sign language digital human driving and real-time rendering system
CN114898032A (en) * 2022-05-10 2022-08-12 北京领为军融科技有限公司 Light spot rendering method based on shader storage cache object
CN115205433A (en) * 2022-09-14 2022-10-18 中山大学 Three-dimensional rendering fusion method based on glTF model and building contour expansion
CN115205433B (en) * 2022-09-14 2022-12-06 中山大学 Three-dimensional rendering fusion method based on glTF model and building contour expansion
CN116091684A (en) * 2023-04-06 2023-05-09 杭州片段网络科技有限公司 WebGL-based image rendering method, device, equipment and storage medium
CN116630486A (en) * 2023-07-19 2023-08-22 山东锋士信息技术有限公司 Semi-automatic animation production method based on Unity3D rendering
CN116630486B (en) * 2023-07-19 2023-11-07 山东锋士信息技术有限公司 Semi-automatic animation production method based on Unity3D rendering

Similar Documents

Publication Publication Date Title
CN112991508A (en) WebGL-based 3D rendering system and method
US20100060640A1 (en) Interactive atmosphere - active environmental rendering
CN110969685A (en) Customizable rendering pipeline using rendering maps
US11954169B2 (en) Interactive path tracing on the web
US20100265250A1 (en) Method and system for fast rendering of a three dimensional scene
JP2012524327A (en) How to add shadows to objects in computer graphics
CN112184873B (en) Fractal graph creation method, fractal graph creation device, electronic equipment and storage medium
EP3989175A1 (en) Illumination probe generation method, apparatus, storage medium, and computer device
Ganovelli et al. Introduction to computer graphics: A practical learning approach
CN110634178A (en) Three-dimensional scene refinement reconstruction method for digital museum
EP4058162A1 (en) Programmatically configuring materials
JP2003168130A (en) System for previewing photorealistic rendering of synthetic scene in real-time
Döllner Geovisualization and real-time 3D computer graphics
Happa et al. Studying illumination and cultural heritage
Sellers et al. Rendering massive virtual worlds
Seng et al. Realistic real-time rendering of 3D terrain scenes based on OpenGL
Agus et al. Mobile graphics
Callieri et al. A realtime immersive application with realistic lighting: The Parthenon
Blythe et al. Lighting and shading techniques for interactive applications
Yang et al. Visual effects in computer games
WO2024027237A1 (en) Rendering optimization method, and electronic device and computer-readable storage medium
Nordahl Enhancing the hpc-lab snow simulator with more realistic terrains and other interactive features
KR102085701B1 (en) Method for rendering image
ZEHNER Landscape visualization in high resolution stereoscopic visualization environments
Xu Purposeful Clouds Shape Generation by Volume Rendering Toolkits

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination