LU504427B1 - Bim based dynamic visualization of noise field for prefabricated buildings adjacent to railways - Google Patents

Bim based dynamic visualization of noise field for prefabricated buildings adjacent to railways Download PDF

Info

Publication number
LU504427B1
LU504427B1 LU504427A LU504427A LU504427B1 LU 504427 B1 LU504427 B1 LU 504427B1 LU 504427 A LU504427 A LU 504427A LU 504427 A LU504427 A LU 504427A LU 504427 B1 LU504427 B1 LU 504427B1
Authority
LU
Luxembourg
Prior art keywords
visualization
texture
rendering
dynamic
noise field
Prior art date
Application number
LU504427A
Other languages
German (de)
Inventor
Qingsheng Xie
Yong Liu
Yang Chen
Jun Tu
Zhiqiang Hua
Chao Tan
Menghong Chen
Fangping Zhang
Nieqiang Xie
Wei Zhou
Fang Jiang
Zhihong Wang
Jianbin Zhu
Original Assignee
China Constr 5Th Eng Division
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Constr 5Th Eng Division filed Critical China Constr 5Th Eng Division
Priority to LU504427A priority Critical patent/LU504427B1/en
Application granted granted Critical
Publication of LU504427B1 publication Critical patent/LU504427B1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/04Architectural design, interior design

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of Mechanical Vibrations Or Ultrasonic Waves (AREA)
  • Soundproofing, Sound Blocking, And Sound Damping (AREA)

Abstract

The invention provides a BIM based dynamic visualization of noise field for prefabricated buildings adjacent to railways, comprising: constructing a visual platform framework; based on the framework of the visualization platform, using IBFV algorithm based on Cesium, rendering to texture technology and ping-pong technology for dynamic visualization of noise field and acoustic visualization respectively; performing dynamic visualization of the noise field based on the results of the dynamic visualization of the noise field and the acoustic visualization. In the texture rendering technology, the invention adopts the rendering to texture technology and ping-pong technology to solve the problem that the cache results of different frames cannot be saved, and adopts the dynamic rendering method based on the viewpoint to improve the data accuracy and make the sound field visualization effect more refined.

Description

DESCRIPTION LU504427
BIM BASED DYNAMIC VISUALIZATION OF NOISE FIELD FOR
PREFABRICATED BUILDINGS ADJACENT TO RAILWAYS
TECHNICAL FIELD
The invention belongs to the technical field of noise visualization, in particular to a
BIM based dynamic visualization of noise field for prefabricated buildings adjacent to railways.
BACKGROUND
Real-time noise estimation of adjacent railway assembly buildings is a technical means to predict and warn the noise that may be caused when the train is about to pass a certain position. It can monitor and analyze the position in real time through various monitoring means before the train approaches a certain position, so as to predict the impact of noise generated by the train passing through the position on the prefabricated buildings of the adjacent railways, and provide relevant suggestions and measures.
In comprehensive simulation, the application of visualization technology can effectively improve the intuition and comprehensiveness of information expression of noise field of adjacent railway assembled buildings. On the one hand, through various visualization methods, the noise field of adjacent railway assembled buildings is expressed intuitively and vividly, which is easy for users to understand; on the other hand, integrating visual information with multi-source information such as building information and monitoring information can not only meet the needs of different users and reduce the switching between different systems in decision-making, but also improve the comprehensive information management and analysis level of noise field big data of adjacent railway assembly buildings, which is helpful for data information mining and extraction.
Most of the existing visualization technologies of noise field in the adjacent railwdyJ504427 assembly buildings are based on client/server architecture, and there are some problems such as poor portability and expansibility, and difficult cross-platform browsing.
SUMMARY
The purpose of the present invention is to provide a BIM based dynamic visualization of noise field for prefabricated buildings adjacent to railways, so as to solve the problems existing in the prior art.
In order to achieve the above object, the present invention provides a BIM based dynamic visualization of noise field for prefabricated buildings adjacent to railways, comprising: constructing a visual platform framework; based on the framework of the visualization platform, using IBFV algorithm based on Cesium, rendering to texture technology and ping-pong technology for dynamic visualization of noise field and acoustic visualization respectively; performing dynamic visualization of the noise field based on the results of the dynamic visualization of the noise field and the acoustic visualization.
Optionally, the visualization platform framework comprises a three-dimensional environment visualization module and a sound field visualization module, wherein the three-dimensional environment visualization module comprises vector data and a three-dimensional digital model; the sound field visualization module includes a geometry of basic graphic elements and a colored appearance.
Optionally, the vector data includes orthophoto map, digital elevation model and groyne, and the three-dimensional digital model includes UAV tilt photogrammetry modeling and BIM model.
Optionally, dynamic visualization of the noise field comprises: rendering texture of basic graphic elements based on shader program; saving the original coordinates of the basic graphic elements in the vertex shad;
updating several pre-generated textures and corresponding serial numbets$/504427 asynchronously in uniformMap based on the automatic updating mechanism of Cesium; drawing the visual texture sound field based on the grid motion, and mixing the noise texture with the previous frame image and follow the grid motion, so as to continuously refresh and produce the effect of continuous texture motion; creating the frame cache object in the off-screen buffer based on the off-screen rendering technology, setting the output target of rendering as the cache object, and rendering the content to the frame cache after binding the created texture to the cache object.
Optionally, acoustic visualization comprises: in the process of rendering texture of basic graphic elements based on shader program, creating several frames of cache objects with mutually referenced colors and textures, and taking the rendering result of the previous frame as the input of the next frame of cache object, so that the rendered content may be transmitted across frames; selecting any frame of the rendered result of the buffered object and outputing to the window to visualize the rendered result.
Optionally, in the process of dynamic visualization of the noise field, a viewpoint-based dynamic rendering method is adopted, and the texture cache object binding is adjusted in real time according to the viewpoint position to render to the shooting area of the texture camera, so that the shooting area only covers the viewport area.
The invention has the technical effects that:
In the texture rendering technology, the invention adopts the rendering to texture technology and ping-pong technology to solve the problem that the cache results of different frames cannot be saved, and adopts the dynamic rendering method based on the viewpoint to improve the data accuracy and make the sound field visualization effect more refined.
BRIEF DESCRIPTION OF THE FIGURES LU504427
The accompanying drawings, which constitute a part of this application, are used to provide a further understanding of this application. The illustrative embodiments of this application and their descriptions are used to explain this application, and do not constitute an improper limitation of this application.
Fig. 1 is a flow chart of a BIM-based dynamic visualization for noise field of adjacent railway assembled buildings in an embodiment of the present invention.
DESCRIPTION OF THE INVENTION
As shown in Fig. 1, this embodiment provides a BIM based dynamic visualization of noise field for prefabricated buildings adjacent to railways, including: the overall framework of the platform is developed by separating the front and back ends, the front end of the platform is developed by JavaScript, the graphical user interface and the three-dimensional virtual engine are developed by Cesium framework, the data interaction between the front and back ends is transmitted by HTTP request, and the platform is published on the server.
The visualization module of the platform is mainly divided into two parts. The first part is the three-dimensional environment visualization module, which includes vector data such as orthophoto map, digital elevation model and groyne, and three-dimensional digital models such as UAV tilt photogrammetry modeling and BIM model. The data of vector data are stored in the server with file storage structure, and the elevation and image data are processed by spatial data slice publishing tool, published as slice data with multi-level storage structure, and loaded by Cesium tool; Three-dimensional model data is published as tile model by data conversion software and loaded by Cesium.
The other part of the visualization module is sound field visualization. The professional data needed for visualization is obtained by querying the database at the back end after being requested by HTTP, and returned to the front end after being processed at the back end. After selecting the visualization scheme, the corresponding grid data and calculation result data are obtained, and a visualization example is created by using the self-built basic graphic elements method, in which the basic graphi¢/504427 element represents the geometry in the scene, including geometric shape and coloring appearance. The appearance part adopts shader development technology to render the data results into the required texture style.
Dynamic noise field drawing
Dynamic visualization
In order to realize the visualization of texture sound field based on Cesium and image, it is necessary to customize the rendering the basic graphic elements, in which the texture rendering process needs to be realized through the shader program in
Cesium, and the pixel values in each channel are calculated and rendered in parallel by using the raster processing mode of the shader program. Texture rendering is based on the above-mentioned principle of image-based sound field visualization algorithm. In the process of image-based sound field visualization texture sound field rendering, texture changes are controlled by grid movement deformation, and then the previous frame image is mixed with noise texture and followed by grid movement, and the effect of continuous texture movement is continuously refreshed. The implementation of IBFV algorithm based on Cesium has the following difficulties: 1) Noise texture images need to change cyclically in a certain period. Here, using the automatic updating mechanism of uniformMap in Cesium, several textures generated in advance and corresponding serial numbers are asynchronously updated in uniformMap. 2) For the reference part of the texture, because the normally rendered the basic graphic elements and the camera settings rendered to the texture are the same, and the world coordinates of the two basic graphic elements are the same, the texture rendered to the texture can be converted into screen coordinates according to the world coordinates of the normally rendered the basic graphic elements, and no additional texture coordinate calculation is needed. However, due to the distortion of the mesh in image-based sound field visualization, the coordinates of real points are transformed, and the built-in variables of Cesium shader follow the calculation. In order to obtain the texture color of the origin of the previous frame, the original coordinates need to B&J504427 saved in the vertex shader. 3) Because it is necessary to keep the results of the previous frame and hand them over to the next frame, rendering-to-texture technology is used here.
Rendering-to-texture technology in WebGL is mature and widely used, for example, in the realization of shadow mapping and luminous effect. Cesium also provides a frame buffer class, which can be used as a rendering-to-texture technology, to achieve rendering to texture under the current rendering, and to read the texture from it under the next rendering. The technical process is to create a frame cache object in the off-screen buffer by using off-screen rendering technology, set the output target of rendering as this frame cache object, bind the created texture to this frame cache object, and then render the content to the frame cache. Because the texture binds the color attachment of the frame cache object, the required texture result is finally obtained.
However, it is not enough to save the results of the previous frame and hand them over to the next frame only by using the technique of rendering to texture. If only one shader program is used to render two kinds of results (frame buffer target and rendering output target), there is only the rendering process in the current frame, which does not conform to the description of the frame buffer class in Cesium for rendering to texture.
The rendering target texture of the frame buffer object cannot be used as the texture data input source of the shader program in the same rendering process; On the other hand, the texture of the last frame used in the image-based sound field visualization algorithm will change constantly, and the uniformMap update mechanism is also needed.
Even if the color attachment of the frame cache object is deeply copied in the event monitoring before rendering, the texture will still be sought first before updating, resulting in the rendering demand and eventually falling into an infinite loop.
Visualization method of sound
In order to solve this problem, Ping Pong technology combined with rendering to texture technology is introduced. Ping-pong technology is a technology used to convert the rendered output into the input of the next operation, which is commonly used, such as particle computing in general graphics processor.
(1) A second frame cache object is created by this method, and the colors arldJ504427 textures in the two frame cache objects refer to each other, so as to bring the result of the previous drawing into the next drawing; (2) except for the empty texture referenced by the first drawing in the first frame, every subsequent rendering instruction can obtain the result of the previous drawing; (3) the first drawing of the next frame refers to the result of the last frame buffer object of the previous frame, so as to realize the cross-frame transmission of the rendered content, and the other frame buffer object realizes the caching function of the rendered result; (4) after realizing the image-based sound field visualization algorithm process in one of the rendering the basic graphic elements, it can be saved and copied in another basic graphic elements and output; (5) selecting any frame buffer object and output it to the window to realize the IBFV process based on Cesium, so as to realize the visualization of sound.
The results of frame buffer objects are updated through the rendering monitoring mechanism of Cesium. The rendering process first goes through the off-screen rendering process, and the targets of the two rendering processes are respectively designated as the two created frame buffer objects in the pre-rendering monitoring, and the two-color attachments in the frame buffer objects are updated successively in the frame-by-frame pre-rendering stage. Enter the stage of screen rendering, normally refer to one of the color attachments as a texture, and render the result to the screen.
The process of IBFV algorithm based on Cesium is shown in FIG. 1.
The traditional rendering-to-texture technology uses orthogonal projection to position the camera, thus fixing the texture coordinates, so the static texture only needs to be rendered once, and the dynamic texture rendering has nothing to do with the state of the screen rendering viewport, which reduces and stabilizes the resource cost of rendering to the texture. At the same time, this method is also convenient to calculate the texture coordinates corresponding to any fixed mesh, which simplifies the writing of shader programs. However, through the simulation experiment of BIM model, if the rendering is fixed to the texture viewport, the accuracy of the texture coordinates will be fixed, and the texture accuracy of the rendered texture after the screen renderirldJ504427 viewport is enlarged is insufficient, resulting in the overall image rendering blur.
Because BIM has good optimization, the project scheme and experimental model can be simulated and optimized according to the geometric information, physical model and rule information provided by BIM. Finally, this platform adopts the way of dynamic rendering based on viewpoint. Real-time adjustment of texture cache object binding rendering to the shooting area of the texture camera according to the viewpoint position, so that it only covers the viewport area, thus maintaining high data accuracy in a limited texture size. As the viewpoint approaches, the area covered by the texture cache object will be reduced synchronously, and the data accuracy will be improved synchronously, thus making the visual effect of the displayed sound field more refined.
The main flow of the rendering to texture method is as follows: (1) monitoring the change of viewport state and updating the transformation matrix; (2) updating the texture output rendered to the texture; (3) updating the texture attachments bound in the uniformMap.
At present, the hardware configuration can generally bear the increased system overhead based on the viewpoint dynamic rendering method, and the frame rate fluctuation is small, and the effect of texture optimization is remarkable.
The above is only the preferred embodiment of this application, but the protection scope of this application is not limited to this. Any change or replacement that can be easily thought of by a person familiar with this technical field within the technical scope disclosed in this application should be covered by this application. Therefore, the protection scope of this application should be based on the protection scope of the claims.

Claims (6)

CLAIMS LU504427
1. A BIM based dynamic visualization of noise field for prefabricated buildings adjacent to railways, comprising: constructing a framework of a visualization platform; based on the framework of the visualization platform, using IBFV algorithm based on Cesium, rendering to texture technology and ping-pong technology for dynamic visualization of noise field and acoustic visualization respectively; performing dynamic visualization of the noise field based on the results of the dynamic visualization of the noise field and the acoustic visualization.
2. The method according to claim 1, wherein the framework of the visualization platform comprises a three-dimensional environment visualization module and a sound field visualization module, wherein the three-dimensional environment visualization module comprises vector data and a three-dimensional digital model; the sound field visualization module includes a geometry of basic graphic elements and a colored appearance.
3. The method according to claim 2, wherein the vector data include orthophoto map, digital elevation model and groyne, and the three-dimensional digital model includes UAV tilt photogrammetry modeling and BIM model.
4. The method according to claim 1, wherein dynamic visualization of the noise field comprises: rendering texture of basic graphic elements based on shader program; saving the original coordinates of the basic graphic elements in the vertex shad; updating several pre-generated textures and corresponding serial numbers asynchronously in uniformMap based on the automatic updating mechanism of Cesium;
drawing the visual texture sound field based on the grid motion, and mixing tH&J504427 noise texture with the previous frame image and follow the grid motion, so as to continuously refresh and produce the effect of continuous texture motion; creating the frame cache object in the off-screen buffer based on the off-screen rendering technology, setting the output target of rendering as the cache object, and rendering the content to the frame cache after binding the created texture to the cache object.
5. The method according to claim 4, wherein acoustic visualization comprises: in the process of rendering texture of basic graphic elements based on shader program, creating several frames of cache objects with mutually referenced colors and textures, and taking the rendering result of the previous frame as the input of the next frame of cache object, so that the rendered content may be transmitted across frames; selecting any frame of the rendered result of the buffered object and outputting to the window to visualize the rendered result.
6. The method according to claim 4, characterized in that in the process of dynamic visualization of the noise field, a viewpoint-based dynamic rendering method is used, and the binding of texture and cache object is adjusted in real time according to the viewpoint position to render the shooting area of the texture camera, so that the shooting area only covers the viewport area.
LU504427A 2023-06-06 2023-06-06 Bim based dynamic visualization of noise field for prefabricated buildings adjacent to railways LU504427B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
LU504427A LU504427B1 (en) 2023-06-06 2023-06-06 Bim based dynamic visualization of noise field for prefabricated buildings adjacent to railways

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
LU504427A LU504427B1 (en) 2023-06-06 2023-06-06 Bim based dynamic visualization of noise field for prefabricated buildings adjacent to railways

Publications (1)

Publication Number Publication Date
LU504427B1 true LU504427B1 (en) 2023-12-06

Family

ID=89123317

Family Applications (1)

Application Number Title Priority Date Filing Date
LU504427A LU504427B1 (en) 2023-06-06 2023-06-06 Bim based dynamic visualization of noise field for prefabricated buildings adjacent to railways

Country Status (1)

Country Link
LU (1) LU504427B1 (en)

Similar Documents

Publication Publication Date Title
CN112270756B (en) Data rendering method applied to BIM model file
CN110415343B (en) Engineering BIM visual three-dimensional engine system
US11645801B2 (en) Method for synthesizing figure of virtual object, electronic device, and storage medium
US11158117B2 (en) Estimating lighting parameters for positions within augmented-reality scenes
CN108648269B (en) Method and system for singulating three-dimensional building models
CN111340928B (en) Ray tracing-combined real-time hybrid rendering method and device for Web end and computer equipment
Shan et al. Research on landscape design system based on 3D virtual reality and image processing technology
CN108520557B (en) Massive building drawing method with graphic and image fusion
US20230419610A1 (en) Image rendering method, electronic device, and storage medium
US20020190989A1 (en) Program and apparatus for displaying graphical objects
US20230230311A1 (en) Rendering Method and Apparatus, and Device
CN114897754A (en) Generating new frames using rendered and non-rendered content from previous perspectives
CN112001993A (en) Multi-GPU (graphics processing Unit) city simulation system for large scene
LU504427B1 (en) Bim based dynamic visualization of noise field for prefabricated buildings adjacent to railways
Schneider et al. Brush as a walkthrough system for architectural models
KR100875297B1 (en) Simulation device with image generation function and simulation method with image generation step
CN115970275A (en) Projection processing method and device for virtual object, storage medium and electronic equipment
CN114140566A (en) Real-time rendering method for design effect of building drawing
CN110070597A (en) A kind of Unity3D rendering accelerated method based on OpenCL
CN116993894B (en) Virtual picture generation method, device, equipment, storage medium and program product
JP3181464B2 (en) Global illumination rendering method and apparatus
Chai et al. Cultural heritage assets optimization workflow for interactive system development
US20240005588A1 (en) Image rendering method and apparatus, electronic device, computer-readable storage medium, and computer program product
JPH05342368A (en) Method and device for generating three-dimensional picture
Liu et al. Research on Real-Time Graphics Drawings Technology in Virtual Scene

Legal Events

Date Code Title Description
FG Patent granted

Effective date: 20231206