CN112634393A - Web-based near space atmospheric wind field real-time self-adaptive visualization method - Google Patents

Web-based near space atmospheric wind field real-time self-adaptive visualization method Download PDF

Info

Publication number
CN112634393A
CN112634393A CN202011630784.8A CN202011630784A CN112634393A CN 112634393 A CN112634393 A CN 112634393A CN 202011630784 A CN202011630784 A CN 202011630784A CN 112634393 A CN112634393 A CN 112634393A
Authority
CN
China
Prior art keywords
wind field
atmospheric wind
time
web
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011630784.8A
Other languages
Chinese (zh)
Other versions
CN112634393B (en
Inventor
詹勤
范湘涛
邰志敏
杜小平
简洪登
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerospace Information Research Institute of CAS
Original Assignee
Aerospace Information Research Institute of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerospace Information Research Institute of CAS filed Critical Aerospace Information Research Institute of CAS
Priority to CN202011630784.8A priority Critical patent/CN112634393B/en
Publication of CN112634393A publication Critical patent/CN112634393A/en
Application granted granted Critical
Publication of CN112634393B publication Critical patent/CN112634393B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4007Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/56Particle system, point based geometry or rendering
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a real-time self-adaptive visualization method of an atmospheric wind field in an adjacent space based on Web, which is characterized in that the number and the distribution area of particles are self-adaptively adjusted in real time according to the dynamic change of a vision field range, so that the proper number and density of the particles can be obtained in different viewpoint heights and different vision field ranges, a streamline with proper density is generated based on a particle tracking method, the motion characteristics of the atmospheric wind field in the adjacent space of the area are further expressed, the whole change trend of the atmospheric wind field can be completely expressed, the local detail characteristics of the atmospheric wind field can be more finely described, and the smooth dynamic visualization of the atmospheric wind field under different resolutions is realized.

Description

Web-based near space atmospheric wind field real-time self-adaptive visualization method
Technical Field
The invention relates to the technical field of crossing of 3D visualization and an adjacent space atmospheric wind field, and mainly relates to a Web-based real-time adaptive visualization method for the adjacent space atmospheric wind field.
Background
The near space atmosphere about 20-100km away from the ground is an important component of the earth atmosphere, the complex state change and dynamic disturbance directly affect human activities such as safety, aerospace activities and wireless information transmission of the near space aircraft, and the near space atmosphere has an important influence on climate change. The near space atmospheric wind field is one of important parameters for representing the near space atmospheric environment, the spatial and temporal distribution condition and the change rule of the near space atmospheric wind field are visually disclosed in a visualization mode, and the method has important significance for improving understanding of the near space atmospheric environment.
The near space atmospheric wind field data not only contains wind speed size information but also contains wind speed direction information, and is vector field data. The current commonly used vector field data visualization methods are mainly classified into the following methods: (1) a direct visualization method represented by a dot plot notation method; (2) adopting a geometric visualization method of vector lines; (3) a texture visualization method; (4) a method for visualizing features.
However, these methods have their own drawbacks. The dot diagram marking method and the texture method are more suitable for two-dimensional vector visualization, and have the problems of icon ambiguity and texture shielding in the aspect of three-dimensional expression; the feature-based method focuses more on feature extraction, and the visualization result depends greatly on the quality of the feature extraction algorithm. The flow field visualization method based on geometry adopts a particle animation mode to express the continuity and dynamic characteristics of vector field data, and is considered to be a better three-dimensional vector field visualization expression method. The streamline visualization method is the most widely applied geometric visualization method at present, and the method describes the vector field characteristics by utilizing the track formed by the movement of the particles along with time, accords with the flow field movement characteristics, is easy to realize, and is mainly used for the visualization of marine and meteorological vector fields. However, how to generate a flow line with a proper density does not cause visual confusion due to over-dense flow lines, and also loses flow field characteristic information due to over-sparse flow lines, and is still a difficult point and hot point problem for visualization of the flow line. And the method has little discussion related to the real-time dynamic three-dimensional visual expression of the vertical layered data of the atmospheric wind field in the near space on the Web digital earth platform. Guo Changshhun et al propose a near space vector field data multi-level mobile streamline visualization method based on a Web digital earth platform, which realizes multi-level, multi-temporal and dynamic interactive visualization of near space atmospheric wind field data, but the method focuses on the display of global wind field characteristics, and the local region wind field characteristics are lost due to too sparse streamline distribution, and the adoption of a time sequence animation mode to realize rapid switching of wind field data between different heights easily causes visual confusion.
Therefore, a near space atmospheric wind field self-adaptive visualization system and method based on Web are urgently needed, the overall change trend of the atmospheric wind field can be completely expressed, local detail characteristics of the atmospheric wind field can be more carefully described, vivid and smooth dynamic visualization of the atmospheric wind field under different resolutions can be realized, and the near space atmospheric wind field self-adaptive visualization system and method based on Web have very important practical value and practical significance.
Disclosure of Invention
In order to solve the technical problems, the invention provides a real-time self-adaptive visualization method of an adjacent space atmospheric wind field based on Web, which mainly aims at the problem of how to dynamically express vector characteristics of the adjacent space atmospheric wind field by streamlines with proper density on a three-dimensional Web digital earth.
In order to achieve the purpose, the invention provides a near space atmospheric wind field real-time self-adaptive visualization method based on Web, which specifically comprises the following steps:
s1: acquiring near space atmospheric wind field data, and preprocessing the data to obtain an atmospheric wind field wind speed PNG picture;
s2: acquiring viewpoint height based on the PNG picture and mouse event operation, determining a view range and the number of initialized particles, and adaptively and dynamically initializing and placing seed points;
s3: iteratively updating the positions of the particles according to the self-adaptive dynamic initialized seed points to generate a dynamic streamline;
s4: performing color mapping and transparency control on the seed points based on the dynamic streamline;
s5: and (3) rendering output based on ceium coordinate conversion and streamline visualization, namely converting the coordinates of the streamline, drawing the streamline by combining color mapping and transparency, and finally obtaining the atmospheric wind field visualization of the adjacent space on the three-dimensional virtual earth.
Preferably, the step S1 is specifically:
s11: acquiring 4D vertical layered data of an atmospheric wind field in an adjacent space in real time by scanning an FTP data server;
s12: converting the 4D vertical hierarchical data into 2D grid data through dimensionality reduction, and obtaining regular two-dimensional grid data through a bilinear interpolation method;
s13: and making the regular two-dimensional grid data into an atmospheric wind field wind speed PNG picture, and transmitting the atmospheric wind field wind speed PNG picture to the front end through a network based on an interactive selection mode of the front end to a time layer and a height layer.
Preferably, the steps S12 to S13 are specifically:
converting 4D vertical layered data of an atmospheric wind field in an adjacent space into 3D wind field data at a series of moments through time dimension reduction; converting the 3D wind field data into 2D wind field grid data with a series of heights through height dimension reduction; and then converting the 2D wind field grid data from a scalar space to an RGB color space to generate an atmospheric wind field wind speed PNG picture.
Preferably, the step S2 is specifically:
s21: initializing the number of seed points in the PNG picture in the global view range;
s22: acquiring a current view range through the viewpoint height and the camera position, and calculating the number of seed points in the current view range in real time according to the viewpoint height change;
s23: and uniformly scattering the seed points in the current visual field range through a random noise function, and endowing the life cycle of the seed points.
Preferably, the step S3 is specifically:
s31: calculating the position of the current seed point by adopting a fourth-order Runge-Kutta numerical integral according to the initialized position of the seed point and the speed of the initialized position, and increasing the life value of the seed point;
s32: continuously performing fourth-order Runge-Kutta numerical integration according to the speed of the current seed point position to obtain the position of the seed point at the next moment, and continuously increasing the life value of the seed point at the next moment;
s33: repeating the steps S31-S32 until the life cycle of the seed point is finished or the position of the seed point exceeds the view range, stopping the operation, and clearing the seed point which exceeds the view range;
s34: and all positions where the seed points pass are sequentially connected to form a seed point motion track curve, so that a dynamic streamline is generated.
Preferably, after the seed point life cycle is ended, whether the seed point is regenerated at the ending position is determined through a random variable.
Preferably, the step S4 is specifically:
mapping the velocity scalar values of the seed points to corresponding color values; and the transparency parameter of the seed point is adopted to control the transparency of the streamline of the atmospheric wind field.
Preferably, the expression for mapping the velocity scalar values of the seed points to corresponding color values is:
Figure BDA0002880010030000051
where x is the intensity of the vector, f (θ) is the number of points with intensity θ, y (x) is the intensity value corresponding to the abscissa of the color mapping table, and the range of values is [0,1 ].
Preferably, the calculation formula of the transparency parameter is as follows:
r=Lt/T
wherein r is the particle transparency at the time t; l istIs the particle life value at time t, Lt=Lt-1+ Δ t, Δ t being LtAnd Lt-1A time interval of a time; t is the life cycle of the particle, L is more than or equal to 0tT is less than or equal to T; the value range of the transparency r is more than or equal to 0 and less than or equal to 1.
Compared with the prior art, the invention has the beneficial effects that:
(1) the invention provides a self-adaptive dynamic visualization method facing to near space atmospheric wind field vertical hierarchical data based on a Web virtual earth, which dynamically and self-adaptively adjusts the particle number and the distribution area in real time according to the change of a vision range, so that the proper particle number and density can be obtained in different viewpoint heights and the vision range in the interactive visualization process, and a streamline with proper density is generated to express the near space atmospheric wind field motion characteristic of the area, thereby not only completely expressing the whole change trend of the atmospheric wind field, but also more finely depicting the local detail characteristic of the atmospheric wind field, and realizing vivid, smooth and dynamic visualization of the atmospheric wind field under different resolutions.
(2) The invention realizes the full process automation of real-time acquisition, preprocessing and dynamic loading visualization of the near space atmospheric wind field data in the Web environment, and provides a new three-dimensional visualization mode based on the Web virtual earth for real-time dynamic update and release of the near space atmospheric wind field.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a flow chart of a method of the present invention;
FIG. 2 is a diagram of a near space atmospheric wind field data dimension reduction process according to the present invention;
FIG. 3 is a comparison graph of the visualization effect of different global initialization particle counts according to the present invention; wherein, (a) is a visualization effect map with the total number of the initialized particles being 128 × 128, (b) is a visualization effect map with the total number of the initialized particles being 256 × 256, and (c) is a visualization effect map with the total number of the initialized particles being 512 × 512;
FIG. 4 is a comparison graph of global visualization effects before and after the adaptation of the atmospheric wind field of the adjacent space when the initial particle number is 200X 200; wherein, (a) is a global visualization effect graph before self-adaptation, and (b) is a global visualization effect graph after self-adaptation;
FIG. 5 is a comparison graph of local region visualization effects before and after adaptation of an atmospheric wind field in an adjacent space; the visualization effect map comprises (a) a visualization effect map of an area near an adaptive forward bay channel, (b) a visualization effect map of an area near an adaptive backward bay channel, (c) a visualization effect map of an area near an adaptive forward Bengali bay channel, (d) a visualization effect map of an area near an adaptive backward Bengali bay channel, (e) a visualization effect map of an area near an adaptive forward Bengali Islands, and (f) a visualization effect map of an area near an adaptive backward Bengali Islands.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Example 1
Referring to fig. 1, the invention provides a Web-based near space atmospheric wind field real-time adaptive visualization method, which specifically includes the following steps:
s1: acquiring near space atmospheric wind field data in real time, and preprocessing the data;
firstly, the invention obtains vertical stratification data of an atmospheric wind field in an adjacent space in real time by periodically scanning an FTP data server, wherein the data is assimilation data which is published by national space science center of China academy of sciences in a NetCDF format and is 20-100km in global range at intervals of 2 hours, the spatial resolution is 2.5 degrees in longitude interval, 2 degrees in latitude interval and 1km in height interval.
And then analyzing NetCDF data, and converting the 4D wind field data into series of atmospheric wind field 2D grid data of different height layers at different moments in a dimensionality reduction mode. And then, carrying out bilinear interpolation on the single-layer atmospheric wind field data in the longitude and latitude directions to generate regular two-dimensional grid data with both longitude intervals and latitude intervals of 1 degree.
And finally, converting the regular grid data from a scalar space to an RGB color space, and making an atmospheric wind field wind speed PNG picture. Through the interactive selection of the front end to the time and height layers, the corresponding atmospheric wind field wind speed PNG picture is transmitted to the front end through the network for the visual rendering of the atmospheric wind field.
The process of dimension reduction and conversion of 4D data of an adjacent space atmospheric wind field into 2D grid data is shown in fig. 2, where the adjacent space atmospheric wind field is a 4D discrete function with longitude X, latitude Y, altitude H and time T as arguments, and W (X, Y, H, T) represents atmospheric wind field attribute values at longitude X, latitude Y, altitude H and time T, and the atmospheric wind field attribute values are usually expressed by latitudinal wind speed WuAnd radial wind velocity WvAnd (4) showing. Converting 4D wind field data into series T through time dimension reductionn3D wind field data of time of day, WTn(X, Y, H); and converting the 3D wind field data into a series of H through height dimension reductionnHeight 2D wind field grid data, i.e. WTn,Hm(X,Y)。
After converting the 4D atmospheric wind field data into a series of regular 2D grid data, the 2D grid data needs to be converted from a scalar space to an RGB color space to generate an atmospheric wind field wind speed PNG picture. Given the value of the RGB color space as (R, G, B), where R, G ∈ [0,255], are used to represent the latitudinal wind and the latitudinal wind, respectively, and the B color component is always assigned 0, the specific conversion formula is as follows:
R=Min{[Wu(X,Y)-Wu_min)/(Wu_max-Wu_min)]×256,255) (1)
G=Min{[Wv(X,Y)-Wv_min)/(Wv_max-Wv_min)]×256,255} (2)
wherein, Wu(X, Y) is a latitudinal wind speed value W at the position X and Y in the 2D grid data of the current height layeru_minIs the minimum wind speed value of the weft wind of the current height layer, Wu_maxThe maximum wind speed value of the weft wind of the current height layer is obtained; wv(X, Y) is the meridional wind velocity value at X, Y position in the current height layer 2D grid data, Wv_minIs the minimum wind speed value of the current height layer through wind, Wv_maxThe maximum wind speed value of the current height layer passing through the wind.
S2: based on the number of the seed points in the vision field range, self-adaptive dynamic initialization placement is carried out;
the number and the placement strategy of the seed points play a key role in the streamline distribution of the atmospheric wind field in the adjacent space. In order to obtain a streamline with a proper density in both a global view and a local view and avoid the situation that the local characteristics of an atmospheric wind field in an adjacent space are lost due to visual confusion caused by over-dense streamline or over-sparse streamline, seed points with a proper density need to be spread in a view field range.
The method comprises the steps of initializing the total number of particles (namely seed points) in the global view range, acquiring the current view range according to the viewpoint height and the camera position, calculating the number of particles in the current view range in real time along with the change of the viewpoint height, and uniformly scattering the seed points in the current view range by adopting a random noise function to endow the life cycle of the particles. And automatically calculating the number of the particles and re-scattering the particles every time the view field range is changed so as to ensure that the seed points are uniformly distributed in the view field range, and recording the position and the life value of the initialized particles in the two-dimensional texture of the particle state.
The particle number calculation formula in the current view field range is as follows:
Figure BDA0002880010030000101
in the formula, n represents the particle number of the current view field range, k represents the initial particle number in the global view (according to multiple experimental comparisons, the invention sets the total number k of particles in the global view field range to be 200 × 200, and h represents the current viewpoint height (in meters).
S3: generating a dynamic streamline;
according to the seed point initialization position and the speed of the seed point initialization position at the position, the current particle position is calculated by adopting a four-order Runge-Kutta numerical integration method, meanwhile, the particle life value is increased, numerical integration is continuously carried out according to the speed of the current position to obtain the particle position at the next moment, the particle life value is continuously increased, the rest is done by analogy, the particle position is continuously updated in an iteration mode, the particle life value is increased, and the particle is removed until the particle life cycle is finished or the particle position exceeds the view field range. And sequentially connecting all positions where the particles pass to form a particle motion track curve to generate a dynamic streamline. In order to ensure the balance of the number of the particles, a random variable (the value of the random variable is 0 or 1) is adopted to determine whether the particles are regenerated at the end position after the life cycle of the particles is ended. This step stores the position, life value, and velocity scalar value of the particle at each time in the corresponding particle state texture.
S4: seed point color mapping and transparency control;
in order to intuitively and uniformly express the intensity of the vector field, the velocity scalar value of the seed point in the vector field data is mapped into a corresponding color value; meanwhile, a particle transparency parameter related to a particle life value is adopted in the flow line generation process to dynamically adjust the transparency of the flow line of the atmospheric wind field, so that the transparency of the flow line from the head to the tail is gradually increased, and the direction information of the atmospheric wind field is better expressed.
The velocity-color mapping formula is
Figure BDA0002880010030000111
Where x represents the intensity magnitude of the vector, f (θ) represents the number of points with intensity θ, and y (x) represents the intensity value corresponding to the abscissa of the color mapping table in the figure, in the range of [0,1 ]. The color mapping method helps to solve the phenomena of uneven intensity variation and excessive color concentration of the vector field.
The transparency parameter calculation formula is as follows:
r=Lt/T (5)
wherein r represents the particle transparency at time t; l istRefers to the particle life value at time t, Lt=Lt-1+ Δ t, Δ t being LtAnd Lt-1A time interval of a time; t represents the life cycle of the particle, L is more than or equal to 0tT is less than or equal to T; therefore, the value range of the transparency r is more than or equal to 0 and less than or equal to 1. The transparency of the particles becomes greater as the life value of the particles increases.
S5: and outputting the visual wind field based on the adjacent space of the Web virtual earth.
Coordinate conversion and streamline visualization rendering output are achieved based on the ceium, and finally visualization of an atmospheric wind field in an adjacent space on the three-dimensional virtual earth is achieved.
In order to verify the technical effect of the experiment, the invention carries out comparative analysis on the drawing efficiency and the visualization effect of the near space wind field under the condition of different total numbers of the global initialization particles, wherein the drawing performance comparison of different total numbers of the global initialization particles is shown in a table 1:
TABLE 1
Figure BDA0002880010030000112
Figure BDA0002880010030000121
Referring to fig. 3, the following conclusions are drawn: when the number of particles is 128 × 128(16384), the streamline sparsity of the global visualization exists; when the number of particles is 256 by 256(65536), a better visualization effect can be obtained; when the number of particles is 512 x 512(262144), the phenomenon of dense and even turbulent flow lines exists, and the rendering efficiency is obviously reduced. According to the comparative analysis, the total number of the global initialization particles is set to be 200 × 200(40000), so that the global visualization effect is guaranteed, and the drawing efficiency is also kept.
Taking 200 × 200 as the total number of the global initialization particles, carrying out qualitative and quantitative analysis on the streamline rendering performance and the visualization effect of different viewpoints before and after self-adaptation, wherein the rendering performance and the visualization effect comparative analysis (the number of the global initialization particles is 200 × 200) of the self-adaptation method are shown in table 2:
TABLE 2
Figure BDA0002880010030000122
Referring to fig. 2-3 and table 1, it is proved that the self-adaptive method not only can completely express the overall motion characteristics of the wind field under the condition that the viewpoint height is constantly changed, but also can describe the local detail characteristics of the wind field by more reasonable streamline layout, thereby avoiding the loss of the characteristic information of the local vector field, ensuring the streamline drawing efficiency, and realizing the vivid and smooth dynamic interactive visualization of the atmospheric wind field in the adjacent space.
In summary, the invention provides a real-time self-adaptive visualization method of an atmospheric wind field in an adjacent space based on Web, which is characterized in that the particle number and the distribution area are self-adaptively adjusted in real time according to the dynamic change of a vision range, so that the proper particle number and density can be obtained in different viewpoint heights and different vision ranges, and a streamline with proper density is generated based on a particle tracking method, so that the motion characteristics of the atmospheric wind field in the adjacent space of the area are expressed, the whole change trend of the atmospheric wind field can be completely expressed, the local detail characteristics of the atmospheric wind field can be more finely described, and the vivid, smooth and dynamic visualization of the atmospheric wind field under different resolutions is realized.
The above-described embodiments are merely illustrative of the preferred embodiments of the present invention, and do not limit the scope of the present invention, and various modifications and improvements of the technical solutions of the present invention can be made by those skilled in the art without departing from the spirit of the present invention, and the technical solutions of the present invention are within the scope of the present invention defined by the claims.

Claims (9)

1. A Web-based near space atmospheric wind field real-time self-adaptive visualization method is characterized by specifically comprising the following steps:
s1: acquiring near space atmospheric wind field data, and preprocessing the data to obtain an atmospheric wind field wind speed PNG picture;
s2: acquiring viewpoint height based on the PNG picture and mouse event operation, determining a view range and the number of initialized particles, and adaptively and dynamically initializing and placing seed points;
s3: iteratively updating the positions of the particles according to the self-adaptive dynamic initialized seed points to generate a dynamic streamline;
s4: performing color mapping and transparency control on the seed points based on the dynamic streamline;
s5: and (3) rendering output based on ceium coordinate conversion and streamline visualization, namely converting the coordinates of the streamline, drawing the streamline by combining color mapping and transparency, and finally obtaining the atmospheric wind field visualization of the adjacent space on the three-dimensional virtual earth.
2. The Web-based near space atmospheric wind field real-time adaptive visualization method according to claim 1, wherein the step S1 is specifically as follows:
s11: acquiring 4D vertical layered data of an atmospheric wind field in an adjacent space in real time by scanning an FTP data server;
s12: converting the 4D vertical hierarchical data into 2D grid data through dimensionality reduction, and obtaining regular two-dimensional grid data through a bilinear interpolation method;
s13: and making the regular two-dimensional grid data into an atmospheric wind field wind speed PNG picture, and transmitting the atmospheric wind field wind speed PNG picture to the front end through a network based on an interactive selection mode of the front end to a time layer and a height layer.
3. The Web-based near space atmospheric wind field real-time adaptive visualization method according to claim 2, wherein the steps S12-S13 are specifically as follows:
converting 4D vertical layered data of an atmospheric wind field in an adjacent space into 3D wind field data at a series of moments through time dimension reduction; converting the 3D wind field data into 2D wind field grid data with a series of heights through height dimension reduction; and then converting the 2D wind field grid data from a scalar space to an RGB color space to generate an atmospheric wind field wind speed PNG picture.
4. The Web-based near space atmospheric wind field real-time adaptive visualization method according to claim 1, wherein the step S2 is specifically as follows:
s21: initializing the number of seed points in the PNG picture in the global view range;
s22: acquiring a current view range through the viewpoint height and the camera position, and calculating the number of seed points in the current view range in real time according to the viewpoint height change;
s23: and uniformly scattering the seed points in the current visual field range through a random noise function, and endowing the life cycle of the seed points.
5. The Web-based near space atmospheric wind field real-time adaptive visualization method according to claim 1, wherein the step S3 is specifically as follows:
s31: calculating the position of the current seed point by adopting a fourth-order Runge-Kutta numerical integral according to the initialized position of the seed point and the speed of the initialized position, and increasing the life value of the seed point;
s32: continuously performing fourth-order Runge-Kutta numerical integration according to the speed of the current seed point position to obtain the position of the seed point at the next moment, and continuously increasing the life value of the seed point at the next moment;
s33: repeating the steps S31-S32 until the life cycle of the seed point is finished or the position of the seed point exceeds the view range, stopping the operation, and clearing the seed point which exceeds the view range;
s34: and all positions where the seed points pass are sequentially connected to form a seed point motion track curve, so that a dynamic streamline is generated.
6. The Web-based near space atmospheric wind field real-time adaptive visualization method according to claim 5,
and after the life cycle of the seed point is ended, determining whether the seed point is regenerated at the ending position through a random variable.
7. The Web-based near space atmospheric wind field real-time adaptive visualization method according to claim 1, wherein the step S4 is specifically as follows:
mapping the velocity scalar values of the seed points to corresponding color values; and the transparency parameter of the seed point is adopted to control the transparency of the streamline of the atmospheric wind field.
8. The Web-based near space atmospheric wind field real-time adaptive visualization method according to claim 7, wherein the expression for mapping the velocity scalar values of the seed points to corresponding color values is:
Figure FDA0002880010020000031
where x is the intensity of the vector, f (θ) is the number of points with intensity θ, y (x) is the intensity value corresponding to the abscissa of the color mapping table, and the range of values is [0,1 ].
9. The Web-based near space atmospheric wind field real-time adaptive visualization method according to claim 6, wherein the calculation formula of the transparency parameter is as follows:
r=Lt/T
wherein r is the particle transparency at the time t; l istParticle life at time tValue, Lt=Lt-1+Δt,ΔtIs LtAnd Lt-1A time interval of a time; t is the life cycle of the particle, L is more than or equal to 0tT is less than or equal to T; the value range of the transparency r is more than or equal to 0 and less than or equal to 1.
CN202011630784.8A 2020-12-31 2020-12-31 Real-time self-adaptive visualization method for near-space atmospheric wind field based on Web Active CN112634393B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011630784.8A CN112634393B (en) 2020-12-31 2020-12-31 Real-time self-adaptive visualization method for near-space atmospheric wind field based on Web

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011630784.8A CN112634393B (en) 2020-12-31 2020-12-31 Real-time self-adaptive visualization method for near-space atmospheric wind field based on Web

Publications (2)

Publication Number Publication Date
CN112634393A true CN112634393A (en) 2021-04-09
CN112634393B CN112634393B (en) 2023-09-05

Family

ID=75289764

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011630784.8A Active CN112634393B (en) 2020-12-31 2020-12-31 Real-time self-adaptive visualization method for near-space atmospheric wind field based on Web

Country Status (1)

Country Link
CN (1) CN112634393B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112991509A (en) * 2021-04-13 2021-06-18 深圳市万向信息科技有限公司 WebGL-based three-dimensional wind field inversion method, system, device and storage medium
CN113158106A (en) * 2021-04-27 2021-07-23 中国石油大学(华东) Visualization method based on NetCDF flooding data
CN115423917A (en) * 2022-08-16 2022-12-02 中国人民解放军海军指挥学院 Real-time intelligent drawing method and system for global three-dimensional wind field

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103606192A (en) * 2013-11-27 2014-02-26 国家电网公司 Wind field visual display method based on three-dimensional virtual globe
CN107102309A (en) * 2017-04-28 2017-08-29 北京怡孚和融科技有限公司 Wind field spatial distribution is converted into the method and aerosol spatial and temporal distributions of wind field time orientation and the stacking method of wind field spatial and temporal distributions
CN107170044A (en) * 2017-05-09 2017-09-15 福州大学 A kind of dynamic and visual method of the wind based on dimensional topography
CN109063279A (en) * 2018-07-16 2018-12-21 南京信息工程大学 Three-dimensional space wind field Dynamic Simulation Method based on particle flux trajectory track algorithm
CN111582547A (en) * 2020-04-09 2020-08-25 中国科学院国家空间科学中心 Method for acquiring wind field distribution at different places by using wind field data set
CN112100299A (en) * 2020-08-20 2020-12-18 四川大学 Visualization method for emergency early warning of sudden toxic gas leakage

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103606192A (en) * 2013-11-27 2014-02-26 国家电网公司 Wind field visual display method based on three-dimensional virtual globe
CN107102309A (en) * 2017-04-28 2017-08-29 北京怡孚和融科技有限公司 Wind field spatial distribution is converted into the method and aerosol spatial and temporal distributions of wind field time orientation and the stacking method of wind field spatial and temporal distributions
CN107170044A (en) * 2017-05-09 2017-09-15 福州大学 A kind of dynamic and visual method of the wind based on dimensional topography
CN109063279A (en) * 2018-07-16 2018-12-21 南京信息工程大学 Three-dimensional space wind field Dynamic Simulation Method based on particle flux trajectory track algorithm
CN111582547A (en) * 2020-04-09 2020-08-25 中国科学院国家空间科学中心 Method for acquiring wind field distribution at different places by using wind field data set
CN112100299A (en) * 2020-08-20 2020-12-18 四川大学 Visualization method for emergency early warning of sudden toxic gas leakage

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李骞, 范茵, 王吉奎: "基于粒子追踪的风场可视化方法", 解放军理工大学学报(自然科学版), no. 01 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112991509A (en) * 2021-04-13 2021-06-18 深圳市万向信息科技有限公司 WebGL-based three-dimensional wind field inversion method, system, device and storage medium
CN113158106A (en) * 2021-04-27 2021-07-23 中国石油大学(华东) Visualization method based on NetCDF flooding data
CN115423917A (en) * 2022-08-16 2022-12-02 中国人民解放军海军指挥学院 Real-time intelligent drawing method and system for global three-dimensional wind field

Also Published As

Publication number Publication date
CN112634393B (en) 2023-09-05

Similar Documents

Publication Publication Date Title
CN112634393A (en) Web-based near space atmospheric wind field real-time self-adaptive visualization method
US20210232733A1 (en) Systems and methods for computer simulation of detailed waves for large-scale water simulation
CN111223191A (en) Large-scale scene infrared imaging real-time simulation method for airborne enhanced synthetic vision system
CN106228594A (en) Typhoon model cloud cartoon display method based on surface subdivision
CN104157000B (en) The computational methods of model surface normal
CN109461197B (en) Cloud real-time drawing optimization method based on spherical UV and re-projection
spick et al. Realistic and textured terrain generation using GANs
Wang et al. Capturing the dance of the earth: PolarGlobe: Real-time scientific visualization of vector field data to support climate science
WO2016076394A1 (en) Image processing apparatus, image processing method, and image processing program
CN115953551A (en) Sparse grid radiation field representation method based on point cloud initialization and depth supervision
CN110852952A (en) GPU-based large-scale terrain real-time rendering method
CN107977511A (en) A kind of industrial design material high-fidelity real-time emulation algorithm based on deep learning
KR20100138073A (en) System and method for rendering fluid flow
Ruzínoor et al. 3D terrain visualisation for GIS: A comparison of different techniques
CN115859755B (en) Visualization method, device, equipment and medium for steady flow field vector data
JP4656633B2 (en) Volume data rendering system and volume data rendering processing program
CN105894438A (en) GPU based multi-time-frame high-perception two-dimensional streamline organization algorithm
CN114373058A (en) Sea surface mesh dynamic division and height field generation method based on illusion engine
Justice et al. A process to create dynamic landscape paintings using barycentric shading with control paintings
CN115035231A (en) Shadow baking method, shadow baking device, electronic apparatus, and storage medium
Boorboor et al. Submerse: Visualizing Storm Surge Flooding Simulations in Immersive Display Ecologies
CN108961412B (en) Three-dimensional cloud simulation method based on self-adaptive far-field grid
CN114627258B (en) Method and system for isomorphic modeling of gravity field catamaran spheres
CN117593471B (en) Ocean three-dimensional situation visualization platform based on illusion engine
CN115222880A (en) Three-dimensional cloud scene programmed modeling method, device and equipment based on atmosphere layered model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant