CN112634393B - Real-time self-adaptive visualization method for near-space atmospheric wind field based on Web - Google Patents

Real-time self-adaptive visualization method for near-space atmospheric wind field based on Web Download PDF

Info

Publication number
CN112634393B
CN112634393B CN202011630784.8A CN202011630784A CN112634393B CN 112634393 B CN112634393 B CN 112634393B CN 202011630784 A CN202011630784 A CN 202011630784A CN 112634393 B CN112634393 B CN 112634393B
Authority
CN
China
Prior art keywords
wind field
atmospheric wind
data
seed
space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011630784.8A
Other languages
Chinese (zh)
Other versions
CN112634393A (en
Inventor
詹勤
范湘涛
邰志敏
杜小平
简洪登
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerospace Information Research Institute of CAS
Original Assignee
Aerospace Information Research Institute of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerospace Information Research Institute of CAS filed Critical Aerospace Information Research Institute of CAS
Priority to CN202011630784.8A priority Critical patent/CN112634393B/en
Publication of CN112634393A publication Critical patent/CN112634393A/en
Application granted granted Critical
Publication of CN112634393B publication Critical patent/CN112634393B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4007Interpolation-based scaling, e.g. bilinear interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/56Particle system, point based geometry or rendering
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The invention provides a real-time self-adaptive visualization method for an atmospheric wind field in near space based on Web, which is characterized in that the quantity and the distribution area of particles are self-adaptively adjusted in real time according to the dynamic change of a visual field range, so that the proper quantity and density of the particles can be obtained in different viewpoint heights and different visual fields, and a streamline with proper density is generated based on a particle tracking method, so that the motion characteristics of the atmospheric wind field in near space of the area are expressed, the overall change trend of the atmospheric wind field can be completely expressed, the local detail characteristics of the atmospheric wind field can be more finely depicted, and the vivid and smooth dynamic visualization of the atmospheric wind field under different resolutions is realized.

Description

Real-time self-adaptive visualization method for near-space atmospheric wind field based on Web
Technical Field
The invention relates to the technical field of 3D visualization and adjacent space atmospheric wind field intersection, in particular to a Web-based adjacent space atmospheric wind field real-time self-adaptive visualization method.
Background
The near space atmosphere, which is about 20-100km from the ground, is an important component of the earth's atmosphere, and its complex state changes and dynamic disturbances directly affect the safety of near space vehicles, the aerospace activities, and the human activities such as wireless information transmission, and have an important impact on climate changes. The near space atmospheric wind field is one of important parameters for representing the near space atmospheric environment, and the space-time distribution condition and the change rule of the near space atmospheric wind field are intuitively revealed in a visual mode, so that the method has important significance for improving the understanding of the near space atmospheric environment.
The near space atmospheric wind field data contains wind speed information and wind speed direction information, and is vector field data. The vector field data visualization method which is commonly used at present is mainly divided into the following steps: (1) a direct visualization method represented by a dot pattern method; (2) adopting a geometric visualization method of vector lines; (3) texture visualization method; (4) feature visualization method.
However, these methods all have their own drawbacks. The dot pattern method and the texture method are more suitable for two-dimensional vector visualization, and the problems of icon ambiguity and texture occlusion respectively exist in the aspect of three-dimensional expression; feature-based methods focus more on feature extraction, and the visual outcome is largely dependent on the quality of the feature extraction algorithm. The geometric-based flow field visualization method adopts a particle animation mode to express the continuity and dynamic characteristics of vector field data, and is considered as a better three-dimensional vector field visualization expression method. The streamline visualization method is the most widely applied geometric visualization method at present, the vector field characteristics are described by utilizing the track formed by the movement of particles along with time, the streamline visualization method accords with the flow field movement characteristics, and the streamline visualization method is easy to realize and is mainly used for visualizing the ocean and meteorological vector fields. However, how to generate flow lines with proper density, which is not too dense to cause visual confusion, but too sparse to lose flow field characteristic information, is still a difficulty and hot spot problem of flow line visualization. And there is little discussion about the real-time dynamic three-dimensional visual expression of the vertical layered data of the atmospheric wind field in the near space on the Web digital earth platform. Guo Changshun et al propose a near space vector field data multi-level mobile streamline visualization method based on a Web digital earth platform, which realizes multi-level, multi-temporal and dynamic interaction visualization of near space atmospheric wind field data, but the method focuses on the display of global wind field characteristics, local region wind field characteristics have defects due to too sparse streamline distribution, and quick switching of different-height interlayer wind field data is realized by adopting a time sequence animation mode, so that visual confusion is easy to cause.
Therefore, a Web-based near-space atmospheric wind field self-adaptive visualization system and a Web-based near-space atmospheric wind field self-adaptive visualization method are needed, not only can the overall change trend of the atmospheric wind field be completely expressed, but also local detail features of the atmospheric wind field can be more finely described, and the vivid and smooth dynamic visualization of the atmospheric wind field under different resolutions can be realized, so that the system and the method have very important practical value and practical significance.
Disclosure of Invention
In order to solve the technical problems, the invention provides a real-time self-adaptive visualization method for an adjacent space atmospheric wind field based on Web, which mainly aims at the problem of how to dynamically express vector features of the adjacent space atmospheric wind field with streamline of proper density on three-dimensional Web digital earth.
In order to achieve the above purpose, the invention provides a real-time self-adaptive visualization method for an atmospheric wind field in the near space based on Web, which comprises the following steps:
s1: acquiring near space atmospheric wind field data, and preprocessing the data to obtain an atmospheric wind field wind speed PNG picture;
s2: acquiring viewpoint height based on the PNG picture and mouse event operation, determining a view range and the number of initialized particles, and adaptively and dynamically initializing a seed placement point;
s3: according to the self-adaptive dynamic initialization seed points, iteratively updating particle positions to generate dynamic streamline;
s4: performing color mapping and transparency control on the seed points based on the dynamic streamline;
s5: and converting coordinates of the streamline based on coordinate conversion of the process and visualization rendering output of the streamline, and drawing the streamline by combining color mapping and transparency, so as to finally obtain near space atmospheric wind field visualization on the three-dimensional virtual earth.
Preferably, the step S1 specifically includes:
s11: acquiring 4D vertical layering data of an atmospheric wind field in near space in real time by scanning an FTP data server;
s12: converting the 4D vertical layered data into 2D grid data through dimension reduction, and obtaining regular two-dimensional grid data through a bilinear interpolation method;
s13: and manufacturing the regular two-dimensional grid data into an atmospheric wind field wind speed PNG picture, and transmitting the atmospheric wind field wind speed PNG picture to the front end through a network based on an interactive selection mode of the front end on time and a height layer.
Preferably, the steps S12 to S13 specifically include:
converting the 4D vertical layered data of the near-space atmospheric wind field into 3D wind field data at a series of moments through time dimension reduction; converting the 3D wind field data into 2D wind field grid data with a series of heights through height dimension reduction; and then converting the 2D wind field grid data from a scalar space to an RGB color space to generate an atmospheric wind field wind speed PNG picture.
Preferably, the step S2 specifically includes:
s21: initializing the number of seed points in the global view range in the PNG picture;
s22: acquiring a current view range through the viewpoint height and the camera position, and calculating the number of seed points in the current view range in real time according to the viewpoint height change;
s23: and uniformly sowing the seed points in the current visual field range through a random noise function, and endowing the seed points with a life cycle.
Preferably, the step S3 specifically includes:
s31: calculating the current seed point position by adopting fourth-order Dragon-Kutta numerical integration according to the seed point initialization position and the speed of the initialization position, and increasing the seed point life value;
s32: continuing to perform fourth-order Dragon-Kutta numerical integration according to the speed of the current seed point position to obtain the position of the seed point at the next moment, and continuing to increase the life value of the seed point at the next moment;
s33: repeating the steps S31-S32 until the life cycle of the seed point is finished or the position of the seed point exceeds the visual range, and removing the seed point exceeding the visual range;
s34: all positions where the seed points pass are sequentially connected to form a motion track curve of the seed points, and a dynamic streamline is generated.
Preferably, after the end of the seed point lifecycle, it is determined whether the seed point is regenerated at the end position by a random variable.
Preferably, the step S4 specifically includes:
mapping the speed scalar value of the seed point to a corresponding color value; and the transparency parameters of the seed points are adopted to control the transparency of the streamline of the atmospheric wind field.
Preferably, the expression for mapping the speed scalar value of the seed point to the corresponding color value is:
where x is the intensity of the vector, f (θ) is the number of points with intensity θ, y (x) is the abscissa of the intensity value corresponding to the color mapping table, and the value range is [0,1].
Preferably, the calculation formula of the transparency parameter is:
r=L t /T
wherein r is the transparency of the particles at the moment t; l (L) t The particle life value at time t is L t =L t-1 +Δt, Δt is L t And L is equal to t-1 Time intervals of time; t is the life cycle of the particles, and L is more than or equal to 0 t T is less than or equal to; the transparency r is in the range of 0, r and 1.
Compared with the prior art, the invention has the beneficial effects that:
(1) The invention provides a self-adaptive dynamic visualization method for vertical layered data of an atmospheric wind field in near space based on Web virtual earth, which dynamically and adaptively adjusts the number and distribution area of particles in real time according to the change of a visual field range, so that the height of different view points and the visual field range in the interactive visualization process can obtain proper number and density of particles, thereby generating a streamline with proper density to express the motion characteristics of the atmospheric wind field in near space of the area, not only can completely express the overall change trend of the atmospheric wind field, but also can more finely describe the local detail characteristics of the atmospheric wind field, and realize vivid dynamic visualization of the atmospheric wind field under different resolutions.
(2) The invention realizes the full-flow automation of the near-space atmospheric wind field data real-time acquisition, preprocessing and dynamic loading visualization in the Web environment, and provides a new three-dimensional visualization mode based on Web virtual earth for the real-time dynamic update and release of the near-space atmospheric wind field.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the prior art, the drawings that are needed in the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a diagram of the process of dimension reduction of the data of the atmospheric wind field in the near space according to the invention;
FIG. 3 is a graph comparing the visual effects of the total number of different global initialization particles according to the present invention; wherein, (a) is a visual effect diagram with the total number of the initialized particles being 128 x 128, (b) is a visual effect diagram with the total number of the initialized particles being 256 x 256, and (c) is a visual effect diagram with the total number of the initialized particles being 512 x 512;
FIG. 4 is a graph showing the comparison of the global visual effects before and after the adaptation of the near space atmospheric wind field when the initial particle count is 200 x 200; wherein, (a) is a global visual effect diagram before self-adaption and (b) is a global visual effect diagram after self-adaption.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In order that the above-recited objects, features and advantages of the present invention will become more readily apparent, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description.
Example 1
Referring to fig. 1, the invention provides a Web-based near space atmospheric wind field real-time adaptive visualization method, which specifically comprises the following steps:
s1: acquiring near space atmospheric wind field data in real time, and preprocessing the data;
firstly, the vertical layering data of the near space atmosphere wind field is obtained in real time by periodically scanning an FTP data server, wherein the data are assimilated data which are distributed by national center of science of China and are distributed by NetCDF format for 2 hours at intervals of 20-100km worldwide, the spatial resolution is that the longitude interval is 2.5 degrees, the latitude interval is 2 degrees, and the altitude interval is 1km.
Then, analyzing the NetCDF data, and converting the 4D wind field data into the atmospheric wind field 2D grid data of different height layers and different moments in time in a dimension reducing mode. Then, bilinear interpolation is carried out on the single-layer atmospheric wind field data in the longitude and latitude directions, and regular two-dimensional grid data with the longitude interval and the latitude interval of 1 DEG are generated.
And finally, converting the regular grid data from a scalar space to an RGB color space, and manufacturing an atmospheric wind field wind speed PNG picture. Through the interactive selection of the front end to the time and height layers, the corresponding air wind field wind speed PNG picture is transmitted to the front end through a network for the visual rendering of the air wind field.
Wherein, the process of dimension reduction conversion of the 4D data of the near space atmosphere wind field into the 2D grid data is shown in FIG. 2, and the near space atmosphere wind field is formed byThe 4D discrete function with the independent variables of the degree X, the latitude Y, the altitude H and the time T is represented by W (X, Y, H and T), wherein the atmospheric wind field attribute values of the latitude X, the latitude Y, the altitude H and the time T are usually represented by the weft wind speed W u And warp wind speed W v And (3) representing. Converting 4D wind field data into a series T through time dimension reduction n Time of day 3D wind field data, i.e. W Tn (X, Y, H); then the 3D wind field data is converted into a series H through the high dimension reduction n Height 2D wind field grid data, i.e. W Tn,Hm (X,Y)。
After converting the 4D atmospheric wind field data into the series of regular 2D grid data, the 2D grid data needs to be converted from scalar space to RGB color space, generating an atmospheric wind field wind speed PNG picture. The values of the RGB color space are given as (R, G, B), wherein R, G E [0, 255] are used for representing weft wind and warp wind respectively, the B color component is always assigned 0, and the specific conversion formula is as follows:
R=Min{[W u (X,Y)-W u_min )/(W u_max -W u_min )]×256,255} (1)
G=Min{[W v (X,Y)-W v_min )/(W x_max -W v_min ]×256,255} (2)
wherein W is u (X, Y) is the weft wind speed value, W, at the X, Y position in the current high-rise 2D grid data u_min For the minimum wind speed value W of the weft wind of the current height layer u_max The maximum wind speed value of the weft wind of the current height layer is set; w (W) v (X, Y) is the warp wind speed value, W, at the X, Y position in the current high-rise 2D grid data v_min For the minimum wind speed value of the current altitude layer wind, W v_max The maximum wind speed value of the current altitude layer warp wind.
S2: based on the number of the seed points in the visual field range, self-adaptive dynamic initialization placement is carried out;
the number of seed points and the placement strategy play a key role in streamline distribution of the atmospheric wind field in the near space. In order to obtain the streamline with proper density in the global view and the local view, the condition that the local characteristics of the atmospheric wind field in the nearby space are lost due to visual confusion caused by too dense streamline or too sparse streamline is avoided, and seed points with proper density need to be sowed in the view range.
The method comprises the steps of initializing the total number of particles (namely seed points) in a global view range, acquiring a current view range according to the height of a view point and the position of a camera, calculating the number of particles in the current view range in real time along with the change of the height of the view point, and uniformly sowing the seed points in the current view range by adopting a random noise function to endow the particles with life cycle. The number of particles is automatically calculated and the particles are re-sown every time the viewing area is changed, so that the seed points are uniformly distributed in the viewing area, and the initialized particle positions and life values are recorded in the two-dimensional texture of the particle state.
The calculation formula of the particle number in the current view range is as follows:
where n represents the number of particles in the current view, k represents the initial number of particles in the global view (according to multiple experimental comparisons, the invention sets the total number of particles in the global view k to 200 x 200, and h represents the current viewpoint height (in meters).
S3: generating a dynamic streamline;
according to the initial position of the seed point and the speed of the seed point at the position, a fourth-order Longgar-Kutta numerical integration method is adopted to calculate the current particle position, meanwhile, the particle life value is increased, numerical integration is continued according to the speed of the current position, the particle position at the next moment is obtained, the particle life value is continuously increased, and the like, the particle position is continuously and iteratively updated, the particle life value is increased until the particle life period is ended or the particle position exceeds the vision range, and then the particle is cleared. All positions through which the particles pass are sequentially connected to form a particle motion track curve, so that a dynamic streamline is generated. To ensure a uniform particle count, a random variable (the random variable takes a value of 0 or 1) is used to determine whether to regenerate particles at the end position after the end of the particle life cycle. This step stores the position, life value and speed scalar value of the particle at each moment in the corresponding particle state texture.
S4: seed point color mapping and transparency control;
in order to intuitively and uniformly express the intensity of the vector field, mapping the speed scalar value of the seed point in the vector field data into a corresponding color value; meanwhile, in the streamline generating process, the transparency of the streamline of the atmospheric wind field is dynamically adjusted by adopting a particle transparency parameter related to a particle life value, so that the transparency of the streamline from the head to the tail is gradually increased, and the direction information of the atmospheric wind field is better expressed.
The speed-color mapping formula is
Where x represents the intensity magnitude of the vector, f (θ) represents the number of points of intensity θ, and y (x) represents the abscissa of the intensity value corresponding to the color map shown in the figure, with a value range of [0,1]. This color mapping method helps to solve the phenomenon that the intensity variation of the vector field is not uniform and the color is excessively concentrated.
The transparency parameter calculation formula is:
r=L t /T (5)
wherein r represents the transparency of the particles at the time t; l (L) t Refer to the life value of the particle at time t, L t =L t-1 +Δt, Δt is L t And L is equal to t-1 Time intervals of time; t represents the life cycle of the particles, and L is more than or equal to 0 t T is less than or equal to; therefore, the value range of the transparency r is more than or equal to 0 and less than or equal to 1. Particle transparency increases with increasing life value of the particles.
S5: and (5) a visual output of a near space wind field based on the Web virtual earth.
The method is characterized in that coordinate transformation and streamline visual rendering output are realized based on the processor, and finally, near space atmosphere wind field visualization on the three-dimensional virtual earth is realized.
In order to verify the technical effect of the experiment, the invention performs comparison analysis on the drawing efficiency and the visual effect of the wind field in the near space under the condition of different total numbers of global initialization particles, wherein the comparison of the drawing performances of the numbers of different global initialization particles is shown in Table 1:
TABLE 1
Referring to fig. 3, the following conclusions are drawn: when the number of the particles is 128 x 128 (16384), the situation that the flow line is sparse exists in the global visualization; when the particle number is 256×256 (65536), a better visual effect can be obtained; when the particle number is 512×512 (26144), there is a phenomenon that the streamline is dense or even disturbed, and the drawing efficiency is significantly reduced. According to the comparative analysis, the total number of global initialization particles is set to 200 x 200 (40000), so that the global visual effect is ensured, and the drawing efficiency is also maintained.
Taking 200 x 200 as the total number of global initialization particles, carrying out qualitative and quantitative analysis on the rendering performance and the visual effect of the flow lines with different view points before and after self-adaption, wherein the comparison analysis (the number of the global initialization particles is 200 x 200) of the rendering performance and the visual effect of the self-adaption method is shown in table 2:
TABLE 2
2-3 and Table 1, the self-adaptive method is proved to not only completely express the overall motion characteristics of the wind field under the condition that the viewpoint height is continuously changed, but also describe the local detail characteristics of the wind field in a more reasonable streamline layout, so that the loss of the characteristic information of the local vector field is avoided, the streamline drawing efficiency is ensured, and the vivid and smooth dynamic interactive visualization of the atmospheric wind field in the near space is realized.
In summary, the invention provides a real-time self-adaptive visualization method for the near-space atmospheric wind field based on Web, which can adaptively adjust the particle quantity and the distribution area in real time according to the dynamic change of the visual field range, so that the proper particle quantity and density can be obtained in different viewpoint heights and different visual field ranges, and a streamline with proper density is generated based on a particle tracking method, so that the motion characteristics of the near-space atmospheric wind field of the area are expressed, the overall change trend of the atmospheric wind field can be expressed completely, the local detail characteristics of the atmospheric wind field can be described more carefully, and the vivid and smooth dynamic visualization of the atmospheric wind field under different resolutions can be realized.
The above embodiments are only illustrative of the preferred embodiments of the present invention and are not intended to limit the scope of the present invention, and various modifications and improvements made by those skilled in the art to the technical solutions of the present invention should fall within the protection scope defined by the claims of the present invention without departing from the design spirit of the present invention.

Claims (5)

1. A real-time self-adaptive visualization method for an atmospheric wind field in near space based on Web is characterized by comprising the following steps:
s1: acquiring near space atmospheric wind field data, and preprocessing the data to obtain an atmospheric wind field wind speed PNG picture;
s2: acquiring viewpoint height based on the PNG picture and mouse event operation, determining a view range and the number of initialized particles, and adaptively and dynamically initializing a seed placement point;
s3: according to the self-adaptive dynamic initialization seed points, iteratively updating particle positions to generate dynamic streamline;
s4: performing color mapping and transparency control on the seed points based on the dynamic streamline;
s5: converting coordinates of the streamline based on coordinate conversion of the process and visualization rendering output of the streamline, and drawing the streamline by combining color mapping and transparency to finally obtain visualization of an atmospheric wind field in the near space on the three-dimensional virtual earth;
the step S1 specifically comprises the following steps:
s11: acquiring 4D vertical layering data of an atmospheric wind field in near space in real time by scanning an FTP data server;
s12: converting the 4D vertical layered data into 2D grid data through dimension reduction, and obtaining regular two-dimensional grid data through a bilinear interpolation method;
s13: the regular two-dimensional grid data are manufactured into an atmospheric wind field wind speed PNG picture, and the atmospheric wind field wind speed PNG picture is transmitted to the front end through a network based on an interactive selection mode of the front end on time and a height layer;
the steps S12 to S13 specifically comprise:
converting the 4D vertical layered data of the near-space atmospheric wind field into 3D wind field data at a series of moments through time dimension reduction; converting the 3D wind field data into 2D wind field grid data with a series of heights through height dimension reduction; then converting the 2D wind field grid data from a scalar space to an RGB color space to generate an atmospheric wind field wind speed PNG picture;
the step S2 specifically comprises the following steps:
s21: initializing the number of seed points in the global view range in the PNG picture;
s22: acquiring a current view range through the viewpoint height and the camera position, and calculating the number of seed points in the current view range in real time according to the viewpoint height change;
s23: uniformly sowing seed points in the current visual field range through a random noise function, and endowing the seed points with a life cycle;
the step S3 specifically comprises the following steps:
s31: calculating the current seed point position by adopting fourth-order Dragon-Kutta numerical integration according to the seed point initialization position and the speed of the initialization position, and increasing the seed point life value;
s32: continuing to perform fourth-order Dragon-Kutta numerical integration according to the speed of the current seed point position to obtain the position of the seed point at the next moment, and continuing to increase the life value of the seed point at the next moment;
s33: repeating the steps S31-S32 until the life cycle of the seed point is finished or the position of the seed point exceeds the visual range, and removing the seed point exceeding the visual range;
s34: all positions where the seed points pass are sequentially connected to form a motion track curve of the seed points, and a dynamic streamline is generated.
2. The Web-based near-space atmospheric wind farm real-time adaptive visualization method according to claim 1, wherein,
after the seed point lifecycle is completed, it is determined whether to regenerate the seed point at the end position by a random variable.
3. The Web-based near space atmospheric wind farm real-time adaptive visualization method according to claim 1, wherein the step S4 specifically comprises:
mapping the speed scalar value of the seed point to a corresponding color value; and the transparency parameters of the seed points are adopted to control the transparency of the streamline of the atmospheric wind field.
4. The Web-based near-space atmospheric wind farm real-time adaptive visualization method of claim 3, wherein the mapping of the seed point velocity scalar values to corresponding color values is expressed as:
where x is the intensity of the vector, f (θ) is the number of points with intensity θ, y (x) is the abscissa of the intensity value corresponding to the color mapping table, and the value range is [0,1].
5. The Web-based near-space atmospheric wind farm real-time adaptive visualization method according to claim 2, wherein the transparency parameter is calculated according to the formula:
r=L t /T
wherein r is the transparency of the particles at the moment t; l (L) t The particle life value at time t is L t =L t-1 +Δt, Δt is L t And L is equal to t-1 Time intervals of time; t is the life cycle of the particles, and L is more than or equal to 0 t T is less than or equal to; the value range of the transparency r is more than or equal to 0 and less than or equal to r1。
CN202011630784.8A 2020-12-31 2020-12-31 Real-time self-adaptive visualization method for near-space atmospheric wind field based on Web Active CN112634393B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011630784.8A CN112634393B (en) 2020-12-31 2020-12-31 Real-time self-adaptive visualization method for near-space atmospheric wind field based on Web

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011630784.8A CN112634393B (en) 2020-12-31 2020-12-31 Real-time self-adaptive visualization method for near-space atmospheric wind field based on Web

Publications (2)

Publication Number Publication Date
CN112634393A CN112634393A (en) 2021-04-09
CN112634393B true CN112634393B (en) 2023-09-05

Family

ID=75289764

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011630784.8A Active CN112634393B (en) 2020-12-31 2020-12-31 Real-time self-adaptive visualization method for near-space atmospheric wind field based on Web

Country Status (1)

Country Link
CN (1) CN112634393B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112991509A (en) * 2021-04-13 2021-06-18 深圳市万向信息科技有限公司 WebGL-based three-dimensional wind field inversion method, system, device and storage medium
CN113158106A (en) * 2021-04-27 2021-07-23 中国石油大学(华东) Visualization method based on NetCDF flooding data
CN115423917B (en) * 2022-08-16 2023-07-21 中国人民解放军海军指挥学院 Real-time drawing method and system for global three-dimensional wind field

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103606192A (en) * 2013-11-27 2014-02-26 国家电网公司 Wind field visual display method based on three-dimensional virtual globe
CN107102309A (en) * 2017-04-28 2017-08-29 北京怡孚和融科技有限公司 Wind field spatial distribution is converted into the method and aerosol spatial and temporal distributions of wind field time orientation and the stacking method of wind field spatial and temporal distributions
CN107170044A (en) * 2017-05-09 2017-09-15 福州大学 A kind of dynamic and visual method of the wind based on dimensional topography
CN109063279A (en) * 2018-07-16 2018-12-21 南京信息工程大学 Three-dimensional space wind field Dynamic Simulation Method based on particle flux trajectory track algorithm
CN111582547A (en) * 2020-04-09 2020-08-25 中国科学院国家空间科学中心 Method for acquiring wind field distribution at different places by using wind field data set
CN112100299A (en) * 2020-08-20 2020-12-18 四川大学 Visualization method for emergency early warning of sudden toxic gas leakage

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103606192A (en) * 2013-11-27 2014-02-26 国家电网公司 Wind field visual display method based on three-dimensional virtual globe
CN107102309A (en) * 2017-04-28 2017-08-29 北京怡孚和融科技有限公司 Wind field spatial distribution is converted into the method and aerosol spatial and temporal distributions of wind field time orientation and the stacking method of wind field spatial and temporal distributions
CN107170044A (en) * 2017-05-09 2017-09-15 福州大学 A kind of dynamic and visual method of the wind based on dimensional topography
CN109063279A (en) * 2018-07-16 2018-12-21 南京信息工程大学 Three-dimensional space wind field Dynamic Simulation Method based on particle flux trajectory track algorithm
CN111582547A (en) * 2020-04-09 2020-08-25 中国科学院国家空间科学中心 Method for acquiring wind field distribution at different places by using wind field data set
CN112100299A (en) * 2020-08-20 2020-12-18 四川大学 Visualization method for emergency early warning of sudden toxic gas leakage

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于粒子追踪的风场可视化方法;李骞, 范茵, 王吉奎;解放军理工大学学报(自然科学版)(第01期);全文 *

Also Published As

Publication number Publication date
CN112634393A (en) 2021-04-09

Similar Documents

Publication Publication Date Title
CN112634393B (en) Real-time self-adaptive visualization method for near-space atmospheric wind field based on Web
CN108921926B (en) End-to-end three-dimensional face reconstruction method based on single image
CN110163799B (en) Super-resolution point cloud generation method based on deep learning
CN109147025B (en) RGBD three-dimensional reconstruction-oriented texture generation method
CN105678846A (en) Three-dimensional visualization method and system for real-time meteorological networking radar data
CN104867181A (en) Fast displaying and drawing method of weather elements on three dimensional earth model
CN106228594A (en) Typhoon model cloud cartoon display method based on surface subdivision
KR20080055581A (en) Apparatus, method, application program and computer readable medium thereof capable of pre-storing data for generating self-shadow of a 3d object
CN103700134A (en) Three-dimensional vector model real-time shadow deferred shading method based on controllable texture baking
CN113436308A (en) Three-dimensional environment air quality dynamic rendering method
CN111223191A (en) Large-scale scene infrared imaging real-time simulation method for airborne enhanced synthetic vision system
CN109461197B (en) Cloud real-time drawing optimization method based on spherical UV and re-projection
CN111028335B (en) Point cloud data block surface patch reconstruction method based on deep learning
CN110070559A (en) A kind of wind power generation blade three-dimensional reconstruction method based on unmanned plane image
CN107704483B (en) A kind of loading method of threedimensional model
Ruzínoor et al. 3D terrain visualisation for GIS: A comparison of different techniques
US20220392121A1 (en) Method for Improved Handling of Texture Data For Texturing and Other Image Processing Tasks
CN115953551A (en) Sparse grid radiation field representation method based on point cloud initialization and depth supervision
CN107146208A (en) The restoration methods of the non-complete model local deformation optimized based on thin plate spline basic function
CN115035231A (en) Shadow baking method, shadow baking device, electronic apparatus, and storage medium
CN113139965A (en) Indoor real-time three-dimensional semantic segmentation method based on depth map
CN107688599B (en) A kind of method of quick-searching threedimensional model
CN115423917B (en) Real-time drawing method and system for global three-dimensional wind field
Hempe et al. Generation and rendering of interactive ground vegetation for real-time testing and validation of computer vision algorithms
CN114627258B (en) Method and system for isomorphic modeling of gravity field catamaran spheres

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant