CN107067458B - Visualization method for enhancing texture advection - Google Patents

Visualization method for enhancing texture advection Download PDF

Info

Publication number
CN107067458B
CN107067458B CN201710026981.0A CN201710026981A CN107067458B CN 107067458 B CN107067458 B CN 107067458B CN 201710026981 A CN201710026981 A CN 201710026981A CN 107067458 B CN107067458 B CN 107067458B
Authority
CN
China
Prior art keywords
texture
advection
noise
filtering
visualization
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710026981.0A
Other languages
Chinese (zh)
Other versions
CN107067458A (en
Inventor
鲁大营
高仲合
倪建成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qufu Normal University
Original Assignee
Qufu Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qufu Normal University filed Critical Qufu Normal University
Priority to CN201710026981.0A priority Critical patent/CN107067458B/en
Publication of CN107067458A publication Critical patent/CN107067458A/en
Application granted granted Critical
Publication of CN107067458B publication Critical patent/CN107067458B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses a visualization method for enhancing texture advection, which is characterized by comprising the following steps of: designing a noise calculation model; a noise weight adjustment model; designing a filtering control model; defining a texture evaluation function; and through the execution flow of vector field + white noise → noise gradient calculation → advection texture weight adjustment → filtering control → visual imaging, three factors which influence the final output texture quality can be influenced: selecting a texture value on the advection position, contribution of the advection position to final imaging and a filter kernel, stretching contrast by using the maximum descending amplitude of a pseudo gradient based on an adjacent noise point, adaptively adjusting the texture details of a highlight vector according to the local change characteristic of the advection position to the weight of noise, and finally selecting a proper high-order filter kernel smooth parameter trace line to obtain the effect of enhancing visualization; by adopting the method, the intensity contrast and the smoothness between the advection trails of the texture of the vector field are enhanced, the drawing effect is improved, and the texture distribution effect of the three-dimensional vector field is displayed with high quality.

Description

Visualization method for enhancing texture advection
Technical Field
The invention relates to the fields of computer graphics, vector field visualization and computational fluid mechanics, in particular to a visualization method for enhancing the contrast and smoothness of a vector field texture advection trajectory.
Background
The texture-based method can effectively reflect the morphological change of the vector field by utilizing the gray level correlation of the texture, is beneficial to analyzing and refining scientific data, understanding the data connotation and revealing the internal law, plays an important role in scientific calculation and engineering analysis, is widely applied to the fields of automobile industry, aeronautical dynamics, turbo-mechanical design, weather forecast, meteorological simulation, geological exploration, medical visualization and the like, and has very important practical significance and research value.
The texture-based method is a promising but rather challenging vector field visualization method, and generally adopts a dense texture mode to visualize the whole appearance of the vector field, describe the detail change of the vector field, avoid the problems of seed points (such as occlusion or detail loss and the like) in a geometric method, effectively track all the characteristics of the fluid, and is well suitable for two-dimensional plane or curved surface-based visualization. But there are very challenging problems when this approach is applied to higher dimensional texture spaces. Firstly, texture particles are advected in a high-dimensional space, serious mutual occlusion exists, confusion appears and even artifacts are generated, and high-quality dense texture visual representation is difficult to obtain; second, texture calculations can be performed in each cell or voxel in a high dimensional space, resulting in a significant increase in the number of calculations. The vector field visualization based on the sparse texture can effectively avoid the problems encountered by the dense texture, namely, the problem of mutual shielding among texture tracks can be reduced to a certain extent, and the calculation efficiency is improved through an effective acceleration technology. However, this method still needs to be further improved in rendering quality, such as enhancing texture advection trajectory contrast and improving smoothness.
Disclosure of Invention
The invention aims to provide the visualization method of texture advection, which can effectively enhance the intensity contrast and smoothness among vector field texture advection trails, improve the drawing effect and display the three-dimensional vector field texture distribution with high quality by 'visualizing data and giving intelligence to information'.
In order to achieve the above object, the present invention provides a visualization method for enhancing texture advection, comprising the following steps:
step 1: designing a noise calculation model;
calculating a noise pseudo gradient, testing the variation trend in a local region of a vector field, and selecting the maximum descending amplitude of the pseudo gradient to participate in texture advection; defining the sphere range of the adjacent area of the current noise point, observing the adjacent points of the grid point and the spherical point on the grid line in the range, acquiring the change characteristic of the surrounding area through the position relation between the current noise point and the adjacent points, and calculating the noise pseudo gradient;
step 2: a noise weight adjustment model;
weighting and performing texture accumulation according to the contribution degree of the noise value at each advection position to the final output texture, so that the texture advection tracking path structure with a large change degree in a local area is clearly presented, and the contrast among veins with different textures is increased;
defining the curvature of the texture advection track, considering the distance between advection positions and the change degree of the position direction of a local area, and properly increasing the weight of a texture value with larger change degree at the advection position;
and step 3: designing a filtering control model;
synthesizing a smoothing operator through recursive convolution box type filtering, performing Gaussian approximation, smoothing rough texture advection tracks, and enabling texture visualization results to have space-time consistency;
defining a high-order filtering model, performing n times of iterative convolution derivation on the basis of box type filtering to obtain high-order filtering, and applying the high-order filtering model to construction of texture bodies in L IC through a 4-order filtering kernel;
and 4, step 4: defining a texture evaluation function;
the intensity change intensity degree at each position is quantified by the quotient of the texture voxel intensity standard deviation and the local region intensity average value, and the texture value residual error and the texture contour highlight visualization contrast at the same position.
By adopting the method, three factors influencing the final output texture quality can be influenced through the execution flow of 'vector field + white noise → noise gradient calculation → advection texture weight adjustment → filtering control → visual imaging': the texture value on the advection position, the contribution of the advection position to the final imaging and the selection of the filtering kernel are carried out, the contrast is stretched by utilizing the maximum descending amplitude of the pseudo gradient based on the adjacent noise point, the highlighted vector texture detail is adaptively adjusted according to the local change characteristic of the advection position to the weight of the noise, and finally, a proper high-order filtering kernel smooth parameter trace line is selected, so that the strength contrast and the smoothness between the vector field texture advection traces are enhanced, the drawing effect is improved, and the visualization effect of the three-dimensional vector field texture distribution is displayed with high quality.
Drawings
FIG. 1 is a flow chart of an implementation of an enhanced texture smoothing method;
FIG. 2 is a schematic diagram of determining a noise point as a center in a positional relationship between a current noise point and an adjacent noise point;
FIG. 3 is a schematic diagram of calculating a noise pseudo gradient in determining a positional relationship between a current noise point and a neighboring noise point.
Detailed Description
The following detailed description of the invention refers to the accompanying drawings.
The three-dimensional vector field is visualized in a sparse texture form, the input white noise is subjected to sparse sampling, the sparse distribution of points is relied on, and the density of texture expression is controlled by adjusting the number of the points; generating uniform distribution of points in a three-dimensional space through a Halton low-deviation sequence in a quasi-Monte Carlo method, rasterizing the noise points into a three-dimensional sparse texture, taking the three-dimensional sparse texture as a texture seed without quality, and performing advection on the three-dimensional sparse texture in the three-dimensional space by utilizing a vector field data and line integral convolution method. Since texture convolution calculation is essentially a low-pass filter, the contrast between texture advection tracks is reduced, and rough and unsmooth textures are easily generated. The existing traditional method selects a Gaussian filter kernel, or adjusts the integral step length, or sets a sensing function based on the integral radius of the advection position to reduce the visualization perception problems, and the traditional method treats the symptoms and the root causes. The invention relates to a visualization method for enhancing texture advection, which comprises the following steps:
step 1: designing a noise calculation model;
calculating a noise pseudo gradient, testing the variation trend in a local region of a vector field, and selecting the maximum descending amplitude of the pseudo gradient to participate in texture advection; defining the sphere range of the adjacent area of the current noise point, observing the adjacent points of the grid point and the spherical point on the grid line in the range, acquiring the change characteristic of the surrounding area through the position relation between the current noise point and the adjacent points, and calculating the noise pseudo gradient; specifically, the method comprises the following steps: the input noise data used in the present invention is a discrete set of values and the objective function is not predictable, making the computation of the noise gradient difficult. For this purpose, a new noise gradient calculation model based on the fastest decrease of the pseudo gradient is defined, that is, based on the discrete linear pseudo gradient calculation between adjacent points centered on the current point, as shown in fig. 2, in order to calculate the pseudo gradient at any noise point, the characteristics of local area change around the point are fully utilized to determine the adjacent point (not necessarily the grid point) related to the current point position. Fig. 2 shows that a sphere range centered on the current noise point is determined by the custom radius, and both the grid point and the spherical point on the grid line in the range are regarded as neighboring points of the current noise point. Fig. 3 shows that the noise pseudo gradient is calculated by using the position relationship between the current noise point and the adjacent points, so as to measure the variation trend of the flow field in the area.
Setting a vector eiIs marked as
Figure RE-GDA0001284712060000031
The current noise point is r, ri,jRepresenting adjacent points in the ith component of r, then adjacent noise points in the positive direction ri+Is defined as
Figure RE-GDA0001284712060000032
Similarly, the neighboring noise point r in the opposite directioni-Is defined as
Figure RE-GDA0001284712060000033
The discrete pseudo-gradient is calculated as follows
Figure RE-GDA0001284712060000034
gm=max{gi};i=1,...,n
According to the above formula, the gradient value of the maximum descent amplitude on the connecting line between adjacent points centered on the current noise point can be determined. In the three-dimensional vector field space, in order to reduce the calculation complexity and reduce the number of adjacent points in the search space, the noise pseudo gradient is calculated only on the basis of the adjacent points in three dimensions. The adjacent points in each dimension comprise the adjacent grid points on the unit grid and the points on the spherical surface with the current point as the center and the curvature radius rho as the radius, the point with the largest descending amplitude is selected from the pseudo gradients based on the two types of adjacent points, meanwhile, a constraint condition needs to be added to the rho, if the rho is larger than 2d, wherein the d is the distance between the unit grids, the pseudo gradient based on the adjacent grid points is directly selected. Therefore, the contrast among the textures can be effectively stretched, the detail of the displayed textures is enhanced, and the display method can be expanded to any dimension space.
Step 2: a noise weight adjustment model;
weighting and performing texture accumulation according to the contribution degree of the noise value at each advection position to the final output texture, so that the texture advection tracking path structure with a large change degree in a local area is clearly presented, and the contrast among veins with different textures is increased;
defining the curvature of the texture advection track, considering the distance between advection positions and the change degree of the position direction of a local area, and properly increasing the weight of a texture value with larger change degree at the advection position; specifically, the method comprises the following steps: the contribution of the noise value at each advection position to the final output texture also affects the quality of the visualization. If the texture accumulation of short step length or uniform weight is used, the visualization contrast is easy to be reduced. This requires a criterion to analyze the magnitude of the contribution of the advection position noise to weight it for texture accumulation. The purpose is to clearly present the texture advection tracking path structure with a large change degree in a local area and increase the contrast between different texture veins. Therefore, the texture value with larger change degree of the advection position is properly increased to increase the weight, and the noise weight w is adaptively adjusted according to the curvature of the texture advection trackiIs marked as
Figure RE-GDA0001284712060000041
Figure RE-GDA0001284712060000042
Wherein Δ s is the arc length between two adjacent positions on the advection trajectory, ciThe included angle of the curvature of the advection track at the current position and the change of the tangential direction of the advection track is shown
Figure RE-GDA0001284712060000043
The size and its arc length deltas. The weighting method considers the distance between advection positions and the change degree of the position direction of a local area, and effectively increases the contrast of different change areas.
And step 3: designing a filtering control model;
synthesizing a smoothing operator through recursive convolution box type filtering, performing Gaussian approximation, smoothing rough texture advection tracks, and enabling texture visualization results to have space-time consistency;
the method comprises the steps of defining a high-order filtering model, carrying out n times of iterative convolution derivation on the basis of box type filtering to obtain high-order filtering, and applying the high-order filtering to texture body construction in L IC through a 4-order filtering kernel, wherein specifically, a texture advection track drawn by a traditional filtering kernel (such as box type filtering, triangular filtering and the like) cannot show better visibility, texture details often become rough and fuzzy, and Gaussian filtering can cause excessive smoothness and is difficult to control the size of a Gaussian window.
Boxed filter kernel is defined as
k1=1/L=(((x+L/2)')0-((x-L/2)')0)/L
In the interval [ -L/2, L/2]Inner satisfies
Figure RE-GDA0001284712060000051
Where L is the filter kernel length, while defining that if x- < 0, (x-) '(0) and when x- > 0, (x-)' (x-is any real number, can be repeatedComplex convolution box filtering to obtain higher order filtering, e.g. convolution two box filtering to obtain triangular filtering, denoted
k2=k1*k1=1/L2(((x+L/2)')1-2x+((x-L/2)')1)
Triangular filtering has continuous derivatives compared to box-type filtering, but still has discontinuous higher order derivatives. Thus we can use a higher order filter kernel, i.e. convolving the box filter n times, noted
Figure RE-GDA0001284712060000052
To pair
Figure RE-GDA0001284712060000053
Taking Fourier transform at two ends to obtain the Fourier transform,
Figure RE-GDA0001284712060000054
because of the fact that
Figure RE-GDA0001284712060000055
sin(x)≤1-x2And is and
Figure RE-GDA0001284712060000056
therefore, the temperature of the molten metal is controlled,
Figure RE-GDA0001284712060000057
therefore, the recursive high-order filtering converges on a Gaussian function along with the increase of the order value, the smooth texture visualization effect is better, the roughness of a smooth track can be well inhibited, and the approximation degree is better when the order value is smaller.
In summary, the three key factors affecting texture rendering quality are calculated by using a pseudo gradient fastest descent method, texture advection path calculation is performed by combining vector field data, a noise weight value is adaptively adjusted at each advection position according to local region characteristics, appropriate high-order filtering is selected, and texture convolution calculation is performed to generate vector textures. The final texture output value I' (r) is calculated as follows
I'(r)=I(r)+gm(r)
Because the texture advection is executed by utilizing the sparse noise particles, a large amount of airspace participates in the construction and rendering calculation of the whole texture body, and the convolution calculation of the three-dimensional texture has higher complexity essentially, the calculation and rendering of the texture body are carried out by utilizing the traditional airspace filtering method, the calculation efficiency is greatly improved, and the interactive texture visualization is ensured.
And 4, step 4: defining a texture evaluation function;
quantifying the intensity change intensity degree at each position through the quotient of the texture voxel intensity standard deviation and the local region intensity average value, and highlighting the visual contrast between the texture value residual error and the texture contour at the same position; specifically, the method comprises the following steps: in order to quantitatively measure the visual contrast of each method, the residual E of the texture value at the same position between the visual imaging and the original imaging based on a certain gradient method is calculated and defined as follows
E(ri)=|N'(ri)-N0(ri)|,i=1..n
Wherein N is0(ri) To visualize r without using any enhancement effectiThe texture value of (N') (r)i) Then r is the method of using some kind of prominent contrastiThe texture value of (a), n is the total number of voxels. In addition, the visual contrast is highlighted by utilizing the texture contour, and statistical characteristic analysis is carried out on the voxel texture intensity of the region of interest through a custom path. The intensity variation intensity at each location is quantified by the quotient of the standard deviation of the texel intensity and the local area intensity average, as defined below
Figure RE-GDA0001284712060000061
Wherein, gamma (r)i) Indicating the degree of fluctuation of texture intensity, I (r)i) Indicates the position riThe intensity value of the upper texture is set,
Figure RE-GDA0001284712060000062
indicates the position riMean value of texture intensity in the neighborhood, n is the total number of voxels. The more drastic the texture intensity changes, the greater the contrast of the visual imaging.
By adopting the method, three factors influencing the final output texture quality can be influenced through the execution flow of 'vector field + white noise → noise gradient calculation → advection texture weight adjustment → filtering control → visual imaging': the texture value on the advection position, the contribution of the advection position to the final imaging and the selection of the filtering kernel are carried out, the contrast is stretched by utilizing the maximum descending amplitude of the pseudo gradient based on the adjacent noise point, the highlighted vector texture detail is adaptively adjusted according to the local change characteristic of the advection position to the weight of the noise, and finally, a proper high-order filtering kernel smooth parameter trace line is selected, so that the strength contrast and the smoothness between the vector field texture advection traces are enhanced, the drawing effect is improved, and the visualization effect of the three-dimensional vector field texture distribution is displayed with high quality.

Claims (1)

1. A visualization method for enhancing texture advection is characterized by comprising the following steps:
step 1: designing a noise calculation model;
calculating a noise pseudo gradient, testing the variation trend in a local region of a vector field, and selecting the maximum descending amplitude of the pseudo gradient to participate in texture advection; defining the sphere range of the adjacent area of the current noise point, observing the adjacent points of the grid point and the spherical point on the grid line in the range, acquiring the change characteristic of the surrounding area through the position relation between the current noise point and the adjacent points, and calculating the noise pseudo gradient;
step 2: a noise weight adjustment model;
weighting and performing texture accumulation according to the contribution degree of the noise value at each advection position to the final output texture, so that the texture advection tracking path structure with a large change degree in a local area is clearly presented, and the contrast among veins with different textures is increased;
defining the curvature of the texture advection track, considering the distance between advection positions and the change degree of the position direction of a local area, and properly increasing the weight of a texture value with larger change degree at the advection position;
and step 3: designing a filtering control model;
synthesizing a smoothing operator through recursive convolution box type filtering, performing Gaussian approximation, smoothing rough texture advection tracks, and enabling texture visualization results to have space-time consistency;
defining a high-order filtering model, performing n times of iterative convolution derivation on the basis of box type filtering to obtain high-order filtering, and applying the high-order filtering model to construction of texture bodies in L IC through a 4-order filtering kernel;
and 4, step 4: defining a texture evaluation function;
the intensity change intensity degree at each position is quantified by the quotient of the texture voxel intensity standard deviation and the local region intensity average value, and the texture value residual error and the texture contour highlight visualization contrast at the same position.
CN201710026981.0A 2017-01-15 2017-01-15 Visualization method for enhancing texture advection Active CN107067458B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710026981.0A CN107067458B (en) 2017-01-15 2017-01-15 Visualization method for enhancing texture advection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710026981.0A CN107067458B (en) 2017-01-15 2017-01-15 Visualization method for enhancing texture advection

Publications (2)

Publication Number Publication Date
CN107067458A CN107067458A (en) 2017-08-18
CN107067458B true CN107067458B (en) 2020-07-21

Family

ID=59599313

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710026981.0A Active CN107067458B (en) 2017-01-15 2017-01-15 Visualization method for enhancing texture advection

Country Status (1)

Country Link
CN (1) CN107067458B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108288287B (en) * 2018-01-16 2021-08-06 合肥工业大学 Power graph-based tile texture generation method
CN110517356A (en) * 2019-08-21 2019-11-29 佳都新太科技股份有限公司 Realize system, the method and apparatus of the three-dimensional enhanced reality of multi-channel video fusion

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8768024B1 (en) * 2010-06-01 2014-07-01 Given Imaging Ltd. System and method for real time detection of villi texture in an image stream of the gastrointestinal tract
CN103914862A (en) * 2014-03-10 2014-07-09 上海大学 Pencil sketch simulating method based on edge tangent stream

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8768024B1 (en) * 2010-06-01 2014-07-01 Given Imaging Ltd. System and method for real time detection of villi texture in an image stream of the gastrointestinal tract
CN103914862A (en) * 2014-03-10 2014-07-09 上海大学 Pencil sketch simulating method based on edge tangent stream

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Fast LIC with Piecewise Polynomial Filter Kernels;Hege H C , Stalling D;《HANS-CHRISTIAN HEGE AND DETLEV STALLING》;20130405;第4节 *
基于GPU加速的3D矢量场改进VolumeLIC绘制技术;王松等;《计算机辅助设计与图形学学报》;20160531;第723-732页 *
基于最大伪梯度搜索的多样复合形法及其在边坡稳定分析中的应用;李亮等;《水力发电学报》;20051231;第20-24页 *
基于稀疏噪声的Volume LIC GPU加速绘制技术;王海洋等;《西南科技大学学报》;20160630;第72-80页 *

Also Published As

Publication number Publication date
CN107067458A (en) 2017-08-18

Similar Documents

Publication Publication Date Title
CA2721008A1 (en) Visulation of geologic features using data representations thereof
Strzodka et al. Image registration by a regularized gradient flow. A streaming implementation in DX9 graphics hardware
US11782183B2 (en) Magnetotelluric inversion method based on fully convolutional neural network
CN105405100B (en) A kind of sparse driving SAR image rebuilds regularization parameter automatic selecting method
Ullah et al. Three-dimensional visualization and quantitative characterization of grains in polycrystalline iron
CN107067458B (en) Visualization method for enhancing texture advection
CN111951292B (en) Object surface reflection attribute extraction method, device, equipment and storage medium
Šedivý et al. Description of the 3D morphology of grain boundaries in aluminum alloys using tessellation models generated by ellipsoids
Gu et al. Supershape recovery from electrical impedance tomography data
CN116127834A (en) PINN neural network-based speed field measurement method
Vyatkin Method of binary search for image elements of functionally defined objects using graphics processing units
Weiskopf Iterative twofold line integral convolution for texture-based vector field visualization
Liu et al. Parallel unsteady flow line integral convolution for high-performance dense visualization
Liktor Ray tracing implicit surfaces on the GPU
Linnér et al. Anti-aliased Euclidean distance transform on 3D sampling lattices
CN113658077A (en) Curvature-based regional bilateral mass point cloud noise reduction method
Szirmay-Kalos Filtering and gradient estimation for distance fields by quadratic regression
Liu et al. Research on influence model of fractional calculus derivative Fourier transform on aerobics view image reconstruction
Shafii et al. The topological effects of smoothing
Venkatesh et al. A comparative study of various deep learning techniques for spatio-temporal super-resolution reconstruction of forced isotropic turbulent flows
Matvienko et al. Explicit frequency control for high-quality texture-based flow visualization
Preßler et al. Virtual DSA visualization of simulated blood flow data in cerebral aneurysms
CN116416409B (en) Fluid simulation particle self-adaptive resolution surface reconstruction method and system
CN108573080B (en) Flow field visual view quantification method based on effective information
Schindler et al. Marching correctors–fast and precise polygonal isosurfaces of SPH data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant