CN101763649A - Method for drawing enhanced model contour surface point - Google Patents

Method for drawing enhanced model contour surface point Download PDF

Info

Publication number
CN101763649A
CN101763649A CN200910243262A CN200910243262A CN101763649A CN 101763649 A CN101763649 A CN 101763649A CN 200910243262 A CN200910243262 A CN 200910243262A CN 200910243262 A CN200910243262 A CN 200910243262A CN 101763649 A CN101763649 A CN 101763649A
Authority
CN
China
Prior art keywords
point
splat
sampled point
coordinate
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN200910243262A
Other languages
Chinese (zh)
Other versions
CN101763649B (en
Inventor
梁晓辉
段薇
何志莹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN200910243262XA priority Critical patent/CN101763649B/en
Publication of CN101763649A publication Critical patent/CN101763649A/en
Application granted granted Critical
Publication of CN101763649B publication Critical patent/CN101763649B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention relates to a method for drawing enhanced model contour surface point. At the stage of preprocessing, the spatial coordinates, normal vectors, radiuses and color information of sampled points are read and stored, a splat template is generated according to a given layer number, the sampled points are spatially divided according to a threshold specified by a user, k neighbors of each sampled point are solved, and finally, by utilizing the k neighbors, a coordinate transformation method is used to solve the curvature of each point. At the stage of operation, whether the sampled points are contour points or not is judged according to the normal vectors and visual angles of the sampled points, and if the sampled points are contour points, a series of vertex coordinates of curved surface splats with the sampled points as centers are calculated. If the sampled points are not contour points, a Surface Splatting algorithm is utilized to carry out drawing: a Jacobian matrix mapped by the 2D space of the sampled points is solved; the visibility of the splats projected on a screen is determined, and the splats are merged; the spatial light intensity of one pixel of the screen is calculated by utilizing an illumination model and shading; and finally, normalization is carried out. The drawing speed is high, the requirement on real time can be effectively met without hardware acceleration, and moreover, the drawing quality is high.

Description

A kind of surface point method for drafting that strengthens model silhouette
Technical field
The invention belongs to computer virtual reality and computer graphics techniques field, particularly relate to the point-rendering method in the drafting of the computer graphics sense of reality.
Background technology
Computer graphics is the subject that computer utility is formed gradually in the demonstration of figure and drafting aspect.It develops into now from birth, has played the part of more and more important role in a lot of fields, is bringing into play more and more important effect.It is an important component part (referring to Wang Jian, the discrete point method for drafting of photo realism graphic, Jilin University, master thesis, 2006) of computer graphics that photo realism graphic is drawn.Its comprehensive utilization mathematics, physics, computer science and other scientific knowledge generate the photo realism graphic as the photo on computer graphics equipment.
According to the difference of drawing pel, photo realism graphic draw the rendering technique (GBR) be divided into based on geometric graphic element, based on the rendering technique (IBR) of image and based on the rendering technique (PBR) of point (referring to Zhang Wu, research and application based on the rendering technique of putting, Zhejiang University, master thesis, 2006).GBR is traditional graphics plotting, usually model of place is resolved into tri patch, though can generate the photo realism graphic of high fidelity, but along with improving constantly of object model complexity, the triangular plate quantity of characterization model also sharply increases, and its processing has been caused bandwidth bottleneck and excessive Floating-point Computation.The image of IBR by generating in advance carries out conversion, interpolation and distortion to the picture that approaches viewpoint or direction of visual lines, thereby obtains the scene picture at current view point place fast.Irrelevant with scene complexity, long but shortcoming is pretreatment time, the storage space that needs are a large amount of, and viewpoint can produce tangible artificial trace when moving.PBR carries out the processing on moulding, drafting and other graphics on the basis of point.It has abandoned traditional tri patch method for expressing, and the information of a measuring point directly reconstructs final image by these points.Along with research and deepening continuously of using, model and geometric scene become and become increasingly complex, with a unit as the simplest pel of fundamental sum, do not need the storage and maintenance topological structure, be convenient to resampling, have redundant few, storage space is little, the advantage that render speed is fast.
Rendering technique based on point uses discrete three-dimensional point of density cloud to come the characterization model surface, target be adopt some unit as pel in three-dimensional point of density cloud, reconstruct continuously, vision model surface of equal value.Be fit to the model that the rendered surface geometric height is complicated or have complex surface illumination details.Ripe at present point-rendering technology has the method based on render speed: QSplat (Szymon Rusinkiewicz, Marc Levoy, QSplat:A Multiresolution PointRendering System for Large Meshes, Siggraph 2000, New Orleans, LA USA), it utilizes level to surround nodule number according to structure, is used for rejecting of the ken cutting fast and the back side and level of detail and draws.
Dachsbacher etc. have proposed a kind of the hierarchical structure sequence to be turned to the method for a linear array (referring to C.Dachsbacher, C.Vogelgsang, and M.Stamminger.Sequential point trees.Siggraph2003 Conference Proceedings), on this basis rendering algorithm is converted into the traversal of linear array by the level traversal of tree, thereby the GPU that has realized QSplat is hardware-accelerated.Take the lead in proposing with some bin (surfel) (referring to C.Dachsbacher based on the method for drawing effect: Pfister etc., C.Vogelgsang, and M.Stamminger.Sequential point trees.Siggraph 2003 Conference Proceedings) represent sampled point.Each point is represented as a disk that is positioned at a little on tangential, and the overlapped formation of each disk is body surface closely.Zwicker etc. are applied to the thought of signal Processing in a little the drafting, have proposed oval weighting (EWA) algorithm, have obtained high-quality drawing result (referring to Matthias Zwicker, Pfister.Surface Splatting.ACM Siggraph2001, Los Angeles, CA, USA).
To sum up, do not consider transparency and anti-aliasing problem based on the method for speed, rendering quality is general, draws the EWA algorithm of effect and is based on software and realizes and have high-quality, thereby do not possess real-time.
Summary of the invention
Technology of the present invention is dealt with problems: the problem of losing shape and drawing efficient in drawing at conventional point, a kind of surface point method for drafting that strengthens model silhouette is provided, this method can reconstruct the continuous surface of reserving model edge high-frequency signal, can satisfy the requirement of real-time rendering simultaneously preferably.
For finishing goal of the invention, the technical solution used in the present invention is: at pretreatment stage, at first read the also volume coordinate of store sample point, normal vector, radius, information such as color, generate the projection of bin according to given hierachy number at screen space, it is the splat template, according to user's specified threshold value the rectangular parallelepiped bounding box that the sampled point volume coordinate forms is carried out spatial division again, divide k the neighbours that the small cubes grid that obtains solves each sampled point according to space in the previous step, utilize the k neighbour at last, solve the curvature of each sampled point by coordinate transformation method, comprise the curvature of configuration sampling point and non-configuration sampling point.
In the operation phase, when viewpoint changes, at first judge according to sampled point normal vector and sight line angle whether this sampled point is point, if, the curvature of certain the configuration sampling point that obtains in conjunction with pretreatment stage then, calculating a series of compositions is the apex coordinate of the curved surface splat (bin is in the projection of screen space) at center with this sampled point.If not the configuration sampling point then utilizes Surface Splatting algorithm to draw: the Jacobi matrix of finding the solution the 2D spatial mappings of sample point; Each splat that projects to screen is carried out observability judge that the first step is obtained during again for pre-service color, normal vector, volume coordinate merge; Utilize illumination model and light and shade to handle the spatial light intensity that calculates the screen pixel; Carry out normalization at last, eliminate Gaussian function intercepting and the inhomogeneous influence of sampling point distributions image.
The advantage that point-rendering method of the present invention is compared with existing method is: render speed is fast, does not need hardware-acceleratedly can satisfy real-time needs preferably; The rendering quality height, but in anti-aliasing the high-frequency signal of reserving model contour edge.At first, the main time loss of Surface Splatting algorithm calculates shape and the size that projects to splat on the screen in real time, choose template when generating splat template and operation during by pre-service and can greatly improve render speed, secondly, Surface Splatting algorithm utilizes oval average weighted thought, can access high-quality splat syncretizing effect, at last, utilize curved surface splat to draw the edge, curvature can better be preserved the minutia of edge of model, uses the approximate plane splat in section to improve crenellated phenomena.
Description of drawings
Fig. 1 is overall flow figure of the present invention;
Fig. 2 is the parameters relationship of space splat of the present invention and screen;
Fig. 3 is the shape of curved surface splat of the present invention;
Fig. 4 a and Fig. 4 b are the model of interpolation illumination calculation of the present invention;
Fig. 5 is for drawing the comparison diagram of effect with traditional algorithm among the present invention; A is the model of drawing out, and the zone of irising out is the contrast part of traditional algorithm and the inventive method, and b is a traditional algorithm;
Fig. 6 is volume coordinate, normal vector, radius and the colouring information of the bin at center for sampled point among the present invention;
Fig. 7 is (3*3*3) grid synoptic diagram that comprises sampled point place small cubes for the cube among the present invention;
Fig. 8 is that of the present invention first depth value calculates synoptic diagram;
Fig. 9 is that observability of the present invention is judged the calculating synoptic diagram.
Embodiment
Stage two parts when as shown in Figure 1, the invention process process comprises pretreatment stage and operation.
Phase one: preprocessing part.Comprise and read sampled point, template generation, spatial division, find the solution the k neighbour, calculate five steps of curvature.
The first step: read sampled point.
Read and information such as the volume coordinate of store sample point, normal vector, radius, color.Volume coordinate by sampled point in the object coordinates system and the position that normal vector can be known this point of model surface and towards, the sampled point radius can guarantee that splat (projection of bin on two-dimensional screen) covers the nothing cavity mutually, and colouring information can better show the details of model.
Fig. 6 provides with the sampled point volume coordinate, normal vector, radius and the colouring information of the bin (Surfel, i.e. Surface elements, surface element) that is the center, and the enough big radius splat that can guarantee to project to screen covers mutually and do not have the cavity.
Second step: template generates.
In order to eliminate the cavity between sampled point, at object space sampled point is expressed as bin (reconstruct nuclear), this step is used for the shape that computer memory splat projects to screen.Fig. 2 has provided the splat shape that the section projects to screen.According to the hierachy number of user's appointment, the radius of the sampled point that reads in the first step and store is divided into n 1, n 1Big more, the splat template is many more, and the drafting effect is good more; The radius r of i layer i=(r Max-r Min)/n 1I, r Max, r MinRepresent the maximum radius and the least radius of the sampled point that reads respectively, sampled point section and screen angle are θ, 0≤θ<90, θ is divided into n2 interval, oval angle with screen space x axle is ω, and 0≤ω<180 are divided into n3 interval with ω, then can generate n1n2n3 splat template, the process that template generates is as follows:
(1) operation in second step for i interval of radius, is carried out in n1 interval of traversal radius generation;
(2) interval for n2 and n3, travel through with the step-length of deltang, for j among the n2 interval, k among the n3 is interval, carries out the 3rd operation generation template that goes on foot;
(3) be respectively (r for lower left corner coordinate and upper right corner coordinate i,-r i), (r i, r i) matrix in a certain pixel (x, y), if it satisfies (xcos ω k-ysin ω k) 2/ (r iCos θ j) 2+ (ycos ω k+ xsin ω k) 2/ r i 2≤ 1 this pixel is covered by splat, and all pixels in the Ergodic Matrices promptly calculate this oval shape; The angle of ellipse and screen space x axle ω = arctan n → pβx n → pβy , wherein
Figure G200910243262XD00042
Be respectively the component that is projected in screen space x, y axle of sampled point P normal vector;
(4) (1), (2), the operation of (3) step are carried out in circulation, all travel through once up to all intervals, just can generate all splat templates
The 3rd step: spatial division.
The volume coordinate that is read by the first step obtains the rectangular parallelepiped bounding box of sampled point, according to the length of side a of user's appointment, this bounding box is divided into mnl small cubes, and m, n, l are respectively the number of small cubes on x, y, the z axle; Handle the Hash table HashList[mnl of conflict to adopt chain address method] store the point in each cube; Finding the solution each sampled point is n at the index i of x, y, z direction x, n y, n z, n wherein xRepresent the index of certain sampled point, n at the residing small cubes of x direction yRepresent the index of certain sampled point, n at the residing small cubes of y direction zRepresent the index of certain sampled point, wherein 0≤n at the residing small cubes of z direction x<m, 0≤n y<n, 0≤n z<l stores into
The 4th step: find the solution the k neighbour
In the relevant grid (3*3*3) of certain sampled point place small cubes, seek all neighbor points, insert ordering according to the order that the distance of sampled point and neighbor point is ascending, k point is stored in the adjacency list before choosing, k herein is that the user sets, be used for later step and ask the curvature of regional area, generally be made as 20--30.
Shown in 7 figure, this cube is (3*3*3) grid that comprises sampled point place small cubes, the center small cubes is a sampled point place cube, the call number of known this small cubes, just know the index of all little solids in (3*3*3) grid, the volume coordinate of all neighbor points in these little solids can be in Hash table, found out, thereby the distance of sampled point and neighbor point can be calculated.
The 5th step: calculate curvature.
The process of calculating curvature according to coordinate transformation method is as follows:
(1) k neighbor point projected on the sampled point section, choose sampled point to the vector of unit length of subpoint direction farthest as the u axle, by u and normal vector multiplication cross compute vector v, thereby calculate the u, v value of neighbor point and at the coordinate figure h of normal vector direction;
(2),, utilize least square method to carry out parabolic match in the local adjacent domain of sampled point according to u, v and h coordinate figure.The parabola of match is that (u v)=au^2+bu*v+cv^2, utilizes least square method can solve coefficient a, b, c to s;
(3) parabola that simulates is sampled point curvature, mean curvature=a+c in the mean curvature of initial point;
Subordinate phase: part during operation.At first judging whether to be the configuration sampling point, if this normal vector and sight line angle then are point in [ε, 90] scope, otherwise is not point, needs to handle respectively.Thereby be divided into curved surface splat and draw with Surface Splatting and draw two subs.
First sub: curved surface splat draws.Fig. 3 has provided the shape of curved surface splat.
The first step: the summit of calculating curved surface splat.
Certain summit for curved surface splat
Figure G200910243262XD00051
, it is i for this vertex representation sLayer, k sSummit on the bar limit, i herein s, k sBe that the user is set, general i s, k sBig more, the complexity of expression curved surface splat is high more, and the drafting effect is good more.The curvature c that calculates during according to pre-service, local coordinate is
Figure G200910243262XD00052
, wherein
n sRepresent total limit number of curved surface splat,
Figure G200910243262XD00053
Represent i sThe radius of layer, φ ( rs i s ) = e - ( rs is c ) 2
Second step: texture.
The summit
Figure G200910243262XD00055
Texture coordinate be
Figure G200910243262XD00056
N wherein sRepresent total limit number of curved surface splat.
Second sub: Surface Splatting (the method is that Zwicker proposes, and is this area generic term, does not have Chinese and explains) draws.Comprise and determine resampling nuclear, Splatting, illumination calculation and four steps of normalization.
The first step: determine the nuclear that resamples.
The nuclear that resamples is the reconstruct nuclear that projects on the screen and the convolution of low-pass filtering.Because the closed of oval Gaussian function under affined transformation and convolution, the two all adopts its expression, thereby the nuclear that resamples also can be expressed as the form of oval Gaussian function, just can reach anti-aliasing preferably effect by choosing suitable variance matrix.
The expression formula of oval Gaussian function is: Gv ( x ) = 1 2 π | V | 1 2 e - 1 2 x T V - 1 x
By the character of oval Gaussian function, the expression formula of the nuclear that resamples is:
ρ ( x ) = ( r ′ ⊗ h ) ( x - m ( u ) )
= 1 | J - 1 | ( G JVJ T ⊗ G I ) ( x - m ( u ) )
= 1 | J - 1 | G JVJ T + I ( x - m ( u ) )
In the following formula, J is the Jacobi matrix of object space to the screen space mapping, and V is the covariance matrix of sample point, generally is expressed as V = r 2 0 0 r 2 R is the radius of sampled point, I is a unit matrix, x is the screen pixels point that splat covered, m (u) projects to the coordinate of screen space for sampled point, the Space Reconstruction nuclear that r ' expression changes through projection, and h represents the low-pass filter function of screen space, oval Gaussian function is similar to the normal distribution of two dimension, uses r in all covariance matrixes 2Expression.
Second step: the Splatting (the space bin being projected to the process that forms splat on the screen)
The observability that this step determines to project to each splat of screen pixels is judged and is merged.Processing procedure is as follows:
(1) template chooses.Do not need to calculate in real time shape and the size that projects to splat on the screen, but the template that pre-service generates is chosen: the arc-tangent value that is projected in x, y axle component according to radius, normal vector and sight line angle theta, the normal vector of object space reconstruct nuclear ω = arctan n → pβx n → pβy , wherein
Figure G200910243262XD00067
Be respectively the component of the axle that is projected in screen space x, y of sampled point P normal vector, when operation, calculate the interval among n1, n2, the n3 under these values, with interval i, j, k is example, is that index is tabled look-up and can be obtained pixel and the method for weighting vector that this splat covers with in2n3+jn3+k+n1; n 1, n2, n3 represent radius, sampled point section and screen angle theta respectively, angle ω 0≤ω<180 of 0≤θ<90, oval and screen space x axle) interval number.n 1Be generally the user and set, n2, n3 determine n2=90/deltang, n3=180/deltang by the traversal step-length deltang that the user sets.
(2) utilizing the z-threshold method to carry out observability judges: calculate splat and project to screen, the pixel that covers along the z depth value of direction of visual lines, according to the perspective projection transformation model, back projection of screen sheet unit can be obtained a Qp to the represented space S urfel of sampled point Pk.Distance between Qp and the viewpoint C is institute's depth value of asking as shown in Figure 8, if the z depth value of pixel storage therewith then merges in certain threshold range, block as if then regarding as, promptly replace current color and the opacity alpha value of pixel with its color and weight less than pixel depth; Otherwise do not handle, the processing procedure that observability is judged as shown in Figure 9.
(3) merge.The texture of pixel, normal vector, volume coordinate are the weighted mean that projects to each splat corresponding information of certain pixel.
g ( x ) = Σ k ∈ N ω k ρ k ( x ) Σ j ∈ N ρ j ( x ) - - - ( 1 )
ρ k(x) for the nuclear that resamples, represent weight, ω kThe texture color at available pixel point x place, normal vector, volume coordinate replace.
The 3rd step: illumination calculation.
According to the Phong illumination model, utilize the normal vector and the volume coordinate that obtain pixel among the Splatting, can calculate the light intensity on this pixel.The illumination calculation that replaces traditional per-splat by the illumination calculation of per-pixel, not only can make model surface illumination more level and smooth, better drawn effect, and the postponement dye technology also can avoid handling the point on the invisible splat, raising drafting efficient.
According to Phong illumination model I=I aK a+ I pK d(LN)+I pK s(RV) n, I wherein aK aBe surround lighting, I pK d(LN) be scattered light, I pK s(RV) nBe specular light, n is reflection index (value is generally 50-100), and the big more expression body surface of this number is smooth more, can be used to define the material of object.Utilize the pixel normal vector that obtains after the filtering in the Splatting process to represent N, utilize filtered spatial coordinates calculation V (with formula (1) ω kReplace just can calculating filtered normal vector and volume coordinate with each normal vector, volume coordinate that covers splat mutually respectively), can calculate light intensity I on this pixel according to the Phong illumination model again.Wherein, the parameter I in the illumination model a, I pBe respectively environmental light intensity and incident intensity, K a, K d, K sBe respectively reflection of ambient light coefficient, scattered light reflection coefficient, specularity factor, can simulated environment as required be provided with (in experiment this time of the present invention, K aValue 0.2, K dAnd K sValue is 0.5)
The illumination calculation that replaces traditional per-splat by the illumination calculation of per-pixel, not only can make model surface illumination more level and smooth, better drawn effect, and the postponement dye technology also can avoid handling the point on the invisible splat, raising drafting efficient.
The 4th step: normalization
In order to eliminate Gaussian function intercepting and the inhomogeneous influence of sampling point distributions, need carry out normalized at last at algorithm to image.The final texture value of screen pixels point x g ( x ) = Σ k ∈ N ω k ρ k ( x ) Σ j ∈ N ρ j ( x ) , ω wherein kTexture information for pixel k is generally color, ρ k(x) for the space splat of pixel k projects to behind the screen weight at pixel x,
Figure G200910243262XD00073
For the weight of each splat of covering pixel x and.
The present invention is directed to the problem of losing shape and drawing efficient in the conventional point drafting, thought based on template, draw the Surface Splatting of effect and the curved surface splat that more approaches regional area than surfel in conjunction with high-quality, a kind of surface point method for drafting real-time, that strengthen model silhouette is provided.By generating template in pre-service, Surface Splatting rendered surface sampled point when operation, curved surface splat draw outline sampled point can reconstruct the continuous surface of reserving model edge high-frequency signal, can satisfy the requirement of real-time rendering simultaneously preferably.
A is not for to calculate the traditional algorithm of illumination among Fig. 4, and b figure is a method of the present invention.By the contrast of last figure as can be seen, model for no color, do not calculate the traditional algorithm of illumination and draw the just zone of solid color that obtains, and method of the present invention can well show the illumination and the light and shade variation of model surface, has increased the surperficial sense of reality and minutia.
Among Fig. 5, a is the model of drawing out, and the zone of irising out is the contrast part of traditional algorithm and the inventive method, and b is a traditional algorithm.By contrast as can be seen, the present invention utilizes the advantage of curved surface splat, has eliminated crenellated phenomena preferably, has strengthened the model silhouette feature.
Figure G200910243262XD00081
Interior, appearance that the present invention does not elaborate are those skilled in the art's common practise.
The above only is a preferred implementation of the present invention; should be pointed out that for those skilled in the art, under the prerequisite that does not break away from the principle of the invention; can also make some improvements and modifications, these improvements and modifications also should be considered as protection scope of the present invention.

Claims (7)

1. surface point method for drafting that strengthens model silhouette is characterized in that: be divided into pretreatment stage and operation phase, wherein:
Step in pretreatment stage is:
(1) at first reads also volume coordinate, normal vector, radius and the colouring information of store sample point;
(2) generate the projection of bin at screen space, i.e. splat template according to given hierachy number;
(3) according to user's specified threshold value the rectangular parallelepiped bounding box that the volume coordinate of sampled point forms is carried out spatial division again;
(4) the cube grid that obtains according to spatial division solves k neighbour of each sampled point;
(5) utilize the k neighbour, solve each sampled point, comprise the curvature of configuration sampling point and non-configuration sampling point by coordinate transformation method;
Step in the operation phase is:
(1) when viewpoint changes, judges according to sampled point normal vector and sight line angle whether this sampled point is point, if then carry out the step (2) of operation phase, otherwise carry out the step (3) of operation phase;
The curvature of certain the configuration sampling point that obtains in conjunction with pretreatment stage, calculating a series of compositions is the apex coordinate of the curved surface splat at center with this sampled point;
(3) utilize Surface Splatting algorithm to draw: the Jacobi matrix of finding the solution the 2D spatial mappings of sample point, again each splat that projects to screen is carried out observability and judge, (1) step obtained during for pre-service color, normal vector, volume coordinate merge; Utilize illumination model and light and shade to handle the spatial light intensity that calculates the screen pixel, carry out normalization at last, eliminate Gaussian function intercepting and the inhomogeneous influence of sampling point distributions image.
2. a kind of surface point method for drafting that strengthens model silhouette according to claim 1 is characterized in that: the process according to given hierachy number generation splat template in the step of described pretreatment stage (2) is as follows:
A. according to the radius information in the step (1) of pretreatment stage, the hierachy number n given by the user is divided into n1 interval with radius, sampled point section and screen angle are θ, 0≤θ<90, θ is divided into n2 interval, and oval angle with screen space x axle is ω, 0≤ω<180, ω is divided into n3 interval, then can generates n1n2n3 splat template;
I interval among the b.n1, the j interval among the n2, the k interval among the n3, the generative process of its splat template is as follows: by the projection theory of object space to screen, long axis of ellipse is the radius r of space splat i, sampled point section and screen angle are θ j, oval angle with screen space x axle is ω k
C. be respectively (r for lower left corner coordinate and upper right corner coordinate i,-r i), (r i, r i) matrix in a certain pixel (x, y), if it satisfies (xcos ω k-ysin ω k) 2/ (r iCos θ j) 2+ (ycos ω k+ xsin ω k) 2/ r i 2≤ 1 this pixel is capped, and all pixels in the Ergodic Matrices promptly calculate this oval shape;
D. for other intervals, the splat template also generates according to the method.
3. a kind of surface point method for drafting that strengthens model silhouette according to claim 1, it is characterized in that: the length of side threshold value according to user's appointment in the step of described pretreatment stage (3) is as follows to the process that sampled point carries out spatial division:
A. obtain the rectangular parallelepiped bounding box of sampled point by the volume coordinate that reads;
B. according to the length of side a of user's appointment, this bounding box is divided into mnl small cubes, m, n, l are respectively the number of small cubes on x, y, the z axle;
C. to adopt chain address method to handle the Hash table HashList[mnl of conflict] store the point in each small cubes, finding the solution each sampled point is n at the index of x, y, z direction x, n y, n z, store into
Figure F200910243262XC00021
Chained list in, so far the sampled point in each small cubes all is stored in the corresponding chained list, has promptly finished spatial division.
4. a kind of surface point method for drafting that strengthens model silhouette according to claim 1 is characterized in that: it is as follows to solve the process of curvature of certain sampled point by coordinate transformation method in the step of described pretreatment stage (5):
A. k neighbor point projected on the sampled point section, choose sampled point to the vector of unit length of subpoint direction farthest as the u coordinate axis, by u vector and normal vector multiplication cross compute vector v as the v coordinate axis, thereby calculate the component of neighbor point on u, v coordinate axis, i.e. u, v coordinate figure and at the coordinate figure h of normal vector direction;
B. according to u, v and the h coordinate figure of k the neighbor point that obtains above, utilize least square method to carry out parabolic match;
C. the parabola that simulates is in the i.e. sampled point curvature for this reason of the mean curvature of initial point, and sampled point curvature is similar obtains for other.
5. a kind of surface point method for drafting that strengthens model silhouette according to claim 2 is characterized in that: the process of apex coordinate of curved surface splat that calculates a series of compositions in the step of described operation phase (2) and with the sampled point be the center is as follows:
A. for certain summit P of curved surface splat Ik, the i layer, k bar limit, according to the curvature c that pretreatment stage calculates, local coordinate is (rs iCos (2k π/n s), rs iSin (2k π/n s), φ (rs i)), n wherein sRepresent total limit number of curved surface splat, rs iThe radius of representing the i layer, φ ( rs i ) = e - ( rs i c ) 2 ;
B. summit P IkTexture coordinate be ( 1 2 + i · cos ( 2 kπ / n s ) / 2 , 1 2 + i · sin ( 2 kπ / n s ) / 2 ) , N wherein sRepresent total limit number of curved surface splat.
6. a kind of surface point method for drafting that strengthens model silhouette according to claim 1 is characterized in that: the process of observability judgement and fusion is in the step of described operation phase (3):
A. the splat template that the step (2) of claim 1 pretreatment stage is generated is chosen: the arc-tangent value that is projected in x, y axle component according to radius, normal vector and sight line angle theta, the normal vector of object space reconstruct nuclear ω = arctan n → pβx n → pβy ,
Wherein
Figure F200910243262XC00032
Be respectively the component of the axle that is projected in screen space x, y of sampled point P normal vector, when operation, calculate these values affiliated n1, n2, n3, n1 represents the interval number of sampled point radius, n2 represents that sampled point represents the interval number of section and screen angle, n3 represents the interval in the interval number at oval and screen space x axle clamp angle, and tabling look-up to obtain pixel and the weight that this splat covers;
B. utilizing the z-threshold method to carry out observability judges: calculate splat and project to screen, the pixel that covers along the z depth value of direction of visual lines, if z pixel depth value therewith then merges in certain threshold range, if then replace color of pixel and opacity, i.e. alpha value with its color and weight less than pixel depth; Otherwise do not handle;
C. merge: the color of pixel, normal vector, volume coordinate are the weighted mean that projects to each splat corresponding information of certain pixel, and wherein the color of each splat, normal vector, volume coordinate are obtained in the step (1) of claim 1 pretreatment stage.
7. a kind of surface point method for drafting that strengthens model silhouette according to claim 1, it is characterized in that: the process of utilizing illumination model and light and shade to handle the spatial light intensity that calculates the screen pixel in the described operation phase step (3) is: screen pixel back projection is p to the spatial point on the splat, promptly calculate the light intensity that p is ordered, according to Phong illumination model I=I aK a+ I pK d(LN)+I pK s(RV) n, I wherein aK aBe surround lighting, I pK d(LN) be scattered light, L represents that the p point points to the vector of light source, and N represents the normal vector that p is ordered, and utilizes the filtering in the Splatting process to calculate; I pK s(RV) nBe specular light, R represents the vector of the reflection ray that light is ordered at p, and V represents that the p point points to the vector of viewpoint, utilizes filtered spatial coordinates calculation, and n is a reflection index, and the big more expression body surfaces of this number are smooth more, can be used to define the material of object, I a, I pBe respectively environmental light intensity and incident intensity, K a, K d, K sBe respectively reflection of ambient light coefficient, scattered light reflection coefficient, specularity factor; At last can calculate light intensity I on this pixel according to the Phong illumination model.
CN200910243262XA 2009-12-30 2009-12-30 Method for drawing enhanced model contour surface point Expired - Fee Related CN101763649B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200910243262XA CN101763649B (en) 2009-12-30 2009-12-30 Method for drawing enhanced model contour surface point

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200910243262XA CN101763649B (en) 2009-12-30 2009-12-30 Method for drawing enhanced model contour surface point

Publications (2)

Publication Number Publication Date
CN101763649A true CN101763649A (en) 2010-06-30
CN101763649B CN101763649B (en) 2012-07-25

Family

ID=42494802

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200910243262XA Expired - Fee Related CN101763649B (en) 2009-12-30 2009-12-30 Method for drawing enhanced model contour surface point

Country Status (1)

Country Link
CN (1) CN101763649B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102096943A (en) * 2011-01-14 2011-06-15 天津大学 Method for extracting and rendering characteristic lines in three-dimensional (3D) real-time landscape painting
CN102214366A (en) * 2011-07-20 2011-10-12 浙江万里学院 High-performance rendering method of three dimensional (3D) point sampling data
CN103065362A (en) * 2012-12-25 2013-04-24 上海交通大学 Grid-free surface drawing method with sharp feature
CN105247574A (en) * 2013-05-29 2016-01-13 五大洋有限公司 Electronic device, control method of electronic device and computer readable recording medium
CN106780530A (en) * 2016-12-15 2017-05-31 广州视源电子科技股份有限公司 A kind of build Forecasting Methodology and equipment
CN107038741A (en) * 2016-11-21 2017-08-11 上海咔咖文化传播有限公司 The method of three-dimensional rendering two dimension shadow
CN107169933A (en) * 2017-04-14 2017-09-15 杭州光珀智能科技有限公司 A kind of edge reflections pixel correction method based on TOF depth cameras
CN107918948A (en) * 2017-11-02 2018-04-17 深圳市自由视像科技有限公司 4D Video Rendering methods
CN113129402A (en) * 2021-04-19 2021-07-16 中国航发沈阳发动机研究所 Cross section data cloud picture drawing method
CN114299240A (en) * 2021-12-20 2022-04-08 重庆市勘测院 Parallel point cloud rarefying method based on distance threshold

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2215326C2 (en) * 2001-06-29 2003-10-27 Самсунг Электроникс Ко., Лтд. Image-based hierarchic presentation of motionless and animated three-dimensional object, method and device for using this presentation to visualize the object

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102096943A (en) * 2011-01-14 2011-06-15 天津大学 Method for extracting and rendering characteristic lines in three-dimensional (3D) real-time landscape painting
CN102096943B (en) * 2011-01-14 2012-07-18 天津大学 Method for extracting and rendering characteristic lines in three-dimensional (3D) real-time landscape painting
CN102214366A (en) * 2011-07-20 2011-10-12 浙江万里学院 High-performance rendering method of three dimensional (3D) point sampling data
CN102214366B (en) * 2011-07-20 2013-01-16 浙江万里学院 High-performance rendering method of three dimensional (3D) point sampling data
CN103065362A (en) * 2012-12-25 2013-04-24 上海交通大学 Grid-free surface drawing method with sharp feature
CN103065362B (en) * 2012-12-25 2015-07-08 上海交通大学 Grid-free surface drawing method with sharp feature
CN105247574A (en) * 2013-05-29 2016-01-13 五大洋有限公司 Electronic device, control method of electronic device and computer readable recording medium
CN105247574B (en) * 2013-05-29 2018-09-21 五大洋有限公司 Electronic device, the control method of electronic device and computer-readable recording medium
CN107038741A (en) * 2016-11-21 2017-08-11 上海咔咖文化传播有限公司 The method of three-dimensional rendering two dimension shadow
CN107038741B (en) * 2016-11-21 2020-08-11 上海咔咖文化传播有限公司 Method for three-dimensionally rendering two-dimensional shadow
CN106780530A (en) * 2016-12-15 2017-05-31 广州视源电子科技股份有限公司 A kind of build Forecasting Methodology and equipment
CN106780530B (en) * 2016-12-15 2019-06-14 广州视源电子科技股份有限公司 A kind of figure prediction technique and equipment
CN107169933A (en) * 2017-04-14 2017-09-15 杭州光珀智能科技有限公司 A kind of edge reflections pixel correction method based on TOF depth cameras
CN107169933B (en) * 2017-04-14 2020-08-18 浙江光珀智能科技有限公司 Edge reflection pixel correction method based on TOF depth camera
CN107918948A (en) * 2017-11-02 2018-04-17 深圳市自由视像科技有限公司 4D Video Rendering methods
CN113129402A (en) * 2021-04-19 2021-07-16 中国航发沈阳发动机研究所 Cross section data cloud picture drawing method
CN113129402B (en) * 2021-04-19 2024-01-30 中国航发沈阳发动机研究所 Cross section data cloud picture drawing method
CN114299240A (en) * 2021-12-20 2022-04-08 重庆市勘测院 Parallel point cloud rarefying method based on distance threshold

Also Published As

Publication number Publication date
CN101763649B (en) 2012-07-25

Similar Documents

Publication Publication Date Title
CN101763649B (en) Method for drawing enhanced model contour surface point
Woo et al. A survey of shadow algorithms
CN105006021B (en) A kind of Color Mapping Approach and device being applicable to quickly put cloud three-dimensional reconstruction
Lu et al. Illustrative interactive stipple rendering
JPH06223198A (en) Device and method for image preparation by light beam tracking
CN103530907A (en) Complicated three-dimensional model drawing method based on images
CN105205861A (en) Tree three-dimensional visualization model realization method based on Sphere-Board
Linsen et al. Splat-based ray tracing of point clouds
CN103903296A (en) Method for shadow rendering in virtual home decoration indoor scene design
CN103645463B (en) The method of synthetic aperture radar image-forming data three-dimensional display
CN101441774A (en) Dynamic scene real time double face refraction drafting method based on image mapping space
CN103617593B (en) The implementation method of three-dimensional fluid physic animation engine and device
Bao et al. Realistic real-time rendering for large-scale forest scenes
CN113255251B (en) Realistic ice type rendering method
CN113034657B (en) Rendering method, device and equipment for illumination information in game scene
Boudon et al. Survey on computer representations of trees for realistic and efficient rendering
US8948498B1 (en) Systems and methods to transform a colored point cloud to a 3D textured mesh
Haglund et al. Snow accumulation in real-time
Décoret et al. Billboard clouds
CN113902887A (en) Three-dimensional visual edge generation method, system, computer and readable storage medium
Nöll et al. High quality and memory efficient representation for image based 3d reconstructions
Liu et al. Shape from silhouettes based on a centripetal pentahedron model
Wang et al. Multi-resolution texture synthesis from turntable image sequences
Guo et al. Research on Real-Time Rendering of Reflection Caustics in Water Scenes
Schertler et al. Visualization of Scanned Cave Data with Global Illumination.

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120725

Termination date: 20141230

EXPY Termination of patent right or utility model