CN101593357B - Interactive volume cutting method based on three-dimensional plane control - Google Patents

Interactive volume cutting method based on three-dimensional plane control Download PDF

Info

Publication number
CN101593357B
CN101593357B CN200810113295.8A CN200810113295A CN101593357B CN 101593357 B CN101593357 B CN 101593357B CN 200810113295 A CN200810113295 A CN 200810113295A CN 101593357 B CN101593357 B CN 101593357B
Authority
CN
China
Prior art keywords
volume data
dimensional
volume
dimensional planar
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN200810113295.8A
Other languages
Chinese (zh)
Other versions
CN101593357A (en
Inventor
田捷
戴亚康
代晓倩
杨鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Automation of Chinese Academy of Science
Original Assignee
Institute of Automation of Chinese Academy of Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Automation of Chinese Academy of Science filed Critical Institute of Automation of Chinese Academy of Science
Priority to CN200810113295.8A priority Critical patent/CN101593357B/en
Publication of CN101593357A publication Critical patent/CN101593357A/en
Application granted granted Critical
Publication of CN101593357B publication Critical patent/CN101593357B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention provides an interactive volume cutting method based on three-dimensional plane controls, aiming to solve the technical problem that how to carry out interactive volume cutting in a volume rendering scene based on software and image hardware. The invention comprises the following steps: (1) integrative drawing environment of volume data and three-dimensional plane controls is built; (2) the volume data and the three-dimensional plane controls are drawn in a mixed mode; (3) the interactive cutting of the volume data is finished by the control of a mouse. The invention integrates volume data, three-dimensional plane controls, volume drawing, mixing drawing and mouse control into the same environment so as to realize interactive and immersing volume cutting. Meanwhile, the volume drawing can be based on software to realize high-quality drawing of large-scale volume data; in addition, the invention also can adopt volume drawing based on image hardware, and can realize quick drawing of small and medium-sized volume data, so that the invention has important application value in the field of volume data search and visualization.

Description

A kind of interactive body cutting method based on three-dimensional planar control
Technical field
The present invention relates to the three-dimensional visualization field in computer graphics, particularly utilize three-dimensional planar control and object plotting method to realize the cutting of interactively body.
Background technology
Volume drawing (Volume Rendering) comes across eighties of last century eighties, is a kind of very important method in three-dimensional visualization method.Because volume drawing not only can show the surface information of three-dimensional regular data (also referred to as volume data), its internal information can also be shown, thus the Realistic representation of volume data can be realized, be conducive to complete understanding and the analysis of volume data information, therefore in assist physician diagnosis, industry CT defects detection etc., have very important effect.
In volume drawing process, the information of display body data interconnects tissue can be carried out by transport functions such as interactive adjustment gray scale-opacitys.But for the volume data of complicated structure, if think to show Various Tissues simultaneously, then need cost many time to regulate, and the display result obtained more clearly might not reflect the relations such as the relative position between each tissue.Body cutting method (VolumeClipping) introduces cutting geometry in volume drawing, cut away not needing the part shown, and only retain the part of needs display, thus overcome above-mentioned defect, can be used for detecting (Explore) baroque volume data.
Domestic and international researcher has done a lot of work in body cutting, and achieves very large achievement.As 1998, the cutting method of geometry that Westermann and Ertl have employed based on template cache test carrys out cutting body data, the people such as Sommer utilized the three-D grain of graphic hardware to operate the plane cutting that cuts randomly achieving volume data in 1999, the people such as Weiskopf then utilized the method for cutting out based on the degree of depth and the method for cutting out based on body to achieve complicated cutting geometry cutting in 2003.Although these methods can obtain good body cutting result, but also need more intuitive interactive manipulations mode to regulate cutting parameter in actual applications, instead of regulated by vitro mode (such as utilizing the interface controls such as the slide block on control panel).
For the problems referred to above, several researchers have proposed and directly drawing the method for carrying out interactive body cutting in scene, to realize the object immersed alternately.As 2003, the people such as McGuffin utilized three-dimensional control to browse volume data, and the people such as Huff utilized cropping tool, eraser and delver to carry out alternatively cutting body data in 2006.But up to the present, this type of mutual immersion body cutting method is according to reports nearly all adopt the volume drawing of graphic based hardware to realize the drafting of volume data, researcher is not also had to adopt the volume drawing based on software to realize the body cutting of mutual immersion.In the volume drawing of graphic based hardware, first volume data can be reconstructed into a series of texture polygons in three dimensions under normal circumstances, and then drawn.But based in the volume drawing of software, volume data can not be reconstructed into the mid-module with depth information, but is directly drawn into a width two-dimensional projection image.Owing to having lacked depth information, so realize the cutting of interactive body in based on the volume drawing scene of software to remain a technical barrier.And research can be applicable to the mutual immersion body cutting method then challenge especially that soft drafting (volume drawing based on software) can be applicable to hard drafting (volume drawing of graphic based hardware).
Summary of the invention
In order to the technical matters solved, the object of the invention is to break through existing mutual immersion body cutting technique to be confined to adopt the volume drawing of graphic based hardware to realize the restriction of the drafting of volume data, for this reason, a kind of mutual immersion body cutting method based on three-dimensional planar control that can be applicable to soft drafting and can be applicable to again hard drafting is provided.
For reaching described object, the technical scheme that the interactive body cutting method that the present invention is based on three-dimensional planar control is dealt with problems is:
Step S1: set up a complete coordinate system, volume data is added this coordinate system, then add three-dimensional planar control in this coordinate system, thus build integrated drafting environment;
Step S2: in integrated drafting environment, first based on the volume drawing of software or the volume drawing of employing graphic based hardware, and take the mode of level of detail and 2 d texture spatial mappings to realize the drafting of volume data according to mouse information, then traversal draws all three-dimensional planar controls, each three-dimensional planar control is checked, check whether and need to show cut surface, if YES, then sampling obtains two dimensional slice image, and calculate current gaze direction and this control cutting planar process vows dot product, if dot product is greater than zero, then drawing three-dimensional plane control and two dimensional slice image, otherwise only drawing three-dimensional plane control, if NO, then a drawing three-dimensional plane control, thus the blend rendering of perfect aspect data and three-dimensional planar control,
Step S3: use mouse to come alternatively manipulator data and three-dimensional planar control, and return step S2 to upgrade blend rendering after mouse action, thus realize cutting the interactive mode of volume data; Described mouse alternatively manipulator data is translation, Rotation and Zoom; Volume data center is positioned in mid-plane all the time; When mouse manipulation volume data, three-dimensional planar control keeps geo-stationary with volume data all the time.
According to embodiments of the invention, described three-dimensional planar control is made up of four summits, four seamed edges and a transparent rectangular face, and encapsulates a cutting plane, and the initial point of cutting plane and method vow calculating according to three-dimensional planar control.
According to embodiments of the invention, described coordinate system comprises: voxel coordinate system, model coordinate systems, world coordinate system, view coordinate system and screen coordinate system.
According to embodiments of the invention, in described world coordinate system, add volume data, adjust the transformation relation between each coordinate system according to volume data size, and the central point of volume data is positioned in the mid-plane of Viewport Clipping body.
According to embodiments of the invention, in described world coordinate system, add one or more three-dimensional planar control, and carry out the state of initialization three-dimensional planar control according to the size of volume data and position.
According to embodiments of the invention, the drafting of described volume data comprises the following steps:
Step S21: extract the cutting plane information packaged by three-dimensional planar control, utilize volume drawing to calculate the two-dimensional projection image of volume data on Viewport Clipping body hither plane;
Step S22: from view coordinate origin to four of two-dimensional projection image summit injection lines, calculate the intersection point of each ray and Viewport Clipping body mid-plane, obtain the projection rectangle of two-dimensional projection image in mid-plane, utilize texture two-dimensional projection image to be hinted obliquely in projection rectangle, and carry out the drafting of map image.
According to embodiments of the invention, the state of described three-dimensional planar control in world coordinate system is changed by following mouse manipulation mode: left mouse button handles summit, for realizing any rotation of three-dimensional planar control; Left mouse button handles seamed edge, for realizing the flexible of three-dimensional planar control; Left mouse button handles transparent rectangular face, for realizing the translation of three-dimensional planar control in cutting plane; Right mouse button handles summit, for realizing the translation of three-dimensional planar control on cutting plane direction of normal; Right mouse button handles seamed edge, for realizing the rotation of three-dimensional planar control around transparent rectangular face axis of symmetry; Middle button of mouse handles summit, for realizing the convergent-divergent of control in cutting plane.
Beneficial effect of the present invention: volume data, three-dimensional planar control, volume drawing, blend rendering and mouse manipulation have been integrated in same environment by the present invention, thus achieve body cutting that is mutual, immersion.Volume drawing both can adopt the volume drawing based on software simultaneously, and the high-quality that can realize Large volume data is drawn, and also can adopt the volume drawing of graphic based hardware, can realize the Fast Drawing of middle and small scale volume data.Therefore can select suitable object plotting method according to data scale and hardware environment, improve the efficiency of interactive body cutting, detect and visual field has important using value in volume data.
Accompanying drawing explanation
Fig. 1 is the process flow diagram of the interactive body cutting method based on three-dimensional planar control;
Fig. 2 is the integrated drafting environment built;
Fig. 3 is the geometry of three-dimensional planar control;
Fig. 4 is the process utilizing ray cast to calculate two-dimensional projection image;
Fig. 5 is the spatial relationship of map image and three-dimensional planar control;
Fig. 6 is the interactive manipulations mode of three-dimensional planar control;
Fig. 7 is the application example utilizing three-dimensional planar control to carry out interactive body cutting.
Embodiment
Each detailed problem involved in technical solution of the present invention is described in detail below in conjunction with accompanying drawing.Be to be noted that described embodiment is only intended to be convenient to the understanding of the present invention, and any restriction effect is not play to it.
The two-dimensional projection image that the present invention is based on three-dimensional planar control and obtained by volume drawing is to realize the cutting of interactive body.Each three-dimensional planar control encapsulates a cutting plane, and can be changed the state of three-dimensional planar control by mouse interactive manipulations.First extract the cutting plane information packaged by three-dimensional planar control, and utilize the two-dimensional projection image obtaining volume data based on the volume drawing of software or the volume drawing of graphic based hardware.Then utilize the method for texture two-dimensional projection image to be remapped in three dimensions, and carry out the drafting of map image and three-dimensional planar control, thus obtain the blend rendering image of volume data and three-dimensional planar control.Utilize mouse interactive manipulations volume data and three-dimensional planar control, and upgrade the blend rendering of volume data and three-dimensional planar control, just can realize mutual and body that is that immerse cuts.
A kind of interactive body cutting method based on three-dimensional planar control of the present invention's proposition is described in detail below in conjunction with accompanying drawing.The process flow diagram of a kind of specific implementation of the present invention as shown in Figure 1, mainly comprises three steps: build integrated drafting environment, blend rendering, mouse manipulation.Detailed step is as follows:
Step S1: build integrated drafting environment
For realizing immersing and body cutting accurately, two requirements must be met: one is must be correct in visual effect to the interactive operation of three-dimensional planar control, two are the cutting parameters regulated by three-dimensional planar control must be accurately, and also must be accurately to the drafting of volume data.Therefore under volume data and three-dimensional planar control must being integrated in a unified drafting environment, step comprises: set up a complete coordinate system, volume data is added this coordinate system, and the transformation relation adjusted between each coordinate system, then in this coordinate system, one or more three-dimensional planar control is added, and the state of each three-dimensional planar control of initialization.
In drafting environment, establish five coordinate systems, as shown in Figure 2, comprise voxel coordinate system G, model coordinate systems M, world coordinate system W, view coordinate system V and screen coordinate system S.Three-dimensional planar control 3 is arranged in world coordinate system.Viewport Clipping body 2 is arranged in view coordinate system V, and determines the viewing area of view coordinate system V.Ray 1 is the direction line investing Viewport Clipping body hither plane from the initial point of view coordinate system V.Transformation matrix from voxel coordinate system G to screen coordinate system S sx can be expressed as:
SX= ST V· VT W· WT M· MT G· GX (1)
Wherein gx is the coordinate in voxel coordinate system G, and sx transforms to the coordinate in screen coordinate system S, gx and sx can be expressed as [x y z 1] tform. mt g, wt m, vt wwith st vcan be write as a unified form jt i, represent the transformation matrix of coordinates of 4 × 4 from coordinate system I to coordinate system J.Only there is convergent-divergent relation between voxel coordinate system G and model coordinate systems M, zooming parameter is the voxel spacing of volume data, and therefore voxel coordinate system G remains consistent with the initial point of model coordinate systems M.
The geometry of three-dimensional planar control as shown in Figure 3, is made up of a transparent rectangular face, four summits 31 and four seamed edges 32, and encapsulates a cutting plane.Four summits are denoted as " upper left ", " lower-left ", " bottom right " and " upper right " respectively, the initial point 34 of so cutting plane is exactly the central point on these four summits, the initial point of cutting plane and method vow calculating according to three-dimensional planar control, the method of cutting plane vows that 33 is then the cross product of two diagoned vectors, four summit balls realize, article four, seamed edge cylinder realizes, and four summits and four seamed edges can also select other mode, do not repeat them here.
In described world coordinate system, add volume data, adjust the transformation relation between each coordinate system according to volume data size, and the central point of volume data is positioned in the mid-plane of Viewport Clipping body.In described world coordinate system, add one or more three-dimensional planar control, and carry out the state of initialization three-dimensional planar control according to the size of volume data and position.In an initial condition, the central point of volume data is placed in the initial point of world coordinate system W, and does not have Rotation and Zoom relation between model coordinate systems M and world coordinate system W.Suppose that the catercorner length of volume data in world coordinate system W is L, in the z-axis positive dirction that then initial point of view coordinate system V is placed in world coordinate system W, in vitro data center is the position of 3L, the position that the hither plane of Viewport Clipping body and the far plane initial point laid respectively at from view coordinate system V is catercorner length L and 5L is set simultaneously, thus the central point of volume data is positioned in the mid-plane of Viewport Clipping body.The initial length of three-dimensional planar control and the wide length that is configured to volume data respectively and wide, initial position is set in the xoy plane of world coordinate system W, and makes it be centrally located at the initial point of world coordinate system W.In follow-up mutual body cutting process, the state of volume data and three-dimensional planar control can be changed by mouse interactive manipulations.
Step S2: blend rendering
In integrated drafting environment, first we obtain the two-dimensional projection image of volume data based on volume drawing.Then utilize the method for texture two-dimensional projection image to be remapped in three dimensions, and carry out the drafting of map image, finally traversal draws all three-dimensional planar controls, thus obtains the blend rendering image of volume data and three-dimensional planar control.Described blend rendering comprises the following steps:
Step S21: extract the cutting plane information packaged by three-dimensional planar control, utilize volume drawing to calculate the two-dimensional projection image of volume data on Viewport Clipping body hither plane;
Step S22: from view coordinate origin to four of two-dimensional projection image summit injection lines 1, calculate the intersection point of each ray 1 and Viewport Clipping body mid-plane, obtain the projection rectangle of two-dimensional projection image in mid-plane, utilize texture two-dimensional projection image to be hinted obliquely in projection rectangle, and carry out the drafting of map image;
Step S23: check each three-dimensional planar control, checks whether and needs to show cut surface, and if YES, then sampling obtains two dimensional slice image, and drawing three-dimensional plane control and two dimensional slice image; If NO, then a drawing three-dimensional plane control.
Above-mentioned steps S21 and S22 realizes the drafting of volume data based on volume drawing and 2 d texture spatial mappings, and step S23 traversal depicts all three-dimensional planar controls.The object plotting method of acquisition volume data two-dimensional projection image both can adopt the volume drawing based on software, also the volume drawing of graphic based hardware can be adopted, and take the mode of level of detail to draw volume data according to mouse information, the maximum difference of different object plotting method is how to extract cutting plane information packaged by three-dimensional planar control to carry out the drafting of volume data.The following drawing process carrying out Description data for light projection method (RayCasting) very classical in volume drawing.
Utilize ray cast to calculate the process of the two-dimensional projection image of volume data as shown in Figure 4.First eight of volume data summits are projected on screen, calculate the projection bounding box 4 of volume data on screen, and the accumulation color of each pixel in projection bounding box 4 and opacity are initialized as zero.Then all pixels in traversal projection bounding box 4, for each pixel, adopt following steps to calculate its final accumulation color and opacity:
1. launch ray from the initial point of view coordinate system V to this pixel, calculate intersection point s and e of near, the far plane of this ray 1 and Viewport Clipping body 2, and initialization S=s, E=e;
2. then calculate with intersection point c and d of volume data outside surface, and make S=c, E=d;
3. calculate again with intersection point a and b of two cutting planes, for each cutting plane, vow that in method the ray segment in the half-plane on 33 directions will be retained, and all the other ray segment by cropped fall, make S=a, E=b; 4. last edge ray segment stepping, calculates color and opacity at each sampled point, and is mixed to get final accumulation color and the opacity of this pixel according to formula (2) recurrence.
c = C S · α S · ( 1 - α ) + c α = α S · ( 1 - α ) + α - - - ( 2 )
After processing all pixels, just obtain the two-dimensional projection image of volume data.
Suppose that A, B, C, D are four summits of two-dimensional projection image, as shown in Figure 5.O is the initial point of view coordinate system, and from O to A, B, C, D injection line, E, F, G, H are the intersection point of the mid-plane 5 of each ray 1 and Viewport Clipping body 2 respectively.First we utilize glTexImage2D to create the texture of two-dimensional projection image, and utilize glBegin (GL_POLYGON) by this texture on rectangle EFGH.Then travel through all three-dimensional planar controls, for each three-dimensional planar control, if do not need to show cut surface, then utilize gluSphere, gluCylinder and glBegin (GL_QUADS) to draw this control.If need to show cut surface, then on transparent rectangular face, resampling is carried out to volume data, obtain two dimensional slice image, and utilize glTexImage2D to create the texture of two dimensional slice image, calculating current gaze direction and this control cutting planar process vow the dot product of 33 afterwards.If dot product is greater than zero, utilize gluSphere to draw the summit of this control, utilize gluCylinder to draw the seamed edge of this control, and utilize glBegin (GL_QUADS) by the texture of two dimensional slice image on the transparent rectangular face of this control.If dot product is less than zero, then gluSphere, gluCylinder and glBegin (GL_QUADS) is utilized to draw this control.Thus achieve the texture of two-dimensional projection image and the drafting of map image and three-dimensional planar control.
Step S3: mouse manipulation
In the process of volume data being carried out to interactive cutting, mouse is used to come alternatively manipulator data and three-dimensional planar control, check whether that three-dimensional planar control is handled, change the state of this control in world coordinate system, otherwise make volume data translation, Rotation and Zoom, and make the center of volume data remain in the mid-plane of Viewport Clipping body, convert three-dimensional planar control makes it keep geo-stationary with volume data all the time simultaneously, after mouse action, return step S2 to upgrade blend rendering, thus realize cutting the interactive mode of volume data.
Based on the selection mechanism of OpenGL, achieve the interactive manipulations to three-dimensional planar control, mode comprises as shown in Figure 6: (a) uses left mouse button to handle summit to realize any rotation of control; B () uses left mouse button to handle seamed edge and realizes the flexible of control; C () uses left mouse button to handle transparent rectangular face and realizes the translation of control in cutting plane; D () uses right mouse button to handle summit and realizes the translation of control on cutting plane direction of normal; E () uses right mouse button to handle seamed edge and realizes the rotation of control around transparent rectangular face axis of symmetry; F () uses middle button of mouse to handle summit and realizes the convergent-divergent of control in cutting plane.By these six kinds of interactive manipulations modes, the cutting plane in optional position and direction can be provided for volume drawing.
The following realization describing interactive manipulations in mode (c).First hypothesis uses left mouse button to choose transparent rectangular face and rolling mouse, and note is previous, the coordinate of current mouse pointer in screen coordinate system S is respectively [x moldy mold0 1] t[x mnewy mnew0 1] t.Then by
WX= WT V· VT S· SX (3)
Calculate this 2 coordinates in world coordinate system W wx moldwith wx mnew.Calculate the mouse motion-vector Δ V under world coordinate system again w= wx mnew- wx mold.Thus can utilize
ΔV P=ΔV W-(ΔV W·N P)·N P(4)
Calculate mouse and move the projection vector in cutting plane, wherein Δ V pprojection vector, N pbe the method arrow of cutting plane, then represent dot product.Finally each apex coordinate of three-dimensional planar control is added Δ V p, just obtain the reposition of three-dimensional planar control after being moved by mouse.
In mouse manipulation process, if do not have three-dimensional planar control selected, be then translation, the Rotation and Zoom amount of volume data mouse mobile projector, implementation method is similar to above-mentioned.We utilize left mouse button that volume data is rotated around its center, middle button of mouse is utilized to make volume data translation in world coordinate system, and utilize right mouse button to make volume data relative to its center convergent-divergent, and make the center of volume data remain in the mid-plane of Viewport Clipping body in rotation, Pan and Zoom process, convert three-dimensional planar control according to the state of volume data simultaneously, make three-dimensional planar control keep geo-stationary with volume data all the time.
In mouse manipulation process, take the mode of level of detail (Level of Detail) to draw volume data according to mouse information, step comprises: before drafting, has first checked whether mouse information, adopt rough draw system, increase sampling interval, reduce calculated amount, thus improve drafting frame per second, otherwise adopt essence to draw, reduce sampling interval, increase computational accuracy, thus improve drafting effect.
Operation result
In the 3-D view kit method integration proposed by the invention researched and developed voluntarily to us.This kit adopts object-oriented design method and Software engineering standard, by C Plus Plus realization, cross-platform 3-D view pack processing.In this kit, except utilizing ray cast volume drawing to obtain except two-dimensional projection image, also achieve the drafting utilizing voxel Projection Body drafting (Splatting) and Shear-Warp volume drawing (Shear Warp) to carry out volume data.For the property immersed alternately and the high efficiency of checking institute of the present invention extracting method, carry out great many of experiments on one computer based on this kit.The software and hardware configuration of computing machine is: Intel Core21.86GHz processor, 1GB physical memory, ATI Radeon X300 video card, Windows XP operating system.The brain CT data of experimental data to be a size be 208 × 256 × 225 × 8bit, should data from http://www.psychology.nottingham.ac.uk/staff/crl/ct.zip.
Fig. 7 is the application example utilizing three-dimensional planar control to carry out interactive body cutting, and wherein (a) is based on ray cast volume drawing; B () draws based on voxel Projection Body; C () is based on Shear-Warp volume drawing.We are in the process of carrying out mutual body cutting, can handle volume data and three-dimensional planar control very intuitively, just as we are in 3 D rendering scene.This qualitative experiment demonstrates the property immersed alternately of institute of the present invention extracting method.
Further experiment as follows.Consider the repeatability of experiment, we are some parameters relevant to body cutting efficiency of specification first:
1. the window size of screen coordinate system is 948 × 618;
2. two-dimensional projection image size is 302 × 294;
3. illumination is not added during volume drawing;
4. the density value of each voxel is mapped to (r, g, b, a), be (0.0,0.0,0.0 by the voxel maps of density value between 0 ~ 20,0.0), by density value be 255 voxel maps be (0.6,0.6,0.2,0.25), the interpolation calculation that is mapping through of other density value obtains;
5. in drafting environment, add a three-dimensional planar control, require that this control can show cut surface, and regulate this control to be located on the symmetrical plane of brain.Then we are interrupted and manipulator data and three-dimensional planar control slightly with mouse, thus rough draw system and essence can be driven to draw, and can ensure that again the state of volume data and three-dimensional planar control is basicly stable.Drafting frame per second in varied situations as shown in Table 1.Draw owing to have employed level of detail, we can obtain higher drafting frame per second when interactive manipulations, and obtain the drawing image of better quality after stopping manipulation.This quantitative experiment demonstrates the high efficiency of institute of the present invention extracting method.
The drafting efficiency of form 1 body cutting
Volume renderer condition Manipulator data Handle three-dimensional planar control Essence is drawn
Ray cast 0.032s/31.72fps 0.057s/17.63fps 0.70s/1.42fps
Voxel projects 0.089s/11.14fps 0.11s/8.94fps 1.69s/0.59fps
Shear-Warp 0.029s/33.60fps 0.054s/18.62fps 0.41s/2.44fps
The above; be only the embodiment in the present invention; but protection scope of the present invention is not limited thereto; any people being familiar with this technology is in the technical scope disclosed by the present invention; the conversion or replacement expected can be understood; all should be encompassed in and of the present inventionly comprise within scope, therefore, protection scope of the present invention should be as the criterion with the protection domain of claims.

Claims (6)

1., based on an interactive body cutting method for three-dimensional planar control, it is characterized in that, comprise the following steps:
Step S1: set up a complete coordinate system, volume data is added this coordinate system, then add three-dimensional planar control in this coordinate system, thus build integrated drafting environment;
Step S2: in integrated drafting environment, first based on the volume drawing of software or the volume drawing of employing graphic based hardware, and take the mode of level of detail and 2 d texture spatial mappings to realize the drafting of volume data according to mouse information, then traversal draws all three-dimensional planar controls, each three-dimensional planar control is checked, check whether and need to show cut surface, if YES, then sampling obtains two dimensional slice image, and calculate current gaze direction and this control cutting planar process vows dot product, if dot product is greater than zero, then drawing three-dimensional plane control and two dimensional slice image, otherwise only drawing three-dimensional plane control, if NO, then a drawing three-dimensional plane control, thus the blend rendering of perfect aspect data and three-dimensional planar control,
The drafting of described volume data comprises the following steps: step S21: extract the cutting plane information packaged by three-dimensional planar control, utilize volume drawing to calculate the two-dimensional projection image of volume data on Viewport Clipping body hither plane; Step S22: from view coordinate origin to four of two-dimensional projection image summit injection lines, calculate the intersection point of each ray and Viewport Clipping body mid-plane, obtain the projection rectangle of two-dimensional projection image in mid-plane, utilize texture two-dimensional projection image to be mapped in projection rectangle, and carry out the drafting of map image;
Step S3: use mouse to come alternatively manipulator data and three-dimensional planar control, and return step S2 to upgrade blend rendering after mouse action, thus realize cutting the interactive mode of volume data; Described mouse alternatively manipulator data is translation, Rotation and Zoom; Volume data center is positioned in mid-plane all the time; When mouse manipulation volume data, three-dimensional planar control keeps geo-stationary with volume data all the time.
2. interactive body cutting method according to claim 1, it is characterized in that: described three-dimensional planar control is made up of four summits, four seamed edges and a transparent rectangular face, and encapsulating a cutting plane, the initial point of cutting plane and method vow calculating according to three-dimensional planar control.
3. interactive body cutting method according to claim 1, it is characterized in that, described coordinate system comprises:
Voxel coordinate system, model coordinate systems, world coordinate system, view coordinate system and screen coordinate system.
4. interactive body cutting method according to claim 3, it is characterized in that, in described world coordinate system, add volume data, adjust the transformation relation between each coordinate system according to volume data size, and the central point of volume data is positioned in the mid-plane of Viewport Clipping body.
5. interactive body cutting method according to claim 3, is characterized in that, adds one or more three-dimensional planar control in described world coordinate system, and carrys out the state of initialization three-dimensional planar control according to the size of volume data and position.
6. interactive body cutting method according to claim 1, it is characterized in that, the state of described three-dimensional planar control in world coordinate system is changed by following mouse manipulation mode: left mouse button handles summit, for realizing any rotation of three-dimensional planar control; Left mouse button handles seamed edge, for realizing the flexible of three-dimensional planar control; Left mouse button handles transparent rectangular face, for realizing the translation of three-dimensional planar control in cutting plane; Right mouse button handles summit, for realizing the translation of three-dimensional planar control on cutting plane direction of normal; Right mouse button handles seamed edge, for realizing the rotation of three-dimensional planar control around transparent rectangular face axis of symmetry; Middle button of mouse handles summit, for realizing the convergent-divergent of control in cutting plane.
CN200810113295.8A 2008-05-28 2008-05-28 Interactive volume cutting method based on three-dimensional plane control Active CN101593357B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200810113295.8A CN101593357B (en) 2008-05-28 2008-05-28 Interactive volume cutting method based on three-dimensional plane control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200810113295.8A CN101593357B (en) 2008-05-28 2008-05-28 Interactive volume cutting method based on three-dimensional plane control

Publications (2)

Publication Number Publication Date
CN101593357A CN101593357A (en) 2009-12-02
CN101593357B true CN101593357B (en) 2015-06-24

Family

ID=41407998

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200810113295.8A Active CN101593357B (en) 2008-05-28 2008-05-28 Interactive volume cutting method based on three-dimensional plane control

Country Status (1)

Country Link
CN (1) CN101593357B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622236A (en) * 2012-01-13 2012-08-01 深圳市妙趣工场信息科技有限公司 Geart three-dimensional (3D) game engine
EP2665042A1 (en) * 2012-05-14 2013-11-20 Crytek GmbH Visual processing based on interactive rendering
CN102968791B (en) * 2012-10-26 2016-12-21 深圳市旭东数字医学影像技术有限公司 Exchange method that 3 d medical images figure shows and system thereof
CN103049266A (en) * 2012-12-17 2013-04-17 天津大学 Mouse operation method of Delta 3D (Three-Dimensional) scene navigation
DE102013216858A1 (en) * 2013-08-23 2015-02-26 Siemens Aktiengesellschaft A method for displaying an object imaged in a volume data set on a screen
CN105224288B (en) * 2014-06-27 2018-01-23 北京大学深圳研究生院 Binocular three-dimensional method for rendering graph and related system
CN104731653B (en) * 2015-03-31 2018-09-25 上海盈方微电子有限公司 A kind of Software on Drawing and hardware drafting dynamic switching method of Android display systems
CN104794758B (en) * 2015-04-17 2017-10-03 青岛海信医疗设备股份有限公司 A kind of method of cutting out of 3-D view

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Joe Kniss,etc..Multidimensional Transfer Functions for Interactive Volume Rendering.《IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS》.2002,第8卷(第3期),270-285. *
田捷,等..集成化医学影像算法平台理论与实践.《集成化医学影像算法平台理论与实践》.清华大学出版社,2004,63-125. *

Also Published As

Publication number Publication date
CN101593357A (en) 2009-12-02

Similar Documents

Publication Publication Date Title
CN101593357B (en) Interactive volume cutting method based on three-dimensional plane control
Wang et al. Generalized displacement maps
Mueller et al. High-quality splatting on rectilinear grids with efficient culling of occluded voxels
Samet et al. Hierarchical data structures and algorithms for computer graphics. II. Applications
CN101982838B (en) 3D virtual set ray tracking method for accelerating back light source irradiation
CN103106685B (en) A kind of abdominal organs three-dimensional visualization method based on GPU
CN100429676C (en) Interactive controlling method for selecting 3-D image body reconstructive partial body
JP2006502508A (en) 3D modeling system
CN102915559A (en) Real-time transparent object GPU (graphic processing unit) parallel generating method based on three-dimensional point cloud
CN101604453A (en) Large-scale data field volume rendering method based on partition strategy
WO2009016511A2 (en) Shape preserving mappings to a surface
Noguera et al. Volume rendering strategies on mobile devices
Yang et al. An efficient rendering method for large vector data on large terrain models
US9401044B1 (en) Method for conformal visualization
CN103345774A (en) Method for building three-dimensional multi-scale vectorization model
Qu et al. Ray tracing height fields
CN108874932A (en) A kind of ocean underwater sound field three-dimensional visualization method based on improved light projecting algorithm
Kaufman et al. A survey of architectures for volume rendering
Dai et al. Volume-rendering-based interactive 3D measurement for quantitative analysis of 3D medical images
CN114140602A (en) Data mixed drawing method and visualization system of three-dimensional nuclear radiation dose field
Zhao et al. High-performance and real-time volume rendering in CUDA
Çalışkan et al. Overview of Computer Graphics and algorithms
Yagel Classification and survey of algorithms for volume viewing
Wang Research on three dimensional visualization technologies
Yagel Volume viewing algorithms: Survey

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant