GB2237714A - Displaying 3-D data - Google Patents

Displaying 3-D data Download PDF

Info

Publication number
GB2237714A
GB2237714A GB9023313A GB9023313A GB2237714A GB 2237714 A GB2237714 A GB 2237714A GB 9023313 A GB9023313 A GB 9023313A GB 9023313 A GB9023313 A GB 9023313A GB 2237714 A GB2237714 A GB 2237714A
Authority
GB
United Kingdom
Prior art keywords
data
image plane
voxel
pixel
volume
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB9023313A
Other versions
GB9023313D0 (en
Inventor
Harvey Ellis Cline
Siegwalt Ludke
Charles Lucian Dumoulin
Steven Peter Souza
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
General Electric Co
Original Assignee
General Electric Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by General Electric Co filed Critical General Electric Co
Publication of GB9023313D0 publication Critical patent/GB9023313D0/en
Publication of GB2237714A publication Critical patent/GB2237714A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)
  • Image Processing (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

A method for providing a volumetrically-rendered projection image using reverse ray casting, uses the steps of: acquiring, from an object volume of interest 11, a set of data sampled from each volume element (voxel) OV therein responsive to a selected characteristic of that object volume; storing the data for each object voxel in a corresponding data volume element DV; scanning sequentially through each data voxel within the data volume 12 corresponding to the object volume of interest 11; projecting each scanned data voxel to an image plane 14, at a solid angle determined from the solid angle at which the object volume is viewed; storing a value for each image plane pixel 16, responsive to a selected criteria, from the values 18 of all projected data voxel values impingent upon that image plane pixel; and then scaling the dimensions of each image plane pixel responsive to the dimensions of the corresponding object volume element shape, and the involved projection solid angle, to correct for anisotropy (see Fig 2). The method is particularly useful in magnetic resonance imaging systems, e.g. N.M.R. <IMAGE>

Description

1 METHOD AND APPARATUS FOR DISPLAYING 3-D DATA RD-19, 324 The present
invention relates to the information display arts and, more particularly, to a method and apparatus for volumetric projection rendering from any angle of viewable three-dimensional (3D) data.
In the medical imaging arts, it is well known to construct synthetic Xray-like images of the interior of a sample volume (e.g. a portion of a patient) by projecting 3D data into a series of view at different angles. A cine loop of rotating projections can be viewed to enhance the 1() perception of depth. In some cases, improved visualization can be accomplished by segmentation of the viewable volume using a model of opaqueness. However, surface rendering., utilizing methods such as those described and claimed in U.S. Patents 4,710,876 and 4,719,585, merely creates a shaded image resembling a photograph of the object. A volumetrically-rendered image is often preferable to a surface model, for providing images of a volume-of-interest in which vessel morphology can be viewed with detail from a magnetic resonance (MR) examination, comparable to X-ray techniques. It is desirable to provide a maximum pixel projective display, wherein the maximum intensity of each pixel along the line-of-sight is presented, as such projective display is in a form more natural than the surface display, to the angiographer. Unfortunately, the volumetric display requires more processing and has hitherto been too sl9w for clinical use in a-MR scanner apparatus. It is therefore highly desirable to provide a method, and companion i 1 RD-19,324 apparatus, for providing a rapidly-processible volumetric medical imaging display.
In accordance with one example of the invention, a method for providing a volumetrically- rendered projection image using reverse ray casting, comprises the steps of: acquiring, from an object volume of interest, a set of data sampled from each volume element (voxel) therein responsive to a selected characteristic of that object volume; storing the data for each object voxel in a corresponding data volume element; scanning sequentially through each data voxel within the data volume corresponding to the object volume of interest; projecting each scanned data voxel to an image plane, at -a spherical angle with parameters Urp determined from a spherical angle (0,)at which the object volume is to be viewed; storing a value for each image plane pixel, responsive to a selected criteria, from the values of all projected data voxel values impingent upon that image plane pixel; and then scaling the dimensions of each image plane pixel.responsive to the dimensions of the corresponding object volume shape, and the involved spherical projection angle, to correct for anisotropy. The resulting stored image values can then be displayed if desired.
Illustrative apparatus utilizing this method includes: memory means for storing the object volume (3D) data; means for equentially addressing all object volume elements within the selected volume of interest, to obtain the data stored for the voxel then addressed; means for modifying the spherical angle parameters (0,) at which the object is to be viewed to obtain new spherical angle parameters (CL,P) of a ray which projects the data value in each data voxel onto the projection plane; memory means for storing the data of each projected ray impinging in each picture element (pixel) of i i i 1 1 1 1 i 1 i 1 1 -3 RD-19, 324 the image plane; and means for determining if the projected data value is to be placed in the image memory for that pixel.
Thus, no recalculation or interpolation of the object volume into the intermediate data space, prior to projection to the image plane, is required, so that greater speed and efficiency results.
In a presently preferred embodiment; the method may use either maximum pixel intensity storage, or data intensity 10 averaging.
A better understanding of the present invention will become apparent upon reading of the following illustrative description of the invention, when considered in conjunction with the associated drawings, in which
Figure 1 is a schematic of the sampled object volume-of-interest, an associated data volume and a image projection plane involved in volumetrically-rendering a reversed ray-cast projection in accordance with the principles of the present invention; Figure 2 is a pair of geometric 2D configurations corresponding to like views of object and data spaces, and useful in defining necessary scaling constants; Figure 3 is a schematic block diagram of means for carrying out the method of the present invention, with respect to providing a maximum intensity projection; and Figure 4 Is a schematic block diagram of a portion of the apparatus of Figure 3, reconfigured for providing a sum, or_average, intensity projection.
i RD-1 9, 32 4 In an image display system, such as for displaying volumetrically- rendered projection images of a sample 10 from any arbitrary viewijhg angle, e.g. a spherical projection angle denoted by angle parameters (0,0), where 0 is the angle that an extension 151 of a viewing ray 15 makes upon the X-Y plane, and 0 Is the angle of ray 15 with respect to extension 151, an object volume 11 is analyzed by at least one desired modality, such as a nuclear magnetic resonance (NMR) scanner and the like. Sample volume 11 is scanned in such a manner as to create a series of stackedr contiguous slices or sheets OS1, OS2,..., 0Sk,... each of which contains the same number of object volume elements (voxels) OV. Each voxel has a rectangular profile in the sheet plane (say, the X-Y plane); while the complementary sides S may be of equal length, so that this profile may be square, the sheet thickness T is generally greater than the length of either side. Thus, the first object slice OSI contains a first multiplicity of object voxels 0Vitjf1r where i and j are the respective x-axis and y-axis positions of the voxel. Similarly, the second object slice OS2 contains object voxels 0Vi,j,2. An arbitrary object slice 0Sk contains voxels 0Vi,j,kt where k is the z-axis position of that voxel.
Each object voxel OVi,j,k is analyzed and the data value thereof is placed in a corresponding data Voxel DVi,j,k of a data volume 12. Data volume 12 Is a simple cubic i,j,k lattice, even though the thickness of each object slice 0Sk and each object voxel face size (the size of the voxel in the x-y plane) will not generally be the same. That is, not only may the object volume have different x,y and z dimensions for each"voxel, but also the total number of voxels In any dimension need not be the same. For example, a typical MR 3D i i i 1 f i -5 RD-19, 324 scan may provide each slice with a 256x256 matrix of voxels, and may involve 128 slices; each slice thickness T may be on the order of 3mm., while the sides S of each of the voxels in that slice may be 1 mm. and the like. It should be noted that resampling of the 3D data is not necessary, in converting the value of each voxel of the multi-sliced object volume 11 into the data value stored therefore id the corresponding voxel of data volume 12, even though the object voxels will not have correct dimensions when placed in the data space 11.
In accordance with one aspect of the invention, an image of object 10 is projected upon a projection plane 14 by ray casting toward the image plane 14 from a lattice point in each data voxel DVi,j,k For convenience, the lattice point may, for example, be the data voxel vertex closest to the data volume origin. The cast ray 17 leaves the data space 12 at a projection angle with spherical angular parameters (CC,P) transformed from the parameters of the spherical angular parameters (0,0) at which the object space 11 is viewed.
These two angles are not the same, due to the geometric distortion caused by use of a cubic data volume 12 with a non-cubic object volume 11. However, the projected ray 17 does have a plane extension 171 which makes an angle a with respect to the X axis of the data space, and ray 17 makes an angle P with the Z axis. Thus, angles cc and P are determined by a rotation process (to be discussed hereinbelow) to correspond to viewing the object space 11 at the desired viewing angle 0,0 (assuming operation in spherical coordinates). Hitherto, a pixe 16 in the image plane 14 has been cast towards the data volume, causing much computational time to be required to determine which of the data volume lattice point/voxel values are to be processed for a ray 17 cast from that pixel 16; complex algorithms were required to determine how close to the cast ray a lattice 4 point/data volume voxel had to be for inclusion for each RD-19, 324 pixel 16. In the present method, each of the rays 17 is cast in the opposite direction, from the data volume voxel lattice point toward the image plane.
While all rays 17 impinge upon some portion of the image plane, only those rays falling within the image plane pixel 16a under consideration are allowed to contribute to the data for that image plane pixel. Thus, having chosen a portion of the object volume 11 to view and a viewing angle 0,0 at which to view this selected object volume, the data value in each voxel of the corresponding portion of the data volume Is cast at some angle cc,p (corresponding to viewing the distorted data space with respect to the object space) toiqard the image plane. The data value in a first voxel (say, voxel DVi,l,k) is thus back-projected along ray 17a, in accordance with the e and values chosen. This ray 17a impinges upon image plane 14 at a position 18a within pixel 16a, and, as this is the first ray to impinge upon this pixel, the intensity value of the incident data is attributed to (stored in) the desired pixel 16a. The next voxel in the data volume (say voxel DVi, 2, k) has its associated ray 17b projected at the same angular ((X, 0) conf iguration f rom the voxel.lattice point, and its position 18b upon image plane 14 is noted. Assuming that impingement position 18b is within desired pixel 16a, the second projected value is (for a maximum pixel projection) compared with the now-stored first value and the larger (more intense) value is placed in storage for pixel 16a. It will be understood that, for an averaged-intensity projection, the value of a current projected data voxel is added to the sum_already stored for the image panel pixel upon which that projection ray impinges, and the sum eventually divided by a counted number of such impinging rays, for that pixel. As each voxel in the selected data volume is sequentially entered and 1 i i i i i i i i i 1 J RD-19, 324 projected toward image plane 14, a data volume voxel (say, voxel DVi,3,k) is eventually projected along its associated ray 17p and does not impinge within the desired pixel 16a, so that its intensity data is not compared to the intensity data presently stored for pixel 16a; the maximum intensity data for pixel 16a is now established, for that projection of the data at the particular e, three-dimensional angle of view. However, the ray 17p does, in fact, have an impingement point 18p which falls within another image plane pixel (say, pixel 16b) and is compared to the intensity data stored therein and the larger value is, after the comparison, returned to storage for that pixel. It will be understood that all intensity values are reset to zero when a new projection is to be taken. Thus, each of the image plane pixels is reset at the start of an image projection procedure, and all of the data volume voxels (in the entire space or in the selected portion, as set by the portion of the object volume 11 selected) are individually and sequentially scanned; the intensity data value in each data voxel DV is projected through an associated ray 17 to impinge upon image plane 14 in one pixel 16 thereof, with the maximum value in each pixel being compared between the present value of the ray-casted data volume voxel, to determine the larger thereof, which larger value is then stored as part of the maximum intensity image. In practice, for a maximum pixel projection, the stored maximum intensity value will be changed only if the newly-cast data voxel value is greater than the data value already stored for the image plane pixel upon which the newly-cast ray impinges.
In accordance with another aspect of the present invention, the data projection is scaled and any anisotropy between the object space and the image plane is removed by only a single set of calculations, after back-projection is complete. Referring now to Figure 2, it will be seen that, RD-19, 324 because object space 11 is a real volume while data space 12 1 i i i i 1 1 1 is an abstract concept, it is necessary to determine the amount of distortion of the data projection due to the presentation of the cubic data volume lattice 12 at a different angle in a first plane, then the angle V at which an arbitrary viewing direction 19 will be positioned with respect to both the object space 11 and data space 12.
It will be seen that the apparent dimensions of each voxel are going to change as the effective elevation angles V and change. It will be seen that, if the aspect ratio A (defined i 1 i i i 1 as the ratio of the actual slice thickness T in object volume 11 to the actual pixel size S in the same object volume 11) is not unity (i.e. is greater than unity, as the object voxel is not a cubic voxel, as will be encountered in data space 12), then the angles V and y of elevation will not only not be the same, the effective elevation angle V in the data space will not be the same as the actual elevation angle y in the object space. Rotation of the data is in accordance with an object elevation angle obtained by V-t an-1 ( (1 /A) (tan [11)).
Thereafter, the projected data can be scaled to have the correct height in the object space, by multiplication of all projected data heights by the elevation scale factor. The old projected image height H can be corrected with an effective scale factor Es, where E3-sqrt((A cos y)2+Sin2 V) and the new height H'-H.Es.
i i i 1 i i i Utilizing the above relationship, the rotation of data space angles (CC, O) becomes angles (0,0), because the distortion is only along 1 axis, so that angle 0 equals angle a. The elements of the three-by-three rotational matrix [M] i i i i i i 1 can be determined, and given the two involved rotational i 1 i i i 1 1 i i i 1 i angles, these relationships can be used to determine the data space-to-image plane transformations:
1 i i 1 i RD-19, 324 X1=MiX+MU+MU+X0 and Y1-WX+M5Y+MU+YO, where MI-M6 are the first two rows of the rotational matrix (i.e. M1-sinO, M2-cosOsinyM3=0,M4--cosOsiny, MS- sinO siny and M6-cosy), X' and Y' are the locations on the image plane of the projected point, and XO and YO are image plane X and Y offsets (respectively referenced to the X and Y lowest value points) at which the selected portion of the image plane begins. After the data is projection onto image plane 14, 10 the image is scaled to correct for the effect of the anisotropic object voxels. It will be seen that factors M1M6 can be precalculated, at the beginning of a projection (given 0 and 0), and these precalculated numbers used for all is rotation calculations.
Referring now to Figure 3, apparatus for use of this method may comprise a subassembly 20 for inclusion in an imaging system. The subassembly includes a 3D data memory means 22 for storing slice data as received at a data input 22a from the modality apparatus sampling the object-to-beinvestigated. The data associated with each object voxel is stored at the address of that voxel, responsive to voxel address input information received at a voxel address input 22b from the modality apparatus (e.g. from the modality control data processing unit (CPU) 27, and the like). Once the data memory means is filled (corresponding to the transfer of all required data from object data 11 to data volume 12), the object volume portion of interest is selected and data establishing its starting corner and extent in the X, Y and Z directions is sent from CPU 27 to an input 25a of an address generator means 25. Means 25 sequentially provides, at an address output 25b, the X,Y,Z address of each voXel within the object volume selected. This sequential succession of voxel XpY,Z addresses at output 25b is provided to an output-data-address input 22c of the data memory means RD-19,324 22, causing the stored intensity data for that one voxel then addressed to be output from data memory means output 22d. The sequence of voxel X,Y,Z addresses is also provided to a first input 30a of a rotational parameter calculation means 30, which receives-.angle ec,p information via the system computer (CPU 27 or the like) as the calculated matrix element M1-M6 values, to provide at an output 30c the address X',Y1 of the image plane pixel corresponding to that object X,Y,Z pixel when viewed at a selected viewing angle 0,. The viewing angle 0, information is entered into the system, is processed by CPU 27 and the results entered into Inputs 35b and 35c of a viewing matrix means 35, to provide matrix elements M1-M6 at its output 35a and thence to rotate means 30. The image plane pixel address X1,Y1 appears at an' address input 40a of a frame buffer acting as an image plane memory means 40. Simultaneously, the intensity data, projected from the data space to the projection plane, appears at the image plane memory means new data input 40b, from 3D data memory means output 22d. This data also appears at the new data input 45a of a data comparator means 45.
Intensity data previously saved in the image plane memory means 40 for that address, at input 40a, appears at an old data output 40c, and thence at an old data input 45b of the comparator means. The old and new data at inputs 45b/45a, respectively, are compared in means 45 and an output 45c thereof is enabled to a selected logic condition (e.g. a high logic level) if the new data at input 45a is of greater amplitude than the old data at input 45b. Output 45c is connected to a substitute-control input 40d of the image plane memory means, to cause the data stored at the address controlled by Input 40a to be changed to accept the new data at'input 40b,, if the substitute-data control input 40d Is at the selected logic level. Thus, the stored data is Initially reset, as by a signal through a data/control port 40e (from 1 i 1 i 1 1 1 i 2 1 i i i 1 i i 1 1 - 11 RD-19,324 CPU 27), and the data of greatest intensity is stored for each image plane pixel location X',Y1 responsive to a comparison indicating that the new data exceeds the value of the previously-stored old data. After all of the selected addresses are sequentially scanned by address generator 25, the data stored in image plane memory means 40 is scaled in CPU 27, and the scaled image plane data can be withdrawn from memory means 40 for display, permanent storage or the like purposes.
In accordance with another aspect of the present invention, maximum pixel projections need not be-utilized; Figure 4 illustrates a modified apparatus 201 portion in which the data comparator means 45 is replaced by a data adder means 50. The average intensity of each image plane pixel is found by the following methodology: for each object volume voxel address X,Y,Z, new data is applied to a first input 50a of adder means 50; simultaneously therewith, the rotated address appears as to corresponding image plane address X',Y1 at image plane memory means address 40a. The appearance of new address data is sensed by a trigger means 52, to provide a substitute-data signal at input 40d. This signal occurs with a sufficient delay so that the previously stored data from memory means 40, and available at a data output 40c thereof, has been applied to adder means second input 50b, and the sum of the new and stored data is now available at the data input 40b of the image plane memory means. The summed data is stored in means 40, for that pixel of the image plane, until all of the object volume voxels in the volume of interest have been scanned, and have made their contribut_ion to the associated pixels of the image plane. The image plane pixel sum values are now operated upon by the CPU, via data port 40e of the memory means, in accordance with the number of additions for each image plane pixel (which is a number obtained by adding activations of the 1 RD-19, 324 i 1 1 I i I input 40d for each pixel and is also stored in image plane memory means 40) to derive the average intensity for each image plane pixel, for display, storage and the like.
While the present invention has been described with 5 one presently preferred embodiment thereof, many i i i I i i modifications and variations will now become apparent to i i those skilled in the art i i i i 1 i 1 i I i 1 i i i i i i i 1 i i i I i t RD-19, 324

Claims (21)

CIAIMS:
1. A method for providing data displayable as a volumetrically-rendered projection image of a sample object, comprising the steps of:
(a) acquiring from each voxel of a volume-of- interest of the object data responsive to a selected characteristic of that object volume; (b) transforming each object voxel location to a location of a pixel in a chosen projection plane; (c) reverse casting a ray from each voxel in a data volume, corresponding to the object volume-of-interest, to the projection plane; at a projection angle corresponding to a projection angle selected for viewing of the object volume; (d) storing a projection data value for each image plane pixel, responsive to a selected criteria, from the values of all projected data voxels impingent upon a particular image panel pixel; and (e) then scaling the dimensions of each image plane pixel, responsive to the dimensions of the corresponding object voxel and the involved projection angle; to correct for anisotropy.
2. The method of claim 1, wherein step (b) includes.the steps of: selecting a viewing angle at which to view the object volume; transforming the viewing angle to a set of angles at which each data voxel ray is back projected onto the image plane; and projecting each data voxel location onto the image plane using the projected angles.
3. The method of claim 2, wherein each projected object voxel (X,Y,Z), impinges upon the image plane at an xaxis location XI and a y-axis location Y' given by: 30 XI-Ml+ M2Y+M3Z+XO and V-WX+M5Y+M6Z+YO J where M1-M6 are terms of the first two rows of a 3x3 rotational matrix describing the relationship between the RD-19, 324 object volume and the image plane, and XO and YO are offsets from the image plane origin.
4. The-method of claim 3, wherein the viewing angle is specified by spherical angle parameters (0,0).
5. The method of claim 4, wherein matrix terms M1-M6 are M1-sine, M2-cosO, M3-0, M4-cosOsiny, M5-sinO siny and M6-cosy, where V Is an angle relating an elevation of the object volume, with respect to the viewing angle.
6. The method of claim 5, wherein the angle V--tan-l((1/A)tarvy),'yis an angle of elevation in the object volume and A is an aspect ratio of an actual object slice thickness T to an actual object voxel side S.
7.
The method of claim 1, wherein step (e) includes the step of multiplying each object voxel height H by an effective scale factor E. to obtain a scaled height H' of an associated pixel in the image plane.
8. The method of claim 7, wherein Es=sqrt ( (Acosy) 2+,sin2,y), y is an angle of elevation in the object volume and A is an aspect ratio of an actual object slice thickness T to an actual object voxel side S.
9. The method of claim 1, wherein-step (d) includes the step of obtaining maximum intensity data for storage in each image plane pixel.
10. The method of claim 1, wherein step (d) includes the step of obtaining average intensity data for storage in each image plane pixel.
11. The method of claim 1, wherein step (a) includes the step of acquiring data by use of a nuclear magnetic resonance technique.
12. Apparatus for providing data displayable as volumetrically-rendered projection image of a sample object, comprising:
i i 1 i i i i 1 1 i 1 i i 1 i a 1 1 i 1 i i 1 i i i A -15 RD-19,324 first means for providing a sequential series of addresses within a selected object volume of interest; 3D data memory means for storing at each of a multiplicity of addressest each corresponding to one voxel of a plurality of slfces of the sample object in an object volume, data characterizing that voxel as taken by a selected modality, and also for providing the data for each addressed voxel responsive to receipt of the address thereof from said first means; means for rotating a set of input viewing angle parameters and the address then being output from said first means to obtain the address of a corresponding pixel location at which data from the then-addressed object'-voxel will impinge upon a projection image plane; image plane memory means for storing at each of a multiplicity of addresses, each corresponding to one pixel of the image plane, image data input thereto; and means for processing the data from the 3D memory means in accordance with a preselected algorithm, prior to storage of the processed data in the image plane memory means; said processing means providing an indication of displayable data, after all processed data is in storage in said image plane memory means.
13. The apparatus of claim 12, wherein the image plane memory means is reset prior to the providing of a sequence of addresses by the first means.
14. The apparatus of claim 13, wherein said processing means includes means for retaining in the image plane memory means the larger of (a) a value of each datum newly-provided from said 3D memory means and (b) a data value already stored in said image plane memory means for a specific pixel, for any voxel projected to impinge upon that image plane pixel.
4 1
15. The apparatus of claim 14, wherein said RD-19,324 retaining means Includes a data comparator, receiving the data value already stored In the image plane memory means for the data value then being output from the 3D memory means, and causing the image plane memory to store the larger of the two Input data values.
16. The apparatus of claim 15, wherein the image plane memory means replaces the already stored data only if the data value then being output from the 3D memory means is larger than the stored data value.
17. The apparatus of claim 13, wherein-said processing means includes means for summing up, and then storing in the image plane memory means, a value of each datum newly- provided from said 3D memory means and a data value already stored in said image plane memory means for a specific pixel, and also for storing the number of summations carried out for each pixel; the summation data stored for each image plane pixel being divided by the number of summations for that pixel, after all addresses in a sequence have been provided by said address-providing means, to obtain an average data value for that pixel.
18. The apparatus of claim 11, further comprising CPU means for scaling the image plane memory means data representing each object voxel height H by an effective scale factor E3 to obtain a scaled height HI of an associated pixel in the Image plane.
19. The apparatus of claim 18, wherein the effective scale factor Essqrt((Acosy)2+Sin2,y), f is an angle of elevation in the object slice thickness T to an actual object voxel side S.
20. A method substantially as herein described with reference to the accompanying drawings.
21. Apparatus substantially as herein described with reference to the accompanying drawings.
Published 1991 atThe Patent Office. State House. 66/71 High Holbom, London WCIR47P. Further copies may be obtained from Sales Branch. Unit 6. Nine Mile Point Cwm1elinfach. Cross Keys. Newport NPI 7HZ. Printed by Multiplex techniques lid. St Mary Cray. Kent.
1 i i i i 1 1 1 i 1 i i 1 i i i i i i i i 1 1 1 1 1;
GB9023313A 1989-10-30 1990-10-25 Displaying 3-D data Withdrawn GB2237714A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US42911189A 1989-10-30 1989-10-30

Publications (2)

Publication Number Publication Date
GB9023313D0 GB9023313D0 (en) 1990-12-05
GB2237714A true GB2237714A (en) 1991-05-08

Family

ID=23701851

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9023313A Withdrawn GB2237714A (en) 1989-10-30 1990-10-25 Displaying 3-D data

Country Status (4)

Country Link
JP (1) JPH0743776B2 (en)
DE (1) DE4034086A1 (en)
FR (1) FR2653919A1 (en)
GB (1) GB2237714A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2325835A (en) * 1997-05-30 1998-12-02 Hewlett Packard Co Volumetric pre-clipping method in a volumetric rendering system

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0549182A2 (en) * 1991-12-23 1993-06-30 General Electric Company Apparatus and method for displaying surgical cuts in three-dimensional models
EP0549183B1 (en) * 1991-12-23 2003-05-14 General Electric Company System for displaying solid cuts for surfaces of solid models
EP0549185A2 (en) * 1991-12-23 1993-06-30 General Electric Company System for 3D scan conversion of a polygonal model into a point and normal format, displayed utilizing an accelerator circuit
US6102858A (en) * 1998-04-23 2000-08-15 General Electric Company Method and apparatus for three-dimensional ultrasound imaging using contrast agents and harmonic echoes
US6219060B1 (en) * 1998-10-15 2001-04-17 General Electric Company Rendering of surfaces from volumetric data employing both dividing and stretching cubes
CN112068791B (en) * 2020-09-04 2024-01-23 京东方科技集团股份有限公司 Storage method, addressing method and equipment for display data of rotary display equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0204225B1 (en) * 1985-06-05 1994-12-14 General Electric Company System and method for the display of surface structures contained within the interior region of a solid body
US4719585A (en) * 1985-08-28 1988-01-12 General Electric Company Dividing cubes system and method for the display of surface structures contained within the interior region of a solid body
CA1258923A (en) * 1986-04-14 1989-08-29 Robert A. Drebin Methods and apparatus for imaging volume data
US4827413A (en) * 1987-06-16 1989-05-02 Kabushiki Kaisha Toshiba Modified back-to-front three dimensional reconstruction algorithm
DE3903838A1 (en) * 1988-02-09 1989-08-17 Toshiba Kawasaki Kk Method and device for representing three-dimensional images

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2325835A (en) * 1997-05-30 1998-12-02 Hewlett Packard Co Volumetric pre-clipping method in a volumetric rendering system
US6072497A (en) * 1997-05-30 2000-06-06 Hewlett-Packard Company Volumetric pre-clipping method that guarantees minimal number of sample points through a volume
GB2325835B (en) * 1997-05-30 2002-01-09 Hewlett Packard Co Volumetric pre-clipping method that guarantees minimal number of sample points through a volume

Also Published As

Publication number Publication date
JPH0743776B2 (en) 1995-05-15
GB9023313D0 (en) 1990-12-05
FR2653919A1 (en) 1991-05-03
JPH03154179A (en) 1991-07-02
DE4034086A1 (en) 1991-05-02

Similar Documents

Publication Publication Date Title
US5226113A (en) Method and apparatus for volumetric projection rendering using reverse ray casting
US4985834A (en) System and method employing pipelined parallel circuit architecture for displaying surface structures of the interior region of a solid body
US4821210A (en) Fast display of three-dimensional images
Greene et al. Creating raster omnimax images from multiple perspective views using the elliptical weighted average filter
US5079699A (en) Quick three-dimensional display
US5170347A (en) System to reformat images for three-dimensional display using unique spatial encoding and non-planar bisectioning
US5283837A (en) Accurate estimation of surface normals in 3-D data sets
US5175806A (en) Method and apparatus for fast surface detail application to an image
US5954653A (en) Method and apparatus for automatically enhancing contrast in projected ultrasound image
EP0791894A2 (en) System and method for displaying oblique cut planes within the interior region of a solid object
US4987554A (en) Method of converting continuous three-dimensional geometrical representations of polygonal objects into discrete three-dimensional voxel-based representations thereof within a three-dimensional voxel-based system
US5779641A (en) Method and apparatus for three-dimensional ultrasound imaging by projecting filtered pixel data
US6155978A (en) Three-dimensional imaging by projecting morphologically filtered pixel data
CA1315902C (en) Minimization of directed points generated in three-dimensional dividing cubes method
EP0447222A2 (en) Gradient calculation for texture mapping
EP0318176A2 (en) Imaging methods and apparatus
US20010020948A1 (en) Method and apparatus for effective level of detail selection
Jense Voxel-based methods for CAD
GB2237714A (en) Displaying 3-D data
US6542154B1 (en) Architectural extensions to 3D texturing units for accelerated volume rendering
US6115048A (en) Fast method of creating 3D surfaces by `stretching cubes`
EP0549182A2 (en) Apparatus and method for displaying surgical cuts in three-dimensional models
EP0994443B1 (en) Rendering of surfaces from volumetric data employing both dividing and stretching cubes
US5821942A (en) Ray tracing through an ordered array
EP0318291B1 (en) Apparatus and method for generating images from tomographic data

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)