US20030132936A1 - Display of two-dimensional and three-dimensional views during virtual examination - Google Patents

Display of two-dimensional and three-dimensional views during virtual examination Download PDF

Info

Publication number
US20030132936A1
US20030132936A1 US10/301,034 US30103402A US2003132936A1 US 20030132936 A1 US20030132936 A1 US 20030132936A1 US 30103402 A US30103402 A US 30103402A US 2003132936 A1 US2003132936 A1 US 2003132936A1
Authority
US
United States
Prior art keywords
dimensional
path
organ
volume
representation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/301,034
Inventor
Kevin Kreeger
Bin Li
Frank Dachille
Jeff Meade
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
VIATRONIX
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US10/301,034 priority Critical patent/US20030132936A1/en
Assigned to VIATRONIX reassignment VIATRONIX ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DACHILLE, FRANK C. IX, KREEGER, KEVIN, MEADE, JEFF, LI, BIN
Publication of US20030132936A1 publication Critical patent/US20030132936A1/en
Assigned to BOND, WILLIAM, AS COLLATERAL AGENT reassignment BOND, WILLIAM, AS COLLATERAL AGENT SECURITY AGREEMENT Assignors: VIATRONIX, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/38Registration of image sequences
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • G06T2207/20044Skeletonization; Medial axis transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30028Colon; Small intestine
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S128/00Surgery
    • Y10S128/92Computer assisted medical diagnostics

Definitions

  • the present disclosure relates to a system and method for performing a volume based three-dimensional virtual examination. More particularly, the disclosure relates to a virtual examination system and method providing enhanced visualization and navigational properties.
  • Two-dimensional (“2D”) visualization of human organs using medical imaging devices has been widely used for patient diagnosis.
  • medical imaging devices include computed tomography (“CT”) and magnetic resonance imaging (“MRI”), for example.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • 3D Three-dimensional (“3D”) images can be formed by stacking and interpolating between two-dimensional pictures produced from the scanning machines. Imaging an organ and visualizing its volume in three-dimensional space would be beneficial due to the lack of physical intrusion and the ease of data manipulation. However, the exploration of the three-dimensional volume image must be properly performed in order to fully exploit the advantages of virtually viewing an organ from the inside.
  • a functional model When viewing the 3D volume virtual image of an environment, a functional model must be used to explore the virtual space.
  • One possible model is a virtual “camera” that can be used as a point of reference for the viewer to explore the virtual space.
  • Camera control in the context of navigation within a general 3D virtual environment has been previously studied.
  • complete control of a camera in a large domain would be tedious and tiring, and an operator might not view all the important features between the start and finishing point of the exploration.
  • the second technique of camera control is a planned navigational method, which assigns the camera a predetermined path to take and which cannot be accidentally changed by the operator. This is akin to having an engaged “autopilot”. This allows the operator to concentrate on the virtual space being viewed, and not have to worry about steering into walls of the environment being examined. However, this second technique does not give the viewer the flexibility to alter the course or investigate an interesting area viewed along the flight path.
  • Radiologists and other specialists have historically been trained to analyze scan data consisting of two-dimensional slices. However, while stacks of such slices may be useful for analysis, they do not provide an efficient or intuitive means to navigate through a virtual organ, especially one as tortuous and complex as the colon. There remains a need for a virtual examination system providing data in a conventional format for analysis while, in addition, allowing an operator to easily navigate a virtual organ.
  • a preferred embodiment of the present disclosure generates a three-dimensional visualization image of an object such as a human organ using volume visualization techniques and explores the virtual image using a guided navigation system, which allows the operator to travel along a predefined flight path and to adjust both the position and viewing angle to a particular portion of interest in the image away from the predefined path in order to identify polyps, cysts or other abnormal features in the organ.
  • An aspect of the present disclosure relates to a method for performing a three-dimensional internal virtual examination of at least one organ.
  • the organ is scanned with a radiological scanning device to produce scan data representative of the organ which is then used to create a three-dimensional volume representation of the organ that includes volume elements.
  • the scan data includes a sequence of axial images.
  • a defined flight path is generated and guided navigation through the three-dimensional representation is performed.
  • one of the series of axial images is displayed wherein the one image corresponds to the current location along defined path.
  • FIG. 10 Another aspect of the present disclosure relates to an operator interface for a three-dimensional virtual examination system of an object wherein the virtual examination includes a guided navigation along a defined path within a three-dimensional volume representation of the object created from scanning data comprising a sequence of two-dimensional axial images of the object and then generating volume elements of the representation based on these axial images.
  • the operator interface includes a display screen having a plurality of sub-windows simultaneously visible. Within a first of these sub-windows volume elements responsive to the defined path and an operator's input during the guided navigation are displayed in real-time. In a second of these sub-windows one of the two-dimensional images corresponding to a current location along the defined path is displayed.
  • This operator interface can be stored as instructions on a computer-readable medium as well to cause, upon execution thereof, a processor to provide the operator interface.
  • system and method embodiments are provided for generating a three-dimensional visualization image of an object such as an organ using volume visualization techniques and exploring the image using a guided navigation system, which allows the operator to travel along a flight path and to adjust the view to a particular portion of the image of interest in order, for example, to identify polyps, cysts or other abnormal features in the visualized organ.
  • One or more series of two-dimensional renditions of the organ, correlated to the flight path location, can be provided to an operator to assist in analyzing the organ. Both the three-dimensional representation, a display of the flight path, and the two-dimensional slices are simultaneously displayed to the operator.
  • FIG. 1 shows a flow chart of the steps for performing a virtual examination of an object, specifically a colon, in accordance with the disclosure
  • FIG. 2 shows an illustration of a “submarine” camera model which performs guided navigation in the virtual organ
  • FIG. 3 shows a diagram illustrating a two dimensional cross-section of a volumetric colon which contains the flight path
  • FIG. 4 shows a diagram of a system used to perform a virtual examination of a human organ in accordance with the disclosure
  • FIG. 5 shows an exemplary representation of a colon and accompanying flight-path generated according to an embodiment of the present disclosure
  • FIG. 6 shows an exemplary display of a two-dimensional slice of scan data according to an embodiment of the present disclosure
  • FIG. 7 shows the colon of FIG. 5 intersected by a plane oriented perpendicular to the flight-path
  • FIG. 8 shows an exemplary operator interface screen according to embodiments of the present disclosure.
  • FIG. 9 shows a block diagram of a system embodiment based on a personal computer bus architecture.
  • the preferred embodiment to be described is the examination of an organ in the human body, specifically the colon.
  • the colon is long and twisted, which makes it especially suited for a virtual examination saving the patient monetary expense as well as the discomfort and increased hazard of a physical probe.
  • organs that can be examined include the lungs, stomach and portions of the gastrointestinal system, the heart and blood vessels.
  • a method for performing a virtual examination of an object such as a colon is indicated generally by the reference numeral 100 .
  • the method 100 illustrates the steps necessary to perform a virtual colonoscopy using volume visualization techniques.
  • Step 101 prepares the colon to be scanned in order to be viewed for examination if required by either the doctor or the particular scanning instrument.
  • This preparation could include cleansing the colon with a “cocktail” or liquid, which enters the colon after being orally ingested and passed through the stomach.
  • the cocktail forces the patient to expel waste material that is present in the colon.
  • a substance used is Golytcly.
  • air or carbon dioxide can be forced into the colon in order to expand it to make the colon easier to scan and examine.
  • Step 101 does not need to be performed in all examinations as indicated by the dashed line in FIG. 1.
  • Step 103 scans the organ that is to be examined.
  • the scanner can be an apparatus well known in the art, such as a spiral CT-scanner for scanning a colon or a Zenith MRI machine for scanning a lung labeled with xenon gas, for example.
  • the scanner must be able to take multiple images from different positions around the body during suspended respiration, in order to produce the data necessary for the volume Visualization.
  • data can be acquired using a GE/CTI spiral mode scanner operating in a helical mode of 5 mm, 1.5-2.0:1 pitch, reconstructed in 1 mm slices, where the pitch is adjusted based upon the patient's height in a known manner.
  • a routine imaging protocol of 120 kVp and 200-280 ma can be utilized for this operation.
  • the data can be acquired and reconstructed as 1 mm thick slice images having an array size of 512 ⁇ 512 pixels in the field of view, which varies from 34 to 40 cm depending on the patient's size.
  • the number of such slices generally varies under these conditions from 300 to 450, depending on the patient's height.
  • the image data set is converted to volume elements or voxels.
  • An example of a single CT-image would use an X-ray beam of 5 mm width, 1:1 to 2:1 pitch, with a 40 cm field-of-view being performed from the top of the splenic flexure of the colon to the rectum.
  • Discrete data representations of the object can be produced by other methods besides scanning.
  • Voxel data representing an object can be derived from a geometric model by techniques described in U.S. Pat. No. 5,038,302 entitled “Method of Connecting Continuous Three-Dimensional Geometrical Representations into Discrete Three-Dimensional Voxel-Based Representations Within a Three-Dimensional Voxel-Based System” by Kaufman, issued Aug. 8, 1991, filed Jul. 26,1988, which is hereby incorporated by reference in its entirety. Additionally, data can be produced by a computer model of an image, which can be converted to three-dimensional voxels and explored in accordance with this disclosure.
  • Step 104 converts the scanned images into three-dimensional volume elements (“voxels”).
  • the scan data is reformatted into 5 mm thick slices at increments of 1 mm or 2.5 mm and reconstructed in 1 mm slices, with each slice represented as a matrix of 512 by 512 pixels. By doing this, voxels of approximately 1 cubic mm are created. Thus a large number of 2D slices are generated depending upon the length of the scan. The set of 2D slices is then reconstructed to 3D voxels.
  • the conversion process of 2D images from the scanner into 3D voxels can either be performed by the scanning machine itself or by a separate machine such as a computer implementing techniques that are well known in the art (see, e.g., U.S. Pat. No. 4,985,856 entitled “Method and Apparatus for Storing, Accessing, and Processing Voxel-based Data” by Kaufman et al.; issued Jan. 15, 1991, filed Nov. 11, 1988; which is hereby incorporated by reference in its entirety).
  • Step 105 allows the operator to define the portion of the selected organ to be examined.
  • a physician may be interested in a particular section of the colon likely to develop polyps.
  • the physician can view a two dimensional slice overview map to indicate the section to be examined.
  • a starting point and finishing point of a path to be viewed can be indicated by the physician/operator.
  • a conventional computer and computer interface e.g., keyboard, mouse or spaceball
  • a grid system with coordinates can be used for keyboard entry or the physician/operator can “click” on the desired points.
  • the entire image of the colon can also be viewed if desired.
  • Step 107 performs the planned or guided navigation operation of the virtual organ being examined.
  • Performing a guided navigation operation is defined as navigating through an environment along a predefined or automatically predetermined flight path, which can be manually adjusted by an operator at any time.
  • the virtual examination is modeled on having a tiny viewpoint or “camera” traveling through the virtual space with a view direction or “lens” pointing towards the finishing point.
  • the guided navigation technique provides a level of interaction with the camera, so that the camera can navigate through a virtual environment automatically in the case of no operator interaction, and at the same time, allow the operator to manipulate the camera when necessary.
  • the preferred embodiment of achieving guided navigation is to use a physically based camera model that employs potential fields to control the movement of the camera, as is further detailed with respect to FIG. 2.
  • Step 109 which can be performed concurrently with step 107 , displays the inside of the organ from the viewpoint of the camera model along the selected pathway of the guided navigation operation.
  • Three-dimensional displays can be generated using techniques well known in the art such as the marching cubes technique, for example.
  • a technique is used that reduces the vast number of data computations necessary for the display of the virtual organ.
  • the method described in FIG. 1 can also be applied to scanning multiple organs in a body at the same time.
  • a patient may be examined for cancerous growths in both the colon and lungs.
  • the method of FIG. 1 would be modified to scan all the areas of interest in step 103 and to select the current organ to be examined in step 105 .
  • the physician/operator may initially select the colon to virtually explore and later explore the lung.
  • two different doctors with different specialties may virtually explore different scanned organs relating to their respective specialties.
  • the next organ to be examined is selected and its portion will be defined and explored. This continues until all organs that need examination have been processed.
  • a “submarine camera” model that performs guided navigation in a virtual organ is indicated generally by the reference numeral 200 .
  • the model 200 depicts a viewpoint control model that performs the guided navigation technique of step 107 .
  • the default navigation is similar to that of planned navigation that automatically directs the camera along a flight path from one selected end of the colon to another.
  • the camera stays at the center of the colon for obtaining better views of the colonic surface.
  • the operator of the virtual camera using guided navigation can interactively bring the camera close to a specific region and direct the motion and angle of the camera to study the interesting area in detail, without unwillingly colliding with the walls of the colon.
  • the operator can control the camera with a standard interface device such as a keyboard, mouse or nonstandard device such as a spaceball.
  • a standard interface device such as a keyboard, mouse or nonstandard device such as a spaceball.
  • six degrees of freedom for the camera are required.
  • the camera must be able to move in the horizontal, vertical, and depth or Z direction (axes 217 ), as well as being able to rotate in another three degrees of freedom (axes 219 ) to allow the camera to move and scan all sides and angles of a virtual environment.
  • a two dimensional cross-section of a volumetric colon containing a flight path is indicated generally by the reference numeral 300 .
  • the cross-section 300 includes the final flight path for the camera model down the center of the colon, as indicated by “x”s, and at least one starting location 301 or 303 near one end of the colon.
  • FIG. 4 a system used to perform a virtual examination of a human organ in accordance with the disclosure is indicated generally by the reference numeral 400 .
  • the system 400 is for performing the virtual examination of an object such as a human organ using the techniques described herein.
  • a patient 401 lays on a platform 402 , while a scanning device 405 scans the area that contains the organ or organs to be examined.
  • the scanning device 405 contains a scanning portion 403 that takes images of the patient and an electronics portion 406 .
  • the electronics portion 406 includes an interface 407 , a central processing unit 409 , a memory 411 for temporarily storing the scanning data, and a second interface 413 for sending data to a virtual navigation platform or terminal 416 .
  • the interfaces 407 and 413 may be included in a single interface component or may be the same component.
  • the components in the portion 406 are connected together with conventional connectors.
  • the data provided from the scanning portion 403 of the device 405 is transferred to unit 409 for processing and is stored in memory 411 .
  • the central processing unit 409 converts the scanned 2D data to 3D voxel data and stores the results in another portion of the memory 411 .
  • the converted data may be directly sent to the interface unit 413 to be transferred to the virtual navigation terminal 416 .
  • the conversion of the 2D data could also take place at the virtual navigation terminal 416 after being transmitted from the interface 413 .
  • the converted data is transmitted over a carrier 414 to the virtual navigation terminal 416 in order for an operator to perform the virtual examination.
  • the data may also be transported in other conventional ways, such as storing the data on a storage medium and physically transporting it to terminal 416 or by using satellite transmissions, for example.
  • the scanned data need not be converted to its 3D representation until the visualization-rendering engine requires it to be in 3D form. This saves computational steps and memory storage space.
  • the virtual navigation terminal 416 includes a screen for viewing the virtual organ or other scanned image, an electronics portion 415 and an interface control 419 such as a keyboard, mouse or spaceball.
  • the electronics portion 415 includes an interface port 421 , a central processing unit 423 , optional components 427 for running the terminal and a memory 425 .
  • the components in the terminal 416 are connected together with conventional connectors.
  • the converted voxel data is received in the interface port 421 and stored in the memory 425 .
  • the central processing unit 423 then assembles the 3D voxels into a virtual representation and runs the submarine camera model as described for FIG. 2 to perform the virtual examination.
  • the visibility technique is used to compute only those areas that are visible from the virtual camera, and display them on the screen 417 .
  • a graphics accelerator can also be used in generating the representations.
  • the operator can use the interface device 419 to indicate which portion of the scanned body is desired to be explored.
  • the interface device 419 can further be used to control and move the submarine camera as desired as detailed for FIG. 2.
  • the terminal portion 415 can be, for example, the Cube-4 dedicated system box, generally available from the Department of Computer Science at the State University of New York at Stony Brook.
  • the scanning device 405 and terminal 416 can be part of the same unit.
  • a single platform would be used to receive the scan image data, connect it to 3D voxels if necessary and perform the guided navigation.
  • An important feature in system 400 is that the virtual organ can be examined at a later time without the presence of the patient. Additionally, the virtual examination could take place while the patient is being scanned.
  • the scan data can also be sent to multiple terminals, which would allow more than one doctor to view the inside of the organ simultaneously. Thus a doctor in New York could be looking at the same portion of a patient's organ at the same time with a doctor in California while discussing the case. Alternatively, the data can be viewed at different times. Two or more doctors could perform their own examination of the same data in a difficult case. Multiple virtual navigation terminals could be used to view the same scan data.
  • An improved electronic colon cleansing technique employs modified bowel preparation operations followed by image segmentation operations, such that fluid and stool remaining in the colon during a computed tomographic (“CT”) or magnetic resonance imaging (“MRI”) scan can be detected and removed from the virtual colonoscopy images.
  • CT computed tomographic
  • MRI magnetic resonance imaging
  • volume-rendering techniques may be used in connection with virtual colonoscopy procedures to further enhance the fidelity of the resulting image.
  • Methods for volume rendering are well known to those of ordinary skill in the pertinent art.
  • an exemplary representation of a colon and accompanying flight-path generated according to an embodiment of the present disclosure is indicated generally by the reference numeral 500 .
  • the representation 500 depicts a human colon 502 showing a centerline flight path 504 . As the operator travels through the virtual organ along this flight path, two-dimensional images of the current position are displayed.
  • an exemplary display of a two-dimensional slice of scan data is indicated generally by the reference numeral 600 .
  • the slice 600 is shown while advancing along the flight path, and the operator interface displays the virtual organ along with the slice for the current “z” coordinate and pans the image of that slice so that the current “x, y” position are in the center of the image.
  • the two-dimensional slices are axial slices, where convention has the z-axis pointing towards the head.
  • two-dimensional slices oriented on other planes can be generated and viewed as well.
  • the two-dimensional images displayed to the operator can be oriented on the sagittal plane, the coronal plane, or perpendicular to the flight path.
  • FIG. 7 the colon of FIG. 5 intersected by a plane oriented perpendicular to the flight-path is indicated generally by the reference numeral 700 .
  • the intersected colon 700 depicts a plane 704 indicating a two-dimensional image perpendicular to the flight path 702 through the colon 703 .
  • an exemplary operator interface screen is indicated generally by the reference numeral 2100 .
  • the screen 2100 includes a number of sub-windows that simultaneously provide an operator with graphical information from a number of different perspectives.
  • the center sub-window 2104 displays inside the virtual organ.
  • An arrow or marker 2105 helps orient the operator along the projected flight path.
  • a complete view of this flight path, along with the entire organ is depicted in sub-window 2102 .
  • Operator controls 2108 are near the bottom of the screen 2100 and are useful to control the travel through the virtual organ. The rendering of the virtual organ, as well as the control of flight through the organ, have been described earlier and are not repeated here.
  • Each of these windows can include a marker, for example 2115 , 2113 and 2111 , to help orient the operator along the flight path.
  • each window 2106 , 2114 , 2112 , and 2110 has a respective control for scrolling through two-dimensional images such as scroll bar 2115 . Accordingly, the operator can traverse the flight path manually, in either direction, using this scroll bar.
  • the screen 2100 is exemplary in nature and a skilled artisan would recognize many equivalent alternatives within the scope of the present disclosure. For example, not all sub-windows 2106 , 2114 , 2112 and 2110 need to be displayed.
  • the system 900 includes an alternate hardware embodiment suitable for deployment on a personal computer (“PC”), as illustrated.
  • the system 900 includes a processor 910 that preferably takes the form of a high speed, multitasking processor, such as, for example, a Pentium III processor operating at a clock speed in excess of 400 MHZ.
  • the processor 910 is coupled to a conventional bus structure 920 that provides for high-speed parallel data transfer.
  • Also coupled to the bus structure 920 are a main memory 930 , a graphics board 940 , and a volume rendering board 950 .
  • the graphics board 940 is preferably one that can perform texture mapping, such as, for example, a Diamond Viper v770 Ultra board manufactured by Diamond Multimedia Systems.
  • the volume rendering board 950 can take the form of the VolumePro board from Mitsubishi Electric, for example, which is based on U.S. Pat. Nos. 5,760,781 and 5,847,711, which are hereby incorporated by reference in their entirety.
  • a display device 945 such as a conventional SVGA or RGB monitor, is operably coupled to the graphics board 940 for displaying the image data.
  • a scanner interface board 960 is also provided for receiving data from an imaging scanner, such as an MRI or CT scanner, for example, and transmitting such data to the bus structure 920 .
  • the scanner interface board 960 may be an application specific interface product for a selected imaging scanner or can take the form of a general-purpose input/output card.
  • the PC based system 900 will generally include an I/O interface 970 for coupling I/O devices 980 , such as a keyboard, digital pointer or mouse, and the like, to the processor 910 .
  • the I/O interface can be coupled to the processor 910 via the bus 920 .
  • Embodiments of the present disclosure provide a user interface displaying both two-dimensional and three-dimensional data. Organs within the body are, by nature, three-dimensional. Conventional medical imaging devices, however, as explained herein, create stacks of two-dimensional images when acquiring scan data. Radiologists and other specialists, therefore, have historically been trained to review and analyze these two-dimensional images. As a result, most doctors are comfortable viewing two-dimensional images even if three-dimensional reconstructions or virtualizations are available.
  • three-dimensional flight paths are intuitive, efficient tools to virtually travel through volumetric renderings of human organs either automatically or manually.
  • each point along the flight path is represented by a coordinate (x, y, z).
  • these coordinates are used to automatically scroll and pan the series of two-dimensional images that doctors are used to analyzing.
  • the operator does not have to manually navigate through an organ in two dimensions but, instead, can let the present virtualization system advance along the organ while the operator concentrates on analyzing each two-dimensional image.
  • the methods and systems described herein could be applied to virtually examine an animal, fish or inanimate object.
  • applications of the technique could be used to detect the contents of sealed objects that cannot be opened.
  • the technique could also be used inside an architectural structure such as a building or cavern and enable the operator to navigate through the structure.
  • the teachings of the present disclosure are implemented as a combination of hardware and software.
  • the software is preferably implemented as an application program tangibly embodied on a program storage unit.
  • the application program may be uploaded to, and executed by, a machine comprising any suitable architecture.
  • the machine is implemented on a computer platform having hardware such as one or more central processing units (“CPU”), a random access memory (“RAM”), and input/output (“I/O”) interfaces.
  • CPU central processing units
  • RAM random access memory
  • I/O input/output
  • the computer platform may also include an operating system and microinstruction code.
  • the various processes and functions described herein may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU.
  • various other peripheral units may be connected to the computer platform such as an additional data storage unit and a printing unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Graphics (AREA)
  • Health & Medical Sciences (AREA)
  • Geometry (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)
  • Processing Or Creating Images (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

A system and method are provided for generating a three-dimensional visualization image of an object such as an organ using volume visualization techniques and exploring the image using guided navigation that allows the operator to travel along a flight path and to adjust the view to a particular portion of the image of interest in order, for example, to identify polyps, cysts or other abnormal features in the visualized organ, wherein one or more series of two-dimensional renditions of the organ, correlated to the flight path location, are provided to an operator to assist in analyzing the organ, and the three-dimensional representation, a display of the flight path, and two-dimensional slices are simultaneously displayed to the operator.

Description

    CROSS-REFERENCE
  • This application claims the benefit of U.S. Provisional Patent Application Serial No. 60/331,712, entitled “New Features For Virtual Colonoscopy” and filed Nov. 21, 2001, which is incorporated herein by reference in its entirety.[0001]
  • BACKGROUND
  • The present disclosure relates to a system and method for performing a volume based three-dimensional virtual examination. More particularly, the disclosure relates to a virtual examination system and method providing enhanced visualization and navigational properties. [0002]
  • Two-dimensional (“2D”) visualization of human organs using medical imaging devices has been widely used for patient diagnosis. Currently available medical imaging devices include computed tomography (“CT”) and magnetic resonance imaging (“MRI”), for example. Three-dimensional (“3D”) images can be formed by stacking and interpolating between two-dimensional pictures produced from the scanning machines. Imaging an organ and visualizing its volume in three-dimensional space would be beneficial due to the lack of physical intrusion and the ease of data manipulation. However, the exploration of the three-dimensional volume image must be properly performed in order to fully exploit the advantages of virtually viewing an organ from the inside. [0003]
  • When viewing the 3D volume virtual image of an environment, a functional model must be used to explore the virtual space. One possible model is a virtual “camera” that can be used as a point of reference for the viewer to explore the virtual space. Camera control in the context of navigation within a general 3D virtual environment has been previously studied. There are two conventional types of camera control offered for navigation of virtual space. The first gives the operator full control of the camera, which allows the operator to manipulate the camera in different positions and orientations to achieve the view, desired. The operator will in effect pilot the camera. This allows the operator to explore a particular section of interest while ignoring other sections. However, complete control of a camera in a large domain would be tedious and tiring, and an operator might not view all the important features between the start and finishing point of the exploration. [0004]
  • The second technique of camera control is a planned navigational method, which assigns the camera a predetermined path to take and which cannot be accidentally changed by the operator. This is akin to having an engaged “autopilot”. This allows the operator to concentrate on the virtual space being viewed, and not have to worry about steering into walls of the environment being examined. However, this second technique does not give the viewer the flexibility to alter the course or investigate an interesting area viewed along the flight path. [0005]
  • It would be desirable to use a combination of the two navigation techniques described above to realize the advantages of both techniques while minimizing their respective drawbacks. It would be desirable to apply a flexible navigation technique to the examination of human or animal organs that are represented in virtual 3D space in order to perform a non-intrusive painless and thorough examination. The desired navigational technique would further allow for a complete examination of a virtual organ in 3D space by an operator, allowing flexibility while ensuring a smooth path and complete examination through and around the organ. It would be additionally desirable to be able to display the exploration of the organ in a real time setting by using a technique that minimizes the computations necessary for viewing the organ. The desired technique should also be equally applicable to exploring any virtual object. [0006]
  • Radiologists and other specialists have historically been trained to analyze scan data consisting of two-dimensional slices. However, while stacks of such slices may be useful for analysis, they do not provide an efficient or intuitive means to navigate through a virtual organ, especially one as tortuous and complex as the colon. There remains a need for a virtual examination system providing data in a conventional format for analysis while, in addition, allowing an operator to easily navigate a virtual organ. [0007]
  • SUMMARY
  • A preferred embodiment of the present disclosure generates a three-dimensional visualization image of an object such as a human organ using volume visualization techniques and explores the virtual image using a guided navigation system, which allows the operator to travel along a predefined flight path and to adjust both the position and viewing angle to a particular portion of interest in the image away from the predefined path in order to identify polyps, cysts or other abnormal features in the organ. [0008]
  • An aspect of the present disclosure relates to a method for performing a three-dimensional internal virtual examination of at least one organ. According to this aspect, the organ is scanned with a radiological scanning device to produce scan data representative of the organ which is then used to create a three-dimensional volume representation of the organ that includes volume elements. The scan data includes a sequence of axial images. Using the three-dimensional representation, a defined flight path is generated and guided navigation through the three-dimensional representation is performed. Simultaneously with the display of the guided navigation, one of the series of axial images is displayed wherein the one image corresponds to the current location along defined path. [0009]
  • Another aspect of the present disclosure relates to an operator interface for a three-dimensional virtual examination system of an object wherein the virtual examination includes a guided navigation along a defined path within a three-dimensional volume representation of the object created from scanning data comprising a sequence of two-dimensional axial images of the object and then generating volume elements of the representation based on these axial images. According to this aspect, the operator interface includes a display screen having a plurality of sub-windows simultaneously visible. Within a first of these sub-windows volume elements responsive to the defined path and an operator's input during the guided navigation are displayed in real-time. In a second of these sub-windows one of the two-dimensional images corresponding to a current location along the defined path is displayed. This operator interface can be stored as instructions on a computer-readable medium as well to cause, upon execution thereof, a processor to provide the operator interface. [0010]
  • Accordingly, system and method embodiments are provided for generating a three-dimensional visualization image of an object such as an organ using volume visualization techniques and exploring the image using a guided navigation system, which allows the operator to travel along a flight path and to adjust the view to a particular portion of the image of interest in order, for example, to identify polyps, cysts or other abnormal features in the visualized organ. One or more series of two-dimensional renditions of the organ, correlated to the flight path location, can be provided to an operator to assist in analyzing the organ. Both the three-dimensional representation, a display of the flight path, and the two-dimensional slices are simultaneously displayed to the operator.[0011]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Further objects, features and advantages of the present disclosure will become apparent from the following detailed description taken in conjunction with the accompanying figures showing preferred embodiments of the disclosure, in which: [0012]
  • FIG. 1 shows a flow chart of the steps for performing a virtual examination of an object, specifically a colon, in accordance with the disclosure; [0013]
  • FIG. 2 shows an illustration of a “submarine” camera model which performs guided navigation in the virtual organ; [0014]
  • FIG. 3 shows a diagram illustrating a two dimensional cross-section of a volumetric colon which contains the flight path; [0015]
  • FIG. 4 shows a diagram of a system used to perform a virtual examination of a human organ in accordance with the disclosure; [0016]
  • FIG. 5 shows an exemplary representation of a colon and accompanying flight-path generated according to an embodiment of the present disclosure; [0017]
  • FIG. 6 shows an exemplary display of a two-dimensional slice of scan data according to an embodiment of the present disclosure; [0018]
  • FIG. 7 shows the colon of FIG. 5 intersected by a plane oriented perpendicular to the flight-path; [0019]
  • FIG. 8 shows an exemplary operator interface screen according to embodiments of the present disclosure; and [0020]
  • FIG. 9 shows a block diagram of a system embodiment based on a personal computer bus architecture.[0021]
  • DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
  • While the methods and systems described herein may be applied to any object to be examined, the preferred embodiment to be described is the examination of an organ in the human body, specifically the colon. The colon is long and twisted, which makes it especially suited for a virtual examination saving the patient monetary expense as well as the discomfort and increased hazard of a physical probe. Other examples of organs that can be examined include the lungs, stomach and portions of the gastrointestinal system, the heart and blood vessels. [0022]
  • As shown in FIG. 1, a method for performing a virtual examination of an object such as a colon is indicated generally by the [0023] reference numeral 100. The method 100 illustrates the steps necessary to perform a virtual colonoscopy using volume visualization techniques. Step 101 prepares the colon to be scanned in order to be viewed for examination if required by either the doctor or the particular scanning instrument. This preparation could include cleansing the colon with a “cocktail” or liquid, which enters the colon after being orally ingested and passed through the stomach. The cocktail forces the patient to expel waste material that is present in the colon. One example of a substance used is Golytcly. Additionally, in the case of the colon, air or carbon dioxide can be forced into the colon in order to expand it to make the colon easier to scan and examine. This is accomplished with a small tube placed in the rectum with approximately 1,000 cc of air pumped into the colon to distend the colon. Depending upon the type of scanner used, it may be necessary for the patient to drink a contrast substance such as barium to coat any unexpunged stool in order to distinguish the waste in the colon from the colon walls themselves. Alternatively, the method for virtually examining the colon can remove the virtual waste prior to or during the virtual examination as explained later in this specification. Step 101 does not need to be performed in all examinations as indicated by the dashed line in FIG. 1.
  • [0024] Step 103 scans the organ that is to be examined. The scanner can be an apparatus well known in the art, such as a spiral CT-scanner for scanning a colon or a Zenith MRI machine for scanning a lung labeled with xenon gas, for example. The scanner must be able to take multiple images from different positions around the body during suspended respiration, in order to produce the data necessary for the volume Visualization. For example, data can be acquired using a GE/CTI spiral mode scanner operating in a helical mode of 5 mm, 1.5-2.0:1 pitch, reconstructed in 1 mm slices, where the pitch is adjusted based upon the patient's height in a known manner. A routine imaging protocol of 120 kVp and 200-280 ma can be utilized for this operation. The data can be acquired and reconstructed as 1 mm thick slice images having an array size of 512×512 pixels in the field of view, which varies from 34 to 40 cm depending on the patient's size. The number of such slices generally varies under these conditions from 300 to 450, depending on the patient's height. The image data set is converted to volume elements or voxels.
  • An example of a single CT-image would use an X-ray beam of 5 mm width, 1:1 to 2:1 pitch, with a 40 cm field-of-view being performed from the top of the splenic flexure of the colon to the rectum. [0025]
  • Discrete data representations of the object can be produced by other methods besides scanning. Voxel data representing an object can be derived from a geometric model by techniques described in U.S. Pat. No. 5,038,302 entitled “Method of Connecting Continuous Three-Dimensional Geometrical Representations into Discrete Three-Dimensional Voxel-Based Representations Within a Three-Dimensional Voxel-Based System” by Kaufman, issued Aug. 8, 1991, filed Jul. 26,1988, which is hereby incorporated by reference in its entirety. Additionally, data can be produced by a computer model of an image, which can be converted to three-dimensional voxels and explored in accordance with this disclosure. [0026]
  • [0027] Step 104 converts the scanned images into three-dimensional volume elements (“voxels”). In a preferred embodiment for examining a colon, the scan data is reformatted into 5 mm thick slices at increments of 1 mm or 2.5 mm and reconstructed in 1 mm slices, with each slice represented as a matrix of 512 by 512 pixels. By doing this, voxels of approximately 1 cubic mm are created. Thus a large number of 2D slices are generated depending upon the length of the scan. The set of 2D slices is then reconstructed to 3D voxels. The conversion process of 2D images from the scanner into 3D voxels can either be performed by the scanning machine itself or by a separate machine such as a computer implementing techniques that are well known in the art (see, e.g., U.S. Pat. No. 4,985,856 entitled “Method and Apparatus for Storing, Accessing, and Processing Voxel-based Data” by Kaufman et al.; issued Jan. 15, 1991, filed Nov. 11, 1988; which is hereby incorporated by reference in its entirety).
  • [0028] Step 105 allows the operator to define the portion of the selected organ to be examined. A physician may be interested in a particular section of the colon likely to develop polyps. The physician can view a two dimensional slice overview map to indicate the section to be examined. A starting point and finishing point of a path to be viewed can be indicated by the physician/operator. A conventional computer and computer interface (e.g., keyboard, mouse or spaceball) can be used to designate the portion of the colon that is to be inspected. A grid system with coordinates can be used for keyboard entry or the physician/operator can “click” on the desired points. The entire image of the colon can also be viewed if desired.
  • [0029] Step 107 performs the planned or guided navigation operation of the virtual organ being examined. Performing a guided navigation operation is defined as navigating through an environment along a predefined or automatically predetermined flight path, which can be manually adjusted by an operator at any time. After the scan data has been converted to 3D voxels, the inside of the organ is traversed from the selected start to the selected finishing point. The virtual examination is modeled on having a tiny viewpoint or “camera” traveling through the virtual space with a view direction or “lens” pointing towards the finishing point. The guided navigation technique provides a level of interaction with the camera, so that the camera can navigate through a virtual environment automatically in the case of no operator interaction, and at the same time, allow the operator to manipulate the camera when necessary. The preferred embodiment of achieving guided navigation is to use a physically based camera model that employs potential fields to control the movement of the camera, as is further detailed with respect to FIG. 2.
  • [0030] Step 109, which can be performed concurrently with step 107, displays the inside of the organ from the viewpoint of the camera model along the selected pathway of the guided navigation operation. Three-dimensional displays can be generated using techniques well known in the art such as the marching cubes technique, for example. In order to produce a real time display of the colon, a technique is used that reduces the vast number of data computations necessary for the display of the virtual organ.
  • The method described in FIG. 1 can also be applied to scanning multiple organs in a body at the same time. For example, a patient may be examined for cancerous growths in both the colon and lungs. The method of FIG. 1 would be modified to scan all the areas of interest in [0031] step 103 and to select the current organ to be examined in step 105. For example the physician/operator may initially select the colon to virtually explore and later explore the lung. Alternatively, two different doctors with different specialties may virtually explore different scanned organs relating to their respective specialties. Following step 109, the next organ to be examined is selected and its portion will be defined and explored. This continues until all organs that need examination have been processed.
  • The steps described in conjunction with FIG. 1 can also be applied to the exploration of any object that can be represented by volume elements. For example, an architectural structure or inanimate object can be represented and explored in the same manner. [0032]
  • Turning to FIG. 2, a “submarine camera” model that performs guided navigation in a virtual organ is indicated generally by the reference numeral [0033] 200. The model 200 depicts a viewpoint control model that performs the guided navigation technique of step 107. When there is no operator control during guided navigation, the default navigation is similar to that of planned navigation that automatically directs the camera along a flight path from one selected end of the colon to another. During the planned navigation phase, the camera stays at the center of the colon for obtaining better views of the colonic surface. When an interesting region is encountered, the operator of the virtual camera using guided navigation can interactively bring the camera close to a specific region and direct the motion and angle of the camera to study the interesting area in detail, without unwillingly colliding with the walls of the colon. The operator can control the camera with a standard interface device such as a keyboard, mouse or nonstandard device such as a spaceball. In order to fully operate a camera in a virtual environment, six degrees of freedom for the camera are required. The camera must be able to move in the horizontal, vertical, and depth or Z direction (axes 217), as well as being able to rotate in another three degrees of freedom (axes 219) to allow the camera to move and scan all sides and angles of a virtual environment.
  • Methods for computing a centerline inside the area of interest are well known in the art (see, e.g., U.S. Pat. No. 5,971,767 entitled “SYSTEM AND METHOD FOR PERFORMING A THREE-DIMENSIONAL VIRTUAL EXAMINATION” by Kaufman et al.; issued Oct. 26, 1999 and incorporated by reference herein in its entirety). [0034]
  • Referring to FIG. 3, a two dimensional cross-section of a volumetric colon containing a flight path is indicated generally by the [0035] reference numeral 300. The cross-section 300 includes the final flight path for the camera model down the center of the colon, as indicated by “x”s, and at least one starting location 301 or 303 near one end of the colon.
  • Turning now to FIG. 4, a system used to perform a virtual examination of a human organ in accordance with the disclosure is indicated generally by the [0036] reference numeral 400. The system 400 is for performing the virtual examination of an object such as a human organ using the techniques described herein. A patient 401 lays on a platform 402, while a scanning device 405 scans the area that contains the organ or organs to be examined. The scanning device 405 contains a scanning portion 403 that takes images of the patient and an electronics portion 406. The electronics portion 406 includes an interface 407, a central processing unit 409, a memory 411 for temporarily storing the scanning data, and a second interface 413 for sending data to a virtual navigation platform or terminal 416. The interfaces 407 and 413 may be included in a single interface component or may be the same component. The components in the portion 406 are connected together with conventional connectors.
  • In the [0037] system 400, the data provided from the scanning portion 403 of the device 405 is transferred to unit 409 for processing and is stored in memory 411. The central processing unit 409 converts the scanned 2D data to 3D voxel data and stores the results in another portion of the memory 411. Alternatively, the converted data may be directly sent to the interface unit 413 to be transferred to the virtual navigation terminal 416. The conversion of the 2D data could also take place at the virtual navigation terminal 416 after being transmitted from the interface 413. In the preferred embodiment, the converted data is transmitted over a carrier 414 to the virtual navigation terminal 416 in order for an operator to perform the virtual examination. The data may also be transported in other conventional ways, such as storing the data on a storage medium and physically transporting it to terminal 416 or by using satellite transmissions, for example.
  • The scanned data need not be converted to its 3D representation until the visualization-rendering engine requires it to be in 3D form. This saves computational steps and memory storage space. [0038]
  • The [0039] virtual navigation terminal 416 includes a screen for viewing the virtual organ or other scanned image, an electronics portion 415 and an interface control 419 such as a keyboard, mouse or spaceball. The electronics portion 415 includes an interface port 421, a central processing unit 423, optional components 427 for running the terminal and a memory 425. The components in the terminal 416 are connected together with conventional connectors. The converted voxel data is received in the interface port 421 and stored in the memory 425. The central processing unit 423 then assembles the 3D voxels into a virtual representation and runs the submarine camera model as described for FIG. 2 to perform the virtual examination. As the submarine camera travels through the virtual organ, the visibility technique is used to compute only those areas that are visible from the virtual camera, and display them on the screen 417. A graphics accelerator can also be used in generating the representations. The operator can use the interface device 419 to indicate which portion of the scanned body is desired to be explored. The interface device 419 can further be used to control and move the submarine camera as desired as detailed for FIG. 2. The terminal portion 415 can be, for example, the Cube-4 dedicated system box, generally available from the Department of Computer Science at the State University of New York at Stony Brook.
  • The [0040] scanning device 405 and terminal 416, or parts thereof, can be part of the same unit. A single platform would be used to receive the scan image data, connect it to 3D voxels if necessary and perform the guided navigation.
  • An important feature in [0041] system 400 is that the virtual organ can be examined at a later time without the presence of the patient. Additionally, the virtual examination could take place while the patient is being scanned. The scan data can also be sent to multiple terminals, which would allow more than one doctor to view the inside of the organ simultaneously. Thus a doctor in New York could be looking at the same portion of a patient's organ at the same time with a doctor in California while discussing the case. Alternatively, the data can be viewed at different times. Two or more doctors could perform their own examination of the same data in a difficult case. Multiple virtual navigation terminals could be used to view the same scan data. By reproducing the organ as a virtual organ with a discrete set of data, there are a multitude of benefits in areas such as accuracy, cost and possible data manipulations.
  • Some of the applicable techniques may be further enhanced in virtual colonoscopy applications through the use of a number of additional techniques that are described in U.S. Pat. No. 6,343,936 entitled “SYSTEM AND METHOD FOR PERFORMING A THREE-DIMENSIONAL VIRTUAL EXAMINATION, NAVIGATION AND VISUALIZATION” by Kaufman et al.; issued Feb. 7, 2002, which is incorporated herein by reference in its entirety. These improvements, described briefly below, include improved colon cleansing, volume rendering, additional fly-path determination techniques, and alternative hardware embodiments. [0042]
  • An improved electronic colon cleansing technique employs modified bowel preparation operations followed by image segmentation operations, such that fluid and stool remaining in the colon during a computed tomographic (“CT”) or magnetic resonance imaging (“MRI”) scan can be detected and removed from the virtual colonoscopy images. Through the use of such techniques, conventional physical washing of the colon, and its associated inconvenience and discomfort, is minimized or completely avoided. [0043]
  • In addition to image segmentation and texture mapping, volume-rendering techniques may be used in connection with virtual colonoscopy procedures to further enhance the fidelity of the resulting image. Methods for volume rendering are well known to those of ordinary skill in the pertinent art. [0044]
  • Referring to FIG. 5, an exemplary representation of a colon and accompanying flight-path generated according to an embodiment of the present disclosure is indicated generally by the reference numeral [0045] 500. The representation 500 depicts a human colon 502 showing a centerline flight path 504. As the operator travels through the virtual organ along this flight path, two-dimensional images of the current position are displayed.
  • As shown in FIG. 6, an exemplary display of a two-dimensional slice of scan data according to an embodiment of the present disclosure is indicated generally by the reference numeral [0046] 600. The slice 600 is shown while advancing along the flight path, and the operator interface displays the virtual organ along with the slice for the current “z” coordinate and pans the image of that slice so that the current “x, y” position are in the center of the image. Thus, in this arrangement, the two-dimensional slices are axial slices, where convention has the z-axis pointing towards the head. However, once the scan data has been converted into a three-dimensional volume, two-dimensional slices oriented on other planes can be generated and viewed as well. For example, the two-dimensional images displayed to the operator can be oriented on the sagittal plane, the coronal plane, or perpendicular to the flight path.
  • Turning to FIG. 7, the colon of FIG. 5 intersected by a plane oriented perpendicular to the flight-path is indicated generally by the [0047] reference numeral 700. The intersected colon 700, for example, depicts a plane 704 indicating a two-dimensional image perpendicular to the flight path 702 through the colon 703.
  • Referring to FIG. 8, an exemplary operator interface screen according to embodiments of the present disclosure is indicated generally by the [0048] reference numeral 2100. The screen 2100 includes a number of sub-windows that simultaneously provide an operator with graphical information from a number of different perspectives. The center sub-window 2104 displays inside the virtual organ. An arrow or marker 2105 helps orient the operator along the projected flight path. A complete view of this flight path, along with the entire organ is depicted in sub-window 2102. Operator controls 2108 are near the bottom of the screen 2100 and are useful to control the travel through the virtual organ. The rendering of the virtual organ, as well as the control of flight through the organ, have been described earlier and are not repeated here. Four other sub-windows are shown 2106, 2114, 2112 and 2110 that provide two-dimensional images along the perpendicular plane, axial plane, sagittal plane and coronal plane, respectively. The displayed two-dimensional image is based on the current position along the flight path through the virtual organ. Each of these windows can include a marker, for example 2115, 2113 and 2111, to help orient the operator along the flight path.
  • Furthermore, each [0049] window 2106, 2114, 2112, and 2110 has a respective control for scrolling through two-dimensional images such as scroll bar 2115. Accordingly, the operator can traverse the flight path manually, in either direction, using this scroll bar.
  • The [0050] screen 2100 is exemplary in nature and a skilled artisan would recognize many equivalent alternatives within the scope of the present disclosure. For example, not all sub-windows 2106, 2114, 2112 and 2110 need to be displayed.
  • Turning to FIG. 9, a system embodiment based on a personal computer bus architecture is indicated generally by the [0051] reference numeral 900. The system 900 includes an alternate hardware embodiment suitable for deployment on a personal computer (“PC”), as illustrated. The system 900 includes a processor 910 that preferably takes the form of a high speed, multitasking processor, such as, for example, a Pentium III processor operating at a clock speed in excess of 400 MHZ. The processor 910 is coupled to a conventional bus structure 920 that provides for high-speed parallel data transfer. Also coupled to the bus structure 920 are a main memory 930, a graphics board 940, and a volume rendering board 950. The graphics board 940 is preferably one that can perform texture mapping, such as, for example, a Diamond Viper v770 Ultra board manufactured by Diamond Multimedia Systems. The volume rendering board 950 can take the form of the VolumePro board from Mitsubishi Electric, for example, which is based on U.S. Pat. Nos. 5,760,781 and 5,847,711, which are hereby incorporated by reference in their entirety. A display device 945, such as a conventional SVGA or RGB monitor, is operably coupled to the graphics board 940 for displaying the image data. A scanner interface board 960 is also provided for receiving data from an imaging scanner, such as an MRI or CT scanner, for example, and transmitting such data to the bus structure 920. The scanner interface board 960 may be an application specific interface product for a selected imaging scanner or can take the form of a general-purpose input/output card. The PC based system 900 will generally include an I/O interface 970 for coupling I/O devices 980, such as a keyboard, digital pointer or mouse, and the like, to the processor 910. Alternatively, the I/O interface can be coupled to the processor 910 via the bus 920.
  • Embodiments of the present disclosure provide a user interface displaying both two-dimensional and three-dimensional data. Organs within the body are, by nature, three-dimensional. Conventional medical imaging devices, however, as explained herein, create stacks of two-dimensional images when acquiring scan data. Radiologists and other specialists, therefore, have historically been trained to review and analyze these two-dimensional images. As a result, most doctors are comfortable viewing two-dimensional images even if three-dimensional reconstructions or virtualizations are available. [0052]
  • However, many organs are not simple convex objects but, instead, can be tortuous or have many branches. While a doctor may be comfortable analyzing two-dimensional images, performing navigation through complex organs is very difficult using merely two-dimensional images. Navigating using the two-dimensional images would include manually scrolling through the images to move in the “z” direction (along the major axis of the body) and panning the images to move in the “x” and “y” direction. In this manner an operator can traverse the organ looking for areas of interest. [0053]
  • On the other hand, three-dimensional flight paths, as described herein, are intuitive, efficient tools to virtually travel through volumetric renderings of human organs either automatically or manually. During a flight path tour, each point along the flight path is represented by a coordinate (x, y, z). According to embodiments of the present disclosure, these coordinates are used to automatically scroll and pan the series of two-dimensional images that doctors are used to analyzing. Thus, the operator does not have to manually navigate through an organ in two dimensions but, instead, can let the present virtualization system advance along the organ while the operator concentrates on analyzing each two-dimensional image. [0054]
  • The foregoing merely illustrates the principles of the disclosure. It will thus be appreciated that those skilled in the art will be able to devise numerous systems, apparatus and methods which, although not explicitly shown or described herein, embody the principles of the disclosure and are thus within the spirit and scope of the disclosure as defined by its claims. [0055]
  • For example, the methods and systems described herein could be applied to virtually examine an animal, fish or inanimate object. Besides the stated uses in the medical field, applications of the technique could be used to detect the contents of sealed objects that cannot be opened. The technique could also be used inside an architectural structure such as a building or cavern and enable the operator to navigate through the structure. [0056]
  • These and other features and advantages of the present disclosure may be readily ascertained by one of ordinary skill in the pertinent art based on the teachings herein. It is to be understood that the teachings of the present disclosure may be to implemented in various forms of hardware, software, firmware, special purpose processors, or combinations thereof. [0057]
  • Most preferably, the teachings of the present disclosure are implemented as a combination of hardware and software. Moreover, the software is preferably implemented as an application program tangibly embodied on a program storage unit. The application program may be uploaded to, and executed by, a machine comprising any suitable architecture. Preferably, the machine is implemented on a computer platform having hardware such as one or more central processing units (“CPU”), a random access memory (“RAM”), and input/output (“I/O”) interfaces. The computer platform may also include an operating system and microinstruction code. The various processes and functions described herein may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU. In addition, various other peripheral units may be connected to the computer platform such as an additional data storage unit and a printing unit. [0058]
  • It is to be further understood that, because some of the constituent system components and methods depicted in the accompanying drawings are preferably implemented in software, the actual connections between the system components or the process function blocks may differ depending upon the manner in which embodiments of the present disclosure are programmed. Given the teachings herein, one of ordinary skill in the pertinent art will be able to contemplate these and similar implementations or configurations of the present invention. [0059]
  • Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the present invention is not limited to those precise embodiments, and that various changes and modifications may be effected therein by one of ordinary skill in the pertinent art without departing from the scope or spirit of the present disclosure. All such changes and modifications are intended to be included within the scope of the present invention as set forth in the appended claims. [0060]

Claims (38)

What is claimed is:
1. A method for performing a three-dimensional virtual examination of at least one object, the method comprising:
scanning said object with a scanning device and producing scan data representative of said object, said scan data comprising a sequence of two-dimensional images of said object;
creating a three-dimensional volume representation of said object comprising volume elements from said scan data;
selecting a start position from said three-dimensional volume representation;
generating a defined path from said start position and extending within said three-dimensional volume representation;
performing a guided navigation of said three-dimensional representation along said path; and
displaying in real time volume elements responsive to said path during said guided navigation and simultaneously displaying at least one of the sequence of two-dimensional images based on a current location along the defined path.
2. A method as defined in claim 1 wherein displaying in real time is further responsive to an operator's input.
3. A method as defined in claim 1 wherein said start position comprises a sub-voxel position.
4. A method as defined in claim 1, further comprising:
moving along the defined path in response to the current location; and
changing the at least one displayed image to correspond to the current location.
5. A method as defined in claim 1 wherein the object is an organ within a body.
6. A method as defined in claim 5 wherein said sequence of two-dimensional images comprises axial images of said organ.
7. A method as defined in claim 5 wherein the organ is a colon.
8. A method as defined in claim 5 wherein the organ is a lung.
9. A method as defined in claim 5 wherein the organ is a heart.
10. A method as defined in claim 1, further comprising displaying a view of the defined path simultaneously with the display of the volume elements.
11. A method as defined in claim 1 wherein the current location includes x, y and z coordinates and the at least one displayed image corresponds to the z coordinate and is displayed centered around the x and y coordinates.
12. A method as defined in claim 1 wherein the current location includes x, y and z coordinates and the at least one displayed image corresponds to the y coordinate and is displayed centered around the x and z coordinates.
13. A method as defined in claim 1 wherein the current location includes x, y and z coordinates and the at least one displayed image corresponds to the x coordinate and is displayed centered around the y and z coordinates.
14. A method as defined in claim 1, further comprising:
generating from the three-dimensional volume representation a sequence of two-dimensional images along the defined flight path, the images aligned with a second axis; and
displaying a particular image aligned with the second axis corresponding to the current location along the defined path simultaneously with the display of the volume elements.
15. A method as defined in claim 14 wherein the second axis is one of a coronal axis, a sagittal axis, and an axis perpendicular to the defined path.
16. A method for performing a three-dimensional internal virtual examination of at least one organ, the method comprising:
scanning said organ with a scanning device and producing scan data representative of said organ, said scan data comprising a sequence of two-dimensional axial images of said organ;
creating a three-dimensional volume representation of said organ comprising volume elements from said scan data;
selecting a start position from said three-dimensional volume representation;
generating a defined path from said start position and extending within said three-dimensional volume representation;
performing a guided navigation of said three-dimensional representation along said path; and
displaying in real time volume elements responsive to said path during said guided navigation and simultaneously displaying one of the sequence of axial images based on a current location along the defined path.
17. A method as defined in claim 16 wherein displaying in real time is further responsive to an operator's input.
18. A method as defined in claim 16 wherein said start position comprises a sub-voxel position.
19. A method as defined in claim 18, further comprising changing the one displayed image to correspond to the current location in response to the current location moving along the defined path.
20. A method as defined in claim 18 wherein the organ is a colon.
21. A method as defined in claim 18 wherein the organ is a lung.
22. A method as defined in claim 18 wherein the organ is a heart.
23. A method as defined in claim 18, further comprising displaying a view of the defined path simultaneously with the display of the volume elements.
24. A method as defined in claim 18 wherein the current location includes x, y and z coordinates and the one displayed image corresponds to the z coordinate and is displayed centered around the x and y coordinates.
25. A method as defined in claim 18, further comprising:
generating from the three-dimensional volume representation a sequence of two-dimensional images along the defined flight path aligned with a second axis; and
displaying a particular two-dimensional image from the sequence of images aligned with the second axis simultaneously with the display of the volume elements, said particular two-dimensional image corresponding to the current location along the defined path.
26. A method as defined in claim 25 wherein the second axis is one of a coronal axis, a sagittal axis, and an axis perpendicular to the defined path.
27. An operator interface for a three-dimensional virtual examination system of an object wherein said virtual examination includes a guided navigation along a defined path within a three-dimensional volume representation of said object created from scanning data comprising a sequence of two-dimensional axial images of said object and then generating volume elements of the representation based on the axial images, the operator interface comprising:
a display screen having a plurality of sub-windows simultaneously visible;
a first of said sub-windows configured to display in real time volume elements responsive to said defined path and to an operator's input during the guided navigation; and
a second of said sub-windows configured to display one of the two-dimensional images corresponding to a current location along the defined path.
28. An interface as defined in claim 27 wherein the object is an organ within a body.
29. An interface as defined in claim 28 wherein the organ is a colon.
30. An interface as defined in claim 28 wherein the organ is a lung.
31. An interface as defined in claim 27, further comprising a third of said sub-windows configured to display the defined path.
32. An interface as defined in claim 27, further comprising a second sequence of two-dimensional images of said object, said second sequence of images generated from the volume elements and oriented along a second axis different from the axial images.
33. An interface as defined in claim 32 wherein the second axis is one of a coronal axis, a sagittal axis, and an axis perpendicular to the defined path.
34. An interface as defined in claim 32, further comprising a third of said sub-windows configured to display a particular image from the second sequence of images based on the current position along the defined path.
35. A computer-readable medium bearing instructions for an operator interface for a three-dimensional virtual examination system of an object wherein said virtual examination includes a guided navigation along a defined path within a three-dimensional volume representation of said object created from scanning data comprising a sequence of two-dimensional axial images of said object and then generating volume elements of the representation based on the axial images, said instructions arranged, when executed by one or more processors, to cause the one or more processors to:
provide a display screen having a plurality of sub-windows simultaneously visible;
display in a first of said sub-windows in real time volume elements responsive to said defined path and an operator's input during the guided navigation; and
display in a second of said sub-windows one of the two-dimensional images corresponding to a current location along the defined path.
36. A program storage device readable by machine, tangibly embodying a program of instructions executable by the machine to perform method steps for performing a three-dimensional virtual examination of at least one object, the method steps comprising:
scanning with a scanning device and producing scan data representative of said object, said scan data comprising a sequence of two-dimensional images of said object;
creating a three-dimensional volume representation of said object comprising volume elements from said scan data;
selecting a start volume element and a finish volume element from said three-dimensional volume representation;
generating a defined path between said start and finish volume elements;
performing a guided navigation of said three-dimensional representation along said path; and
displaying in real time said volume elements responsive to said path and to an operator's input during said guided navigation and simultaneously displaying at least one of the sequence of two-dimensional images based on a current location along the defined path.
37. An apparatus for performing a three-dimensional virtual examination of at least one object, the apparatus comprising:
scanning means for scanning with a scanning device and producing scan data representative of said object, said scan data comprising a sequence of two-dimensional images of said object;
volume-rendering means for creating a three-dimensional volume representation of said object comprising volume elements from said scan data;
selection means for selecting a start volume element and a finish volume element from said three-dimensional volume representation;
flight-path means for generating a defined path between said start and finish volume elements;
navigational means for performing a guided navigation of said three-dimensional representation along said path; and
display means for displaying in real time said volume elements responsive to said path and to an operator's input during said guided navigation and simultaneously displaying at least one of the sequence of two-dimensional images based on a current location along the defined path.
38. An apparatus for performing a three-dimensional virtual examination of at least one object, the apparatus comprising:
a scanning device for receiving a plurality of two-dimensional image slices of at least one object;
a rendering device in signal communication with the scanning device for rendering a three-dimensional volume representation of the plurality of two-dimensional image slices;
a processing device in signal communication with the rendering device for locating a first set of features along a centerline within the rendered three-dimensional volume representation;
an indexing device in signal communication with the processing device for matching at least one feature in the rendered three-dimensional volume representation with a corresponding two-dimensional image slice; and
a display device in signal communication with the indexing device for displaying both of the rendered three-dimensional volume representation and the matched two-dimensional image slice.
US10/301,034 2001-11-21 2002-11-21 Display of two-dimensional and three-dimensional views during virtual examination Abandoned US20030132936A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/301,034 US20030132936A1 (en) 2001-11-21 2002-11-21 Display of two-dimensional and three-dimensional views during virtual examination

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US33171201P 2001-11-21 2001-11-21
US10/301,034 US20030132936A1 (en) 2001-11-21 2002-11-21 Display of two-dimensional and three-dimensional views during virtual examination

Publications (1)

Publication Number Publication Date
US20030132936A1 true US20030132936A1 (en) 2003-07-17

Family

ID=23295049

Family Applications (3)

Application Number Title Priority Date Filing Date
US10/496,430 Abandoned US20050169507A1 (en) 2001-11-21 2002-11-21 Registration of scanning data acquired from different patient positions
US10/301,034 Abandoned US20030132936A1 (en) 2001-11-21 2002-11-21 Display of two-dimensional and three-dimensional views during virtual examination
US11/273,430 Expired - Fee Related US7372988B2 (en) 2001-11-21 2005-11-14 Registration of scanning data acquired from different patient positions

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/496,430 Abandoned US20050169507A1 (en) 2001-11-21 2002-11-21 Registration of scanning data acquired from different patient positions

Family Applications After (1)

Application Number Title Priority Date Filing Date
US11/273,430 Expired - Fee Related US7372988B2 (en) 2001-11-21 2005-11-14 Registration of scanning data acquired from different patient positions

Country Status (5)

Country Link
US (3) US20050169507A1 (en)
EP (1) EP1456805A1 (en)
AU (1) AU2002365560A1 (en)
CA (1) CA2467646A1 (en)
WO (1) WO2003046811A1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050048456A1 (en) * 2003-08-14 2005-03-03 Christophe Chefd'hotel Method and apparatus for registration of virtual endoscopic images
US20050116957A1 (en) * 2003-11-03 2005-06-02 Bracco Imaging, S.P.A. Dynamic crop box determination for optimized display of a tube-like structure in endoscopic view ("crop box")
US20060103678A1 (en) * 2004-11-18 2006-05-18 Pascal Cathier Method and system for interactive visualization of locally oriented structures
US20070116332A1 (en) * 2003-11-26 2007-05-24 Viatronix Incorporated Vessel segmentation using vesselness and edgeness
US20090040221A1 (en) * 2003-05-14 2009-02-12 Bernhard Geiger Method and apparatus for fast automatic centerline extraction for virtual endoscopy
US20100215226A1 (en) * 2005-06-22 2010-08-26 The Research Foundation Of State University Of New York System and method for computer aided polyp detection
US20100259542A1 (en) * 2007-11-02 2010-10-14 Koninklijke Philips Electronics N.V. Automatic movie fly-path calculation
US20100283781A1 (en) * 2008-01-04 2010-11-11 Kriveshko Ilya A Navigating among images of an object in 3d space
US20110122068A1 (en) * 2009-11-24 2011-05-26 General Electric Company Virtual colonoscopy navigation methods using a mobile device
US20160350979A1 (en) * 2015-05-28 2016-12-01 The Florida International University Board Of Trustees Systems and methods for shape analysis using landmark-driven quasiconformal mapping
CN108210066A (en) * 2016-12-22 2018-06-29 韦伯斯特生物官能(以色列)有限公司 Two-dimensional pulmonary vein display
CN108701492A (en) * 2016-03-03 2018-10-23 皇家飞利浦有限公司 Medical image navigation system
US10517690B2 (en) * 2014-10-31 2019-12-31 Scopis Gmbh Instrument guidance system for sinus surgery
US11191423B1 (en) * 2020-07-16 2021-12-07 DOCBOT, Inc. Endoscopic system and methods having real-time medical imaging
US11423318B2 (en) * 2019-07-16 2022-08-23 DOCBOT, Inc. System and methods for aggregating features in video frames to improve accuracy of AI detection algorithms
US11684241B2 (en) 2020-11-02 2023-06-27 Satisfai Health Inc. Autonomous and continuously self-improving learning system
US11694114B2 (en) 2019-07-16 2023-07-04 Satisfai Health Inc. Real-time deployment of machine learning systems
US12062427B2 (en) * 2017-06-30 2024-08-13 Shanghai United Imaging Healthcare Co., Ltd. Method and system for tissue density analysis

Families Citing this family (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050169507A1 (en) * 2001-11-21 2005-08-04 Kevin Kreeger Registration of scanning data acquired from different patient positions
US20060071932A1 (en) * 2002-11-21 2006-04-06 Koninklijke Philips Electronics N.V. Method and apparatus for visualizing a sequece of volume images
US20050152588A1 (en) * 2003-10-28 2005-07-14 University Of Chicago Method for virtual endoscopic visualization of the colon by shape-scale signatures, centerlining, and computerized detection of masses
US7574032B2 (en) * 2003-10-31 2009-08-11 General Electric Company Method and apparatus for virtual subtraction of stool from registration and shape based analysis of prone and supine scans of the colon
US7274811B2 (en) * 2003-10-31 2007-09-25 Ge Medical Systems Global Technology Company, Llc Method and apparatus for synchronizing corresponding landmarks among a plurality of images
US20050256400A1 (en) * 2003-12-03 2005-11-17 Bhargav Raman Method to identify arterial and venous vessels
US20060041181A1 (en) * 2004-06-04 2006-02-23 Viswanathan Raju R User interface for remote control of medical devices
ATE484811T1 (en) * 2004-06-23 2010-10-15 Koninkl Philips Electronics Nv VIRTUAL ENDOSCOPY
US20060034513A1 (en) * 2004-07-23 2006-02-16 Siemens Medical Solutions Usa, Inc. View assistance in three-dimensional ultrasound imaging
BRPI0419168B1 (en) * 2004-09-24 2017-05-16 Nokia Corp electronic device comprising detecting a user's input during an idle operating mode
US20090226065A1 (en) * 2004-10-09 2009-09-10 Dongqing Chen Sampling medical images for virtual histology
WO2006095309A2 (en) * 2005-03-07 2006-09-14 Koninklijke Philips Electronics N.V. Apparatus and method for correlating first and second 3d images of tubular object
US8144987B2 (en) * 2005-04-13 2012-03-27 Koninklijke Philips Electronics N.V. Method, a system and a computer program for segmenting a surface in a multi-dimensional dataset
US20080219531A1 (en) * 2005-08-01 2008-09-11 Koninklijke Philips Electronics, N.V. Method and Apparatus For Metching First and Second Image Data of an Object
US7379062B2 (en) * 2005-08-01 2008-05-27 Barco Nv Method for determining a path along a biological object with a lumen
WO2007023423A2 (en) * 2005-08-24 2007-03-01 Koninklijke Philips Electronics N.V. Apparatus and method for labeling anatomical image data
WO2007023450A2 (en) * 2005-08-24 2007-03-01 Koninklijke Philips Electronics N.V. Apparatus and method for identifying sections of an anatomical object
US20080285822A1 (en) 2005-11-09 2008-11-20 Koninklijke Philips Electronics N. V. Automated Stool Removal Method For Medical Imaging
US20070109299A1 (en) * 2005-11-15 2007-05-17 Vital Images, Inc. Surface-based characteristic path generation
US7570986B2 (en) * 2006-05-17 2009-08-04 The United States Of America As Represented By The Secretary Of Health And Human Services Teniae coli guided navigation and registration for virtual colonoscopy
US8023703B2 (en) * 2006-07-06 2011-09-20 The United States of America as represented by the Secretary of the Department of Health and Human Services, National Institues of Health Hybrid segmentation of anatomical structure
US7853058B2 (en) 2006-11-22 2010-12-14 Toshiba Medical Visualization Systems Europe, Limited Determining a viewpoint for navigating a virtual camera through a biological object with a lumen
EP2229642A1 (en) * 2007-12-07 2010-09-22 Koninklijke Philips Electronics N.V. Navigation guide
US8144964B1 (en) * 2008-05-30 2012-03-27 Ellis Amalgamated LLC Image feature analysis
EP2409280A1 (en) * 2009-03-20 2012-01-25 Koninklijke Philips Electronics N.V. Visualizing a view of a scene
US8213700B2 (en) * 2009-03-31 2012-07-03 Icad, Inc. Systems and methods for identifying suspicious anomalies using information from a plurality of images of an anatomical colon under study
DE102009035441B4 (en) * 2009-07-31 2016-11-24 Siemens Healthcare Gmbh Method and image processing system for generating a volume view image from inside a body
JP5551955B2 (en) * 2010-03-31 2014-07-16 富士フイルム株式会社 Projection image generation apparatus, method, and program
US10679365B1 (en) * 2010-11-24 2020-06-09 Fonar Corporation Method of correlating a slice profile
US8379955B2 (en) 2010-11-27 2013-02-19 Intrinsic Medical Imaging, LLC Visualizing a 3D volume dataset of an image at any position or orientation from within or outside
US10373375B2 (en) * 2011-04-08 2019-08-06 Koninklijke Philips N.V. Image processing system and method using device rotation
US9060672B2 (en) * 2013-02-11 2015-06-23 Definiens Ag Coregistering images of needle biopsies using multiple weighted landmarks
DE102014203113B4 (en) * 2014-02-20 2019-01-10 Siemens Healthcare Gmbh Generation of image data of an examination object by means of a magnetic resonance tomograph
JP6560745B2 (en) * 2014-09-24 2019-08-14 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Visualizing volumetric images of anatomy
WO2016168328A1 (en) * 2015-04-13 2016-10-20 Accumetra, Llc Automated scan quality monitoring system

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5839440A (en) * 1994-06-17 1998-11-24 Siemens Corporate Research, Inc. Three-dimensional image registration method for spiral CT angiography
US5920319A (en) * 1994-10-27 1999-07-06 Wake Forest University Automatic analysis in virtual endoscopy
US6161211A (en) * 1996-10-28 2000-12-12 Altera Corporation Method and apparatus for automated circuit design
US6181320B1 (en) * 1998-08-19 2001-01-30 International Business Machines Corporation Method for converting timing diagram into timing graph and vice versa
US6331116B1 (en) * 1996-09-16 2001-12-18 The Research Foundation Of State University Of New York System and method for performing a three-dimensional virtual segmentation and examination
US20020015517A1 (en) * 2000-03-29 2002-02-07 Hwang Scott N. Subvoxel processing: a method for reducing partial volume blurring
US6448970B1 (en) * 1997-07-25 2002-09-10 Namco Ltd. Image generation apparatus for causing movement of moving body based on flow data for a fluid set on a course, and information storage medium
US20030038798A1 (en) * 2001-02-28 2003-02-27 Paul Besl Method and system for processing, compressing, streaming, and interactive rendering of 3D color image data
US20030052875A1 (en) * 2001-01-05 2003-03-20 Salomie Ioan Alexandru System and method to obtain surface structures of multi-dimensional objects, and to represent those surface structures for animation, transmission and display
US6597359B1 (en) * 2000-05-17 2003-07-22 Raychip, Inc. Hierarchical space subdivision hardware for ray tracing
US20040109603A1 (en) * 2000-10-02 2004-06-10 Ingmar Bitter Centerline and tree branch skeleton determination for virtual objects
US6928314B1 (en) * 1998-01-23 2005-08-09 Mayo Foundation For Medical Education And Research System for two-dimensional and three-dimensional imaging of tubular structures in the human body

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07120621B2 (en) * 1989-05-08 1995-12-20 キヤノン株式会社 Alignment method
WO1997041532A1 (en) * 1996-04-29 1997-11-06 The Government Of The United States Of America, Represented By The Secretary, Department Of Health And Human Services Iterative image registration process using closest corresponding voxels
US5971767A (en) * 1996-09-16 1999-10-26 The Research Foundation Of State University Of New York System and method for performing a three-dimensional virtual examination
US20050169507A1 (en) * 2001-11-21 2005-08-04 Kevin Kreeger Registration of scanning data acquired from different patient positions

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5839440A (en) * 1994-06-17 1998-11-24 Siemens Corporate Research, Inc. Three-dimensional image registration method for spiral CT angiography
US5920319A (en) * 1994-10-27 1999-07-06 Wake Forest University Automatic analysis in virtual endoscopy
US6331116B1 (en) * 1996-09-16 2001-12-18 The Research Foundation Of State University Of New York System and method for performing a three-dimensional virtual segmentation and examination
US6161211A (en) * 1996-10-28 2000-12-12 Altera Corporation Method and apparatus for automated circuit design
US6448970B1 (en) * 1997-07-25 2002-09-10 Namco Ltd. Image generation apparatus for causing movement of moving body based on flow data for a fluid set on a course, and information storage medium
US6928314B1 (en) * 1998-01-23 2005-08-09 Mayo Foundation For Medical Education And Research System for two-dimensional and three-dimensional imaging of tubular structures in the human body
US6181320B1 (en) * 1998-08-19 2001-01-30 International Business Machines Corporation Method for converting timing diagram into timing graph and vice versa
US20020015517A1 (en) * 2000-03-29 2002-02-07 Hwang Scott N. Subvoxel processing: a method for reducing partial volume blurring
US6597359B1 (en) * 2000-05-17 2003-07-22 Raychip, Inc. Hierarchical space subdivision hardware for ray tracing
US20040109603A1 (en) * 2000-10-02 2004-06-10 Ingmar Bitter Centerline and tree branch skeleton determination for virtual objects
US20030052875A1 (en) * 2001-01-05 2003-03-20 Salomie Ioan Alexandru System and method to obtain surface structures of multi-dimensional objects, and to represent those surface structures for animation, transmission and display
US20030038798A1 (en) * 2001-02-28 2003-02-27 Paul Besl Method and system for processing, compressing, streaming, and interactive rendering of 3D color image data

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090040221A1 (en) * 2003-05-14 2009-02-12 Bernhard Geiger Method and apparatus for fast automatic centerline extraction for virtual endoscopy
US8059877B2 (en) * 2003-05-14 2011-11-15 Siemens Corporation Method and apparatus for fast automatic centerline extraction for virtual endoscopy
US7300398B2 (en) * 2003-08-14 2007-11-27 Siemens Medical Solutions Usa, Inc. Method and apparatus for registration of virtual endoscopic images
US20050048456A1 (en) * 2003-08-14 2005-03-03 Christophe Chefd'hotel Method and apparatus for registration of virtual endoscopic images
US20050116957A1 (en) * 2003-11-03 2005-06-02 Bracco Imaging, S.P.A. Dynamic crop box determination for optimized display of a tube-like structure in endoscopic view ("crop box")
US20070116332A1 (en) * 2003-11-26 2007-05-24 Viatronix Incorporated Vessel segmentation using vesselness and edgeness
US20060103678A1 (en) * 2004-11-18 2006-05-18 Pascal Cathier Method and system for interactive visualization of locally oriented structures
US8600125B2 (en) * 2005-06-22 2013-12-03 The Research Foundation Of State University Of New York System and method for computer aided polyp detection
US20100215226A1 (en) * 2005-06-22 2010-08-26 The Research Foundation Of State University Of New York System and method for computer aided polyp detection
US20100259542A1 (en) * 2007-11-02 2010-10-14 Koninklijke Philips Electronics N.V. Automatic movie fly-path calculation
US10217282B2 (en) 2007-11-02 2019-02-26 Koninklijke Philips N.V. Automatic movie fly-path calculation
US20100283781A1 (en) * 2008-01-04 2010-11-11 Kriveshko Ilya A Navigating among images of an object in 3d space
US11163976B2 (en) 2008-01-04 2021-11-02 Midmark Corporation Navigating among images of an object in 3D space
US9937022B2 (en) * 2008-01-04 2018-04-10 3M Innovative Properties Company Navigating among images of an object in 3D space
US10503962B2 (en) * 2008-01-04 2019-12-10 Midmark Corporation Navigating among images of an object in 3D space
US20180196995A1 (en) * 2008-01-04 2018-07-12 3M Innovative Properties Company Navigating among images of an object in 3d space
US8692774B2 (en) * 2009-11-24 2014-04-08 General Electric Company Virtual colonoscopy navigation methods using a mobile device
US20110122068A1 (en) * 2009-11-24 2011-05-26 General Electric Company Virtual colonoscopy navigation methods using a mobile device
US10517690B2 (en) * 2014-10-31 2019-12-31 Scopis Gmbh Instrument guidance system for sinus surgery
US11324566B2 (en) 2014-10-31 2022-05-10 Stryker European Operations Limited Instrument guidance system for sinus surgery
US9892506B2 (en) * 2015-05-28 2018-02-13 The Florida International University Board Of Trustees Systems and methods for shape analysis using landmark-driven quasiconformal mapping
US20160350979A1 (en) * 2015-05-28 2016-12-01 The Florida International University Board Of Trustees Systems and methods for shape analysis using landmark-driven quasiconformal mapping
CN108701492A (en) * 2016-03-03 2018-10-23 皇家飞利浦有限公司 Medical image navigation system
CN108210066A (en) * 2016-12-22 2018-06-29 韦伯斯特生物官能(以色列)有限公司 Two-dimensional pulmonary vein display
US12062427B2 (en) * 2017-06-30 2024-08-13 Shanghai United Imaging Healthcare Co., Ltd. Method and system for tissue density analysis
US11423318B2 (en) * 2019-07-16 2022-08-23 DOCBOT, Inc. System and methods for aggregating features in video frames to improve accuracy of AI detection algorithms
US11694114B2 (en) 2019-07-16 2023-07-04 Satisfai Health Inc. Real-time deployment of machine learning systems
US11191423B1 (en) * 2020-07-16 2021-12-07 DOCBOT, Inc. Endoscopic system and methods having real-time medical imaging
US11684241B2 (en) 2020-11-02 2023-06-27 Satisfai Health Inc. Autonomous and continuously self-improving learning system

Also Published As

Publication number Publication date
WO2003046811A1 (en) 2003-06-05
US7372988B2 (en) 2008-05-13
US20060062450A1 (en) 2006-03-23
AU2002365560A1 (en) 2003-06-10
US20050169507A1 (en) 2005-08-04
CA2467646A1 (en) 2003-06-05
EP1456805A1 (en) 2004-09-15

Similar Documents

Publication Publication Date Title
US20030132936A1 (en) Display of two-dimensional and three-dimensional views during virtual examination
US7012603B2 (en) Motion artifact detection and correction
EP1012812B1 (en) System and method for performing a three-dimensional virtual examination
US6343936B1 (en) System and method for performing a three-dimensional virtual examination, navigation and visualization
US7148887B2 (en) System and method for performing a three-dimensional virtual segmentation and examination with optical texture mapping
US7194117B2 (en) System and method for performing a three-dimensional virtual examination of objects, such as internal organs
US7477768B2 (en) System and method for performing a three-dimensional virtual examination of objects, such as internal organs
WO2006042077A2 (en) Sampling medical images for virtual histology
US20050197558A1 (en) System and method for performing a virtual endoscopy in a branching structure
IL178768A (en) System and method for mapping optical texture properties from at least one optical image to an acquired monochrome data set
MXPA01009387A (en) System and method for performing a three-dimensional virtual examination, navigation and visualization

Legal Events

Date Code Title Description
AS Assignment

Owner name: VIATRONIX, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KREEGER, KEVIN;LI, BIN;DACHILLE, FRANK C. IX;AND OTHERS;REEL/FRAME:013858/0007;SIGNING DATES FROM 20030305 TO 20030312

AS Assignment

Owner name: BOND, WILLIAM, AS COLLATERAL AGENT, FLORIDA

Free format text: SECURITY AGREEMENT;ASSIGNOR:VIATRONIX, INC.;REEL/FRAME:018515/0169

Effective date: 20060721

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION