US6331116B1 - System and method for performing a three-dimensional virtual segmentation and examination - Google Patents

System and method for performing a three-dimensional virtual segmentation and examination Download PDF

Info

Publication number
US6331116B1
US6331116B1 US09/343,012 US34301299A US6331116B1 US 6331116 B1 US6331116 B1 US 6331116B1 US 34301299 A US34301299 A US 34301299A US 6331116 B1 US6331116 B1 US 6331116B1
Authority
US
United States
Prior art keywords
colon
volume elements
image data
virtual
volume
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US09/343,012
Other languages
English (en)
Inventor
Arie E. Kaufman
Zhengrong Liang
Mark R. Wax
Ming Wan
Dongquing Chen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Research Foundation of State University of New York
Original Assignee
Research Foundation of State University of New York
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US08/714,697 external-priority patent/US5971767A/en
Application filed by Research Foundation of State University of New York filed Critical Research Foundation of State University of New York
Priority to US09/343,012 priority Critical patent/US6331116B1/en
Assigned to RESEARCH FOUNDATION OF STATE UNIVERSITY OF NEW YORK, THE reassignment RESEARCH FOUNDATION OF STATE UNIVERSITY OF NEW YORK, THE ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHEN, DONGQING, KAUFMAN, ARIE E., LIANG, ZHENGRONG, WAN, MING, WAX, MARK R.
Priority to US09/493,559 priority patent/US6343936B1/en
Priority to KR1020017011900A priority patent/KR100701234B1/ko
Priority to PCT/US2000/007351 priority patent/WO2000055812A1/en
Priority to CA002368058A priority patent/CA2368058A1/en
Priority to KR1020067021647A priority patent/KR100790536B1/ko
Priority to CNB008076375A priority patent/CN1277241C/zh
Priority to EP00918154.6A priority patent/EP1173830B1/en
Priority to JP2000605971A priority patent/JP4435430B2/ja
Priority to PCT/US2000/007352 priority patent/WO2000055814A2/en
Priority to EP00918153A priority patent/EP1161741A1/en
Priority to KR1020017011901A priority patent/KR100701235B1/ko
Priority to BR0009098-0A priority patent/BR0009098A/pt
Priority to KR1020067021648A priority patent/KR20060116872A/ko
Priority to IL14551500A priority patent/IL145515A0/xx
Priority to CNB008076383A priority patent/CN1248167C/zh
Priority to JP2000605969A priority patent/JP2002538915A/ja
Priority to AU39018/00A priority patent/AU3901800A/en
Priority to AU39017/00A priority patent/AU3901700A/en
Priority to IL14551600A priority patent/IL145516A0/xx
Priority to CA2368390A priority patent/CA2368390C/en
Priority to BR0009099-9A priority patent/BR0009099A/pt
Priority to US09/777,120 priority patent/US7194117B2/en
Priority to IS6078A priority patent/IS6078A/is
Priority to IS6079A priority patent/IS6079A/is
Priority to IL145516A priority patent/IL145516A/en
Priority to US09/974,548 priority patent/US7148887B2/en
Priority to US09/974,569 priority patent/US6514082B2/en
Publication of US6331116B1 publication Critical patent/US6331116B1/en
Application granted granted Critical
Priority to IL178769A priority patent/IL178769A/en
Priority to IL178768A priority patent/IL178768A/en
Priority to US11/613,297 priority patent/US7486811B2/en
Priority to US11/613,306 priority patent/US7477768B2/en
Priority to US11/613,283 priority patent/US7474776B2/en
Assigned to NATIONAL INSTITUTES OF HEALTH (NIH), U.S. DEPT. OF HEALTH AND HUMAN SERVICES (DHHS), U.S. GOVERNMENT reassignment NATIONAL INSTITUTES OF HEALTH (NIH), U.S. DEPT. OF HEALTH AND HUMAN SERVICES (DHHS), U.S. GOVERNMENT EXECUTIVE ORDER 9424, CONFIRMATORY LICENSE Assignors: STATE UNIVERSITY NEW YORK STONY BROOK
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20156Automatic seed setting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30028Colon; Small intestine
    • G06T2207/30032Colon polyp
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/28Force feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2021Shape modification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Definitions

  • the present invention relates to a system and method for performing a volume based three-dimensional virtual examination using planned and guided navigation techniques.
  • One such application is performing a virtual endoscopy.
  • Colon cancer continues to be a major cause of death throughout the world. Early detection of cancerous growths, which in the human colon initially manifest themselves as polyps, can greatly improve a patient's chance of recovery.
  • the first method is a colonoscopy procedure, which uses a flexible fiber-optic tube called a colonoscope to visually examine the colon by way of physical rectal entry with the scope. The doctor can manipulate the tube to search for any abnormal growths in the colon.
  • the colonoscopy although reliable, is both relatively costly in money and time, and is an invasive, uncomfortable painful procedure for the patient.
  • the second detection technique is the use of a barium enema and two-dimensional X-ray imaging of the colon.
  • the barium enema is used to coat the colon with barium, and a two-dimensional X-ray image is taken to capture an image of the colon.
  • barium enemas may not always provide a view of the entire colon, require extensive pretreatment and patient manipulation, is often operator-dependent when performing the operation, exposes the patient to excessive radiation and can be less sensitive than a colonoscopy. Due to deficiencies in the conventional practices described above, a more reliable, less intrusive and less expensive way to check the colon for polyps is desirable. A method to examine other human organs, such as the lungs, for masses in a reliable, cost effective way and with less patient discomfort is also desirable.
  • Three-dimensional images can be formed by stacking and interpolating between two-dimensional pictures produced from the scanning machines. Imaging an organ and visualizing its volume in three-dimensional space would be beneficial due to its lack of physical intrusion and the ease of data manipulation. However, the exploration of the three-dimensional volume image must be properly performed in order to fully exploit the advantages of virtually viewing an organ from the inside.
  • a functional model When viewing the three dimensional (“3D”) volume virtual image of an environment, a functional model must be used to explore the virtual space.
  • One possible model is a virtual camera which can be used as a point of reference for the viewer to explore the virtual space.
  • Camera control in the context of navigation within a general 3D virtual environment has been previously studied.
  • complete control of a camera in a large domain would be tedious and tiring, and an operator might not view all the important features between the start and finishing point of the exploration.
  • the camera could also easily get “lost” in remote areas or be “crashed” into one of the walls by an inattentive operator or by numerous unexpected obstacles.
  • the second technique of camera control is a planned navigation method, which assigns the camera a predetermined path to take and which cannot be changed by the operator. This is akin to having an engaged “autopilot”. This allows the operator to concentrate on the virtual space being viewed, and not have to worry about steering into walls of the environment being examined. However, this second technique does not give the viewer the flexibility to alter the course or investigate an interesting area viewed along the flight path.
  • the invention generates a three-dimensional visualization image of an object such as a human organ using volume visualization techniques and explores the virtual image using a guided navigation system which allows the operator to travel along a predefined flight path and to adjust both the position and viewing angle to a particular portion of interest in the image away from the predefined path in order to identify polyps, cysts or other abnormal features in the organ.
  • the inventive technique for three-dimensional virtual examination of an object includes producing a discrete representation of the object in volume elements, defining the portion of the object which is to be examined, performing a navigation operation in the virtual object and displaying the virtual object in real time during the navigation.
  • the inventive technique for a three-dimensional virtual examination as applied to an organ of a patient includes preparing the organ for scanning, if necessary, scanning the organ and converting the data into volume elements, defining the portion of the organ which is to be examined, performing a guided navigation operation in the virtual organ and displaying the virtual organ in real time during the guided navigation.
  • a method for electronically cleansing an image can be performed by converting the image data to a plurality of volume elements with each volume element having an intensity value.
  • a classifying operation is performed to classify the volume elements into a plurality of clusters in accordance with the intensity values. Once classified, at least one cluster of volume elements can then be removed from the image data.
  • the classifying operation can be performed by evaluating a plurality of volume elements of the image data with respect to a plurality of neighboring volume elements to determine a neighborhood similarity value for the volume element.
  • the clusters can be further classified by applying a mixture probability function to the clusters to classify voxels whose intensity value results from inclusion of more than one material type.
  • An alternative classifying operation includes the steps of performing feature vector analysis on at least one of the clusters which include image data for a material of interest followed by performing high level feature extraction to remove volume elements from the image which are not substantially indicative of the material of interest.
  • the method according method for electronically cleansing an image is well suited for applications where the image data represents a region of the human body including at least a portion of the colon and the material of interest is tissue of a colon.
  • the removing operation can remove volume elements representing intracolonic fluid, residual stool within the colon, bone, and non-colonic tissue.
  • the colon can be scanned and visualized in real-time or the stored data can be visualized at a later time.
  • a surface of an object (such as an organ) can be rendered transparent or translucent in order to view further objects within or behind the object wall.
  • the object can also be sliced in order to examine a particular cross-section of the object.
  • a section of the object can also be composited using the opacity coefficients.
  • FIG. 1 is a flow chart of the steps for performing a virtual examination of an object, specifically a colon, in accordance with the invention
  • FIG. 2 is an illustration of a “submarine” camera model which performs guided navigation in the virtual organ
  • FIG. 3 is an illustration of a pendulum used to model pitch and roll of the “submarine” camera
  • FIG. 4 is a diagram illustrating a two dimensional cross-section of a volumetric colon which identifies two blocking walls
  • FIG. 5 is a diagram illustrating a two dimensional cross-section of a volumetric colon upon which start and finish volume elements are selected
  • FIG. 6 is a diagram illustrating a two dimensional cross-section of a volumetric colon which shows a discrete sub-volume enclosed by the blocking walls and the colon surface;
  • FIG. 7 is a diagram illustrating a two dimensional cross-section of a volumetric colon which has multiple layers peeled away;
  • FIG. 8 is a diagram illustrating a two dimensional cross-section of a volumetric colon which contains the remaining flight path
  • FIG. 9 is a flow chart of the steps of generating a volume visualization of the scanned organ.
  • FIG. 10 is an illustration of a virtual colon which has been sub-divided into cells
  • FIG. 11A is a graphical depiction of an organ which is being virtually examined
  • FIG. 11B is a graphical depiction of a stab tree generated when depicting the organ in FIG. 11A;
  • FIG. 11C is a further graphical depiction of a stab tree generated while depicting the organ in FIG. 11A;
  • FIG. 12A is a graphical depiction of a scene to be rendered with objects within certain cells of the scene
  • FIG. 12B is a graphical depiction of a stab tree generated while depicting the scene in FIG. 12A;
  • FIGS. 12C-12E are further graphical depictions of stab trees generated while depicting the image in FIG. 12A;
  • FIG. 13 is a two dimensional representation of a virtual colon containing a polyp whose layers can be removed;
  • FIG. 14 is a diagram of a system used to perform a virtual examination of a human organ in accordance with the invention.
  • FIG. 15 is a flow chart depicting an improved image segmentation method
  • FIG. 16 is a graph of voxel intensity versus frequency of a typical abdominal CT data set
  • FIG. 17 is a perspective view diagram of an intensity vector structure including a voxel of interest and its selected neighbors;
  • FIG. 18A is an exemplary image slice from a CT scan of a human abdominal region, primarily illustrating a region including the lungs;
  • FIG. 18B is a pictorial diagram illustrating the identification of the lung region in the image slice of FIG. 18A;
  • FIG. 18C is a pictorial diagram illustrating the removal of the lung volume identified in FIG. 18B;
  • FIG. 19A is a exemplary image slice form a CT scan of a human abdominal region, primarily illustrating a region including a portion of the colon and bone;
  • FIG. 19B is a pictorial diagram illustrating the identification of the colon and bone region from the image slice of FIG. 19A;
  • FIG. 19C is a pictorial diagram illustrating the image scan of FIG. 19 a with the regions of bone removed.
  • FIG. 20 is a flowchart illustrating a method for applying texture to monochrome image data.
  • the preferred embodiment which will be described is the examination of an organ in the human body, specifically the colon.
  • the colon is long and twisted which makes it especially suited for a virtual examination saving the patient both money and the discomfort and danger of a physical probe.
  • organs which can be examined include the lungs, stomach and portions of the gastro-intestinal system, the heart and blood vessels.
  • FIG. 1 illustrates the steps necessary to perform a virtual colonoscopy using volume visualization techniques.
  • Step 101 prepares the colon to be scanned in order to be viewed for examination if required by either the doctor or the particular scanning instrument.
  • This preparation could include cleansing the colon with a “cocktail” or liquid which enters the colon after being orally ingested and passed through the stomach.
  • the cocktail forces the patient to expel waste material that is present in the colon.
  • a substance used is Golytely.
  • air or CO 2 can be forced into the colon in order to expand it to make the colon easier to scan and examine. This is accomplished with a small tube placed in the rectum with approximately 1,000 cc of air pumped into the colon to distend the colon.
  • Step 101 does not need to be performed in all examinations as indicated by the dashed line in FIG. 1 .
  • Step 103 scans the organ which is to be examined.
  • the scanner can be an apparatus well known in the art, such as a spiral CT-scanner for scanning a colon or a Zenita MRI machine for scanning a lung labeled for example with xenon gas.
  • the scanner must be able to take multiple images from different positions around the body during suspended respiration, in order to produce the data necessary for the volume visualization.
  • An example of a single CT-image would use an X-ray beam of 5 mm width, 1:1 to 2:1 pitch, with a 40 cm field-of-view being performed from the top of the splenic flexure of the colon to the rectum.
  • Voxel data representing an object can be derived from a geometric model by techniques described in U.S. Pat. No. 5,038,302 entitled “Method of Converting Continuous Three-Dimensional Geometrical Representations into Discrete Three-Dimensional Voxel-Based Representations Within a Three-Dimensional Voxel-Based System” by Kaufman, issued Aug. 8, 1991, filed Jul. 26, 1988, which is hereby incorporated by reference. Additionally, data can be produced by a computer model of an image which can be converted to three-dimension voxels and explored in accordance with this invention.
  • This type of data is a computer simulation of the turbulence surrounding a space shuttle craft.
  • Step 104 converts the scanned images into three-dimensional volume elements (Voxels).
  • the scan data is reformatted into 5 mm thick slices at increments of 1 mm or 2.5 mm, with each slice represented as a matrix of 512 by 512 pixels.
  • the set of 2D slices is then reconstructed to 3D voxels.
  • the conversion process of 2D images from the scanner into 3D voxels can either be performed by the scanning machine itself or by a separate machine such as a computer with techniques which are well known in the art (for example, see U.S. Pat. No. 4,985,856 entitled “Method and Apparatus for Storing, Accessing, and Processing Voxel-based Data” by Kaufman et al.; issued Jan. 15, 1991, filed Nov. 11, 1988; which is hereby incorporated by reference).
  • Step 105 allows the operator to define the portion of the selected organ to be examined.
  • a physician may be interested in a particular section of the colon likely to develop polyps.
  • the physician can view a two dimensional slice overview map to indicate the section to be examined.
  • a starting point and finishing point of a path to be viewed can be indicated by the physician/operator.
  • a conventional computer and computer interface e.g., keyboard, mouse or spaceball
  • a grid system with coordinates can be used for keyboard entry or the physician/operator can “click” on the desired points.
  • the entire image of the colon can also be viewed if desired.
  • Step 107 performs the planned or guided navigation operation of the virtual organ being examined.
  • Performing a guided navigation operation is defined as navigating through an environment along a predefined or automatically predetermined flight path which can be manually adjusted by an operator at any time.
  • the virtual examinations is modeled on having a tiny camera traveling through the virtual space with a lens pointing towards the finishing point.
  • the guided navigation technique provides a level of interaction with the camera, so that the camera can navigate through a virtual environment automatically in the case of no operator interaction, and at the same time, allow the operator to manipulate the camera when necessary.
  • the preferred embodiment of achieving guided navigation is to use a physically based camera model which employs potential fields to control the movement of the camera and which are described in detail in FIGS. 2 and 3.
  • Step 109 which can be performed concurrently with step 107 , displays the inside of the organ from the viewpoint of the camera model along the selected pathway of the guided navigation operation.
  • Three-dimensional displays can be generated using techniques well known in the art such as the marching cubes technique.
  • a technique is required which reduces the vast number of computations of data necessary for the display of the virtual organ.
  • FIG. 9 describe this display step in more detail.
  • the method described in FIG. 1 can also be applied to scanning multiple organs in a body at the same time.
  • a patient may be examined for cancerous growths in both the colon and lungs.
  • the method of FIG. 1 would be modified to scan all the areas of interest in step 103 and to select the current organ to be examined in step 105 .
  • the physician/operator may initially select the colon to virtually explore and later explore the lung.
  • two different doctors with different specialties may virtually explore different scanned organs relating to their respective specialties.
  • the next organ to be examined is selected and its portion will be defined and explored. This continues until all organs which need examination have been processed.
  • FIG. 2 depicts a “submarine” camera control model which performs the guided navigation technique in step 107 .
  • the default navigation is similar to that of planned navigation which automatically directs the camera along a flight path from one selected end of the colon to another.
  • the camera stays at the center of the colon for obtaining better views of the colonic surface.
  • the operator of the virtual camera using guided navigation can interactively bring the camera close to a specific region and direct the motion and angle of the camera to study the interesting area in detail, without unwillingly colliding with the walls of the colon.
  • the operator can control the camera with a standard interface device such as a keyboard, mouse or non-standard device such as a spaceball.
  • the camera model for guided navigation includes an inextensible, weightless rod 201 connecting two particles x 1 203 and x 2 205 , both particles being subjected to a potential field 215 .
  • the potential field is defined to be highest at the walls of the organ in order to push the camera away from the walls.
  • the positions of the particles are given by x 1 and x 2 , and they are assumed to have the same mass m.
  • a camera is attached at the head of the submarine x 1 203 , whose viewing direction coincides with x 2 x 1 .
  • the submarine can perform translation and rotation around the center of mass x of the model as the two particles are affected by the forces from the potential field V(x) which is defined below, any friction forces, and any simulated external force.
  • V(x) which is defined below, any friction forces, and any simulated external force.
  • the F j s are called the generalized forces.
  • the control of the submarine is performed by applying a simulated external force to x 1 ,
  • ⁇ dot over (x) ⁇ and ⁇ umlaut over (x) ⁇ denote the first and the second derivative of x, respectively, and ( ⁇ V ⁇ ( x ) ⁇ x , ⁇ V ⁇ ( x ) ⁇ y , ⁇ V ⁇ ( x ) ⁇ z ) ,
  • Equation (6) From the first three formulas of Equation (6), it is known that the submarine cannot be propelled by the external force against the potential field if the following condition is satisfied: ⁇ ⁇ V ⁇ ( x 1 ) + ⁇ V ⁇ ( x 2 ) ⁇ > ⁇ F ext ⁇ m .
  • the roll angle ⁇ of the camera system needs to be considered.
  • One possible option allows the operator full control of the angle ⁇ .
  • the operator can rotate the camera freely around the rod of the model, he or she can easily become disoriented.
  • the preferred technique assumes that the upper direction of the camera is connected to a pendulum with mass m 2 301 , which rotates freely around the rod of the submarine, as shown in FIG. 3 .
  • the direction of the pendulum, r 2 is expressed as:
  • r 2 r 2 (cos ⁇ cos ⁇ sin ⁇ +sin ⁇ cos ⁇ , cos ⁇ sin ⁇ sin ⁇ cos ⁇ cos ⁇ , ⁇ sin ⁇ sin ⁇ ).
  • ⁇ ⁇ ⁇ 1 r 2 ⁇ ⁇ g x ⁇ ( cos ⁇ ⁇ ⁇ cos ⁇ cos ⁇ - sin ⁇ ⁇ ⁇ sin ⁇ ⁇ ⁇ ) + ⁇ g y ⁇ ( cos ⁇ ⁇ ⁇ sin ⁇ cos ⁇ - cos ⁇ ⁇ ⁇ sin ⁇ ⁇ ⁇ ) + ⁇ g z ⁇ ( - sin ⁇ ⁇ ⁇ cos ⁇ ) ⁇ - k 2 m 2 ⁇ ⁇ . . ( 7 )
  • the time step h is selected as an equilibrium value between being as small as possible to smooth the motion but as large as necessary to reduce computation cost.
  • the potential field in the submarine model in FIG. 2 defines the boundaries (walls or other matter) in the virtual organ by assigning a high potential to the boundary in order to ensure that the submarine camera does not collide with the walls or other boundary. If the camera model is attempted to be moved into a high potential area by the operator, the camera model will be restrained from doing so unless the operator wishes to examine the organ behind the boundary or inside a polyp, for example.
  • a potential field value is assigned to each piece of volumetric colon data (volume element).
  • a potential value is assigned to every voxel x of the selected volume based on the following three distance values: the distance from the finishing point dt(x), the distance from the colon surface ds(x) and the distance from the center-line of the colon space dc(x).
  • dt(x) is calculated by using a conventional growing strategy.
  • the distance from the colon surface, ds(x) is computed using a conventional technique of growing from the surface voxels inwards.
  • To determine dc(x) the center-line of the colon from the voxel is first extracted, and then dc(x) is computed using the conventional growing strategy from the center-line of the colon.
  • the maximum value of ds(x) is located and denoted dmax. Then for each voxel inside the area of interest, a cost value of dmax ⁇ ds(x) is assigned.
  • dmax ⁇ ds(x) is assigned for each voxel inside the area of interest.
  • the voxels which are close to the colon surface have high cost values and the voxels close to the center line have relatively low cost values.
  • the single-source shortest path technique which is well known in the art is applied to efficiently compute a minimum cost path from the source point to the finish point. This low cost line indicates the center-line or skeleton of the colon section which is desired to be explored. This technique for determining the center-line is the preferred technique of the invention.
  • V ⁇ ( x ) C 1 ⁇ d t ⁇ ( x ) ⁇ + C 2 ⁇ ( d s ⁇ ( x ) d c ⁇ ( x ) + d s ⁇ ( x ) ) - v , ( 8 )
  • Another technique to determine the center-line of the path in the colon is called the“peel-layer” technique and is shown in FIG. 4 through FIG. 8 .
  • FIG. 4 shows a 2D cross-section of the volumetric colon, with the two side walls 401 and 403 of the colon being shown.
  • Two blocking walls are selected by the operator in order to define the section of the colon which is of interest to examine. None can be viewed beyond the blocking walls. This helps reduce the number of computations when displaying the virtual representation.
  • the blocking walls together with side walls identify a contained volumetric shape of the colon which is to be explored.
  • FIG. 5 shows two end points of the flight path of the virtual examination, the start volume element 501 and the finish volume element 503 .
  • the start and finish points are selected by the operator in step 105 of FIG. 1 .
  • the voxels between the start and finish points and the colon sides are identified and marked, as indicated by the area designated with“x”s in FIG. 6 .
  • the voxels are three-dimensional representations of the picture element.
  • the peel-layer technique is then applied to the identified and marked voxels in FIG. 6 .
  • the outermost layer of all the voxels (closest to the colon walls) is peeled off step-by-step, until there is only one inner layer of voxels remaining. Stated differently, each voxel furthest away from a center point is removed if the removal does not lead to a disconnection of the path between the start voxel and the finish voxel.
  • FIG. 7 shows the intermediate result after a number of iterations of peeling the voxels in the virtual colon are complete. The voxels closest to the walls of the colon have been removed.
  • FIG. 8 shows the final flight path for the camera model down the center of the colon after all the peeling iterations are complete. This produces essentially a skeleton at the center of the colon and becomes the desired flight path for the camera model.
  • FIG. 9 describes a real time visibility technique to display of virtual images seen by the camera model in the virtual three-dimensional volume representation of an organ.
  • FIG. 9 shows a display technique using a modified Z buffer which corresponds to step 109 in FIG. 1 .
  • the number of voxels which could be possibly viewed from the camera model is extremely large. Unless the total number of elements (or polygons) which must be computed and visualized is reduced from an entire set of voxels in the scanned environment, the overall number of computations will make the visualization display process exceedingly slow for a large internal area. However, in the present invention only those images which are visible on the colon surface need to be computed for display.
  • the scanned environment can be subdivided into smaller sections, or cells.
  • the Z buffer technique then renders only a portion of the cells which are visible from the camera.
  • the Z buffer technique is also used for three-dimensional voxel representations.
  • the use of a modified Z buffer reduces the number of visible voxels to be computed and allows for the real time examination of the virtual colon by a physician or medical technician.
  • the area of interest from which the center-line has been calculated in step 107 is subdivided into cells before the display technique is applied.
  • Cells are collective groups of voxels which become a visibility unit. The voxels in each cell will be displayed as a group. Each cell contains a number of portals through which the other cells can be viewed.
  • the colon is subdivided by beginning at the selected start point and moving along the center-line 1001 towards the finish point. The colon is then partitioned into cells (for example, cells 1003 , 1005 and 1007 in FIG. 10) when a predefined threshold distance along the center-path is reached. The threshold distance is based upon the specifications of the platform upon which the visualization technique is performed and its capabilities of storage and processing.
  • the cell size is directly related to the number of voxels which can be stored and processed by the platform.
  • One example of a threshold distance is 5 cm, although the distance can greatly vary.
  • Each cell has two cross-sections as portals for viewing outside of the cell as shown in FIG. 10 .
  • Step 901 in FIG. 9 identifies the cell within the selected organ which currently contains the camera.
  • the current cell will be displayed as well as all other cells which are visible given the orientation of the camera.
  • Step 903 builds a stab tree (tree diagram) of hierarchical data of potentially visible cells from the camera (through defined portals), as will be described in further detail hereinbelow.
  • the stab tree contains a node for every cell which may be visible to the camera. Some of the cells may be transparent without any blocking bodies present so that more than one cell will be visible in a single direction.
  • Step 905 stores a subset of the voxels from a cell which include the intersection of adjoining cell edges and stores them at the outside edge of the stab tree in order to more efficiently determine which cells are visible.
  • Step 907 checks if any loop nodes are present in the stab tree.
  • a loop node occurs when two or more edges of a single cell both border on the same nearby cell. This may occur when a single cell is surrounded by another cell. If a loop node is identified in the stab tree, the method continues with step 909 . If there is no loop node, the process goes to step 911 .
  • Step 909 collapses the two cells making up the loop node into one large node.
  • the stab tree is then corrected accordingly. This eliminates the problem of viewing the same cell twice because of a loop node.
  • the step is performed on all identified loop nodes. The process then continues with step 911 .
  • Step 911 then initiates the Z-buffer with the largest Z value.
  • the Z value defines the distance away from the camera along the skeleton path.
  • the tree is then traversed to first check the intersection values at each node. If a node intersection is covered, meaning that the current portal sequence is occluded (which is determined by the Z buffer test), then the traversal of the current branch in the tree is stopped.
  • Step 913 traverses each of the branches to check if the nodes are covered and displays them if they are not.
  • Step 915 then constructs the image to be displayed on the operator's screen from the volume elements within the visible cells identified in step 913 using one of a variety of techniques known in the art, such as volume rendering by compositing.
  • the only cells shown are those which are identified as potentially visible. This technique limits the number of cells which requires calculations in order to achieve a real time display and correspondingly increases the speed of the display for better performance. This technique is an improvement over prior techniques which calculate all the possible visible data points whether or not they are actually viewed.
  • FIG. 11A is a two dimensional pictorial representation of an organ which is being explored by guided navigation and needs to be displayed to an operator.
  • Organ 1101 shows two side walls 1102 and an object 1105 in the center of the pathway.
  • the organ has been divided into four cells A 1151 , B 1153 , C 1155 and D 1157 .
  • the camera 1103 is facing towards cell D 1157 and has a field of vision defined by vision vectors 1107 , 1108 which can identify a cone-shaped field.
  • the cells which can be potentially viewed are cells B 1153 , C 1155 and D 1157 .
  • Cell C 1155 is completely surrounded by Cell B and thus constitutes a node loop.
  • FIG. 11B is a representation of a stab tree built from the cells in FIG. 11 A.
  • Node A 1109 which contains the camera is at the root of the tree.
  • a sight line or sight cone which is a visible path without being blocked, is drawn to node B 1110 .
  • Node B has direct visible sight lines to both node C 1112 and node D 1114 and which is shown by the connecting arrows.
  • the sight line of node C 1112 in the direction of the viewing camera combines with node B 1110 .
  • Node C 1112 and node B 1110 will thus be collapsed into one large node B′ 1122 as shown in FIG. 11 C.
  • FIG. 11C shows node A 1109 containing the camera adjacent to node B′ 1122 (containing both nodes B and node C) and node D 1114 .
  • the nodes A, B′ and D will be displayed at least partially to the operator.
  • FIGS. 12A-12E illustrate the use of the modified Z buffer with cells that contain objects which obstruct the views.
  • An object could be some waste material in a portion of the virtual colon.
  • FIG. 12A shows a virtual space with 10 potential cells: A 1251 , B 1253 , C 1255 , D 1257 , E 1259 , F 1261 , G 1263 , H 1265 , I 1267 and J 1269 . Some of the cells contain objects. If the camera 1201 is positioned in cell I 1267 and is facing toward cell F 1261 as indicated by the vision vectors 1203 , then a stab tree is generated in accordance with the technique illustrated by the flow diagram in FIG. 9 .
  • FIG. 9 FIG.
  • FIG. 12B shows the stab tree generated with the intersection nodes showing for the virtual representation as shown in FIG. 12 A.
  • FIG. 12B shows cell I 1267 as the root node of the tree because it contains the camera 1201 .
  • Node I 1211 is pointing to node F 1213 (as indicated with an arrow), because cell F is directly connected to the sight line of the camera.
  • Node F 1213 is pointing to both node B 1215 and node E 1219 .
  • Node B 1215 is pointing to node A 1217 .
  • Node C 1202 is completely blocked from the line of sight by camera 1201 so is not included in the stab tree.
  • FIG. 12C shows the stab tree after node I 1211 is rendered on the display for the operator. Node I 1211 is then removed from the stab tree because it has already been displayed and node F 1213 becomes the root.
  • FIG. 12D shows that node F 1213 is now rendered to join node I 1211 .
  • the next nodes in the tree connected by arrows are then checked to see if they are already covered (already processed). In this example, all of the intersected nodes from the camera positioned in cell I 1267 has been covered so that node B 515 (and therefore dependent node A) do not need to be rendered on the display.
  • FIG. 12E shows node E 515 being checked to determine if its intersection has been covered. Since it has, the only rendered nodes in this example of FIG. 12A-12E are nodes I and F while nodes A, B and E are not visible and do not need to have their cells prepared to be displayed.
  • the modified Z buffer technique described in FIG. 9 allows for fewer computations and can be applied to an object which has been represented by voxels or other data elements, such as polygons.
  • FIG. 13 shows a two dimensional virtual view of a colon with a large polyp present along one of its walls.
  • FIG. 13 shows a selected section of a patient's colon which is to be examined further. The view shows two colon walls 1301 and 1303 with the growth indicated as 1305 . Layers 1307 , 1309 , and 1311 show inner layers of the growth. It is desirable for a physician to be able to peel the layers of the polyp or tumor away to look inside of the mass for any cancerous or other harmful material. This process would in effect perform a virtual biopsy of the mass without actually cutting into the mass. Once the colon is represented virtually by voxels, the process of peeling away layers of an object is easily performed in a similar manner as described in conjunction with FIGS. 4 through 8.
  • the mass can also be sliced so that a particular cross-section can be examined.
  • a planar cut 1313 can be made so that a particular portion of the growth can be examined.
  • a user-defined slice 1319 can be made in any manner in the growth.
  • the voxels 1319 can either be peeled away or modified as explained below.
  • a transfer function can be performed to each voxel in the area of interest which can make the object transparent, semi-transparent or opaque by altering coefficients representing the translucently for each voxel.
  • An opacity coefficient is assigned to each voxel based on its density.
  • a mapping function then transforms the density value to a coefficient representing its translucency.
  • a high density scanned voxel will indicate either a wall or other dense matter besides simply open space.
  • An operator or program routine could then change the opacity coefficient of a voxel or group of voxels to make them appear transparent or semi-transparent to the submarine camera model. For example, an operator may view a tumor within or outside of an entire growth.
  • a composite of a section of the object can be created using a weighted average of the opacity coefficients of the voxels in that section.
  • a physician desires to view the various layers of a polyp to look for a cancerous areas, this can be performed by removing the outer layer of polyp 1305 yielding a first layer 1307 . Additionally, the first inner layer 1307 can be stripped back to view second inner layer 1309 . The second inner layer can be stripped back to view third inner layer 1311 , etc. The physician could also slice the polyp 1305 and view only those voxels within a desired section. The slicing area can be completely user-defined.
  • Adding an opacity coefficient can also be used in other ways to aid in the exploration of a virtual system. If waste material is present and has a density as other properties within a certain known range, the waste can be made transparent to the virtual camera by changing its opacity coefficient during the examination. This will allow the patient to avoid ingesting a bowel cleansing agent before the procedure and make the examination faster and easier. Other objects can be similarly made to disappear depending upon the actual application. Additionally, some objects like polyps could be enhanced electronically by a contrast agent followed by a use of an appropriate transfer function.
  • FIG. 14 shows a system for performing the virtual examination of an object such as a human organ using the techniques described in this specification.
  • Patient 1401 lies down on a platform 1402 while scanning device 1405 scans the area that contains the organ or organs which are to be examined.
  • the scanning device 1405 contains a scanning portion 1403 which actually takes images of the patient and an electronics portion 1406 .
  • Electronics portion 1406 comprises an interface 1407 , a central processing unit 1409 , a memory 1411 for temporarily storing the scanning data, and a second interface 1413 for sending data to the virtual navigation platform.
  • Interface 1407 and 1413 could be included in a single interface component or could be the same component.
  • the components in portion 1406 are connected together with conventional connectors.
  • the data provided from the scanning portion of device 1403 is transferred to portion 1405 for processing and is stored in memory 1411 .
  • Central processing unit 1409 converts the scanned 2D data to 3D voxel data and stores the results in another portion of memory 1411 .
  • the converted data could be directly sent to interface unit 1413 to be transferred to the virtual navigation terminal 1416 .
  • the conversion of the 2D data could also take place at the virtual navigation terminal 1416 after being transmitted from interface 1413 .
  • the converted data is transmitted over carrier 1414 to the virtual navigation terminal 1416 in order for an operator to perform the virtual examination.
  • the data could also be transported in other conventional ways such as storing the data on a storage medium and physically transporting it to terminal 1416 or by using satellite transmissions.
  • the scanned data may not be converted to its 3D representation until the visualization rendering engine requires it to be in 3D form. This saves computational steps and memory storage space.
  • Virtual navigation terminal 1416 includes a screen for viewing the virtual organ or other scanned image, an electronics portion 1415 and interface control 1419 such as a keyboard, mouse or spaceball.
  • Electronics portion 1415 comprises a interface port 1421 , a central processing unit 1423 , other components 1427 necessary to run the terminal and a memory 1425 .
  • the components in terminal 1416 are connected together with conventional connectors.
  • the converted voxel data is received in interface port 1421 and stored in memory 1425 .
  • the central processor unit 1423 then assembles the 3D voxels into a virtual representation and runs the submarine camera model as described in FIGS. 2 and 3 to perform the virtual examination. As the submarine camera travels through the virtual organ, the visibility technique as described in FIG.
  • Terminal portion 1415 can be the Cube- 4 dedicated system box, generally available from the Department of Computer Science at the State University of New York at Stony Brook.
  • Scanning device 1405 and terminal 1416 can be part of the same unit.
  • a single platform would be used to receive the scan image data, connect it to 3D voxels if necessary and perform the guided navigation.
  • An important feature in system 1400 is that the virtual organ can be examined at a later time without the presence of the patient. Additionally, the virtual examination could take place while the patient is being scanned.
  • the scan data can also be sent to multiple terminals which would allow more than one doctor to view the inside of the organ simultaneously. Thus a doctor in New York could be looking at the same portion of a patient's organ at the same time with a doctor in California while discussing the case. Alternatively, the data can be viewed at different times. Two or more doctors could perform their own examination of the same data in a difficult case. Multiple virtual navigation terminals could be used to view the same scan data.
  • the above described techniques can be further enhanced in virtual colonoscopy applications through the use of an improved electronic colon cleansing technique which employs modified bowel preparation operations followed by image segmentation operations, such that fluid and stool remaining in the colon during a computed tomographic (CT) or magnetic resonance imaging (MRI) scan can be detected and removed from the virtual colonoscopy images.
  • CT computed tomographic
  • MRI magnetic resonance imaging
  • the first step in electronic colon cleansing is bowel preparation (step 1510 ), which takes place prior to conducting the CT or magnetic resonance imaging (MRI) scan and is intended to create a condition where residual stool and fluid remaining in the colon present significantly different image properties from that of the gas-filled colon interior and colon wall.
  • An exemplary bowel preparation operation includes ingesting three 250 cc doses of Barium Sulfate suspension of 2.1% W/V, such as manufactured by E-Z-EM, Inc.,of Westbury, N.Y., during the day prior the CT or MRI scan. The three doses should be spread out over the course of the day and can be ingested along with three meals, respectively.
  • the Barium Sulfate serves to enhance the images of any stool which remains in the colon.
  • fluid intake is preferably increased during the day prior to the CT or MRI scan.
  • Cranberry juice is known to provide increased bowel fluids and is preferred, although water can also be ingested.
  • 60 ml of a Diatrizoate Meglumine and Diaztrizoate Sodium Solution which is commercially available as MD-Gastroview, manufactured by Mallinckrodt, Inc. of St. Louis, Mo., can be consumed to enhance image properties of the colonic fluid.
  • Sodium phosphate can also be added to the solution to liquilize the stool in the colon, which provides for more uniform enhancement of the colonic fluid and residual stool.
  • the above described exemplary preliminary bowel preparation operation can obviate the need for conventional colonic washing protocols, which can call for the ingestion of a gallon of Golytely solution prior to a CT scan.
  • Glucagon manufactured by Ely Lily and Company, of Indianapolis, Ind.
  • the colon can be inflated using approximately 1000 cc of compressed gas, such as CO 2 , or room air, which can be introduced through a rectum tube.
  • a conventional CT scan is performed to acquire data from the region of the colon (step 1520 ).
  • data can be acquired using a GE/CTI spiral mode scanner operating in a helical mode of 5 mm, 1.5-2.0:1 pitch, where the pitch is adjusted based upon the patient's height in a known manner.
  • a routine imaging protocol of 120 kVp and 200-280 ma can be utilized for this operation.
  • the data can be acquired and reconstructed as 1 mm thick slice images having an array size of 512 ⁇ 512 pixels in the field of view, which varies from 34 to 40 cm depending on the patient's size. the number of such slices generally varies under these conditions from 300 to 450, depending on the patient's height.
  • the image data set is converted to volume elements or voxels (step 1530 ).
  • Image segmentation can be performed in a number of ways.
  • a local neighbor technique is used to classify voxels of the image data in accordance with similar intensity values.
  • each voxel of an acquired image is evaluated with respect to a group of neighbor voxels.
  • the voxel of interest is referred to as the central voxel and has an associated intensity value.
  • a classification indicator for each voxel is established by comparing the value of the central voxel to each of its neighbors. If the neighbor has the same value as the central voxel, the value of the classification indicator is incremented.
  • the classification indicator for the central voxel is decremented.
  • the central voxel is then classified to that category which has the maximum indicator value, which indicates the most uniform neighborhood among the local neighbors.
  • Each classification is indicative of a particular intensity range, which in turn is representative of one or more material types being imaged.
  • the method can be further enhanced by employing a mixture probability function to the similarity classifications derived.
  • An alternate process of image segmentation is performed as two major operations: low level processing and high level feature extraction.
  • low level processing regions outside the body contour are eliminated from further processing and voxels within the body contour are roughly categorized in accordance with well defined classes of intensity characteristics.
  • a CT scan of the abdominal region generates a data set which tends to exhibit a well defined intensity distribution.
  • the graph of FIG. 16 illustrates such an intensity distribution as an exemplary histogram having four, well defined peaks, 1602 , 1604 , 1606 , 1608 , which can be classified according to intensity thresholds.
  • the voxels of the abdominal CT data set are roughly classified as four clusters by intensity thresholds (step 1540 ).
  • Cluster 1 can include voxels whose intensities are below 140. This cluster generally corresponds to the lowest density regions within the interior of the gas filled colon.
  • Cluster 2 can include voxels which have intensity values in excess of 2200. These intensity values correspond to the enhanced stool and fluid within the colon as well as bone.
  • Cluster 3 can include voxels with intensities in the range of about 900 to about 1080. This intensity range generally represents soft tissues, such as fat and muscle, which are unlikely to be associated with the colon.
  • the remaining voxels can then be grouped together as cluster 4 , which are likely to be associated with the colon wall (including mucosa and partial volume mixtures around the colon wall) as well as lung tissue and soft bones.
  • Clusters 1 and 3 are not particularly valuable in identifying the colon wall and, therefore are not subject to substantial processing during image segmentation procedures for virtual colonoscopy.
  • the voxels associated with cluster 2 are important for segregating stool and fluid from the colon wall and are processed further during the high-level feature extraction operations. Low level processing is concentrated on the fourth cluster, which has the highest likelihood of corresponding to colon tissue (step 1550 ).
  • an intensity vector is generated using itself and its neighbors.
  • the intensity vector provides an indication of the change in intensity in the neighborhood proximate a given voxel.
  • the number of neighbor voxels which are used to establish the intensity vector is not critical, but involves a tradeoff between processing overhead and accuracy.
  • a simple voxel intensity vector can be established with seven (7) voxels, which includes the voxel of interest, its front and back neighbors, its left and right neighbors and its top and bottom neighbors, all surrounding the voxel of interest on three mutually perpendicular axes.
  • FIG. 17 is a perspective view illustrating an exemplary intensity vector in the form of a 25 voxel intensity vector model, which includes the selected voxel 1702 as well as its first, second and third order neighbors.
  • the selected voxel 1702 is the central point of this model and is referred to as the fixed voxel.
  • a planar slice of voxels, which includes 12 neighbors on the same plane as the fixed voxel, is referred to as the fixed slice 1704 .
  • On adjacent planes to the fixed slice are two nearest slices 1706 , having five voxels each.
  • Adjacent to the first nearest slices 1706 are two second nearest slices 1708 , each having a single voxel.
  • the collection of intensity vectors for each voxel in the fourth cluster is referred to as a local vector series.
  • the data set for an abdominal image generally includes more than 300 slice images, each with a 512 ⁇ 512 voxel array, and each voxel having an associated 25 voxel local vector
  • feature analysis is a principal component analysis (PCA), which can be applied to the local vector series to determine the dimension of a feature vector series and an orthogonal transformation matrix for the voxels of cluster 4 .
  • PCA principal component analysis
  • an orthogonal transformation matrix can be established which is a predetermined matrix determined by using several sets of training data acquired using the same scanner under similar conditions. From this data, a transformation matrix, such as a Karlhunen-Loéve (K-L) transformation, can be generated in a known manner. The transformation matrix is applied to the local vector series to generate feature vector series. Once in the feature-vector space domain, vector quantization techniques can be used to classify the feature vector series.
  • K-L Karlhunen-Loéve
  • An analytical, self-adaptive algorithm can be used for the classification of the feature vectors.
  • a representative element is generated by the algorithm.
  • dist(x,y) is the Euclidean distance between vector x and y and arc min d j gives the integer j which realizes the minimum value of d j .
  • T which is the vector similarity threshold
  • each voxel within the selected cluster is assigned to a class (step 1570 ).
  • a class In the exemplary case of virtual colonoscopy, there are several classes within cluster 4 .
  • the next task is to determine which of the several classes in cluster 4 corresponds to the colon wall.
  • the first coordinate of the feature vector which is that coordinate of the feature vector exhibiting the highest variance, reflects the information of the average of the 3D local voxel intensities.
  • the remaining coordinates of the feature vector contain the information of directional intensity change within the local neighbors.
  • a threshold interval can be determined by data samples selected from typical colon wall intensities of a typical CT data set to roughly distinguish colon wall voxel candidates. The particular threshold value is selected for each particular imaging protocol and device. This threshold interval can then applied to all CT data sets (acquired from the same machine, using the same imaging protocol). If the first coordinate of the representative element is located in the threshold interval, the corresponding class is regarded as the colon wall class and all voxels in that class are labeled as colon wall-like voxels.
  • Each colon wall-like voxel is a candidate to be a colon wall voxel.
  • the first case relates to voxels which are close to the stool/liquid inside the colon.
  • the second case occurs when voxels are in the lung tissue regions.
  • the third case represents mucosa voxels.
  • low level classification carries a degree of classification uncertainty.
  • the causes of the low-level classification uncertainty vary. For example, a partial-volume effect resulting from voxels containing more than one material type (i.e., fluid and colon wall) leads to the first case of uncertainty.
  • the second and the third cases of uncertainty are due to both the partial volume effect as well as the low contrast of CT images.
  • a high-level feature extraction procedure is used in the present method to further distinguish candidates for the colon wall from other colon wall-like voxels, based on a priori anatomical knowledge of the CT images (step 1580 ).
  • FIG. 18A is an exemplary slice image clearly illustrating the lung region 1802 .
  • the lung region 1802 is identifiable as a generally contiguous three dimensional volume enclosed by colon wall-like voxels, as illustrated in FIG. 18 B. Given this characteristic, the lung region can be identified using a region growing strategy.
  • the first step in this technique is to find a seed voxel within the region of growing.
  • the operator performing the CT imaging scan sets the imaging range such that the top most slice of the CT scan does not contain any colon voxels.
  • the seed is provided by the low-level classification simply by selecting an air voxel.
  • a next step in performing high-level feature extraction can be to separate the bone voxels from enhanced stool/fluid voxels in cluster 2 .
  • the bone tissue voxels 1902 are generally relatively far away from the colon wall and resides outside the colon volume. To the contrary, the residual stool 1906 and fluid 1904 are enclosed inside the colon volume.
  • a rough colon wall volume is generated. Any voxel separated by more than a predetermined number (e.g., 3 ) of voxel units from the colon wall, and outside the colon volume, will be labeled as bone and then removed from the image.
  • the remaining voxels in cluster 2 can be assumed to represent stool and fluid within the colon volume (see FIGS. 19 A-C).
  • the voxels within the colon volume identified as stool 1906 and fluid 1904 can be removed from the image to generate a clean colon lumen and colon wall image.
  • One region type is small residual areas of stool 1906 attached to the colon wall.
  • the other region type is large volumes of fluid 1904 , which collect in basin-like colonic folds (see FIGS. 19 A-C).
  • the attached residual stool regions 1906 can be identified and removed because they are inside the rough colon volume generated during the low-level classification process.
  • the fluid 1906 in the basin-like colon fold usually has a horizontal surface 1908 due to the effect of gravity. Above the surface is always a gas region, which exhibits a very high contrast to the fluid intensity. Thus, the surface interface of the fluid regions can be easily marked.
  • the contour of the attached stool regions 1906 can be outlined, and the part which is away from the colon wall volume can be removed.
  • the contour of the fluid regions 1904 can also be outlined. After eliminating the horizontal surfaces 1908 , the colon wall contour is revealed and the clean colon wall is obtained.
  • the inner surface, the outer surface and the wall itself of the colon can be extracted and viewed as a virtual object.
  • This provides a distinct advantage over conventional optical colonoscopy in that the exterior wall of the colon can be examined as well as the interior wall. Furthermore, the colon wall and the colon lumen can be obtained separately from the segmentation.
  • the colon is substantially evacuated prior to imaging, a commonly encountered problem is that the colon lumen collapses in spots. While the inflation of the colon with compressed gas, such as air or CO 2 , reduces the frequency of collapsed regions, such areas still occur.
  • compressed gas such as air or CO 2
  • the first step is to detect a collapsed region.
  • an entropy analysis can be used to detect areas of colon collapse.
  • the degree of change in greyscale value can be expressed and measured by an entropy value.
  • voxels on the outer surface of the colon wall are selected. Such points are identified from the above described image segmentation techniques.
  • a 5 ⁇ 5 ⁇ 5 cubic window can be applied to the pixels, centered on the pixel of interest.
  • a smaller (3 ⁇ 3 ⁇ 3) window can be applied to the pixels of interest in order to filter out noise from the image data.
  • the calculated entropy values for each window are then compared against a predetermined threshold value. For regions of air, the entropy values will be fairly low, when compared to regions of tissue. Therefore, along the centerline of the colon lumen, when the entropy values increase and exceed the predetermined threshold value, a collapsed region is indicated.
  • the exact value of the threshold is not critical and will depend in part on the imaging protocol and particulars of the imaging device.
  • the previously determined centerline flight path can be extended through the region by piercing through the center of the collapse with a one voxel wide navigation line.
  • the region of colon collapse can be virtually opened using a physical modeling technique to recover some of the properties of the collapsed region.
  • a model of the physical properties of the colon wall is developed. From this model, parameters of motion, mass density, damping density, stretching and bending coefficients are estimated for a Lagrange equation. Then, an expanding force model (i.e., gas or fluid, such as air, pumped into the colon) is formulated and applied in accordance with the elastic properties of the colon, as defined by the Lagrange equation, such that the collapsed region of the colon image is restored to its natural shape.
  • an expanding force model i.e., gas or fluid, such as air, pumped into the colon
  • a finite-element model can be applied to the collapsed or obstructed regions of the colon lumen. This can be performed by sampling the elements in a regular grid, such as an 8 voxel brick, and then applying traditional volume rendering techniques. Alternatively, an irregular volume representation approach, such as tetrahedrons can be applied to the collapsed regions.
  • the magnitude of the external force is first determined to properly separate the collapsed colon wall regions.
  • a three dimensional growing model can be used to trace the internal and external colon wall surfaces in a parallel manner. The respective surfaces are marked from a starting point at the collapsed region to a growing source point, and the force model is applied to expand the surfaces in a like and natural manner.
  • the region between the internal and external surfaces, i.e., the colon wall, are classified as sharing regions.
  • the external repulsive force model is applied to these sharing regions to separate and expand the collapsed colon wall segments in a natural manner.
  • FIG. 20 is a flow chart depicting a present method for generating virtual objects having a texture component.
  • the purpose of this method is to map textures obtained by optical colonoscopy images in the red-green-blue (RGB) color space, as for example from the Visible Human, onto the gray scale monochrome CT image data used to generate virtual objects.
  • the optical colonoscopsy images are acquired by conventional digital image acquistion techniques, such as by a digital “frame grabber” 1429 which receives analog optical images from a camera, such as a video camera, and converts the image to digital data which can be provided to CPU 1423 via interface port 1431 (FIG. 14 ).
  • the first step in this process is to segment the CT image data (step 2010 ).
  • the above described image segmentation techniques can be applied to choose intensity thresholds in the grey scale image to classify the CT image data into various tissue types, such as bone, colon wall tissue, air, and the like.
  • the texture features of the optical image need to be extracted from the optical image data (step 2020 ).
  • a gausian filter can be applied to the optical image data followed by subsampling to decompose the data into a multiresolutional pyramid.
  • a laplacian filter and steerable filter can also be applied to the multiresolutional pyramid to obtain oriented and non-oriented features of the data. While this method is effective at extracting and capturing the texture features, the implementation of this approach requires a large amount of memory and processing power.
  • An alternative approach to extracting the texture features from the optical image is to utilize a wavelet transform.
  • wavelet transformations are generally computationally efficient, conventional wavelet transforms are limited in that they only capture features with orientations parallel to the axes and cannot be applied directly to a region of interest.
  • a non-separable filter can be employed.
  • a lifting shcme cam be employed to build filter banks for wavelets transform in any dimension using a two step, prediction and updating approach.
  • Such filter banks can be synthesized by the Boor-Rom algorithm for multidimensional polynomial interpolation.
  • models must be generated to describe these features (step 2030 ). This can be performed, for example, by using a non-parametric multi-scale statistical model which is based on estimating and manipulating the entropy of non-Gaussian distributions attributable to the natural textures.
  • texture matching must be performed to correlate these models to the segmented CT image data (step 2050 ).
  • regions of the CT image data where the texture is continuous corresponding classes of texture are easily matched.
  • boundary regions between two or more texture regions the process is more complex. Segmentation of the CT data around a boundary region often leads to data which is fuzzy, i.e., the results reflect a percentage of texture from each material or tissue and vary depending on the various weighting of each. The weighting percentage can be used to set the importance of matching criteria.
  • the cross entropy or a Kullback-Leiber divergance algorithm can be used to measure the distribution of different textures in a boundary region.
  • texture synthesis is performed on the CT image data (step 2050 ). This is done by fusing the textures from the optical image data in to the CT image data.
  • texture can be sampled directly from the optical data to the segmented CT image data.
  • unisotropic texture regions such as colon mucosa
  • a multiresolution sampling procedure is preferred. In this process, selective resampling for homogenous and heterogenous regions is employed.
  • volume rendering techniques employ a defined transfer function to map different ranges of sample values of the original volume data to different colors and opacities.
  • the present system and methods can be extended to perform automated polyp detection.
  • Polyps which occur, for example, within the colon are generally small convex hill-like structures extending from the colon wall. This geometry is distinct from the fold of the colon wall.
  • a differential geometry model can be used to detect such polyps on the colon wall.
  • the surface of the colon lumen can be represented using a C- 2 smoothness surface model.
  • each voxel on the surface has an associated geometrical feature which has a Gaussian curvature, referred to as Gaussian curvature fields.
  • Gaussian curvature fields A convex hill on the surface, which may be indicative of a polyp, possesses a unique local feature in the Gaussian curvature fields. Accordingly, by searching the Gausian curvature fields for specific local features, polyps can be detected.
  • the methods and systems described herein could be applied to virtually examine an animal, fish or inanimate object.
  • applications of the technique could be used to detect the contents of sealed objects which cannot be opened.
  • the technique could also be used inside an architectural structure such as a building or cavern and enable the operator to navigate through the structure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Architecture (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Vascular Medicine (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)
  • Image Processing (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)
US09/343,012 1996-09-16 1999-06-29 System and method for performing a three-dimensional virtual segmentation and examination Expired - Lifetime US6331116B1 (en)

Priority Applications (33)

Application Number Priority Date Filing Date Title
US09/343,012 US6331116B1 (en) 1996-09-16 1999-06-29 System and method for performing a three-dimensional virtual segmentation and examination
US09/493,559 US6343936B1 (en) 1996-09-16 2000-01-28 System and method for performing a three-dimensional virtual examination, navigation and visualization
BR0009099-9A BR0009099A (pt) 1999-03-18 2000-03-17 Sistema e processo para efetuar um exame, uma navegação e uma visualização tridimensionais virtuais
AU39018/00A AU3901800A (en) 1999-03-18 2000-03-17 System and method for performing a three-dimensional virtual segmentation and examination
IL14551600A IL145516A0 (en) 1999-03-18 2000-03-17 System and method for performing a three-dimensional virtual segmentation and examination
CA002368058A CA2368058A1 (en) 1999-03-18 2000-03-17 System and method for performing a three-dimensional virtual examination, navigation and visualization
KR1020067021647A KR100790536B1 (ko) 1999-03-18 2000-03-17 가상 결장 내강을 통한 이동 경로를 생성하는 방법
CNB008076375A CN1277241C (zh) 1999-03-18 2000-03-17 实行三维虚拟检查、导引行进和可视化的系统和方法
EP00918154.6A EP1173830B1 (en) 1999-03-18 2000-03-17 System and method for performing a three-dimensional virtual segmentation and examination
JP2000605971A JP4435430B2 (ja) 1999-03-18 2000-03-17 3次元仮想細分化および検査を実施するシステムおよび方法
PCT/US2000/007352 WO2000055814A2 (en) 1999-03-18 2000-03-17 System and method for performing a three-dimensional virtual segmentation and examination
EP00918153A EP1161741A1 (en) 1999-03-18 2000-03-17 System and method for performing a three-dimensional virtual examination, navigation and visualization
KR1020017011901A KR100701235B1 (ko) 1999-03-18 2000-03-17 삼차원 가상 분할 및 검사 시스템 및 방법
BR0009098-0A BR0009098A (pt) 1999-03-18 2000-03-17 Sistema e processo para efetuar uma segmentação e exame tridimensionais virtuais
KR1020067021648A KR20060116872A (ko) 1999-03-18 2000-03-17 가상 결장 내강의 검사 방법
IL14551500A IL145515A0 (en) 1999-03-18 2000-03-17 System and method for performing a three-dimensional virtual examination, navigation and visualization
CNB008076383A CN1248167C (zh) 1999-03-18 2000-03-17 实行三维虚拟分割和检查的系统和方法
JP2000605969A JP2002538915A (ja) 1999-03-18 2000-03-17 3次元仮想検査、ナビゲーション及び視覚化を実行するシステム及び方法
KR1020017011900A KR100701234B1 (ko) 1999-03-18 2000-03-17 3차원 가상 검사, 네비게이션 및 가시화 시스템 및 방법
AU39017/00A AU3901700A (en) 1999-03-18 2000-03-17 System and method for performing a three-dimensional virtual examination, navigation and visualization
PCT/US2000/007351 WO2000055812A1 (en) 1999-03-18 2000-03-17 System and method for performing a three-dimensional virtual examination, navigation and visualization
CA2368390A CA2368390C (en) 1999-03-18 2000-03-17 System and method for performing a three-dimensional virtual segmentation and examination
US09/777,120 US7194117B2 (en) 1999-06-29 2001-02-05 System and method for performing a three-dimensional virtual examination of objects, such as internal organs
IS6079A IS6079A (is) 1999-03-18 2001-09-18 Kerfi og aðferð til að framkvæma í þrívídd sýndarhlutun og -rannsókn
IS6078A IS6078A (is) 1999-03-18 2001-09-18 Kerfi og aðferð til að framkvæma í þrívídd sýndarskoðun, -ráp og sjóngervingu
IL145516A IL145516A (en) 1999-03-18 2001-09-20 System and method for segmentation and simulated three-dimensional tests
US09/974,548 US7148887B2 (en) 1996-09-16 2001-10-10 System and method for performing a three-dimensional virtual segmentation and examination with optical texture mapping
US09/974,569 US6514082B2 (en) 1996-09-16 2001-10-10 System and method for performing a three-dimensional examination with collapse correction
IL178769A IL178769A (en) 1999-03-18 2006-10-19 System and method for performing a simulated colonoscopy
IL178768A IL178768A (en) 1999-03-18 2006-10-19 System and method for optical mapping of texture properties from at least one optical image to a monochromatic array of information
US11/613,283 US7474776B2 (en) 1996-09-16 2006-12-20 System and method for performing a three-dimensional virtual examination of objects, such as internal organs
US11/613,297 US7486811B2 (en) 1996-09-16 2006-12-20 System and method for performing a three-dimensional virtual examination of objects, such as internal organs
US11/613,306 US7477768B2 (en) 1999-06-29 2006-12-20 System and method for performing a three-dimensional virtual examination of objects, such as internal organs

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US08/714,697 US5971767A (en) 1996-09-16 1996-09-16 System and method for performing a three-dimensional virtual examination
US12504199P 1999-03-18 1999-03-18
US09/343,012 US6331116B1 (en) 1996-09-16 1999-06-29 System and method for performing a three-dimensional virtual segmentation and examination

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US08/714,697 Continuation-In-Part US5971767A (en) 1996-09-16 1996-09-16 System and method for performing a three-dimensional virtual examination

Related Child Applications (6)

Application Number Title Priority Date Filing Date
US09/493,559 Continuation-In-Part US6343936B1 (en) 1996-09-16 2000-01-28 System and method for performing a three-dimensional virtual examination, navigation and visualization
US09/777,120 Continuation-In-Part US7194117B2 (en) 1996-09-16 2001-02-05 System and method for performing a three-dimensional virtual examination of objects, such as internal organs
US09/974,569 Division US6514082B2 (en) 1996-09-16 2001-10-10 System and method for performing a three-dimensional examination with collapse correction
US09/974,548 Division US7148887B2 (en) 1996-09-16 2001-10-10 System and method for performing a three-dimensional virtual segmentation and examination with optical texture mapping
US11/613,297 Continuation-In-Part US7486811B2 (en) 1996-09-16 2006-12-20 System and method for performing a three-dimensional virtual examination of objects, such as internal organs
US11/613,283 Continuation-In-Part US7474776B2 (en) 1996-09-16 2006-12-20 System and method for performing a three-dimensional virtual examination of objects, such as internal organs

Publications (1)

Publication Number Publication Date
US6331116B1 true US6331116B1 (en) 2001-12-18

Family

ID=26823212

Family Applications (3)

Application Number Title Priority Date Filing Date
US09/343,012 Expired - Lifetime US6331116B1 (en) 1996-09-16 1999-06-29 System and method for performing a three-dimensional virtual segmentation and examination
US09/974,569 Expired - Lifetime US6514082B2 (en) 1996-09-16 2001-10-10 System and method for performing a three-dimensional examination with collapse correction
US09/974,548 Expired - Lifetime US7148887B2 (en) 1996-09-16 2001-10-10 System and method for performing a three-dimensional virtual segmentation and examination with optical texture mapping

Family Applications After (2)

Application Number Title Priority Date Filing Date
US09/974,569 Expired - Lifetime US6514082B2 (en) 1996-09-16 2001-10-10 System and method for performing a three-dimensional examination with collapse correction
US09/974,548 Expired - Lifetime US7148887B2 (en) 1996-09-16 2001-10-10 System and method for performing a three-dimensional virtual segmentation and examination with optical texture mapping

Country Status (11)

Country Link
US (3) US6331116B1 (xx)
EP (1) EP1173830B1 (xx)
JP (1) JP4435430B2 (xx)
KR (1) KR100701235B1 (xx)
CN (1) CN1248167C (xx)
AU (1) AU3901800A (xx)
BR (1) BR0009098A (xx)
CA (1) CA2368390C (xx)
IL (1) IL145516A0 (xx)
IS (1) IS6079A (xx)
WO (1) WO2000055814A2 (xx)

Cited By (123)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002029764A1 (en) * 2000-10-03 2002-04-11 The Research Foundation Of State University Of New York System and method for performing a three-dimensional virtual examination of objects, such as internal organs
US20020097320A1 (en) * 2000-04-07 2002-07-25 Zalis Michael E. System for digital bowel subtraction and polyp detection and related techniques
US6477401B1 (en) 2000-03-10 2002-11-05 Mayo Foundation For Medical Education And Research Colonography of an unprepared colon
US6476803B1 (en) * 2000-01-06 2002-11-05 Microsoft Corporation Object modeling system and process employing noise elimination and robust surface extraction techniques
US20020164060A1 (en) * 2001-05-04 2002-11-07 Paik David S. Method for characterizing shapes in medical images
US20020165689A1 (en) * 2001-04-18 2002-11-07 Callegari Andres C. Volume body renderer
US20020164061A1 (en) * 2001-05-04 2002-11-07 Paik David S. Method for detecting shapes in medical images
US20030108853A1 (en) * 2000-05-19 2003-06-12 Edna Chosack Endoscopic tutorial system for the pancreatic system
US20030132936A1 (en) * 2001-11-21 2003-07-17 Kevin Kreeger Display of two-dimensional and three-dimensional views during virtual examination
WO2003083781A1 (en) * 2002-03-29 2003-10-09 Koninklijke Philips Electronics N.V. Method, system and computer program for stereoscopic viewing of 3d medical images
US6643533B2 (en) * 2000-11-28 2003-11-04 Ge Medical Systems Global Technology Company, Llc Method and apparatus for displaying images of tubular structures
US20040064029A1 (en) * 2002-09-30 2004-04-01 The Government Of The Usa As Represented By The Secretary Of The Dept. Of Health & Human Services Computer-aided classification of anomalies in anatomical structures
US6718193B2 (en) * 2000-11-28 2004-04-06 Ge Medical Systems Global Technology Company, Llc Method and apparatus for analyzing vessels displayed as unfolded structures
US20040136584A1 (en) * 2002-09-27 2004-07-15 Burak Acar Method for matching and registering medical image data
US6765566B1 (en) * 1998-12-22 2004-07-20 Che-Chih Tsao Method and apparatus for displaying volumetric 3D images
US20040141638A1 (en) * 2002-09-30 2004-07-22 Burak Acar Method for detecting and classifying a structure of interest in medical images
US20040147830A1 (en) * 2003-01-29 2004-07-29 Virtualscopics Method and system for use of biomarkers in diagnostic imaging
US20040151668A1 (en) * 2000-03-07 2004-08-05 Kevin Tait Stool marker
US20040220466A1 (en) * 2003-04-02 2004-11-04 Kazuhiko Matsumoto Medical image processing apparatus, and medical image processing method
US20040259065A1 (en) * 2003-05-08 2004-12-23 Siemens Corporate Research Inc. Method and apparatus for automatic setting of rendering parameter for virtual endoscopy
US20050015004A1 (en) * 2003-07-17 2005-01-20 Hertel Sarah Rose Systems and methods for combining an anatomic structure and metabolic activity for an object
US20050024724A1 (en) * 2002-01-09 2005-02-03 Bo-Hyoung Kim Apparatus and method for displaying virtual endoscopy display
US20050058351A1 (en) * 2003-06-30 2005-03-17 Ione Fine Surface segmentation from luminance and color differences
US20050074150A1 (en) * 2003-10-03 2005-04-07 Andrew Bruss Systems and methods for emulating an angiogram using three-dimensional image data
US20050078858A1 (en) * 2003-10-10 2005-04-14 The Government Of The United States Of America Determination of feature boundaries in a digital representation of an anatomical structure
US20050113696A1 (en) * 2003-11-25 2005-05-26 Miller Steven C. Methods and systems for motion adaptive spatial compounding
WO2005048198A1 (en) * 2003-11-14 2005-05-26 Philips Intellectual Property & Standards Gmbh Method and apparatus for visualisation of a tubular structure
US20050114831A1 (en) * 2001-04-18 2005-05-26 Andres Callegari Volume body renderer
US20050110791A1 (en) * 2003-11-26 2005-05-26 Prabhu Krishnamoorthy Systems and methods for segmenting and displaying tubular vessels in volumetric imaging data
US20050143654A1 (en) * 2003-11-29 2005-06-30 Karel Zuiderveld Systems and methods for segmented volume rendering using a programmable graphics pipeline
WO2005061008A1 (en) * 2003-12-01 2005-07-07 The United States Of America As Represented By The Secretary Of The Navy Bowel preparation for virtual colonscopy
US20050148848A1 (en) * 2003-11-03 2005-07-07 Bracco Imaging, S.P.A. Stereo display of tube-like structures and improved techniques therefor ("stereo display")
US20050175542A1 (en) * 2002-04-06 2005-08-11 Philippe Lefere System, formulation, kit and method for tagging colonic residue in an individual
US20050281481A1 (en) * 2004-06-07 2005-12-22 Lutz Guendel Method for medical 3D image display and processing, computed tomograph, workstation and computer program product
US20060024236A1 (en) * 2004-07-27 2006-02-02 Pickhardt Perry J Bowel preparation for virtual colonoscopy
US20060073454A1 (en) * 2001-01-24 2006-04-06 Anders Hyltander Method and system for simulation of surgical procedures
US20060079746A1 (en) * 2004-10-11 2006-04-13 Perret Florence M Apparatus and method for analysis of tissue classes along tubular structures
US7039723B2 (en) 2001-08-31 2006-05-02 Hinnovation, Inc. On-line image processing and communication system
US20060093196A1 (en) * 2004-10-28 2006-05-04 Odry Benjamin L System and method for automatic detection and localization of 3D bumps in medical images
US20060099557A1 (en) * 2002-09-30 2006-05-11 Anders Hyltander Device and method for generating a virtual anatomic environment
US20060157069A1 (en) * 2005-01-19 2006-07-20 Ziosoft, Inc. Identification method and computer readable medium
US20060184003A1 (en) * 2005-02-03 2006-08-17 Lewin Jonathan S Intra-procedurally determining the position of an internal anatomical target location using an externally measurable parameter
US20060187221A1 (en) * 2005-02-22 2006-08-24 Sarang Lakare System and method for identifying and removing virtual objects for visualization and computer aided detection
US20060215896A1 (en) * 2003-10-31 2006-09-28 General Electric Company Method and apparatus for virtual subtraction of stool from registration and shape based analysis of prone and supine scans of the colon
US20060228003A1 (en) * 2005-04-06 2006-10-12 Silverstein D A Method and apparatus for detection of optical elements
US20060293792A1 (en) * 2005-06-17 2006-12-28 Honda Motor Co., Ltd. Path generator for mobile object
US20070003122A1 (en) * 2005-06-29 2007-01-04 General Electric Company Method for quantifying an object in a larger structure using a reconstructed image
WO2007002146A2 (en) * 2005-06-22 2007-01-04 The Research Foundation Of State University Of New York System and method for computer aided polyp detection
US20070014463A1 (en) * 2004-10-15 2007-01-18 Brigham And Women's Hospital Factor Analysis in Medical Imaging
US20070073114A1 (en) * 2005-09-28 2007-03-29 Lutz Gundel Method and apparatus for post-processing of a 3D image data record, in particular for virtual colonography
US20070109299A1 (en) * 2005-11-15 2007-05-17 Vital Images, Inc. Surface-based characteristic path generation
US20070116346A1 (en) * 2005-11-23 2007-05-24 Peterson Samuel W Characteristic path-based colon segmentation
US20070120845A1 (en) * 2005-11-25 2007-05-31 Kazuhiko Matsumoto Image processing method and computer readable medium for image processing
US20070127804A1 (en) * 2005-11-30 2007-06-07 The General Hospital Corporation Adaptive density mapping in computed tomographic images
DE102006001655A1 (de) * 2006-01-12 2007-08-02 Siemens Ag Verfahren und Vorrichtung zur virtuellen Darmreinigung
US20070232896A1 (en) * 1998-09-24 2007-10-04 Super Dimension Ltd. System and method of recording and displaying in context of an image a location of at least one point-of-interest in a body during an intra-body medical procedure
US20080009760A1 (en) * 2006-06-30 2008-01-10 Broncus Technologies, Inc. Airway bypass site selection and treatment planning
KR100802137B1 (ko) 2006-07-21 2008-02-12 한국과학기술원 대장모델 생성 방법, 충돌 검사 방법 및 이를 이용한내시경 시뮬레이션 방법
US20080055308A1 (en) * 2004-06-23 2008-03-06 Koninklijke Philips Electronics N.V. Virtual Endoscopy
US7346209B2 (en) 2002-09-30 2008-03-18 The Board Of Trustees Of The Leland Stanford Junior University Three-dimensional pattern recognition method to detect shapes in medical images
US20080118131A1 (en) * 2006-11-22 2008-05-22 General Electric Company Method and system for automatically identifying and displaying vessel plaque views
US20080118133A1 (en) * 2006-11-22 2008-05-22 General Electric Company Methods and apparatus for suppressing tagging material in prepless CT colonography
US20080118111A1 (en) * 2006-11-22 2008-05-22 Saad Ahmed Sirohey Method and apparatus for synchronizing corresponding landmarks among a plurality of images
US20080118127A1 (en) * 2006-11-22 2008-05-22 General Electric Company Methods and apparatus for detecting aneurysm in vasculatures
US20080160489A1 (en) * 2005-02-23 2008-07-03 Koninklijke Philips Electronics, N.V. Method For the Prediction of the Course of a Catheter
EP1955263A2 (en) * 2005-10-17 2008-08-13 The General Hospital Corporation Structure-analysis system, method, software arrangement and computer-accessible medium for digital cleansing of computed tomography colonography images
US20080194946A1 (en) * 2007-02-12 2008-08-14 The Government Of The U.S.A. As Represented By The Secretary Of The Dept. Of Health & Human Services Virtual colonoscopy via wavelets
US20080228067A1 (en) * 2005-10-21 2008-09-18 Koninklijke Philips Electronics N.V. Rendering Method and Apparatus
US7440601B1 (en) 2003-10-10 2008-10-21 The United States Of America As Represented By The Department Of Health And Human Services Automated identification of ileocecal valve
US20080260274A1 (en) * 2007-04-23 2008-10-23 Microsoft Corporation Local image descriptors
US20080273781A1 (en) * 2005-02-14 2008-11-06 Mayo Foundation For Medical Education And Research Electronic Stool Subtraction in Ct Colonography
US20080304616A1 (en) * 2007-06-05 2008-12-11 The Government Of The U.S.A. As Represented By The Secretary Of The Dept. Of Health & Human Services Segmenting colon wall via level set techniques
US20090028405A1 (en) * 2004-04-08 2009-01-29 Hadassa Degani Three Time Point Lung Cancer Detection, Diagnosis and Assessment of Prognosis
US20090055137A1 (en) * 2007-08-22 2009-02-26 Imed Gargouri Method for obtaining geometric properties of an anatomic part
US20090080747A1 (en) * 2007-09-21 2009-03-26 Le Lu User interface for polyp annotation, segmentation, and measurement in 3D computed tomography colonography
DE102007056800A1 (de) 2007-11-23 2009-06-04 Siemens Ag Verfahren zur tomographischen Darstellung eines mit einer Substanz versehenen Hohlorgans und Tomographiegerät
US20090244060A1 (en) * 2008-04-01 2009-10-01 Michael Suhling Method and apparatus for visualizing an image data record of an organ enclosing a cavity, in particular a CT image data record of a colon
US20090297010A1 (en) * 2008-05-28 2009-12-03 Dominik Fritz Method and apparatus for visualizing tubular anatomical structures, in particular vessel structures, in medical 3D image records
WO2009144290A1 (en) * 2008-05-28 2009-12-03 Dublin City University Electronic cleansing of digital data sets
US20090309961A1 (en) * 2008-06-16 2009-12-17 Olympus Corporation Image processing apparatus, image processing method and image processing program
US20100021026A1 (en) * 2008-07-25 2010-01-28 Collins Michael J Computer-aided detection and display of colonic residue in medical imagery of the colon
US20100067753A1 (en) * 2005-06-21 2010-03-18 Koninklijke Philips Electronics, N.V. Method and device for imaging a blood vessel
US20100208956A1 (en) * 2005-11-30 2010-08-19 The Research Foundation Of State University Of New York Electronic colon cleansing method for virtual colonoscopy
CN101084840B (zh) * 2006-06-08 2011-01-26 西门子公司 利用x射线透视配准功能磁共振图像数据的方法
US20110206253A1 (en) * 2010-02-01 2011-08-25 Superdimension, Ltd. Region-Growing Algorithm
US20110206250A1 (en) * 2010-02-24 2011-08-25 Icad, Inc. Systems, computer-readable media, and methods for the classification of anomalies in virtual colonography medical image processing
USRE42952E1 (en) 1999-11-05 2011-11-22 Vital Images, Inc. Teleradiology systems for rendering and visualizing remotely-located volume data sets
US20110285695A1 (en) * 2008-03-07 2011-11-24 Georg-Friedemann Rust Pictorial Representation in Virtual Endoscopy
US20120011457A1 (en) * 2009-03-20 2012-01-12 Koninklijke Philips Electronics N.V. Visualizing a view of a scene
US8184888B2 (en) 2007-09-19 2012-05-22 Siemens Medical Solutions Usa, Inc. Method and system for polyp segmentation for 3D computed tomography colonography
US20120128218A1 (en) * 2008-09-25 2012-05-24 Cae Healthcare Inc. Simulation of Medical Imaging
US20120169735A1 (en) * 2009-09-11 2012-07-05 Koninklijke Philips Electronics N.V. Improvements to curved planar reformation
US20120278711A1 (en) * 2003-09-16 2012-11-01 Labtest International, Inc. D/B/A Intertek Consumer Goods North America Haptic response system and method of use
US20130121549A1 (en) * 2010-07-30 2013-05-16 Koninklijke Philips Electronics N.V. Organ-specific enhancement filter for robust segmentation of medical images
US8452068B2 (en) 2008-06-06 2013-05-28 Covidien Lp Hybrid registration method
US8473032B2 (en) 2008-06-03 2013-06-25 Superdimension, Ltd. Feature-based registration method
US8608724B2 (en) 2004-07-19 2013-12-17 Broncus Medical Inc. Devices for delivering substances through an extra-anatomic opening created in an airway
US20140010430A1 (en) * 2011-02-24 2014-01-09 Dog Microsystems Inc. Method and apparatus for isolating a potential anomaly in imaging data and its application to medical imagery
US8709034B2 (en) 2011-05-13 2014-04-29 Broncus Medical Inc. Methods and devices for diagnosing, monitoring, or treating medical conditions through an opening through an airway wall
US20140193789A1 (en) * 2011-07-28 2014-07-10 Ryoichi Imanaka Cutting simulation device and cutting simulation program
US9053563B2 (en) 2011-02-11 2015-06-09 E4 Endeavors, Inc. System and method for modeling a biopsy specimen
US20150169826A1 (en) * 2012-06-14 2015-06-18 Sony Corporation Information processing apparatus, information processing method, and information processing program
US20160019694A1 (en) * 2013-03-29 2016-01-21 Fujifilm Corporation Region extraction apparatus, method, and program
US9345532B2 (en) 2011-05-13 2016-05-24 Broncus Medical Inc. Methods and devices for ablation of tissue
US20160350979A1 (en) * 2015-05-28 2016-12-01 The Florida International University Board Of Trustees Systems and methods for shape analysis using landmark-driven quasiconformal mapping
US9533128B2 (en) 2003-07-18 2017-01-03 Broncus Medical Inc. Devices for maintaining patency of surgically created channels in tissue
US9575140B2 (en) 2008-04-03 2017-02-21 Covidien Lp Magnetic interference detection system and method
US20170227620A1 (en) * 2016-02-09 2017-08-10 Toshiba Medical Systems Corporation Image processing device and mri apparatus
US9996918B2 (en) 2013-04-10 2018-06-12 The Asan Foundation Method for distinguishing pulmonary artery and pulmonary vein, and method for quantifying blood vessels using same
US10025950B1 (en) * 2017-09-17 2018-07-17 Everalbum, Inc Systems and methods for image recognition
US10242488B1 (en) * 2015-03-02 2019-03-26 Kentucky Imaging Technologies, LLC One-sided transparency: a novel visualization for tubular objects
US10272260B2 (en) 2011-11-23 2019-04-30 Broncus Medical Inc. Methods and devices for diagnosing, monitoring, or treating medical conditions through an opening through an airway wall
US10418705B2 (en) 2016-10-28 2019-09-17 Covidien Lp Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same
US10426424B2 (en) 2017-11-21 2019-10-01 General Electric Company System and method for generating and performing imaging protocol simulations
US10446931B2 (en) 2016-10-28 2019-10-15 Covidien Lp Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same
US10517505B2 (en) 2016-10-28 2019-12-31 Covidien Lp Systems, methods, and computer-readable media for optimizing an electromagnetic navigation system
US10615500B2 (en) 2016-10-28 2020-04-07 Covidien Lp System and method for designing electromagnetic navigation antenna assemblies
US10638952B2 (en) 2016-10-28 2020-05-05 Covidien Lp Methods, systems, and computer-readable media for calibrating an electromagnetic navigation system
US10722311B2 (en) 2016-10-28 2020-07-28 Covidien Lp System and method for identifying a location and/or an orientation of an electromagnetic sensor based on a map
US10751126B2 (en) 2016-10-28 2020-08-25 Covidien Lp System and method for generating a map for electromagnetic navigation
US10792106B2 (en) 2016-10-28 2020-10-06 Covidien Lp System for calibrating an electromagnetic navigation system
WO2020254845A1 (en) 2019-04-02 2020-12-24 Autoid Polska S.A. System for analysis and compression of video results of an endoscopic examination
US12089902B2 (en) 2019-07-30 2024-09-17 Coviden Lp Cone beam and 3D fluoroscope lung navigation

Families Citing this family (134)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6331116B1 (en) 1996-09-16 2001-12-18 The Research Foundation Of State University Of New York System and method for performing a three-dimensional virtual segmentation and examination
US7486811B2 (en) 1996-09-16 2009-02-03 The Research Foundation Of State University Of New York System and method for performing a three-dimensional virtual examination of objects, such as internal organs
DE19835215C2 (de) * 1998-08-05 2000-07-27 Mannesmann Vdo Ag Kombinationsinstrument
US6692258B1 (en) * 2000-06-26 2004-02-17 Medical Learning Company, Inc. Patient simulator
US6674880B1 (en) * 1999-11-24 2004-01-06 Confirma, Inc. Convolution filtering of similarity data for visual display of enhanced image
JP3254451B2 (ja) * 2000-03-06 2002-02-04 経済産業省産業技術総合研究所長 多チャンネルmri画像処理によるカラー化方法及び装置
US6690816B2 (en) 2000-04-07 2004-02-10 The University Of North Carolina At Chapel Hill Systems and methods for tubular object processing
US7356367B2 (en) 2000-06-06 2008-04-08 The Research Foundation Of State University Of New York Computer aided treatment planning and visualization with image registration and fusion
US6775405B1 (en) * 2000-09-29 2004-08-10 Koninklijke Philips Electronics, N.V. Image registration system and method using cross-entropy optimization
JP2004522464A (ja) 2000-10-02 2004-07-29 ザ リサーチ ファウンデーション オブ ステイト ユニヴァーシティ オブ ニューヨーク 向上した視覚化、ナビゲーション及び検査
US6917710B2 (en) * 2001-02-05 2005-07-12 National Instruments Corporation System and method for scanning a region using a low discrepancy curve
US6909801B2 (en) * 2001-02-05 2005-06-21 National Instruments Corporation System and method for generating a low discrepancy curve on an abstract surface
US7034831B2 (en) * 2001-02-05 2006-04-25 National Instruments Corporation System and method for generating a low discrepancy curve in a region
US6950552B2 (en) * 2001-02-05 2005-09-27 National Instruments Corporation System and method for precise location of a point of interest
US7630750B2 (en) 2001-02-05 2009-12-08 The Research Foundation For The State University Of New York Computer aided treatment planning
US6959104B2 (en) * 2001-02-05 2005-10-25 National Instruments Corporation System and method for scanning a region using a low discrepancy sequence
US6792131B2 (en) * 2001-02-06 2004-09-14 Microsoft Corporation System and method for performing sparse transformed template matching using 3D rasterization
US7127100B2 (en) * 2001-06-25 2006-10-24 National Instruments Corporation System and method for analyzing an image
US7324104B1 (en) 2001-09-14 2008-01-29 The Research Foundation Of State University Of New York Method of centerline generation in virtual objects
US7596256B1 (en) 2001-09-14 2009-09-29 The Research Foundation For The State University Of New York Computer assisted detection of lesions in volumetric medical images
US20030086595A1 (en) * 2001-11-07 2003-05-08 Hui Hu Display parameter-dependent pre-transmission processing of image data
ATE518553T1 (de) 2001-11-21 2011-08-15 Bracco Diagnostics Inc Vorrichtung und system zur sammlung von ausfluss von einer person
JP2005511234A (ja) * 2001-12-14 2005-04-28 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ 容積走査に基づいた対象の内臓管腔臓器壁の表面組織の視覚化に関する方法、システム及びコンピュータープログラム
US20030152897A1 (en) * 2001-12-20 2003-08-14 Bernhard Geiger Automatic navigation for virtual endoscopy
US6956373B1 (en) 2002-01-02 2005-10-18 Hugh Keith Brown Opposed orthogonal fusion system and method for generating color segmented MRI voxel matrices
US6658080B1 (en) * 2002-08-05 2003-12-02 Voxar Limited Displaying image data using automatic presets
TW558689B (en) * 2002-08-30 2003-10-21 Univ Taipei Medical Three-dimensional surgery simulation system and method
US6996500B2 (en) * 2002-10-30 2006-02-07 Hewlett-Packard Development Company, L.P. Method for communicating diagnostic data
ATE384314T1 (de) * 2002-11-21 2008-02-15 Koninkl Philips Electronics Nv Verfahren und vorrichtung zur visualisierung einer sequenz von volumenbildern
US7304644B2 (en) * 2003-03-12 2007-12-04 Siemens Medical Solutions Usa, Inc. System and method for performing a virtual endoscopy
US7457444B2 (en) 2003-05-14 2008-11-25 Siemens Medical Solutions Usa, Inc. Method and apparatus for fast automatic centerline extraction for virtual endoscopy
US7301538B2 (en) * 2003-08-18 2007-11-27 Fovia, Inc. Method and system for adaptive direct volume rendering
US7049319B2 (en) 2003-09-23 2006-05-23 Semaan Abboud Colon cleansing composition and method
US7868900B2 (en) * 2004-05-12 2011-01-11 General Electric Company Methods for suppression of items and areas of interest during visualization
WO2005055137A2 (en) * 2003-11-26 2005-06-16 Viatronix Incorporated Vessel segmentation using vesselness and edgeness
US7480412B2 (en) * 2003-12-16 2009-01-20 Siemens Medical Solutions Usa, Inc. Toboggan-based shape characterization
DE102004011155A1 (de) * 2004-03-08 2005-08-18 Siemens Ag Verfahren zur Visualisierung von mit einem bildgebenden, endoluminalen Instrument aufgezeichneten 2D-Bilddaten eines Hohlkanals
US7609910B2 (en) * 2004-04-09 2009-10-27 Siemens Medical Solutions Usa, Inc. System and method for creating a panoramic view of a volumetric image
US7590310B2 (en) 2004-05-05 2009-09-15 Facet Technology Corp. Methods and apparatus for automated true object-based image analysis and retrieval
WO2005111944A1 (ja) * 2004-05-19 2005-11-24 Kyoto University データ表示方法、データ表示装置、データ表示プログラム及びそれを記録したコンピュータ読み取り可能な記録媒体
WO2005119505A2 (en) * 2004-06-04 2005-12-15 Stereotaxis, Inc. User interface for remote control of medical devices
CN1301494C (zh) * 2004-06-07 2007-02-21 东软飞利浦医疗设备系统有限责任公司 一种医学图像的三维分割方法
US7724258B2 (en) * 2004-06-30 2010-05-25 Purdue Research Foundation Computer modeling and animation of natural phenomena
SE528068C2 (sv) 2004-08-19 2006-08-22 Jan Erik Solem Med Jsolutions Igenkänning av 3D föremål
US20060047227A1 (en) * 2004-08-24 2006-03-02 Anna Jerebko System and method for colon wall extraction in the presence of tagged fecal matter or collapsed colon regions
US20060149127A1 (en) * 2004-12-30 2006-07-06 Seddiqui Fred R Disposable multi-lumen catheter with reusable stylet
US8797392B2 (en) 2005-01-05 2014-08-05 Avantis Medical Sytems, Inc. Endoscope assembly with a polarizing filter
US8872906B2 (en) 2005-01-05 2014-10-28 Avantis Medical Systems, Inc. Endoscope assembly with a polarizing filter
US8289381B2 (en) 2005-01-05 2012-10-16 Avantis Medical Systems, Inc. Endoscope with an imaging catheter assembly and method of configuring an endoscope
US20080021274A1 (en) * 2005-01-05 2008-01-24 Avantis Medical Systems, Inc. Endoscopic medical device with locking mechanism and method
US20070293720A1 (en) * 2005-01-05 2007-12-20 Avantis Medical Systems, Inc. Endoscope assembly and method of viewing an area inside a cavity
US8182422B2 (en) * 2005-12-13 2012-05-22 Avantis Medical Systems, Inc. Endoscope having detachable imaging device and method of using
US7583831B2 (en) * 2005-02-10 2009-09-01 Siemens Medical Solutions Usa, Inc. System and method for using learned discriminative models to segment three dimensional colon image data
EP1849402B1 (en) * 2005-02-15 2018-05-16 Olympus Corporation Medical image processing device, lumen image processing device, lumen image processing method, and programs for them
KR100970295B1 (ko) * 2005-04-13 2010-07-15 올림푸스 메디칼 시스템즈 가부시키가이샤 화상 처리 장치 및 화상 처리 방법
JP4105176B2 (ja) * 2005-05-19 2008-06-25 ザイオソフト株式会社 画像処理方法および画像処理プログラム
US7889905B2 (en) 2005-05-23 2011-02-15 The Penn State Research Foundation Fast 3D-2D image registration method with application to continuously guided endoscopy
US7756563B2 (en) * 2005-05-23 2010-07-13 The Penn State Research Foundation Guidance method based on 3D-2D pose estimation and 3D-CT registration with application to live bronchoscopy
US20070015989A1 (en) * 2005-07-01 2007-01-18 Avantis Medical Systems, Inc. Endoscope Image Recognition System and Method
US7806850B2 (en) 2005-10-24 2010-10-05 Bracco Diagnostics Inc. Insufflating system, method, and computer program product for controlling the supply of a distending media to an endoscopic device
US7983462B2 (en) * 2005-11-22 2011-07-19 Purdue Research Foundation Methods and systems for improving quality of an image
EP2412300B1 (en) * 2005-12-28 2014-03-26 Olympus Medical Systems Corp. Image processing device and image processing method in image processing device
WO2007087421A2 (en) 2006-01-23 2007-08-02 Avantis Medical Systems, Inc. Endoscope
US7907772B2 (en) * 2006-03-30 2011-03-15 Accuray Incorporated Delineation on three-dimensional medical image
US8287446B2 (en) 2006-04-18 2012-10-16 Avantis Medical Systems, Inc. Vibratory device, endoscope having such a device, method for configuring an endoscope, and method of reducing looping of an endoscope
US7613539B2 (en) * 2006-05-09 2009-11-03 Inus Technology, Inc. System and method for mesh and body hybrid modeling using 3D scan data
JP2009537284A (ja) * 2006-05-19 2009-10-29 アヴァンティス メディカル システムズ インコーポレイテッド 画像を作成しかつ改善するためのシステムおよび方法
US20070279435A1 (en) * 2006-06-02 2007-12-06 Hern Ng Method and system for selective visualization and interaction with 3D image data
US20070279436A1 (en) * 2006-06-02 2007-12-06 Hern Ng Method and system for selective visualization and interaction with 3D image data, in a tunnel viewer
US8023703B2 (en) * 2006-07-06 2011-09-20 The United States of America as represented by the Secretary of the Department of Health and Human Services, National Institues of Health Hybrid segmentation of anatomical structure
US7927272B2 (en) * 2006-08-04 2011-04-19 Avantis Medical Systems, Inc. Surgical port with embedded imaging device
US20080117210A1 (en) * 2006-11-22 2008-05-22 Barco N.V. Virtual endoscopy
US7840051B2 (en) * 2006-11-22 2010-11-23 Toshiba Medical Visualization Systems Europe, Limited Medical image segmentation
KR100805387B1 (ko) * 2006-11-24 2008-02-25 충북대학교 산학협력단 내시경 영상의 변환 장치 및 방법
US20090156895A1 (en) * 2007-01-31 2009-06-18 The Penn State Research Foundation Precise endoscopic planning and visualization
US8672836B2 (en) * 2007-01-31 2014-03-18 The Penn State Research Foundation Method and apparatus for continuous guidance of endoscopy
US9037215B2 (en) 2007-01-31 2015-05-19 The Penn State Research Foundation Methods and apparatus for 3D route planning through hollow organs
US20090231419A1 (en) * 2007-02-06 2009-09-17 Avantis Medical Systems, Inc. Endoscope Assembly and Method of Performing a Medical Procedure
US8064666B2 (en) 2007-04-10 2011-11-22 Avantis Medical Systems, Inc. Method and device for examining or imaging an interior surface of a cavity
US8514218B2 (en) 2007-08-14 2013-08-20 Siemens Aktiengesellschaft Image-based path planning for automated virtual colonoscopy navigation
US20090213211A1 (en) * 2007-10-11 2009-08-27 Avantis Medical Systems, Inc. Method and Device for Reducing the Fixed Pattern Noise of a Digital Image
PT2848192T (pt) * 2007-10-15 2022-03-02 Univ Maryland Aparelho para utilização no estudo do cólon de um paciente
US8848995B2 (en) * 2007-12-28 2014-09-30 Im3D S.P.A. Method of classification of tagged material in a set of tomographic images of colorectal region
DE102008003878B3 (de) * 2008-01-10 2009-04-30 Siemens Aktiengesellschaft Verfahren und Vorrichtung zur Visualisierung von 3D-Bilddaten der tomographischen Bildgebung
US8200466B2 (en) 2008-07-21 2012-06-12 The Board Of Trustees Of The Leland Stanford Junior University Method for tuning patient-specific cardiovascular simulations
US8131770B2 (en) 2009-01-30 2012-03-06 Nvidia Corporation System, method, and computer program product for importance sampling of partitioned domains
US9405886B2 (en) 2009-03-17 2016-08-02 The Board Of Trustees Of The Leland Stanford Junior University Method for determining cardiovascular information
US8392853B2 (en) * 2009-07-17 2013-03-05 Wxanalyst, Ltd. Transparent interface used to independently manipulate and interrogate N-dimensional focus objects in virtual and real visualization systems
US20110066406A1 (en) * 2009-09-15 2011-03-17 Chung Yuan Christian University Method for Generating Real-Time Haptic Response Information for a Haptic Simulating Device
US8611622B2 (en) * 2009-11-27 2013-12-17 Dog Microsystems Inc. Method for determining an estimation of a topological support of a tubular structure and use thereof in virtual endoscopy
JP5535725B2 (ja) * 2010-03-31 2014-07-02 富士フイルム株式会社 内視鏡観察支援システム、並びに、内視鏡観察支援装置、その作動方法およびプログラム
WO2011136336A1 (ja) 2010-04-30 2011-11-03 味の素株式会社 Ctコロノグラフィに用いられる経口投与用液剤及び消化管造影用組成物
US8315812B2 (en) 2010-08-12 2012-11-20 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
US8157742B2 (en) 2010-08-12 2012-04-17 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
DE102010040402B4 (de) * 2010-09-08 2012-12-06 Siemens Aktiengesellschaft Vereinfachte Definition von gekrümmten Anregungen in der parallelen MR-Bildgebung
CN103249447B (zh) 2010-11-24 2015-08-26 布莱可诊断公司 用于提供和控制用于ct结肠造影的膨胀介质的供应的系统、装置和方法
US9486142B2 (en) 2010-12-13 2016-11-08 The Trustees Of Columbia University In The City Of New York Medical imaging devices, methods, and systems
US9119655B2 (en) 2012-08-03 2015-09-01 Stryker Corporation Surgical manipulator capable of controlling a surgical instrument in multiple modes
US9921712B2 (en) 2010-12-29 2018-03-20 Mako Surgical Corp. System and method for providing substantially stable control of a surgical tool
EP2729066B1 (en) * 2011-07-07 2021-01-27 The Board of Trustees of the Leland Stanford Junior University Comprehensive cardiovascular analysis with volumetric phase-contrast mri
US20130072783A1 (en) 2011-09-16 2013-03-21 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Indicating proximity of a body-insertable device to a destination region of interest
WO2013142220A2 (en) 2012-03-22 2013-09-26 The Cleveland Clinic Foundation Augmented reconstruction for computed tomography
US8548778B1 (en) 2012-05-14 2013-10-01 Heartflow, Inc. Method and system for providing information from a patient-specific model of blood flow
US9226796B2 (en) 2012-08-03 2016-01-05 Stryker Corporation Method for detecting a disturbance as an energy applicator of a surgical instrument traverses a cutting path
KR102235965B1 (ko) 2012-08-03 2021-04-06 스트리커 코포레이션 로봇 수술을 위한 시스템 및 방법
US9820818B2 (en) 2012-08-03 2017-11-21 Stryker Corporation System and method for controlling a surgical manipulator based on implant parameters
JP6080249B2 (ja) * 2012-09-13 2017-02-15 富士フイルム株式会社 3次元画像表示装置および方法並びにプログラム
JP6080248B2 (ja) * 2012-09-13 2017-02-15 富士フイルム株式会社 3次元画像表示装置および方法並びにプログラム
CN103903303B (zh) * 2012-12-27 2018-01-30 清华大学 三维模型创建方法和设备
KR101348680B1 (ko) * 2013-01-09 2014-01-09 국방과학연구소 영상추적기를 위한 표적포착방법 및 이를 이용한 표적포착장치
EP2996615B1 (en) 2013-03-13 2019-01-30 Stryker Corporation System for arranging objects in an operating room in preparation for surgical procedures
AU2014248758B2 (en) 2013-03-13 2018-04-12 Stryker Corporation System for establishing virtual constraint boundaries
KR101851221B1 (ko) 2013-07-05 2018-04-25 삼성전자주식회사 초음파 영상 장치 및 그 제어 방법
CN104778743B (zh) * 2013-10-30 2018-04-17 宏达国际电子股份有限公司 用于产生三维景象的装置及由电脑执行的产生三维景象的方法
JP5844438B2 (ja) * 2014-07-25 2016-01-20 富士設計株式会社 三次元測定対象物の形態調査方法
US9672747B2 (en) 2015-06-15 2017-06-06 WxOps, Inc. Common operating environment for aircraft operations
US20170084036A1 (en) * 2015-09-21 2017-03-23 Siemens Aktiengesellschaft Registration of video camera with medical imaging
CN113925610B (zh) 2015-12-31 2024-08-13 史赛克公司 用于在由虚拟对象限定的目标部位处对患者执行手术的系统和方法
EP3554414A1 (en) 2016-12-16 2019-10-23 MAKO Surgical Corp. Techniques for modifying tool operation in a surgical robotic system based on comparing actual and commanded states of the tool relative to a surgical site
KR101930905B1 (ko) 2017-02-28 2018-12-19 메디컬아이피 주식회사 의료영상의 영역 분리 방법 및 그 장치
DE102017203248B3 (de) * 2017-02-28 2018-03-22 Siemens Healthcare Gmbh Verfahren zum Bestimmen einer Biopsieposition, Verfahren zum Optimieren eines Positionsbestimmungsalgorithmus, Positionsbestimmungseinheit, bildgebende medizinische Vorrichtung, Computerprogrammprodukte und computerlesbare Speichermedien
TWI632893B (zh) * 2017-05-16 2018-08-21 國立陽明大學 偵測與分析消化道黏膜組織之方法及系統
CN114777682A (zh) 2017-10-06 2022-07-22 先进扫描仪公司 生成一个或多个亮度边缘以形成物体的三维模型
JP7084193B2 (ja) * 2018-04-10 2022-06-14 ザイオソフト株式会社 医用画像処理装置、医用画像処理方法、及び医用画像処理プログラム
JP7131080B2 (ja) * 2018-05-28 2022-09-06 大日本印刷株式会社 ボリュームレンダリング装置
AU2019325414A1 (en) * 2018-08-24 2021-03-25 David Byron Douglas A virtual tool kit for radiologists
US10813612B2 (en) 2019-01-25 2020-10-27 Cleerly, Inc. Systems and method of characterizing high risk plaques
US10849696B2 (en) 2019-03-01 2020-12-01 Biosense Webster (Israel) Ltd. Map of body cavity
US20220392065A1 (en) 2020-01-07 2022-12-08 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11501436B2 (en) 2020-01-07 2022-11-15 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11969280B2 (en) 2020-01-07 2024-04-30 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
CN111821021B (zh) * 2020-06-19 2021-10-26 湖州市中心医院 一种基于人工智能的肠镜最优路径计算方法和系统
EP4036865A1 (en) * 2021-01-28 2022-08-03 Siemens Healthcare GmbH Method, device and system for visualizing colon
US20230289963A1 (en) 2022-03-10 2023-09-14 Cleerly, Inc. Systems, devices, and methods for non-invasive image-based plaque analysis and risk determination

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4751643A (en) 1986-08-04 1988-06-14 General Electric Company Method and apparatus for determining connected substructures within a body
US4985856A (en) 1988-11-10 1991-01-15 The Research Foundation Of State University Of New York Method and apparatus for storing, accessing, and processing voxel-based data
US4987554A (en) 1988-08-24 1991-01-22 The Research Foundation Of State University Of New York Method of converting continuous three-dimensional geometrical representations of polygonal objects into discrete three-dimensional voxel-based representations thereof within a three-dimensional voxel-based system
US5038302A (en) 1988-07-26 1991-08-06 The Research Foundation Of State University Of New York Method of converting continuous three-dimensional geometrical representations into discrete three-dimensional voxel-based representations within a three-dimensional voxel-based system
US5095521A (en) 1987-04-03 1992-03-10 General Electric Cgr S.A. Method for the computing and imaging of views of an object
US5101475A (en) 1989-04-17 1992-03-31 The Research Foundation Of State University Of New York Method and apparatus for generating arbitrary projections of three-dimensional voxel-based data
US5361763A (en) 1993-03-02 1994-11-08 Wisconsin Alumni Research Foundation Method for segmenting features in an image
US5442733A (en) 1992-03-20 1995-08-15 The Research Foundation Of State University Of New York Method and apparatus for generating realistic images using a discrete representation
US5458111A (en) 1994-09-06 1995-10-17 William C. Bond Computed tomographic colonoscopy
US5611025A (en) 1994-11-23 1997-03-11 General Electric Company Virtual internal cavity inspection system
US5623586A (en) 1991-05-25 1997-04-22 Hoehne; Karl-Heinz Method and device for knowledge based representation and display of three dimensional objects
US5630034A (en) 1994-04-05 1997-05-13 Hitachi, Ltd. Three-dimensional image producing method and apparatus
US5699799A (en) 1996-03-26 1997-12-23 Siemens Corporate Research, Inc. Automatic determination of the curved axis of a 3-D tube-shaped object in image volume
US5782762A (en) 1994-10-27 1998-07-21 Wake Forest University Method and system for producing interactive, three-dimensional renderings of selected body organs having hollow lumens to enable simulated movement through the lumen
WO1998037517A1 (en) 1997-02-25 1998-08-27 Wake Forest University Automatic analysis in virtual endoscopy

Family Cites Families (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DK273280A (da) 1979-06-28 1980-12-29 Schering Ag Trijoderede 5-aminoisophthalsyrederivater
US4391280A (en) 1980-04-04 1983-07-05 Miller Roscoe E Enema apparata improvements relating to double contrast studies
US4630203A (en) 1983-12-27 1986-12-16 Thomas Szirtes Contour radiography: a system for determining 3-dimensional contours of an object from its 2-dimensional images
US4737921A (en) 1985-06-03 1988-04-12 Dynamic Digital Displays, Inc. Three dimensional medical image display system
US4729098A (en) 1985-06-05 1988-03-01 General Electric Company System and method employing nonlinear interpolation for the display of surface structures contained within the interior region of a solid body
US4710876A (en) 1985-06-05 1987-12-01 General Electric Company System and method for the display of surface structures contained within the interior region of a solid body
US4719585A (en) 1985-08-28 1988-01-12 General Electric Company Dividing cubes system and method for the display of surface structures contained within the interior region of a solid body
DE3611018C1 (de) 1986-03-27 1987-06-19 Wiest Peter P Vorrichtung zum Insufflieren von Gas
US4791567A (en) 1986-09-15 1988-12-13 General Electric Company Three dimensional connectivity system employing an equivalence schema for determining connected substructures within a body
US4879668A (en) 1986-12-19 1989-11-07 General Electric Company Method of displaying internal surfaces of three-dimensional medical images
US4823129A (en) 1987-02-24 1989-04-18 Bison Instruments, Inc. Analog-to-digital converter
FR2614163B1 (fr) 1987-04-17 1989-06-09 Thomson Cgr Procede de representation d'images de vues d'un objet
US4831528A (en) 1987-11-09 1989-05-16 General Electric Company Apparatus and method for improvement of 3D images derived from tomographic data
US5170347A (en) 1987-11-27 1992-12-08 Picker International, Inc. System to reformat images for three-dimensional display using unique spatial encoding and non-planar bisectioning
US5023072A (en) 1988-08-10 1991-06-11 University Of New Mexico Paramagnetic/superparamagnetic/ferromagnetic sucrose sulfate compositions for magnetic resonance imaging of the gastrointestinal tract
US4993415A (en) 1988-08-19 1991-02-19 Alliance Pharmaceutical Corp. Magnetic resonance imaging with perfluorocarbon hydrides
FR2636752B1 (fr) 1988-09-16 1990-10-26 Gen Electric Cgr Procede et systeme de correction des defauts d'images d'un scanner dus aux deplacements de ce dernier
US4984157A (en) 1988-09-21 1991-01-08 General Electric Company System and method for displaying oblique planar cross sections of a solid body using tri-linear interpolation to determine pixel position dataes
US4985834A (en) 1988-11-22 1991-01-15 General Electric Company System and method employing pipelined parallel circuit architecture for displaying surface structures of the interior region of a solid body
FR2648304B1 (fr) 1989-06-12 1991-08-30 Commissariat Energie Atomique Procede de determination d'un espace a partir d'un espace discret connu pour la reconstruction d'images bi ou tridimensionnelles, dispositif de mise en oeuvre et application du procede
JP2714164B2 (ja) 1989-07-31 1998-02-16 株式会社東芝 三次元画像表示装置
US5006109A (en) 1989-09-12 1991-04-09 Donald D. Douglas Method and device for controlling pressure, volumetric flow rate and temperature during gas insuffication procedures
US5187658A (en) 1990-01-17 1993-02-16 General Electric Company System and method for segmenting internal structures contained within the interior region of a solid object
US5086401A (en) 1990-05-11 1992-02-04 International Business Machines Corporation Image-directed robotic system for precise robotic surgery including redundant consistency checking
US5047772A (en) 1990-06-04 1991-09-10 General Electric Company Digital error correction system for subranging analog-to-digital converters
US5127037A (en) 1990-08-15 1992-06-30 Bynum David K Apparatus for forming a three-dimensional reproduction of an object from laminations
US5204625A (en) 1990-12-20 1993-04-20 General Electric Company Segmentation of stationary and vascular surfaces in magnetic resonance imaging
US5270926A (en) 1990-12-21 1993-12-14 General Electric Company Method and apparatus for reconstructing a three-dimensional computerized tomography (CT) image of an object from incomplete cone beam projection data
US5166876A (en) 1991-01-16 1992-11-24 General Electric Company System and method for detecting internal structures contained within the interior region of a solid object
US5345490A (en) 1991-06-28 1994-09-06 General Electric Company Method and apparatus for converting computed tomography (CT) data into finite element models
US5261404A (en) 1991-07-08 1993-11-16 Mick Peter R Three-dimensional mammal anatomy imaging system and method
US5283837A (en) 1991-08-27 1994-02-01 Picker International, Inc. Accurate estimation of surface normals in 3-D data sets
US5734384A (en) 1991-11-29 1998-03-31 Picker International, Inc. Cross-referenced sectioning and reprojection of diagnostic image volumes
US5371778A (en) 1991-11-29 1994-12-06 Picker International, Inc. Concurrent display and adjustment of 3D projection, coronal slice, sagittal slice, and transverse slice images
US5295488A (en) 1992-08-05 1994-03-22 General Electric Company Method and apparatus for projecting diagnostic images from volumed diagnostic data
US5322070A (en) 1992-08-21 1994-06-21 E-Z-Em, Inc. Barium enema insufflation system
US5365927A (en) 1993-11-02 1994-11-22 General Electric Company Magnetic resonance imaging system with pointing device
US6343936B1 (en) 1996-09-16 2002-02-05 The Research Foundation Of State University Of New York System and method for performing a three-dimensional virtual examination, navigation and visualization
US5971767A (en) 1996-09-16 1999-10-26 The Research Foundation Of State University Of New York System and method for performing a three-dimensional virtual examination
US6331116B1 (en) 1996-09-16 2001-12-18 The Research Foundation Of State University Of New York System and method for performing a three-dimensional virtual segmentation and examination
US5986662A (en) 1996-10-16 1999-11-16 Vital Images, Inc. Advanced diagnostic viewer employing automated protocol selection for volume-rendered imaging
US6369812B1 (en) * 1997-11-26 2002-04-09 Philips Medical Systems, (Cleveland), Inc. Inter-active viewing system for generating virtual endoscopy studies of medical diagnostic data with a continuous sequence of spherical panoramic views and viewing the studies over networks
US6130671A (en) 1997-11-26 2000-10-10 Vital Images, Inc. Volume rendering lighting using dot product methodology
US6556210B1 (en) * 1998-05-29 2003-04-29 Canon Kabushiki Kaisha Image processing method and apparatus therefor
US6084407A (en) * 1998-09-10 2000-07-04 Pheno Imaging, Inc. System for measuring tissue size and marbling in an animal
US6771262B2 (en) * 1998-11-25 2004-08-03 Siemens Corporate Research, Inc. System and method for volume rendering-based segmentation
US6317137B1 (en) * 1998-12-01 2001-11-13 Silicon Graphics, Inc. Multi-threaded texture modulation for axis-aligned volume rendering
US6603877B1 (en) * 1999-06-01 2003-08-05 Beltronics, Inc. Method of and apparatus for optical imaging inspection of multi-material objects and the like

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4751643A (en) 1986-08-04 1988-06-14 General Electric Company Method and apparatus for determining connected substructures within a body
US5095521A (en) 1987-04-03 1992-03-10 General Electric Cgr S.A. Method for the computing and imaging of views of an object
US5038302A (en) 1988-07-26 1991-08-06 The Research Foundation Of State University Of New York Method of converting continuous three-dimensional geometrical representations into discrete three-dimensional voxel-based representations within a three-dimensional voxel-based system
US4987554A (en) 1988-08-24 1991-01-22 The Research Foundation Of State University Of New York Method of converting continuous three-dimensional geometrical representations of polygonal objects into discrete three-dimensional voxel-based representations thereof within a three-dimensional voxel-based system
US4985856A (en) 1988-11-10 1991-01-15 The Research Foundation Of State University Of New York Method and apparatus for storing, accessing, and processing voxel-based data
US5101475A (en) 1989-04-17 1992-03-31 The Research Foundation Of State University Of New York Method and apparatus for generating arbitrary projections of three-dimensional voxel-based data
US5623586A (en) 1991-05-25 1997-04-22 Hoehne; Karl-Heinz Method and device for knowledge based representation and display of three dimensional objects
US5442733A (en) 1992-03-20 1995-08-15 The Research Foundation Of State University Of New York Method and apparatus for generating realistic images using a discrete representation
US5361763A (en) 1993-03-02 1994-11-08 Wisconsin Alumni Research Foundation Method for segmenting features in an image
US5630034A (en) 1994-04-05 1997-05-13 Hitachi, Ltd. Three-dimensional image producing method and apparatus
US5458111A (en) 1994-09-06 1995-10-17 William C. Bond Computed tomographic colonoscopy
US5782762A (en) 1994-10-27 1998-07-21 Wake Forest University Method and system for producing interactive, three-dimensional renderings of selected body organs having hollow lumens to enable simulated movement through the lumen
US5611025A (en) 1994-11-23 1997-03-11 General Electric Company Virtual internal cavity inspection system
US5699799A (en) 1996-03-26 1997-12-23 Siemens Corporate Research, Inc. Automatic determination of the curved axis of a 3-D tube-shaped object in image volume
WO1998037517A1 (en) 1997-02-25 1998-08-27 Wake Forest University Automatic analysis in virtual endoscopy

Non-Patent Citations (12)

* Cited by examiner, † Cited by third party
Title
Adam L. Penenberg, "From Stony Brook, a New Way to Examine Colons, Externally," The New York Times, p. 6 (1996).
Burgard et al.: "Active mobile robot localization by entropy minimization", Proceedings Second Euromicro Workshop on Advanced Mobile Robots, p. 155-162, Oct. 1997.
David J. Vining, "Virtual Colonoscopy," Advance for Administrators in Radiology, pp. 50-52 91998).
Holzapfel et al,: Large strain analysis of soft biological membranes: formulation and finite element analysis, Computer Methods in Applied Mechanics and Engineering, May 15, 1996, p. 45-61.
Hong et al., "3D Reconstruction and Visualization of the Inner Surface of the Colon from Spiral CT Data," IEEE, pp. 1506-1510 (1997).
Hong et al., "3D Virtual Colonoscopy," 1995 Biomedical Visualization Proceedings, pp. 26-32 and 83 (1995).
Kaufman, A.,: "Disobstruction of Colon Wall Collapse", Project Description Online Jan. 1999; www.cs.sunysb.edu/ari/523/collapse.html.
Kaye et al.: A 3D virtual environment of modeling mechanical cardiopulmonary interactions, CVRMED-MRCAS '97, First Joint Conference, Computer Vision, Virtual Reality and Robotics in Medicine and Medical Robotics and Computer-Assisted Surgery Proceedings, p. 389-398.
Liang Z. et al.: "Inclusion of priori information in segmentation of colon lumen for 3D virtual colonoscopy", 1997 IEEE Nuclear Science Symposium Conference Record, vol. 2, p. 1423-1427.
Shibolet et al.: "Coloring voxel-based objects for virtual endoscopy" IEEE Symposium on Volume Visualization (Cat. No. 989EX300), p. 15-22, (1998).
Valev V. et al.: "Techniques of CT colonography (virtual colonoscopy)" Critical Reviews in Biomedical engineering, vol. 27, p. 1-25, (1999).
William E. Lorensen, "The Exploration of Cross-Sectional Data with a Virtual Endoscope," Interactive Technology and the New Health Paradigm, IOS Press, pp. 221-230 (1995).

Cited By (245)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070232896A1 (en) * 1998-09-24 2007-10-04 Super Dimension Ltd. System and method of recording and displaying in context of an image a location of at least one point-of-interest in a body during an intra-body medical procedure
US6765566B1 (en) * 1998-12-22 2004-07-20 Che-Chih Tsao Method and apparatus for displaying volumetric 3D images
USRE44336E1 (en) 1999-11-05 2013-07-02 Vital Images, Inc. Teleradiology systems for rendering and visualizing remotely-located volume data sets
USRE42952E1 (en) 1999-11-05 2011-11-22 Vital Images, Inc. Teleradiology systems for rendering and visualizing remotely-located volume data sets
US6476803B1 (en) * 2000-01-06 2002-11-05 Microsoft Corporation Object modeling system and process employing noise elimination and robust surface extraction techniques
US7591998B2 (en) * 2000-03-07 2009-09-22 Kevin Tait Stool marker
US20040151668A1 (en) * 2000-03-07 2004-08-05 Kevin Tait Stool marker
US6477401B1 (en) 2000-03-10 2002-11-05 Mayo Foundation For Medical Education And Research Colonography of an unprepared colon
US7620442B2 (en) 2000-03-10 2009-11-17 Mayo Foundation For Medical Education And Research Colonography on an unprepared colon
US7630529B2 (en) 2000-04-07 2009-12-08 The General Hospital Corporation Methods for digital bowel subtraction and polyp detection
US20050107691A1 (en) * 2000-04-07 2005-05-19 The General Hospital Corporation Methods for digital bowel subtraction and polyp detection
US6947784B2 (en) 2000-04-07 2005-09-20 The General Hospital Corporation System for digital bowel subtraction and polyp detection and related techniques
US20020097320A1 (en) * 2000-04-07 2002-07-25 Zalis Michael E. System for digital bowel subtraction and polyp detection and related techniques
US7261565B2 (en) * 2000-05-19 2007-08-28 Simbionix Ltd. Endoscopic tutorial system for the pancreatic system
US20030108853A1 (en) * 2000-05-19 2003-06-12 Edna Chosack Endoscopic tutorial system for the pancreatic system
WO2002029764A1 (en) * 2000-10-03 2002-04-11 The Research Foundation Of State University Of New York System and method for performing a three-dimensional virtual examination of objects, such as internal organs
US6643533B2 (en) * 2000-11-28 2003-11-04 Ge Medical Systems Global Technology Company, Llc Method and apparatus for displaying images of tubular structures
US6718193B2 (en) * 2000-11-28 2004-04-06 Ge Medical Systems Global Technology Company, Llc Method and apparatus for analyzing vessels displayed as unfolded structures
US9153146B2 (en) * 2001-01-24 2015-10-06 Surgical Science Sweden Ab Method and system for simulation of surgical procedures
US20060073454A1 (en) * 2001-01-24 2006-04-06 Anders Hyltander Method and system for simulation of surgical procedures
US20050114831A1 (en) * 2001-04-18 2005-05-26 Andres Callegari Volume body renderer
US20020165689A1 (en) * 2001-04-18 2002-11-07 Callegari Andres C. Volume body renderer
US7412363B2 (en) * 2001-04-18 2008-08-12 Landmark Graphics Corporation Volume body renderer
US7991600B2 (en) 2001-04-18 2011-08-02 Landmark Graphics Corporation Volume body renderer
US20100286972A1 (en) * 2001-04-18 2010-11-11 Landmark Graphics Corporation, A Halliburton Company Volume Body Renderer
US20020164061A1 (en) * 2001-05-04 2002-11-07 Paik David S. Method for detecting shapes in medical images
US7043064B2 (en) 2001-05-04 2006-05-09 The Board Of Trustees Of The Leland Stanford Junior University Method for characterizing shapes in medical images
US20020164060A1 (en) * 2001-05-04 2002-11-07 Paik David S. Method for characterizing shapes in medical images
US7039723B2 (en) 2001-08-31 2006-05-02 Hinnovation, Inc. On-line image processing and communication system
US20030132936A1 (en) * 2001-11-21 2003-07-17 Kevin Kreeger Display of two-dimensional and three-dimensional views during virtual examination
US20050024724A1 (en) * 2002-01-09 2005-02-03 Bo-Hyoung Kim Apparatus and method for displaying virtual endoscopy display
US7102634B2 (en) * 2002-01-09 2006-09-05 Infinitt Co., Ltd Apparatus and method for displaying virtual endoscopy display
WO2003083781A1 (en) * 2002-03-29 2003-10-09 Koninklijke Philips Electronics N.V. Method, system and computer program for stereoscopic viewing of 3d medical images
US7684852B2 (en) * 2002-04-06 2010-03-23 Bracco Diagnostics Inc. System, formulation, kit and method for tagging colonic residue in an individual
US20050175542A1 (en) * 2002-04-06 2005-08-11 Philippe Lefere System, formulation, kit and method for tagging colonic residue in an individual
US20040136584A1 (en) * 2002-09-27 2004-07-15 Burak Acar Method for matching and registering medical image data
US7224827B2 (en) * 2002-09-27 2007-05-29 The Board Of Trustees Of The Leland Stanford Junior University Method for matching and registering medical image data
US7646904B2 (en) 2002-09-30 2010-01-12 The United States Of America As Represented By The Department Of Health And Human Services Computer-aided classification of anomalies in anatomical structures
US20100074491A1 (en) * 2002-09-30 2010-03-25 The Government of the United States of America as represented by the Secretary of Health and Human Computer-aided classification of anomalies in anatomical structures
US7272251B2 (en) 2002-09-30 2007-09-18 The Board Of Trustees Of The Leland Stanford Junior University Method for detecting and classifying a structure of interest in medical images
US20040064029A1 (en) * 2002-09-30 2004-04-01 The Government Of The Usa As Represented By The Secretary Of The Dept. Of Health & Human Services Computer-aided classification of anomalies in anatomical structures
US20060099557A1 (en) * 2002-09-30 2006-05-11 Anders Hyltander Device and method for generating a virtual anatomic environment
US7260250B2 (en) 2002-09-30 2007-08-21 The United States Of America As Represented By The Secretary Of The Department Of Health And Human Services Computer-aided classification of anomalies in anatomical structures
US9230452B2 (en) * 2002-09-30 2016-01-05 Surgical Science Sweden Ab Device and method for generating a virtual anatomic environment
US20080015419A1 (en) * 2002-09-30 2008-01-17 The Gov. Of The U.S.A. As Represented By The Secretary Of The Dept. Of Health And Human Service Computer-aided classification of anomalies in anatomical structures
US8189890B2 (en) 2002-09-30 2012-05-29 The United States Of America As Represented By The Secretary Of The Department Of Health And Human Services Computer-aided classification of anomalies in anatomical structures
US7346209B2 (en) 2002-09-30 2008-03-18 The Board Of Trustees Of The Leland Stanford Junior University Three-dimensional pattern recognition method to detect shapes in medical images
US20040141638A1 (en) * 2002-09-30 2004-07-22 Burak Acar Method for detecting and classifying a structure of interest in medical images
US20040147830A1 (en) * 2003-01-29 2004-07-29 Virtualscopics Method and system for use of biomarkers in diagnostic imaging
US20040220466A1 (en) * 2003-04-02 2004-11-04 Kazuhiko Matsumoto Medical image processing apparatus, and medical image processing method
US7639855B2 (en) * 2003-04-02 2009-12-29 Ziosoft, Inc. Medical image processing apparatus, and medical image processing method
US7417636B2 (en) * 2003-05-08 2008-08-26 Siemens Medical Solutions Usa, Inc. Method and apparatus for automatic setting of rendering parameter for virtual endoscopy
US20040259065A1 (en) * 2003-05-08 2004-12-23 Siemens Corporate Research Inc. Method and apparatus for automatic setting of rendering parameter for virtual endoscopy
US20050058351A1 (en) * 2003-06-30 2005-03-17 Ione Fine Surface segmentation from luminance and color differences
US7499572B2 (en) * 2003-06-30 2009-03-03 The Salk Institute For Biological Studies Surface segmentation from luminance and color differences
US20050015004A1 (en) * 2003-07-17 2005-01-20 Hertel Sarah Rose Systems and methods for combining an anatomic structure and metabolic activity for an object
US9533128B2 (en) 2003-07-18 2017-01-03 Broncus Medical Inc. Devices for maintaining patency of surgically created channels in tissue
US20120278711A1 (en) * 2003-09-16 2012-11-01 Labtest International, Inc. D/B/A Intertek Consumer Goods North America Haptic response system and method of use
US20050074150A1 (en) * 2003-10-03 2005-04-07 Andrew Bruss Systems and methods for emulating an angiogram using three-dimensional image data
US7454045B2 (en) 2003-10-10 2008-11-18 The United States Of America As Represented By The Department Of Health And Human Services Determination of feature boundaries in a digital representation of an anatomical structure
US7440601B1 (en) 2003-10-10 2008-10-21 The United States Of America As Represented By The Department Of Health And Human Services Automated identification of ileocecal valve
US20050078858A1 (en) * 2003-10-10 2005-04-14 The Government Of The United States Of America Determination of feature boundaries in a digital representation of an anatomical structure
US7574032B2 (en) 2003-10-31 2009-08-11 General Electric Company Method and apparatus for virtual subtraction of stool from registration and shape based analysis of prone and supine scans of the colon
US20090279759A1 (en) * 2003-10-31 2009-11-12 Saad Ahmed Sirohey Method and apparatus for virtual subtraction of stool from registration and shape based analysis of prone and supine scans of the colon
US8224054B2 (en) 2003-10-31 2012-07-17 General Electric Company Method and apparatus for virtual subtraction of stool from registration and shape based analysis of prone and supine scans of the colon
US20060215896A1 (en) * 2003-10-31 2006-09-28 General Electric Company Method and apparatus for virtual subtraction of stool from registration and shape based analysis of prone and supine scans of the colon
US20050148848A1 (en) * 2003-11-03 2005-07-07 Bracco Imaging, S.P.A. Stereo display of tube-like structures and improved techniques therefor ("stereo display")
WO2005048198A1 (en) * 2003-11-14 2005-05-26 Philips Intellectual Property & Standards Gmbh Method and apparatus for visualisation of a tubular structure
US20070133849A1 (en) * 2003-11-14 2007-06-14 Koninklijke Philips Electronics N.V. Method and apparatus for visualisation of a tubular structure
US20050113696A1 (en) * 2003-11-25 2005-05-26 Miller Steven C. Methods and systems for motion adaptive spatial compounding
US7101336B2 (en) 2003-11-25 2006-09-05 General Electric Company Methods and systems for motion adaptive spatial compounding
US20050110791A1 (en) * 2003-11-26 2005-05-26 Prabhu Krishnamoorthy Systems and methods for segmenting and displaying tubular vessels in volumetric imaging data
US20050143654A1 (en) * 2003-11-29 2005-06-30 Karel Zuiderveld Systems and methods for segmented volume rendering using a programmable graphics pipeline
WO2005061008A1 (en) * 2003-12-01 2005-07-07 The United States Of America As Represented By The Secretary Of The Navy Bowel preparation for virtual colonscopy
US20090028405A1 (en) * 2004-04-08 2009-01-29 Hadassa Degani Three Time Point Lung Cancer Detection, Diagnosis and Assessment of Prognosis
US7693320B2 (en) * 2004-04-08 2010-04-06 Yeda Research And Development Co. Ltd. Three time point lung cancer detection, diagnosis and assessment of prognosis
US20050281481A1 (en) * 2004-06-07 2005-12-22 Lutz Guendel Method for medical 3D image display and processing, computed tomograph, workstation and computer program product
US7839402B2 (en) 2004-06-23 2010-11-23 Koninklijke Philips Electronics N.V. Virtual endoscopy
US8009167B2 (en) 2004-06-23 2011-08-30 Koninklijke Philips Electronics N.V. Virtual endoscopy
US20110116692A1 (en) * 2004-06-23 2011-05-19 Koninklijke Philips Electronics N.V. Virtual endoscopy
US20080055308A1 (en) * 2004-06-23 2008-03-06 Koninklijke Philips Electronics N.V. Virtual Endoscopy
US10369339B2 (en) 2004-07-19 2019-08-06 Broncus Medical Inc. Devices for delivering substances through an extra-anatomic opening created in an airway
US11357960B2 (en) 2004-07-19 2022-06-14 Broncus Medical Inc. Devices for delivering substances through an extra-anatomic opening created in an airway
US8608724B2 (en) 2004-07-19 2013-12-17 Broncus Medical Inc. Devices for delivering substances through an extra-anatomic opening created in an airway
US8784400B2 (en) 2004-07-19 2014-07-22 Broncus Medical Inc. Devices for delivering substances through an extra-anatomic opening created in an airway
US20060024236A1 (en) * 2004-07-27 2006-02-02 Pickhardt Perry J Bowel preparation for virtual colonoscopy
US20060079746A1 (en) * 2004-10-11 2006-04-13 Perret Florence M Apparatus and method for analysis of tissue classes along tubular structures
US8175355B2 (en) 2004-10-15 2012-05-08 The Brigham And Women's Hospital, Inc. Factor analysis in medical imaging
US20070014463A1 (en) * 2004-10-15 2007-01-18 Brigham And Women's Hospital Factor Analysis in Medical Imaging
US7519211B2 (en) * 2004-10-15 2009-04-14 The Brigham And Women's Hospital, Inc. Factor analysis in medical imaging
US20060093196A1 (en) * 2004-10-28 2006-05-04 Odry Benjamin L System and method for automatic detection and localization of 3D bumps in medical images
US7590271B2 (en) * 2004-10-28 2009-09-15 Siemens Medical Solutions Usa, Inc. System and method for automatic detection and localization of 3D bumps in medical images
US20060157069A1 (en) * 2005-01-19 2006-07-20 Ziosoft, Inc. Identification method and computer readable medium
US20060184003A1 (en) * 2005-02-03 2006-08-17 Lewin Jonathan S Intra-procedurally determining the position of an internal anatomical target location using an externally measurable parameter
US20080273781A1 (en) * 2005-02-14 2008-11-06 Mayo Foundation For Medical Education And Research Electronic Stool Subtraction in Ct Colonography
US8031921B2 (en) 2005-02-14 2011-10-04 Mayo Foundation For Medical Education And Research Electronic stool subtraction in CT colonography
US20060187221A1 (en) * 2005-02-22 2006-08-24 Sarang Lakare System and method for identifying and removing virtual objects for visualization and computer aided detection
US20080160489A1 (en) * 2005-02-23 2008-07-03 Koninklijke Philips Electronics, N.V. Method For the Prediction of the Course of a Catheter
US20060228003A1 (en) * 2005-04-06 2006-10-12 Silverstein D A Method and apparatus for detection of optical elements
US20060293792A1 (en) * 2005-06-17 2006-12-28 Honda Motor Co., Ltd. Path generator for mobile object
US7519457B2 (en) * 2005-06-17 2009-04-14 Honda Motor Company, Ltd. Path generator for mobile object
US20100067753A1 (en) * 2005-06-21 2010-03-18 Koninklijke Philips Electronics, N.V. Method and device for imaging a blood vessel
WO2007002146A2 (en) * 2005-06-22 2007-01-04 The Research Foundation Of State University Of New York System and method for computer aided polyp detection
WO2007002146A3 (en) * 2005-06-22 2007-05-03 Univ New York State Res Found System and method for computer aided polyp detection
US20070003122A1 (en) * 2005-06-29 2007-01-04 General Electric Company Method for quantifying an object in a larger structure using a reconstructed image
US7702141B2 (en) * 2005-06-29 2010-04-20 General Electric Company Method for quantifying an object in a larger structure using a reconstructed image
US20070073114A1 (en) * 2005-09-28 2007-03-29 Lutz Gundel Method and apparatus for post-processing of a 3D image data record, in particular for virtual colonography
DE102005046385A1 (de) * 2005-09-28 2007-04-05 Siemens Ag Verfahren und Vorrichtung zur Nachbearbeitung eines 3D-Bilddatensatzes, insbesondere für die virtuelle Kolonographie
DE102005046385B4 (de) * 2005-09-28 2012-03-22 Siemens Ag Verfahren und Vorrichtung zur Nachbearbeitung eines 3D-Bilddatensatzes, insbesondere für die virtuelle Kolonographie
US7680313B2 (en) * 2005-09-28 2010-03-16 Siemens Aktiengesellschaft Method and apparatus for post-processing of a 3D image data record, in particular for virtual colonography
EP1955263A4 (en) * 2005-10-17 2011-12-21 Gen Hospital Corp STRUCTURE ANALYSIS SYSTEM, METHOD, SOFTWARE ARRANGEMENT AND COMPUTER ACCESS MEDIUM FOR THE DIGITAL CLEANING OF COMPUTER TOMOGRAPHY PICTURES OF THE DARMS
US20090304248A1 (en) * 2005-10-17 2009-12-10 Michael Zalis Structure-analysis system, method, software arrangement and computer-accessible medium for digital cleansing of computed tomography colonography images
EP1955263A2 (en) * 2005-10-17 2008-08-13 The General Hospital Corporation Structure-analysis system, method, software arrangement and computer-accessible medium for digital cleansing of computed tomography colonography images
US9299156B2 (en) 2005-10-17 2016-03-29 The General Hospital Corporation Structure-analysis system, method, software arrangement and computer-accessible medium for digital cleansing of computed tomography colonography images
US20080228067A1 (en) * 2005-10-21 2008-09-18 Koninklijke Philips Electronics N.V. Rendering Method and Apparatus
US8548566B2 (en) 2005-10-21 2013-10-01 Koninklijke Philips N.V. Rendering method and apparatus
US20070109299A1 (en) * 2005-11-15 2007-05-17 Vital Images, Inc. Surface-based characteristic path generation
US7929748B2 (en) * 2005-11-23 2011-04-19 Vital Images, Inc. Characteristic path-based colon segmentation
US20090238431A1 (en) * 2005-11-23 2009-09-24 Vital Images, Inc. Characteristic path-based colon segmentation
US7574029B2 (en) * 2005-11-23 2009-08-11 Vital Images, Inc. Characteristic path-based colon segmentation
US20070116346A1 (en) * 2005-11-23 2007-05-24 Peterson Samuel W Characteristic path-based colon segmentation
US7825924B2 (en) * 2005-11-25 2010-11-02 Ziosoft, Inc. Image processing method and computer readable medium for image processing
US20070120845A1 (en) * 2005-11-25 2007-05-31 Kazuhiko Matsumoto Image processing method and computer readable medium for image processing
US20100208956A1 (en) * 2005-11-30 2010-08-19 The Research Foundation Of State University Of New York Electronic colon cleansing method for virtual colonoscopy
US8000550B2 (en) 2005-11-30 2011-08-16 The General Hospital Corporation Adaptive density correction in computed tomographic images
US7809177B2 (en) 2005-11-30 2010-10-05 The General Hospital Corporation Lumen tracking in computed tomographic images
US20070127804A1 (en) * 2005-11-30 2007-06-07 The General Hospital Corporation Adaptive density mapping in computed tomographic images
US20070127803A1 (en) * 2005-11-30 2007-06-07 The General Hospital Corporation Adaptive density correction in computed tomographic images
US20070165928A1 (en) * 2005-11-30 2007-07-19 The General Hospital Corporation Lumen tracking in computed tomographic images
US20100278403A1 (en) * 2005-11-30 2010-11-04 The General Hospital Corporation Lumen Tracking in Computed Tomographic Images
US7965880B2 (en) 2005-11-30 2011-06-21 The General Hospital Corporation Lumen tracking in computed tomographic images
US7961967B2 (en) 2005-11-30 2011-06-14 The General Hospital Corporation Adaptive density mapping in computed tomographic images
US8452061B2 (en) * 2005-11-30 2013-05-28 The Research Foundation Of State University Of New York Electronic colon cleansing method for virtual colonoscopy
US20070297662A1 (en) * 2006-01-12 2007-12-27 Walter Marzendorfer Method and apparatus for virtual bowel cleaning
DE102006001655A1 (de) * 2006-01-12 2007-08-02 Siemens Ag Verfahren und Vorrichtung zur virtuellen Darmreinigung
US7844095B2 (en) 2006-01-12 2010-11-30 Siemens Aktiengesellschaft Method and apparatus for virtual bowel cleaning
CN101084840B (zh) * 2006-06-08 2011-01-26 西门子公司 利用x射线透视配准功能磁共振图像数据的方法
US8668652B2 (en) 2006-06-30 2014-03-11 Broncus Medical Inc. Airway bypass site selection and treatment planning
US20080009760A1 (en) * 2006-06-30 2008-01-10 Broncus Technologies, Inc. Airway bypass site selection and treatment planning
US7985187B2 (en) 2006-06-30 2011-07-26 Broncus Technologies, Inc. Airway bypass site selection and treatment planning
US7517320B2 (en) 2006-06-30 2009-04-14 Broncus Technologies, Inc. Airway bypass site selection and treatment planning
US20100305463A1 (en) * 2006-06-30 2010-12-02 Macklem Peter J Airway bypass site selection and treatment planning
KR100802137B1 (ko) 2006-07-21 2008-02-12 한국과학기술원 대장모델 생성 방법, 충돌 검사 방법 및 이를 이용한내시경 시뮬레이션 방법
US9913969B2 (en) 2006-10-05 2018-03-13 Broncus Medical Inc. Devices for delivering substances through an extra-anatomic opening created in an airway
US8126238B2 (en) 2006-11-22 2012-02-28 General Electric Company Method and system for automatically identifying and displaying vessel plaque views
US8244015B2 (en) 2006-11-22 2012-08-14 General Electric Company Methods and apparatus for detecting aneurysm in vasculatures
US20080118131A1 (en) * 2006-11-22 2008-05-22 General Electric Company Method and system for automatically identifying and displaying vessel plaque views
US20080118133A1 (en) * 2006-11-22 2008-05-22 General Electric Company Methods and apparatus for suppressing tagging material in prepless CT colonography
US20080118111A1 (en) * 2006-11-22 2008-05-22 Saad Ahmed Sirohey Method and apparatus for synchronizing corresponding landmarks among a plurality of images
US7983463B2 (en) 2006-11-22 2011-07-19 General Electric Company Methods and apparatus for suppressing tagging material in prepless CT colonography
US20080118127A1 (en) * 2006-11-22 2008-05-22 General Electric Company Methods and apparatus for detecting aneurysm in vasculatures
US8160395B2 (en) 2006-11-22 2012-04-17 General Electric Company Method and apparatus for synchronizing corresponding landmarks among a plurality of images
US20080194946A1 (en) * 2007-02-12 2008-08-14 The Government Of The U.S.A. As Represented By The Secretary Of The Dept. Of Health & Human Services Virtual colonoscopy via wavelets
US8023710B2 (en) 2007-02-12 2011-09-20 The United States Of America As Represented By The Secretary Of The Department Of Health And Human Services Virtual colonoscopy via wavelets
US7970226B2 (en) * 2007-04-23 2011-06-28 Microsoft Corporation Local image descriptors
US20080260274A1 (en) * 2007-04-23 2008-10-23 Microsoft Corporation Local image descriptors
US20080304616A1 (en) * 2007-06-05 2008-12-11 The Government Of The U.S.A. As Represented By The Secretary Of The Dept. Of Health & Human Services Segmenting colon wall via level set techniques
US8175348B2 (en) 2007-06-05 2012-05-08 The United States Of America As Represented By The Secretary Of The Department Of Health And Human Services Segmenting colon wall via level set techniques
US20090055137A1 (en) * 2007-08-22 2009-02-26 Imed Gargouri Method for obtaining geometric properties of an anatomic part
US8184888B2 (en) 2007-09-19 2012-05-22 Siemens Medical Solutions Usa, Inc. Method and system for polyp segmentation for 3D computed tomography colonography
US8126244B2 (en) 2007-09-21 2012-02-28 Siemens Medical Solutions Usa, Inc. User interface for polyp annotation, segmentation, and measurement in 3D computed tomography colonography
US20090080747A1 (en) * 2007-09-21 2009-03-26 Le Lu User interface for polyp annotation, segmentation, and measurement in 3D computed tomography colonography
DE102007056800A1 (de) 2007-11-23 2009-06-04 Siemens Ag Verfahren zur tomographischen Darstellung eines mit einer Substanz versehenen Hohlorgans und Tomographiegerät
US20110285695A1 (en) * 2008-03-07 2011-11-24 Georg-Friedemann Rust Pictorial Representation in Virtual Endoscopy
US8259108B2 (en) 2008-04-01 2012-09-04 Siemens Aktiengesellschaft Method and apparatus for visualizing an image data record of an organ enclosing a cavity, in particular a CT image data record of a colon
US20090244060A1 (en) * 2008-04-01 2009-10-01 Michael Suhling Method and apparatus for visualizing an image data record of an organ enclosing a cavity, in particular a CT image data record of a colon
US9575140B2 (en) 2008-04-03 2017-02-21 Covidien Lp Magnetic interference detection system and method
US8472687B2 (en) 2008-05-28 2013-06-25 Dublin City University Electronic cleansing of digital data sets
US20110135180A1 (en) * 2008-05-28 2011-06-09 Dublin City University Electronic Cleansing of Digital Data Sets
US20090297010A1 (en) * 2008-05-28 2009-12-03 Dominik Fritz Method and apparatus for visualizing tubular anatomical structures, in particular vessel structures, in medical 3D image records
WO2009144290A1 (en) * 2008-05-28 2009-12-03 Dublin City University Electronic cleansing of digital data sets
US11074702B2 (en) 2008-06-03 2021-07-27 Covidien Lp Feature-based registration method
US8473032B2 (en) 2008-06-03 2013-06-25 Superdimension, Ltd. Feature-based registration method
US10096126B2 (en) 2008-06-03 2018-10-09 Covidien Lp Feature-based registration method
US9117258B2 (en) 2008-06-03 2015-08-25 Covidien Lp Feature-based registration method
US9659374B2 (en) 2008-06-03 2017-05-23 Covidien Lp Feature-based registration method
US11783498B2 (en) 2008-06-03 2023-10-10 Covidien Lp Feature-based registration method
US10674936B2 (en) 2008-06-06 2020-06-09 Covidien Lp Hybrid registration method
US10285623B2 (en) 2008-06-06 2019-05-14 Covidien Lp Hybrid registration method
US11931141B2 (en) 2008-06-06 2024-03-19 Covidien Lp Hybrid registration method
US8467589B2 (en) 2008-06-06 2013-06-18 Covidien Lp Hybrid registration method
US9271803B2 (en) 2008-06-06 2016-03-01 Covidien Lp Hybrid registration method
US10478092B2 (en) 2008-06-06 2019-11-19 Covidien Lp Hybrid registration method
US8452068B2 (en) 2008-06-06 2013-05-28 Covidien Lp Hybrid registration method
US9076078B2 (en) * 2008-06-16 2015-07-07 Olympus Corporation Image processing apparatus, method and program for determining arrangement of vectors on a distribution map
US20090309961A1 (en) * 2008-06-16 2009-12-17 Olympus Corporation Image processing apparatus, image processing method and image processing program
US20100021026A1 (en) * 2008-07-25 2010-01-28 Collins Michael J Computer-aided detection and display of colonic residue in medical imagery of the colon
US8131036B2 (en) 2008-07-25 2012-03-06 Icad, Inc. Computer-aided detection and display of colonic residue in medical imagery of the colon
US9020217B2 (en) * 2008-09-25 2015-04-28 Cae Healthcare Canada Inc. Simulation of medical imaging
US20120128218A1 (en) * 2008-09-25 2012-05-24 Cae Healthcare Inc. Simulation of Medical Imaging
US20120011457A1 (en) * 2009-03-20 2012-01-12 Koninklijke Philips Electronics N.V. Visualizing a view of a scene
US9508188B2 (en) * 2009-03-20 2016-11-29 Koninklijke Philips N.V. Re-centering a view of a scene of a three-dimensional image of anatomy of interest and an alignment object based on a new location of the alignment object in the three-dimensional image
US20120169735A1 (en) * 2009-09-11 2012-07-05 Koninklijke Philips Electronics N.V. Improvements to curved planar reformation
US9019272B2 (en) * 2009-09-11 2015-04-28 Koninklijke Philips N.V. Curved planar reformation
RU2563158C2 (ru) * 2009-09-11 2015-09-20 Конинклейке Филипс Электроникс Н.В. Усовершенствования для планарного преобразования криволинейной структуры
US10249045B2 (en) 2010-02-01 2019-04-02 Covidien Lp Region-growing algorithm
US8842898B2 (en) 2010-02-01 2014-09-23 Covidien Lp Region-growing algorithm
US8428328B2 (en) 2010-02-01 2013-04-23 Superdimension, Ltd Region-growing algorithm
US9042625B2 (en) 2010-02-01 2015-05-26 Covidien Lp Region-growing algorithm
US20110206253A1 (en) * 2010-02-01 2011-08-25 Superdimension, Ltd. Region-Growing Algorithm
US9595111B2 (en) 2010-02-01 2017-03-14 Covidien Lp Region-growing algorithm
US9836850B2 (en) 2010-02-01 2017-12-05 Covidien Lp Region-growing algorithm
US20110206250A1 (en) * 2010-02-24 2011-08-25 Icad, Inc. Systems, computer-readable media, and methods for the classification of anomalies in virtual colonography medical image processing
US20130121549A1 (en) * 2010-07-30 2013-05-16 Koninklijke Philips Electronics N.V. Organ-specific enhancement filter for robust segmentation of medical images
US9087259B2 (en) * 2010-07-30 2015-07-21 Koninklijke Philips N.V. Organ-specific enhancement filter for robust segmentation of medical images
US9053563B2 (en) 2011-02-11 2015-06-09 E4 Endeavors, Inc. System and method for modeling a biopsy specimen
US9672655B2 (en) 2011-02-11 2017-06-06 E4 Endeavors, Inc. System and method for modeling a biopsy specimen
US10223825B2 (en) 2011-02-11 2019-03-05 E4 Endeavors, Inc. System and method for modeling a biopsy specimen
US9717414B2 (en) * 2011-02-24 2017-08-01 Dog Microsystems Inc. Method and apparatus for isolating a potential anomaly in imaging data and its application to medical imagery
US20140010430A1 (en) * 2011-02-24 2014-01-09 Dog Microsystems Inc. Method and apparatus for isolating a potential anomaly in imaging data and its application to medical imagery
US8709034B2 (en) 2011-05-13 2014-04-29 Broncus Medical Inc. Methods and devices for diagnosing, monitoring, or treating medical conditions through an opening through an airway wall
US10631938B2 (en) 2011-05-13 2020-04-28 Broncus Medical Inc. Methods and devices for diagnosing, monitoring, or treating medical conditions through an opening through an airway wall
US9993306B2 (en) 2011-05-13 2018-06-12 Broncus Medical Inc. Methods and devices for diagnosing, monitoring, or treating medical conditions through an opening through an airway wall
US12016640B2 (en) 2011-05-13 2024-06-25 Broncus Medical Inc. Methods and devices for diagnosing, monitoring, or treating medical conditions through an opening through an airway wall
US9421070B2 (en) 2011-05-13 2016-08-23 Broncus Medical Inc. Methods and devices for diagnosing, monitoring, or treating medical conditions through an opening through an airway wall
US9345532B2 (en) 2011-05-13 2016-05-24 Broncus Medical Inc. Methods and devices for ablation of tissue
US9486229B2 (en) 2011-05-13 2016-11-08 Broncus Medical Inc. Methods and devices for excision of tissue
US8932316B2 (en) 2011-05-13 2015-01-13 Broncus Medical Inc. Methods and devices for diagnosing, monitoring, or treating medical conditions through an opening through an airway wall
US20140193789A1 (en) * 2011-07-28 2014-07-10 Ryoichi Imanaka Cutting simulation device and cutting simulation program
US10272260B2 (en) 2011-11-23 2019-04-30 Broncus Medical Inc. Methods and devices for diagnosing, monitoring, or treating medical conditions through an opening through an airway wall
US10460075B2 (en) * 2012-06-14 2019-10-29 Sony Corporation Information processing apparatus and method to move a display area of a needle biopsy image
US20150169826A1 (en) * 2012-06-14 2015-06-18 Sony Corporation Information processing apparatus, information processing method, and information processing program
US20160019694A1 (en) * 2013-03-29 2016-01-21 Fujifilm Corporation Region extraction apparatus, method, and program
US9754368B2 (en) * 2013-03-29 2017-09-05 Fujifilm Corporation Region extraction apparatus, method, and program
US9996918B2 (en) 2013-04-10 2018-06-12 The Asan Foundation Method for distinguishing pulmonary artery and pulmonary vein, and method for quantifying blood vessels using same
US10242488B1 (en) * 2015-03-02 2019-03-26 Kentucky Imaging Technologies, LLC One-sided transparency: a novel visualization for tubular objects
US20160350979A1 (en) * 2015-05-28 2016-12-01 The Florida International University Board Of Trustees Systems and methods for shape analysis using landmark-driven quasiconformal mapping
US9892506B2 (en) * 2015-05-28 2018-02-13 The Florida International University Board Of Trustees Systems and methods for shape analysis using landmark-driven quasiconformal mapping
US10451699B2 (en) * 2016-02-09 2019-10-22 Canon Medical Systems Corporation Image processing device and MRI apparatus
US20170227620A1 (en) * 2016-02-09 2017-08-10 Toshiba Medical Systems Corporation Image processing device and mri apparatus
US10446931B2 (en) 2016-10-28 2019-10-15 Covidien Lp Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same
US11672604B2 (en) 2016-10-28 2023-06-13 Covidien Lp System and method for generating a map for electromagnetic navigation
US10722311B2 (en) 2016-10-28 2020-07-28 Covidien Lp System and method for identifying a location and/or an orientation of an electromagnetic sensor based on a map
US10751126B2 (en) 2016-10-28 2020-08-25 Covidien Lp System and method for generating a map for electromagnetic navigation
US10792106B2 (en) 2016-10-28 2020-10-06 Covidien Lp System for calibrating an electromagnetic navigation system
US10418705B2 (en) 2016-10-28 2019-09-17 Covidien Lp Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same
US10517505B2 (en) 2016-10-28 2019-12-31 Covidien Lp Systems, methods, and computer-readable media for optimizing an electromagnetic navigation system
US10615500B2 (en) 2016-10-28 2020-04-07 Covidien Lp System and method for designing electromagnetic navigation antenna assemblies
US10638952B2 (en) 2016-10-28 2020-05-05 Covidien Lp Methods, systems, and computer-readable media for calibrating an electromagnetic navigation system
US11759264B2 (en) 2016-10-28 2023-09-19 Covidien Lp System and method for identifying a location and/or an orientation of an electromagnetic sensor based on a map
US11786314B2 (en) 2016-10-28 2023-10-17 Covidien Lp System for calibrating an electromagnetic navigation system
US10025950B1 (en) * 2017-09-17 2018-07-17 Everalbum, Inc Systems and methods for image recognition
US10426424B2 (en) 2017-11-21 2019-10-01 General Electric Company System and method for generating and performing imaging protocol simulations
WO2020254845A1 (en) 2019-04-02 2020-12-24 Autoid Polska S.A. System for analysis and compression of video results of an endoscopic examination
US12089902B2 (en) 2019-07-30 2024-09-17 Coviden Lp Cone beam and 3D fluoroscope lung navigation

Also Published As

Publication number Publication date
US7148887B2 (en) 2006-12-12
KR100701235B1 (ko) 2007-03-29
CA2368390A1 (en) 2000-09-21
IL145516A0 (en) 2002-06-30
EP1173830A2 (en) 2002-01-23
US6514082B2 (en) 2003-02-04
CA2368390C (en) 2010-07-27
IS6079A (is) 2001-09-18
CN1352781A (zh) 2002-06-05
WO2000055814A3 (en) 2001-06-28
WO2000055814A2 (en) 2000-09-21
BR0009098A (pt) 2002-05-28
US20020039400A1 (en) 2002-04-04
CN1248167C (zh) 2006-03-29
AU3901800A (en) 2000-10-04
JP2002539568A (ja) 2002-11-19
US20020045153A1 (en) 2002-04-18
KR20010113840A (ko) 2001-12-28
JP4435430B2 (ja) 2010-03-17
EP1173830B1 (en) 2013-05-08

Similar Documents

Publication Publication Date Title
US6331116B1 (en) System and method for performing a three-dimensional virtual segmentation and examination
US6343936B1 (en) System and method for performing a three-dimensional virtual examination, navigation and visualization
US7194117B2 (en) System and method for performing a three-dimensional virtual examination of objects, such as internal organs
US7477768B2 (en) System and method for performing a three-dimensional virtual examination of objects, such as internal organs
US5971767A (en) System and method for performing a three-dimensional virtual examination
US7372988B2 (en) Registration of scanning data acquired from different patient positions
IL178768A (en) System and method for optical mapping of texture properties from at least one optical image to a monochromatic array of information
MXPA01009387A (en) System and method for performing a three-dimensional virtual examination, navigation and visualization
MXPA01009388A (en) System and method for performing a three-dimensional virtual segmentation and examination

Legal Events

Date Code Title Description
AS Assignment

Owner name: RESEARCH FOUNDATION OF STATE UNIVERSITY OF NEW YOR

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KAUFMAN, ARIE E.;LIANG, ZHENGRONG;WAX, MARK R.;AND OTHERS;REEL/FRAME:010328/0302

Effective date: 19991013

STCF Information on status: patent grant

Free format text: PATENTED CASE

CC Certificate of correction
CC Certificate of correction
FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

AS Assignment

Owner name: NATIONAL INSTITUTES OF HEALTH (NIH), U.S. DEPT. OF

Free format text: EXECUTIVE ORDER 9424, CONFIRMATORY LICENSE;ASSIGNOR:STATE UNIVERSITY NEW YORK STONY BROOK;REEL/FRAME:020951/0730

Effective date: 20071111

FEPP Fee payment procedure

Free format text: PAT HOLDER NO LONGER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: STOL); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12

CC Certificate of correction