WO2005078666A1 - Real time user interaction with deformable surfaces of segmentation - Google Patents

Real time user interaction with deformable surfaces of segmentation Download PDF

Info

Publication number
WO2005078666A1
WO2005078666A1 PCT/IB2005/000083 IB2005000083W WO2005078666A1 WO 2005078666 A1 WO2005078666 A1 WO 2005078666A1 IB 2005000083 W IB2005000083 W IB 2005000083W WO 2005078666 A1 WO2005078666 A1 WO 2005078666A1
Authority
WO
WIPO (PCT)
Prior art keywords
model
user
real
data plane
click point
Prior art date
Application number
PCT/IB2005/000083
Other languages
French (fr)
Inventor
Olivier Gerard
Jean-Michel Rouet
Maxim Fradkin
Antoine Collet-Billon
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Priority to EP05702250A priority Critical patent/EP1709591A1/en
Priority to JP2006548471A priority patent/JP2007518484A/en
Publication of WO2005078666A1 publication Critical patent/WO2005078666A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/149Segmentation; Edge detection involving deformable models, e.g. active contour models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • G06T2207/101363D ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20101Interactive definition of point of interest, landmark or seed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical

Definitions

  • the invention relates to an image processing system having processing means for segmenting an object of interest in a three-dimensional image using deformable surfaces.
  • This system comprises means of automatically fitting a three-dimensional deformable
  • the invention further relates to a medical imaging apparatus coupled to such an image processing system and to program products for processing medical three-dimensional images produced by this apparatus or system, for the segmentation of objects of interest that are body organs.
  • the invention finds a particular application in the field of medical imaging in order to study or detect organ pathologies.
  • Background of the Invention A technique of representation of a 3D object using a mesh model is already disclosed by H. DELINGETTE in the publication entitled “Simplex Meshes: a General Representation for 3-D shape Reconstruction” in "Processing of the International Conference on Computer Vision and Pattern Recognition (CVPR'94), 20-24 June 1994, Seattle, USA”.
  • Simplex Meshes which are called 2- Simplex Meshes, where each vertex is connected to three neighboring vertices, are used.
  • the structure of a Simplex Mesh is dual to the structure of a triangulation as illustrated by the FIG.1 of the cited publication. It can represent all types of orientable surface.
  • the contour on a Simplex Mesh is defined as a closed polygonal chain consisting of neighboring vertices on the Simplex Mesh. The contour is restricted to not intersect itself, as far as possible. Contours are deformable models and are handled independently of the Simplex Mesh where they are embedded.
  • Four independent transformations are defined for achieving the whole range of possible mesh transformations. They consist in inserting or deleting edges in a face of the Mesh.
  • the description of the Simplex Mesh also comprises the definition of a Simplex Angle that generalized the angle used in planar geometry; and the definition of metric parameters, which describe how the vertex is located with respect to its three neighbors.
  • the dynamic of each vertex is given by a Newtonian law of motion.
  • the deformation implies a force that constrains the shape to be smooth and a force that constrains the mesh to be close to the 3D object.
  • Internal forces determine the response of a physically based model to external constraints. The internal forces are expressed so that they are intrinsic viewpoint invariant and scale dependant. Similar types of constraints hold for contours.
  • the cited publication provides a simple model for representing a given 3D object.
  • the "Simplex Mesh technique” is a robust segmentation method. Summary of the Invention However, the "Simplex Mesh” technique that is proposed in the cited paper may not achieve a perfect segmentation in certain circumstances. For instance: in a circumstance when the three-dimensional image, which is an image of an organ, is very noisy or when the object of interest is partly blurred. In this circumstance, the automatic segmentation algorithm may yield a wrong location for the surface of the segmented object and the resulting three- dimensional surface may show one or several dissimilarities with the organ of interest.
  • the automatic segmentation algorithm may stop whereas the segmentation operation is not completed; it may progress in a wrong direction, being mislead towards a wrong but contrasted surface; or it may even regress due to the complicated surface shape, again being mislead towards a wrong surface.
  • the invention has for an object to propose a 3D image processing system having means for segmenting an object of interest represented in a three-dimensional image using a deformable 3D surface model and further having real time interactive adaptation means for interactively modifying the 3D surface model in real time by a user.
  • the interactive real-time adaptation means comprises real-time user-actuated processing means, called attraction to point means, including means for: user-selecting a plane of work, called Data Plane, intersecting the surface model; user-actuating a click point in said Data Plane, through which the surface model should pass; attaching a 3D correction surface to the click point; attracting a 3D portion of the 3D surface model to said 3D correction surface; user-sliding the click point in the Data Plane for the user to select the best adaptation of the 3D surface model to the object of interest, whereby the 3D surface model looks like attracted to the click point; optionally user-selecting modification of the shape of the attached 3D correction surface, while sliding the click point; repeating the above operations until the user-controlled real-time adaptation of the 3D surface model is completed.
  • attraction to point means including means for: user-selecting a plane of work, called Data Plane, intersecting the surface model; user-actuating a click point in said Data Plane, through which the
  • the interactive real-time adaptation means further comprises visualization means for the user to control the operation of the real-time user-actuated processing means in 3D images or in 2D images.
  • the 3D correction surface is defined by a shape parameter, which may be set in function of the distance of the user-defined click point to the intersection curve of the surface model within the Data Plane and in function of the actual area of 3D surface model to be modified.
  • This shape parameter is user-selected or user- modified and associated to the user-defined click point for defining the 3D correction surface, which is used for performing the attraction-to-point operation.
  • Said real time interactive adaptation means permits the user of interfering locally in real time in a 2D view of the surface model, which is a relatively easy operation, instead of only acting on a 3D surface model, which is difficult.
  • Said real time interactive adaptation means permits of modifying in real-time a chosen region of the 3D surface model around the user- defined click point, which modification is controlled by the shape parameter, in order to improve the fitness of the 3D surface model of segmentation.
  • the invention also relates to a medical diagnostic imaging apparatus coupled to this system for 3D image processing.
  • the medical imaging apparatus may be an X-ray medical examination apparatus or any other 3D medical imaging apparatus, such as MRI.
  • the invention further relates to a program product or a program package to be used in the system. Brief description of the Drawings The invention is described hereafter in detail in reference to the following diagrammatic and schematic drawings, wherein: FIG.1A shows a diagrammatic representation of the means of the system of the invention; and FIG. IB shows a diagrammatic representation of the real-time interactive adaptation means of the system of the invention; FIG.2 illustrates a Data Plane selection; FIG.3A schematically shows a mesh curve in the selected Data Plane with an
  • FIG.3B shows a 3D Gaussian Surface calculated from the Click point position in the Data Plane
  • FIG.3C shows a corrected mesh curve in the Data Plane
  • FIG.3D shows a motion vector to move the point of reference of the mesh curve of the Data Plane
  • FIG.3E shows motion vectors to move neighbors of the reference point
  • FIG.4A is a 2D view of an object of interest in a medical image, with overlaid segmentation curve
  • FIG.4B illustrates the click point action on the same view
  • FIG.4C shows the same view with the modified overlaid segmentation curve after real-time interactive adaptation
  • FIG.5 illustrates a medical viewing system coupled to a medical examination apparatus.
  • the invention relates to an image processing system for segmenting an object of interest represented in a three-dimensional (3D) image, using a tliree-dimensional deformable surface model technique, whereby the deformable surface model of segmentation is fitted onto the surface of said three-dimensional object.
  • the deformable surface model is a mesh model.
  • the surface of segmentation is represented by mesh faces defined by edges and vertices, as illustrated by FIG.3D.
  • the present invention may be applied to deformable surface models other than mesh models by simply replacing the words "mesh model” by “deformable surface model", the words “mesh curve” by the words “surface curve” and the word “vertex” by the word "point”.
  • the 3D segmented object of interest is an organ represented in a 3D medical image.
  • Segmenting images using discrete deformable models like 2-Simplex meshes, often requires corrections of the resulting segmented surface. This is especially true for medical images, where due to image noise or poor data quality, some salient image features may be missing. As a result, some parts of the model might be attracted to wrong features leading to partially erroneous segmented shape. Therefore, the practitioner usually would like to use his/her experience in image interpretation in order to correct the segmentation result. Moreover, the practitioner may want to guide further segmentation process by forcing the model to stick to user-imposed locations. Modifying the surface of segmentation, when it is not correct, is very difficult to achieve, particularly in real-time.
  • the present invention proposes means to solve this problem in real time.
  • the present invention proposes an image processing system having interactive user- actuated processing means for modifying the 3D mesh model, in real time, by only using a user-drawn point, towards which the mesh model will be attracted, for the modified mesh model to pass through this user-drawn point.
  • said real-time interactive image processing means permits the user of controlling the segmentation operation and of interfering where and when it is necessary in order to modify, correct or adapt in real-time at best the mesh surface of segmentation to better fit the actual surface of the object of interest.
  • FIG.1 A diagrammatically represents the processing means of the system of the invention.
  • This system has initialization means 10 for setting parameters for the automatic segmentation means 11 to perform preliminary 3D image segmentation using the automatic mesh model Technique.
  • the system has display means 60, as illustrated by FIG.5, for the user to examine the result of the preliminary automatic segmentation, which is the image of the mesh model substantially fitting the surface of the object of interest.
  • This 3D mesh model is first mapped at best onto the surface of the object of interest by the automatic segmentation means 11.
  • the system has control means 15, which may be set in operation by the user in order to control the real-time interactive adaptation means 20. If the user accepts the result of the automatic segmentation 11, the data are directed to STOP means 30 through the control means 15.
  • the STOP means 30 permits of yielding directly the preliminary segmentation result as the final segmentation image data.
  • image data may be provided as an image by display means, or as data by memory means, storing means, or other means.
  • the user may want to continue the segmentation operation using the automatic segmentation means 11. Then, the resulting signals data may be again entered into said automatic segmentation means 11 through 13.
  • the real-time interactive adaptation means 20 can be user-actuated through the control means 15.
  • the user-actuated real-time adaptation means 20 are provided for the user to enter data or information in order to interactively modify, or correct or improve in realtime the result of the preliminary automatic segmentation means 11.
  • the real-time interactive adaptation means 20 are actuated by the user through the control means 15 using actuation means such as a keyboard 72 or a mouse 71, as illustrated by FIG.5, or any other interactive actuation means known to those skilled in the art.
  • the user After having performed real-time interactive adaptation 20, the user further examines the segmentation results, for instance using the display means 60. The user may operate the real-time interactive adaptation means until he/she accepts the result.
  • FIG. IB diagrammatically represents the means for carrying out the real-time interactive adaptation means 20 of the invention, called “attraction to point means" for realtime user-controlled adaptation.
  • this real-time interactive adaptation means 20 first comprises plane selection means 21, for the user to select an oriented Data Plane DP showing a section of the surface of segmentation of the obj ect of interest.
  • the orientation of the Data Plane DP is defined within a volume of reference VOL in a tliree-dimensional referential OX, OY, OZ, as illustrated by FIG.2.
  • the Data Plane DP is a work plane for the user to perform actions using the real-time interactive adaptation means.
  • a 3D image is constructed from the assembling of a certain number of two- dimensional images of points parallel to one plane of the referential, each image plane representing a section of the volume of reference NOL in the referential.
  • the orientation of the Data Plane DP is not necessarily a plane of construction of the 3D image.
  • the orientation is selected for said Data Plane to show an intersection, denoted by mesh curve MC, with the mesh model, where a defect of segmentation is best seen.
  • This interesting orientation can be any orientation with respect to the 3D referential.
  • the viewing means 60 may advantageously provide several images, such as 3D images of the 3D object and of the 3D mesh model and one or several 2D views showing calculated 2D mesh curves MC representing the 2D intersection curves of the 3D mesh model by Data Plane(s) in different directions of orientation. These 2D mesh curves may favorably be highlighted and overlaid on a 2D grey-level view of the obj ect of interest in the DP, as illustrated by FIG.4A to FIG.4C.
  • the real-time interactive adaptation means 20 permits the user of interfering locally and in real-time onto the 2D mesh curve MC in the Data Plane DP, instead of directly acting on the 3D mesh model forming the segmented surface of the object of interest.
  • the system of the invention modifies directly, in real-time, the 3D mesh model, while the user acts on the 2D mesh curve MC in the Data Plane DP.
  • the real-time interactive adaptation means 20 permits the user to dispose of the 2D mesh curve MC, and to select a portion of said 2D mesh curve to be modified, denoted by Aberrant Curve AC.
  • the Aberrant Curve AC is the portion of mesh curve MC where the user detects that the calculated mesh model does not correctly fit the surface of the object to be segmented or does not correspond to the way the object of interest is chosen to be segmented.
  • the user actuates in real-time the plane selection means 21 for selecting the best orientation of the Data Plane DP for visualizing said 2D mesh curve MC and the orientation of the Data Plane is varied until the user finds a view of the 2D mesh curve MC where an Aberrant Curve AC is particularly visible, and where the user regards a modification or a correction of the mesh model as particularly necessary.
  • the plane selection means 21 for selecting the best orientation of the Data Plane DP for visualizing said 2D mesh curve MC and the orientation of the Data Plane is varied until the user finds a view of the 2D mesh curve MC where an Aberrant Curve AC is particularly visible, and where the user regards a modification or a correction of the mesh model as particularly necessary.
  • the user may decide that this Aberrant Curve AC should be corrected by passing through a particular point of the Data Plane, denoted by User Point UP, as illustrated by FIG.4A, which is a view of an Object of Interest OI in a Data Plane, with overlaid representations of the mesh curve MC and Aberrant Curve AC.
  • the interactive adaptation means 20 comprises interactive drawing means 22 for the user to draw this point, further denoted by click point CP, for instance by action of a key of the keyboard 72 or by a click of a mouse 71 as shown on FIG.5 or by any other drawing means.
  • This Click Point CP may be at a distance of the Aberrant Curve AC in the Data Plane DP and at proximity of the User Point UP, as illustrated by FIG.4B, which is a view of the same Object of interest OI in the same Data Plane as FIG.4A.
  • the system has calculation means to modify the shape of the Aberrant Curve, for instance based on the distance between the Click Point and the Aberrant Curve.
  • the real-time interactive adaptation system 20 has measure means 23 to estimate the geometrical distance, denoted by Reference Distance Dref, between the Click Point CP and the nearest vertex of the mesh model in the Data Plane, denoted by Reference Point Pref.
  • the actually nearest vertex of the mesh model with respect to the click point may be located in a plane other than the Data Plane.
  • the present system 20 has means to impose to work in the Data Plane DP.
  • the selected nearest point is in the Data Plane DP.
  • the system favorably further estimates the distance between the Click Point CP and the Reference Point Pref as Reference Distance Dref.
  • the user may chose Dref according to other criteria.
  • the system has calculation means for constructing a 3D correction surface G based on said Reference Distance Dref.
  • the 3D correction surface G permits of modifying the shape of the Aberrant Curve AC, while automatically modifying the shape of the 3D mesh model.
  • the 3D correction surface G attracts the reference point Pref and its neighbors towards the Click Point CP and the points of said correction surface G.
  • the modifications produced in 3D may be inspected in the data plane DP and in other cross-sections of the 3D mesh model represented in other 2D views, and displayed by the display means 60.
  • a favorable 3D correction surface is a 3D Gaussian Surface.
  • the user may define another type of 3D correction surface. In the case when a 3D Gaussian Surface is used, its parameter ⁇ may be chosen as a function of the distance Dref.
  • the system 20 has parameter calculation means 24 to calculate the parameter ⁇ for defining the 3D Gaussian Surface G. Then, the system has calculation means 25 to define the 3D Gaussian Surface G from parameter ⁇ .
  • motion vector calculation means 26 provides a motion vector v as a function of the distance Dref between the Click Point CP and the Reference Point Pref, for moving the Reference Point towards the Click Point.
  • the Reference Point Pref is pulled towards the Click Point CP using a pulling force F: ⁇ - V( ⁇ )
  • the modification of the Aberrant Curve AC using the Gaussian Surface G necessitates not only the definition of a pulling force F to move the Reference Point Pref towards the Click Point, but also the definition of pulling forces Fi to move the vertices of its neighbors in 3D towards the Gaussian Surface G.
  • FIG.3D represents two faces ⁇ l, ⁇ 2 of a mesh model, a point Pref located at a vertex of the face ⁇ l, and the distance Dref between the Click Point CP and Pref.
  • Calculation means 27 calculates parameters, denoted by weights ⁇ i, for further calculating said forces Fi based on the respective distances Di assigned to the neighboring vertices Pi.
  • Calculation means 28 further give the forces ⁇ F ⁇ for moving the respective vertices Pi:
  • the forces applied to each vertex of the mesh model are function of the distance of the neighbor vertex to the Reference Point Pref of the mesh model to be modified.
  • Using the coefficient ⁇ permits of defining forces Fi that depend on the volume of the mesh model to modify. The larger the volume of the mesh model, the larger the variation of forces to apply to the vertices.
  • the processing means 29 of the Real-time Interactive Adaptation Means 20 further moves the vertex Pref and its neighbor vertices defined by the distances Di towards the Gaussian Surface G in 3D, respectively using the pulling forces F and Fi previously defined.
  • the click point CP only by sliding the mouse of the computing means, or by any other drawing means, and the Correction Surface G, which is attached to the sliding Click Point CP, follows said Click Point. This is very important because it permits the user of perfectly choosing the best Click Point location for correcting the mesh model in real time.
  • the real-time adaptation of the mesh model has been performed once.
  • the user has display means for estimating the result of the real-time Interactive Adaptation means 20.
  • Either the real-time Interactive Adaptation means yields directly a corrected mesh curve CC, or the user may cancel the last operations and select a new Data Plane and/or a new Click Point, which yields a new correction curve.
  • the operations may be carried out in several different 2D Data Planes of the volume, that contain aberrant curves resulting from preliminary segmented zones, until the mesh model fulfills fitting conditions chosen by the user.
  • This system permits of refining in real-time the segmented object in the 3D image, up to a level of fitness of mapping of the mesh model onto the object, which the user regards as a satisfying result.
  • the real-time interactive segmentation may be performed in any plane of the volume for providing such an improved segmented 3D object.
  • the processing means of the invention may be applied to as many Data Planes and Aberrant Curves as necessary for obtaining a segmentation of the object of interest that is conform to the needs of the user.
  • the user actuates the Stop means 30, as illustrated by FIG.l A. Then, the user may dispose of a segmented image 2, or of segmented image data.
  • Fig.5 shows the basic components of an embodiment of an image viewing system in accordance to the present invention, incorporated in a medical examination apparatus.
  • the medical examination apparatus 100 may include a bed 110 on which the patient lies or another element for localizing the patient relative to the imaging apparatus.
  • the medical imaging apparatus 100 may be a CT scanner or other medical imaging apparatus such as x- rays or ultrasound apparatus.
  • the image data produced by the apparatus 100 is fed to data processing means 70, such as a general-purpose computer, that comprises computation means and user control means appropriate to form the interactive adaptation means of the invention.
  • the data processing means 70 is typically associated with a visualization device, such as a monitor 60, and an input device 72, such as a keyboard, or a mouse 71, pointing device, etc. operative by the user so that he can interact with the system.
  • the data processing device 70 is programmed to implement the processing means for processing medical image data according to invention.
  • the data processing device 70 has computing means and memory means necessary to perform the operations described in relation to FIG.1 and FIG.4.
  • a computer program product having pre-programmed instructions to carry out these operations can also be implemented.
  • the present invention has been described in terms of generating image data for display, the present invention is intended to cover substantially any form of visualization of the image data including, but not limited to, display on a display device, and printing. Any reference sign in a claim should not be construed as limiting the claim.

Abstract

An image processing system comprising 3D processing means (10, 11) of automatic segmentation of an object using a 3-D deformable Surface Model, further comprising means of real-time interactive adaptation (20) of the Surface Model including user-controlled means of selection (21) of a 2D Data Plane (DP) that intersects the 3-D Surface Model along a 2-D Model Curve (MC); user-actuated means for actuating a Click Point (CP) in said 2-D Data Plane and real-time calculation means for yielding a 3D Correction Surface (G) attached to said Click Point; attract-to-points processing means for pulling the points of the 3D Surface Model, in a 3D neighborhood of a 2D portion of the Model Curve to be modified, towards the 3D Correction Surface; and visualization means for visualizing user-controlled actions. The user may slide the Click Point in the Data Plane, whereby the 3D Surface Model looks like attracted to the Click Point and the Correction Surface. Optionally, the user may modify the shape of the attached 3D Correction Surface, while sliding the Click Point.

Description

"REAL TLME USER INTERACTION WITH DEFORMABLE SURFACES OF SEGMENTATION"
Description:
Field of the Invention The invention relates to an image processing system having processing means for segmenting an object of interest in a three-dimensional image using deformable surfaces. This system comprises means of automatically fitting a three-dimensional deformable
Surface Model onto said three-dimensional object and user-controlled means of adaptation of the resulting surface model. The invention further relates to a medical imaging apparatus coupled to such an image processing system and to program products for processing medical three-dimensional images produced by this apparatus or system, for the segmentation of objects of interest that are body organs. The invention finds a particular application in the field of medical imaging in order to study or detect organ pathologies. Background of the Invention A technique of representation of a 3D object using a mesh model is already disclosed by H. DELINGETTE in the publication entitled "Simplex Meshes: a General Representation for 3-D shape Reconstruction" in "Processing of the International Conference on Computer Vision and Pattern Recognition (CVPR'94), 20-24 June 1994, Seattle, USA". In this paper, a physically based approach for recovering three-dimensional objects is presented. This approach is based on the geometry of "Simplex Meshes". Elastic behavior of the meshes is modeled by local stabilizing functions controlling the mean curvature through the simplex angle extracted at each vertex (node of the mesh). Those functions are viewpoint-invariant, intrinsic and scale-sensitive. Unlike deformable surfaces defined on regular grids, Simplex Meshes are very adaptive structures. A refinement process for increasing the mesh resolution at highly curved or inaccurate parts is also disclosed. Operations for connecting Simplex Meshes in order to recover complex models may be performed using parts having simpler shapes. A Simplex Mesh has constant vertex connectivity. For representing 3-D surfaces, Simplex Meshes, which are called 2- Simplex Meshes, where each vertex is connected to three neighboring vertices, are used. The structure of a Simplex Mesh is dual to the structure of a triangulation as illustrated by the FIG.1 of the cited publication. It can represent all types of orientable surface. The contour on a Simplex Mesh is defined as a closed polygonal chain consisting of neighboring vertices on the Simplex Mesh. The contour is restricted to not intersect itself, as far as possible. Contours are deformable models and are handled independently of the Simplex Mesh where they are embedded. Four independent transformations are defined for achieving the whole range of possible mesh transformations. They consist in inserting or deleting edges in a face of the Mesh. The description of the Simplex Mesh also comprises the definition of a Simplex Angle that generalized the angle used in planar geometry; and the definition of metric parameters, which describe how the vertex is located with respect to its three neighbors. The dynamic of each vertex is given by a Newtonian law of motion. The deformation implies a force that constrains the shape to be smooth and a force that constrains the mesh to be close to the 3D object. Internal forces determine the response of a physically based model to external constraints. The internal forces are expressed so that they are intrinsic viewpoint invariant and scale dependant. Similar types of constraints hold for contours. Hence, the cited publication provides a simple model for representing a given 3D object. It defines the forces to be applied in order to reshape and adjust the model onto the 3D object of interest. The "Simplex Mesh technique" is a robust segmentation method. Summary of the Invention However, the "Simplex Mesh" technique that is proposed in the cited paper may not achieve a perfect segmentation in certain circumstances. For instance: in a circumstance when the three-dimensional image, which is an image of an organ, is very noisy or when the object of interest is partly blurred. In this circumstance, the automatic segmentation algorithm may yield a wrong location for the surface of the segmented object and the resulting three- dimensional surface may show one or several dissimilarities with the organ of interest. For example, the automatic segmentation algorithm may stop whereas the segmentation operation is not completed; it may progress in a wrong direction, being mislead towards a wrong but contrasted surface; or it may even regress due to the complicated surface shape, again being mislead towards a wrong surface. The invention has for an object to propose a 3D image processing system having means for segmenting an object of interest represented in a three-dimensional image using a deformable 3D surface model and further having real time interactive adaptation means for interactively modifying the 3D surface model in real time by a user. The interactive real-time adaptation means comprises real-time user-actuated processing means, called attraction to point means, including means for: user-selecting a plane of work, called Data Plane, intersecting the surface model; user-actuating a click point in said Data Plane, through which the surface model should pass; attaching a 3D correction surface to the click point; attracting a 3D portion of the 3D surface model to said 3D correction surface; user-sliding the click point in the Data Plane for the user to select the best adaptation of the 3D surface model to the object of interest, whereby the 3D surface model looks like attracted to the click point; optionally user-selecting modification of the shape of the attached 3D correction surface, while sliding the click point; repeating the above operations until the user-controlled real-time adaptation of the 3D surface model is completed. The interactive real-time adaptation means further comprises visualization means for the user to control the operation of the real-time user-actuated processing means in 3D images or in 2D images. According to the invention, the 3D correction surface is defined by a shape parameter, which may be set in function of the distance of the user-defined click point to the intersection curve of the surface model within the Data Plane and in function of the actual area of 3D surface model to be modified. This shape parameter is user-selected or user- modified and associated to the user-defined click point for defining the 3D correction surface, which is used for performing the attraction-to-point operation. Said real time interactive adaptation means permits the user of interfering locally in real time in a 2D view of the surface model, which is a relatively easy operation, instead of only acting on a 3D surface model, which is difficult. When handling medical images, it is important for the user to deal in real time with the problems that occur while imaging the organs, such as problems of image segmentation. Said real time interactive adaptation means permits of modifying in real-time a chosen region of the 3D surface model around the user- defined click point, which modification is controlled by the shape parameter, in order to improve the fitness of the 3D surface model of segmentation. The invention also relates to a medical diagnostic imaging apparatus coupled to this system for 3D image processing. The medical imaging apparatus may be an X-ray medical examination apparatus or any other 3D medical imaging apparatus, such as MRI. The invention further relates to a program product or a program package to be used in the system. Brief description of the Drawings The invention is described hereafter in detail in reference to the following diagrammatic and schematic drawings, wherein: FIG.1A shows a diagrammatic representation of the means of the system of the invention; and FIG. IB shows a diagrammatic representation of the real-time interactive adaptation means of the system of the invention; FIG.2 illustrates a Data Plane selection; FIG.3A schematically shows a mesh curve in the selected Data Plane with an
Aberrant Curve and a Click Point; FIG.3B shows a 3D Gaussian Surface calculated from the Click point position in the Data Plane; FIG.3C shows a corrected mesh curve in the Data Plane; FIG.3D shows a motion vector to move the point of reference of the mesh curve of the Data Plane; and FIG.3E shows motion vectors to move neighbors of the reference point; FIG.4A is a 2D view of an object of interest in a medical image, with overlaid segmentation curve; FIG.4B illustrates the click point action on the same view; and FIG.4C shows the same view with the modified overlaid segmentation curve after real-time interactive adaptation; FIG.5 illustrates a medical viewing system coupled to a medical examination apparatus. Detailed Description of Embodiments The invention relates to an image processing system for segmenting an object of interest represented in a three-dimensional (3D) image, using a tliree-dimensional deformable surface model technique, whereby the deformable surface model of segmentation is fitted onto the surface of said three-dimensional object. In the example described below, the deformable surface model is a mesh model. Hence, the surface of segmentation is represented by mesh faces defined by edges and vertices, as illustrated by FIG.3D. The present invention may be applied to deformable surface models other than mesh models by simply replacing the words "mesh model" by "deformable surface model", the words "mesh curve" by the words "surface curve" and the word "vertex" by the word "point". In this example, the 3D segmented object of interest is an organ represented in a 3D medical image. Segmenting images using discrete deformable models, like 2-Simplex meshes, often requires corrections of the resulting segmented surface. This is especially true for medical images, where due to image noise or poor data quality, some salient image features may be missing. As a result, some parts of the model might be attracted to wrong features leading to partially erroneous segmented shape. Therefore, the practitioner usually would like to use his/her experience in image interpretation in order to correct the segmentation result. Moreover, the practitioner may want to guide further segmentation process by forcing the model to stick to user-imposed locations. Modifying the surface of segmentation, when it is not correct, is very difficult to achieve, particularly in real-time. The present invention proposes means to solve this problem in real time. The present invention proposes an image processing system having interactive user- actuated processing means for modifying the 3D mesh model, in real time, by only using a user-drawn point, towards which the mesh model will be attracted, for the modified mesh model to pass through this user-drawn point. According to the invention, said real-time interactive image processing means permits the user of controlling the segmentation operation and of interfering where and when it is necessary in order to modify, correct or adapt in real-time at best the mesh surface of segmentation to better fit the actual surface of the object of interest. These real-time interactive processing means, controlled by the user, permits the real-time adaptation of the mapping of the 3-D mesh model onto the three-dimensional surface of the object of interest until a satisfying level of fitness chosen by the user is reached. The system may be applied to processing a three-dimensional grey level image. FIG.1 A diagrammatically represents the processing means of the system of the invention. This system has initialization means 10 for setting parameters for the automatic segmentation means 11 to perform preliminary 3D image segmentation using the automatic mesh model Technique. The system has display means 60, as illustrated by FIG.5, for the user to examine the result of the preliminary automatic segmentation, which is the image of the mesh model substantially fitting the surface of the object of interest. This 3D mesh model is first mapped at best onto the surface of the object of interest by the automatic segmentation means 11. Either the user accepts the result of this preliminary segmentation or the user does not accept this result. The system has control means 15, which may be set in operation by the user in order to control the real-time interactive adaptation means 20. If the user accepts the result of the automatic segmentation 11, the data are directed to STOP means 30 through the control means 15. The STOP means 30 permits of yielding directly the preliminary segmentation result as the final segmentation image data. These image data may be provided as an image by display means, or as data by memory means, storing means, or other means. Alternatively, the user may want to continue the segmentation operation using the automatic segmentation means 11. Then, the resulting signals data may be again entered into said automatic segmentation means 11 through 13. If the user does not accept this preliminary segmentation result 12, according to the invention, the real-time interactive adaptation means 20 can be user-actuated through the control means 15. The user-actuated real-time adaptation means 20 are provided for the user to enter data or information in order to interactively modify, or correct or improve in realtime the result of the preliminary automatic segmentation means 11. The real-time interactive adaptation means 20 are actuated by the user through the control means 15 using actuation means such as a keyboard 72 or a mouse 71, as illustrated by FIG.5, or any other interactive actuation means known to those skilled in the art. After having performed real-time interactive adaptation 20, the user further examines the segmentation results, for instance using the display means 60. The user may operate the real-time interactive adaptation means until he/she accepts the result. When the user accepts the result of the real-time interactive adapted segmentation, then the data are directed to STOP 30 and the adapted segmentation result yields final segmentation image data 2 to provide to the display means 60, or memory means, storing means, or other means of FIG.5. Alternatively, the user may undo the actions that he/she has performed with this realtime interactive adaptation means. He/she may perform new actions with this real-time interactive adaptation means until he is satisfied with the results. FIG. IB diagrammatically represents the means for carrying out the real-time interactive adaptation means 20 of the invention, called "attraction to point means" for realtime user-controlled adaptation. Referring to FIG. IB, this real-time interactive adaptation means 20 first comprises plane selection means 21, for the user to select an oriented Data Plane DP showing a section of the surface of segmentation of the obj ect of interest. The orientation of the Data Plane DP is defined within a volume of reference VOL in a tliree-dimensional referential OX, OY, OZ, as illustrated by FIG.2. The Data Plane DP is a work plane for the user to perform actions using the real-time interactive adaptation means. Usually, a 3D image is constructed from the assembling of a certain number of two- dimensional images of points parallel to one plane of the referential, each image plane representing a section of the volume of reference NOL in the referential. It is then very difficult and fastidious for the user to identify and correct the defects of mesh propagation in these planes, because the defects of the mesh model are not necessarily best seen with the given orientation of said predetermined construction planes. Instead, according to the invention, the orientation of the Data Plane DP is not necessarily a plane of construction of the 3D image. The orientation is selected for said Data Plane to show an intersection, denoted by mesh curve MC, with the mesh model, where a defect of segmentation is best seen. This interesting orientation can be any orientation with respect to the 3D referential. The viewing means 60 may advantageously provide several images, such as 3D images of the 3D object and of the 3D mesh model and one or several 2D views showing calculated 2D mesh curves MC representing the 2D intersection curves of the 3D mesh model by Data Plane(s) in different directions of orientation. These 2D mesh curves may favorably be highlighted and overlaid on a 2D grey-level view of the obj ect of interest in the DP, as illustrated by FIG.4A to FIG.4C. According to the invention, the real-time interactive adaptation means 20 permits the user of interfering locally and in real-time onto the 2D mesh curve MC in the Data Plane DP, instead of directly acting on the 3D mesh model forming the segmented surface of the object of interest. However, the system of the invention modifies directly, in real-time, the 3D mesh model, while the user acts on the 2D mesh curve MC in the Data Plane DP. Referring to FIG.3 A, the real-time interactive adaptation means 20 permits the user to dispose of the 2D mesh curve MC, and to select a portion of said 2D mesh curve to be modified, denoted by Aberrant Curve AC. The Aberrant Curve AC is the portion of mesh curve MC where the user detects that the calculated mesh model does not correctly fit the surface of the object to be segmented or does not correspond to the way the object of interest is chosen to be segmented. The user actuates in real-time the plane selection means 21 for selecting the best orientation of the Data Plane DP for visualizing said 2D mesh curve MC and the orientation of the Data Plane is varied until the user finds a view of the 2D mesh curve MC where an Aberrant Curve AC is particularly visible, and where the user regards a modification or a correction of the mesh model as particularly necessary. Referring to FIG. IB and illustrated by FIG.3A, while examining the Aberrant Curve AC, the user may decide that this Aberrant Curve AC should be corrected by passing through a particular point of the Data Plane, denoted by User Point UP, as illustrated by FIG.4A, which is a view of an Object of Interest OI in a Data Plane, with overlaid representations of the mesh curve MC and Aberrant Curve AC. Thus, the interactive adaptation means 20 comprises interactive drawing means 22 for the user to draw this point, further denoted by click point CP, for instance by action of a key of the keyboard 72 or by a click of a mouse 71 as shown on FIG.5 or by any other drawing means. This Click Point CP may be at a distance of the Aberrant Curve AC in the Data Plane DP and at proximity of the User Point UP, as illustrated by FIG.4B, which is a view of the same Object of interest OI in the same Data Plane as FIG.4A. Now the system has calculation means to modify the shape of the Aberrant Curve, for instance based on the distance between the Click Point and the Aberrant Curve. Referring to FIG. IB and FIG.3A, in an example of embodiment, the real-time interactive adaptation system 20 has measure means 23 to estimate the geometrical distance, denoted by Reference Distance Dref, between the Click Point CP and the nearest vertex of the mesh model in the Data Plane, denoted by Reference Point Pref. It is to be noted that the actually nearest vertex of the mesh model with respect to the click point may be located in a plane other than the Data Plane. The present system 20 has means to impose to work in the Data Plane DP. Thus, the selected nearest point is in the Data Plane DP. In the Data Plane DP, the system favorably further estimates the distance between the Click Point CP and the Reference Point Pref as Reference Distance Dref. However, the user may chose Dref according to other criteria. Then, the system has calculation means for constructing a 3D correction surface G based on said Reference Distance Dref. The 3D correction surface G permits of modifying the shape of the Aberrant Curve AC, while automatically modifying the shape of the 3D mesh model. The 3D correction surface G attracts the reference point Pref and its neighbors towards the Click Point CP and the points of said correction surface G. The modifications produced in 3D may be inspected in the data plane DP and in other cross-sections of the 3D mesh model represented in other 2D views, and displayed by the display means 60. According to the invention, a favorable 3D correction surface is a 3D Gaussian Surface. The user may define another type of 3D correction surface. In the case when a 3D Gaussian Surface is used, its parameter σ may be chosen as a function of the distance Dref. Favorably, its parameter σ is chosen equal to the distance Dref: σ = Dref Hence, the system 20 has parameter calculation means 24 to calculate the parameter σ for defining the 3D Gaussian Surface G. Then, the system has calculation means 25 to define the 3D Gaussian Surface G from parameter σ. A cross-section of the 3D Gaussian Surface in the Data Plane DP, with a parameter σ equal to Dref, is represented in FIG.3B. The highest point of the 3D Gaussian Surface G is attached to the Click Point CP, as illustrated by FIG.3C and FIG.4B in DP. Once the 3D Gaussian Surface is defined, motion vector calculation means 26 provides a motion vector v as a function of the distance Dref between the Click Point CP and the Reference Point Pref, for moving the Reference Point towards the Click Point. Hence V is a function of σ: v(U= ::Dr -e=*f The Reference Point Pref is pulled towards the Click Point CP using a pulling force F: ¥- V(σ) The modification of the Aberrant Curve AC using the Gaussian Surface G necessitates not only the definition of a pulling force F to move the Reference Point Pref towards the Click Point, but also the definition of pulling forces Fi to move the vertices of its neighbors in 3D towards the Gaussian Surface G. Thus, once the force Fhas been calculated for the Reference Point Pref, said vertex Pref of the Aberrant Curve can be moved towards the Click Point CP. Then, vertices, denoted by Pi, which are around the Reference Point in 3D, must be pulled towards the Gaussian Surface G using forces, denoted by Fi respectively, for modifying the shape of the mesh model in 3D. The forces Fi are calculated based on distances Di of the vertices Pi to the Gaussian surface G. The distances Di are estimated based on the Reference Distance Dref and the distance between the different vertices Pi and the Reference Point Pref. FIG.3D represents two faces Φl, Φ2 of a mesh model, a point Pref located at a vertex of the face Φl, and the distance Dref between the Click Point CP and Pref. The distance Di for the vertex A of face Φl is the distance Da, which is the sum of the distance Dref and the distance between Pref and A on the edge of face Φl: Di(A) - Dref + APref= Da And the distance Di for B is: Di(B) - Db = Da + AB The respective motion vectors Vi are given by the means 26 in function of Dref: ^v*i = Di Calculation means 27 calculates parameters, denoted by weights αi, for further calculating said forces Fi based on the respective distances Di assigned to the neighboring vertices Pi. For moving the neighbors of the Reference Point around this point towards the Gaussian Surface, the weights α,;, are calculated based on the respective distances Di. These weights α; are assigned to the neighbor vertices Pi of Pref, for example: α.- e-^ 2 with σ = Dref for instance. Calculation means 28 further give the forces~Fι for moving the respective vertices Pi:
Figure imgf000011_0001
Hence, the forces applied to each vertex of the mesh model are function of the distance of the neighbor vertex to the Reference Point Pref of the mesh model to be modified. Using the coefficient α; permits of defining forces Fi that depend on the volume of the mesh model to modify. The larger the volume of the mesh model, the larger the variation of forces to apply to the vertices. Referring to FIG. IB and illustrated by FIG.3E and FIG.4C, the processing means 29 of the Real-time Interactive Adaptation Means 20 further moves the vertex Pref and its neighbor vertices defined by the distances Di towards the Gaussian Surface G in 3D, respectively using the pulling forces F and Fi previously defined. Each time the user moves the click point CP, only by sliding the mouse of the computing means, or by any other drawing means, and the Correction Surface G, which is attached to the sliding Click Point CP, follows said Click Point. This is very important because it permits the user of perfectly choosing the best Click Point location for correcting the mesh model in real time. Also, the control means 15 permits the user of interactively modifying the value of the distance Dref = σ. Hence a new Gaussian Surface G is calculated and attached to the Click Pont. This is very important because it permits of acting on the largeness of area of mesh model to modify. At this stage, the real-time adaptation of the mesh model has been performed once. The user has display means for estimating the result of the real-time Interactive Adaptation means 20. Either the real-time Interactive Adaptation means yields directly a corrected mesh curve CC, or the user may cancel the last operations and select a new Data Plane and/or a new Click Point, which yields a new correction curve. Or the operations may be carried out in several different 2D Data Planes of the volume, that contain aberrant curves resulting from preliminary segmented zones, until the mesh model fulfills fitting conditions chosen by the user. This system permits of refining in real-time the segmented object in the 3D image, up to a level of fitness of mapping of the mesh model onto the object, which the user regards as a satisfying result. The real-time interactive segmentation may be performed in any plane of the volume for providing such an improved segmented 3D object. As described above, the processing means of the invention may be applied to as many Data Planes and Aberrant Curves as necessary for obtaining a segmentation of the object of interest that is conform to the needs of the user. When the user finally decides to accept the result yielded by the segmentation means, the user actuates the Stop means 30, as illustrated by FIG.l A. Then, the user may dispose of a segmented image 2, or of segmented image data. Medical examination apparatus and viewing system The above-described means are included in or coupled to the viewing system of the invention. Fig.5 shows the basic components of an embodiment of an image viewing system in accordance to the present invention, incorporated in a medical examination apparatus. The medical examination apparatus 100 may include a bed 110 on which the patient lies or another element for localizing the patient relative to the imaging apparatus. The medical imaging apparatus 100 may be a CT scanner or other medical imaging apparatus such as x- rays or ultrasound apparatus. The image data produced by the apparatus 100 is fed to data processing means 70, such as a general-purpose computer, that comprises computation means and user control means appropriate to form the interactive adaptation means of the invention. The data processing means 70 is typically associated with a visualization device, such as a monitor 60, and an input device 72, such as a keyboard, or a mouse 71, pointing device, etc. operative by the user so that he can interact with the system. The data processing device 70 is programmed to implement the processing means for processing medical image data according to invention. In particular, the data processing device 70 has computing means and memory means necessary to perform the operations described in relation to FIG.1 and FIG.4. A computer program product having pre-programmed instructions to carry out these operations can also be implemented. The drawings and their description herein before illustrate rather than limit the invention. It will be evident that there are numerous alternatives that fall within the scope of the appended claims. Moreover, although the present invention has been described in terms of generating image data for display, the present invention is intended to cover substantially any form of visualization of the image data including, but not limited to, display on a display device, and printing. Any reference sign in a claim should not be construed as limiting the claim.

Claims

Claims
1. An image processing system comprising 3D processing means (10,11) of automatic segmentation of an object of interest using a 3-D deformable Surface Model in a 3-D image, further comprising means of real-time interactive adaptation (20) of the Surface Model to the actual image of the surface of the object of interest including: user-controlled means of selection (21) of a 2D Data Plane (DP) that intersects the 3- D Surface Model along a 2-D Model Curve (MC) with a 2-D portion (AC) to be modified; user-actuated means for actuating a Click Point (CP) in said 2-D Data Plane and real- time calculation means for yielding a 3D Correction Surface (G) attached to said Click Point; attract-to-points processing means for pulling the points of the 3D Surface Model, in a 3D neighborhood of the 2D portion to be modified, towards the 3D Correction Surface; and visualization means for visualizing in real-time the user-controlled actions.
2. The image processing system of claim 1, wherein the means of real-time calculation means for yielding a 3D Correction Surface (G) attached to the Click Point comprises: means of calculation of a reference point (Pref) of the Model Curve (MC) nearest to the Click Point (CP) and means of calculation of the corresponding shortest distance (Dref) defined in the Data Plane (DP); and means of calculation of said 3D Correction Surface (G) function of said shortest distance (Dref).
3. The image processing system of claim 2, wherein the 3D Correction Surface (G) is a Gaussian Surface whose parameter (σ) is function of said shortest distance (Dref).
4. The image processing system of claim 3, wherein the parameter (σ) is user-controlled for modifying the shape of the Gaussian Surface (G).
5. The image processing system of one of Claims 2 to 4, wherein attract-to-points processing means for pulling the points of the 3D Surface Model, in a 3D neighborhood of the 2D portion to be modified, towards the 3D Correction Surface, comprises: calculation means for calculating pulling forces (F, Fi) that are function of said shortest distance (Dref), for pulling the reference point (Pref) and neighbor points (Pi) towards the Correction Surface in 3D.
6. The image processing system of claim 5, having means for calculation of the pulling forces (F, Fi) as functions of said shortest distance (Dref) that is weighted for the neighbor points in function of their distance to the reference point (Pref).
7. The system of Claim 6, having means for calculation of the pulling forces based on the calculation of motion vectors that are function of said shortest distance (Dref).
8. The system of Claim 7, having means for user-sliding the Click Point in the Data Plane for the user to select the best adaptation of the 3D Surface Model to the object of interest, using the Correction Surface, whereby the 3D Surface Model looks like attracted to the Click Point and the Correction Surface; optionally user-selecting modification of the shape of the attached 3D Correction Surface, while sliding the Click Point; repeating the above operations until the user-controlled real-time adaptation of the 3D Surface Model is completed.
9. The system of one of Claims 1 to 8, wherein the Surface Model is a Mesh Model and the points of the Surface Model are vertices of the Mesh Model.
10. The system of Claim 9, comprising: Acquisition means for acquiring a three-dimensional image of an object of interest to be segmented, Automatic segmentation means (10) for generating a Mesh Model, formed of polygonal faces with common edges and nodes and automatically deforming the Mesh Model in order to map said Mesh Model onto said object of interest; Real-time Interactive adaptation means (20) for interactively adapting said Mesh Model in order to locally modify regions of the Mesh Model, by pulling vertices of the Model Surface towards a 3D Correction Surface (G) attached to a user-actuated Click Point (CP) in a selected Data Plane (DP).
11. The system of one of Claims 1 to 10, having display means to display 3D views of the Surface Model, 2D view of the Data Plane, 2D view of the intersection of the Data Plane with the Surface Model, with or without highlighting said intersection, 2D views of intersection of other planes with the Surface Model, said 3D and/or 2D views being displayed one at a time or several at a time.
12. A medical imaging system comprising a suitably programmed computer or a special purpose processor having circuit means, which are arranged to form an image processing system as claimed in one of Claims 1 to 9 to process medical image data.
13. A medical examination imaging apparatus having: Means to acquire a tliree-dimensional image of an organ of a body; and a system according to one of Claims 1 to 10.
14. A computer program product comprising a set of instructions to be used in a system as claimed in one of Claims 1 to 12.
PCT/IB2005/000083 2004-01-19 2005-01-14 Real time user interaction with deformable surfaces of segmentation WO2005078666A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP05702250A EP1709591A1 (en) 2004-01-19 2005-01-14 Real time user interaction with deformable surfaces of segmentation
JP2006548471A JP2007518484A (en) 2004-01-19 2005-01-14 Real-time user interaction of deformable surface segmentation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP04300027.2 2004-01-19
EP04300027 2004-01-19

Publications (1)

Publication Number Publication Date
WO2005078666A1 true WO2005078666A1 (en) 2005-08-25

Family

ID=34854735

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2005/000083 WO2005078666A1 (en) 2004-01-19 2005-01-14 Real time user interaction with deformable surfaces of segmentation

Country Status (3)

Country Link
EP (1) EP1709591A1 (en)
JP (1) JP2007518484A (en)
WO (1) WO2005078666A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009016530A2 (en) 2007-07-27 2009-02-05 Koninklijke Philips Electronics N.V. Interactive atlas to image registration
WO2010113051A1 (en) * 2009-04-03 2010-10-07 Koninklijke Philips Electronics N.V. System and method for interactive live-mesh segmentation
US8253726B1 (en) 2008-01-09 2012-08-28 Spaceclaim Corporation, Inc. Systems and methods for modifying three dimensional geometry using an arbitrary cross-section plane
WO2013003136A1 (en) * 2011-06-28 2013-01-03 General Electric Company Method and system for navigating, segmenting, and extracting a three-dimensional image
US8477153B2 (en) 2011-08-24 2013-07-02 General Electric Company Method and system for navigating, segmenting, and extracting a three-dimensional image
EP3360486A1 (en) * 2017-02-13 2018-08-15 Koninklijke Philips N.V. Ultrasound evaluation of anatomical features
EP3195272B1 (en) * 2014-09-02 2018-11-21 Koninklijke Philips N.V. Device for visualizing a 3d object
US10586398B2 (en) 2014-12-18 2020-03-10 Koninklijke Philips N.V. Medical image editing
US10984533B2 (en) 2016-10-25 2021-04-20 Koninklijke Philips N.V. Method and apparatus for segmenting a two-dimensional image of an anatomical structure
US11793574B2 (en) 2020-03-16 2023-10-24 Stryker Australia Pty Ltd Automated cut planning for removal of diseased regions

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3493154A1 (en) * 2017-12-01 2019-06-05 Koninklijke Philips N.V. Segmentation system for segmenting an object in an image

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ALLAN J B; WYVILL B; WITTEN I H: "A methodology for direct manipulation of polygon meshes", NEW ADVANCES IN COMPUTER GRAPHICS, June 1998 (1998-06-01), pages 451 - 469, XP008044110 *
JACKOWSKI M ET AL: "Interactive tools for image segmentation", PROCEEDINGS OF THE SPIE - THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING SPIE-INT. SOC. OPT. ENG USA, vol. 3661, 1999, pages 1063 - 1074, XP002321032, ISSN: 0277-786X *
MCINERNEY T ET AL: "A DYNAMIC FINITE ELEMENT SURFACE MODEL FOR SEGMENTATION AND TRACKING IN MULTIDIMENSIONAL MEDICAL IMAGES WITH APPLICATION TO TO CARDIAC 4D IMAGE ANALYSIS", COMPUTERIZED MEDICAL IMAGING AND GRAPHICS, PERGAMON PRESS, NEW YORK, NY, US, vol. 19, no. 1, 1995, pages 69 - 83, XP000934040, ISSN: 0895-6111 *

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009016530A3 (en) * 2007-07-27 2009-07-09 Koninkl Philips Electronics Nv Interactive atlas to image registration
WO2009016530A2 (en) 2007-07-27 2009-02-05 Koninklijke Philips Electronics N.V. Interactive atlas to image registration
US8554573B2 (en) 2007-07-27 2013-10-08 Koninklijke Philips N.V. Interactive atlas to image registration
US8253726B1 (en) 2008-01-09 2012-08-28 Spaceclaim Corporation, Inc. Systems and methods for modifying three dimensional geometry using an arbitrary cross-section plane
RU2523915C2 (en) * 2009-04-03 2014-07-27 Конинклейке Филипс Электроникс Н.В. System and method for interactive live-mesh segmentation
CN102378990A (en) * 2009-04-03 2012-03-14 皇家飞利浦电子股份有限公司 System and method for interactive live-mesh segmentation
WO2010113051A1 (en) * 2009-04-03 2010-10-07 Koninklijke Philips Electronics N.V. System and method for interactive live-mesh segmentation
US8907944B2 (en) 2011-06-28 2014-12-09 General Electric Company Method and system for navigating, segmenting, and extracting a three-dimensional image
WO2013003136A1 (en) * 2011-06-28 2013-01-03 General Electric Company Method and system for navigating, segmenting, and extracting a three-dimensional image
US8477153B2 (en) 2011-08-24 2013-07-02 General Electric Company Method and system for navigating, segmenting, and extracting a three-dimensional image
EP3195272B1 (en) * 2014-09-02 2018-11-21 Koninklijke Philips N.V. Device for visualizing a 3d object
US11000252B2 (en) 2014-09-02 2021-05-11 Koninklijke Philips N.V. Device for visualizing a 3D object
US10586398B2 (en) 2014-12-18 2020-03-10 Koninklijke Philips N.V. Medical image editing
US10984533B2 (en) 2016-10-25 2021-04-20 Koninklijke Philips N.V. Method and apparatus for segmenting a two-dimensional image of an anatomical structure
CN110300548A (en) * 2017-02-13 2019-10-01 皇家飞利浦有限公司 Ultrasound Evaluation anatomical features
WO2018146296A1 (en) 2017-02-13 2018-08-16 Koninklijke Philips N.V. Ultrasound evaluation of anatomical features
EP3360486A1 (en) * 2017-02-13 2018-08-15 Koninklijke Philips N.V. Ultrasound evaluation of anatomical features
US11484286B2 (en) 2017-02-13 2022-11-01 Koninklijke Philips N.V. Ultrasound evaluation of anatomical features
US11793574B2 (en) 2020-03-16 2023-10-24 Stryker Australia Pty Ltd Automated cut planning for removal of diseased regions

Also Published As

Publication number Publication date
EP1709591A1 (en) 2006-10-11
JP2007518484A (en) 2007-07-12

Similar Documents

Publication Publication Date Title
EP1709591A1 (en) Real time user interaction with deformable surfaces of segmentation
EP1565880B1 (en) Image processing system for automatic adaptation of a 3-d mesh model onto a 3-d surface of an object
EP1588325B1 (en) Image processing method for automatic adaptation of 3-d deformable model onto a subtantially tubular surface of a 3-d object
EP1685534B1 (en) Three-dimensional segmentation using deformable surfaces
US7773786B2 (en) Method and apparatus for three-dimensional interactive tools for semi-automatic segmentation and editing of image objects
US8345927B2 (en) Registration processing apparatus, registration method, and storage medium
EP3100236B1 (en) Method and system for constructing personalized avatars using a parameterized deformable mesh
EP2710557B1 (en) Fast articulated motion tracking
US20040171922A1 (en) Image processing method for interacting with a 3-d surface represented in a 3-d image
US20070196007A1 (en) Device Systems and Methods for Imaging
US20130135305A1 (en) In-plane and interactive surface mesh adaptation
JP3712234B2 (en) Region of interest extraction method and image processing server
JP4170096B2 (en) Image processing apparatus for evaluating the suitability of a 3D mesh model mapped on a 3D surface of an object
Kjer et al. Free-form image registration of human cochlear μCT data using skeleton similarity as anatomical prior
US20220375099A1 (en) Segmentating a medical image
EP4197444A1 (en) Lung volume estimation from radiographic images
Shen et al. Deformable registration using spring mass system with cross-section correction
Vanacken et al. Force feedback to assist active contour modelling for tracheal stenosis segmentation

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2005702250

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2006548471

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

WWW Wipo information: withdrawn in national office

Ref document number: 2005702250

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2005702250

Country of ref document: EP