EP2111604A2 - Bildgebungssystem und bildgebungsverfahren zur abbildung eines objekts - Google Patents
Bildgebungssystem und bildgebungsverfahren zur abbildung eines objektsInfo
- Publication number
- EP2111604A2 EP2111604A2 EP07849520A EP07849520A EP2111604A2 EP 2111604 A2 EP2111604 A2 EP 2111604A2 EP 07849520 A EP07849520 A EP 07849520A EP 07849520 A EP07849520 A EP 07849520A EP 2111604 A2 EP2111604 A2 EP 2111604A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- scan parameter
- projection image
- dimensional
- dimensional model
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/032—Transmission computed tomography [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/41—Detecting, measuring or recording for evaluating the immune or lymphatic systems
- A61B5/414—Evaluating particular organs or parts of the immune or lymphatic systems
- A61B5/417—Evaluating particular organs or parts of the immune or lymphatic systems the bone marrow
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/466—Displaying means of special interest adapted to display 3D data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/467—Arrangements for interfacing with the operator or the patient characterised by special input means
- A61B6/469—Arrangements for interfacing with the operator or the patient characterised by special input means for selecting a region of interest [ROI]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/75—Determining position or orientation of objects or cameras using feature-based methods involving models
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/113—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb occurring during breathing
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/318—Heart-related electrical modalities, e.g. electrocardiography [ECG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
- A61B6/503—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of the heart
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
- A61B6/504—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of blood vessels, e.g. by angiography
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Definitions
- the invention relates to an imaging system, an imaging method and a computer program for imaging an object.
- the invention relates further to a scan parameter determination device, a scan parameter determination method and a computer program for determining a scan parameter.
- a computed tomography system is an imaging system for imaging an object.
- a computed tomography system comprises an X-ray source and a detection unit, which move relative to an examination zone, in which an object is located. Detection values are acquired, which depend on the radiation after having passed the object, and an image of the object is reconstructed using the acquired detection values.
- the X-ray source illuminates generally only the part of the object, which has to be illuminated in order to have detection values, which are sufficient for reconstructing a desired region of interest.
- the region of interest is generally defined by hand. For defining the region of interest by hand a two-dimensional projection image is generated and displayed on a monitor, and a user defines a region of interest on the two-dimensional projection image by using a graphical user interface.
- the two-dimensional projection image is generated by moving an object table, on which the object is located, linearly and by illuminating the object by radiation of the X-ray source, which does not rotate during generation of the two-dimensional projection image.
- This determination of the region of interest has the drawback that in the projection direction overlaying structures of the object can not be distinguished in the two- dimensional projection image resulting in a decreased quality of the determination of the region of interest.
- an imaging system for imaging an object wherein the determination of a scan parameter, like the region of interest, is improved.
- an imaging system for imaging an object is presented, wherein the imaging system is adapted for scanning the object in accordance with a scan parameter, the imaging system comprising a projection image generation unit for generating a two-dimensional projection image of the object, a model provision unit for providing a three-dimensional model of the object, a registration unit for registering the three-dimensional model with the two- dimensional projection image, a scan parameter determination unit for determining the scan parameter from the registered model.
- the determination of the scan parameter is improved, even if the object, which has to be scanned, is a moving object.
- the registration unit is adapted for using registration features, which are detectable in the two-dimensional projection image and in the three- dimensional model, for registration, wherein the scan parameter determination unit uses scan parameter determining features of the three-dimensional model for determining the scan parameter from the registered three-dimensional model. This allows optimizing the registration features for the registration and optimizing the scan parameter determining features for the determination of the scan parameter independently from each other. This further improves the quality of the registration and the quality of the determination of the scan parameter.
- the three-dimensional model of the object does not only comprise the object itself, but also further objects or entities, in particular registration features.
- the model of the object preferentially does not only comprise the heart, but several objects or entities located in a thorax region of a human patient, i.e., if the object, which has to be imaged, is a heart of a patient, the model of the heart is preferentially a thorax model including the heart of the patient and further entities or objects within the thorax region like the spinal column and the ribs, which can be used as registration features.
- the model provision unit is adapted for adapting the three- dimensional model to the two-dimensional projection image.
- the three-dimensional model can be transformed, for example, translated and/or rotated and/or contracted and/or extended, in order to match as good as possible to the projection image.
- a similarity measure can be used, like a sum of absolute differences or a correlation, and the three- dimensional model can be transformed such that the similarity measure applied to the two- dimensional projection image and to a simulated two-dimensional projection image, which has, for example, been simulated by forward projecting the transformed three-dimensional model in the projection geometry of the provided two-dimensional projection image, is minimized.
- a scan parameter determination device for determining a scan parameter is provided, which is usable by an imaging system for scanning an object in accordance with the scan parameter, the scan parameter determination device being provided with a two-dimensional projection image of the object generated by a projection image generation unit, wherein the scan parameter determination device comprises: a model provision unit for providing a three-dimensional model of the object, - a registration unit for registering the three-dimensional model with the two- dimensional projection image, a scan parameter determination unit for determining the scan parameter from the registered three-dimensional model.
- a scan parameter determination method for determining a scan parameter is provided, which is usable by an imaging system for scanning an object in accordance with the scan parameter, the scan parameter determination method being provided with a two-dimensional projection image of the object generated by a projection image generation unit, wherein the scan parameter determination method comprises following steps: - providing a three-dimensional model of the object by a model provision unit, registering the three-dimensional model with the two-dimensional projection image by a registration unit, determining the scan parameter from the registered three-dimensional model by a scan parameter determination unit.
- a computer program for determining a scan parameter comprises program code means for causing a computer to carry out the steps of the scan parameter determination method as claimed in claim 7 when the computer program is carried out on a computer controlling a scan parameter determination device as claimed in claim 5.
- FIG. 1 shows schematically a representation of an imaging system in accordance with the invention
- Fig. 4 shows schematically a flowchart illustrating a scan parameter determination method for determining a scan parameter in accordance with the invention.
- Fig. 1 shows an imaging system for imaging an object, which is, in this embodiment, a computed tomography system (CT system).
- CT system includes a gantry 1, which is capable of rotation about an axis of rotation R, which extends parallel to the z direction.
- a radiation source 2 which is, in this embodiment, an X-ray source 2 is mounted on the gantry 1.
- the X-ray source 2 is provided with a collimator 3, which forms in this embodiment a conical radiation beam 4 from the radiation produced by the X-ray source 2.
- the radiation traverses an object (not shown), such as a patient, in an examination zone 5, which is in this embodiment cylindrical.
- the X-ray beam 4 is incident on a detection unit 6, which comprises in this embodiment a two- dimensional detection surface.
- the detection unit 6 is mounted on the gantry 1.
- the examination zone 5 including the object is moved parallel to the axis of rotation R or the z direction, and that the radiation source 2 is not rotated, i.e., that the radiation source 2 and the examination zone 5 move relative to each other along a linear trajectory.
- the collimator 3 can be adapted for forming a fan beam and the detection unit 6 can also be a one-dimensional detector.
- the control unit 9 is adapted such that during an acquisition of detection values, which will be used for reconstructing an image of the object, the object is scanned in accordance with one or more scan parameters.
- the radiation source 2 and the examination zone 5 move relative to each other along a linear trajectory, wherein detection values are acquired by the detection unit 6, which are transmitted to a projection image generation unit 15.
- the radiation source 2 is not rotated and the examination zone 5, and, therefore, the object 2 are moved parallel to the z direction or the axis of rotation R, for example, by moving a patient table, on which the object is located, parallel to the parallel direction or the axis of rotation R.
- the projection image generation unit 15 generates a two- dimensional projection image of the object and transmits the generated two-dimensional projection image to a scan parameter determination device 12.
- the scan parameter determination device 12 which is schematically shown in more detail in Fig. 2, comprises a model provision unit 16 for providing a three-dimensional model of the object, a registration unit 17 for registering the three-dimensional model with the two-dimensional projection image, a scan parameter determination unit 18 for determining the scan parameter from the registered three-dimensional model and a modification unit 19 for allowing a user modifying at least one of the registration between the two-dimensional projection image and the three-dimensional model and the determined scan parameter.
- the determined scan parameter and/or the three-dimensional registered model and the two-dimensional projection image can be shown on a display 11.
- the scan parameters determined by the scan parameter determination unit 18 are transmitted to the control unit 9, which controls the scanning of the object for acquiring detection values, which will be used for reconstructing an image of the object, in accordance with the determined scan parameters.
- the acquired detection values which have been acquired while the control unit 9 has controlled the imaging system in accordance with the determined one or more scan parameters, are provided to an image generation device 10 for generating an image of the object.
- the image generation device 10 reconstructs an image of the object using the acquired detection values.
- the control unit 9 is connected to an input unit 20, which is, for example, a keyboard or a mouse, in particular for allowing a user selecting or choosing a desired scan. For example, a user can input that only a certain part of the object or, if several objects are present, a certain object should be imaged.
- the CT system further comprises, in this embodiment, a motion determination unit 14 for determining a motion of the object.
- the motion determination unit 14 is, for example, an electrocardiograph for acquiring an electrocardiogram, which is directly related to the movement of a heart of the patient, if a heart or a part, i.e. a region of interest, of a heart has to be imaged.
- the motion determination unit 14 can comprise a device for determining the motion of a patient caused by respiration. Furthermore, the motion determination unit 14 can be adapted such that it determines a motion of the object from the acquired detection values, which will be used for reconstructing an image of the object, wherein the images of the object can also image a part of an object or a region of interest of an object.
- a type of scan is provided, which has to be performed.
- a user can define the type of scan, for example, by inputting a corresponding input signal in the control unit 9 using the input unit 20.
- the user can, for example, define that a certain organ of a patient, like the heart, should be imaged, furthermore, a user can define that only a part of an object, like a part or a region of interest of an organ, should be imaged.
- a user can define that a cardiac computed tomography angiography scan (cardiac CTA scan) should be performed.
- cardiac CTA scan cardiac computed tomography angiography scan
- a two-dimensional projection image is generated.
- the X-ray source 2 does not rotate and the examination zone 5 including the object is moved parallel to the z direction or the axis of rotation R, i.e. the X-ray source 2 and the examination zone 5 move relative to each other along a linear trajectory.
- the X-ray source 2 emits X-ray radiation traversing the object in the examination zone 5.
- the X-ray radiation, which has passed the object, is detected by the detection unit 6, which generates detection values. These detection values are transmitted to the projection image generation unit 15, which generates a projection image of the object from the detection values.
- This projection image is an overview image, i.e.
- the projection image shows a region of the examination zone 5, which is surely large enough for including the object or the part of the object or the region of interest within the object, which has to be scanned for allowing performing the type of scan defined in step 101.
- the projection image is generated such, and the detection values are acquired such, that the projection image shows surely and at least the human heart.
- the projection image can show the whole thorax.
- Such a projection image is, for example, a scanogram.
- a three-dimensional model of the object is provided by the model provision unit 16 in step 103.
- the three-dimensional model does not only comprise the heart itself, but also registration features, which are detectable in the two-dimensional projection image and in the three-dimensional model. These registration features are, for example, bones of the thorax.
- the three-dimensional model is, for example, an anatomical model of the respective anatomical region, which corresponds to the type of scan defined in step 101.
- the anatomical model is, for example, a model of the head/neck region, the thorax/abdomen region, the pelvis or the legs region.
- These anatomical models comprise the corresponding bone structures and the soft tissue, in particular the organs, located in the respective anatomical region.
- steps 102 and 103 can be changed, i.e. step 103 can be performed before step 102.
- the provided three-dimensional model and the two-dimensional projection image are transmitted to the registration unit 17, which registers the three-dimensional model with the two-dimensional projection image and which furthermore preferentially adapts the three-dimensional model to the two-dimensional projection image in step 104.
- the registration features are used.
- the registration features are preferentially bone structures, which can be identified in the three- dimensional model and in the two-dimensional projection image.
- a bone structure within the projection image can be detected by using thresholding and/or a casting of search rays and/or a generalized Hough transform.
- the use of a casting of search rays is, for example, disclosed in "Fast automated object detection by recursive casting of search rays", C. Lorenz, J.v.
- the registration unit 17 uses the registration features, i.e. in this embodiment the detected bone structure, for positioning the three-dimensional model with respect to the two-dimensional projection image and for adapting the three-dimensional model to the two- dimensional projection image.
- the positioning and adaptation of the model with respect to the projection image is performed by using a 2D-3D registration method.
- the model is positioned and transformed such that the registration features, in this embodiment the bone structure, of the model match as good as possible to the registration features of the projection image.
- the transformation preferentially includes a translation, a rotation and an extension or contraction of the three-dimensional model.
- the 2D-3D registration can, for example, use simulated projection images or radiographs generated from the model as, for example, described in "An approach to 2D/3D registration of a vertebra in 2D X-ray fluoroscopies with 3D CT images", J. Weese, T.M. Buzug, C. Lorenz, C. Fassnacht, VRMed 1997, pages 119-128, ISBS: 3-540-62794-0, or a matching of silhouette lines to edge features in the projection image as, for example, described in "Recovering the position and orientation of free form objects from image contours using 3D distance maps", S. Lavallee, R. Szeliski, IEEE PAMI, 17(4), 1995.
- a similarity measure can be used, like a sum of absolute differences or a correlation, and the three-dimensional model can be transformed such that the similarity measure applied to the registration features of the two-dimensional projection image and the registration features of a simulated two-dimensional projection image, which has, for example, been simulated by forward projecting the transformed three- dimensional model, in particular by forward projecting the registration features of the transformed three-dimensional model, in the projection geometry of the provided two- dimensional projection image, is minimized.
- registration features are, for example, bones and, in particular, silhouette lines and edge features of bones.
- the motion determination unit 14 has determined the motion of the object, for example, the motion of a heart present in the examination zone 5.
- the provided three-dimensional model is therefore preferentially a moving three-dimensional model of the object, wherein the registration unit 17 is adapted for registering the moving three-dimensional model with and preferentially adapting the moving three-dimensional model to the two-dimensional projection image using the determined motion of the object during the acquisition of the projection image. Therefore, for each or several projection images the corresponding motion phase of the object is determined, and during the registration and preferentially adaptation the moving three-dimensional model in the respective motion phase is registered with and preferentially adapted to a two-dimensional projection image having this respective motion phase.
- the registered three-dimensional model is transmitted to the scan parameter determination unit 18 for determining one or more scan parameters from the registered three-dimensional model. Since within the three-dimensional model the dimensions and the position of an object are known, one or more scan parameters can be determined such that an object, i.e., for instance, a whole object, a part of an object or a region of interest of an object, can be imaged by the imaging system.
- the object, which has to be imaged is a heart and if the three-dimensional model is a thorax model, which has been registered with respect to the two-dimensional projection image, the region, which has to be scanned by the rays of the radiation unit 2, can be determined as a scan parameter such that the whole heart, a part or a region of interest of the heart can be reconstructed.
- the slice and the in-slice position for an aorta contrast peak measurement used for a bolus injection delay time can be determined from the registered three-dimensional thorax model including a three-dimensional model of a heart.
- the three-dimensional model is an anatomical three-dimensional model, like a thorax model, which is composed of entities (for instance individual vertebrae, ribs, organs) carrying an entity specific local coordinate system.
- entities for instance individual vertebrae, ribs, organs
- the relations between the individual coordinate systems are known, for example, by learning during the generation of the anatomical three-dimensional model. Since the positions, orientations and dimensions of the registration features, which are in this embodiment bone structures, are known after the registration step 104 and since the spatial relations between the individual coordinate systems of the registration features, i.e.
- the registration entities, and of the other entities, like a soft tissue organ are known, the positions, orientations and dimensions of objects within the anatomical three-dimensional model or a region of interest within the anatomical model, for example, the position, orientation and dimensions of the heart, can easily be determined. Since the positions, orientations and dimensions of one or several objects or entities are known, this information can be used for determining at least one scan parameter depending on the type of scan defined in step 101. For example, if a cardiac scan, which requires imaging the whole heart, has been defined in step 101, the region of the patient, which has to be illuminated for reconstructing an image of the heart, can easily be determined, because the position, orientation and the dimensions of the heart of the specific patient are known.
- the position, orientation and dimensions of the aorta can be determined from the registered anatomical model, and the scan parameters can be defined such that an image of the aorta can be reconstructed for a contrast peak measurement used for the bolus injection delay time.
- the bolus is a contrast agent bolus used for visualizing the heart vessels.
- the determination of the one or more scan parameters is preferentially performed automatically.
- the display 11 displays at least one of the two-dimensional projection image, the registered three-dimensional model and the determined scan parameters.
- the registration of the three-dimensional model with the two- dimensional projection image and/or the determined scan parameters can be modified by a user using the modification unit 19.
- the modification unit 19 comprises, for example, a graphical user interface allowing a user modifying the registration, for example, by modifying the position of the three-dimensional model with respect to the two-dimensional projection image, or the determined scan parameters, for example, by modifying a region of interest of the examination zone 5, which has preferentially been determined as a scan parameter in step 105.
- step 108 the examination zone 5 and, therefore, the object is scanned in accordance with the scan parameters determined in step 105.
- a scan parameter which is a region of the examination zone 5, which has to be imaged, i.e., which is, for example, a region of interest
- the radiation source 2 and the examination zone 5 move relative to each other such that detection values are acquired, which are sufficient to reconstruct an image of the region of the examination zone defined in step 105.
- step 109 an image of the object is reconstructed using the detection values acquired in step 108.
- the reconstruction is performed by a reconstruction unit, which preferentially uses a filtered back projection technique for reconstructing the object. Also other reconstruction techniques can be used, like a radon inversion.
- the motion of the object determined by the motion determination unit 14 is considered.
- Steps 106 and/or 107 can be omitted. If step 106 is omitted, the imaging method for imaging an object does not visualize at least one of the registration of the three- dimensional model with the two-dimensional projection image and the determined scan parameters. If step 107 is omitted, the imaging method for imaging an object does not provide the possibility to modify at least one of the registration of the three-dimensional model with the two-dimensional projection image and of the determined scan parameters.
- a scan parameter determination method for determining a scan parameter which is usable by an imaging system for scanning an object in accordance with the scan parameter, will be described with reference to a flowchart shown in Fig. 4
- step 201 a three-dimensional model of an object, which has to be imaged, is provided by the model provision unit 16. This provision of a three-dimensional model corresponds to step 103 in Fig. 3.
- step 202 the provided three-dimensional model and a two-dimensional projection image of the object provided by the projection image generation unit 15 are registered by the registration unit 17. Furthermore, preferentially the registration unit 17 adapts the three-dimensional model to the two-dimensional projection image.
- step 202 corresponds to step 104.
- the registered three-dimensional model is transmitted to the scan parameter determination unit 18 for determining one or more scan parameters from the registered three- dimensional model in step 203. This determination of one or more scan para-meters from the registered three-dimensional model corresponds to step 105 in Fig. 3.
- step 204 the display 11 displays at least one of the two-dimensional projection image, the registered three-dimensional model and the determined one or more scan parameters.
- step 205 the registration of the three-dimensional model with the two- dimensional projection image and/or the determined one or more scan parameters can be modified by a user using the modification unit 19.
- This provision of a modification by a modification unit 19 corresponds to step 107 in Fig. 3.
- the scan parameter determination method for determining a scan parameter ends in step 206.
- Steps 204 and/or 205 can be omitted, wherein, if step 204 is omitted, at least one of the registration of the three-dimensional model and the two-dimensional projection image and the scan parameters are not visualized, and wherein, if step 205 is omitted, the scan parameter determination method does not provide a possibility for a user to modify at least one of the registration of the three-dimensional model with the two-dimensional projection image and the determined scan parameters.
- the imaging system which has been described above, comprises a motion determination unit 14, the invention is not limited to an imaging system comprising a motion determination unit.
- the imaging system can also be an imaging system, which does not comprise a motion determination unit.
- imaging of an object and similar expressions include the imaging of a whole object, a part of an object and/or a region of interest of an object.
- the imaging system can image several objects.
- the invention is not limited to an imaging system being a computed tomography system.
- the imaging system can also be an X-ray system mounted on a C-arm.
- a scan parameter in claim 1 does not limit the invention to the determination of only one scan parameter. Also several scan parameters can be determined in accordance with the invention.
- a single unit may fulfill the functions of several items recited in the claims.
- the mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
- a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium, supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Optics & Photonics (AREA)
- High Energy & Nuclear Physics (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Pulmonology (AREA)
- Hematology (AREA)
- Immunology (AREA)
- Vascular Medicine (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Cardiology (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Analysing Materials By The Use Of Radiation (AREA)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP07849520A EP2111604A2 (de) | 2006-12-22 | 2007-12-17 | Bildgebungssystem und bildgebungsverfahren zur abbildung eines objekts |
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP06127045 | 2006-12-22 | ||
| EP07849520A EP2111604A2 (de) | 2006-12-22 | 2007-12-17 | Bildgebungssystem und bildgebungsverfahren zur abbildung eines objekts |
| PCT/IB2007/055155 WO2008078259A2 (en) | 2006-12-22 | 2007-12-17 | Imaging system and imaging method for imaging an object |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| EP2111604A2 true EP2111604A2 (de) | 2009-10-28 |
Family
ID=39563029
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP07849520A Ceased EP2111604A2 (de) | 2006-12-22 | 2007-12-17 | Bildgebungssystem und bildgebungsverfahren zur abbildung eines objekts |
Country Status (4)
| Country | Link |
|---|---|
| EP (1) | EP2111604A2 (de) |
| JP (1) | JP5345947B2 (de) |
| CN (1) | CN101689298B (de) |
| WO (1) | WO2008078259A2 (de) |
Families Citing this family (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2010025946A1 (en) * | 2008-09-08 | 2010-03-11 | Deutsches Krebsforschungszentrum Stiftung des öffentlichen Rechts | System and method for automated, intrinsic gated imaging |
| WO2010109345A1 (en) * | 2009-03-25 | 2010-09-30 | Koninklijke Philips Electronics N.V. | Method and apparatus for breathing adapted imaging |
| DE102009049818A1 (de) | 2009-10-19 | 2011-04-21 | Siemens Aktiengesellschaft | Verfahren zur Ermittlung der Projektionsgeometrie einer Röntgenanlage |
| US8693634B2 (en) | 2010-03-19 | 2014-04-08 | Hologic Inc | System and method for generating enhanced density distribution in a three dimensional model of a structure for use in skeletal assessment using a limited number of two-dimensional views |
| JP5844353B2 (ja) | 2010-05-26 | 2016-01-13 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 心臓の高ボリュームレート3次元超音波診断画像化 |
| US10610192B2 (en) | 2010-05-26 | 2020-04-07 | Koninklijke Philips N.V. | High volume rate 3D ultrasonic diagnostic imaging |
| WO2012059867A1 (en) * | 2010-11-05 | 2012-05-10 | Koninklijke Philips Electronics N.V. | Imaging apparatus for imaging an object |
| GB201113683D0 (en) * | 2011-08-09 | 2011-09-21 | Imorphics Ltd | Image processing method |
| GB201203883D0 (en) * | 2012-03-05 | 2012-04-18 | King S College London | Method and system to assist 2D-3D image registration |
| CN103829966B (zh) * | 2012-11-27 | 2018-12-07 | Ge医疗系统环球技术有限公司 | 用于自动确定侦测图像中的定位线的方法和系统 |
| US9972093B2 (en) * | 2015-03-30 | 2018-05-15 | Siemens Healthcare Gmbh | Automated region of interest detection using machine learning and extended Hough transform |
| RU2727244C2 (ru) * | 2015-11-04 | 2020-07-21 | Конинклейке Филипс Н.В. | Устройство для визуализации объекта |
| CN105761304B (zh) * | 2016-02-02 | 2018-07-20 | 飞依诺科技(苏州)有限公司 | 三维脏器模型构造方法和装置 |
| CN107510466B (zh) * | 2016-06-15 | 2022-04-12 | 中慧医学成像有限公司 | 一种三维成像方法和系统 |
| MX386184B (es) * | 2016-06-30 | 2025-03-18 | Sicpa Holding Sa | Sistemas, metodos y programas de computadora para generar una imagen de un objeto y generar una medicion de autenticidad del objeto |
| JP7191020B2 (ja) * | 2016-12-08 | 2022-12-16 | コーニンクレッカ フィリップス エヌ ヴェ | 脊椎の医用イメージングデータの簡略化されたナビゲーション |
| WO2019212016A1 (ja) * | 2018-05-01 | 2019-11-07 | 国立大学法人東北大学 | 画像処理装置,画像処理方法および画像処理プログラム |
| KR102394901B1 (ko) * | 2020-04-06 | 2022-05-09 | 큐렉소 주식회사 | 2차원 의료영상 기반 척추 수술 플래닝 장치 및 방법 |
| DE102022003163B4 (de) * | 2022-08-30 | 2025-03-20 | Ziehm Imaging Gmbh | Verfahren zur Aufnahme eines großflächigen Röntgenbildes |
| CN117197204B (zh) * | 2023-09-14 | 2026-02-10 | 武汉联影智融医疗科技有限公司 | 二维图像与三维模型的配准方法及装置 |
| CN120859534B (zh) * | 2025-07-24 | 2026-01-30 | 中世康恺科技有限公司 | 一种x射线成像系统的几何参数标定方法 |
Family Cites Families (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3864106B2 (ja) * | 2002-03-27 | 2006-12-27 | ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー | 透過x線データ獲得装置およびx線断層像撮影装置 |
| JP2003310592A (ja) * | 2002-04-22 | 2003-11-05 | Toshiba Corp | 遠隔x線撮像方法、遠隔x線撮像システム、医用画像診断装置のシミュレーション方法、情報処理サービス方法、及びモダリティシミュレータシステム |
| JP2004201730A (ja) * | 2002-12-24 | 2004-07-22 | Hitachi Ltd | 複数方向の投影映像を用いた3次元形状の生成方法 |
| EP1603461A2 (de) * | 2003-03-10 | 2005-12-14 | Philips Intellectual Property & Standards GmbH | Einrichtung und verfahren zur anpassung der aufnahmeparameter einer röntgenaufnahme |
| JP4429694B2 (ja) * | 2003-11-13 | 2010-03-10 | 株式会社日立メディコ | X線ct装置 |
| JP4554185B2 (ja) * | 2003-11-18 | 2010-09-29 | 株式会社日立メディコ | X線ct装置 |
| US7327872B2 (en) * | 2004-10-13 | 2008-02-05 | General Electric Company | Method and system for registering 3D models of anatomical regions with projection images of the same |
| JP4731151B2 (ja) * | 2004-10-22 | 2011-07-20 | 株式会社日立メディコ | X線管電流決定方法及びx線ct装置 |
| US20060173268A1 (en) * | 2005-01-28 | 2006-08-03 | General Electric Company | Methods and systems for controlling acquisition of images |
| GB0503236D0 (en) * | 2005-02-16 | 2005-03-23 | Ccbr As | Vertebral fracture quantification |
| JP4679951B2 (ja) * | 2005-04-11 | 2011-05-11 | 株式会社日立メディコ | X線ct装置 |
-
2007
- 2007-12-17 EP EP07849520A patent/EP2111604A2/de not_active Ceased
- 2007-12-17 JP JP2009542335A patent/JP5345947B2/ja not_active Expired - Fee Related
- 2007-12-17 WO PCT/IB2007/055155 patent/WO2008078259A2/en not_active Ceased
- 2007-12-17 CN CN2007800473164A patent/CN101689298B/zh active Active
Non-Patent Citations (2)
| Title |
|---|
| None * |
| See also references of WO2008078259A2 * |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2008078259A2 (en) | 2008-07-03 |
| CN101689298B (zh) | 2013-05-01 |
| CN101689298A (zh) | 2010-03-31 |
| JP5345947B2 (ja) | 2013-11-20 |
| WO2008078259A3 (en) | 2009-06-04 |
| JP2010512915A (ja) | 2010-04-30 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5345947B2 (ja) | 対象物を撮像する撮像システム及び撮像方法 | |
| JP7051307B2 (ja) | 医用画像診断装置 | |
| US7426256B2 (en) | Motion-corrected three-dimensional volume imaging method | |
| JP5134957B2 (ja) | 運動中の標的の動的追跡 | |
| EP3349660B1 (de) | System zur verfolgung einer ultraschallsonde in einem körperteil | |
| EP3586309B1 (de) | Einstellung für anhalten des atems bei tiefem einatmen mittels röntgenbildgebung | |
| JP7027046B2 (ja) | 医用画像撮像装置及び方法 | |
| US20070053482A1 (en) | Reconstruction of an image of a moving object from volumetric data | |
| EP2017785A1 (de) | Bildaufnahmemethode zur Bewegungsanalyse | |
| US20030181809A1 (en) | 3D imaging for catheter interventions by use of 2D/3D image fusion | |
| US10540764B2 (en) | Medical image capturing apparatus and method | |
| JP6951117B2 (ja) | 医用画像診断装置 | |
| CN114929112A (zh) | 用于移动式3d成像的视场匹配 | |
| US10537293B2 (en) | X-ray CT system, image display device, and image display method | |
| CN1973297A (zh) | 信息增强图像引导介入 | |
| JP6637781B2 (ja) | 放射線撮像装置及び画像処理プログラム | |
| JP2024504867A (ja) | 断層撮影スキャンの画像ベースの計画 | |
| CN102427767B (zh) | 用于计算机断层摄影中低剂量介入引导的数据采集和可视化模式 | |
| US10463328B2 (en) | Medical image diagnostic apparatus | |
| US20250127476A1 (en) | Methods, systems, and mediums for scanning | |
| EP2220618B1 (de) | Vorrichtung zum bestimmen eines parameters eines bewegten objekts | |
| JP2017217154A (ja) | X線ct装置 | |
| US10796475B2 (en) | Bone segmentation and display for 3D extremity imaging | |
| WO2006085253A2 (en) | Computer tomography apparatus, method of examining an object of interest with a computer tomography apparatus, computer-readable medium and program element | |
| JP6855173B2 (ja) | X線ct装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR |
|
| 17P | Request for examination filed |
Effective date: 20091204 |
|
| RBV | Designated contracting states (corrected) |
Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR |
|
| 17Q | First examination report despatched |
Effective date: 20100324 |
|
| DAX | Request for extension of the european patent (deleted) | ||
| RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: PHILIPS INTELLECTUAL PROPERTY & STANDARDS GMBH Owner name: KONINKLIJKE PHILIPS N.V. |
|
| REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
| 18R | Application refused |
Effective date: 20170304 |