EP3370616B1 - Vorrichtung zur abbildung eines objekts - Google Patents
Vorrichtung zur abbildung eines objekts Download PDFInfo
- Publication number
- EP3370616B1 EP3370616B1 EP16790337.6A EP16790337A EP3370616B1 EP 3370616 B1 EP3370616 B1 EP 3370616B1 EP 16790337 A EP16790337 A EP 16790337A EP 3370616 B1 EP3370616 B1 EP 3370616B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- imaging
- region
- anatomical
- orientation data
- imaged
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003384 imaging method Methods 0.000 title claims description 118
- 238000004590 computer program Methods 0.000 claims description 23
- 238000000034 method Methods 0.000 claims description 20
- 210000003484 anatomy Anatomy 0.000 claims description 12
- 230000003287 optical effect Effects 0.000 claims description 5
- 238000002604 ultrasonography Methods 0.000 claims description 4
- 230000001419 dependent effect Effects 0.000 description 4
- 238000002594 fluoroscopy Methods 0.000 description 3
- 206010033307 Overweight Diseases 0.000 description 2
- 238000002059 diagnostic imaging Methods 0.000 description 2
- 238000011045 prefiltration Methods 0.000 description 2
- 230000003936 working memory Effects 0.000 description 2
- 238000012879 PET imaging Methods 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 239000002872 contrast media Substances 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000001435 haemodynamic effect Effects 0.000 description 1
- 230000005865 ionizing radiation Effects 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 238000002601 radiography Methods 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/54—Control of apparatus or devices for radiation diagnosis
- A61B6/545—Control of apparatus or devices for radiation diagnosis involving automatic set-up of acquisition parameters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/48—Diagnostic techniques
- A61B6/486—Diagnostic techniques involving generating temporal series of image data
- A61B6/487—Diagnostic techniques involving generating temporal series of image data involving fluoroscopy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/54—Control of apparatus or devices for radiation diagnosis
- A61B6/547—Control of apparatus or devices for radiation diagnosis involving tracking of position of the device or parts of the device
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
Definitions
- the invention relates to a device for imaging of an object, a system for imaging of an object, a method for imaging of an object, a computer program element for controlling such device or system for performing such method and a computer readable medium having stored such computer program element.
- an anatomical region examined is known before the examination. Hence, it is possible to preselect an appropriate image processing in nearly all cases. Furthermore, in radiology there is usually a delay between examination and diagnosis, which gives a radiologist a possibility to change or adapt the image processing later on. On the contrary, in fluoroscopy, the anatomical region examined is often changing during the examination. This means that compared to radiology a more general image processing has to be applied to images of different anatomical regions. Such general image processing may be non-optimal for particular anatomical regions.
- US 2010/183206 discloses an automatically adjusting acquisition protocol for dynamic medical imaging, such as dynamic CT, MRI or PET imaging.
- the protocols are adjusted based on anatomic and dynamic models which are individualized or fitted to each patient based on a scout scan.
- the adjustment can compensate for changes in the patient due to patient motion (e.g. breathing or heartbeat) or flow of contrast or tracing agent during the sequence.
- the dynamic model can be a motion model used to predict the motion of anatomic/physiologic features, typically organs, during scanning, or a haemodynamic model used to predict flow of the contrast agent allowing for precise timing of the scanning sequence.
- DE 10 2012 201798A1 discloses a method for planning an X-ray imaging of an examination area of an object with low radiation exposure, comprising the following steps: S1) Receiving configuration parameters of the X-ray imaging device; S2) Determining the position of at least one part of the object; S3) Determining at least one irradiated region of the object that is imaged depending on the configuration parameters of the X-ray imaging device and the position of at least one part of the object. Furthermore, the invention of that cited document describes a device for planning an X-ray imaging with low radiation exposure.
- WO 2014/033614 A1 discloses an apparatus and method for automatically or semi-automatically controlling a collimator of an X-ray imager to collimate imager's X-ray beam and adjusting an alignment of the X-ray imager in respect of an object.
- the collimation and alignment operation is based on 3D image data of the object to be imaged.
- the 3D image data is acquired by a sensor.
- the sensor operates on non-ionizing radiation.
- the 3D image data describes a shape in 3D of the object and anatomic landmarks are derived therefrom to define a collimation window for a region of interest. Based on the collimation window the collimator's setting and imager alignment is adjusted accordingly.
- the image processing may, however, still be improved.
- the imaging device comprises a provision unit and a processing unit.
- the provision unit is configured to provide position and orientation data of the object to be imaged.
- the provision unit is further configured to provide position and orientation data of an imaging unit adjusted for a subsequent imaging of a region of the object to be imaged.
- the processing unit is configured to combine the position and orientation data of the object and the position and orientation data of the imaging unit to determine the region to be subsequently imaged and to set at least one imaging parameter of the imaging unit based on the determined region to be subsequently imaged.
- position data as well as orientation data may be either two-dimensional ("2D") or three-dimensional (“3D").
- the position data may be provided in the form of coordinates along two or three mutually independent directions e.g. in 2D or 3D Cartesian space, respectively.
- the orientation data may for example be provided in the form of rotations along two or three mutually independent directions e.g. in 2D or 3D Cartesian space, respectively.
- the position data and orientation data employ a common (sub-)set of 2D or 3D directions.
- the position data and orientation data employ mutually different sets of 2D or 3D directions.
- the processing unit is capable of determining i.e. predicting, based on a combination of the position data and orientation data of both the object and the imaging unit, the region to be subsequently imaged. While doing so, the processing unit does not need to rely on (yet may incorporate) information concerning the medical imaging protocol at hand. For example, the processing unit may be able to distinguish between a posterior-anterior ("PA") exposure and an anterior-posterior ("AP") exposure based on a combination of orientation data of the object with orientation data of the imaging unit. Furthermore, for example, the processing unit may be able to determine scan direction e.g. from head to toe or viceversa based on a combination of position data of the object and position data of the imaging unit. Furthermore, for example, the processing unit may be able to determine the region to be subsequently imaged based on the position data of the imaging unit and such scan direction.
- PA posterior-anterior
- AP anterior-posterior
- scan direction e.g. from head to toe or viceversa
- a dynamic X-ray system for example a fluoroscopy system, is configured for imaging a non-static region i.e. the region to be imaged is not constant over time.
- the device for imaging of an object according to the present invention particularly allows for successful clinical application in such dynamic X-ray system, either in diagnostic or interventional use. That is, the device according the present invention - owing to its capability to predict the region to be subsequently imaged and to set at least one imaging parameter based on such prediction - is advantageously capable of automatically selecting and setting, in dependence of the region to be subsequently imaged, an optimal i.e. most appropriate value for the at least one imaging parameter.
- imaging parameter includes the parameters concerning image processing as well as parameters concerning the imaging itself, e.g. parameters concerning irradiation such as generator voltage, tube pre-filtration and dose in case the imaging unit comprises an X-ray source.
- the object to be imaged is a patient and the region to be subsequently imaged is an anatomical region.
- the provision unit is configured to provide position and/or orientation data of the object and/or the imaging unit relative to the region of the object to be imaged.
- the provision unit is configured to provide absolute position and/or orientation data of the object and/or the imaging unit.
- the processing unit may be configured to calculate relative position and/or orientation data of the object and/or the imaging unit.
- the imaging unit may be an X-ray system.
- the position and orientation data of the imaging unit comprise a position and an orientation of an X-ray tube, an X-ray detector and/or a collimator.
- the provision unit is further configured to provide an anatomical model of the object, such anatomical model having one or more anatomical landmarks, and to provide the position and orientation data of the object by combining the anatomical model and the one or more anatomical landmarks.
- anatomical model of the object such anatomical model having one or more anatomical landmarks
- the position and orientation data of the object by combining the anatomical model and the one or more anatomical landmarks.
- at least three landmarks are used to estimate the position and orientation of an object in 3D space.
- the anatomical model is preselected out of a group of anatomical models based on patient data and based on a previous image. Thereby the anatomical model for e.g. a child or an adult, or a slim, a normal or overweight patient can be preselected.
- the anatomical model is adapted into an adapted anatomical model based on patient data and/or based on a previous image of the region to be subsequently imaged.
- the anatomical model can be adapted to the patient at hand, for example a child or an adult, or a slim, a normal or an overweight patient.
- the provision unit is further configured to detect the position of an anatomical landmark in patient data acquired by 2D and/or 3D optical, video, infrared and/or ultrasound means.
- an anatomical landmark in patient data acquired by 2D and/or 3D optical, video, infrared and/or ultrasound means.
- a position and orientation of the object to be imaged is stationary relative to the imaging unit and provided the orientation of an X-ray tube relative to an X-ray detector is known
- a two-dimensional estimation of the anatomical landmark(s) may be sufficient. This estimation may be based on simple video images detecting an outline of the object to be imaged.
- the device for imaging of an object may be part of a dynamic X-ray system i.e. fluoroscopy system in which imaging parameters are automatically selected based on the region to be imaged next.
- the region to be subsequently imaged is identified by combining the position and orientation data of the imaging unit with the position and orientation data of the object as derived from an anatomical model having anatomical landmarks, wherein the landmarks are calculated from a previous image or pre-scan e.g. using optical, infrared and/or ultrasound means.
- the processing unit is configured for updating the anatomical model hence the anatomical landmarks based on a previous image as acquired by the dynamic X-ray system. That is, a dynamic X-ray system typically generates a range of images which allow for improving the precision and/or quality with which the position of the anatomical landmarks is determined by the processing unit.
- the imaging system comprises an imaging unit and the imaging device as described above.
- the imaging unit may comprise an X-ray source and an X-ray detector.
- Such X-ray source may comprise an X-ray tube driven by a high voltage generator.
- At least one imaging parameter of the imaging unit is set based on a region determined by the imaging device to be subsequently imaged.
- the method further comprises step e for imaging the determined region to be subsequently imaged.
- the method for imaging of an object according to the disclosure may comprise the steps of:
- the computer program element comprises program code means for causing the imaging system as defined in the independent claim to carry out the steps of the imaging method as defined in the independent claim when the computer program is run on a computer controlling the imaging system.
- the imaging device, the imaging system, the imaging method, the computer program element for controlling such device and the computer readable medium having stored such computer program element according to the independent claims have similar and/or identical preferred embodiments, in particular, as defined in the dependent claims, if any.
- Fig. 1 shows schematically and exemplarily an embodiment of an imaging system 1 for an object 30 in accordance with the invention.
- the object 30 to be imaged may be a patient and the region to be subsequently imaged may be an anatomical region.
- the imaging system 1 comprises an imaging unit 20 and an imaging device 10 for an object 30 to be imaged.
- the imaging unit 20 may comprise an X-ray tube. At least one imaging parameter of the imaging unit 20 is set based on a region determined by the imaging device 10 to be subsequently imaged.
- the imaging device 10 comprises a provision unit 11 and a processing unit 12.
- the provision unit 11 provides position and orientation data of an object 30 to be imaged and position and orientation data of an imaging unit 20 relative to a region of the object 30.
- the imaging unit 20 is adjusted for a subsequent imaging of the region.
- the position and orientation data of the object 30 may comprise positions of anatomical landmarks of the object 30 and the provision unit 11 is configured to provide an anatomical model and to combine the anatomical model and the positions of the object's anatomical landmarks to provide the position and orientation data of the object 30.
- the provision unit may further detect the position of an anatomical landmark in patient data acquired by 2D and/or 3D optical, video, infrared and/or ultrasound means.
- the position and orientation data of the imaging unit 20 may comprise a position and an orientation of an X-ray tube, an X-ray detector and a collimator.
- the processing unit 12 combines the position and orientation data of the object 30 and the position and orientation data of the imaging unit 20 to determine the region to be subsequently imaged and to set at least one imaging parameter of the imaging unit 20 based on the determined region to be subsequently imaged.
- the imaging device 10 automatically sets the most appropriate imaging parameter(s) and thereby the most appropriate image processing for the next and thereby for all subsequent anatomical regions being imaged.
- the imaging device 10 may not only be used to adapt the image processing but also to improve the irradiation by the imaging unit 20.
- the imaging unit 20 may comprise an X-ray tube and the processing unit 12 may set an irradiation parameter of the imaging unit 20 based on the determined region to be subsequently imaged.
- X-ray beam quality e.g. generator kV, tube pre-filtration
- dose generator mAs
- Fig. 2 shows a schematic overview of steps of a method for imaging of an object 30.
- the method comprises the following steps, not necessarily in this order:
- a computer program or a computer program element is provided that is characterized by being adapted to execute the method steps of the method according to one of the preceding embodiments, on an appropriate system.
- the computer program element might therefore be stored on a computer unit, which might also be part of an embodiment of the present invention.
- This computing unit may be adapted to perform or induce a performing of the steps of the method described above. Moreover, it may be adapted to operate the components of the above described apparatus.
- the computing unit can be adapted to operate automatically and/or to execute the orders of a user.
- a computer program may be loaded into a working memory of a data processor.
- the data processor may thus be equipped to carry out the method of the invention.
- This exemplary embodiment of the invention covers both, a computer program that right from the beginning uses the invention and a computer program that by means of an up-date turns an existing program into a program that uses the invention.
- the computer program element might be able to provide all necessary steps to fulfil the procedure of an exemplary embodiment of the method as described above.
- a computer readable medium such as a CD-ROM
- the computer readable medium has a computer program element stored on it, which computer program element is described by the preceding section.
- a computer program may be stored and/or distributed on a suitable medium, such as an optical storage medium or a solid state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems.
- a suitable medium such as an optical storage medium or a solid state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems.
- the computer program may also be presented over a network like the World Wide Web and can be downloaded into the working memory of a data processor from such a network.
- a medium for making a computer program element available for downloading is provided, which computer program element is arranged to perform a method according to one of the previously described embodiments of the invention.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- Radiology & Medical Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Optics & Photonics (AREA)
- Biophysics (AREA)
- High Energy & Nuclear Physics (AREA)
- Veterinary Medicine (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Analysing Materials By The Use Of Radiation (AREA)
Claims (12)
- Vorrichtung (10) zur Bildgebung eines Gegenstandes (30), umfassend:- eine Bereitstellungseinheit (11), und- eine Verarbeitungseinheit (12), wobei die Bereitstellungseinheit (11) dazu konfiguriert ist, Positions und Orientierungsdaten des abzubildenden Objekts (30) bereitzustellen, wobei die Positions- und Orientierungsdaten des Objekts (30) Positionen von anatomischen Orientierungspunkten des Objekts (30) umfassen, wobei die Bereitstellungseinheit (11) ferner konfiguriert ist, um ein anatomisches Modell des Objekts (30) bereitzustellen, wobei dieses anatomische Modell eine oder mehrere anatomische Orientierungspunkte aufweist, und um die Positions- und Orientierungsdaten des Objekts durch Kombinieren des anatomischen Modells und eines oder mehrerer anatomischer Orientierungspunkte des Objekts (30) bereitzustellen, wobei die Bereitstellungseinheit (11) ferner dazu konfiguriert ist, Positions- und Orientierungsdaten einer Bildgebungseinheit (20) bereitzustellen, die für eine nachfolgende Bildgebung eines Bereichs des abzubildenden Objekts (30) angepasst ist,wobei die Verarbeitungseinheit (12) ferner dazu konfiguriert ist, die Positions- und Orientierungsdaten des Objekts (30) und die Positions- und Orientierungsdaten der Abbildungseinheit (20) zu kombinieren, um den nachfolgend abzubildenden Bereich zu bestimmen, und wobei die Verarbeitungseinheit (12) ferner konfiguriert ist zum Einstellen mindestens eines Bildgebungsparameters der Bildgebungseinheit (20) basierend auf dem ermittelten Bereich, der anschließend abgebildet werden soll,
dadurch gekennzeichnet, dass das anatomische Modell aus einer Gruppe von anatomischen Modellen basierend auf einem vorherigen Bild des Bereichs, der anschließend abgebildet werden soll, vorausgewählt wird, wobei das anatomische Modell aus einer Gruppe von anatomischen Modellen vorausgewählt wird, basierend auf Patientendaten. - Vorrichtung (10) nach Anspruch 1, wobei das abzubildende Objekt (30) ein Patient und der nachfolgend abzubildende Bereich ein anatomischer Bereich ist.
- Vorrichtung (10) nach Anspruch 1 oder 2, wobei die Bereitstellungseinheit (11) eingerichtet ist, Positions- und/oder Orientierungsdaten der Abbildungseinheit (20) relativ zu dem abzubildenden Bereich des Objekts (30) bereitzustellen.
- Vorrichtung (10) nach einem der vorangehenden Ansprüche, wobei die Bereitstellungseinheit (11) ferner dazu konfiguriert ist, Positionen und/oder Orientierungen der anatomischen Orientierungspunkte in Patientendaten zu erkennen, die durch 2D- und/oder 3D-Optik, Video, Infrarot und/oder oder Ultraschall erkannt werden.
- Vorrichtung (10) nach einem der vorhergehenden Ansprüche, wobei das anatomische Modell basierend auf Patientendaten und/oder basierend auf einem vorherigen Bild des nachfolgend abzubildenden Bereichs in ein angepasstes anatomisches Modell angepasst wird.
- Vorrichtung (10) nach einem der vorhergehenden Ansprüche, wobei die Positions- und Orientierungsdaten der Abbildungseinheit (20) eine Position und eine Orientierung einer Röntgenröhre, eines Röntgendetektors oder eines Kollimators umfassen.
- Vorrichtung (10) nach einem der vorhergehenden Ansprüche, wobei die Verarbeitungseinheit (12) ferner konfiguriert ist, einen Bestrahlungsparameter der Abbildungseinheit (20) basierend auf dem ermittelten Bereich, der anschließend abgebildet werden soll, einzustellen.
- Ein System (1) zur Bildgebung für einen Gegenstand (30), umfassend:- eine Bildgebungseinheit (20), und- das Gerät (10) zur Bildgebung eines Gegenstands (30) gemäß einem der vorangehenden Ansprüche,wobei zumindest ein Bildgebungsparameter der Bildgebungseinheit (20) basierend auf einem von der Bildgebungsvorrichtung (10) ermittelten Bereich eingestellt wird, der anschließend abgebildet werden soll.
- Verfahren zur Bildgebung für einen Gegenstand (30), die folgenden Schritte umfassend:a) Bereitstellen von Positions- und Orientierungsdaten des abzubildenden Objekts (30), wobei die Positions- und Orientierungsdaten des Objekts (30) Positionen anatomischer Orientierungspunkte des Objekts (30) umfassen,b) Vorauswahl eines anatomischen Modells aus einer Gruppe von anatomischen Modellen basierend auf Patientendaten und basierend auf einem vorherigen Bild der nachfolgend abzubildenden Region, wobei dieses anatomische Modell eine oder mehrere anatomische Orientierungspunkte aufweist,c) Bereitstellen der Positions- und Orientierungsdaten des Objekts durch Kombinieren des anatomischen Modells und eines oder mehrerer anatomischer Orientierungspunkte des Objekts (30),d) Bereitstellen von Positions- und Orientierungsdaten einer Abbildungseinheit (20), die für eine nachfolgende Abbildung des Bereichs angepasst sind,e) Kombinieren der Positions- und Orientierungsdaten des Objekts (30) und der Positions- und Orientierungsdaten der Abbildungseinheit (20) zur Bestimmung des nachfolgend abzubildenden Bereichs, undf) Einstellen mindestens eines Bildgebungsparameters der Bildgebungseinheit (20) basierend auf dem ermittelten Bereich, der anschließend abgebildet werden soll.
- Verfahren zum Abbilden eines Objekts (30) nach dem vorhergehenden Anspruch, ferner umfassend den Schritt e) zum Abbilden des bestimmten Bereichs, der anschließend abgebildet werden soll.
- Ein Computerprogrammelement zum Steuern einer Vorrichtung oder eines Systems nach einem der Ansprüche 1 bis 8, das, wenn es von einer Verarbeitungseinheit (12) ausgeführt wird, angepasst ist, die Verfahrensschritte der Ansprüche 10 oder 11 auszuführen.
- Ein computerlesbares Medium, welches das Programmelement des vorangegangenen Anspruchs gespeichert hat.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP15192891 | 2015-11-04 | ||
PCT/EP2016/076314 WO2017076841A1 (en) | 2015-11-04 | 2016-11-02 | Device for imaging an object |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3370616A1 EP3370616A1 (de) | 2018-09-12 |
EP3370616B1 true EP3370616B1 (de) | 2022-08-17 |
Family
ID=54366111
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP16790337.6A Active EP3370616B1 (de) | 2015-11-04 | 2016-11-02 | Vorrichtung zur abbildung eines objekts |
Country Status (7)
Country | Link |
---|---|
US (1) | US10786220B2 (de) |
EP (1) | EP3370616B1 (de) |
JP (2) | JP7165053B2 (de) |
CN (1) | CN108348205A (de) |
BR (1) | BR112018008899A8 (de) |
RU (1) | RU2727244C2 (de) |
WO (1) | WO2017076841A1 (de) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3370616B1 (de) * | 2015-11-04 | 2022-08-17 | Koninklijke Philips N.V. | Vorrichtung zur abbildung eines objekts |
EP3508130A1 (de) * | 2018-01-03 | 2019-07-10 | Koninklijke Philips N.V. | Anpassung des sichtfeldes |
EP3847967A1 (de) * | 2020-01-07 | 2021-07-14 | Koninklijke Philips N.V. | Patientenmodellschätzung für interventionen |
US10863964B1 (en) | 2020-06-01 | 2020-12-15 | King Saud University | Infrared alignment guide for an X-ray machine |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000197621A (ja) | 1999-01-06 | 2000-07-18 | Toshiba Corp | 医用画像撮影装置 |
EP1324694A1 (de) | 2000-10-02 | 2003-07-09 | Koninklijke Philips Electronics N.V. | Verfahren und röntgenvorrichtung zur optimalen darstellung der menschlichen anatomie |
JP2003310592A (ja) | 2002-04-22 | 2003-11-05 | Toshiba Corp | 遠隔x線撮像方法、遠隔x線撮像システム、医用画像診断装置のシミュレーション方法、情報処理サービス方法、及びモダリティシミュレータシステム |
DE10232676B4 (de) | 2002-07-18 | 2006-01-19 | Siemens Ag | Verfahren und Vorrichtung zur Positionierung eines Patienten in einem medizinischen Diagnose- oder Therapiegerät |
US7873403B2 (en) * | 2003-07-15 | 2011-01-18 | Brainlab Ag | Method and device for determining a three-dimensional form of a body from two-dimensional projection images |
US8000445B2 (en) | 2006-07-31 | 2011-08-16 | Koninklijke Philips Electronics N.V. | Rotational X-ray scan planning system |
EP2111604A2 (de) | 2006-12-22 | 2009-10-28 | Koninklijke Philips Electronics N.V. | Bildgebungssystem und bildgebungsverfahren zur abbildung eines objekts |
US7627084B2 (en) | 2007-03-30 | 2009-12-01 | General Electric Compnay | Image acquisition and processing chain for dual-energy radiography using a portable flat panel detector |
CN101686825B (zh) | 2007-06-21 | 2012-08-22 | 皇家飞利浦电子股份有限公司 | 使用动态模型调整用于动态医学成像的采集协议 |
EP2130491B1 (de) | 2008-06-06 | 2015-08-05 | Cefla S.C. | Verfahren und Vorrichtung zur Röntgenbildgebung |
FI123713B (fi) | 2011-03-21 | 2013-09-30 | Planmeca Oy | Järjestely intraoraaliröntgenkuvantamisen yhteydessä |
DE102012201798A1 (de) * | 2012-02-07 | 2013-08-08 | Siemens Aktiengesellschaft | Verfahren und Vorrichtung zur Planung einer Röntgenbildgebung mit geringer Strahlenbelastung |
RU2640566C2 (ru) | 2012-08-27 | 2018-01-09 | Конинклейке Филипс Н.В. | Персональная и автоматическая корректировка рентгеновской системы на основе оптического обнаружения и интерпретации трехмерной сцены |
DE102013220665A1 (de) * | 2013-10-14 | 2015-04-16 | Siemens Aktiengesellschaft | Bestimmung eines Werts eines Aufnahmeparameters mittels einer anatomischen Landmarke |
JP6707542B2 (ja) * | 2014-12-18 | 2020-06-10 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 物体の細長い関心領域を撮像するための撮像システム |
EP3370616B1 (de) | 2015-11-04 | 2022-08-17 | Koninklijke Philips N.V. | Vorrichtung zur abbildung eines objekts |
-
2016
- 2016-11-02 EP EP16790337.6A patent/EP3370616B1/de active Active
- 2016-11-02 BR BR112018008899A patent/BR112018008899A8/pt not_active Application Discontinuation
- 2016-11-02 RU RU2018120336A patent/RU2727244C2/ru active
- 2016-11-02 JP JP2018521566A patent/JP7165053B2/ja active Active
- 2016-11-02 US US15/772,604 patent/US10786220B2/en active Active
- 2016-11-02 WO PCT/EP2016/076314 patent/WO2017076841A1/en active Application Filing
- 2016-11-02 CN CN201680064540.3A patent/CN108348205A/zh active Pending
-
2022
- 2022-09-09 JP JP2022143388A patent/JP7320116B2/ja active Active
Also Published As
Publication number | Publication date |
---|---|
BR112018008899A2 (pt) | 2018-11-06 |
RU2018120336A (ru) | 2019-12-04 |
CN108348205A (zh) | 2018-07-31 |
JP7320116B2 (ja) | 2023-08-02 |
US10786220B2 (en) | 2020-09-29 |
JP7165053B2 (ja) | 2022-11-02 |
RU2018120336A3 (de) | 2020-02-17 |
JP2018532503A (ja) | 2018-11-08 |
BR112018008899A8 (pt) | 2019-02-26 |
EP3370616A1 (de) | 2018-09-12 |
JP2022173271A (ja) | 2022-11-18 |
RU2727244C2 (ru) | 2020-07-21 |
US20190117183A1 (en) | 2019-04-25 |
WO2017076841A1 (en) | 2017-05-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9724049B2 (en) | Radiotherapy system | |
US10172574B2 (en) | Interventional X-ray system with automatic iso-centering | |
CN104586417B (zh) | 增大锥束计算机层析成像获取中的视场的方法和成像设备 | |
CN109938758B (zh) | 用于确保针对放射成像记录的正确定位的方法和装置 | |
US8886286B2 (en) | Determining and verifying the coordinate transformation between an X-ray system and a surgery navigation system | |
EP2465435B1 (de) | Auswahl des optimalen Betrachtungswinkels zur Optimierung der Anatomiesichtbarkeit und Patientenhautdosis | |
CN106456082B (zh) | 用于脊椎层面的成像系统 | |
JP7320116B2 (ja) | 対象を撮像する装置 | |
US20220160322A1 (en) | Positioning of an x-ray imaging system | |
JP2009022754A (ja) | 放射線画像の位置揃えを補正する方法 | |
US10245001B2 (en) | Generation of a three-dimensional reconstruction of a body part by an X-ray machine | |
US10546398B2 (en) | Device and method for fine adjustment of the reconstruction plane of a digital combination image | |
EP3206183A1 (de) | Verfahren und vorrichtung zur benutzerführung zur auswahl einer zweidimensionalen angiografischen projektion | |
EP4274502B1 (de) | Navigationsunterstützung | |
JP2017202308A (ja) | X線ct装置及び医用情報管理装置 | |
CN110267594B (zh) | C型臂计算机断层摄影中的等中心 | |
EP4312188A1 (de) | Kombinierte optische und nicht optische 3d-rekonstruktion | |
KR102240184B1 (ko) | 해부학적 구조의 모델 생성 시스템 및 이를 포함하는 의료 도구 가이드 시스템 | |
WO2024022907A1 (en) | Combined optical and non-optical 3d reconstruction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20180604 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20191126 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: KONINKLIJKE PHILIPS N.V. |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20220322 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602016074366 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1511642 Country of ref document: AT Kind code of ref document: T Effective date: 20220915 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R084 Ref document number: 602016074366 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: 746 Effective date: 20221013 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20220817 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20221219 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20221117 Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1511642 Country of ref document: AT Kind code of ref document: T Effective date: 20220817 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20221217 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20221118 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602016074366 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
26N | No opposition filed |
Effective date: 20230519 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20221130 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20221130 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20221130 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20221102 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20221102 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20221130 Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20221130 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20231121 Year of fee payment: 8 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20231127 Year of fee payment: 8 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20161102 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220817 |